Technology

Predictive Coding And Technology-Assisted Review After Da Silva Moore

Editor: Skip, give us some background about your roles at Dorsey.

Durocher: I am a litigator in the trial department of Dorsey & Whitney and work on e-discovery matters. Caroline works on the technology side of e-discovery. We have both been doing technology-assisted review for many years.

Editor: With Da Silva Moore and other cases, predictive coding is top of mind for many. What is your take on predictive coding and technology-assisted review?

Durocher: Predictive coding has long been on the horizon. As a result of Judge Peck’s decision in the Da Silva Moore case, it has now entered the mainstream. Judge Peck wrote an article about a year ago in which he advocated the use of predictive coding. This gave litigators some comfort, but there was still a bit of reluctance until a court actually “blessed” its use. It is no surprise that the judicial blessing came from Judge Peck.

The affirmation of Judge Peck’s decision in Da Silva Moore by District Court Judge Carter opens up the use of this technology. Technology-assisted review is certainly here to stay, and predictive coding is going to become much more commonplace with the Da Silva Moore case and the continued push by clients to reduce costs.

Sweeney: I would like to add that if law firms are to remain competitive, they are going to have to adapt to the use of predictive coding. This will be driven by clients’ concerns about controlling costs. With the court’s affirming that it is an appropriate approach to document review, it will be used more frequently.

Durocher: When technology-assisted review became feasible, people realized that use of the gold standard of attorney review of every document was no longer affordable. Over the years, we also learned that technology-assisted review is more accurate and more consistent than subjective review by humans, with each person making his or her individual judgment call.

Editor: Please discuss Dorsey’s approach to technology-assisted review, the technologies it uses, the workflow and the benefits.

Sweeney: We typically start with negotiating with opposing counsel and targeting our collection of data. The more you can limit the scope, the more you are able to control costs.

We are using technology-assisted review in almost every litigation document review that we do. Almost every litigation case we receive is put into the Ringtail system. Ringtail functionality allows us to utilize the entire spectrum of analytics capabilities for different portions of the technology-assisted reviews.

Once we have the data in hand, we use the Ringtail platform to help us reduce the data population for review. It enables us to do de-duplication, removal of system files, email thread identification, identification of foreign language content and to date-cull.

After processing, we start with the Ringtail Analytics platform, which enables us to do early case assessment by looking at the key concepts in the documents and vetting our keyword searches. For example, we can see how many search terms hit per custodian and what date ranges were collected, allowing us to prioritize the review. Or, we might find concepts in the documents that lead us to consider new search terms or to locate key documents.

We then target the more relevant portions of the data collection for the review process. Using the Ringtail automated workflow system, we put data into review batches.

Dorsey’s review team uses the document mapper functionality of Ringtail to conduct the document review looking at the documents in concept clusters. Document Mapper and the concept-clustering functionality consistently receive positive feedback from our review attorneys. They like the visual clustering, which has increased our review rates while offering greater consistency and quality, since similar documents, including near duplicates, cluster together.

Mapper also allows us to seed key documents within a review set, so similar documents cluster around the seed document, allowing us to identify the responsive documents more quickly.

We also upload our privilege and keyword search terms, so that during the review process those terms are highlighted. That helps facilitate the review team’s decision-making process regarding privilege or what keyword term might make a document responsive. And, again, the similar documents will cluster together allowing for consistent privilege and responsive calls.

In one case, we received 26 Concordance databases and terabytes of data from the other party. We were able to use the Ringtail Analytics function to quickly identify the key documents. As a result, we did not have to do a linear review of those documents to figure out what had been produced to us. We are particularly impressed by the ability of Ringtail to identify concepts and provide visual analytics in a cost-efficient and effective manner.

Many other firms and their clients send documents outside the firm for review to third-party vendors in India or other countries. Ringtail enables us to do document review on a cost-competitive basis, whether or not we are handling the case.

People are still needed in the document review process – it is not all about the technology. Our attorneys with case expertise are involved in the initial assessment process of identifying key concepts and prioritizing documents for review, and selecting seed documents. Contract attorneys conduct review with the Mapper platform and associates conduct quality control, using a sampling method, to be sure we are getting correct responsive, non-responsive and privilege calls. It is a repetitive process as it goes back and forth, getting better and more refined results.

Once the whole process is done and we have our set for production, associates on the Dorsey legal team are able to use the documents in the set to support our case.

Durocher: Partners and senior associates need to be involved in the critical aspects of the predictive-coding process. A high-level understanding of the case is necessary to generate the initial seed documents and then to evaluate each of the data feedings into the predictive coding software.

Sweeney: Our approach is about people, process and technology; it is not just turning everything over to a black box and saying, “Find the responsive documents in this collection.”

Editor: Is the usefulness of technology governed by the size of the case or the type of matter?

Sweeney: Technology-assisted review and the use of analytics and document-concept clustering is not dependent on the type of matter or the size of the case. However, we do see definite benefits in large cases when a quick turnaround is important.

Durocher: The cases that generate the most documents seem to be the huge intellectual property lawsuits and antitrust cases. But any business dispute can also generate great volumes of documentation and email. The Da Silva Moore case was an employment class action lawsuit that generated 2.4 million emails.

Editor: What questions do your clients ask about predictive coding and technology-assisted review?

Durocher: Clients are most concerned about the accuracy and consistency of the system. They want to know whether it will be accepted by the court or challenged by the other side. They also ask about the cost savings it generates.

Sweeney: We have many clients with very active litigation profiles. Those clients tend to be savvy about technology and push us to use it appropriately. There is still a lot of trepidation as to the different approaches to predictive coding and technology-assisted review because they come in different flavors with the different vendors. I see a lot of clients trying to gain knowledge about what is out there and what the differences are so they will know when to use it in their matters.

Editor: What advice would you give to anyone considering technology-assisted review?

Durocher: They should resist the tendency to turn things over to a vendor and say, “This is your problem.” I see this happening most frequently when both clients and law firms lack significant in-house capability. The client and the law firm both have to be intimately involved in the process for the reasons we’ve discussed.

Sweeney: I agree. Clients need to understand what they are getting with the predictive coding platform that is being used. They need to understand to some extent how the black box works and what the quality control and the overall workflow processes are.

Durocher: What is really important to those looking into technology-assisted review is that they should be considering transparency and cooperation. In the past, there was typically not a lot of cooperation during the discovery process. With the advent of high-stakes litigation, with millions of document in play, litigators are being pushed in the direction of cooperation. The Da Silva Moore case is a good example because in that case the parties actually agreed to the use of predictive coding – they just couldn’t agree on the details of that coding process. Even if you generate some general level of agreement on some of the big picture issues as to how the discovery process should proceed, you still may need the Judge Pecks of the world to manage the process and to resolve disputes.

Editor: Why did you select Ringtail as your technology platform?

Sweeney: Dorsey selected Ringtail about five years ago when we were making the move from Summation. As we discussed earlier in this interview, we selected Ringtail because of its functionality and roadmap -- and we liked and continue to approve of where FTI was taking it.

Published .