Discovery

Wowed by Wizardry? Clients want practical tools for typical problems

MCC: What was the basis for establishing UBIC’s review center in Washington, D.C.?

Guttman: There were two major considerations. The first relates to our reading of the D.C. market. Given the high volume of cases that are regulatory in nature, it’s important to have a local presence where attorneys can come onsite and work side-by-side with our review teams. The second was really an internal consideration. Setting up shop in D.C. fit with UBIC’s strategic plan for growth within the industry; we felt that being in the district really was a prerequisite toward that goal.

MCC: How broad is the scope of capabilities this facility can handle?

Guttman: We're ready to engage with clients for any data analysis request, most of which involve the legal review of documents. Production to a subpoena, an internal investigation and regulatory response are the main three drivers of the bulk of review.

That said, we also help businesses with structured and unstructured data analysis either to investigate areas that don't quite fall into the legal sphere, which may relate to HR or internal policy, or the more happy circumstance where we are helping a company with business intelligence and giving guidance that helps them make money. The D.C. facility is configured for a range of projects, from large second-request projects to very small, discrete teams doing legal research.

MCC: What are your goals in terms of e-discovery?

Guttman: The true goal of e-discovery is substantial compliance with a production demand within a subpoena. Goal two is knowledge extraction, which isn't talked about as much in the industry, and that’s really a disservice to our clients. Producing to a subpoena makes your adversary happy, but providing the knowledge that resides within the story of your data pertains to your own ability to win that case.

The framework for achieving those two goals must be defensibility, which, in turn, is driven by communication, memorializing decisions, and making sure that your team is well trained and stays on target. Our workflow is set up to achieve all of that.

MCC: How do you select and develop the tools you employ as part of a managed review?

Guttman: My group understands that technology is the only way to get through the massive amounts of data that even the simplest projects demand. There are wonderful tools out there, everything from predictive-coding engines to the concept clustering and e-mail threading that exists within various technology assisted review (TAR) applications. All of these technologies can be extraordinarily useful under certain circumstances. Nothing's a panacea, of course, and there's no one-size-fits-all solution, so we are very careful about knowing all available tools and recommending those that provide the greatest advantage and efficiencies. Even very well-managed human review has issues of scale, and it's very difficult to get a team of 200 attorneys all to make consistent judgments across a review protocol. That's where technology can vastly improve the results.

In terms of developing our own tools, UBIC is an artificial intelligence company, a technology company that has made a deep investment in its development team and approach. That enables our review teams, as users of the tools, to share insights on what makes a great review platform. We are akin to a Formula 1 racing team. Ford, Chevy and GM all have racing teams, partially for bragging and branding rights of course, but also to serve as research and development incubators. For example, when Al Unser, Jr. drives a Ford car, he can provide valuable feedback, such as “the car drifts to the left when I go above 150 miles per hour.” Ford engineers will take that insight and make the requisite improvements to my own Ford Edge, which now drives smoother and handles better. Even though I'll never be driving at 150 miles per hour, the car and the consumer still benefit from that knowledge.

Because our managed review group routinely interacts with UBIC’s tools, we can tell our technology teams: “Here's what makes a review efficient, and here's what end users need to be able to do as seamlessly as possible.” It's not enough to say that the database contains all of the records within a given review. Users need to be able to see through dashboards, reporting and targeted searches what's going on with their projects. They want information that enhances decision making and strategy development and, ultimately, shows how well prepared they are to win their case. That's really what we're doing here, right? It's not about sorting. It's about winning a case.

MCC: How do review-centric tools translate into the cost benefits that corporate clients are looking for?

Guttman: It's not about cost being higher or lower, per se. The big concern we hear from corporate counsel is spend certainty – knowing on day one what the budget will look like regardless of what happens before you reach production.

In 2013, Rand identified that human review accounts for about 73 percent of e-discovery costs, and to be frank, there's a lot of current dissatisfaction with the value story, as well as uncertainty about quality. Clients want good technology and review platforms that drive efficiency without sacrificing defensibility. Our review teams need to be able to attest that our process was purposefully designed, reasonable and proportional – really, that it meets all the given criteria. Put simply, we lawyers need to know whether we're performing our jobs diligently or whether we're being negligent. We have to perform at a high level for our clients.

At UBIC, we’ve found that tool selection is a big factor in driving down that 73 percent but also a key driver in addressing the spend issue more fundamentally through a pricing model approach. Our robust workflow for managed review can be wrapped in a per-document billing model, which gives clients spend certainty on day one.

Further, before we gauge a managed review project, we work closely with clients through a culling process that reduces review populations to the smallest possible target. As a result, our clients can tackle the “73 percent issue” through a combination of a strong managed review workflow, with creative pricing, inside an advanced technology suite. They are in control, and they have the knowledge they need to make early-stage decisions as to the strategy of an e-discovery process.

MCC: How has UBIC’s proprietary artificial intelligence-based software program, Virtual Data Scientist (VDS), expanded your discovery capabilities?

Guttman: Honestly, VDS operates at a higher level than I can explain, but the crux lies in its ability to acquire expert tacit knowledge, meaning non-explicit information, such as body language or even human speech patterns. You are not taught those things. You acquire them. Similarly, VDS acquires knowledge about what is responsive in a document population, not by matching documents to predefined tags, for example, but rather by examining the decision makers’ choices within a given training set. In this way, VDS is unique in the market.

That said, clients are not always interested in the wizardry behind the curtain. They want practical tools that address typical problems. For example, during one project, we were a week out from production, and suddenly 200,000 documents were discovered that needed to be reviewed. Virtual Data Scientist, and the predictive coding workflow that attaches to it, allowed us to use a very small seed set of documents, just several hundred, to code the rest of the database quickly and effectively. The client was just flabbergasted that we were able to solve that conundrum with such a light footprint of human effort.

VDS also helps with up-front organization – culling, concept clustering, predictive coding and all the common techniques in our industry. By embedding this software within review tools, UBIC’s clients are able to approach discovery at a much higher level, find data that helps them win cases and consistently reduce that “73 percent” to a much lower proportion of the overall e-discovery spend.

MCC: Let’s go back to regulatory investigations. Please talk about targeted capabilities within the new D.C. facility.

Guttman: In specific response to the D.C. market, we developed a second request workflow to handle requests by the Department of Justice when it examines a possible merger of two companies in a like industry. The DOJ is looking for issues that may negatively affect consumers post-merger, and they do this by examining company documents for clues about the marketplace, conglomeration and the pricing effect of eliminating products, and whether such merger would enable the merged entity to assert monopoly power.

Virtual Data Scientist helps us identify clusters of documents that discuss concepts related to market power. The documents may not contain the words “monopoly “or “product” or “price,” but they're talking about those concepts, and it’s important for GCs and their outside counsel to look at these key documents while developing strategies during early stages. Depending on the story within those documents, they may feel more or less confident about their ability to get the deal approved, which will affect how much they're willing to spend and how aggressive they want to be.

VDS provides a powerful, fact-based search that accelerates this process. A lot of artistry goes into it, especially since this is not a solo activity. You’re beholden to what a judge and opposing counsel are comfortable with, or what they might further demand should you deviate from their prescription. Also, there is a lot of uncertainty right now about sharing one’s work product within a TAR process. Do I have to turn over my seed set and the thinking behind culling it? If so, then the decision about whether to use these really great tools is not black and white, which is unfortunate. As software providers, we need to take a much more advanced approach to recommending tools, one that comprehends those nuances and focuses with laser precision on the ultimate goal of making projects successful.

MCC: Please expand on the consultative aspect of managed review, the strategic difference it brings to the process, and how that adds value for your clients.

Guttman: There are a lot of players in this industry, on the firm side, the corporate side and the vendor side. Many different tool kits have emerged on this landscape, and many different data populations may be implicated in legal, regulatory and internal investigatory proceedings. We now have years of experience, and as lawyers or even non-lawyers in the industry, we know that the opponent is likely also very sophisticated.

In this day and age, we've got to be prepared to act as consultants and say to clients, “Your focus is on winning your case. Let us provide guidance that comes from both expertise with technology tools and how humans interact with them plus experience with the data handling.” Review teams can be a tremendous source of consultative knowledge provided that everyone is aligned with the e-discovery process, from forensic collection, through hosting, processing and review management, and back out to production and trial prep. Everyone in that spectrum has to be prepared not just to report on what they're doing but to provide normative statements about best practices. We’re too far into this to not give our clients the benefit of all that experience.

Published .