Discovery

Patience Yields Optimal Results: Acquiring and comprehending the right data makes any legal response more economical and efficient

Corporate litigation often means high-profile investigations, front page news articles, tight deadlines and millions of dollars at issue – usually with zero room for error. There’s almost always a lot at stake, but this is the business we’ve chosen. Attorneys and litigation consultants are often thrust into high-pressure situations to assist clients navigating complex litigation. We are expected to hit the ground running, wherever it may be, and rapidly acquire the same amount of knowledge as clients who have been doing the same job for years or even decades. The best-performing litigators accomplish this in short order.

Data management is a key task for litigation data consultants. This includes prospecting, acquiring, validating, centralizing and analyzing vast amounts of information for clients under immense pressure.

For example, consider a global multinational enmeshed in potential litigation over regulatory issues. Senior management is grappling with internal, external and media issues, and they would benefit from the services of an accomplished data-consulting team. To provide a cohesive response, their legal team needs to develop and execute a strategy based on limited information, successful extraction of years of data from multiple disparate systems and data located in multiple jurisdictions.

How do data consultants help execute a cohesive legal response, making their clients’ lives easier? The easiest answer, which many data consultants use by default, is this: “Give us everything from everywhere.” This is not always appropriate.

Defining the Scope

Defining the scope of data to be used in litigation is one of the most important factors in executing a successful engagement. Companies now generate data at an astronomical pace, aided by improved technology and shrinking costs. That means companies and counsel must review and analyze seemingly insurmountable volumes of data in an effort to analyze relevant information. But acquiring the right targeted data set makes subsequent phases of any legal response easier, more economical and efficient. Conversely, acquiring vast amounts of data, much of which may not be needed, can lead to wholly avoidable complexities. Acquiring and analyzing data beyond the scope of a current investigation may lead to distractions and unnecessary delays.

It’s vital to ask the right questions in this phase, to understand what sort of information is needed. Will it come from financial systems or customer relationship systems? Is the information from current or legacy systems? What jurisdictions are in this scope? What’s the time frame for the required data? Targeted questions help define the parameters of the data to extract and analyze.

The next hurdle is usually integrating and maintaining the acquired data set. Large investigations often depend on data from multiple sources and systems. Each of these data sets is unique and requires customized methodologies for integration in the overall legal strategy. Think of this phase like building the foundation of a house. If the foundation is not solid, no matter the subsequent efforts, the house will have weaknesses. The same principle applies to integrating and maintaining data critical to the end goal. If this process is rushed and appropriate measures to understand the data are not taken, all subsequent analyses may be vulnerable or, at the very least, may require intense efforts to remediate. This can be avoided by taking time to understand the nuances of the systems and the data contained within them. Adding a week up front to the time line can prevent having to add months to the back end of the data analysis time line.

A good understanding of the data allows data consultants to best identify the appropriate tools and methodologies for the data set. This does not mean that clients are not kept up to speed – it’s crucial to keep counsel and clients apprised of all the facts as they are uncovered. We find the best client relationships are formed when we provide as much information about the data as soon as possible, usually within the first few days of the start of the engagement. Full transparency can provide clients with critical information early, and that can help drive the strategy and response.

Validation Creates Confidence

Validating the data extracted from various systems is the next key step, and its importance should not be overlooked. Finding the shortcomings in a data set is crucial to producing useful analysis. This is done by applying confidence factors – numerical coefficients that help indicate degrees of accuracy – to various data sets. Two types of assessments need to be used to find the confidence factor of each data set: the mechanical assessment and the sniff-test assessment. A mechanical assessment ensures the data set has been completely and accurately extracted from the client’s information systems. A sniff-test assessment ensures that this same set of information meets the project’s expectations. It helps ensure that the team has responded appropriately to the business request.

It is imperative that counsel is aware of these confidence levels when making strategic decisions and executing methodologies. The next step in analysis and reporting should not begin until data gathering is complete and everyone is aware of the confidence levels of the data sets. This doesn’t mean there are no reports until the data is perfect – sharing reports and analyses is part of the data management process. The key point is that clients should be made aware of the potential shortcomings of the data while it’s being validated.

As these processes continue, the validated data gets moved into a central repository, which serves as the “single point of truth” for the remainder of proceedings. In large investigations, with multiple constituents, it is vital that all source data is stored in a central repository and all analyses are performed from the same data set. The inclusion of additional data and updates from the validation processes means the data set is subject to constant change, especially during the initial phases of any engagement.

While it may be easier for consultants to maintain their own data sets as they work to finalize them, it can lead to different answers to the same question. This can be due to nonalignment of data or certain validation steps missing in one data set as compared with another. It is prudent – and wise – to take the time to ensure the data analytics team is working from the central repository, which contains formal validation checks, change logs and other critical attributes about the extraction and validation of the data.

In most high-profile investigations, clients want data-driven answers to be produced rapidly, especially during the early stages. That’s important, but so is keeping an eye on the end goal. Spending a little more time at the outset formalizing the scope, the integration and validation methodologies of client data sets can save months of work later in the process.

The opinions expressed are those of the authors and do not necessarily reflect the views of AlixPartners, LLP, its affiliates, or any of its or their respective professionals or clients.

Charles A. Cipione, based in Dallas, and Vineet Sehgal, based in New York, are managing directors in AlixPartners’ Information Management Services practice. [email protected], [email protected]

Published .