Do You Know What You Don't Know? Protecting Your Blind Side!

Since last month's interview entitled, "Controlling Legal Costs - Service Providers Defensible Predictive Coding Will Change The Economics Of eDiscovery In 2010," I received many inquiries from readers about how they could start the evaluation process to implement technology that will change the economics of eDiscovery. So, we thought it would be helpful to walk through the evaluation process to prepare you for the task ahead and offer some strategies.

The technology evaluation process can be a long and winding road with no end in sight unless you assemble the right team, establish a testing/evaluation protocol with clear objectives, document your findings and set milestone deadlines.This may seem simple, but with the myriad of eDiscovery technology suppliers and service providers in the market claiming to do it all, it is not. In order to weed out the contenders from the pretenders, you should consider these best practice suggestions for a timely and successful evaluation.

1. Be a team player

Yes, this sounds a bit trite, but it is critical. Your company's eDiscovery problem is a holistic one, and you must view it as such. Your evaluation teams should include members from legal, the IT-legal liaison, IT, security, and third party experts that you trust such as your outside counsel and consultants. Above all, assign one or two grizzled eDiscovery veterans to the team if you have them within your company. If not, then again look for outside assistance. These veterans usually have stepped on a few eDiscovery landmines in their time, and they will help to protect your blind side.

2. Paint a vision

Create a set of detailed uses cases with objectives that describe what the evaluation team seeks to accomplish. Consider objectives such as reducing unnecessary collection and preservation, lowering the cost of processing, increasing review efficiency, upgrading from a linear review tool to a tool that supports non-linear review, and/or increasing the accuracy and defensibility of your review process. Preparation of these use cases may seem arduous, but they will have cross-application down the road where they can be leveraged during implementation, testing, and training.

When preparing these use cases, there are three common mistakes that we see far too often and should be avoided.

Don't make the mistake of "habituos erroneous" which is making the same mistake over and over again. That is, don't simply outline your existing workflow, which may be flawed or part of a systemic problem, and use that for your test cases. View your current process objectively and let there be no "sacred cows." Question and verify every step. Remember, you are painting a vision so be creative and engage your experts. Collaborating with a group of your peers to share best practices can supplement your internal team. Sometimes vendors will share evaluation criteria or use cases that you can cherry pick. Engaging these folks will help you identify what you don't know that you don't know.

Next, don't make the mistake of solely relying on the same testing scripts and methodologies as are used for general IT tools. Instead, consider supplementing your company's testing protocols with eDiscovery-specific tests. For example, evaluating the indexing/processing of data for an eDiscovery application is substantively different than the evaluation of a search engine. Industry-leading search engines often value performance over accuracy and will partially index documents, skip large documents, and ignore some metadata. eDiscovery indexing/processing must be evaluated for defensibility and well-designed tests will uncover what you did not know. eDiscovery tools have some unique functionality so your evaluation criteria should reflect that. In-sourcing will result in a new set of challenges previously handled by outsourced partners. For example, scalability and performance are critical to success and must be evaluated despite some vendors' desire to focus on features alone. Test for processing speeds, post-processing index size, search response times, navigation speed and general scalability. Test for features and performance and do not limit your evaluation to a feature beauty pageant.

Finally, avoid the trap of going to a vendor website and using their RFP template as your sole evaluation scorecard. These templates are generalized, often outdated and typically assembled by the vendor so that the capabilities of their own technology are weighted to score high. Again, leverage your peers and independent authorities for a template and then customize to the objectives of your company.

3. Be on the lookout for the eDiscovery Peacock

First be aware that the eDiscovery space is littered with eDiscovery Peacocks - the self-proclaimed eDiscovery "gurus" who will cite eDiscovery case law chapter and verse while avoiding questions about how the law is practically applied to your situation. We are not suggesting that knowledge of case law is a bad thing, but rely on your in-house and outside counsel for legal advice. Don't be enchanted by the eDiscovery Peacocks' spreading their colorful feathers of eDiscovery "brilliance" for all to see. Remember, the eDiscovery Peacock struts from event to event with an agenda. They want to sell you something, so their perspective may be a bit self-serving.

Once the parameters of the evaluation have been defined, the next phase is to meet the technology and service providers to evaluate their technology to learn more that you may not know. During this process you may be greeted by a cast of characters that includes some of the following:

The Illusionist: The Illusionists insist upon their own pre-prepared demonstration rather than engaging with you on a Proof of Concept (POC) with your own data. They are often heard telling you that you "can't touch this." You should insist on a hands-on POC using your data and testing all of your use cases.

The Bully: The Bullies will frame their refusal to meet your evaluation requirements with bullying insults framed as a helping hand. "You don't need to evaluate the technology because it will take up too many of your resources." "Our clients never need a hands-on POC and I will call your boss to help you reduce the testing burden." Don't allow yourself to be bullied.

The Socialite: The Socialites will wine and dine you at the most elegant establishments and then "BAM!"you are supposed to buy their software. While it is ok to get to know your potential technology partner, stick to your evaluation plan and avoid excessive entertaining or the appearance of a compromise evaluation. Many damaging compromises in the evaluation process are made when sharing a glass of wine and steak. In the end, you want to be able to proclaim that the award went to the technology that met your objectives.

The Magicians: As the Magicians out there mysteriously float into the evaluation, they will distract you by using their skillful and subtle sleight of hand to avoid display. When you ask about their conceptual search and grouping technology, they provide an entertaining narrative to illustrate their point rather than conducting a live demonstration, such as the description of dog noting that a dog could be a canine, or maybe a boy trying to date your daughter or possibly Randy Jackson's favorite term for his American Idol contestants. This technique is as clever and entertaining as it is effective because you still haven't evaluated the technology. Words are not enough if you aren't able to test the technology. Did you test the concept's precision and scalability? Does it identify multiple concepts per document? Don't be distracted by entertaining explanations without a hands-on demonstration.

The Cleaners: The Cleaners will want you to use exclusively "standard" data when you evaluate their processing as "standard" data usually shows well. That is good, but does your company only have standard data? Not many do, so evaluate the vendor's processing capabilities using your company's own data including the hard stuff. The Cleaners may even combine their demand to use only standard data with the Simon Cowell and tell you that you only need to test common use cases because that's how all of their other clients test data and that it will take too long to find edge documents. However, exceptions in eDiscovery are the rule and are a standard that needs testing.

Included in your evaluation should be known file exception and edge document types to test the vendor's handling of exceptions.Examples of information to test should include large documents to see if processing skips them; a 500-page contract to ensure the technology indexes every page; searchable embedded objects to test indexing; foreign language documents to evaluate indexing; documents with track changes and comments to explore handling of indexing and display; all types of compound documents (such as nested container files like .zip or .tar), password-protected and encrypted files to test exception categorization vs skipping; and finally (and most importantly) any company-specific file types or previously identified trouble documents to ensure that the vendor's system handles these correctly. If you don't include at least one document that stumps the application, you need a better test set.

The Tommy Flanagan: Do you remember the pathological liar character played by Saturday Night Live's John Lovitz? Believe it or not, vendors will "stretch the truth." Some companies have been known to instruct their team to respond "yes" to any question during the RFI/RFP process. As such, it is important that you appoint a historian to track all of the failed use cases during your evaluation; track all of the follow-up responses promised by the vendor; and then audit these against the initial RFP/RFI. Do not rely on the vendor to record these follow-up items. Beware of companies who are willing to mislead you during the evaluation process as that should be a sign of caution for the road ahead. As Tommy often said, "Yeah, that's the ticket."

4. Check references

Consider these three suggestions when checking references. First, create a list of questions to ask references because it is often hard to get a second conversation. Include quantifiable questions such as performance results (i.e., GB/hr processing), grades (i.e., grade the following A-F, implementation, performance, customer service, etc), and lessons learned (i.e., What would you do differently next time? How did you handle change management?). Any response that is graded less than an A should be pursued further to uncover possible issues.

Second, inform the vendor about the types of companies with whom you want to speak. Ask for peers within companies and provide the vendor with a list of your trusted outside counsel to see if any are their clients. Your outside counsel will be more likely to provide a more critical reference. Also, ask for five references to get three. If you ask for three you may only get one or two.

Third, request to attend the vendor's customer advisory board session. You will see if the vendor has a formal process for gathering customer feedback and ensuring the facilitation of collaboration between users. If the vendor doesn't have a formal process or will not give you access, this should be a serious red flag.

The late British psychiatrist R.D. Laing wrote, "The range of what we think and do is limited by what we fail to notice. And because we fail to notice that we fail to notice, there is little we can do to change until we notice how failing to notice shapes our thoughts and deeds." This certainly holds true for technology assessments. It is often that what we fail to notice presents the greatest issues. Thus, the better we become at noticing issues the better that we can become at addressing them. In terms of eDiscovery technology, it all starts with setting your objectives for how you will test and evaluate competing (and complimentary) technologies. Armed with this knowledge you should be well on your way to identifying eDiscovery technology vendors who can meet your company's objectives.

Published .