The Most Important E-Discovery Metric: The Responsive Rate

Posted October 12, 2014 by Ian Wilson & filed under Cost Containment, Statistical Sampling, Predictive Review, Search

An essential step to improving quality and efficiency is to measure the effectiveness of the current process. Without measurement, we cannot evaluate the effect that new technology and process changes will deliver.

For years, the e-discovery industry has focused on review speed (number of documents reviewed per hour). There is no doubt that the speed of review is important. But, I would argue that the “responsive rate” is the most important metric when it comes to e-discovery cost containment – and is commonly overlooked.

The “responsive rate” represents the percentage of the reviewed documents that are responsive to the discovery request. The simple formula for determining the responsive rate is:

Total Responsive Documents / Total Documents Reviewed
Assuming that the documents submitted for legal review are the result of search protocols, the responsive rate is essentially the same as the search “precision” measure referred to by search technologists.

While far from scientific, we have made a point over the last six months to ask people about their general experience in “responsive rates.” Lawyers consistently report responsive rates in the 20% range, with many attorneys noting that responsive rates are often much lower – even single digit responsive rates in some cases. Indeed, many firms typically assume a 20% responsive rate for reviews when they prepare internal project budgets.

A completed review with a responsive rate of 20% means that 80% of the total cost of review was spent paying lawyers to review irrelevant documents. If we were to accept the analysts’ $10 billion dollar estimate of the annual amount incurred in legal review costs that we discussed in our previous post, and we assume that the average responsive rate is 20%, that means that companies are paying approximately $8 billion dollars a year for lawyers to review irrelevant documents. As a profession we can do better.

Low responsive rates commonly result from the ineffective search techniques that have become standard practice in e-discovery. In many cases, a simple list of terms, phrases and Boolean queries are drafted and submitted to vendors to serve as the filter to govern what documents are assigned for expensive review. The general and simplistic search terms are often created without access to the collected data and without an understanding of how the search functions against the data set.

We are also seeing the simplistic search trend permeating into the enterprise as clients increasingly have the in-house capabilities of basic searching through their internal data stores. It is common for this approach to decrease the “responsive rate” because not only is the search simplistic, the search is often run against irrelevant custodians thereby introducing a whole new level of irrelevant data into the review stage.

There are three main points that counsel should keep in mind as they consider addressing responsive rates to control costs.

First, lawyers should always measure the responsive rate to understand the efficiency of the process. Taking a sample of documents returned by the search prior to full review will allow the legal team to estimate the “responsive rate” and improve the search protocol if the percentage of irrelevant documents is high. Every corporate counsel should ask their lawyers to report on the responsive rate after the completion of a review so that they can evaluate how much money was spent on the review of irrelevant documents.

Second, the search and analysis process must be iterative. Search is not a single event - it is a learning process. Few if any lawyers I know would conduct legal research by creating a single list of terms that may appear in the relevant case law and then read every opinion that contains any one of the terms. To the contrary, lawyers are trained to search iteratively, learning from the cases returned by the search results, refining the search and exploring the relationships among the legal opinions. E-discovery is no different. Lawyers should explore the search results to understand the context of search hits and intelligently refine the search. A little work will pay great dividends in improving the responsive rate and controlling costs.

Third, the review must be connected with the search phase so that the legal team has access to the litigation hold data to intelligently refine and test the search protocols. We must remove the time and complexity hurdles facing lawyers today as a result of the disjointed approach of separating the search and review phases. And, of course this means that the legal team can’t procrastinate on the task and turn e-discovery into a “fire drill” if they expect to meaningfully contain costs. A single unified platform that makes an iterative search and review process practical for diligent counsel is essential to achieving cost containment.

Servient’s Predictive Review technology is a significant leap in the direction of improved responsive rates. Once the legal team has refined the search protocol, Servient’s active learning technology kicks in and begins to further separate the relevant and irrelevant documents during the review. By learning from the document decisions made early in the process, and controlling the documents that are reviewed, Servient’s Predictive Review technology can substantially improve the responsive rate and fundamentally change the economics of e-discovery. Measure it.

Recent Posts

Effective Today: Important Amendments to the Federal Rules of Evidence 

The sheer volume of electronic data in the digital age is continually driving changes to the rules of federal practice. So be aware that as of Friday, December 1, 2017, two new amendments to the Federal Rules of Evidence (to Rules 803 and 902) will limit the...

Lynyrd Skynyrd Movie Ruling Takes an Expansive View of the Duty to Preserve Electronic Evidence

 Lynyrd Skynyrd, the popular 1970s Southern rock band behind the legendary songs “Sweet Home Alabama” (Roll Tide) and “Free Bird”, has been back in the news due to a recent legal ruling that prohibits the distribution of a new film about the influential band....

Servient To Discuss Application of Artificial Intelligence for Legal and Compliance Workflows at Annual ACC Event.

  Servient’s machine learning technology has been widely used by legal departments for eDiscovery purposes for many years. But today there is growing realization that the sophisticated machine learning technology can also be applied to other legal and...

In Search of A Positive Spin on the Economic Impact of Machine Learning

The practical application of machine learning to e-discovery, commonly referred to as predictive coding, has begun to move from just a debated topic, to an applied technology. As developers of machine learning explain the efficiencies of their processes, law...

E-Discovery Search: Focus on Negotiating the Search Process Not Keywords

While I hate to show my age, this blog topic reminds me of a statement I heard at one of the first e-discovery seminars I attended in San Francisco in the 1990s. In those early days, someone from the audience commented that e-discovery will require the...