The Federal Rules of Civil Procedure require experts to prepare a written report if they are retained to testify. State rules may or may not impose the same obligation. Even when state rules do not demand production of a report, a judge’s scheduling order might require a testifying expert to do so.
Lawyers often tell the expert that they need a report, advise the expert of the deadline for its production, and await the result. If an expert has a track record of writing excellent expert reports, further guidance may not be necessary. In most cases, however, a lawyer should remind the expert of the need to describe the expert’s methodology in detail.
The Basics
In federal court, every report must include:
(i) a complete statement of all opinions the witness will express and the basis and reasons for them;
(ii) the facts or data considered by the witness in forming them;
(iii) any exhibits that will be used to summarize or support them;
(iv) the witness’s qualifications, including a list of all publications authored in the previous 10 years;
(v) a list of all other cases in which, during the previous 4 years, the witness testified as an expert at trial or by deposition; and
(vi) a statement of the compensation to be paid for the study and testimony in the case.
Lawyers should make certain to provide an expert witness with a list of the report’s required contents. If the governing state rule differs from the federal rule, lawyers should make sure the expert follows the applicable rule.
Description of Methodology
While the federal rule does not use the word “methodology,” the Daubert decision requires experts to base their opinions on reliable methods. Describing the expert’s methodology — the process by which the expert formed opinions — is an essential part of explaining the “basis and reasons for” each opinion the expert will express.
When judges decide that expert testimony is inadmissible, they usually conclude that the expert report failed to describe a reliable methodology. Experts may take it for granted that they use reasonable methods to arrive at opinions, but the Daubert decision requires experts to “show their work.”
Experts should provide a step-by-step description of the process by which they formed each opinion. For example, it isn’t enough for an accident reconstruction engineer to write “I examined the accident scene and determined that the defendant’s car was traveling at 80 mph when the collision occurred.” That’s a statement of an opinion, not an explanation of the methodology that produced the opinion.
Instead, the expert should describe the observations she made at the accident scene, including measurements of skid marks, the distances that accident debris traveled from the point of impact, and any other facts that inform her opinion. The expert should then explain the principles of physics that the expert used to make conclusions about the vehicle’s speed. The expert should describe the mathematical calculations that establish the relationship between the facts and the expert’s conclusion about the vehicle’s speed. Describing the process that the expert followed to reach an opinion is usually the most important part of an expert report.
As another example, a vocational expert who assesses future employability should explain the sources of information (including doctor’s reports, the injury victim’s work history and education, and interviews) that inform the expert’s opinions about work limitations. The expert should then describe the method used to assess job availability, including consultation with databases that describe jobs in the economy that a person with the victim’s limitations can perform. The report should cite evidence that those methods are traditionally used by vocational experts to form opinions about employability.
Establishing the Reliability of a Methodology
Supreme Court decisions that address reliability have focused on scientific opinions. They describe factors that scientists have identified as affecting experimental reliability. Have the study results been published and peer reviewed? Have other scientists replicated the results? Was the experiment conducted according to accepted standards? Did the experimental procedure have a known error rate? Is there a consensus within the relevant scientific community concerning the validity of the results of studies that inform an expert’s opinion?
When an expert’s opinion is based on an assessment of scientific studies, the expert establishes reliability by discussing all of the relevant studies, by explaining why some of those studies produced more reliable results than others, and by articulating the reasoning that guided the expert’s acceptance of particular study results. Cherry-picking results that support the expert’s opinion while ignoring less favorable results is not a sign of reliability.
Many experts offer opinions outside the realm of “hard science.” While judicial decisions that focus on scientific opinions do not necessarily fit well with opinions that do not derive from experimental studies, some judges are inflexible in their insistence that experts should always discuss error rates and other reliability factors that simply don’t apply to the expert’s analysis. Experts should take care to explain why their methodologies are reliable, even when the methodologies are not based on an analysis of experimental studies.
For example, an expert in the History of Science provided an expert opinion about the validity of Michael Mann’s climate change studies in Mann’s lawsuit claiming that he was defamed by bloggers. The expert testified that her methodology was “reading and thinking.” The court concluded that “reading and thinking” is not an expert methodology because everyone, including jurors, can read and think.
The expert could have explained that, throughout history, scientists have identified specific methodologies that produce reliable results. She could have identified those scientists and explained why the scientific community embraced the scientific method and rejected alternative means of forming opinions. She could then have explained whether Mann followed the scientific method, comparing his work to the expectations of scientists who have developed and refined that method over the years.
The expert likely had that literature review in mind when she talked about “reading and thinking,” but she didn’t show her work. By condensing the process of evaluating literature into the phrase “reading and thinking,” the expert failed to persuade the judge that she used a reliable methodology to evaluate Mann’s work.
There is little doubt that the expert was qualified to opine about the reliability of Mann’s methods. She likely thought that her own method — identifying the factors that determine reliability and applying those factors to Mann’s research — was self-evident. The court’s rejection of her testimony should be a lesson to lawyers about the need to remind experts that their reports must identify a detailed methodology and must explain why the methodology is reliable.