AuthorMaClean, Carla L.

    Over many centuries, courts have developed evidentiary and procedural rules that are aimed at preventing unreliable information from tainting the decision-making process. These systems of rules do well in some ways, but less well in other ways. With respect to expert opinion evidence, the courts have attempted to eliminate or correct for possible bias that is predominantly intentional. However, the courts have not, to date, developed robust ways to identify and counteract experts' biases caused by factors that unconsciously affect the quality of their evidence (e.g. extraneous contextual information that biases decision making).

    Cognitive psychologists have been doing research into ways in which experts, sometimes without their own awareness, can be influenced by bias which impacts their reports and the information they present in court. Some of the research in this area is described in this paper.

    The term "bias" is used by cognitive psychologists to refer to instances where systematic errors are present in people's understanding of information or reaching of judgments, and is not meant to suggest conscious or culpable bias. Cognitive psychologists demonstrate that people's judgments are often not entirely rational and based on facts; rather, a substantial psychological literature has long established that people's interpretations and judgments are affected by extraneous material, motivations, and experiences. Often, the experts themselves are not aware of their own biases, and believe in their own impartiality and that they are abiding by their duty to be impartial in the evidence that they offer to the court. (1)

    Given the powerful role of expert evidence in court proceedings and the great weight it may be given, it is important to examine possible sources of bias in that evidence. A biased expert who is unaware of their bias is especially convincing and dangerous. This paper discusses the role of the expert in providing evidence in court, as well as the nature of human cognition and information processing.

    The paper begins with a review of the literature that demonstrates that the judgments of highly trained experts can be biased not only by the nature of the legal system but also by a host of other motivational and environmental factors. The review covers the bias mitigation literature and discusses what has been demonstrated to be effectual and ineffectual in efforts to mitigate the impact of systematic bias. The paper then presents a three-step process addressing how professionals in the legal system may attempt to reveal and expose such cognitive bias in expert reports and evidence presented at court.

    All humans, even experts who give evidence to the court, cognitively interact with incoming information. Such cognitive interaction affects what we perceive, how we interpret and evaluate it, and our decision-making processes. The courts are increasingly aware of the ways certain kinds of evidence--for example, eyewitness identifications--are susceptible to error. However, there are multiple sources of error affecting not only witnesses of fact, but also scientific expert witnesses.

    Courts have to make difficult decisions, often based on partial information. In many circumstances, judges or jurors are required to reach conclusions about matters that require technical or experiential expertise, and find that they are safer (i.e. more likely to be accurate) in doing so with the assistance of experts in areas ranging from the physical sciences to psychiatry. Ideally, experts provide an independent opinion, presumably based on impartial analysis, utilizing objective scientific methods. Furthermore, they accurately and fairly communicate their findings to the court, with transparency as to any weaknesses, limitations, and susceptibility to bias. However, the functioning of the human brain affects the extent to which the ideal of independent impartial expert evidence can be achieved. We know from inquiries into wrongful convictions or miscarriages of justice, (2) legal literature, (3) and government reports (4) that, despite experts' duty to the court to remain impartial, well-intentioned experts' opinions are often not entirely evidence-based.

    The rules governing expert opinion evidence vary among jurisdictions. In Canada, for example, R v Mohan (5) established the criteria for admissibility of expert evidence in court. The Mohan criteria state that expert evidence can be received in cases in which the expert's opinion is relevant, necessary to the trier of fact, provided by a person qualified to be an expert and does not violate exclusionary rules. The Mohan criteria do not explicitly query whether the expert's evidence is biased. (6)

    The criteria used for determining admissibility of expert opinion evidence have evolved over the years (R v Abbey; (7) White Burgess Langille Inman v Abbott and Haliburton (8)) to recognize the role of the judge as a "gatekeeper" who undertakes a cost-benefit analysis of the expert evidence and considers its value to the court's proceedings. As gatekeepers, judges consider the independence and impartiality of the expert, the time that the testimony may add to legal proceedings, and the potential for the evidence to mislead the trier of fact.

    Recent empirical analysis has explored judicial use of the gatekeeper role and found that although the frequency of challenges related to expert biases has increased since White Burgess, the percentage of experts excluded from providing their opinion has remained basically unchanged. (9)

    For members of the legal community, learning about the research findings of cognitive psychologists with respect to possible sources of bias can be useful. Understanding these findings can assist lawyers to instruct their retained experts in ways that minimize the risk of cognitive bias. It can help the experts themselves to approach their task in a bias-minimizing way. And it can also guide cross-examining lawyers to ways of revealing weaknesses in evidence presented by the opposing party. Finally, for judges, awareness of possible sources of expert bias can provide highly useful context for analyzing the admissibility and weight of expert opinion evidence.

    This paper describes a three-step process of detecting expert bias, stemming from the conceptual frameworks and literature discussed below. This paper is relevant to lawyers, expert witnesses, and judges who wish to see the decisions in legal proceedings based upon the highest possible quality of reliable evidence.


    Conceptual frameworks have been developed regarding what makes an expert's opinion credible. They can be a useful starting point in evaluating the quality of the opinions given by experts before the court. Frameworks like the Hierarchy of Expert Performance (HEP) developed by Dror (10) or the Cochran, Weiss, and Shanteau (CWS) (11) emphasize the value of intra-expert consistency, which is an expert's ability to reach the same decision when provided with the same evidence and decision-making criteria at different times, that is, to be consistent with themselves.

    Further, HEP introduces the concept of biasability, which is a measure of how much an expert's judgment can be affected by factors external and irrelevant to properties of the evidence or the task at hand. Biasability is different than reliability and both factors are important when considering the veracity of an expert's opinion. (12) The scientific term of reliability (as used in HEP) refers to consistency in reproducibility and repeatability of the results when factors remain the same. HEP suggests that reliability and biasability of judgments should be considered not only between but also within experts (i.e. not only differences between different experts, but also differences within the same expert when presented with the same evidence at two different times). These need to be examined both at the levels of the observations they make (i.e. what they see in the evidence) as well as their conclusions (i.e. they interpret what they see and use it to draw conclusions).

    In recent years, a groundswell of research has revealed problems with reliability as well as biasability between and even within experts. (13) This research has raised concerns about expert opinion, and a desire for evaluations of expert domains prior to experts delivering testimony to the court. (14) In Canada, for example, expert evidence is not guaranteed admissibility simply because it has been used in the past. In its 2007 ruling in R v Trochym, (15) the Supreme Court of Canada made this clear:

    While some forms of scientific evidence become more reliable over time, others may become less so as further studies reveal concerns. Thus, a technique that was once admissible may subsequently be found to be inadmissible. An example of the first situation, where, upon further refinement and study, a scientific technique becomes sufficiently reliable to be used in criminal trials, is DNA matching evidence, which this Court recognized in R v Terceira. An example of the second situation, where a technique that has been employed for some time comes to be questioned, is so-called "dock", or in-court, identification evidence. In R v Hibbert, Arbour J., writing for the majority, stated that despite its long-standing use, dock identification is almost totally unreliable. Therefore, even if it has received judicial recognition in the past, a technique or science whose underlying assumptions are challenged should not be admitted in evidence without first confirming the validity of those assumptions. (16) This is an important principle, but in addition to assessing expert reliability and proficiency, there needs to be an understanding of expert biasability.

    To date, the literature on cognitive bias and expert evidence has largely discussed how biases introduced by the adversarial system can contaminate an expert's...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT