Sign In
    Wisconsin Lawyer
    November 01, 2015

    As I See It
    Reforming the ‘Science’ in Forensic Science

    Despite the flaws in the criminal justice system’s long-term faith in forensic science to find and convict offenders, forensic sciences remain increasingly critical components of the fact-finding process in criminal cases. To make forensic science evidence more reliable, a wide range of reforms must take place.

    Keith A. Findley

    Keith Findley

    Keith Findley visits the DNA lab in the U.W. Biotechnology Center, Madison. He says many proposals to make forensic science evidence more reliable require action at the national level, but some are amenable to state action. Photos: Andy Manis

    Five years ago, in a press release hailing the court of appeals’ decision in State v. Jones,1 which upheld the admission of “ballistics” evidence and affirmed a murder conviction, then-Wisconsin Attorney General J.B. Van Hollen declared, “Murderers should fear forensic science.”2 Few would quibble with that sentiment. Forensic science can and should be a powerful tool for identifying and convicting the guilty – and for clearing and exonerating the innocent.

    Today, as we continue to learn ever more about the challenges facing forensic science, however, Van Hollen’s declaration is perhaps better reinterpreted as aspirational as much as it was congratulatory. Indeed, even then there was some irony in Van Hollen’s choice of that particular case to celebrate forensic science, because it involved some of the most suspect and least scientific expert testimony imaginable.

    In Jones, an analyst visually compared a bullet from a murder to the defendant’s gun, and testified that no “other gun in the world would have left those particular type of markings on that bullet.”3 When asked what the error rate was for his analysis, he said, “There is no error rate.”4

    Ballistics – more properly called firearms and toolmark analysis – can be a useful tool in identifying sources of bullets from a crime scene. But this examiner’s testimony went well beyond what his discipline could support. There simply is no scientific basis for declaring that a particular bullet can be matched to a particular gun to the exclusion of all other guns in the world or for implying that the discipline is error free.

    No research indicates whether the markings and striations on the bullet observed by the analyst could be replicated by any other guns. No scientific principles define how many points of similarity must be identified to warrant calling a gun and a bullet a match, or even which protocols analysts should employ when making their visual comparisons. No databases have been assembled of bullet striations to permit any statistical analysis of the frequency with which certain marking patterns appear.

    And certainly no science supports the bold claim that the method has no error rate, if, as the parties and the court of appeals appeared to take it, the analyst was implying that the technique is error free. No real science, not even DNA analysis, can claim such perfection. Error is inevitable in all human endeavors, including all real science.

    Now, five years later, the Jones case still exemplifies the challenges facing the criminal justice system as it tries to ensure that evidence admitted in court is as reliable and scientific as possible. It is a challenge that has begun to get attention at the highest levels of government. But it is also a challenge the Wisconsin legal system has done relatively little to meet. Indeed, although in January 2011, only one year after the Jones decision, the legislature adopted the Daubert5 standard for screening out bad science, there still has been only one Daubert decision on forensic science in Wisconsin, and it did not address any of the identification disciplines (for example, pattern, impression, trace evidence).6

    Challenges Confronting Forensic Science

    Until recently, few participants in the criminal justice system paused to question or examine the reliability of forensic science evidence; forensic science was a staple of criminal cases that was viewed as virtually infallible and precise, whose validity was established by decades of adversary testing through litigation. The reality, however, is different than that perception. It is a reality with which the legal system is just now beginning to come to terms.

    The sense of infallibility began to change when DNA evidence began to reveal criminal case errors at rates never before imagined. Since 1989, when the first two men in the United States were exonerated by postconviction DNA testing, at least 330 convicted individuals have walked free in serious cases after DNA testing proved their innocence.7 Many hundreds, indeed thousands, more have been exonerated by other types of evidence.8

    Surprisingly, flawed forensic science evidence is the second-leading contributor to the wrongful convictions in those DNA cases. Of the first 325 DNA exonerations, 154, or 47 percent, included misapplication of forensic science.9 Only eyewitness-misidentification evidence contributed to more false convictions.

    A detailed analysis of the cases in which a forensics expert testified at trial and DNA evidence later proved the defendant’s innocence found that 60 percent of the experts proffered scientifically inappropriate testimony.10 In the much larger pool of exonerations counted by the National Registry of Exonerations since 1989 (which includes exonerations based on all types of evidence, not just DNA), misapplied forensic evidence played a smaller but still significant role – flawed science was present in 363, or 23 percent, of the 1,600 exonerations in that database.11

    Of course, the problem of shaky forensic science evidence concerns much more than wrongful conviction of innocent persons. Flawed forensic science inevitably means that the system also fails to identify the truly guilty.

    Hence, this is a problem recognized by more than only advocates for the innocent. Most notably, in 2009 the National Academy of Sciences (NAS) – the preeminent scientific authority in the United States – published its groundbreaking study of forensic sciences.12

    Among its many findings, the NAS concluded that, despite their long pedigree in the criminal justice system, most forensic identification disciplines (those whose objective is to match evidentiary traces found on crime scene evidence to a particular individual) are fundamentally unscientific. The NAS wrote: “With the exception of nuclear DNA analysis, ... no forensic method has been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source.”13

    This is not to say that traditional forensic sciences have no value. As the NAS also wrote, “For decades, the forensic science disciplines have produced valuable evidence that has contributed to the successful prosecution and conviction of criminals as well as the exoneration of innocent people.”14 But it is to say that forensic sciences are not all that they were presented as being, and that they can and must be improved.

    The NAS explained that, unlike science-based disciplines such as DNA analysis, serology, and chemical analyses, the identification disciplines “have been developed heuristically. That is, they are based on observation, experience, and reasoning without an underlying scientific theory, experiments designed to test the uncertainties and reliability of the method, or sufficient data that are collected and analyzed scientifically.”15

    In its voluminous report, the NAS examined a number of specific identification forensic specialties, including friction-ridge analysis (fingerprints) and other pattern-impression evidence (shoe prints, tire tracks, lip prints, ear prints, glove prints, and so on); forensic odontology (bite marks); microscopic hair and fiber analysis; questioned-document analysis, including handwriting analysis; firearms and toolmark analysis; and forensic DNA profiling.

    For each specialty, except DNA, the NAS found that there is little if any underlying scientific research, an absence of uniform protocols and standards, no databases of evidence to enable calculating the statistical significance of a “match” between crime scene evidence and a suspect, and wide room for subjective judgments by analysts.

    The critique applies to even the most venerated of the traditional, pre-DNA identification disciplines – fingerprints. The NAS concluded that, because there are no set standards for declaring a fingerprint “match,” “[e]xaminers must make subjective assessments throughout. In the United States, the threshold for making a source identification is deliberately kept subjective, so that the examiner can take into account both the quantity and quality of comparable details. As a result, the outcome of a friction ridge analysis is not necessarily repeatable from examiner to examiner. In fact, recent research by Dror16 has shown that experienced examiners do not necessarily agree with even their own past conclusions when the examination is presented in a different context some time later.”17

    Finally, the NAS Report also addressed serious challenges confronting forensic pathology.18 Of concern in criminal cases, for example, are growing doubts about the validity of medical diagnoses in matters such as “shaken baby syndrome,” or “abusive head trauma” cases. Numerous courts, including the Wisconsin Court of Appeals19 and the U.S. Supreme Court,20 and legal and medical literature21 have noted the increasing scientific challenges to previously held beliefs about the medical diagnosis of murder in such cases.

    The Discovery of “Junk” Science

    New scrutiny has even exposed some forensic disciplines as essentially “junk” sciences, which laboratories are abandoning as useless, or worse.

    Comparative Bullet Lead Analysis. For more than 40 years, for example, analysts from the esteemed FBI Crime Lab testified that, using a technique known as comparative bullet lead analysis (CBLA), they could match crime-scene bullet fragments to bullets found in a suspect’s possession by their distinctive elemental makeup. An exhaustive analysis of the technique, however, again by the NAS, found no scientific basis for such claims.22

    Surprisingly, flawed forensic science evidence is the second-leading contributor to the wrongful convictions in [330] DNA [exoneration] cases.

    Although the analysis of the bullet lead was valid, no science existed that permitted analysts to claim a particular lead mixture was the signature of any particular manufacturer, batch, or box of bullets. Indeed, a single batch or box of bullets could contain bullets with different compositional signatures, and bullets with “matching” compositional attributes could be found in bullets manufactured miles and years apart.

    When the NRC issued its report on CBLA in 2004, the FBI initially responded defensively.23 Within a year, however, by 2005, the FBI abandoned CBLA altogether.24 By 2007, the FBI fully conceded that any testimony suggesting that CBLA could identify a bullet as coming from any particular box of bullets was insupportable. Thereafter, the FBI worked with a task force of defense lawyers assembled by the National Association of Criminal Defense Lawyers and the Innocence Project to identify cases in which individuals might have been wrongly convicted based on CBLA evidence.25

    Microscopic Hair Comparison. A similar story is now emerging about microscopic hair analysis. For decades, analysts have visually compared hairs from crime scenes to see if they match reference hairs plucked from suspects by looking at them under comparison microscopes. As with other forensic disciplines, however, there were no standards, no databases, and no set protocols. Careful examiners tried to make clear in their testimony that they could not definitively link a crime-scene hair to a suspect, but their caveats were often ignored by prosecutors, courts, juries, and defense attorneys, and some analysts testified far beyond the warrant of their discipline.

    The advent of DNA testing revealed that with unsettling frequency, hairs that analysts said could have a common source in fact did not – and conversely, that hairs that analysts said were distinct were in fact from the same person. Because DNA was so much more reliable, the Wisconsin State Crime Laboratory stopped doing hair microscopy several years ago.

    The scope of the damage from the flawed science is only now being recognized. Already, approximately 75 individuals convicted in part as a result of expert testimony about hair “matches” have been exonerated by DNA.26 As with CBLA, the FBI has now acknowledged that its analysts routinely testified about hair microscopy in ways that were unsupported by science. The FBI is now working jointly with the U.S. Department of Justice, the National Association of Criminal Defense Lawyers, and the Innocence Project to reexamine old cases to identify individuals who might have been wrongly convicted because of those errors.27

    In a joint report issued April 20, 2015, the government identified nearly 3,000 cases in which FBI examiners used microscopic hair analysis. As of March 2015, the FBI had reviewed approximately 500 of those cases. In the 268 cases in which examiners provided testimony that inculpated a defendant at trial, analysts made erroneous statements in 257, or 96 percent, of the cases.28

    The FBI has identified four such cases in which its hair analysts testified in Wisconsin – two in state court and two in federal court.29 Unfortunately, however, the FBI also trained hundreds or even thousands of analysts across the country in the same flawed techniques and methods. Indeed, at least four of Wisconsin’s eight known DNA exonerations included state analyst (as opposed to FBI) testimony at trial about hair microscopy that suggested a match, when the DNA later proved there was no match.30 It remains unknown how many other Wisconsin convictions based on microscopic hair comparison testimony from Wisconsin Crime Laboratory analysts are also tainted by these errors.

    Recognizing the real possibility of more extensive error, the FBI has recently called on individual states to undertake their own independent reviews of hair microscopy cases. Although Wisconsin has stopped using hair microscopy, to date nothing has been done to identify and rectify past errors in a systematic way.

    Forensic Odontology. Other disciplines are gradually assuming the mantle of “junk” science as well. Prominent among them is forensic odontology (dentistry).31 Forensic odontology has two distinct applications. The first, identifying remains in mass disasters such as airplane crashes by comparing the teeth of the deceased with dental records showing the location of fillings and crowns and the like, is noncontroversial. The second, however, comparing apparent bite marks on the flesh of victims to the teeth of a suspect, has proved to be highly unreliable and unscientific. Although bite mark evidence is still used in some jurisdictions and still has adherents, its record of error is causing many prosecutors, including many in Wisconsin, to abandon it.32

    Keith A. FindleyKeith A. Findley, Yale 1985, is an assistant professor at the U.W. Law School, where he is also cofounder and codirector of the Wisconsin Innocence Project. He is past president of the Innocence Network, an affiliation of 70 innocence projects in the United States and nearly a dozen other countries. He previously was a trial and appellate assistant state public defender in Madison.

    A prime example is the Wisconsin case of Robert Lee Stinson. Stinson was convicted in 1985 of murder, based almost entirely on a forensic odontologist’s claim that bite marks found on the victim’s chest and abdomen “had to have been made by teeth identical” to Stinson’s, and that there was “no margin for error.”33

    Twenty-three years later, in 2009, Stinson was exonerated and freed when re-analysis of the bite marks suggested they could not have been made by Stinson, and when DNA from saliva on the victim’s shirt excluded Stinson and matched another convicted offender in CODIS, the DNA databank.

    In June 2012, that other man, Moses Price Jr., entered a guilty plea to second-degree murder for the crime that took 23 years of Stinson’s life. A month later, the court sentenced Price to 19 years in prison, to be served concurrently with a 35-year sentence he had received for the 1991 murder of a Milwaukee man.34

    Following Stinson’s release, the State Claims Board awarded him the maximum compensation allowed under Wisconsin law – $ 25,000. In 2014, the legislature passed a private bill awarding him an additional $90,000.35 Stinson is the only wrongly convicted individual in recent Wisconsin history to receive such legislative compensation.

    Cognitive Bias in Forensic Analysis

    Even putting the “junk” techniques aside, the inherent subjectivity of most forensic analyses means not only that the disciplines lack scientific rigor, but also that they are subject to the distortions of cognitive biases. That is not to say that the disciplines are useless, or even that they are incorrect in any specific percentage of cases. But it does mean they must be approached with caution and sensitivity to, among other things, the damaging effects of cognitive biases.

    Cognitive bias refers not to some moral deficiency, misconduct, or malicious intent by laboratory analysts. Rather it refers to nothing more than the unavoidable nature of human thinking – the tendency all human beings have, despite our best efforts, to be influenced in our judgments by expectations, desires, prior judgments, and irrelevant information.

    It is now widely acknowledged that cognitive biases are ubiquitous and can pose serious challenges to any human inquiry, including forensic analyses. The cognitive biases at play in this context are far too numerous and complex to discuss fully in this short article. It suffices here to mention a few, including context effects, role effects, group think, and confirmation bias.

    Context Effects. Context effects can be understood as a form of what Professor Dan Simon calls “inferential spillage.”36 When analysts test and analyze data (particularly ambiguous data – which is almost always the case with crime-scene evidence), non-domain-relevant information can spill into the analysis and skew interpretations. Research shows, for example, that when fingerprint analysts are provided with case-relevant but task-irrelevant information suggesting that the source of the questioned print either is or is not in fact the true perpetrator, that “context” information can lead the analysts to interpret the fingerprint data in ways that are consistent with that extraneous information.37 Other evidence of guilt – the context evidence – such as a confession or an eyewitness identification, might be ultimately case relevant, but it is utterly irrelevant to the analyst’s task of determining if two sets of prints match each other.

    Context effects are so powerful that they have even been shown to affect the way analysts interpret DNA results, at least in cases that involve low-level DNA and mixed-DNA profiles, in which significant interpretation is required.38 It is precisely because of this risk of bias that scientific tests in almost all other contexts are always run in a blind or double-blind manner, so as to shield the scientists and subjects from context information that might skew results.

    Role Effects and Group Think. Related biases, such as role effects and group think, have been shown to bias the interpretation of data in ways that conform to what analysts think is expected of them in their roles or by their peers.39 Research shows that analysts who are organizationally part of a law enforcement team, or who know that law enforcement agents are hoping to connect a suspect to a piece of evidence, are more likely to interpret their data in ways that comport with those expectations.40

    So powerful is this bias that research shows that experts have a tendency to skew their conclusions in favor of whoever hires them, even if they have no other or long-term affiliation with that party, whether it is the prosecution or the defense.41

    Confirmation Bias. Another related bias, confirmation bias, refers to the tendency that all people have, once they have formed an initial hypothesis or conclusion, to seek, recall, and interpret subsequent data in ways that confirm their initial hypothesis or conclusion. Thus, when an analyst is presented with a working hypothesis that a particular suspect is the perpetrator, confirmation bias would suggest that the analyst will subconsciously seek and interpret data in ways that, if true, would confirm that suspect’s guilt.42

    Countering Cognitive Biases. To a large degree, the need to mute cognitive biases such as these (and other more overt motivational biases) underlies the NAS recommendation, discussed below, that crime laboratories be removed from administrative control of law enforcement.43 Short of that, others have proposed steps that can be taken to shield, or “blind,” analysts from biasing context information. Blinding forensic analysts to non-domain-relevant information can be difficult, however, because frequently analysts must know significant information about a case to enable them to make the necessary judgments about what and how to test the evidence.

    But there are ways to minimize the pernicious effects of context information while giving analysts needed case information, through procedures such as “sequential unmasking” – a process by which analysts are given only the information needed to perform their testing at each step of the process, with additional information revealed or “unmasked” as it is needed.44

    Challenges to Cutting-edge Sciences

    The advancement of science into new frontiers constantly expands the capacity for developing useful evidence, but at the same time raises challenges for ensuring reliability. Cell phone- and other digital-tracking technologies, for example, are producing important forensic evidence, but are also increasingly complicated and hence subject to misunderstanding and misuse.45

    Emerging issues in DNA analysis also present new challenges for the legal system. Recently, the FBI announced that it had discovered errors in the population statistics it had been using since 1999 to calculate the statistical significance of a DNA “match.”46 Additionally, complex new software packages are being developed to estimate the probabilities of “matches” in cases in which DNA mixtures from several people are present. While such software may be promising, to date it has been deemed proprietary, so it has not been accessible for peer review and adversary testing for reliability.

    DNA technology is also rapidly advancing to expand its ability to obtain useful data from low-quality or low-quantity samples. These “enhanced detection methods,” or related techniques called “low-template” or “low-copy” DNA analysis, use several methods to increase the sensitivity of the testing. With increased sensitivity, however, comes increased risk of allelic drop-in, and its converse, allelic drop-out. One possible consequence is the incorrect inclusion (from allelic drop-in) or exclusion (from allelic drop-out) of an individual as the source of the biological material.

    Given these risks, new guidelines are emerging from sources such as the FBI and its National DNA Index (NDIS),47 as well as the national Scientific Working Group on DNA Analysis Methods.48 The guidelines are too complex to address here, but readers should be aware of the issues and the guidelines and reference them if confronted with a case involving those techniques.

    Prescriptions for Improvement

    Despite these challenges, forensic sciences remain important – indeed increasingly critical – components of the fact-finding process in criminal cases. To make them as scientific and reliable as possible, a wide range of reforms have been proposed. Many require action at the national level, but some are amenable to state action.

    For its part, the NAS made 13 specific recommendations for reforms (see sidebar). The recommendations centered around a recommendation that Congress create a new, independent governmental agency – the National Institute of Forensic Science (NIFS) – whose responsibility would be to set standards, encourage research, and generally regulate the field.

    Other key recommendations focused on validating the various disciplines and establishing testing protocols; removing forensic science laboratories from administrative control by law enforcement and prosecutors’ offices (to ensure objectivity and minimize cognitive biases); establishing uniform standards for report writing, sharing of information, and testifying; and mandating accreditation of laboratories and certification of analysts.

    Congress has been in no mood to create new independent governmental agencies and has taken no action on the NAS recommendations. Stepping into the void, in 2013 the Department of Justice and the National Institute of Standards and Technology created the National Forensic Science Commission and charged it with undertaking many of the tasks the NAS had envisioned for the NIFS.49 The two agencies entered into a memorandum of understanding delegating to NIST the responsibility for establishing measurement standards and foundation validity for various forensic disciplines.

    To advance this mission, the National Commission has created six subcommittees on a variety of specific issues.50 For its part, NIST has also created the Organization of Scientific Area Committees (OSAC) “to support the development and promulgation of forensic science consensus documentary standards and guidelines, and to ensure that a sufficient scientific basis exists for each discipline,”51 which also includes committees on a range of specific issues.52

    While change is clearly coming at the national level, action can also be taken at the state level to prepare for and more fully implement the reforms needed to make forensic science evidence more reliable. Several years ago, the Wisconsin Department of Justice, the State Bar of Wisconsin, and the Marquette and U.W. law schools jointly sponsored the Wisconsin Criminal Justice Study Commission, whose mission was to bring together stakeholders from throughout the criminal justice system to explore ways to make the system perform more reliably in identifying the guilty and protecting the innocent. Among the issues on the agenda for that commission was forensic science reform.

    In 2007-08, the commission considered various steps Wisconsin could take, without waiting for federal government action, to address the challenges confronting forensic science. Ideas ranged from making State Crime Lab services more accessible to the defense and courts, making crime lab files more transparent and open to review by all parties and courts (both of which were recommended as ways to make the labs more neutral and thus less susceptible to biasing pressures), and creating a Wisconsin Forensic Science Commission to set standards and oversee the work of forensic science laboratories in Wisconsin.53 At the time, the issues were too controversial and the commission ultimately ended its work without agreeing on any forensic science reforms.

    As the problems with forensic sciences become more widely recognized and the federal government moves toward recommending or even mandating changes, the time may be upon us to once again take a serious look at the way forensic science evidence is produced and used in Wisconsin.

    Admissibility Issues

    In the meantime, as reforms are evaluated for improving forensic science upstream of the courtroom, lawyers and courts must wrestle with what to do with forensic evidence as it exists today. As the Jones case suggests, in the pre-Daubert days Wisconsin courts had little to say about the quality of forensic evidence. But now that the legislature has made Wisconsin a Daubert state, courts have a duty to screen out inadequately validated and unreliable expert evidence.

    Indeed, the Jones court itself made that very point when it upheld admission of the evidence in that case. Suggesting that perhaps the outcome of the case might be different under a Daubert regime, the court wrote:

    “Unlike in the federal system, where the trial judge is a powerful gatekeeper with respect to the receipt of proffered expert evidence, Wisconsin gives to the trial judge a more-limited role: the trial judge ‘“merely require[s] the evidence to be ‘an aid to the jury’ or ‘reliable enough to be probative.’” … Simply stated, this is a ‘relevancy test.’”54

    No longer is that true. In January 2011, the legislature amended Wis. Stat. section 907.02 to require circuit courts to screen out unreliable expert evidence – to act as that “powerful gatekeeper” referenced in Jones – under the standards articulated in Daubert. Whether Wisconsin courts will in fact aggressively apply Daubert to forensic science evidence in criminal cases, to screen out the kind of invalid testimony that was received in Jones, is yet to be seen.

    If Wisconsin courts do play that role, they will be swimming against the tide of federal case law. Empirical analyses consistently show that, while there are notable exceptions, federal courts applying Daubert almost never exclude prosecution-proffered forensic science evidence (conversely, they routinely exclude defense-proffered expert evidence).55

    That reality is especially striking given that courts are quite aggressive in suppressing expert evidence in civil cases under Daubert.56 Why the courts much more vigilantly guard against unreliable science in civil than in criminal cases is not fully understood, but the pattern is indisputable. Perhaps with the increasing number of wrongful convictions tied to the misapplication of forensic science, this trend will change.

    Writing on a clean Daubert slate, Wisconsin courts have the opportunity to examine forensic science in criminal cases without the burdens of precedent. Surprisingly, however, to date there have been almost no Wisconsin appellate decisions applying the new Daubert requirements in criminal cases, and none have confronted the admissibility of the traditional forensic identification sciences. In those few cases that do exist, Wisconsin courts, not surprisingly, have relied extensively on case law from federal courts and other Daubert states.57 The pattern reminds us that asking courts to take a serious and hard look at forensic sciences that we have become so accustomed to relying on is asking a lot – even if it is asking for what Daubert demands.

    Conclusion

    Criminal cases are increasingly science dependent, and the traditional forensic sciences have played a crucial role in the way we dispense justice. Recent years have shown, however, that forensic science is no silver bullet. A weak scientific foundation, sparse research support, and surprisingly high error rates beset most forensic sciences.

    Enhanced support, investments in research, greater oversight, greater transparency and accessibility, efforts to ensure neutrality and minimize context effects, and more demanding scrutiny in litigation are the future. They are also essential to ensuring that the evidence we rely on to determine guilt and innocence is as reliable as it can be.

    Endnotes

    1 State v. Jones, 2010 WI App 133, 329 Wis. 2d 498, 791 N.W.2d 390.

    2 Keith Findley, No Silver Bullets in Forensic Science, Milw. J. Sent., Sept. 2, 2010.

    3 Jones, 2010 WI App 133, ¶ 10, 329 Wis. 2d 498.

    4 Id. ¶ 11.

    5 Daubert v. Merrell Dow Pharm. Inc., 509 U.S. 579 (1993).

    6 See State v. Giese, 2014 WI App 92, 356 Wis. 2d 796, 854 N.W.2d 687 (admitting expert testimony concerning retrograde extrapolation of blood-alcohol concentration over a Daubert objection).

    7 The Innocence Project, www.innocenceproject.org.

    8 The National Registry of Exonerations.

    9 The Innocence Project, The Causes of Wrongful Conviction.

    10 Brandon L. Garrett & Peter J. Neufeld, Invalid Forensic Science Testimony and Wrongful Convictions, 95 Va. L. Rev. 1, 2 (2009).

    11 The National Registry of Exonerations, % Exonerations by Contributing Factor.

    12 National Academy of Sciences, Committee on Identifying the Needs of the Forensic Sciences Community; Committee on Science, Technology, and Law; Committee on Applied and Theoretical Statistics; Policy and Global Affairs; Division on Engineering and Physical Sciences; National Research Council, Strengthening Forensic Science in the United States: A Path Forward (2009).

    13 Id. at 7.

    14 Id. at 4.

    15 Id. at 128

    16 I.E. Dror & D. Charlton, Why Experts Make Errors, 56 J. Forensic Identification 600 (2006).

    17 National Academy of Sciences, supra note 12, at 139.

    18 Id. at 241-68.

    19 State v. Edmunds, 2008 WI App 33, 308 Wis. 2d 374, 746 N.W.2d 590.

    20 Cavazos v. Smith, 132 S. Ct. 2 (2011).

    21 Deborah Tuerkheimer, Flawed Convictions: “Shaken Baby Syndrome” and the Inertia of Injustice (2014); Keith Findley et al., Shaken Baby Syndrome, Abusive Head Trauma, and Actual Innocence: Getting It Right, 12 Hous. J. Health L. & Pol’y 209 (2012).

    22 Nat’l Res. Council, Forensic Analysis: Weighing Bullet Lead Evidence (2004).

    23 John Solomon, Silent Injustice: Bullet-matching Science Debunked, Wash. Post, Nov. 19, 2007, at A1.

    24 Id.

    25 Keith A. Findley, Innocents at Risk: Adversary Imbalance, Forensic Science, and the Search for Truth, 38 Seton Hall L. Rev. 893, 970-71 (2008).

    26 Innocence Project, Wrongful Convictions Involving Unvalidated or Improper Forensic Science that Were Later Overturned through DNA Testing.

    27 National Association of Criminal Defense Lawyers News Release, FBI Testimony on Microscopic Hair Analysis Contained Errors in at least 90% of Cases in Ongoing Review: 26 of 28 FBI Analysts Provided Testimony or Reports with Errors.

    28 Id.

    29 Flawed Forensic Hair Testimony from the FBI Lab.

    30 See The Innocence Project; State v. Hicks, 202 Wis. 2d 150, 549 N.W.2d 435 (1996); State v. Avery; State v. Armstrong, 2005 WI 119, 283 Wis. 2d 639, 700 N.W.2d 98.

    31 See Radley Balko, How the Flawed “Science”of Bite Mark Analysis has Sent Innocent People to Prison, Wash. Post, Feb. 13, 2015; Erica Beecher-Monas, Reality Bites: The Illusion of Science in Bite-Mark Evidence, 30 Cardozo L. Rev. 1369 2009).

    32 See, e.g., Gabriel E. Fuentes, Op-Ed: Bite-Mark Evidence Proving Unreliable, Nat’l L.J..

    33 National Registry of Exonerations, Robert Lee Stinson.

    34 Id.

    35 Id.

    36 Inside the Mind of Forensic Science: An Interview with Dan Simon (Part 2).

    37 See, e.g., Dror & Charlton, supra note 16.

    38 Itiel Dror & Greg Hampikian, Subjectivity and Bias in Forensic DNA Mixture Interpretation, 51 Sci. & Justice 204(2011).

    39 See Sandra Guerra Thompson, Cops in Lab Coats: Curbing Wrongful Convictions through Independent Forensic Laboratories 131 (2015).

    40 See D. Michael Risinger et al., The Daubert/Kumho Implications of Observer Effects in Forensic Science: Hidden Problems of Expectation and Suggestion, 90 Cal. L. Rev. 1, 18-19 (2002).

    41 Daniel C. Murrie et al., Are Forensic Experts Biased by the Side That Retained Them?, Psychol. Sci. (epub Aug. 22, 2013).

    42 See Saul M. Kassin, Itiel Dror & Jeff Kukucka, The Forensic Confirmation Bias: Problems, Perspectives, and Proposed Solutions, 2 J. Applied Research in Memory & Cognition 42 (2013).

    43 See Paul C. Giannelli, Independent Crime Laboratories: The Problem of Motivational and Cognitive Biases, 2010 Utah L. Rev. 247.

    44 See Dan E. Krane et al., Sequential Unmasking: A Means of Minimizing Observer Effects in Forensic DNA Interpretation, 53 J. Forensic Sci. 1006 (2008); Itiel E. Dror, William C. Thompson & Christian A. Meissner, Context Management Toolbox: A Linear Sequential Unmasking Approach for Minimizing Cognitive Bias in Forensic Decision Making, J. Forensic Sci. (in press, 2015).

    45 Mark Hansen, Prosecutors’ Use of Mobile Phone Tracking is “Junk Science,” Critics Say, ABA J. (June 1, 2013).

    46 Spencer S. Hsu, FBI Notifies Crime Labs of Errors Used in DNA Match Calculations Since 1999, Wash. Post, May 29, 2015.

    47 www.fbi.gov/about-us/lab/biometric-analysis/codis/ndis-procedures-manual.

    48 Scientific Working Group on DNA Analysis Methods, Guidelines for STR Enhanced Detection Methods.

    49 Seewww.justice.gov/ncfs. See also Charter, U.S. Dep’t of Justice, National Commission on Forensic Science.

    50 See www.justice.gov/ncfs/subcommittees.

    51 See www.nist.gov/forensics/osac/index.cfm.

    52 See NIST, Organization of Scientific Area Committees.

    53 See U.W. Law School, Wisconsin Criminal Justice Study Commission (agendas and meeting summaries).

    54 Jones, 2010 WI App 133, ¶ 22, 329 Wis. 2d 498 (quoting State v. Walstad, 119 Wis. 2d 483, 519, 351 N.W.2d 469 (1984)).

    55 See Findley, supra note 25, at 939-42; Jennifer L. Groscup et al., The Effects of Daubert on the Admissibility of Expert Testimony in State and Federal Criminal Cases, 8 Psychol. Pub. Pol’y & L. 339, 342 (2002); D. Michael Risinger, Navigating Expert Reliability: Are Criminal Standards of Certainty Being Left on the Dock?, 64 Alb. L. Rev. 99, 143-49 (2000).

    56 Id.

    57 See, e.g., Giese, 2014 WI App 92, 356 Wis. 2d 796.


Join the conversation! Log in to comment.

News & Pubs Search

-
Format: MM/DD/YYYY