United States v. Bonds, No. 18-2670 (7th Cir. 2019)

Annotate this Case
Justia Opinion Summary

Convicted of bank robbery in 2015, 18 U.S.C. 2113(a), Bonds was sentenced to 60 months’ imprisonment. The evidence against him included the testimony of an FBI Latent Print Operations Unit fingerprint examiner, Glass, that Bonds’s fingerprints appeared on the demand notes used in the robberies. In 2004 the Unit incorrectly identified Mayfield as a person whose fingerprints suggested involvement in a terrorist bombing in Spain. Bonds wanted to use this episode to illustrate the potential for mistakes in the “analysis, comparison, evaluation, and verification” (ACE-V) method. The district judge permitted Bonds to cross-examine Glass about the reliability of the ACE-V method and to present other evidence suggesting that the approach is more error-prone than jurors might believe. Evidence about one particular error, the judge concluded, would be more distracting and time-consuming than its incremental value could justify. The Seventh Circuit affirmed, rejecting a Confrontation Clause argument. Presenting jurors with details of one wrongful imprisonment (especially on a mistaken charge of terrorism) would appeal to emotion rather than to reason. Bonds had ample opportunity to supply the jury with evidence about the reliability of the ACE-V method, including changes made in the last decade. In 2016, the President’s Council of Advisors on Science and Technology concluded that changes in ACE-V have bolstered its accuracy. The summary provides the defense bar with paths to cross-examine witnesses who used the ACE-V approach.

Download PDF
In the United States Court of Appeals For the Seventh Circuit ____________________ No. 18-2670 UNITED STATES OF AMERICA, Plaintiff-Appellee, v. MYSHAWN BONDS, Defendant-Appellant. ____________________ Appeal from the United States District Court for the Northern District of Illinois, Eastern Division. No. 15 CR 573 — Sara L. Ellis, Judge. ____________________ ARGUED APRIL 16, 2019 — DECIDED APRIL 24, 2019 ____________________ es. Before EASTERBROOK, KANNE, and SCUDDER, Circuit Judg- EASTERBROOK, Circuit Judge. A jury convicted Myshawn Bonds of bank robbery, 18 U.S.C. §2113(a), and a judge sentenced him to sixty months’ imprisonment plus three years’ supervised release. The evidence against him included the testimony of Kira Glass, a ngerprint examiner in the FBI’s Latent Print Operations Unit. Glass concluded that Bonds’s 2 No. 18-2670 ngerprints appeared on the demand notes used in the two robberies. In 2004 the Latent Print Operations Unit incorrectly identi ed Brandon May eld as a person whose ngerprints suggested involvement in a terrorist bombing in Spain. May eld was arrested and held for more than two weeks as a material witness, until the FBI acknowledged that its assessment resulted from operational errors. The United States released May eld, apologized, and paid him substantial compensation. Bonds wanted to use this episode to illustrate for the jury the potential for mistakes in the application of a method that ngerprint analysts dub ACE-V, for analysis, comparison, evaluation, and veri cation. That method miscarried for May eld, and Bonds contended that it could miscarry for him too. But the district judge concluded that evidence about May eld’s arrest in 2004 would take the trial far a eld from the question whether Bonds robbed two banks in 2015. The judge permifed Bonds to cross-examine Glass about the reliability of the ACE-V method and to present other evidence suggesting that the approach is more error-prone than jurors are likely to believe after watching forensic labs operate to perfection on television. Evidence about one particular error, the judge concluded, would be more distracting and time consuming than its incremental value could justify. Bonds contends that the district court’s decision to exclude evidence about May eld’s mistaken identi cation and arrest violated the Confrontation Clause of the Constitution’s Sixth Amendment. United States v. Rivas, 831 F.3d 931 (7th Cir. 2016), rejected an identical contention, holding that a district court did not violate the Constitution when exclud- No. 18-2670 3 ing evidence about the May eld situation. Bonds asks us to distinguish Rivas on the ground that Glass works in the same FBI division that mistakenly identi ed May eld, but Bonds does not contend that Glass was involved in that error. Guilt by association would be a poor reason to deny a district judge the discretion otherwise available under Fed. R. Evid. 403. Defense counsel suggested at oral argument that jurors respond more strongly to concrete examples than to data about error rates. That may well be true—but it is a reason to limit the use of extrinsic evidence, not to require judges to admit it. Presenting jurors with details of one wrongful imprisonment (especially on a mistaken charge of terrorism) would appeal to their emotion rather than to their reason. Emotional responses can be strong, but reason should underlie a verdict. Bonds had ample opportunity to supply the jury with evidence about the reliability of the ACE-V method, including the extent to which changes the FBI made in the last decade have improved its reliability. Shortly after the May eld asco the National Research Council concluded that the ACE-V method is too subjective and too unreliable to deserve the label “scienti c.” Strengthening Forensic Science in the United States: A Path Forward 136–45 (2009). More recently, the President’s Council of Advisors on Science and Technology concluded that changes in ACE-V have bolstered its accuracy. Forensic Science in the Criminal Courts: Ensuring Scienti c Validity of Feature-Comparison Methods 87–103 (2016). This report concluded that the error rates shown by well-designed studies ranged from 1 in 18 to 1 in 604. (These are rates of false positives—incorrectly declaring a match when the prints 4 No. 18-2670 di er—taking account of statistical con dence intervals.) The bofom line (id. at 101–02; emphasis in original): Foundational validity. Based largely on two recent appropriately designed … studies, [we nd] that latent ngerprint analysis is a foundationally valid subjective methodology—albeit with a false positive rate that is substantial and is likely to be higher than expected by many jurors based on longstanding claims about the infallibility of ngerprint analysis. Conclusions of a proposed identi cation may be scienti cally valid, provided that they are accompanied by accurate information about limitations on the reliability of the conclusion— speci cally, that (1) only two properly designed studies of the foundational validity and accuracy of latent ngerprint analysis have been conducted, (2) these studies found false positive rates that could be as high as 1 error in 306 cases in one study and 1 error in 18 cases in the other, and (3) because the examiners were aware they were being tested, the actual false positive rate in casework may be higher. At present, claims of higher accuracy are not warranted or scienti cally justi ed. Additional … studies are needed to clarify the reliability of the method. Validity as applied. Although we conclude that the method is foundationally valid, there are a number of important issues related to its validity as applied. (1) Con rmation bias. Work by FBI scientists has shown that examiners typically alter the features that they initially mark in a latent print based on comparison with an apparently matching exemplar. Such circular reasoning introduces a serious risk of con rmation bias. Examiners should be required to complete and document their analysis of a latent ngerprint before looking at any known ngerprint and should separately document any additional data used during their comparison and evaluation. (2) Contextual bias. Work by academic scholars has shown that examiners’ judgments can be in uenced by irrelevant information about the facts of a case. E orts should be made No. 18-2670 5 to ensure that examiners are not exposed to potentially biasing information. (3) Pro ciency testing. Pro ciency testing is essential for assessing an examiner’s capability and performance in making accurate judgments. As discussed elsewhere in this report, pro ciency testing needs to be improved by making it more rigorous, by incorporating it within the ow of casework, and by disclosing tests for evaluation by the scienti c community. From a scienti c standpoint, validity as applied requires that an expert: (1) has undergone appropriate pro ciency testing to ensure that he or she is capable of analyzing the full range of latent ngerprints encountered in casework and reports the results of the pro ciency testing; (2) discloses whether he or she documented the features in the latent print in writing before comparing it to the known print; (3) provides a wrifen analysis explaining the selection and comparison of the features; (4) discloses whether, when performing the examination, he or she was aware of any other facts of the case that might in uence the conclusion; and (5) veri es that the latent print in the case at hand is similar in quality to the range of latent prints considered in the foundational studies. This summary provides the defense bar with paths to crossexamine witnesses who used the ACE-V approach. Have they avoided con rmation bias? Have they avoided contextual bias? Has their pro ciency been con rmed by testing? Cf. Florida v. Harris, 568 U.S. 237, 247–49 (2013). And the report’s observation about the rate of false positives will inform jurors that TV shows do not depict the actual state of latent- ngerprint analysis. Bonds does not contest any restrictions the district court placed on how these mafers were explored on cross-examination; his sole appellate contention is that the court erred by preventing reference to May eld. 6 No. 18-2670 To say that an error rate is troubling is not to suggest that ACE-V is too uncertain for use in litigation. Assessment must be comparative. What are the alternatives? Grainy pictures taken by bank surveillance cameras of robbers wearing masks, or confederates who testify for the prosecution, have problems of their own. Witnesses may lie on the stand; there is no science of credibility enabling jurors to detect who is telling the truth, and some witnesses who think that they are telling the truth nonetheless may be confused or incorrect. Eyewitness identi cation is notoriously subject to the vagaries of memory. A judicial system that relies heavily on fallible lay testimony cannot be improved by excluding professional analysis that may well have a lower error rate—or by diverting jurors’ afention to particular errors made by other analysts years earlier. What the judicial system can do is subject the forensic evidence to cross-examination about a method’s reliability and whether the witness took appropriate steps to reduce errors. Bonds enjoyed that opportunity. AFFIRMED
Primary Holding

Seventh Circuit upholds the district court's refusal to allow the defense to explore one particular incident of incorrect identification in challenging the accuracy of a particular method of fingerprint analysis.


Disclaimer: Justia Annotations is a forum for attorneys to summarize, comment on, and analyze case law published on our site. Justia makes no guarantees or warranties that the annotations are accurate or reflect the current state of law, and no annotation is intended to be, nor should it be construed as, legal advice. Contacting Justia or any attorney through this site, via web form, email, or otherwise, does not create an attorney-client relationship.

Some case metadata and case summaries were written with the help of AI, which can produce inaccuracies. You should read the full case before relying on it for legal research purposes.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.