Summary

  • Detroit woman wrongly arrested for carjacking and robbery due to facial recognition technology error.
  • Porsche Woodruff, 8 months pregnant, mistakenly identified as culprit based on outdated 2015 mug shot.
  • Surveillance footage did not match the identification, victim wrongly identified Woodruff from lineup based on the 2015 outdated photo.
  • Woodruff arrested, detained for 11 hours, charges later dismissed; she files lawsuit against Detroit.
  • Facial recognition technology’s flaws in identifying women and people with dark skin highlighted.
  • Several US cities banned facial recognition; debate continues due to lobbying and crime concerns.
  • Law enforcement prioritized technology’s output over visual evidence, raising questions about its integration.
  • ACLU Michigan involved; outcome of lawsuit uncertain, impact on law enforcement’s tech use in question.
  • flatbield@beehaw.org
    link
    fedilink
    English
    arrow-up
    16
    ·
    11 months ago

    This is the issue with big data. Big enough database you will find a closest match that seems pretty good. False positives are a huge concern in any big data approach… and couple that with lazy policing well you get this.