Senate bill aims to stop Uncle Sam using facial recognition at airports / Legislation would eliminate TSA permission to use the tech, require database purge in 90 days::Legislation would eliminate TSA permission to use the tech, require database purge in 90 days

  • NocturnalMorning@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    2
    ·
    1 year ago

    Facial recognition is bad for a multitude of privacy reasons. But, the biggest reason though is it is also wrong, and often trained with biased data (which is almost impossible to completely remove).

    • bobgusford@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      1 year ago

      Sorry, this needs more clarification! Do you mean “intent recognition” where some AI, trained with biased data, will assume that some brown person is upto no good? Or do you mean that they will misidentify black and brown people more often due to how cameras work? Because the latter has nothing to do with biased data.

      • yeather@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        Both in fact. Training data for things like this regularly mix up minority people. If Omar is a upstanding citizen, but gets his face mixed with Haani, known terrorist, Omar gets treated unfairly, potentially to the point of lethality.

        • bobgusford@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          For “intent recognition”, I agree. A system trained on data of mostly black committing crimes might flag more black people with ill intent.

          But for the sake of identification at security checkpoints, if a man named Omar - who has an eerie resemblance to Haani the terrorist - walks through the gates, then they probably need to do a more thorough check. If they confirm with secondary data that Omar is who he says he is, then the system needs to be retrained on more images of Omar. The bias was only that they didn’t have enough images of Haani and Omar for the system to make a good enough distinction. With more training, it will probably be less biased and more accurate than a human.

    • paysrenttobirds@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      6
      ·
      1 year ago

      There is nothing in the article to suggest that the TSA programs’ errors have inconvenienced people as the agent is right there to correct it, and more scans improves the accuracy. I get what you’re saying, but the same biases are undoubtedly programmed into the brains of the agents and just as hard to eradicate.

      There are many places I don’t want to see facial recognition employed, but where people are already mandated to positively identify themselves seems like a natural fit. I think the senators and the ACLU can find much more persuasive examples of overreach.