A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

    • CeeBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      15
      ·
      3 months ago

      No they aren’t. This is the narrative that keeps getting repeated over and over. And the citation for it is usually the ACLU’s test on Amazon’s Rekognition system, which was deliberately flawed to produce this exact outcome (people years later still saying the same thing).

      The top FR systems have no issues with any skin tones or connections.

        • CeeBee@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          5
          ·
          3 months ago

          I promise I’m more aware of all the studies, technologies, and companies involved. I worked in the industry for many years.

          The technical studies you’re referring to show that the difference between a white man and a black woman (usually polar opposite in terms of results) is around 0.000001% error rate. But this usually gets blown out of proportion by media outlets.

          If you have white men at 0.000001% error rate and black women at 0.000002% error rate, then what gets reported is “facial recognition for black women is 2 times worse than for white men”.

          It’s technically true, but in practice it’s a misleading and disingenuous statement.

          • AwesomeLowlander@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            Would you kindly link some studies backing up your claims, then? Because nothing I’ve seen online has similar numbers to what you’re claiming