A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

  • PseudorandomNoise@lemmy.world
    link
    fedilink
    English
    arrow-up
    127
    arrow-down
    1
    ·
    2 months ago

    Despite concerns about accuracy and potential misuse, facial recognition technology seems poised for a surge in popularity. California-based restaurant CaliExpress by Flippy now allows customers to pay for their meals with a simple scan of their face, showcasing the potential of facial payment technology.

    Oh boy, I can’t wait to be charged for someone else’s meal because they look just enough like me to trigger a payment.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      76
      ·
      2 months ago

      I have an identical twin. This stuff is going to cause so many issues even if it worked perfectly.

      • Telodzrum@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        10
        ·
        2 months ago

        If it works anything like Apple’s Face ID twins don’t actually map all that similar. In the general population the probability of matching mapping of the underlying facial structure is approximately 1:1,000,000. It is slightly higher for identical twins and then higher again for prepubescent identical twins.

        • MonkderDritte@feddit.de
          link
          fedilink
          English
          arrow-up
          16
          ·
          edit-2
          2 months ago

          Meaning, 8’000 potential false positives per user globally. About 300 in US, 80 in Germany, 7 in Switzerland.

          Might be enough for Iceland.

          • Telodzrum@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            13
            ·
            2 months ago

            Yeah, which is a really good number and allows for near complete elimination of false matches along this vector.

            • 4am@lemm.ee
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              1
              ·
              2 months ago

              I promise bro it’ll only starve like 400 people please bro I need this

              • Telodzrum@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                2 months ago

                No you misunderstood. That is a reduction in commonality by a literal factor of one million. Any secondary verification point is sufficient to reduce the false positive rate to effectively zero.

                • BassTurd@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  Which means the face recognition was never necessary. It’s a way for companies to build a database that will eventually get exploited. 100% guarantee.

                • AwesomeLowlander@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  secondary verification point

                  Like, running a card sized piece of plastic across a reader?

                  It’d be nice if they were implementing this to combat credit card fraud or something similar, but that’s not how this is being deployed.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          2 months ago

          Yeah, people with totally different facial structures get identified as the same person all the time with the “AI” facial recognition, especially if your darker skinned. Luckily (or unluckily) I’m white as can be.

          I’m assuming Apple’s software is a purpose built algorithm that detects facial features and compares them, rather than the black box AI where you feed in data and it returns a result. Thats the smart way to do it, but it takes more effort.

        • 4am@lemm.ee
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          2 months ago

          And yet this woman was mistaken for a 19-year-old 🤔

          • Telodzrum@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            2 months ago

            Shitty implementation doesn’t mean shitty concept, you’d think a site full of tech nerds would understand such a basic concept.

            • Hawk@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              Pretty much everyone here agrees that it’s a shitty concept. Doesn’t solve anything and it’s a privacy nightmare.

      • CeeBee@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        16
        ·
        2 months ago

        Ok, some context here from someone who built and worked with this kind tech for a while.

        Twins are no issue. I’m not even joking, we tried for multiple months in a live test environment to get the system to trip over itself, but it just wouldn’t. Each twin was detected perfectly every time. In fact, I myself could only tell them apart by their clothes. They had very different styles.

        The reality with this tech is that, just like everything else, it can’t be perfect (at least not yet). For all the false detections you hear about, there have been millions upon millions of correct ones.

          • CeeBee@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            5
            ·
            2 months ago

            Yes, because like I said, nothing is ever perfect. There can always be a billion little things affecting each and every detection.

            A better statement would be “only one false detection out of 10 million”

            • Zron@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 months ago

              You want to know a better system?

              What if each person had some kind of physical passkey that linked them to their money, and they used that to pay for food?

              We could even have a bunch of security put around this passkey that makes it’s really easy to disable it if it gets lost or stolen.

              As for shoplifting, what if we had some kind of societal system that levied punishments against people by providing a place where the victim and accused can show evidence for and against the infraction, and an impartial pool of people decides if they need to be punished or not.

            • fishpen0@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 months ago

              Another way to look at that is ~810 people having an issue with a different 810 people every single day assuming only one scan per day. That’s 891,000 people having a huge fucking problem at least once every single year.

              I have this problem with my face in the TSA pre and passport system and every time I fly it gets worse because their confidence it is correct keeps going up and their trust in my actual fucking ID keeps going down

              • CeeBee@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                2 months ago

                I have this problem with my face in the TSA pre and passport system

                Interesting. Can you elaborate on this?

                Edit: downvotes for asking an honest question. People are dumb

        • MonkderDritte@feddit.de
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          2 months ago

          it can’t be perfect (at least not yet).

          Or ever, because it locks you out after a drunken night otherwise.

          • CeeBee@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            2 months ago

            Or ever because there is no such thing as 100% in reality. You can only add more digits at the end of your accuracy, but it will never reach 100.

        • boatswain@infosec.pub
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 months ago

          In fact, I myself could only tell them apart by their clothes. They had very different styles.

          This makes it sound like you only tried one particular set of twins–unless there were multiple sets, and in each set the two had very different styles? I’m no statistician, but a single set doesn’t seem statistically significant.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          This tech (AI detection) or purpose built facial recognition algorithms?

      • CeeBee@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        15
        ·
        2 months ago

        No they aren’t. This is the narrative that keeps getting repeated over and over. And the citation for it is usually the ACLU’s test on Amazon’s Rekognition system, which was deliberately flawed to produce this exact outcome (people years later still saying the same thing).

        The top FR systems have no issues with any skin tones or connections.

          • CeeBee@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            5
            ·
            2 months ago

            I promise I’m more aware of all the studies, technologies, and companies involved. I worked in the industry for many years.

            The technical studies you’re referring to show that the difference between a white man and a black woman (usually polar opposite in terms of results) is around 0.000001% error rate. But this usually gets blown out of proportion by media outlets.

            If you have white men at 0.000001% error rate and black women at 0.000002% error rate, then what gets reported is “facial recognition for black women is 2 times worse than for white men”.

            It’s technically true, but in practice it’s a misleading and disingenuous statement.

            • AwesomeLowlander@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Would you kindly link some studies backing up your claims, then? Because nothing I’ve seen online has similar numbers to what you’re claiming

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Are we assuming there is no pin or any other auth method? That would be unlike any other payment system I’m aware of. I have to fingerprint scan on my phone to use my credit cards even though I just unlocked my phone to attempt it

  • MentalEdge@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    84
    arrow-down
    1
    ·
    2 months ago

    Even if someone did steal a mars-bar… Banning them from all food-selling establishments seems… Disproportional.

    Like if you steal out of necessity, and get caught once, you then just starve?

    Obviously not all grocers/chains/restaurants are that networked yet, but are we gonna get to a point where hungry people are turned away at every business that provides food, once they are on “the list”?

    • DivineDev@kbin.run
      link
      fedilink
      arrow-up
      18
      ·
      2 months ago

      No no, that would be absurd. You’ll also be turned away if you are not on the list if you’re unlucky.

    • FuryMaker@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 months ago

      get caught once, you then just starve?

      Maybe they send you to Australia again?

      The world hasn’t changed has it.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      They’ve essentially created their own privatized law enforcement system. They aren’t allowed to enforce their rules the same way a government would be, but punishment like banning a person from huge swaths of economic life can still be severe. The worst part is that private legal systems almost never have any concept of rights or due process, so there is absolutely nothing stopping them from being completely arbitrary in how they apply their punishments.

      I see this kind of thing as being closely aligned with right wingers’ desire to privatize everything, abolish human rights, and just generally turn the world into a dystopian hellscape for anyone who isn’t rich and well connected.

  • Vipsu@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    ·
    2 months ago

    Can’t wait for something like this get hacked. There’ll be a lot explaining to do.

    • Jayjader@jlai.lu
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      2 months ago

      Still, I think the only way that would result in change is if the hack specifically went after someone powerful like the mayor or one of the richest business owners in town.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    2 months ago

    This is why some UK leaders wanted out of EU, to make their own rules with way less regard for civil rights.

    • ᕙ(⇀‸↼‶)ᕗ@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 months ago

      nah i think main thing was a super fragile identity. i mean they have been shit all the time since before EU. when talks between france,germany and uk took place the insisted to take control of EU.

      if you live on an island for generations with limited new genetic input…well, thats where you end up.

      • sailingbythelee@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        We humans have these things called “boats” that have enabled the British Isles to receive regular inputs of new genetic material. Pretty useful things, these boats, and somewhat pivotal in the history of the UK.

    • Lad@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s the Tory way. Authoritarianism, culture wars, fucking over society’s poorest.

  • Jackthelad@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 months ago

    Well, this blows the “if you’ve not done anything wrong, you have nothing to worry about” argument out of the water.

    • raspberriesareyummy@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      That argument was only ever made by dumb fucks or evil fucks. The article reports about an actual occurrence of one of the problems of such technology that we (people who care about privacy) have warned about from the beginning.

    • refalo@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      the way I like to respond to that:

      “ok, pull down your pants and hand me your unlocked phone”

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 months ago

    This raises questions about how ‘good’ this technology is.

    But it also raises the question of how well your police can deal with false suspicions and false accusations.

    • CeeBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      13
      ·
      2 months ago

      This raises questions about how ‘good’ this technology is.

      No it doesn’t. For every 10 million good detections you only hear about the 1 or 2 false detections. The issue here are the policies around detections and how to verify them. Some places are still taking a blind faith approach to the detections.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        2 months ago

        For every 10 million good detections you only hear about the 1 or 2 false detections.

        Considering the impact of these faults, it is obviously not good enough.

        • CrayonMaster@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          2 months ago

          But that really says more about the user then the tech. This issue here isn’t that the tech has too many errors, it’s that stores use it and it alone to ban people despite it having a low but well known error rate.

          • NeoNachtwaechter@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            says more about the user then the tech.

            “You need to have a suitable face for our face recognition?” ;-)

            stores use it and it alone to ban people

            No. Read again. The stores did not use technology, they used the services of that tech company.

          • nyan@lemmy.cafe
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            stores use it and it alone to ban people despite it having a low but well known error rate.

            And it is absolutely predictable that some stores would do that, because humans. At very least, companies deploying this technology need to make certain that all the store staff are properly trained on what it does and doesn’t mean, including new hires who arrive after the system is put in. Forcing that is going to require that a law be passed.

  • HiramFromTheChi@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    Not the first time facial recognition tech has been misused, and certainly won’t be the last. The UK in particular has caught a lotta flak around this.

    We seem to have a hard time connecting the digital world to the physical world and realizing just how interwoven they are at this point.

    Therefore, I made an open source website called idcaboutprivacy to demonstrate the importance—and dangers—of tech like this.

    It’s a list of news articles that demonstrate real-life situations where people are impacted.

    If you wanna contribute to the project, please do. I made it simple enough to where you don’t need to know Git or anything advanced to contribute to it. (I don’t even really know Git.)

  • Clbull@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    Please tell me a lawyer is taking this on pro bono and is about to sue the shit out of Facewatch.

    • Madison420@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      13
      ·
      2 months ago

      For what? A private business can exclude anyone for any reason or no reason at all so long as the reason isn’t a protected right.

      • howrar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        2 months ago

        I’d be surprised if being born with a specific face configuration isn’t protected in the same way that race and gender are.

        • Madison420@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          2 months ago

          In the uk you can pet much guarantee that won’t happen because it would shut down their surveillance state.

    • orrk@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      you will sit down and be quiet, all you parasites stifling innovation, the market will solve this, because it is the most rational thing in existence, like trains, oh god how I love trains, I want to be f***ed by trains.

      ~~Rand

    • Alexstarfire@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      Are you suggesting they shouldn’t be allowed to ban people from stores? The only problem I see here is misused tech. If they can’t verify the person, they shouldn’t be allowed to use the tech.

      I do think there need to be reprocussions for situations like this.

      • Queen HawlSera@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        Well there should be a limited amount of ability to do so. I mean there should be police reports or something at the very least. I mean, what if Facial Recognition AI catches on in grocery stores? Is this woman just banned from all grocery stores now? How the fuck is she going to eat?

        • Alexstarfire@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          That’s why I said this was a misuse of tech. Because that’s extremely problematic. But there’s nothing to stop these same corps from doing this to a person even if the tech isn’t used. This tech just makes it easier to fuck up.

          I’m against the use of this tech to begin with but I’m having a hard time figuring out if people are more upset about the use of the tech or about the person being banned from a lot of stores because of it. Cause they are separate problems and the latter seems more of an issue than the former. But it also makes fucking up the former matter a lot more as a result.

          • TonyOstrich@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            I wish I could remember where I saw it, but years ago I read something in relation to policing that said a certain amount of human inefficiency in a process is actually a good thing to help balance bias and over reach that could occur when technology could technically do in seconds what would take a human days or months.

            In this case if a person is enough of a problem that their face becomes known at certain branches of a store it’s entirely reasonable for that store to post a sign with their face saying they are aren’t allowed. In my mind it would essentially create a certain equilibrium in terms of consequences and results. In addition to getting in trouble for stealing itself, that individual person also has a certain amount of hardship placed on them that may require they travel 40 minutes to do their shopping instead of 5 minutes to the store nearby. A sign and people’s memory also aren’t permanent, so it’s likely that after a certain amount of time that person would probably be able to go back to that store if they had actually grown out of it.

            Or something to that effect. If they steal so much that they become known to the legal system there should be processes in place to address it.

            And even with all that said, I’m just not that concerned with theft at large corporate retailers considering wage theft dwarfs thefts by individuals by at least an order of magnitude.

  • The Menemen!@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    2 months ago

    Even if she were the shoplifter, how would that work? “Sorry mate, you shoplifted when you were 16, now you can never buy food again.”?

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 months ago

    This makes me think of people who have trouble in airports because their name is similar to someone else’s.

    Only this is going to be much harder to deal with