An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    165
    arrow-down
    15
    ·
    edit-2
    1 year ago

    It’s less a reflection on the tech, and more a reflection on the culture that generated the content that trained the tech.

    Wang told The Globe that she was worried about the consequences in a more serious situation, like if a company used AI to select the most “professional” candidate for the job and it picked white-looking people.

    This is a real potential issue, not just “clickbait”.

    • HumbertTetere@feddit.de
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      2
      ·
      1 year ago

      If companies go pick the most professional applicant by their photo that is a reason for concern, but it has little to do with the image training data of AI.

      • AbouBenAdhem@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Some people (especially in business) seem to think that adding AI to a workflow will make obviously bad ideas somehow magically work. Dispelling that notion is why articles like this are important.

        (Actually, I suspect they know they’re still bad ideas, but delegating the decisions to an AI lets the humans involved avoid personal blame.)

        • Square Singer@feddit.de
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          It’s a massive issue that many people (especially in business) have this “the AI has spoken”-bias.

          Similar to how they implement whatever the consultant says, no matter if it actually makes sense, they just blindly follow what the AI says .

        • Water1053@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Businesses will continue to use bandages rather than fix their root issue. This will always be the case.

          I work in factory automation and almost every camera/vision system we’ve installed has been a bandage of some sort because they think it will magically fix their production issues.

          We’ve had a sales rep ask if our cameras use AI, too. 😵‍💫

    • JeffCraig@citizensgaming.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Again, that’s not really the case.

      I have Asian friends that have used these tools and generated headshots that were fine. Just because this one Asian used a model that wasn’t trained for her demographic doesn’t make it a reflection of anything other than the fact that she doesn’t understand how MML models work.

      The worst thing that happened when my friends used it were results with too many fingers or multiple sets of teeth 🤣

    • drz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      1 year ago

      No company would use ML to classify who’s the most professional looking candidate.

      1. Anyone with any ML experience at all knows how ridiculous this concept is. Who’s going to go out there and create a dataset matching “proffesional looking scores” to headshots?
      2. The amount of bad press and ridicule this would attract isn’t worth it to any company.
      • kbotc@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Companies already use resume scanners that have been found to bias against black sounding names. They’re designed to feedback loop successful candidates, and guess what shit the ML learned real quick?