“Suno’s training data includes essentially all music files of reasonable quality that are accessible on the open internet.”

“Rather than trying to argue that Suno was not trained on copyrighted songs, the company is instead making a Fair Use argument to say that the law should allow for AI training on copyrighted works without permission or compensation.”

Archived (also bypass paywall): https://archive.ph/ivTGs

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    75
    arrow-down
    12
    ·
    edit-2
    2 months ago

    You’re free to learn from any piece of music too. Whether AI is actually learning is still debatable but you have the same rights right now.

    I’m still on the edge tbh I feel like it is learning and it is transformative but it’s just too powerful for our current copyright framework.

    Either way, that’ll be such a headache for the transformative work clause of copyright for years to come. Also policing training would be completely unenforcable so any decision here would be rather moot in real world practice either way.

    • jadelord@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      2 months ago

      We are free to learn, but learning is not free.

      Freedom vs cost. One cannot pickup a skill without time, effort and more importantly access to guidance and a vast library of content. Same applies to man or machine. The difference is how corporations have essentially reinvented piracy to facilitate their selfish ends after decades of dictating what’s right with DMCA, DRM and what not.

      • ClamDrinker@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        That’s where open source AI comes in. If we have the same freedoms then all it takes is grassroots efforts to ensure the tools born of humanity’s information remain free to be used by all of humanity. We should also be able to use the same tool without having to pay those companies a dime.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      6
      ·
      2 months ago

      Also policing training would be completely unenforcable

      That’s where laws would come in. Obviously it would have civil law, not criminal law, but making sure it would be enforceable would have to be part of such laws. For example, forcing model makers to disclose their training dataset in one way or another.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        But you can already train models at home also you can just extend existing models with new training data. Will that be regulated too? How?

        • CosmoNova@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          edit-2
          2 months ago

          They‘re literally already about to heavily regulate hobby AI to ensure giant corporations that hoard all our information get to make even more mountains of money with it. The idea that anyone gets to use any media for machine learning is already a relict of the past and in fact not remotely comparable to learning things for yourself. Especially not in the legal sense. Did you really naively believe AI will democratize anything for even a second?

    • helenslunch@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      5
      ·
      2 months ago

      Also policing training would be completely unenforcable

      But…it’s already been enforced, several times.