Just like scarlet doesn’t own all voices that sound mildly like her, Spike Jonze doesn’t own the concept of an AI companion.
I’m not really sure what your point is, there’s nothing to rip off. No matter what they make it sound like, there’s going to be similarities with the movie. There’s nothing wrong with leaning into these for advertising purposes.
No matter what they make it sound like, there’s going to be similarities with the movie.
I don’t follow.
They literally disabled the ‘Sky’ voice Sunday night and now users can’t pick a voice that sounds like the character from Her.
And, mind you, this is not a ‘huh, they sorta sound the same’ this is a ‘they sound very similar, and have the same personality’ situation, in addition to the fact that Sam Altman is on the record talking about being obsessed with the movie Her - which is circumstantial. What isn’t circumstantial is they literally referenced the movie’s name in their marketing materials. Sam tweeted a vague hint, and his colleagues confirmed it. It’s not speculative.
There’s nothing wrong with leaning into these for advertising purposes.
Actually, intellectual property theft is either wrong or merely only technically illegal, depending on where you stand on copyright, but it’s still wrong, either way. Then there’s trying to mislead the public into thinking that GPT-4o was endorsed in some way by those involved in the Her movie. A false endorsement is also illegal. So - wrong there, too.
I’m sure an actual lawyer could find more wrong with it, but just those two things are actual, literal crimes.
I’m saying practically any voice with the associated bubbly flirty personality is going to make you think of the movie Her in such a context.
Sure they leaned into it for advertising purposes but a tweet referencing it and showcasing the one voice that sounds like her out of the five isn’t crossing the line imo.
I think it’s a slippery slope to say any AI assistant that has a similar timbre and personality as an AI in a movie is off limits.
As long as they don’t infringe by calling it “Scarjo” or saying “From the movie Her” I don’t see a problem.
Just like scarlet doesn’t own all voices that sound mildly like her, Spike Jonze doesn’t own the concept of an AI companion.
I’m not really sure what your point is, there’s nothing to rip off. No matter what they make it sound like, there’s going to be similarities with the movie. There’s nothing wrong with leaning into these for advertising purposes.
I don’t follow.
They literally disabled the ‘Sky’ voice Sunday night and now users can’t pick a voice that sounds like the character from Her.
And, mind you, this is not a ‘huh, they sorta sound the same’ this is a ‘they sound very similar, and have the same personality’ situation, in addition to the fact that Sam Altman is on the record talking about being obsessed with the movie Her - which is circumstantial. What isn’t circumstantial is they literally referenced the movie’s name in their marketing materials. Sam tweeted a vague hint, and his colleagues confirmed it. It’s not speculative.
Actually, intellectual property theft is either wrong or merely only technically illegal, depending on where you stand on copyright, but it’s still wrong, either way. Then there’s trying to mislead the public into thinking that GPT-4o was endorsed in some way by those involved in the Her movie. A false endorsement is also illegal. So - wrong there, too.
I’m sure an actual lawyer could find more wrong with it, but just those two things are actual, literal crimes.
I’m saying practically any voice with the associated bubbly flirty personality is going to make you think of the movie Her in such a context.
Sure they leaned into it for advertising purposes but a tweet referencing it and showcasing the one voice that sounds like her out of the five isn’t crossing the line imo.
I think it’s a slippery slope to say any AI assistant that has a similar timbre and personality as an AI in a movie is off limits.
As long as they don’t infringe by calling it “Scarjo” or saying “From the movie Her” I don’t see a problem.