Why ChatGPT's Fake ScarJo Voice Is So Disturbing

SdĂ­let
VloĆŸit
  • čas pƙidĂĄn 29. 05. 2024
  • Controversy surrounds OpenAI's new GPT-4o model and its talking AI assistant, Sky. Scarlett Johansson, in a statement, expressed concern that OpenAI had copied her voice without her permission. She claims that OpenAI reached out to her on multiple occasions to be the voice of their new chatbot, and Sam Altman, the CEO of OpenAI appeared to joke on Twitter that his new virtual assistant was modelled after the AI Samantha, voiced by Johansson in the Spike Jonze movie "Her." What does all of this mean for the future of AI, and what does it say about the gender politics of modern technology? Here we'll talk about the controversy, we'll analyze the movie Her, and we'll turn to C.S. Lewis for some insight.
    #ai #her #technology
    Support this channel 👉 / greatbooksprof
    📗 CONNECT WITH ME! I use Instagram to keep in touch with students and friends.
    / greatbooksprof
    📓 Find more on TIKTOK! ⏱ Short videos on philosophy, books, and life.
    / greatbooksprof
    📗 STUDY WITH ME: Learn about the Great Books Program at St. Thomas University 🇹🇩
    www.stu.ca/greatbooks/
    📗 CHECK OUT MY BOOKS:
    Shakespeare between Machiavelli and Hobbes: Dead Body Politics
    rowman.com/ISBN/9781498514095...
    Magic in Early Modern England: Literature, Politics and Supernatural Power
    rowman.com/ISBN/9781498575515...
    đŸ“« E-MAIL: Business inquiries should be directed to greatbooksprof@gmail.com
  • Věda a technologie

Komentáƙe • 18

  • @harrietparnell6318
    @harrietparnell6318 Pƙed měsĂ­cem +4

    You deliver commentary gold EVERY TIME. This channel is one of the only places on the internet I’ve found that explores current events in a such a worthwhile and thoughtful way. I love the way this channel introduces us to past thinkers and ideas that are so relevant to today. Please keep making these videos! (And I love that C.S. Lewis quote! )

    • @GreatBooksProf
      @GreatBooksProf  Pƙed měsĂ­cem

      I really appreciate that, Harriet! Thanks so much! This is exactly what I’ve been trying to do over the past few months with the channel, so it’s good to know it’s making sense to at least one person. 😅

  • @myles1451
    @myles1451 Pƙed měsĂ­cem +1

    I think this is a non -event if they demonstrated that they used another voice actor. In this instance it has nothing to do with the unethical use of AI. It doesn’t even really sound like her tbh

  • @moorefamily4016
    @moorefamily4016 Pƙed 27 dny +1

    This is my dad

  • @jeff20928
    @jeff20928 Pƙed 29 dny

    This was a good video and the most striking part to me was the CS Lewis quote which, for a quote from 1943, it's eerie how prescient it is in mirroring today's tech landscape.

    • @GreatBooksProf
      @GreatBooksProf  Pƙed 23 dny

      It's always so uncanny how a book from 100 (or 1,000!) years ago seems to speak directly to our current situation!

  • @stephenspackman5573
    @stephenspackman5573 Pƙed měsĂ­cem

    It's interesting. I suppose I'm one of those tech people (though I retired just before the LLMs hit the public consciousness), and you'd call me male and white and protestant and anglophone and all those bad things.
    For myself, I want my information source to be knowledgeable and authoritative, and to me that makes an adult female voice the natural preference among human-based models. Most (though certainly not all) of the good teachers, doctors and librarians in my life have been female. Children, I don't expect to know the answers to questions about the wider world, and men, I might worry in the back of my mind that they are lying to me. And I know I set my GPS to the female voice because it doesn't make me panic the way that a male voice suddenly shouting instructions does; panicking while driving is a bad thing. Clearly all this _is_ sexism on my part, in that I'm able to map my preferences directly onto a sex, and whenever I've tried a male voice I've ended up switching back, but I'm not convinced that a creepy imaginary girlfriend explanation of my motives is correct. Indeed, that “flirty” aspect rather creeps me out, and once again makes me worry that I'm being manipulated. Perhaps I'd go for the grandmother persona, if offered, though that might make more sense if conceptualised as the voice of the cloud rather than the voice of my shiny new local device.
    Of course, since the current generation of these tools are engineered to say what's plausible and not what's true, they _are_ by nature lying and manipulative, so
.
    Finally, as to power dynamics, I do very much want to be in charge of my relationship with a computer. That's been a defining characteristic of my life. But I'm fairly confident that _isn't_ a sex thing and has little to do with my computer's voice settings.

  • @gaylenwoof
    @gaylenwoof Pƙed měsĂ­cem +2

    From what I’ve seen, AIs typically come with a variety of voice options. As long as that’s true, I don’t see any problem with offering sexy female voices as options. Also, IF OpenAI really did find another voice actor whose voice is similar to Scarlett’s, then I see nothing wrong with that either. It’s only problematic if it really is an AI copy Scarlett’s voice against her will. I’m skeptical, however, because it would be easy for OpenAI to prove if another actor was used. Just present that person and let her be interviewed.
    As for erotic relationships with machines, there are obviously some serious potential problems but, ultimately, we need to let people make their own choices - just as we let people make their own choices about sex toys, gender preferences, non-monogamy, most fetishes, masturbation, abortion, etc. There is a loooooooooooong history of people feeling that anything other than vanilla heterosexual monogamy is creepy or immoral and should be illegal. The exact same judgmental attitudes are going to apply to sexbots. But what gives any of us the moral authority to judge other people’s private preferences and lifestyles in a way that oppresses their freedom to choose? Quick answer: Nothing unless such preferences lead to tangible harm. As I see it, sexbot fetishes should fall in the same general moral categories as other fetishes and/or potentially problematic life choices such as alcohol, pot, tobacco, gambling, pornography, etc. Virtually any of these things are harmful to some degree or can be potentially harmful, but most of them are legal and/or accepted to some degree.
    I should add that, for some people, sexbots might literally be the best lifestyle choice for them. Some people can’t realistically have human sexual partners and/or shouldn’t even try (e.g., people with certain types of mental or physical problems who are potentially harmful to others). Choosing a sexbot relationship should be no more stigmatized that using a sex toy, or being celibate, or being asexual. Just as with alcohol, etc., there will certainly be cases where people are harmed. Just as with any potentially problematic behaviors, there should be research and discussions so that people can make informed decisions about the risks (“
this product may be harmful
”
 “don’t take too many of these pills
” etc.)
    Bottom line: I hope people have learned something from centuries of various sorts of oppression so that, instead of re-creating all of the old oppressive judgementalism, we can move quickly into accepting that some people are going to have intimate relationships with machines and that’s none of our business. Make sure that everyone has the benefit of knowing the potential risks but, other than that, just let people be who they want to be.

  • @ujjwaltyagi3030
    @ujjwaltyagi3030 Pƙed měsĂ­cem +1

    If its someone else then its not a problem, he is overreacting

    • @GreatBooksProf
      @GreatBooksProf  Pƙed měsĂ­cem

      Hey, Thanks for watching. OpenAI pulled its “Sky” chatbot days after Scarlett Johansson made her statement, so clearly they are taking her allegations pretty seriously.

    • @Dupamine
      @Dupamine Pƙed měsĂ­cem

      ​@@GreatBooksProf sure they are. But they trained on a different person so it dont matter. Youre inflating things and exaggering things and pulling it out of proportions. Its just a voice. Stop crying

    • @GreatBooksProf
      @GreatBooksProf  Pƙed měsĂ­cem

      @@Dupamine If it doesn’t matter, why do you think they’ve pulled the voice? What exactly do you think is “exaggerated”?

    • @ujjwaltyagi3030
      @ujjwaltyagi3030 Pƙed měsĂ­cem

      @@GreatBooksProf they pulled it because scarlett complained. Both scarlett and you are overeacting over a voice.

    • @GreatBooksProf
      @GreatBooksProf  Pƙed měsĂ­cem

      @@ujjwaltyagi3030 They pulled it because they are worried they might get sued. If it was “not a big deal” they wouldn’t worry, or they would ignore her complaint. They reacted because they know the public and a judge might reasonably agree with SJ. But react as you see fit. 🙂