Why ChatGPT's Fake ScarJo Voice Is So Disturbing
VloĆŸit
- Äas pĆidĂĄn 29. 05. 2024
- Controversy surrounds OpenAI's new GPT-4o model and its talking AI assistant, Sky. Scarlett Johansson, in a statement, expressed concern that OpenAI had copied her voice without her permission. She claims that OpenAI reached out to her on multiple occasions to be the voice of their new chatbot, and Sam Altman, the CEO of OpenAI appeared to joke on Twitter that his new virtual assistant was modelled after the AI Samantha, voiced by Johansson in the Spike Jonze movie "Her." What does all of this mean for the future of AI, and what does it say about the gender politics of modern technology? Here we'll talk about the controversy, we'll analyze the movie Her, and we'll turn to C.S. Lewis for some insight.
#ai #her #technology
Support this channel đ / greatbooksprof
đ CONNECT WITH ME! I use Instagram to keep in touch with students and friends.
/ greatbooksprof
đ Find more on TIKTOK! â± Short videos on philosophy, books, and life.
/ greatbooksprof
đ STUDY WITH ME: Learn about the Great Books Program at St. Thomas University đšđŠ
www.stu.ca/greatbooks/
đ CHECK OUT MY BOOKS:
Shakespeare between Machiavelli and Hobbes: Dead Body Politics
rowman.com/ISBN/9781498514095...
Magic in Early Modern England: Literature, Politics and Supernatural Power
rowman.com/ISBN/9781498575515...
đ« E-MAIL: Business inquiries should be directed to greatbooksprof@gmail.com - VÄda a technologie
You deliver commentary gold EVERY TIME. This channel is one of the only places on the internet Iâve found that explores current events in a such a worthwhile and thoughtful way. I love the way this channel introduces us to past thinkers and ideas that are so relevant to today. Please keep making these videos! (And I love that C.S. Lewis quote! )
I really appreciate that, Harriet! Thanks so much! This is exactly what Iâve been trying to do over the past few months with the channel, so itâs good to know itâs making sense to at least one person. đ
I think this is a non -event if they demonstrated that they used another voice actor. In this instance it has nothing to do with the unethical use of AI. It doesnât even really sound like her tbh
This is my dad
This was a good video and the most striking part to me was the CS Lewis quote which, for a quote from 1943, it's eerie how prescient it is in mirroring today's tech landscape.
It's always so uncanny how a book from 100 (or 1,000!) years ago seems to speak directly to our current situation!
It's interesting. I suppose I'm one of those tech people (though I retired just before the LLMs hit the public consciousness), and you'd call me male and white and protestant and anglophone and all those bad things.
For myself, I want my information source to be knowledgeable and authoritative, and to me that makes an adult female voice the natural preference among human-based models. Most (though certainly not all) of the good teachers, doctors and librarians in my life have been female. Children, I don't expect to know the answers to questions about the wider world, and men, I might worry in the back of my mind that they are lying to me. And I know I set my GPS to the female voice because it doesn't make me panic the way that a male voice suddenly shouting instructions does; panicking while driving is a bad thing. Clearly all this _is_ sexism on my part, in that I'm able to map my preferences directly onto a sex, and whenever I've tried a male voice I've ended up switching back, but I'm not convinced that a creepy imaginary girlfriend explanation of my motives is correct. Indeed, that âflirtyâ aspect rather creeps me out, and once again makes me worry that I'm being manipulated. Perhaps I'd go for the grandmother persona, if offered, though that might make more sense if conceptualised as the voice of the cloud rather than the voice of my shiny new local device.
Of course, since the current generation of these tools are engineered to say what's plausible and not what's true, they _are_ by nature lying and manipulative, soâŠ.
Finally, as to power dynamics, I do very much want to be in charge of my relationship with a computer. That's been a defining characteristic of my life. But I'm fairly confident that _isn't_ a sex thing and has little to do with my computer's voice settings.
From what Iâve seen, AIs typically come with a variety of voice options. As long as thatâs true, I donât see any problem with offering sexy female voices as options. Also, IF OpenAI really did find another voice actor whose voice is similar to Scarlettâs, then I see nothing wrong with that either. Itâs only problematic if it really is an AI copy Scarlettâs voice against her will. Iâm skeptical, however, because it would be easy for OpenAI to prove if another actor was used. Just present that person and let her be interviewed.
As for erotic relationships with machines, there are obviously some serious potential problems but, ultimately, we need to let people make their own choices - just as we let people make their own choices about sex toys, gender preferences, non-monogamy, most fetishes, masturbation, abortion, etc. There is a loooooooooooong history of people feeling that anything other than vanilla heterosexual monogamy is creepy or immoral and should be illegal. The exact same judgmental attitudes are going to apply to sexbots. But what gives any of us the moral authority to judge other peopleâs private preferences and lifestyles in a way that oppresses their freedom to choose? Quick answer: Nothing unless such preferences lead to tangible harm. As I see it, sexbot fetishes should fall in the same general moral categories as other fetishes and/or potentially problematic life choices such as alcohol, pot, tobacco, gambling, pornography, etc. Virtually any of these things are harmful to some degree or can be potentially harmful, but most of them are legal and/or accepted to some degree.
I should add that, for some people, sexbots might literally be the best lifestyle choice for them. Some people canât realistically have human sexual partners and/or shouldnât even try (e.g., people with certain types of mental or physical problems who are potentially harmful to others). Choosing a sexbot relationship should be no more stigmatized that using a sex toy, or being celibate, or being asexual. Just as with alcohol, etc., there will certainly be cases where people are harmed. Just as with any potentially problematic behaviors, there should be research and discussions so that people can make informed decisions about the risks (ââŠthis product may be harmfulâŠâ⊠âdonât take too many of these pillsâŠâ etc.)
Bottom line: I hope people have learned something from centuries of various sorts of oppression so that, instead of re-creating all of the old oppressive judgementalism, we can move quickly into accepting that some people are going to have intimate relationships with machines and thatâs none of our business. Make sure that everyone has the benefit of knowing the potential risks but, other than that, just let people be who they want to be.
If its someone else then its not a problem, he is overreacting
Hey, Thanks for watching. OpenAI pulled its âSkyâ chatbot days after Scarlett Johansson made her statement, so clearly they are taking her allegations pretty seriously.
â@@GreatBooksProf sure they are. But they trained on a different person so it dont matter. Youre inflating things and exaggering things and pulling it out of proportions. Its just a voice. Stop crying
@@Dupamine If it doesnât matter, why do you think theyâve pulled the voice? What exactly do you think is âexaggeratedâ?
@@GreatBooksProf they pulled it because scarlett complained. Both scarlett and you are overeacting over a voice.
@@ujjwaltyagi3030 They pulled it because they are worried they might get sued. If it was ânot a big dealâ they wouldnât worry, or they would ignore her complaint. They reacted because they know the public and a judge might reasonably agree with SJ. But react as you see fit. đ