It’s the new symbol of celebrity status that no one wants. Jennifer Aniston, Oprah Winfrey and Kylie Jenner had their voices cloned by fraudsters. Online bros used artificial intelligence to fake the Tigger tone of TV financial advisor Martin Lewis. And this weekend David Attenborough described himself as “deeply disturbed” after discovering his cloned voice had been used to broadcast partisan US news programmes.
AI-generated voice of David Attenborough covering Donald Trump – video
Now experts have warned that voice cloning is breaking the law, as engineers advance previously clunky voice generators into models capable of mimicking the more subtle pauses and breaths of human intonation.
Dr. Dominic Lees, an expert on AI in film and television who advises a British parliamentary committee, told the Guardian on Monday: “Our data protection and copyright laws are not up to date with what this new technology offers, so there is very little of it.” David Attenborough I can do it.”
Lees is advising the House of Commons Select Committee on Culture, Media and Sport on an inquiry that will look into the ethical use of AI in film production. He is also launching the Synthetic Media Research Network, whose members include the company creating an AI version of the late chat show interviewer Michael Parkinson, which will spawn an eight-part unscripted series with new guests: Virtually Parkinson. This voice cloning project This is done with the consent of Parkinson’s family and estate.
“The government definitely needs to address (vote cloning) because it is a huge fraud problem,” Lee said. “To stop (abuse) government regulation is needed…we can’t allow it to be a free for all.”
It said the number of AI voice cloning fraud cases in the UK has increased by 30% over the past year seek from NatWest Bank this month. Another creditor, Starling Bank, found 28% of people have been victims of an AI voice cloning scam at least once in the past year.
Voice cloning is also used by scammers to carry out a version of the “Hello Mom” SMS scam, in which scammers pose as children in desperate need of money from their parents. With already confusing phone lines, it can be difficult to figure out that a pleading child is a clone of a scammer. Consumers are advised to hang up and call a trusted number again to verify.
People whose voices are cloned without their consent find this more than annoying. Attenborough told the BBC on Sunday: “After a lifetime of trying to speak what I believe to be the truth, I am deeply disturbed to find my identity being stolen by others these days and I am strict against them doing this.” to say what they want.
When a new voice option in OpenAI’s latest AI model, ChatGPT-4o, included sounds that were very close She told actress Scarlett Johansson she was shocked and upset because the voice “sounded so eerily similar to mine that my closest friends and media couldn’t tell the difference.”
A weekly look at how technology shapes our lives
Data protection notice: Newsletters may contain information about charities, online advertising and content sponsored by third parties. For more information, see our Privacy Policy. We use Google reCaptcha to protect our website and Google Privacy Policy And Terms of Use apply.
after newsletter advertising
OpenAI Sky’s voice assistant looks like Scarlett Johansson – video
The proliferation of cloned voices raises the question of what they lack in real human sounds. Lees said: “The big problem is that AI doesn’t understand emotions and how that changes the way a word or phrase can have emotional impact and how you vary your voice to represent it.”
The voiceover industry, which provides voices for advertising, animation and training, needs to respond quickly to technological advances. Joe Lewis, head of audio at the Voiceover Gallery in London, which has provided real human voices for Specsavers and National Express adverts, said he had already cloned the voices of some of his artists.
He said the AI seemed to work best with English male voices, perhaps because this reflected a bias in the type of recording used to train the algorithm, but warned that in general “there is something wrong with the way it is done.” is generated, makes you less “attentive.”
“When the AI (voice) breathes, it is a very repetitive breath,” he said. “The breaths are in the right place, but they don’t look natural… (But) can you get to the point where they’re really perfect? I don’t understand why not, but reaching the full emotional spectrum is still a long way off.”
Guest 1: Michael Smith, Voice Actor
Question 1: How do you think the rise of AI-generated voice cloning will impact the voice acting industry?
Michael Smith: Well, it’s a mixed bag really. On one hand, it could potentially take away some jobs from voice actors who rely on recent work. However, it also raises new possibilities such as working with voice cloning technology to create unique characters or even doubling up on existing jobs. But regardless of the industry has to adapt to this new technology and find new ways to utilize it.
Question 2: What do you think about the ethical implications of using voice cloning for commercial purposes without the consent of the original person’s voice?
Michael Smith: I think it’s a big issue. Celebrities and public figures have every right to protect their own image and voice, and using it without permission is certainly wrong. Furthermore, fake voices can be very harmful to innocent people who may fall prey to fraudulent activities. Regulation is definitely needed to ensure that everyone’s rights are protected.
Guest 2: Dr. Sarah Johnson, AI Expert
Question 1: How close are AI-generated voices to replicating human speech patterns and emotions?
Dr. Sarah Johnson: I believe we are still some distance from fully replicating human speech patterns and emotions. AI still lacks the depth of understanding that humans have when it comes to context and emotionally charged language. However, we are making progress, and with further advancements in technology, this could certainly change.
Question 2: How do you think voice cloning can be used responsibly in the future?
Dr. Sarah Johnson: I think voice cloning has enormous potential for creating new forms of entertainment and storytelling, but it’s important that we use it responsibly. This means being transparent about using cloned voices and ensuring that they are not used to deceive or harm others. We also need to consider ways to protect the privacy of individuals whose voices may be cloned without their consent.