Home » today » Technology » Coming is the next generation scam for phone users. Saving yourself is not an option.

Coming is the next generation scam for phone users. Saving yourself is not an option.

An unknown number doesn’t ring – it’s a relative or friend. The voice on the handset also sounds familiar. Unfortunately, the loan request turns out to be false and the money disappears forever.

This is a new phone scam that combines the long-known caller ID spoofing technique with artificial intelligence – alarm Matthew Wright and Christopher Schwartz, two scientists at the Rochester Institute of Technology in New York state.

As computer security researchers, we see that advances in deep learning algorithms, audio editing and engineering, and synthetic voice generation mean that it is increasingly convincing to simulate a person’s voice.

– we read in the report published in “The Conversation”.

Fake any voice for tens of dollars.

In the United States, according to experts, it has already been stolen in this way over 11 million dollars And the phenomenon, in their opinion, will deepen and develop in other regions of the world, because the necessary tools are becoming more and more easily available. An example are commercial voice cloning servicessuch as Play.ht, Murf or Respeecher, which will generate almost any path for several dozen dollars a month.

Wright and Schwartz agree that soon phone users will have to form a new habit. Namely always hang up and call backif the request heard on the handset concerns a loan of money or disclosure of potentially sensitive data.

To make matters worse, chatbots like ChatGPT are starting to generate realistic scripts with real-time adaptive responses. Combining these technologies with voice generation, deepfake turns from a static recording into a living, realistic avatar that can convincingly lead a phone conversation.

Matthew Wright and Christopher Schwartz note.

What’s more, researchers are also concerned that the voice spoofing scam is yet to evolve into a truly dangerous one. Currently, most scammers are supposed to use prepared, pre-recorded tracks, but by adding chatbots to the puzzle, they could conduct a real-time conversationwe read.

How to protect yourself from it? – you will ask. The researchers themselves admit that, at least for now, there is no good solution. As they rightly point out, there are no relevant regulations that would regulate voice cloning. So we return, in a way, to the starting point: heard requests must be verified at all costs. There is no better alternative.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.