The development of artificial intelligence has generated numerous benefitsbut they have also given rise to new forms of fraud and extortion, especially through messaging applications such as WhatsApp and Telegram. These criminal methods have grown in complexity and effectiveness, posing serious risks to users.
One of the most alarming types of extortion today is voice cloning using artificial intelligence. This process allows criminals to replicate a person’s voice with the aim of deceiving the victim and persuading them to hand over large sums of money. According to a study by cybersecurity company McAfee, with just a brief voice recording, AI programs can generate fake messages that are distributed through these messaging platforms.
Cloned messages often convey a sense of urgency, communicating emergency situations that require a rapid economic response. To increase the credibility of their deceptions, extortioners pose as family or friends.which complicates the possibility of the victim detecting the fraud in time.
To protect yourself against these extortions, It is recommended to establish security codes with family and friends, as keywords for help situations. Additionally, it is essential to always verify the source of messages, checking the phone number and the person’s communication style. Sharing information only with trusted people and eliminating intermediary contacts can reduce the risk of falling into these traps.
Stay up to date with the news, join our WhatsApp channel
SV