Science and Technology Science and Technology
Thu, March 6, 2025
Wed, March 5, 2025

Did a Family Member Call You Asking for Money? It May Be an AI Voice Scam


Published on 2025-03-05 14:42:48 - CNET
  Print publication without navigation

  • Scammers can clone voices to trick you into thinking your loved one is in trouble. How to protect your money and identity.

The article from CNET discusses the rising threat of scammers using AI to clone voices for fraudulent activities. These scammers utilize voice clips, often obtained from social media or other public platforms, to create convincing replicas of someone's voice. This technology allows them to impersonate individuals, tricking friends, family, or colleagues into believing they are speaking to the real person. The article outlines several protective measures to stay safe: being cautious with personal information shared online, using strong, unique passwords, enabling two-factor authentication, and being skeptical of unsolicited calls or requests for money, even if the voice sounds familiar. It also suggests setting up a "safe word" or phrase with close contacts to verify identity in suspicious situations. The piece emphasizes the importance of awareness and vigilance as AI technology advances, making scams more sophisticated and harder to detect.

Read the Full CNET Article at:
[ https://www.cnet.com/personal-finance/scammers-are-using-voice-clips-to-create-ai-clones-heres-how-to-stay-safe/ ]