AI identity scams are skyrocketed in 2025, warn safety experts – here is how to stay safe


  • IA imitation scams use vocal cloning and Deepfake video to convincingly imitate trusted people
  • Cybercriminals target people and businesses via calls, video meetings, messages and emails
  • Experts say that the independent identity verification and the use of multi-factory authentication are essential to protect you

Imagine getting a frantic call from your best friend. Their voice is trembling because they tell you that they were in an accident and need urgently. You recognize the voice instantly; After all, you have known them for years. But what happens if this voice is not really real?

In 2025, crooks used the AI ​​to clone the voices more and more, imitate faces and pretend to be the most confidence.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top