- AI reduces the barrier for the entry into cybercrime
- Deepfakes and false websites take a few minutes to support
- Users must be careful
Sophisticated scam campaigns which took weeks of cybercriminals to install can now appear in a few minutes, Microsoft is in warning. The difference? Generative artificial intelligence (GENAI).
The tools that emerged a few years ago, notably Chatgpt, Copilot, Midjourney and many others have not only reduced the time necessary to prepare a sophisticated internet scam, they also lowered the barrier for the entry allowing even the recruit frauds to prepare and lead advanced campaigns.
In the latest cyber-signals report on AI assisted scams, Microsoft said that cybercriminals use Genai for more than phishing e-mail. They create deep buttocks (usually false celebrity videos approving a project) and create “simulated websites” generated by the IA imitating legitimate companies.
Phishing and fraud
“What used to take crooks for days or weeks to create will now take minutes,” said Microsoft.
But in the end, it is always “just” phishing and fraud – people can mitigate the risk by slowly on the internet. Microsoft says that the best thing to do is not to be fooled by “limited” offers and timetables in countdown, click only on verified advertisements (many scam sites distributed via advertisements on optimized social networks), and to be skeptical about social evidence (fracturers can use journals generated by AI, opinions on social media).
Finally, users should never provide personal or financial information to people who have reached the hand via unsolicited SMS or emails. “You should never provide a social security number, bank details or passwords to an unaccompanied employer,” he said.
Paying for a job opportunity is almost always a red flag. The best advice is generally the simplest: if it seems too good to be true, this is probably the case.
Artificial intelligence will make internet fraud even more dangerous, but with a little common sense and a little care – that should not succeed.