- 58-59% of workers admit to using shadow AI at work
- Datasets, employee names, and finances are all shared with unapproved tools
- Could IT teams meet workers where they are to ensure better compliance?
While AI tools are now commonplace in many companies, a new study from BlackFog found that while most (86%) employees now say they use AI for their work tests at least once a week, three-fifths (58%) admit they use unapproved AI or free, publicly available tools instead of company-provided tools, putting their business at risk.
Company-provided tools are important for providing enterprise-grade security, governance, and privacy, but many employees complain that the AI offered to them isn’t tailored to their needs.
But more importantly, 63% think it’s OK to use AI without IT approval and 60% agree that unapproved AI is worth the security risk if it helps them meet deadlines, suggesting there is a clear disconnect between the company’s goals and how they communicate them to staff.
Shadow AI Is Pervasive in Employee Workflows
Shadow AI “should alert security teams and highlights the need for increased monitoring and visibility into these security blind spots,” wrote BlackFog CEO Dr. Darren Williams.
This comes as 33% of workers admit to sharing research or datasets with unapproved AI, 27% have shared employee data such as names, payroll or performance, and 23% have shared financial or business data.
But while the onus is on IT teams to double down on AI policies and expectations, they face an uphill battle with more senior executives and senior leaders believing that speed trumps privacy and security than junior and administrative staff.
And BlackFog isn’t the only company to reveal widespread use of shadow AI – Cybernews also found that 59% of workers use unapproved AI at work, suggesting that an even more 75% of users have shared sensitive information with these unapproved tools.
Similarly, the report found that 57% of workers’ direct managers support the use of unapproved AI. “This creates a gray area where employees feel encouraged to use AI, but companies lose control over how and where sensitive information is shared,” warned security researcher Mantas Sabeckis.
Looking ahead, there are two clear solutions to eradicating shadow AI. First, IT teams need to reiterate the risks involved and guide users to approved tools, but second, it is clear that currently approved tools are not suitable for many workers. So IT teams need to meet them where they are and deliver professional versions of these applications.
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds. Make sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp Also.




