- Microsoft has clarified some terms and conditions associated with Copilot
- Responsibilities have been transferred to users of the AI tool
- Although it is intended for “entertainment purposes”, it is still heavily marketed towards workers.
In a major turnaround, Microsoft has reaffirmed that Copilot is intended “for entertainment purposes only” and that, if used for business purposes, it should be used as the first of multiple fact-checking steps, rather than as a source of trust.
“It may make errors and not perform as expected,” the company wrote. “Do not rely on Copilot for important advice. Use Copilot at your own risk.”
Although the company is keen for businesses and employees to continue using Copilot for work, there is a clear shift of responsibility here to the user, ridding Microsoft of any accusations of false information.
Article continues below
Microsoft says “use Copilot at your own risk”
In a roundabout way, Microsoft is effectively admitting the risk of AI hallucination amid ongoing concerns about copyrighted content, the ambiguity of intellectual property, and the legitimacy of results.
With this in mind, the company clearly wants us to view Copilot as a tool, not a decision maker, and for users to be able to independently verify facts and be careful with all sensitive and protected data.
“You agree to indemnify and hold us harmless…from and against any claims, losses, and expenses…arising from or relating to your use of Copilot,” Microsoft added in another paragraph.
More generally, the company also notes that prompts and responses can be used to improve Copilot, but that professional versions have additional protections to protect sensitive information. In other words, users retain rights to their inputs, but Microsoft still has the right to use the data to improve the service.
However, while Microsoft’s efforts to place some responsibility on users have attracted attention, it is not the only company to use such terms. OpenAI, Google, and Anthropic all state similar notices in their terms, including user responsibility and no guarantees of accuracy.
Shifting responsibility from AI provider to user is an ongoing shift that companies are asserting as the industry continues to determine what the legal risks might be, but as Microsoft continues to sell Copilot tools to business users and consumers, it’s clearly an exercise in rephrasing terms rather than a total change in behavior.
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds. Make sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp Also.




