Microsoft put the same disclaimer on Copilot that a psychic uses to avoid getting sued
TL;DR
- Microsoft’s Copilot terms of use explicitly state, “Copilot is for entertainment purposes only.”
- While other AI companies warn users to double-check AI output, this Copilot disclaimer goes quite a bit further.
- Microsoft has been heavily promoting Copilot’s business uses despite the entertainment-only message.
For all the complaints people make about AI replacing human skills, there’s another side to it: The rise of AI has also forced humans to develop new skills, specifically in terms of being able to sort useful AI output from incorrect, hallucinated garbage. Over the past couple years, many of us have gotten pretty good at this, and have leaned to make the most of the many limitations we experience with so many AI agents. While the companies behind these projects are similarly aware of the limitations we’re up against, one of them seems to be overcompensating a bit in the legal department, as Copilot users notice some concerning language in Microsoft’s terms of service.
Don’t want to miss the best from Android Authority?
from Android Authority https://ift.tt/DHlMaAf
Post a Comment