Microsoft
Microsoft has recently updated the terms of service, which will come into effect by the end of September. It is now clearly stating that its Copilot AI services should not be regarded as a substitute for advice given by actual human beings.
AI-based agents are emerging in various industries. Chatbots are being increasingly utilized for customer service calls, health and wellness applications, and even for providing legal advice. Nevertheless, Microsoft is once again emphasizing to its customers that the responses of its chatbots should not be considered as absolute truth. “AI services are not designed, intended, or meant to be used as substitutes for professional advice,” as stated in the updated Service Agreement.
Recommended Videos
The company specifically took its health bots as an example. The bots, “are not designed or intended to serve as substitutes for professional medical advice or for use in diagnosing, curing, alleviating, preventing, or treating diseases or other conditions,” as explained in the new terms. “Microsoft is not accountable for any decision you make based on the information you receive from health bots.”
The revised Service Agreement also elaborated on additional AI practices that are explicitly prohibited. For instance, users are not allowed to use its AI services for extracting data. “Unless explicitly permitted, you may not employ web scraping, web harvesting, or web data extraction methods to extract data from the AI services,” the agreement states. The company is also forbidding reverse engineering attempts to reveal the model’s weights or use its data “to create, train, or enhance (directly or indirectly) any other AI service.”
“You may not use the AI services to uncover any underlying components of the models, algorithms, and systems,” the new terms read. “For example, you may not attempt to determine and remove the weights of models or extract any parts of the AI services from your device.”
Microsoft has been outspoken for a long time about the potential risks of the misuse of generative AI. With these new terms of service, Microsoft seems to be establishing legal protection for itself as its AI products become ubiquitous.