Microsoft is ushering in a new era of PCs – the Copilot+ era. It’s a novel category of device designed and built with AI at its core, and the key selling point of a Copilot+ PC lies in its remarkable Recall feature. However, I’m not fully on board with it just yet.
The text goes on to describe how Microsoft is kicking off this new era. It emphasizes that Copilot+ PCs are a new category of device that is designed and built around AI. The key selling point of these PCs is the Recall feature, which is described as a collection of several small language models that run on the device all the time. These models track everything the user does, from messages and emails to their navigation within Windows 11. And, as the name suggests, Copilot can recall this information whenever needed, using it as the foundation for how the user interacts with the PC.
It also mentions that for showfloor demos and idealized interactions with the PC, Copilot+ sounds like the AI “superpower” that Microsoft has described it as. Users can bring up a single sentence from a multipage document when typing an email later in the week, or recall a recipe they scrolled past but forgot to save. Microsoft claims that Copilot+ PCs organize information like humans do, based on relationships and associations unique to each individual’s experience.
However, there are some concerns raised regarding privacy. It is pointed out that there is an obvious privacy concern with Copilot+. Microsoft is already taking steps to address this by providing multiple settings to control how Recall scans and stores information. Despite these measures, the author is not yet ready to fully embrace the world of Copilot+ PCs.
The text further discusses the idealized usage of the PC and how it may not align with the actual usage. The author mentions that there are plenty of things users do on their PCs that they would rather not share widely, such as searching for certain topics out of concern or curiosity. They worry that feeding information into an AI like Copilot+ could potentially lead to issues if the AI misinterprets or connects unrelated topics.
It also highlights that there is a stark difference between the idealized way the PC is used and how it is actually used. Especially in the early days of Copilot+ PCs, there may be unexpected glitches or issues that users and even Microsoft may not anticipate.
In Microsoft’s defense, they recognize the privacy issue and have implemented various control options for users. Microsoft says Recall uses snapshots of PC usage to provide context, and users can browse, delete, and adjust the range of time the models use the snapshots. They can also pause Recall in the System Tray and filter apps or websites from being tracked.
The text mentions that in the past, companies have been caught using users’ data without their consent to train AI models. However, Microsoft now claims that they will not do so with Copilot+ PCs. But there is still uncertainty about whether there are certain data requirements or if users truly have as much control as Microsoft suggests.
Finally, it is noted that there are a lot of details about Copilot+ PCs that are still unknown, and it will likely take some time before we fully understand how these devices work. As a result, the author is cautious and needs more details before fully embracing the AI capabilities of Copilot+ PCs.