Microsoft AI Scientist Addresses Privacy Concerns Regarding Windows 11 Recall

Last updated: June 18, 2024

Speaking at the Stanford Institute for Human-Centred Artificial Intelligence, Jaime Teevan, Microsoft’s chief research scientist, addressed these issues, emphasising Recall’s local operation and data privacy protocols. Teevan assured that Recall operates entirely locally on devices powered by Snapdragon X processors, with no data stored in the cloud.

Recall, designed to enhance user productivity, functions by recording and making past activities searchable. For instance, it can retrieve details from past interactions or projects, leveraging Microsoft’s AI capabilities without uploading data externally.

During a recent test at the Build 2024 developer conference, Windows Latest discovered that Recall captures screenshots every five seconds. Despite Microsoft allowing users to exclude certain apps and websites from Recall’s recordings, concerns persist about its potential to capture sensitive information, including passwords, due to its text extraction from images.

According to Kevin Beaumont, a former Microsoft employee, Windows 11 utilises Azure AI locally to perform OCR (optical character recognition) on screenshots, storing the extracted text in a local SQLite database. This database, potentially accessible with appropriate permissions, raises concerns about data security and unauthorised access.

Jaime Teevan reiterated Microsoft’s stance on data security, emphasising that Recall’s local storage ensures user privacy. However, questions remain regarding the robustness of these security measures, particularly in protecting sensitive information.

In conclusion, while Recall offers innovative capabilities for users, including local data handling and AI-driven productivity enhancements, its implementation raises valid privacy concerns. The discussion surrounding Recall underscores the ongoing debate about balancing technological advancements with data privacy safeguards.

crosschevron-down