The Microsoft AI calls for an unwanted return

Photo of author

By [email protected]


Security and privacy defenders roam in another arduous battle Against the summonsThe AI ​​tool is released in Windows 11, which will shot the screen, the index, and store everything the user does every three seconds.

when A summons has been submitted In May 2024, security practitioners prepared it in a circular way to create a golden mine for the malicious, criminals, or the national state spies if they were able to obtain brief administrative access to the Windows device. Privacy defenders have warned that the summons was mature for abuse in intimate partner violence. They also indicated that there is nothing that prevents the summons from maintaining the content that disappears the sensor sent through the privacy protected messengers such as signal.

The total summons

Months of the reverse reaction, Microsoft later commented. Thursday, the company He said The recall has been re -submitted. It is currently available only for those with the perception of access to the Windows 11 Build 26100.3902. Over time, the feature will be launched on a wider scale. Microsoft officials wrote:

Mention (inspection)* It provides you with time by offering a completely new way to search for the things you have seen or did on your computer safely. By Copilot+ PCS capabilities, any application, location, image, or document only can only be found by describing its content. To use the summons, you will need to subscribe to save the shots, which are pictures of your activities, and register in Windows Hello to confirm your presence so that you can only reach your shots. You are always controlling the shots that are saved and can stop providing shots at any time. While using COPILOT+ all day, working on documents or presentations, taking video calls, and switching context through activities, the summons will take regular shots and help you find things faster and easier. When you need to find something you have done before, open the summons and documentation with Windows Hello. When you find what you are looking for, you can reopen the application, website, document, or use click to do this to work on any image or text in the shot you found.

Microsoft hopes that the concessions that require participation and the ability to stop stopping to the collective rebellion that broke out last year. It is likely that it is not for different reasons.

First, even if the user has never chose to call, they will not have any control of the preparation on user machines from B to Z. This means that anything is sent to a screenshot, processing with recognition of visual letters and Copilot AI, then stored in a database indexed on other users devices. This would randomly raise all kinds of user -sensitive materials A, including photos, passwords, medical conditions, videos and encrypted messages. like Privacy evidence Writer M. Books on Mastodon:

This feature will unfortunately extract your information from any safe program that you may have used and store on the computer of this person in a perhaps less safe way.

Of course, this person can pick up a screenshot of all this anyway, but this feature makes even a good -intention person who may not realize that he works, or it may be mistakenly assumed that he is safe enough.

This feature has not been fully released yet, but it may be soon.

The presence of a database that can be easily searched will be to capture every moment of waking up in the device as a reward for others who do not have interest in the heart. This level of detailed archival materials will undoubtedly be subject to calling lawyers and governments. Threat representatives who were able to install their spyware on the device will not have to roam the most sensitive data stored there. Instead, they will remember as they do the browser databases that store passwords now.

Microsoft did not immediately respond to a message asking about the re -submitting of less than a year after the feature got such a cold reception. For critics, the summons is likely to remain one of the most malicious examples EnshittificationThe recently arrested term for unwanted artificial intelligence and other features in the existing products when there is a minimal benefit for users.

This story was originally appeared on Art Technica.



https://media.wired.com/photos/67fd67d0a5f29aac901085c9/191:100/w_1280,c_limit/micrisoft-recall-sec-2153357587.jpg

Source link

Leave a Comment