- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
It’s entirely a nonstarter for entire fucking industries. That’s not hyperbole. I work in one of them.
Edit: scratch that - If any infosec team, anywhere, in any industry, at any corporation or organization, doesn’t categorically refuse to certify for use any system that is running MS Recall, they should be summarily fired and blackballed from the industry. It’s that bad. For real: this is how secrets (as in, cryptographic) get leaked. The exposure and liability inherent to this service is comical in the extreme. This may actually kill the product.
E2: to the title’s implication that such trust can be earned: it kinda can’t. That’s basically the point of really good passwords and secrets (private keys, basically): nobody else knows them. To try to dance around that is fundamentally futile. Also: who am I kidding, this shit will sell like hotcakes. Everyone’s on fucking Facebook, and look how horrifically they exploit everyone’s data for goddamn everything. This isn’t much worse than that to the average mostly-tech-illiterate consumer.
Accounting details, sensitive credentials for sys admin use, HIPAA data, PII etc. there’s just so much crap understood to be temporarily unlocked, viewed, and then immediately deleted or locked again. Even home users shouldn’t turn this thing on, check your bank? Balance and account details now always available. Use a password manager? Whatever you looked at is likely captured.
Using it may not be legal for videoconferencing in states and countries where recording without notification is illegal.
Also, legalities aside, if there is any application that might be displaying the contents of one’s laptop webcam onscreen, that turns it into something that logs a series of snapshots of that (and then OCRs any text that the camera can see). I can see potential problems there.
Microsoft’s solution will be to remove the feature from Enterprise versions of Windows while keeping it around for the plebs using Pro and Home
Their solution is to let users filter out websites in compatible browsers. This lets them blame the user for not marking sensitive websites as such. I don’t know if native applications can also be filtered.
Of course they also filter out precious DRM protected content. You wouldn’t steal a series of JPEGs.
deleted by creator
For all the invasive problems this feature causes, what the fuck does it actually do? The ability to ask an ai what website you were on last Thursday? Who needs this garbage
I have my search history for that. Useless “feature”.
The most evil company that ever existed needs it. So you will have it by default.
Listen Microsoft is super evil but I I think most pharmaceutical companies have them beat.
Also nestle says hi
Who needs cancer drugs??? Muahahaha HAHAHAHAHA. HAHAHAHAAHAHAH
Union Carbide says hi
The concept is useful. A well known idea capture of it is the famous “As We May Think” article from Vannevar Bush all the way back in 1945, which conceptualized a machine “Memex” that would enhance humans capabilities with for example memory and recall. A lot of humans needs help with this and use devices for this daily, with notes, map lookups of where you parked, find my things for devices, analytics for photo libraries etc etc etc.
The only issue here is the implementation.
deleted by creator
Anything that takes data off the computer is a no fly zone.
It doesn’t transmit the data; it supposedly stores it locally. The issue is it’s a huge convenient plaintext trove of information if the system is compromised.
Anything that copies, or persist data to a new location should also be a no-fly zone
I’ll keep my off-site backups, thank you very much.
As long as it’s your choice, sure.
But surprising users and system architects with surprise copies is going to break lots of data security models and behaviors.
When I read this, I’m glad I ain’t using windows anymore.
If it was turned off by default, it would be different as people would be consciously choosing. But turned on by default should be illegal.
As some people are saying, a lot of this isn’t gonna be legal in some countries.
Just for people that haven’t searched it yet:
During setup of your new Copilot+ PC, and for each new user, you’re informed about Recall and given the option to manage your Recall and snapshots preferences. If selected, Recall settings will open where you can stop saving snapshots, add filters, or further customize your experience before continuing to use Windows 11. If you continue with the default selections, saving snapshots will be turned on.
It sounds like YOU need to TURN IT OFF
In their defense, my mom hasn’t earned that level of trust from me, either.
What this opens the door to is MICROSOFT will be able to get your database and be able to ask it questions as if it was talking to you. An AI agent of you that they can do what they like with. This is insanely dangerous.
Particularly since they’re requiring everyone to log in using credentials via their infrastructure.
They absolutely have a way in.
I’m super interested to see how companies handle this when employees work with confidential data all the time.
This is the best summary I could come up with:
This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare.
Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows’ disk encryption technologies, which are generally on by default if you’ve signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user’s Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall’s snapshots.
This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.
Security researcher Kevin Beaumont, first in a thread on Mastodon and later in a more detailed blog post, has written about some of the potential implementation issues after enabling Recall on an unsupported system (which is currently the only way to try Recall since Copilot+ PCs that officially support the feature won’t ship until later this month).
The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity.
Data is stored on a per-app basis, presumably to make it easier for Microsoft’s app-exclusion feature to work.
The original article contains 710 words, the summary contains 260 words. Saved 63%. I’m a bot and I’m open source!