These have different meanings. Microsoft is legally entitled to refuse a request from law enforcement, and subject to criminal penalties if it refuses a valid legal order.
It does illustrate a significant vulnerability in that Microsoft has access to user keys by default. The public cannot be sure that Microsoft employees or criminals are unable to access those keys.
They said “legal order”, which includes a variety of things ranging from administrative subpoenas to judicial warrants. Generally they say warrant if that was used.
A “request” is “Hi Microsoft man, would you please bypass your process and give me customer data?” That doesn’t happen unless it’s for performative purposes. (Like when the FBI was crying about the San Bernardino shooter’s iPhone) Casual asks are problematic for police because it’s difficult to use that information in court.
What exactly was requested sounds fishy as the article states that Microsoft only gets 20 a year, and is responsive to 9 or fewer requests. Apple seems to get more and typically is more responsive. (https://www.apple.com/legal/transparency/us.html)
The other weird thing is that the Microsoft spokesman named in the Forbes article is an external crisis communications consultant. Why an use external guy firewalled from the business for what is a normal business process?
That just makes me think that Windows is generally less secure and there are likely a larger number of instances where the AHJ doesn't have to request help from Microsoft to access the data.
| Apple | Microsoft |
---------------+---------------+------------|
Users with | approximately | small but |
data in cloud | everyone | growing |
---------------+---------------+------------|
Access to | denied via | easily |
data on device | cryptography | available |
| by default | by default |This is a problem, because Microsoft operates in a lot of jurisdictions, but one of them always wants to be the exception and claims that it has jurisdiction over all the others. Not that I personally am of the opinion, that it is wise for the other jurisdiction to trust Microsoft, but if MS wants to secure operating in the other jurisdiction it needs to separate itself from that outsider.
I think you need to rethink your position.
Causality here actually works both ways, because in free(ish) societies, law enforcement derives its authority more from people's intersubjective belief in that authority, and less from actual use of force.
It's quite clear that if law enforcement officers are indeed worse or just like regular thugs the failed state will soon materialize regardless of what people think about the issue.
Moreover, isn't the fastest way to a failed state to have people believe that their security agencies are good and proper when in reality they aren't? That kind of naivete is surely a lot worse than a bit of paranoia.
> Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order.
I suspect the FBI part was added editorially since this specific legal order came from the FBI.
Now CIA, on the other hand, ... well, they won't need to ask for the crypto keys anyway.
There is reasonable suspicion, some might argue evidence, that Microsoft voluntarily cooperated with U.S. Intelligence Community without being compelled by a court order, the most famous instances being leaked in the Snowden disclosures.
To be fair to Microsoft, here's their updated statement (emphasis mine):
"Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order. “While key recovery offers convenience, it also carries a risk of unwanted access, so Microsoft believes customers are in the best position to decide... how to manage their keys,” said Microsoft spokesperson Charles Chamberlayne."
First they are capable of fulfilling the request in the first place which means their approach or encryption is inherently flawed. Second companies can very much push back on such requests with many examples of such working, but they need to make the attempt.
They do seem to be reasonable in the case that brought about this reporting, with substantial evidence that the suspects committed fraud and that evidence is on the devices in question.
So why should customers entrust their data to the company? It’s a transactional relationship and the less you do the less reason someone has to pay you.
Further, our legal system is adversarial it assumes someone is going to defend you. Without that there’s effectively zero protection for individuals.
You're making a lot of assumptions about how people use their computers, their understanding of their own devices, and the banality of building argumentation around what someone should have done or should not have done in the face of how reality works.
One of the privacy protections is simply that it's a lot of work to go through that process. The FBI wouldn't have the resources to do it to everyone it's merely curious about even if it had the authority, which it doesn't because warrants require probable cause.
I believe that it's generally acceptable that when law enforcement has probable cause for a search warrant, third parties grant them what access they reasonably can. I also believe people who actually want to protect their privacy and security should learn fundamentals like whoever has the key can unlock it and if nobody has the key, it's gone forever. If I was building a consumer product, I'd have to care quite a bit about the fact that many people won't do that, but I'm not so I don't.
I realize it's not a court order, but just want to add to the stack that there are examples of them being requested to provide something within the public's interest in a legal context (a FOIA lawsuit) where their counsel pushed back by saying no.
It could be a bigger obstacle for other agencies. CBP can hold a device carried by someone crossing the border without judicial oversight. ICE is in the midst of a hiring surge and from what I've read lately, has an abbreviated screening and training process likely not matching the rigor of the FBI. Local law enforcement agencies vary greatly.
I keep seeing mentions in the news of FBI agents resigning suddenly.
Having said that I won’t go back to Windows.
Inlight of fascism coming to Democratic cities and anyone documenting it being a registered domestic terrorist...well thats pretty f'n insecure by default.
If you are running any kind of service, you should learn how warrants work in the country you are hosting in, come the time, if your service grows, eventually you will have to comply with an order.
If you want anything else you will have to design your system such that you can't even see the data, ala Telegram. And even then, you will get into pretty murky waters.
This is an odd thing to split hairs over IMO. Warrants or subpoenas or just asking nicely, whatever bar you want to set, is a secondary concern. The main issue is they can and will hand the keys to LEO’s at all.
If you don’t trust the institutions issuing those court orders, that is an entirely reasonable stance but it should be addressed at its root cause using our democratic process, however rapidly eroding that process may seem to be.
The fourth amendment protects against warrantless search and seizure, it is not carte blanche to fill up your hard drive with child porn and expect Microsoft to fall on their swords to protect you.
I was understanding and felt your points had validity until you threw out this gross, emotionally manipulative, horrible misrepresentation of my stance.
Do we really, really, fully understand the implications of allowing for private contracts that can trump criminal law?
Given the abilities of the median MS client, the better choice is not obvious at all, while "protecting from a nation-state adversary" was definitely not one of the goals.
I could understand if the default is an online account + automatic key upload, but only if you add an opt-out option to it. It might not even be visible by default, like, idk, hide it somewhere so that you can be sure that the median MS user won't see it and won't think about it. But just fully refusing to allow your users to decide against uploading the encryption key to your servers is evil, straight up.
Before recently, normal people who get arrested and have their computer seized were 100% guaranteed that the cops could read their hard drive and society didn't fall apart. Today, the chances the cops can figure out how to read a given hard drive is probably a bit less. If someone needs better security against the actual government (and I'm hoping that person is a super cool brave journalist and not a terrorist), they should be handling their own encryption at the application layer and keeping their keys safe on their own, and probably using Linux.
I really think that enabling BitLocker with an escrowed key during OOBE is the right choice, the protection to risk balance for a “normal” user is good. Power users who are worried about government compulsion can still set up their system to be more hardened.
Yes, you can opt out of it while manually activating bitlocker, but I find it infuriating that there's no such choice at the system installation process. It's stupid that after system installation a user supposed to renecrypt their system drive if they don't want this.
If they honestly informed customers about the tradeoff between security and convenience they'd certainly have far fewer customers. Instead they lead people to believe that they can get that convenience for free.
The obvious better choice is transparancy.
What? Most people, thinking through the tradeoff, would 100% not choose to be in charge of safeguarding their own key, because they're more worried about losing everything on their PC, than they are about going to jail. Because most people aren't planning on doing crime. Yes, I know people can be wrongly accused and stuff, but overall most people aren't thinking of that as their main worry.
If you tell people, "I'll take care of safeguarding your key for you," it sounds like you're just doing them a favor.
It would be more honest to say, "I can hold on to a copy of your key and automatically unlock your data when we think you need it opened," but that would make it too obvious that they might do so without your permission.
Trust is a fundamental aspect of how the world works. It's a feature, not a bug.
Consider that e.g. your car mechanic, or domestic service (if you employ it), or housekeeping in hotel you stay, all have unsupervised access to some or all of your critical information and hardware. Yet, these people are not seen as threat actors by most people, because we trust them to not abuse that access, and we know there are factors at play to ensure that trust.
In this context, I see Microsoft as belonging to the cohort above for most people. Both MS and your house cleaner will turn over your things to police should they come knocking, but otherwise you can trust them to not snoop through your stuff with malicious intent. And if you don't trust them enough - don't buy their services.
Protecting from specifically the nation state that hosts and regulates Microsoft and its biggest clients, probably not.
This story is just yet another confirmation of what used to be the "the americans have bugged most computers in the world" conspiracy theory.
I hope Microsoft wakes up to the changes in the way America is being viewed these days, because they stand to lose a lot of business if they don't.
It's a nightmare actually.
And AFAICT, they do ask, even if the flow is clearly designed to get the user to back up their keys online.
yes, it would be. So, the current way, 99% of people are benefitting from knowing their data is secure when very common thefts occur, and 1% of people have the same outcome as if their disk was unencrypted: When they're arrested and their computers seized, the cops have their crime secrets. What's wrong?
Of course this feature comes at the cost of no longer being able to have low level control over your device, but this isn't a binary choice.
Yes, phones just try to back up all of your data online.
>By giving the user the three options with consequences you empower the user to address their threat model how they see fit.
Making it too easy for uneducated users to make poor choices is terrible software design.
That defies the definition of "forced". Forced means no option. You can disagree all you want -- but at a technical level, you're incorrect.
Some even go that far that they push an update that exfiltrates data from a device (and some even do on their own initiative).
And even if you are not legally compelled. Money or influence can go a long way. For example, the fact that HTTPS communications were decipherable by the NSA for almost 20 years, or, whoops, no contract with DoD ("not safe enough"...)
Once the data is in the hands of the intelligence services, from a procedure perspective they can choose what to do next (e.g. to officialize this data collection through physical collection of the device, or do nothing and try to find a more juicy target).
It's not in the interest of anyone to prevent such collection agreement with governments. It's just Prism v2.
So seems normal that Microsoft gives the keys, the same that Cloudflare may give information about you and the others. They don't want to have their lives ruined for you.
Perhaps in this case they should be required to get a warrant rather than a subpoena?
The default behavior will never ever be to "encrypt the disk by a key and encrypt the key with the user's password." It just doesn't work in real life. You'll have thousands of users who lost access to their disks every week.
Inform, and Empower with real choices. Make it easy for end users to select an alternate key backup method. Some potential alternatives: Allow their bank to offer such a service. Allow friends and family to self host such a service. Etc.
Basically, we need better education about the issue, but as this is the case with almost every contentious issue in the world right now, I can't imagine this particular issue will bubble to the top of the awareness heap.
I suppose this all falls apart when the PC unlock password is your MS account password, the MS account can reset the local password. In Mac OS / Linux, you reset the login password, you loose the keychain.
If you mean the secure boot auto-unlock type of setup and you don't have a key backup, then you cannot reset your login password at all. You have to wipe the drive.
Password managers shift the paradigm and the risk factors. In terms of MFA, a password in your manager is now "something you have" rather than "something you know". The only password I know nowadays is my sign-in password that unlocks the password manager's vault. So the passwords to my bank, my health care, my video games are no longer "in my fingers" or in my head anymore, they're unknown to me!
So vault management becomes the issue rather than password management. If passwords are now "something you have" then it becomes possible to lose them. For example, if my home burns down and I show up in a public library with nothing but the clothes on my back, how do I sign into my online accounts? If the passwords were in my fingers, I could do this. But if they require my smartphone to be operational and charged and having network access, and also require passwords I don't know anymore, I'm really screwed at that library. It'd be nearly impossible for me to sign back in.
So in the days of MFA and password managers, now we need to manage the vaults, whether they're in the cloud or in local storage, and we also need to print out recovery codes on paper and store them securely somewhere physical that we can access them after a catastrophe. This is an increase in complexity.
So I contend that password managers, and their cousins the nearly-ubiquitous passkeys, are the main driving factor in people's forgetting their passwords and forgetting how to sign-in now, without relying on an app to do it for them. And that is a decrease in opsec for consumers.
(Separately, if you can get access to a computer I'm sure you can get access to a phone charger.)
I know the police can just break down my door, but that doesn't mean I should be ok with some random asshole having my keys.
This is being reported on because it seems newsworthy and a departure from the norm.
Apple also categorically says they refuse such requests.
It's a private device. With private data. Device and data owned by the owner.
Using sleight of hand and words to coax a password into a shared cloud and beyond just seems to indicate the cloud is someone else's computer, and you are putting the keys to your world and your data insecurely in someone else's computer.
Should windows users assume their computer is now a hostile and hacked device, or one that can be easily hacked and backdoored without their knowledge to their data?
Should Apple find itself with a comparable decryption key in its possession, it would have little options but to comply and hand it over.
This is a misrepresentation of what actually happened: the FBI even argued that they would accept a tool locked to the specific device in question so as to alleviate this concern.
This is still forced labor/creative work/engineering work/speech and not okay, but it was not a "master key."
It is entirely possible that Apple's Advanced Data Protection feature is removed legally by the US as well, if the regime decides they want to target it. I suspect there are either two reasons why they do not: Either the US has an additional agreement with Apple behind the scenes somewhere, OR the US regime has not yet felt that this was an important enough thing to go after.
There is precedent in the removal, Apple has shown they'll do the removal if asked/forced. What makes you think they wouldn't do the same thing in the US if Trump threatened to ban iPhone shipments from China until Apple complied?
The options for people to manage this stuff themselves are extremely painful for the average user for many reasons laid out in this thread. But the same goes for things like PGP keys. Managing PGP keys, uploading to key servers, using specialized mail clients, plugging in and unplugging the physical key, managing key rotation, key escrow, and key revocation. And understanding the deep logic behind it actually requires a person with technical expertise in this particular solution to guide people. It's far beyond what the average end user is ever going to do.
We live in far different times these days. I have no doubt in my mind that Apple is complying 100% with every LE request coming their way (not only because of the above gesture, but because it's actually the law)
American presidents are not dictators. The system has checks and balances and the courts decide. It doesn’t matter who the president is.
By simply not having the ability to do so.
Of course Microsoft should comply with the law, expecting anything else is ridiculous. But they themselves made sure that they had the ability to produce the requested information.
I'm honestly not entirely convinced that disk encryption be enabled by default. How much of a problem was stolen personal laptops really? Corporate machine, sure, but leave the master key with the IT department.
...it's not that at all. We don't want private contracts to enshrine the same imbalances of power; we want those imbalances rendered irrelevant.
We hope against hope that people who have strength, money, reputation, legal teams, etc., will be as steadfast in asserting basic rights as people who have none of those things.
We don't regard the FBI as a legitimate institution of the rule of law, but a criminal enterprise and decades-long experiment in concentration of power. The constitution does not suppose an FBI, but it does suppose that 'no warrant shall issue but upon probable cause... particularly describing the place to be searched, and the persons or things to be seized' (emphasis mine). Obviously a search of the complete digital footprint and history of a person is not 'particular' in any plain meaning of that word.
...and we just don't regard the state as having an important function in the internet age. So all of its whining and tantrums and pepper spray and prison cells are just childish clinging to a power structure that is no longer desirable.
Without doubt, this analogy surely breaks down as society changes to become more digital - what about a Google Glass type of device that records my entire life, or the glasses of all people detected around me? what about the device where I uploaded my conscience, can law enforcement simply probe around my mind and find direct evidence of my guilt? Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
My question was more general: how could we draft that new social contract to the current age, how could we maintain the balance where the encrypted device of a suspected child predator and murderer is left encrypted, despite the fact that some 3rd party has the key, because we agreed that is the correct way to balance freedoms and law enforcement? It just doesn't sound stable in a democracy, where the rules of that social contract can change, it would contradict the moral intuitions of the vast majority.
But it isn't a warrant, it's a subpoena. Also, the locksmith isn't the one compelled to open it; if the government wants someone to do that they have to pay them.
> Any written constitution is just a snapshot of a social contract at a particular historical time and technological development point, so it cannot serve as the ultimate source of truth regarding individual rights - the contract is renegotiated constantly through political means.
The Fourth Amendment was enacted in 1791. A process to change it exists, implying that the people could change it if they wanted to, but sometimes they get it pretty right to begin with. And then who are these asshats craving access to everyone's "papers and effects" without a warrant?
The second, very clear, argument is that the state can't be trusted in the long run. Period. Maybe you love your elected officials today but tomorrow they could be actively out to harm you. Every tool we allow the state to use needs to be viewed with this level of extreme skepticism and even very clear benefits need to be debated vigorously.
Encryption, and technologies like it, may allow hiding criminal activity but they also provide people a sense of security to think freely and stave off political power grabs. We recognize the fundamental right to free speech and give great latitude to it even when it is harmful and hateful, we need to recognize the fundamental right to free thought and recognize that encryption and similar tools are critical to it.
With Intel Panther Lake (I'm not sure about AMD), Bitlocker will be entirely hardware-accelerated using dedicated SoC engines – which is a huge improvement and addresses many commonly known Full Disk Encryption vulnerabilities. However, in my opinion some changes still need to be made, particularly for machines without hardware acceleration support:
- Let users opt out of storing recovery keys online during setup.
- Let users choose between TPM or password based FDE during setup and let them switch between those options without forcing them to deal with group policies and the CLI.
- Change the KDF to a memory-hard KDF - this is important for both password and PIN protected FDE. It's 2026 - we shouldn't be spamming SHA256 anymore.
- Remove the 20 char limit from PIN protectors and make them alphanumerical by default. Windows 11 requires TPM 2.0 anyway so there's no point in enforcing a 20 char limit.
- Enable TPM parameter encryption for the same reasons outlined above.
Apple asks you when you set up your Mac if you want to do this. You can just ask the user, Microsoft!
If that’s what you’re worried about, you shouldn’t be using computers at all. I can pretty much guarantee that Linux will adopt SoC based hardware acceleration because the benefits – both in performance and security – outweigh the theoretical risks.
Brian Cantrill is trying to end this nonsense but we shall see if they end up being the lone voice or not.
And if it's not there, a patch is pretty easy to write.
It's not like there's no source code ;)
Is how bitlocker works not well known perhaps? I don't think it's a secret. The whole schtick is that you get to manage windows computers in a corporate fleet remotely, that includes being able to lock-out or unlock volumes. The only other way to do that would be for the person using the device to store the keys somewhere locally, but the whole point is you don't trust the people using the computers, they're employees. If they get fired, or if they lose the laptop, them being the only people who can unlock the bitlocker volume is a very bad situation. Even that aside, the logistics of people switching laptops, help desk getting a laptop and needing to access the volume and similar scenarios have to be addressed. Nothing about this and how bitlocker works is new.
Even in the safer political climates of pre-2025, you're still looking at prosecution if you resist a lawful order. You can fight gag-orders, or the legality of a request, but without a court order to countermand the feds request, you have to comply.
Microsoft would do the same in China, Europe, middle east,etc.. the FBI isn't special.
One would presume US agencies has leverage to access global data.
These sorts of things should be very unsurprising to the people who depend on them...
Based on the sheer number of third parties we're required to use for our day to day lives, that is ridiculous and Third Party Doctrine should be eliminated.
Is it the case with BitLocker? The voluntary part.
When someone is arrested, the police can get a subpoena to enter your house, right?
There they can collect evidence regarding the case.
Digital protections should exist, but should they exist beyond what is available in the physical world? If so, why?
I think the wording of this is far too lenient and I understand the controversy of "if asked" vs "valid legal order", neither of which strictly say "subpoena", and of course, the controversy of how laws are interpreted/ignored in one country in particularly (yes, I'm looking at you USA).
Should there be a middle ground? Or should we always consider anything that is digital off-limits?
Crazier question: what’s wrong with a well-intentioned surveillance state? Preventing crime is a noble goal, and sometimes I just don’t think some vague notion of privacy is more important than that.
I sometimes feel that the tech community would find the above opinion far more outlandish than the general population would.
https://en.wikipedia.org/wiki/Wings_of_Desire
tl;dw: A well-intentioned surveillance state may, in fact, love the beings they are surveilling. They may fall in love so deeply, that they want to become like us. I know it's a revolutionary concept.
Article and facts are “…if served with a valid legal order compelling it”
∴ Headline is clickbait.
I’d much rather they require a warrant than just give it to any enforcement agency that sends them an email asking. The former is what I expect.
The default setting is a good mix of protecting people from the trouble they’re far more likely to run into (someone steals their laptop) while still allowing them back in if they forget their password. The previous default setting was no encryption at all which is worse in every case.
The way it is is important. Otherwise getting locked out is very easy. I think booting into safemode or messing with specific bios settings / certain bios updates enough to lock you out.
Either way once the Windows OS volume is unlocked it's all moot. There are many other ways to access ones machine remotely such as pushing a targeted update to the specific machine OS agnostic but easiest on Windows as Windows update fires off all the time despite patches being on a specific Tuesday. This method applies to phones as well, beyond the JTAG encryption bypass at power-up. Then a gag order is applied.
[1] - https://jetico.com/data-encryption/encrypt-hard-drives-bestc...
There were questions about their motivation at the time. There still are questions.
https://ubuntu.com/download/desktop
https://www.kali.org/get-kali/#kali-platforms
Every bad day for microsoft is yet another glorious day for linux.
Nah. If that were the case, Linux would dominate personal computer statistics. The reality is that most mainstream users just don't care. But, of course, that won't stop us.
Across the generations, there are always a few groups to where cryptographic ownership really matter, such as journalists, protesters, and so on. Here on HN I feel like we tend to over-geeneralize these use cases to everybody, and then we are surprised when most people don't actually care.
http://slackware.osuosl.org/slackware64-current/ChangeLog.tx...
That is also exactly why people like myself are so against passkeys, there are no offline recovery.
Who holds/controls the keys on both ends?
There's a "data encryption key", encrypted with a hash derived of your username+master password, and that data encryption key is used locally to decrypt the items of your vault. Even if everything is stored remotely, unless the provider got your raw master password (usually, a hash of that is used as the "password" for authentication), your information is totally safe.
A whole other topic is communications, but we're talking decryption keys here
I'm all for criticizing tech companies but it's pointless to demand the impossible.
Besides, bit ocker keys are really quite hard to lose.
Pretty surprising they'd back up the disk encryption secrets to the cloud at all, IMHO, let alone that they'd back it up in plaintext.
"Tough luck, should have made a backup" is higher responsibility than securing anything in meatspace, including your passport or government ID. In the real world, there is always a recovery path. Security aficionados pushing non-recoverable traps on people are plain disconnected from reality.
Microsoft has the right approach here with Bitlocker defaults. It's not merely about UX - it's about not setting up traps and footguns that could easily cause harm to people.
Eventually they yielded on this, but their later updates had other usability traps. Because Google Auth was the household name for TOTP apps, this maybe ruined TOTP's reputation early-on.
Yes you should do the former. That doesn't say much about the latter.
Or maybe I missed something, and there is actually a way to download your phone backup from Google, or PC backup from Microsoft, as actual files you can browse, without having to have a sacrificial device to wipe and restore from backup?
That's the problem right there. Migrating my phone recently (without having broken/bricked the previous one, which is somehow even worse wrt. transferring 2FA these days than getting new phone after old one breaks!), I discovered that most sites I used did not allow more than one authenticator app. If I try to add new phone as second-factor auth method, the website deletes the entry for the old phone.
Do you feel equally strongly about people using drives that can fail? Is selling a computer without redundant drives also borderline malicious?
> In the real world, there is always a recovery path.
To accounts there is. But data gets lost all the time.
No. Drives wear out and fail, like all hardware. Much like the compressor in your fridge, or V-belt in your car, you can extend the service life of your drive through proper care, and replace it when it fails to keep the system running. And in practice, hard drives are reliable enough that, with typical usage patterns, most people don't need RAID).
And, much like with fridges and cars, computers and their parts are subject to both market forces and (in more civilized places) consumer protection laws, which ensure computer hardware meets the usual, reasonable expectations of the common person.
> To accounts there is. But data gets lost all the time.
Data loss still happens, which kind of proves my point - computers are hard, and normal people can't even be expected to back things up properly. That's why every commercial PC and mobile OS vendor these days is pushing automated off-site backups using their cloud offerings. Might not be ideal, and even might be a tad anti-competitive, but it's a good deal for 99% of the users.
But this brings me back to my other pet peeve: 2FA, via authenticator apps, passkeys, and other such things that tie your credentials to a device via magic crypto keys. These crypto keys are data, and given how tech companies get away with having no actual customer support, 2FA ends up turning data loss into account access loss.
Mandatory 2FA is a trap, a time ticking bomb, because it's way too easy to make a mistake and lose the keys - and if the backend follows the current High Security Standards, this is irreversible even from the vendor side.
Compare that to expectations people have about the real world - if you lose all your keys to your home or your car, you... just go to a locksmith and show some plausible proof of ownership, and they'll legally break in and replace the locks for you. If you can't produce a plausible proof of ownership, you involve police in the process. And so on. There's always a recovery path.
To be fair, if you inadvertently get locked out of your Google account "tough luck, should have used a different provider" and Gmail is a household name so ...
Less snarky, I think that there's absolutely nothing wrong with key escrow (either as a recovery avenue or otherwise) so long as it's opt in and the tradeoffs are made abundantly clear up front. Unfortunately that doesn't seem to be the route MS went.
I am sad that this now appears unlikely. I suspect it may even be lower for people in their 20s today than a decade ago.
One of these things is not like the other...
That's why I'm stressing the comparison to e.g. government documents: nothing in meatspace requires regular people to show anywhere near as much conscientiousness as handling encryption keys.
Or: many people probably know, in the abstract, that "encrypted means gone if you lose the key", much like many people know slipping up while working on a HV line will kill you. Doesn't mean we should require everyone to play with them.
Apple manages a recovery path for users without storing the key in plain text. Must have something to do with those "security aficionados."
Linux can be fairly well-secured against state-level threat actors, but honestly, if your adversary is your own nation-state, then no amount of security is going to protect you!
For Microsoft and the other consumer-OS vendors, it is typically a bad user-experience for any user, particularly a paying subscriber, to lose access to their account and their cloud apps. There are many ways to try and cajole the naïve user into storing their recovery key somewhere safe, but the best way is to just do it for them.
A recovery key stored in the user's own cloud account is going to be secure from the typical threats that consumers will face. I, for one, am thankful that there is peace of mind both from the on-device encryption, as well as the straightforward disaster recovery methods.
But One-drive is essentially a mass-surveillance tool. It's a way to load the contents of every single person's computer into Palentir or similar tools and, say, for instance, "give me a list of everyone who harbors anti-ICE sentiments."
By the way my windows computer nags me incessantly about "setting up backups" with no obvious way to turn off the nags, only a "remind me later" button. I assume at some point the option to not have backups will go away.
What is just as crazy as cloud storage, is how you "go paperless" with all your service providers. Such as health care, utility bills, banks, etc. They don't print a paper statement and send it to your snail mail box anymore. They produce a PDF and store it in their cloud storage and then you need to go get it when you want/need it.
The typical consumer may never go get their paperwork from the provider's cloud. It is as if they said "Hey this document's in our warehouse! You need to drive across town, prove your identity, and look at it while you're here! ...You may not be permitted to take it with you, either!"
So I've been rather diligent and proactive about going to get my "paperless documents" from the various providers, and storing them in my own cloud storage, because, well, at least it's somewhere I can access it. I care a lot more about paying my medical bills, and accounting for my annual taxes, than someone noticing that I harbor anti-jew sentiment. I mean, I think they already figured that part out.
There are plenty of people that post clear positions on multiple social networks. I personally doubt that One-drive files will provide much more information for most of the people compared to what's already out there (including mobile phone location, credit card transactions, streaming services logs, etc.).
What I think the danger is for individual abuse. Someone "in power" wants one guy to have issues, they could check his One-drive for something.
Best is to make people aware of how it works and let them figure it out. There are so many options (local only, encrypted cloud storage, etc.) I doubt there is an ideal solution for everything.
...in which case having a cloud backup of the full disk encryption key is pointless, because you don't have access to the disk any more.
Full-disk encryption is the opposite of pointless, my dude! The notebook-thief cannot access my data! That is the entire point!
No, I cannot recover the data from an HDD or SSD that I don't possess. But neither can the thief. The thief cannot access the keys in my cloud. Isn't that the point?
If a thief steals a notebook that isn't encrypted at all, then they can go into the storage, even forensically, and extract all my data! Nobody needs a "key" or credentials to do that! That was the status quo for decades in personal computing--and even enterprise computing. I've had "friends" give me "decommissioned" computers that still had data on their HDD from some corporation. And it would've been readable if I had tried.
The thief may have stolen a valuable piece of kit, but now all she has is hardware. Not my data. Not to mention, if your key was in a cloud backup, isn't most of your important data in the cloud, as well? Hopefully the only thing you lost with your device are the OS system files, and your documents are safely synced??
This isn't that simple.
But I guess it's not done more because the free data can't be analyzed and sold.
Given that the us government is happy to execute us citizens and invade other countries that basically means everyone.
This is blurring of fact drives click bait.
The origin of this is a Forbes article[0] where the quote is: "Microsoft confirmed to Forbes that it does provide BitLocker recovery keys if it receives a valid legal order."
[0] https://www.forbes.com/sites/thomasbrewster/2026/01/22/micro...
The following information may be available from iCloud if a user has enabled Advanced Data Protection for iCloud:
https://www.apple.com/legal/privacy/law-enforcement-guidelin...
Do you think Tim Cook gave that gold bar to Trump for nothing?
Don't know if the problem is on my end but your link goes to a 20 page document. If this is not a mistake you should quote the actual section and text you are referrimg to.
> Apple does not receive or retain encryption keys for customer’s end-to-end encrypted data. Advanced Data Protection uses end-to-end encryption, and Apple cannot decrypt certain iCloud content, including Photos, iCloud Drive, Backup, Notes, and Safari Bookmarks
Not in US - THANKS for this hint: I googled it! Wow!!! The both do bribery (offering&accepting) in front of the recording camera in a government building!!
Relly "impressive" :-X
For example, it is new in Tahoe that they store your filevault encryption key in your icloud keychain without telling you.
https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
iCloud is much more secure than most people realize because most people don’t take the 30 minutes to learn how it is architected.
You can (and should) watch https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for all the details about how iCloud is protected, but especially the time-linked section. :)
That said, they could also roll out a small patch to a specific device to extract the keys. When you really want to be safe (and since you can be a called a 'left extremist' for moving your car out of the way, that now includes a lot of people), probably use Linux with LUKS.
Apple provides an optional encryption level (ADP) where they don't have a copy of your encryption key.
When Apple doesn't have the encryption key, they can't decrypt your data, so they can't provide a copy of the decrypted data in response to a warrant.
They explain the trade off during device setup: If Apple doesn't have a copy of the key, they can't help you if you should lose your copy of the key.
That's a Microsoft thing.
If you use a Local Account (which requires bypassing the OOBE internet check during setup) or explicitly disable key backup, the key never leaves the TPM. The issue isn't the encryption algorithm its the convenience selection.
People also forget how they kind of always played ball in similar governments.
Lockdown mode: https://support.apple.com/en-us/105120
Advanced Data Protection for iCloud: https://support.apple.com/en-us/108756
Besides, they fully comply with Chinese requirements, so...
PS. Others report Filevault keys are also being backed to iCloud since September and they didn't tell anyone: https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
And if you don't want iCloud Keychain, you are still given the choice to encrypt and print the backup key.
Unless Apple is straight up lying about their technology and encryption methods used to secure iCloud and their hardware, the issue of a public standoff is moot, because Apple couldn't help them if they wanted to. And while perhaps it's possible that Apple would lie to consumers to please US law enforcement, it's a bit of a stretch to say that because there haven't been any high-profile cases where law enforcement tries to force Apple to give up what they don't have, that this must be evidence that they're in cahoots.
Which, to be clear, is perfectly possible. Apple has denied the existence of a deliberately backdoored system at least once before: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...
Apple has since confirmed in a statement provided to Ars that the US federal government “prohibited” the company “from sharing any information,” but now that Wyden has outed the feds, Apple has updated its transparency reporting and will “detail these kinds of requests” in a separate section on push notifications in its next report.
Who knows what else they're hiding, if we only found out about this scheme in 2023.How is this any different?
If you encrypt your drive and upload the key to Microsoft, you are engaging in anti-competitive behavior since you give them access to your data, but not also to the local thief.
Just don't encrypt your drive if you cant be bothered to secure your key. Encryption-neutrality.
Just because the article is click bait doesn't mean the HN entry needs to be, too.
Sure, the fact that MS has your keys at all is no less problematic for it, but the article clearly explains that MS will do this if legally ordered to do so. Not "when the FBI asks for it".
Which is how things work: when the courts order you to do something, you either do that thing, or you are yourself violating the law.
sixcolors.com/post/2025/09/filevault-on-macos-tahoe-no-longer-uses-icloud-to-store-its-recovery-key/
Probably not if one is not using Apple cloud on their laptops.
> stored in your keychain (without telliing you!)
How to verify that? Any commands/tools/guides?
There's a saying that goes "not your keys not your crypto" but this really extends to everything. If you don't control the keys something else does behind the scenes. A six digit PIN you use to unlock your phone or messaging app doesn't have enough entropy to be secure, even to derive a key-encryption-key.
If you pass a KDF with a hardness of ~5 seconds a four digit PIN to derive a key, then you can brute force the whole 10,000 possible PINs in ~13 hours. After ~6.5 hours you would have a 50% chance of guessing correctly. Six digit PIN would take significantly longer, but most software uses a hardness nowhere near 5 seconds.
The PIN is not usually used for cryptography, it's used to authorize the TEE (secure enclave) to do it for you. It's usually difficult or impractical to get the keys from the TEE.
We joke and say that maybe Microsoft could engineer a safer architecture, but they can also ship an OTA update changing the code ad-hoc. If the FBI demands cooperation from Microsoft, can they really afford to say "no" to the feds? The architecture was busted from the ground-up for the sort of cryptographic expectations most people have.
You can (and should) watch all of https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for the details about how iCloud is protected by HSMs and rate limits to understand why you’re wrong, but especially the time-linked section… instead of spreading FUD about something you know nothing about.
Where's the source code? Who audits this system?
Still crap but the headline is intentionally inaccurate for clickbaiting
> Microsoft confirms it will give the FBI your Windows PC data encryption key if asked
> Microsoft says it will hand those over to the FBI if requested via legal order
Microsoft complying with legal orders is not news. But why hire actual journalists when you can just lie in your headlines and still get clicks?
Edit: Nevermind.