Today, 2 out of 3 of my machines are KDE fedora. The last one is TBD because my kids are using it.
I didn't have a choice for machine 1 because it wasn't eligible for windows 11 and windows 10 security updates were EOL. Machine 2 quickly followed.
At the time, there had been disappointing windows news every few months. Since there have continued to be disappointing windows news every few months.
I expect more disappointing windows news to follow.
This is why the FBI can compel Microsoft to provide the keys. It's possible, perhaps even likely, that the suspect didn't even know they had an encrypted laptop. Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
Except the steps to to that are disable bitlocker, create a local user account (assuming you initially signed in with a Microsoft account because Ms now forces it on you for home editions of windows), delete your existing keys from OneDrive, then re-encrypt using your local account and make sure not to sign into your Microsoft account or link it to Windows again.
A much more sensible default would be to give the user a choice right from the beginning much like how Apple does it. When you go through set up assistant on mac, it doesn't assume you are an idiot and literally asks you up front "Do you want to store your recovery key in iCloud or not?"
That's not so easy. Microsoft tries really hard to get you to use a Microsoft account. For example, logging into MS Teams will automatically link your local account with the Microsoft account, thus starting the automatic upload of all kinds of stuff unrelated to MS Teams.
In the past I also had Edge importing Firefox data (including stored passwords) without me agreeing to do so, and then uploading those into the Cloud.
Nowadays you just need to assume that all data on Windows computers is available to Microsoft; even if you temporarily find a way to keep your data out of their hands, an update will certainly change that.
I used to be a windows user, it has really devolved to the point where it's easier for me to use Linux (though I'm technical). I really feel for the people who aren't technical and are forced to endure the crap that windows pushes on users now.
That’s the real problem MS has. It’s becoming a meme how bad the relationship between the user and windows is. It’s going to cause generational damage to their company just so they can put ads in the start menu.
At this rate, my next laptop might end up being a framework running Linux.
Those old habits have been creeping back lately through all the various *OS 26 updates. I too now have Linux on Framework. Not perfect, but so much better for my wellbeing.
I recommend it.
I've been using AMD EliteBooks, the firmware has Linux happy paths, the hardware is supported by the kernel and Modern Standby actually works well. Getting one with a QHD to UHD screen is mandatory, though, and I wouldn't buy a brand new model without confirming it has working hardware on linux-hardware.org.
If you look online, HP has a YouTube channel with instructional videos for replacing and repairing every part of their laptops. They are made to make memory, storage and WiFi/5G card replacements easy, parts are cheap and the after market for them is healthy.
I've also had good luck with their support, they literally overnight'd a new laptop with a return box for the broken one in a day.
Is linux support on the M1/M2 models as good as linux support on x86 laptops? My understanding was that there's still a fair bit of hardware that isn't fully supported. Like, external displays and Bluetooth.
Or more detailed results at:
https://asahilinux.org/docs/platform/feature-support/overvie...
I have done triple booting of MacOS, Linux and Windows on an old Mac Mini, and it was a nightmare to get them working, but worked well once set up.
I think well known brands and models of PCs are better for such alternative setups, rather than obscure PCs.
Realistically, a major Linux distro is the most user-beneficial thing you can do and today it is easier than ever. If my 12 year old can figure out how to use it productively, so can anyone. Switch today and enjoy.
Advertising stories like that, make sure M$ execs could care less about damage to their image.
Especially when profit leers its head.
(at least, I presume?!?)
I have been recommending Kubuntu to Windows people. I find it's an easier bet than Linux Mint. You get the stability of Ubuntu, plus the guarantee of a Windows-like environment.
Yes, I know, Linux Mint supports Plasma, but I honestly think the "choose your desktop" part of the setup process is more confusing to a newbie than just recommending a distro with the most Windows-like UI and a straightforward installation.
Truly, and to really drive it home, I’ve loved PopOS but this latest release is just too half baked. I think anyone considering it should either wait a year or use something else, and Kubuntu seems like a reasonable alternative for people coming from Windows or MacOS.
I have spent a decent few days to get long battery life on Linux (fedora), with sleep hibernate + encryption. And I am still thinking that the Linux scheduler is not correctly using Intel's pcore/ecore on 13th gen correctly.
If you want I can try to help you debug it. I don't have a fedora system but I can spin up a VM or nspawn to try to match your environment if you want
Btw - my definition of “possible” would include anything possible in the UI - but if you have to edit the registry or do shenanigans in the filesystem to disable the upload from happening, I would admit that it’s basically mandatory.
I get why the US would not, but I really wish the rest of the world looked at this like the security and sovereignty issue that it is.
I did this myself for about 8 years, from 2016-2024. During that time my desktop system at home was running Linux with ZFS and libvirt, with Windows in a VM. That Windows VM was my usual day-to-day interface for the entire system. It was rocky at first, but things did get substantially better as time moved on. I'll do it again if I have a compelling reason to.
With a VM running on an encrypted file system, whatever a warrant for a bitlocker key might normally provide will be hidden behind an additional layer that Microsoft does not hold the keys to.
(Determining whether that is useful or not is an exercise for the person who believes that they have something to hide.)
The only windows I am using is the one my company makes me use but I don't do anything personal on it. I have my personal computer next to it in my office running on linux.
I mean, this is one application nobody should ever log into!
I, however, like getting my paycheck, and so I have no choice.
Just don’t use that machine for anything private.
Is anyone using their private devices for work? (Also there is teams for Linux and on the web, if that is not prevented by the policy of your org.)
Obviously enterprises aren’t commonly BYOD shops, but SMBs and startups certainly can be.
… whether the people who would do such BYOD things are at all likely to be Windows users who care about this Bitlocker issue, is a different debate entirely.
Unless you're a founder, you should always use company provided equipment.
I also notice that it helps in segmenting in the brain to use separate devices for private and business use.
What this means for the user is your personal device is rather invasively managed. If you want Linux, your distro choice may be heavily restricted. What you can do with that personal device might be restricted (all the EDR monitoring), and you’ll probably take a performance and reliability hit. Not better than just a second laptop for most people.
Admittedly, the risks of choosing this option are not clearly laid out, but the way you are framing it also isn't accurate
Whether you opt in, or not, if you connect your account to Microsoft, then they do have the ability fetch the bitlocker key, if the account is not local only. [0] Global Reader is builtin to everything +365.
[0] https://github.com/MicrosoftDocs/entra-docs/commit/2364d8da9...
The question is do they ever fetch and transmit it if you opt out?
The expected answer would be no. Has anyone shown otherwise? Because hypotheticals that they could are not useful.
Why? They are useful to me and I appreciate the hypotheticals because it highlights the gaps between "they can access my data and I trust them to do the right thing" and "they literally can't access my data so trust doesn't matter."
Hell, all the times they keep enabling one drive despite it being really clear I don’t want it, and then uploading stuff to the cloud that I don’t want?
I have zero trust for Microsoft now, and not much better for them in the past either.
One day they came in and found an icon on their desktop labeled “Where are my files?” that explained they had all been moved in OneDrive following an update. This prompted my clients to go into full meltdown mode, as they knew exactly what this meant. We ultimately got a BAA from Microsoft just because we don’t trust them not to violate federal laws again.
This does not apply to standalone devices. MS doesn't have a magic way to reach into your laptop and pluck the keys.
Of course they do! They can just create a Windows Update that does it. They have full administrative access to every single PC running Windows in this way.
It's both extremely convenient and very unlikely to be detected; especially given that most current systems are associated to an account.
I'd be surprised if it's not widely used by law enforcement, when it's not possible to hack a device in more obvious ways.
Please check theupdateframework.io if you have a say in an update system.
Updates are using root to run?
I don't know the status of the updating systems of the various distributions; if some use server-delivered scripts run as root, that's potentially a further powerful attack avenue.
But I was assuming that the update process itself is safe; the problem is that you usually don't have guarantees that the updates you get are genuine.
So if you update a component run as root, yes, the update could include malicious code that can do anything.
But even an update to a very constrained application could be very damaging: for example, if it is for a E2EE messaging application, it could modify it to have it send each encryption key to a law enforcement agency.
A point of order: you do have that guarantee for most Linux distro packages. All 70,000 of them in Debian's case. And all Linux distro distribute their packages anonymously, so they can never target just one individual.
That's primarily because they aren't trying to make money out of you. Making money requires a billing relationship, and tracking which of your customers own what. Off the back of that governments can demand particular users are targeted with "special" updates. Australia in particular demands commercial providers do that with its "Assistance and Access Bill (2018)" and I'm sure most governments in the OECD have equivalents.
- They're easy to setup and maintain immutable and reproducible builds.
- You only install the software you need, and even within each software item, you only build/install the specific features you need. For example, if you are building a server that will sit in a datacentre, you don't need to build software with Bluetooth support, and by extension, you won't need to install Bluetooth utilities and libraries.
- Both have a monolithic Git repository for packages, which is advantageous because you gain the benefit of a giant distributed Merkle tree for verifying you have the same packages everyone else has. As observed with xz-utils, you want a supply chain attacker to be forced to infect as many people as possible so more people are likely to detect it.
- Sandboxing is used to minimise the lines of code during build/install which need to have any sort of privileges. Most packages are built and configured as "nobody" in an isolated sandbox, then a privileged process outside of the sandbox peeks inside to copy out whatever the package ended up installing. Obviously the outside process also performs checks such as preventing cool-new-free-game from overwriting /usr/bin/sudo.
- The time between a patch hitting an upstream repository and that patch being part of a package installed in these distributions is fast. This is important at the moment because there are many efforts underway to replace and rewrite old insecure software with modern secure equivalents, so you want to be using software with a modern design, not just 5 year old long-term-support software. E.g. glycin is a relatively new library used by GNOME applications for loading of untrusted images. You don't want to be waiting 3 years for a new long-support-support release of your distribution for this software.
No matter which distribution you use, you'll get some common benefits such as:
- Ability to deploy user applications using something like Flatpak which ensures they are used within a sandbox.
- Ability to deploy system applications using something like systemd which ensures they are used within a sandbox.
Microsoft have long underinvested in Windows (particularly the kernel), and have made numerous poor and failed attempts to introduce secure application packaging/sandboxing over the years. Windows is now akin to the horse and buggy when compared to the flying cars of open source Linux, iOS, Android and HarmonyOS (v5+ in particular which uses the HongMeng kernel that is even EAL6+, ASIL D and SIL 3 rated).
I'd be curious to see a conclusive piece of documentation about this, though
If you really don't trust Microsoft at all then don't use Windows.
> sign into your Microsoft account or link it to Windows again.
For reference, I did accidentally login into my Microsoft account once on my local account (registered in the online accounts panel). While Edge automatically enabled synchronization without any form of consent from my part, it does not look like that my Bitlocker recovery key is listed on https://account.microsoft.com/devices/recoverykey. But since I unlinked my account, it could be that it was removed automatically (but possible still cached somewhere).
Given that:
1. Retail licenses (instead of OEM ones) can be transferred to new machines
2. Microsoft seems to be making a pattern of allowing retail and OEM licenses to newer versions of Windows for free
A $60 difference in license cost, one-time, isn't such a big deal unless you're planning on selling your entire PC down the line and including the license with it. Hell, at this point, I haven't purchased a Windows license for my gaming PC since 2013 - I'm still using the same activation key from my retail copy of Windows 8 Pro.
GPEdit -> Computer Configuration → Administrative Templates → Windows Components → BitLocker Drive Encryption → Operating System Drives → “Choose how BitLocker-protected operating system drives can be recovered”
Repeat for other drives.
This seems to go against principles of key management. If your key escrow peer has defected, the correct response is to rotate your keys.
Microsoft has the KEK or passphrase that can be used to derive the KEK. The KEK protects the DEK which is used to encrypt the data. Rotating the KEK (or KEKs if multiple slots are used) will overwrite the encrypted DEK, rendering the old KEK useless.
Or does BitLocker work differently than typical data at rest encryption?
The current approach is weak, and strikes me as a design unlikely to be taken unless all the people involved were unfamiliar with secure design (unlikely IMO), or they intentionally left the door open to this type of access.
I would be using an operating system that wasn’t geared up to be cloud backed up and closed source.
1. Is there any indication it forcibly uploads your recovery keys to microsoft if you're signed into a microsoft account? Looking at random screenshots, it looks like it presents you an option https://helpdeskgeek.com/wp-content/pictures/2022/12/how-to-...
2. I'm pretty sure you don't have to decrypt and rencrypt the entire drive. The actual key used for encrypting data is never revealed, even if you print or save a recovery key. Instead, it generates a "protectors", which encrypts the actual key using the recovery key, then stores the encrypted version on the drive. If you remove a recovery method (ie. protector), the associated recovery key becomes immediately useless. Therefore if your recovery keys were backed up to microsoft and you want to opt out, all you have to do is remove the protector.
Once the feature exists, it's much easier to use it by accident. A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded. And it's a silent failure: the security properties of the system have changed without any visible indication that it happened.
ETA: You're not wrong; folk who have specific, legitimate opsec concerns shouldn't be using certain tools. I just initially read your post a certain way. Apologies if it feels like I put words in your mouth.
Me-30-years-ago would have called today's government crimes and corruption an implausible fever dream.
Stats on this very likely scenario?
From the wikipedia article on "Soft error", if anyone wants to extrapolate.
So roughly you could expect this happen roughly once every two hundred million years.
Assuming there are about 2 billion Windows computers in use, that’s about 10 computers a year that experience this bit flip.
That's wildly more than I would have naively expected to experience a specific bit-flip. Wow!
Big numbers are crazy.
Was in DEFCON19.
Can't find an explanatory video though :(
More on the topic: Single-event upset[1]
That's all errors including permanent hardware failure, not just transient bit flips or from cosmic rays.
"We provide strong evidence that memory errors are dominated by hard errors, rather than soft errors, which previous work suspects to be the dominant error mode." [0]
"Memory errors can be caused by electrical or magnetic interference (e.g. due to cosmic rays), can be due to problems with the hardware (e.g. a bit being permanently damaged), or can be the result of corruption along the data path between the memories and the processing elements. Memory errors can be classified into soft errors, which randomly corrupt bits but do not leave physical damage; and hard errors, which corrupt bits in a repeatable manner because of a physical defect."
"Conclusion 7: Error rates are unlikely to be dominated by soft errors.
We observe that CE [correctable errors] rates are highly correlated with system utilization, even when isolating utilization effects from the effects of temperature. In systems that do not use memory scrubbers this observation might simply reflect a higher detection rate of errors. In systems with memory scrubbers, this observations leads us to the conclusion that a significant fraction of errors is likely due to mechanism other than soft errors, such as hard errors or errors induced on the datapath. The reason is that in systems with memory scrubbers the reported rate of soft errors should not depend on utilization levels in the system. Each soft error will eventually be detected (either when the bit is accessed by an application or by the scrubber), corrected and reported. Another observation that supports Conclusion 7 is the strong correlation between errors in the same DIMM. Events that cause soft errors, such as cosmic radiation, are expected to happen randomly over time and not in correlation.
Conclusion 7 is an interesting observation, since much previous work has assumed that soft errors are the dominating error mode in DRAM. Some earlier work estimates hard errors to be orders of magnitude less common than soft errors and to make up about 2% of all errors."
[0] https://www.cs.toronto.edu/~bianca/papers/sigmetrics09.pdf
Also, around 1999-2000, Sun blamed cosmic rays for bit flips for random crashes with their UltraSPARC II CPU modules.
Yep, hardware failures, electrical glitches, EM interference... All things that actually happen to actual people every single day in truly enormous numbers.
It ain't cosmic rays, but the consequences are still flipped bits.
This is absurd, because it's basically a generic argument about any sort of feature that vaguely reduces privacy. Sorry guys, we can't have automated backups in windows (even opt in!), because if the feature exists, a random bitflip can cause everything to be uploaded to microsoft against the user's will.
So that may not be a great example of you’re trying to make people like Microsoft.
This is a dismissal of an objection to a software system implemented such that it performs in a discrete manner by default(no info leaves until I explicitly tell it to; this would be a nice thing, if you hadn't noticed). You repudiate the challenge on the basis of "we want to implement $system that escrows keys by default; a bad thing, but great for the company and host government in which said thing is widely adopted).
You may not have used the exact words; but the constellation of factors is still there. We can't have nice things (machines that don't narc, do what we tell them, etc.) because there are other forces at work in our society making these things an impossibility.
It is regrettable you do not see the pattern, but then again, that may be for the better for you. I wouldn't wish the experience of seeing things the way I do on anyone else. Definitely not a fun time. But it is certainly there.
We have mandatory identification for all kinds of things that are illegal to purchase or engage in under a certain age. Nobody wants to prosecute 12 year old kids for lying when the clicked the "I am at least 13 years old" checkbox when registering an account. The only alternative is to do what we do with R-rated movies, alcohol, tobacco, firearms, risky physical activities (i.e. bungee jumping liability waiver) etc... we put the onus of verifying identification on the suppliers.
I've always imagined this was inevitable.
When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.
We can't trust every private company that now has to verify age to not store that information with whatever questionable security.
If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not, then we need regulation to prevent the storage of ID information.
We should still be able to verify age while remaining psuedo-anonymous.
Querying a national registry is not good because the timing of the queries could be matched up with the timing of site logins to possibly figure out the identities of anonymous site users.
A way to address this, at the cost of requiring the user to have secure hardware such as a smart phone or a smart card or a hardware security token or similar is for your government to issue you signed identity documents that you store and that are bound cryptographically to your secure hardware.
A zero knowledge protocol can later be used between your secure hardware and the site you are trying to use that proves to the site you have ID that says you are old enough and it is bound to your hardware without revealing anything else from your ID to the site.
This is what the EU had been developing for a few years. It is currently undergoing a series of large scale field trials, with release to the public later this year, with smart phones as the initial secure hardware. Member starts will be required to support it, and any mandatory age verification laws they pass will require sites to support it (they can also support other methods).
All the specs are open and the reference implementations are also open source, so other jurisdictions could adopt this.
Google has released an open source library for a similar system. I don't know if it is compatible with the EU system or not.
I think Apple's new Digital ID feature in Wallet is also similar.
We really need to get advocacy groups that are lobbying on age verification bills to try to make it so when the bills are passed (and they will be) they at least allow sites to support some method like those described above, and ideally require sites to do so.
And note that if we are, the records of the request to that database are an even bigger privacy timebomb than those of any given provider, just waiting for malicious actors with access to government records.
Beer, sure. But if you buy certain decongestants, they do log your ID. At least that's the case in Texas.
Yeah, but many people don't actually think War on Drugs policies are a model for civil liberties that should be extended beyond that domain (or, in many cases, even tolerated in that domain.) That policy has been effective, I guess, in promoting the sales of alternative “decongestants” (that don't actually work), though it did little to curb use and harms from the drugs it was supposed to control by attacking supply.
That's how it should be, but it's not how it is. Many places now scan your ID into their computer (the computer which, btw, tracks everything you buy). It may not go to a government database (yet) but it's most certainly being stored.
That would completely defeat the purpose. The goal is to identify online users, not protect children.
There is already plenty of entities that not only have reliable way of proving it's you that have access to account, but also enough info to return user's age without disclosing anything else, like banks or govt sites, they could (or better, be forced to) provide interface to that data.
Basically "pick your identity provider" -> "auth on their site" -> "step showing that only age will be shared" -> response with user's age and the query's unique ID that's not related to the user account id
Nah, no shot.
How?
There is no point locking your laptop with a passphrase if that passphrase is thrown around.
Sure, maybe some thief can't get access, but they probably can if they can convince Microsoft to hand over the key.
Microsoft should not have the key, thats part of the whole point of FDE; nobody can access your drive except you.
The cost of this is that if you lose your key: you also lose the data.
We have trained users about this for a decade, there have been countless dialogues explaining this, even if we were dumber than we were (we're not, despite what we're being told: users just have fatigue from over stimulation due to shitty UX everywhere); then it's still a bad default.
The important bit here is that ~*nobody* who is using Windows cares about encryption or even knows what it is! This is all on by default, which is a good thing, but also means that yes, of course Microsoft has to store the keys, because otherwise a regular user will happen to mess around with their bios one day and accidentally lock themselves permanently out of their computer.
If you want regular FDE without giving Microsoft the key you can go ahead and do it fairly easily! But realistically if the people in these cases were using Linux or something instead the police wouldn't have needed an encryption key because they would never have encrypted their laptop in the first place.
Seemingly once you've installed Windows and given the Microsoft your BitLocker keys in escrow, you could then use Remove-BitLockerKeyProtector to delete the VMK which is protected with mode 3 "Numerical password" (recovery key).[4] It appears that the escrow process (possibly the same as used by BackupToAAD-BitLockerKeyProtector) might only send the numerical key, rather than the VMK itself.[5][6] I couldn't find from a quick Internet search someone who has reverse engineered fveskybackup.dll to confirm this is the case though. If Microsoft are sending the VMK _and_ the numerical key, then they have everything needed to decrypt a disk. If Microsoft are only sending the numerical key, and all numerical key protected VMKs are later securely erased from the disk, the numerical key they hold in escrow wouldn't be useful later on.
Someone did however ask the same question I first had. What if I had, for example, a billion BitLocker recovery keys I wanted to ensure were backed up for my protection, safety and peace of mind? This curious person did however already know the limit was 200 recovery keys per device, and found out re-encryption would fail if this limit had been reached, then realised Microsoft had fixed this bug by adding a mechanism to automatically delete stale recovery keys in escrow, then reverse engineered fveskybackup.dll and an undocumented Microsoft Graph API call used to delete (or "delete") escrowed BitLocker recovery keys in batches of 16.[7]
It also appears you might only be able to encrypt 10000 disks per day or change your mind on your disk's BitLocker recovery keys 10000 times per day.[8] That might sound like a lot for particularly an individual, but the API also perhaps applies a limit of 150 disks being encrypted every 15 minutes for an entire organisation/tenancy. It doesn't look like anyone has written up an investigation into the limits that might apply for personal Microsoft accounts, or if limits differ if the MS-Organization-Access certificate is presented, or what happens to a Windows installation if a limit is encountered (does it skip BitLocker and continue the installation with it disabled?).
[1] https://learn.microsoft.com/en-us/purview/office-365-bitlock...
[2] https://itm4n.github.io/tpm-based-bitlocker/
[3] https://learn.microsoft.com/en-us/windows/win32/secprov/getk...
[4] https://learn.microsoft.com/en-us/powershell/module/bitlocke...
[5] https://learn.microsoft.com/en-us/graph/api/bitlockerrecover...
[6] https://learn.microsoft.com/en-us/powershell/module/bitlocke...
[7] https://patchmypc.com/blog/bitlocker-recovery-key-cleanup/
[8] https://learn.microsoft.com/en-us/graph/throttling-limits#in...
Right, so the solution is to silently upload their encryption keys to Microsoft's servers without telling them? If users don't understand encryption, they certainly don't understand they've just handed their keys to a third party subject to government data requests.
> otherwise a regular user will happen to mess around with their bios one day and accidentally lock themselves permanently out of their computer.
This is such transparent fear-mongering. How often does this actually happen versus how often are cloud providers breached or served with legal requests? You're solving a hypothetical edge case by creating an actual security vulnerability.
Encryption by default and cloud key escrow are separate decisions. You can have one without the other. The fact that Microsoft chose both doesn't make the second one necessary, it makes it convenient for Microsoft.
> If you want regular FDE without giving Microsoft the key you can go ahead and do it fairly easily!
Then why isn't that the default with cloud backup as opt-in? Oh right, because then Microsoft wouldn't have everyone's keys.
And the passphrase they log in to windows with is not the key, Microsoft is not storing their plain text passphrase in the cloud, just to be clear.
The only thing I would really fault Microsoft for here is making it overly difficult to disable the cloud storage for users who do understand all the implications.
Mate, if 99% of users don't understand encryption, they also don't understand that Microsoft now has their keys. You can't simultaneously argue that users are too thick to manage keys but savvy enough to consent to uploading them.
> If their keys weren't stored in the cloud, these users could easily lose access to their data without understanding how or why.
As opposed to losing access when Microsoft gets breached, or when law enforcement requests their keys, or when Microsoft decides to lock them out? You've traded one risk for several others, except now users have zero control.
The solution to "users might lock themselves out" is better UX for local key backup, not "upload everyone's keys to our servers by default and bury the opt-out". One is a design problem, the other is a business decision masquerading as user protection.
> The only thing I would really fault Microsoft for here is making it overly difficult to disable the cloud storage for users who do understand all the implications.
That's not a bug, it's the entire point. If it were easy to disable, people who understand the implications would disable it. Can't have that, can we?
> that just so happens to take a screenshot of your screen every few seconds
Recall is off by default. You have to go turn it on if you want it.
Microsoft also happens to own LinkedIn which conveniently "forgets" all of my privacy settings every time I decide to review them (about once a year) and discover that they had been toggled back to the privacy-invasive value without my knowledge. This has happened several times over the years.
There is the old password for candy bar study: https://blog.tmb.co.uk/passwords-for-chocolate
Do users care? I would posit that the bulk of them do not, because they just dont see how it applies to them, till they run into some type of problem.
2) according to Microsoft
So, trust is not zero. It's deeply negative.
Most (though not all) users are looking for encryption to protect their data from a thief who steals their laptop and who could extract their passwords, banking info, etc. Not from the government using a warrant in a criminal investigation.
If you're one of the subset of people worried about the government, you're generally not using default options.
Even offices usually give people laptops over desktops so that they can bring it to meetings.
I think you're misunderstanding. You can rescue the files on your own disk when you place the key in your MS account.
There's no scenario where you can't but the police can.
What's the equivalent of thinking users are this stupid?
I seem to recall that the banks repeatedly tell me not to share my PIN number with anyone, including (and especially) bank staff.
I'm told not to share images of my house keys on the internet, let alone handing them to the government or whathaveyou.
Yet for some unknown reason everyone should send their disk encryption keys to one of the largest companies in the world (largely outside of legal jurisdiction), because they themselves can't be trusted.
Bear in mind that with a(ny) TPM chip, you don't need to remember anything.
Come off it mate. You're having a laugh aren't you?
What's the equivalent of thinking security aficionados are clueless?
Security advice is dumb and detached from life, and puts ubdue burden on people that's not like anything else in life.
Sharing passwords is a feature, or rather a workaround because this industry doesn't recognize the concept of temporary delegation of authority, even though it's the basics of everyday life and work. That's what you do when you e.g. send your kid on a grocery run with your credit card.
Asking users to keep their 2FA recovery keys or disk encryption keys safe on their own - that's beyond ridiculous. Nothing else in life works that way. Not your government ID, not your bank account, not your password, not even the nuclear launch codes. Everything people are used to is fixable; there's always a recovery path for losing access to accounts or data. It may take time and might involve paying a notary or a court case, but there is always a way. But not so with encryption keys to your shitposts and vacation pictures in the cloud.
Why would you expect people to follow security advice correctly? It's detached from reality, dumb, and as Bitcoin showed, even having millions of dollars on the line doesn't make regular people capable of being responsible with encryption keys.
> Nothing else in life works that way. Not your government ID, not your bank account, not your password, not even the nuclear launch codes.
Brilliant examples of why you're wrong:
Government IDs have recovery because the government is the trusted authority that verified you exist in the first place. Microsoft didn't issue your birth certificate.
Nuclear launch codes are literally designed around not giving any single entity complete access, hence the two-person rule and multiple independent key holders. You've just argued for my position.
Banks can reset your PIN because they're heavily regulated entities with legal obligations and actual consequences for breaching trust. Microsoft's legal department is larger than most countries' regulators.
> even having millions of dollars on the line doesn't make regular people capable of being responsible with encryption keys.
Right, so the solution is clearly to hand those keys to a corporation that's subject to government data requests, has been breached multiple times, and whose interests fundamentally don't align with yours? The problem with Bitcoin isn't that keys are hard - it's that the UX is atrocious. The solution is better tooling, not surveillance capitalism with extra steps.
You're not arguing for usability. You're arguing that we should trust a massive corporation more than we trust ourselves, whilst simultaneously claiming users are too thick to keep a recovery key in a drawer. Pick a lane.
You're saying it's likely to happen that a laptop thief also is capable to stealing the recovery key from Microsoft'servers?
So therefore it would be better that users lost all their data if - an update bungles the tpm trust - their laptop dies and they extract the hard drive - they try to install another OS alongside but fuck up the tpm trust along the way - they have to replace a Mainboard - they want to upgrade their pc ?
I know for a fact which has happened to me more often.
The question isn't "cloud escrow vs nothing". It's "cloud escrow vs local backup". One protects you from hardware failure. The other protects you from hardware failure whilst also making you vulnerable to data breaches, government requests, and corporate policy changes you have zero control over.
You've solved a technical problem by creating a political one. Great.
Okay, then take sharing your PINs with your spouse. Or for that matter, account passwords or phone unlock patterns. It's a perfectly normal thing that many people (including myself) do, because it enables ad-hoc delegation. "Honey, can you copy those photos to my laptop and send them to godparents?", asks my wife as she hands me her phone and runs to help our daughter with something - implicitly trusting me with access to her phone, thumbdrive, Windows account, e-mail account, and WhatsApp/Messenger accounts.
This kind of ad-hoc requests happen for us regularly, in both directions, without giving it much of a thought[0]. It's common between couples, variants of that are also common within family (e.g. grandparents delegating most of computer stuff to their adult kids on an ad-hoc basis), and variants of that also happen regularly in workplaces[1], despite the whole corporate and legal bureaucracy trying its best to prevent it[2].
> Government IDs have recovery because the government is the trusted authority that verified you exist in the first place. Microsoft didn't issue your birth certificate.
But Microsoft issued your copy of Windows and Bitlocker and is the one responsible for your data getting encrypted. It's obvious for people to seek recourse with them. This is how it works in every industry other than tech, which is why I'm a supporter of governments actually regulating in requirements for tech companies to offer proper customer support, and stop with the "screw up managing 2FA recovery keys, lose your account forever" bullshit.
> Banks can reset your PIN because they're heavily regulated entities with legal obligations and actual consequences for breaching trust.
As it should be. As it works everywhere, except tech, and especially except in the minds of security aficionados.
> Nuclear launch codes are literally designed around not giving any single entity complete access, hence the two-person rule and multiple independent key holders.
Point being, if enough right people want the nukes to be launched, the nukes will be launched. This is about the highest degree of responsibility on the planet, and relevant systems do not have the property of "lose the encryption key we told you 5 years ago to write down, and it's mathematically proven that no one can ever access the system anymore". It would be stupid to demand that.
That's the difference between infosec industry and real life: in real life, there is always a way to recover. Infosec is trying to normalize data and access being fundamentally unrecoverable after even a slightest fuckup, which is a degree of risk individuals and society have not internalized yet, and are not equipped to handle.
> Right, so the solution is clearly to hand those keys to a corporation that's subject to government data requests, has been breached multiple times, and whose interests fundamentally don't align with yours?
Yes. For normal people, Microsoft is not a threat actor here. Nor is the government. Microsoft is offering a feature that keeps your data safe from thieves and stalkers (and arguably even organized crime), but that doesn't require you to suddenly treat your laptop with more care than you treat your government ID. They can do this, because for users of this feature, Microsoft is a trusted party.
Ultimately, that's what security aficionados and cryptocurrency people don't get: the world runs on trust. Trust is a feature.
--
[0] - Though less and less of that because everyone and their dog now wants to require 2FA for everything. Instead of getting the hint that passwords are not meant to identify a specific individual, they're doubling down and tying every other operation to a mobile phone, so delegating desktop operations often requires handing over your phone as well, defeating the whole point. This is precisely what I mean by the industry not recognizing or supporting the concept of delegation of authority.
[1] - The infamous practice of writing passwords on post-it notes isn't just because of onerous password requirements, it's also a way to facilitate temporary delegation of authority. "Can you do X for me? Password is on a post-it in the top drawer."
[2] - GDPR or not, I still heard from doctors I know personally that sharing passwords to access patient data is common, and so is bringing some of it back home on a thumb drive, to do some work after hours. On the one hand, this creates some privacy risks for patient (and legal risk for hospitals) - but on the other hand, these doctors don't do it because they hate GDPR or their patients. They do it because it's the only way they can actually do their jobs effectively. If rules were actually enforced to prevent it, people would die. This is what I mean when I say that security advice is often dumb and out of touch with reality, and ignored for very good reasons.
> Okay, then take sharing your PINs with your spouse.
Sharing with your spouse is consensual, temporary, and revocable. You know you've done it, you trust that specific person, and you can change it later. Uploading your keys to Microsoft is none of these things.
> But Microsoft issued your copy of Windows and Bitlocker and is the one responsible for your data getting encrypted.
Microsoft sold you software. They didn't verify your identity, they're not a regulated financial institution, and they have no duty of care beyond their terms of service. The fact that they encrypted your drive doesn't make them a trustworthy custodian of the keys any more than your locksmith is entitled to copies of your house keys.
> For normal people, Microsoft is not a threat actor here. Nor is the government.
"Normal people" includes journalists, lawyers, activists, abuse survivors, and anyone else Microsoft might be legally compelled to surveil. Your threat model is "thieves and stalkers". Mine includes the state. Both are valid, but only one of us is forcing our model on everyone by default.
> the world runs on trust. Trust is a feature.
Trust in the wrong entity is a vulnerability. You're arguing we should trust a corporation with a legal department larger than most countries' regulators, one that's repeatedly been breached and is subject to government data requests in every jurisdiction it operates.
Your doctors-breaking-GDPR example is particularly telling: you've observed that bad UX causes people to route around security, and concluded that security is the problem rather than the UX. The solution to "delegation is hard" isn't "give up and trust corporations". It's "build better delegation mechanisms". One is an engineering problem. The other is surrender dressed as pragmatism.
If you don't have backups of your data, you've already lost regardless of where your recovery key lives. That's not an encryption problem, that's a "you didn't do backups" problem, which, I'll agree is a common issue. I wonder if the largest software company on the planet (with an operating system in practically every home) can help with making that better. Seems like Apple can, weird.
> TPMs do fail on occasion.
So do Microsoft's servers. Except Microsoft's servers are a target worth attacking, whereas your TPM isn't. When was the last time you heard about a targeted nation-state attack on someone's motherboard TPM versus a data breach at a cloud provider?
> A bank PIN you can call and reset, they can already verify your identity through other means.
Banks can do that because they're regulated financial institutions with actual legal obligations and consequences for getting it wrong. They also verified your identity when you opened the account, using government ID and proof of address.
Microsoft is not your bank, not your government, and has no such obligations. When they hand your keys to law enforcement, which they're legally compelled to do, you don't get a phone call asking if that's alright.
The solution to TPM failure is a local backup of your recovery key, stored securely. Not uploading it to someone else's computer and hoping for the best.
If you're talking about time machine, windows has had options built in since NT.
There are a lot of people here criticising MSFT for implementing a perfectly reasonable encryption scheme.
This isn’t some secret backdoor, but a huge security improvement for end-users. This mechanism is what allows FDE to be on by default, just like (unencrypted) iCloud backups do for Apple users.
Calling bs on people trying to paint this as something it’s not is not “whiteknighting”.
>Microsoft is an evil corporation, so we must take all bad stories about them at face value. You're not some corpo bootlicker, now, are you? Now, in unrelated news, I heard Pfizer, another evil corporation with a dodgy history[1] is insisting their vaccines are safe...
Yes. The thing is: Microsoft made the design decision to copy the keys to the cloud, in plaintext. And they made this decision with the full knowledge that the cops could ask for the data.
You can encrypt secrets end-to-end - just look at how password managers work - and it means the cops can only subpoena the useless ciphertext. But Microsoft decided not to do that.
I dread to think how their passkeys implementation works.
Apple does this too. So does Google. This is nothing new.
It's a commonly used feature by the average user who loses their password or their last device.
During set up, they even explicitly inform the user that their bitlocker keys are being backed up to the cloud. And, you can still choose to use bitlocker without key escrow.
If the user's MacOS FileVault disk encryption key is "stored in iCloud" it resides in the users iCloud Keychain which is end-to-end encrypted. This creates a situation similar to the iPhone, where Apple does not have the ability to access the user's data and therefore cannot comply with a warrant for access (which really annoys organizations like the FBI and Interpol)
It's 2026. The abuses of corporations are well documented. Anyone who still chooses Windows of their own volition is quite literally asking for it and they deserve everything that happens to them.
I'd already long since migrated away from Windows but if I'd been harbouring any lingering doubts, that was enough to remove them.
What do you suggest? I’ll try it in a VM or live usb.
One warning: keep in mind that if your desktop PC motherboard has a mediatek wifi+bluetooth chip, that chip will probably not work on any version Linux (AFAIK). I don't use wifi on my desktop but I do use bluetooth game controllers. You can replace the chip (which is what I did, with https://www.amazon.com/dp/B08MJLPZPL), get a bluetooth dongle (my friend recommends https://www.amazon.com/Bluetooth-Wireless-External-Receiver-...), or get a PCIe one.
Regardless of which distro you choose, your "desktop experience" will be mostly based on what desktop environment you pick, and you are free to switch between them regardless of distro. Ubuntu for example provides various installers that come with different DEs installed by default (they call them "flavours": https://ubuntu.com/desktop/flavors), but you can also just switch them after installation. I say "mostly" because some distros will also customise the DE a bit, so you might find some differences.
"Nicest desktop experience" is also too generic to really give a proper suggestion. There are DEs which aim to be modern and slick (e.g. GNOME, KDE Plasma, Cinnamon), lightweight (LXQt), or somewhere in between (Xfce). For power users there's a multitude of tiling window managers (where you control windows with a keyboard). Popular choices there are i3/sway or, lately, Niri. All of these are just examples, there are plenty more DEs / WMs to pick from.
Overall my suggestion would be to start with something straightforward (Mint would probably be my first choice here), try all the most popular DEs and pick the one you like, then eventually (months or years later) switch to a more advanced distro once you know more what your goals are and how you want to use the system. For example I'm in the middle of migrating to NixOS because I want a fully declarative system which gives the freedom to experiment without breaking your system because you can switch between different temporary environments or just rollback to previous generations. But I definitely wouldn't have been ready for that at the outset as it's way more complex than a more traditional distro.
Bon appetit!
I heard Kubuntu is not a great distro for KDE, but I can't comment on that personally.
If you want something that "just works," Linux Mint[1] is a great starting point. That gets you into Linux without any headache. Then, later when bored, you can branch out into the thousands[2] of Linux distributions that fill every possible niche
Fedora is so significantly better.
I wouldn't confuse popularity for good. Ubuntu gave away free CDs in the 2000s and are living off old marketing.
Debian family is so bad. You will be in the terminal constantly just trying to get stuff to work. Stick to a well maintained, up to date, consumer distro, Fedora.
(reminder that Fedora is Not Arch)
If you care a little more about your privacy and is willing to sacrifice some commodity, go for Fedora. It's community run and fairly robust. You may have issues with media codecs, nvidia drivers and few other wrinkles though. The "workstation" flavor is the most mature, but you may want to give the KDE version a try.
If you want an adventure, try everything else people are recommending here :)
The real issue is that you can't be sure that the keys aren't uploaded even if you opt out.
At this point, the only thing that can restore trust in Microsoft is open sourcing Windows.
The fully security conscious option is to not link a Microsoft account at all.
I just did a Windows 11 install on a workstation (Windows mandatory for some software) and it was really easy to set up without a Microsoft account.
And newer builds of Windows 11 are removing these methods, to force use of a Microsoft account. [0]
[0] https://www.windowslatest.com/2025/10/07/microsoft-confirms-...
By "really easy" do you mean you had a checkbox? Or "really easy" in that there's a secret sequence of key presses at one point during setup? Or was it the domain join method?
Googling around, I'm not sure any of the methods could be described as "really easy" since it takes a lot of knowledge to do it.
Edit: ofc we all agree local accounts needs to be a supported option, but perhaps we should be more careful about yelling from the rooftops that it’s practically impossible. I’ve been told for years now that it’s really hard or impossible, and it really was not that hard (yet…)
Chastising people about "yelling" is not really an appropriate thing to say here.
This is a part of Settings that you will never see at a passing glance, so it's easy to forget that you may have it on.
I'd also like to gently push back against the cynicism expressed about having a feature like this. There are more people who benefit from a feature like this than not. They're more likely thinking "I forgot my password and I want to get the pictures of my family back" than fully internalizing the principles and practices of self custody - one of which is that if you lose your keys, you lose everything.
There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account. Either of these types of accounts may be associated with an iCloud account. macOS doesn’t work like Windows where your Microsoft account is your login credential for the local machine.
FileVault key escrow is something you can enable when enabling FileVault, usually during initial machine setup. You must be logged into iCloud (which happens in a previous step of the Setup Assistant) and have iCloud Keychain enabled. The key that wraps the FileVault volume encryption key will be stored in your iCloud Keychain, which is end-to-end encrypted with a key that Apple does not have access to.
If you are locked out of your FileVault-encrypted laptop (e.g. your local user account has been deleted or its password has been changed, and therefore you cannot provide the key to decrypt the volume encryption key), you can instead provide your iCloud credentials, which will use the wrapping key stored in escrow to decrypt the volume encryption key. This will get you access to the drive so you can copy data off or restore your local account credentials.
And just in case it wasn't clear enough, I'd add: a local user account is standard. The only way you'd end up with an LDAP account is if you're in an organization that deliberately set your computer up for networked login; it's not a typical configuration, nor is it a component used by iCloud.
In my humble opinion: the current state is better than no encryption at all. For example: Laptop theft, scavengers trying to find pictures, etc. And if you think you are target of either Microsoft or the law enforcement manage your keys yourself or go straight to Linux.
Bitlocker on by default (even if Microsoft does have the keys and complies with warrants) is still a hell if a lot better than the old default of no encryption. At least some rando can't steal your laptop, pop out the HDD, and take whatever data they want.
False. If you only put the keys on the Microsoft account, and Microsoft closes your account for whatever reason, you are done.
done here meaning you've lost your data which uhhh, is currently on a drive in the hands of thieves, so what did you lose again?
The issue is about getting locked out of your own data, which can easily happen in a number of cases.
And you don't necessarily need to actually have your account banned.
Let's just say you signed up for a Microsoft account when setting up for a new PC (well, because you have to). You don't use that account anywhere else, and you forgot the password, even though you can log in via PIN or something else. Now you install Linux or just boot to a different system once. When you need to boot to Windows again, good luck.
And that's just one of the cases.
A real disaster happened to someone, although on a different platform, and the context is a bit different: https://hey.paris/posts/appleid/
However a hostile foreign government has less control over me.
As such using a tool of a hostile foreign government (Microsoft) needs to be understood and avoided.
Users absolutely 100% will lose their password and recovery key and not understand that even if the bytes are on a desk physically next to you, they are gone. Gone baby gone.
In university, I helped a friend set up encryption on a drive w/ his work after a pen drive with work on it was stolen. He insisted he would not lose the password. We went through the discussion of "this is real encryption. If you lose the password, you may as well have wiped the files. It is not in any way recoverable. I need you to understand this."
6 weeks is all it took him.
Microsoft seems to feel constant pressure to dumb Windows down, but if you look at the reasons people state when switching to Linux, control is a frequent theme. People want the dangerous power tools.
Table saw blade guards and riving knives are an ironic example here: I've yet to hear a story of a woodworker that lost a finger on a table saw that wouldn't have been able to avoid that injury if they kept one of those safety devices on the saw. Everyone thinks the annoyance isn't worth it, since they are an 'expert', yet it happens frequently.
Improving the situation ... how exactly?
While it is true that NSLs or other coercion tactics will force them to give out the keys, it is also true that this is only possible because Microsoft implemented a fatally flawed system where they have access to the keys.
Any system where a third party has access to cleartext or the keys to decrypt to cleartext is completely broken and must not be used.
They can fight the warrant, if you don't at least object to it then "giving the keys away" is not an incorrect characterization.
Often it is the case that companies hand over private data to law enforcement just by being asked for it nicely, no warrant needed.
We know iCloud has configurations that can’t disclosed, and I wonder if there is a middle ground between if you loose the recovery key you are stuffed and maybe have a recovery key unblocked by a password similar to ssh keys
> It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
It allows /anyone/ to recover their data later. You don't have to be a "purist" to hate this.
With this scheme the drive is recoverable by the user and unreadable to everyone except you, Microsoft, and the police. Surely that's a massive improvement over sitting in plaintext readable by the world. The people who are prepared to do proper key management will know how to do it themselves.
Apple does the same thing with FileVault when you set up with your iCloud account where, again, previously your disk was just left unencrypted.
I think you just identified the problem clearly.
> Now the police don't even have to ask!
Security is not a switch you can turn on and forget about. Plus the police have extraordinary real world powers to compel you to disclose the necessary information anyways. Unless you're holding state secrets, which, c'mon, you're almost certainly going to give in and cooperate at some point. It wouldn't make for a great Hollywood movie but it would accurately reflect day to day reality.
> unreadable to everyone except you, Microsoft, and the police.
That's two too many. It should either be unreadable to everyone but me or readable by anyone with physical access. Does it not occur to people that you can still rely on physical security even in computing?
> Apple does the same thing
The two corporate computing giants do the same thing? I am not surprised but I also don't see it as a worthwhile data point.
Nah, the FileVault key is stored in your iCloud Keychain when you choose to backup the key to iCloud. And the keychain is end-to-end encrypted. Only the user has access.
I’m not sure how you’re criticizing the “gave” framing when you’re describing and stating Microsoft literally giving the keys to the FBI.
Better, and more accurate wording, would be that "Microsoft surrendered keys" or "Microsoft ceded keys". Or "Microsoft legally compelled to give the keys". If Microsoft did so without a warrant, then "gave" would be more tonally accurate.
In addition, none of this is new. They've been turning over keys when legally compelled to, for many years.
Fun fact: Apple does this too. https://support.apple.com/en-us/108756
In Apple's case, even when the user enables iCloud FileVault key backup, that key is still end-to-end encrypted and Apple cannot access it. As a matter of fact, while Apple regularly receives legal warrants for access, they are ineffective because Apple has no way to fulfill that request/requirement.
Microsoft has chosen to store the BitLocker key backups in a manner that maintains their (Microsoft's) access. But, this is a choice Microsoft has made its not an intrinsic requirement of a key escrow system. And in the end, it enables law enforcement to compel them to turn over these keys when a judge issues a warrant.
I have W11 w a local account and no bitlocker on my desktop computer, but the sheer amount of nonsense MS has been doing these days has really made me question if 'easy modding*' is really enough of a benefit for me to not just nuke it and install linux yet again
* You can get the MO2 mod manager running under linux, but it's a pain, much like you can also supposedly run executable mods (downgraders, engine patches, etc) in the game's context, but again, pain
I'd be more concerned about access to cloud data (emails, photos, files.)
So, yes. That is how it works: 1) Microsoft forces users to online accounts 2) Bitlocker keys are stored in an insecure manner allowing any US agency to ask for them. I intentionally say "ask for them" because the US government is a joke with respect to respecting its own citizens privacy [1] at this point.
This type of apologetic half-truth on behalf of a multi-billion dollar corporation is getting old fast.
[0] https://www.forbes.com/sites/thomasbrewster/2026/01/22/micro... [1] https://www.npr.org/2026/01/23/nx-s1-5684185/doge-data-socia...
But, the 5th amendment is also why its important to not rely on biometrics. Generally (there are some gray areas) in the US you cannot be compelled to give up your password, but biometrics are viewed as physical evidence and not protected by the 5th.
The 5th Amendment gives you the right to refuse speech that might implicate you in a crime. It doesn’t protect Microsoft from being compelled to provide information that may implicate one of its customers in a crime.
Only fix is apparently waiting until enough for to cram through an Amendment/set a precedent to fix it.
One of the reasons giving for (usually) now requiring a warrant to open your phone they grab from you is because of the amount of third-party data you can access through it, although IIRC they framed is a regular 4th Amend issue by saying if you had a security camera inside your house the police would be bypassing the warrant requirement by seeing directly into your abode.
In practice: https://en.wikipedia.org/wiki/In_re_Boucher
The government gets what the government wants.
You bet I have that enabled.
But this is irrelevant to the argument made above, right?
You mean "Install Linux",because that's easier than dealing with the steps required to do that on Windows
So all the state needs to get into your laptop is to get access from Apple to your iCloud account.
That said, when setting up FileVault, you have the option to escrow your recovery key with Apple. If you enable that, Apple can get the recovery key.
"For additional privacy and security, 15 data categories — including Health and passwords in iCloud Keychain — are end-to-end encrypted. Apple doesn't have the encryption keys for these categories, and we can't help you recover this data if you lose access to your account. The table below includes a list of data categories that are always protected by end-to-end encryption."
The FileVault keys are stored in the iCloud Keychain and Apple does not have access to them, full stop :-)
Unless they are given a warrant, then they magically have access to your encrypted data.
https://www.businessinsider.com/apple-fbi-icloud-investigati...
If they can get access to your icloud, they can get access to your laptop if you store your decryption key in your keychain.
I recently needed to make a bootable key and found that Rufus out of the box allows you to modify the installer, game changer.
Absolutely not. If my laptop tells me that it is encrypted by default, I don't like that the default is to also hold a copy of the keys in case big brother wants them.
Call me a "privacy purist" all you want, but it shouldn't be normal to expect the government to have access to a key to your house.
I think the reasonable default here would be to not upload to MS severs without explicit consent about what that means in practise. I suspect if you actually asked the average person if they're okay with MS having access to all of the data on their device (including browser history, emails, photos) they'd probably say no if they could.
Maybe I'm wrong though... I admit I have a bad theory of mind when it comes to this stuff because I struggle to understand why people don't value privacy more.
Eg in England you're already an enemy of the state when you protest against Israel's actions in Gaza. In America if you don't like civilians being executed by ICE.
This is really a bad time to throw "enemy of the state" around as if this only applies to the worst people.
Current developments are the ideal time to show that these powers can be abused.
As of today at 00:00 UTC, no.
But there's an increasingly possible future
where authoritarian governments will brand users
who practice 'non-prescribed use' as enemies of the state.
And when we have a government who's leader
openly gifts deep, direct access to federal power
to unethical tech leaders who've funded elections (ex:Thiel),
that branding would be a powerful perk to have access to
(even if indirectly).But you can still help prevent abuses of mass surveillance without probable cause by making such surveillance as expensive and difficult as possible for the state
Criticizing the current administration? That sounds like something an enemy of the state would do!
Prepare yourself for the 3am FBI raid, evildoer! You're an enemy of the state, after all, that means you deserve it! /s
Companies know that putting themselves in a position where they can betray their users, means they will be forced to do so. Famously demonstrated when Apple had to ban the Hong Kong protest app [1]. Yet they continue to do it, don't inform their users, and in the rare occasion that they offer an alternative, it is made unclear and complicated and easy to get wrong [2].
They deserve every ounce of blame.
The choice is not between honoring the warrant and breaking the law.
They can go to a judge and fight the warrant. Other companies have done this.
Microsoft won’t, one more reason I will never use anything from them.
These two statements are in no way mutually exclusive. Microsoft is gobbling up your supposedly private encryption keys because they love cops and want an excuse to give your supposedly private data to cops.
Microsoft could simply not collect your keys and then would have no reason or excuse to hand them to cops.
Microsoft chose to do this.
Do not be charitable to fascists.
Definitely agree that choices like these are the most sane for the default user experience and that having these advanced options for power users to do with it what they want is a fair compromise. Wish more people were open to designing software for the average person and compromising on a middle ground the benefits both kinds of users.
Will they shoot me in head?
What if I truly forgot the password to my encrypted drive? Will they also shoot me in the head?
What about your wife's head? Your kids' heads?
But Microsoft chose to keep them plain text, and thus they are, and will continue to be abused.
We must not victim blame. This is absolutely corruption on microsofts part.
can they compel testimony? keys, passcodes and the like are usually considered testimony. did they try? the usual story here is that they don't have to, that the big corporations will turn over any info they have on request because they can and the government makes a better friend than a single user. the article mentions 20 "requests" per year on average but doesn't say anything about the government using force.
I agree with your conclusion though: data you share with anyone is data you've shared with everyone and that includes your encryption keys. if that matters to you, then you need to take active steps to ensure your own security because compelled or not, the cloud providers aren't here to help keep you safe.
There is always a choice.
Which are both choices. Microsoft can for sure choose to block the government and so can individual workers. Let's not continue the fascism-enabling narratives of "no choice."
Tl;dr - "Basically, you’re either dealing with Mossad or not-Mossad. If your adversary is not-Mossad, then you’ll probably be fine if you pick a good password and don’t respond to emails from ChEaPestPAiNPi11s@ virus-basket.biz.ru. If your adversary is the Mossad, YOU’RE GONNA DIE AND THERE’S NOTHING THAT YOU CAN DO ABOUT IT" (Mickens, 2014)
but they didn't do so.
and it's surely just a coincidence, because m$ has always been such an ethical company.
and it's surely not by design to centralize power by locking out competing criminals from the user's data, but not themselves.
</s>
>This is why the FBI can compel Microsoft to provide the keys.
>in my opinion it's the reasonable default
I really can't imagine what kind of person would say that with a straight face. Hanlon's razor be damned, I have to ask: are you a Microsoft employee or investor?
Back in the day hackernews had some fire and resistance.
Too many tech workers decided to rollover for the government and that's why we are in this mess now.
This isn't an argument about law, it's about designing secure systems. And lazy engineers build lazy key escrow the government can exploit.
Most of the comments are fire and resistance, but they commonly take ragebait and run with the assumptions built-in to clickbait headlines.
> Too many tech workers decided to rollover for the government and that's why we are in this mess now.
I take it you've never worked at a company when law enforcement comes knocking for data?
The internet tough guy fantasy where you boldly refuse to provide the data doesn't last very long when you realize that it just means you're going to be crushed by the law and they're getting the data anyway.
The solution to that is to not have the data in the first place. You can't avoid the warrants for data if you collect it, so the next best thing is to not collect it in the first place.
The technology exists to trivially encrypt your data if you want to. That's not a product most people want, because the vast majority of people (1) will forget their password and don't want to lose their data, and (2) aren't particularly worried about the feds barging in and taking their laptop during a criminal investigation.
That's not what the idealists want, but that's the way the market works. When the state has a warrant, and you've got a backdoor, you're going to need to give the state the keys to the backdoor.
It shows that your idea of how the market works clearly is not representative of the actual market.
1. The famous 2016 San Bernardino case predates Advanced Data Protection technology of iCloud backups. It was never about encryption keys, it was about signing a ‘bad’ iOS update.
2. Details are limited, but it involved a third-party exploit to gain access to the device, not to break the encryption (directly). These are different things and should both be addressed for security, but separately.
Evidently, after this case ended, Apple continued its efforts. It rolled out protecting backups from Apple, and the requirement of successful user authentication before installing iOS updates (which is also protecting against Apple or stolen signing keys).
There is a market here.
Where I live, government passed a similar law to the UK's online identification law not too long ago. It creates obligations for operating system vendors to provide secure identity verification mechanisms. Can't just ask the user if they're over 18 and believe the answer.
The goal is of course to censor social media platforms by "regulating" them under the guise of protecting children. In practice the law is meant for and will probably impact the mobile platforms, but if interpreted literally it essentially makes free computers illegal. The implication is that only corporation owned computers will be allowed to participate in computer networks because only they are "secure enough". People with their own Linux systems need not apply because if you own your machine you can easily bypass these idiotic verifications.
In Brazil, where I live, it's law 15.211/2025. It makes it so that the tech industry must verify everyone's identity in order to proactively ban children from the harmful activities. It explicitly mentions "terminal operating systems" when defining which softwares the law is supposed to regulate.
Microsoft (and every other corporation) wants your data. They don't want to be a responsible custodian of your data, they want to sell it and use it for advertising and maintaining good relationships with governments around the world.
The same way companies used to make money, before they started bulk harvesting of data and forcing ads into products that we're _already_ _paying_ _for_?
I wish people would have integrity instead of squeezing out every little bit of profit from us they can.
The same can be said of using “allies” to mutually snoop on citizens then turning over data.
Microsoft does not sell / use for advertising data from your Bitlocked laptop.
They do use the following for advertising:
Name / contact data Demographic data Subscription data Interactions
This seems like what a conspiracy theorist would imagine a giant evil corporation does.
https://www.microsoft.com/en-us/privacy/usstateprivacynotice
> I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.
FTA (3rd paragraph): don't default upload the keys to MSFT.
>If you design it so you don't have access to the data, what can they do?
You don't have access to your own data? If not, they can compel you to reveal testimony on who/what is the next step to accessing the data, and they chase that.
It has nothing to do with the state and has to do with getting the RSUs to pay the down payment for a house in a HCOL area in order to maybe have children before 40 and make the KPIs so you don't get stack-ranked into the bottom 30% and fired at big tech, or grinding 996 to make your investors richest and you rich-ish in the process if you're unlikely enough to exit in the upper decile with your idea. This doesn't include the contingent of people who fundamentally believe in the state, too.
Most people are activists only to the point of where it begins to impede on their comfort.
There was no “back in the day” where big tech was on our side. Stop being a poser
False. You can design truly end-to-end encrypted secure system and then the state comes at you and says that this is not allowed, period. [1]
[1] https://medium.com/@tahirbalarabe2/the-encryption-dilemma-wh...
govt := new_govt
terrorist := yous/workers/Corporations/
I hope you put your money where your mouth is.
The people going 'well of course' or 'this is for the user' drive me insane here because as said, there are secure ways you can build a key escrow system so that your data and systems are actually secure. From a secure design standpoint it feels more and more like we're living in Idiocracy as people argue insecure solutions are secure actually and perfectly acceptable.
Trying to resist building ethically questionable software usually means quitting or being fired from a job.
In the 90s and 00s people overwhelmingly built stuff in tech because they cared about what they were building. The money wasn't bad, but no one started coding for the money. And that mindset was so obvious when you looked at the products and cultures of companies like Google and Microsoft.
Today however people largely come into this industry and stay in it for the money. And increasingly tech products are reflecting the attitudes of those people.
Hackernews is a public forum, and the people here change constantly. "Back in the day" there were mostly posts about LISP and startup equity. It's obviously not the same people here now.
> Too many tech workers decided to rollover for the government
Again, not the same group of people. In the 2000s "tech workers" might have mostly been Californians. Now they're mostly in India. Differing perspectives on government, to be sure.
> lazy engineers build lazy key escrow
Hey you should know this one, because it's something that HAS stayed constant since "back in the day": The engineers have absolutely no say in this whatsoever.
That's it. That's the whole thing. Whatever "secure system" you build will not have this property and users will lose their data, be mad at you, and eventually you'll have to turn it off by default leaving everyone's data in plaintext. It's a compromise that improves security for people who previously left their disk unencrypted. It changes nothing for people who previously did their own key management.
You won't be able to turn the first group into the second group. That's HN's "Average Familiarity" fallacy. The fact that basically every 2FA system has a means of recovering your account by removing it should tell you that even technical people are shit at key management.
When you get high up in an org, choosing Microsoft is the equivalent of the old "nobody ever got fired for buying IBM". You are off-loading responsibility. If you ever get high up at a fortune 500 company, good luck trying to get off of behemoths like Microsoft.
It isn't really about the government. It's about a bunch of people trying to convince you that the locked-down proprietary closed source corporate crap that they use isn't in and of itself a security risk, no matter what the quality of the code that you've never seen is. Apple, Microsoft, Google etc. aren't your friends; no matter how brand loyal you are, they'll never care whether you're alive or dead.
FOSS isn't your friend either, but they're not asking you to trust them. Any exposure to these world spanning juggernaut military and intelligence contractor companies is a security hole. It's insane that people (thinking of Europeans now) get fired up to switch from this stuff because Trump but not because of course you should. Instead they're busy calling being suspicious of Microsoft old and hatred of Apple's customer corral stuck up and the desire to own your own machine fanatical and judgemental. Have you ever considered that you've been programmed to say and encourage dumb stuff that is completely against your own interests and supports the interests of the people who sell things to you?
You're convinced by the argument that people dumber than you have to be protected from their own machines (by corporations who have no interest in or obligation to protect them) - have you ever thought that people are saying the same thing about you? That you have to be protected from writing things you shouldn't write or talking to people you shouldn't be talking to? And the world isn't a meritocracy: the people on the top are inbred creeps. You've given up your freedom to dummies with marketing departments.
So now I just use whatever I want. Someone else can be a tech moralist.
I'm glad the knee-jerk absolutists are marginal, for one. A world run by you people would be much worse for anyone who isn't you.
Ask a non techy user:
* How do they backup their data/do they backup their data at all?
* Do they know 3-2-1 rule? Are they following it?
I bet 90% people will answer no to some of the questions.
And data backup is much more of an everyday topic compared to disk encryption.
I don’t get how people like you trust the corporation or the government that much. If we were all more cognizant of security and privacy, it would be much harder for large orgs to break our society the way they are doing today.
This is one such example.
This sort of utilitarian nitpicking over the convenience of a "median" user is like maximizing the happiness of a cow on a factory farm. The cow would be better off if it did not exist at all. It is a matter of freedom and dignity.
What happens if I forget my keys? Same thing that happens if my computer gets struck by a meteor. New drive, new key, restore contents from backups.
It's simple, secure, set-and-forget, and absolutely nobody but me and your favored deity have any idea what's on my drives. Microsoft and the USGov don't have any business having access to my files, and it's completely theoretically impossible for them to gain access within the next few decades.
Don't use Windows. Use a secure operating system. Windows is not security for you, it's security for a hostile authoritarian government.
What happens if you forget your backup keys?
At this point I think all of the modern, widely used symmetric cryptography that humans have invented will never be broken in practice, even by another more technologically advanced civilization.
On the asymmetric side, it's a different story. It seems like we were in a huge rush to standardize because we really needed to start PQ encrypting data in transit. All the lattice stuff still seems very green to me. I put P(catastrophic attack) at about 10% over the next decade.
the only real defense of privacy these days is to literally not write anything down or store it in any way
Privacy is not a crime.
Some ways around this is to either not store sensitive user data on servers, or if that needs to happen then encrypt it with user supplied keys.
It's time - it's never been easier, and there's nothing you'll miss about Windows.
I've tried to get them to use the web version of office, I've tried to get them to use OnlyOffice and LibreOffice, I've even tried showing them LaTeX as a last ditch effort, but no, if it isn't true Microsoft Branded Office 2024, the topic isn't even worth discussing [1].
I'm sure there are technical reasons why Wine can't run Office 2024, and I am certainly not trying to criticize the wine developers at all, but until I can show Wine running full-fat MS Office, my parents will always "miss" Windows.
To be clear, I hate MS Office. I do not miss it on Linux. I'm pretty sure my parents could get by just fine with LibreOffice or OnlyOffice or Google Docs, but they won't hear it.
I've also tried to get them to use macOS, since that does have a full-fat MS Office, I've even offered to buy them Macbooks so they can't claim it's "too expensive", and they still won't hear it. I love my parents but they can be stubborn.
[1] Before you accuse me of pushing for "developer UI", LaTeX was not something I led with. I tried the more "normy-friendly" options first.
If I had a dollar for every time MS Word failed to correctly handle the BIDI mix and put things in the wrong order, despite me reapeatedly trying different ways to fix it, I'd be richer than Microsoft.
On the contrary, Google Docs, LibreOffice, and pretty much every text box outside of MS Office can effortlessly handle BIDI mixing, all thanks the Unicode Bidirectional Algorithm [1] being widely implemented ans standardized.
I get the notion of shortcut conflicts, but, at a glance, this should be a trivial one click setup to set the desired shortcut config, wouldn’t it?
I agree that it might be trivial to set up for spreadsheets, and it would be really useful for other spreadsheets, and many other applications. I suppose a hurdle is how context sensitive the commands are depending on the cell or range of cells activated, and their contents and data type.
That's a 100% easy peasy safe mode, the worst they're likely to encounter is a brief 2 minute call with you, and in the worst case scenario, they get to go back to Windows without having to be scared of losing anything.
Afraid I don't get the reference if this is a joke, but no that is not my last name.
I've offered similar solutions to this; a VM that they can RDP into, or just a VM running locally with Winboat or Winapps so they could work with the apps they need to, but they won't entertain the idea.
Honestly I kind of think they're adding increasing conditions just so I stop bothering them about it. I think they very much do not want to change operating systems and they know that just saying that won't be a valid enough excuse to get my to shut up about it.
Before people give me shit over trying to force my dogma on them, I should point out that when their computers break (e.g. Windows Update decides to brick their computer), I am the one that is expected to fix them. I don't think it's unreasonable that if I'm expected to do the repairs on the computer that I get a say in what's installed on them.
Fedora is my recommendation. I remind people Fedora is not Arch. Fedora is a consumer grade OS that is so good, I don't lump it in with the word Linux.
In the past 3 years: - mouse/cursor issues due to some kernel upgrade I think, as Fedora stays close to upstream - unresponsive computer due to a bug in the AMD graphics driver
Both were easy to fix (kernel cmdline change or just kept updating my computer), and I absolutely recommend Fedora. That's what I'd use if I had servers. But, you'll probably have to debug _some_ issues if you use something less-used like AMD.
You can build your ideal fantasy setup piecewise, and I definitely recommend getting there, but Fedora is nice, and clean, and has plenty of "just works", and 99.999% of the problems you might run into, someone else has, too, and they wrote a treatise and tutorial on how to fix it and why it happened.
At least they are honest about it, but a good reason to switch over to linux. Particularly if you travel.
If microsoft is giving these keys out to the US government, they are almost certainly giving them to all other governments that request them.
I use BitLocker on my Windows box without uploading the keys. I don't even have it connected to a Microsoft account. This isn't a requirement.
Probably not now but not something unimaginable in some future.
However, since Windows can still run on user-controlled hardware (non-secure boot or VMs), I guess this kind of behavior could be checked for by intercepting communications before TLS encryption.
Well at least you got that part correct. Do you just not know about security researchers? Or even bug bounty programs?
Why are you even on this forum? Doesn't seem like you know much about technology
> If they have a key in their possession [...]
So they do have a choice.
Regarding the article's Apple example:
> The FBI eventually found a third party to break into the phone, but the tension between privacy and security remains unresolved.
This is actually quite resolved.
- Tech companies in the US are free to write secure encryption technologies without backdoors.
- Government is free to try to break it when they have valid legal authority.
- Tech companies are obligated to turn over information in their possession when given a legal warrant signed by a judge based on probable cause that a crime has occurred.
- Tech companies are not required to help hack into systems on the government's behalf.
As far as I'm concerned, in the US things are perfectly resolved, and quite well I think. It's the government and fear-mongers who constantly try to "unresolve" things.
The only explanation that makes sense to me is that there's an element of irrationality to it. Apple has a well known cult, but Microsoft might have one that's more subtle? Or maybe it's a reverse thing where they hate Linux for some equally irrational reasons? That one is harder to understand because Linux is just a kernel, not a corporation with a specific identity or spokesperson (except maybe Torvalds, but afaik he's well-regarded by everyone)
> Here's what happens on your Dell computer:
> BitLocker turns on automatically when you first set up Windows 10 or Windows 11
> It works quietly in the background, you won't notice it's there
> Your computer creates a special recovery key (like a backup password) that's saved to your Microsoft account
> You might be reading this article because:
> Your computer is asking for a BitLocker recovery key
...such as after your laptop resets its tpm randomly which is often the first time many people learn their disk is encrypted and that there's a corresponding recovery key in their microsoft account for the data they are now unexpectedly locked out of.
https://www.dell.com/support/kbdoc/el-gr/000124701/automatic...
Though that doesn't mean Microsoft couldn't implement a way of storing these keys so that they can't be accessed by Microsoft. Still better than nothing though.
Will this make people care? Probably not, but you never know.
What governments and corporations (and plenty of bad actors in the FOSS world) have done is make this the default; made it easy to mindlessly hand people your privacy without even knowing. Opt-out, if you know the setting exists, and can find it.
I trust BitLocker and Apple’s encryption to protect my stuff against snooping thieves, but I have never, ever assumed for a moment that it’d protect me against a nation-state, and neither should you. All the back-and-forth you see in the media is just what’s public drama, and a thin veil of what’s actually going on behind the scenes.
If there’s stuff you don’t want a nation state to see, it better be offline, on a OSS OS, encrypted with thoroughly audited and properly configured security tooling. Even then, you’re more likely to end up in jail for refusing to decrypt it [1][2].
[1] https://arstechnica.com/tech-policy/2020/02/man-who-refused-...
[2] https://www.vice.com/en/article/how-refusing-to-hand-over-yo...
https://cointelegraph.com/news/fbi-cant-be-blamed-for-wiping...
Perhaps next time, an agent will copy the data, wipe the drive, and say they couldn't decrypt it. 10 years ago agents were charged for diverting a suspect's Bitcoin, I feel like the current leadership will demand a cut.
Shufflecake ( https://shufflecake.net/ ) is a "spiritual successor" to TrueCrypt/VeraCrypt but vastly improved: works at the block device level, supports any filesystem of choice, can manage many nested layers of secrecy concurrently in read/write, comes with a formal proof of security, and is blazing fast (so much, in fact, that exceeds performances of LUKS/dm-crypt/VeraCrypt in many scenarios, including SSD use).
Disclaimer: it is still a proof of concept, only runs on Linux, has no security audit yet. But there is a prototype for the "Holy Grail" of plausible deniability on the near future roadmap: a fully hidden Linux OS (boots a different Linux distro or Qubes container set depending on the password inserted at boot). Stay tuned!
Don't think Apple wouldn't do the same.
If you don't want other people to have access to your keys, don't give your keys to other people.
As a US company, it's certainly true that given a court order Apple would have to provide these keys to law enforcement. That's why getting the architecture right is so important. Also check out iCloud Advanced Data Protection for similar protections over the rest of your iCloud data.
[0] https://sixcolors.com/post/2025/09/filevault-on-macos-tahoe-...
As of macOS Tahoe, the FileVault key you (optionally) escrow with Apple is stored in the iCloud Keychain, which is cryptographically secured by HSM-backed, rate-limited protections.
You can (and should) watch https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for all the details about how iCloud is protected.
Unbreakable phones are coming. We’ll have to decide who controls the cockpit: The captain? Or the cabin?
The security in iOS is not to designed make you safer, in the same way that cockpit security doesn't protect economy class from rogue pilots or business-class terrorists. Apple made this decision years ago, they're right there in Slide 5 of the Snowden PRISM disclosure. Today, Tim stands tall next to POTUS. Any preconceived principle that Apple might have once clung to is forfeit next to their financial reliance on American protectionism: https://www.cnbc.com/2025/09/05/trump-threatens-trade-probe-...Of course Apple offers a similar feature. I know lots of people here are going to argue you should never share the key with a third party, but if Apple and Microsoft didn't offer key escrow they would be inundated with requests from ordinary users to unlock computers they have lost the key for. The average user does not understand the security model and is rarely going to store a recovery key at all, let alone safely.
> https://support.apple.com/en-om/guide/mac-help/mh35881/mac
Apple will escrow the key to allow decryption of the drive with your iCloud account if you want, much like Microsoft will optionally escrow your BitLocker drive encryption key with the equivalent Microsoft account feature. If I recall correctly it's the default option for FileVault on a new Mac too.
Besides, Apple's lawyers aren't stupid enough to forget to carve out a law-enforcement demand exception.
For example, in 20th century, an European manufacturer of encryption machines (Crypto AG [1]) made a backdoor at request of governments and never got punished - instead it got generous payments.
https://news.ycombinator.com/item?id=46252114
https://news.ycombinator.com/item?id=45520407
Uploading passwords to the cloud should count. Also this: https://sneak.berlin/20231005/apple-operating-system-surveil...
Are you expecting perfection here? Or are you just being argumentative?
"Conspiracy theory" is not the same as a crazy, crackhead theory. See: Endward Snowden.
Full quote from the article:
> Mind you, this is definitionally a conspiracy theory; please don’t let the connotations of that phrase bias you, but please feel free to read this (and everything else on the internet) as critically as you wish.
> and they fixed the cleartext transmission of hardware identifiers
Have you got any links for that?
> Are you expecting perfection here? Or are you just being argumentative?
I expect basic things people should expect from a company promoting themselves as respecting privacy. And I don't expect them to be much worse than GNU/Linux in that respect (but they definitely are).
They are immune to reputation damage. Teens and moms don't care.
Nope. For this threat model, E2E is a complete joke when both E's are controlled by the third party. Apple could be compelled by the government to insert code in the client to upload your decrypted data to another endpoint they control, and you'd never know.
An open source project absolutely cannot do that without your consent if you build your client from the source. That's my point.
This also completely disregards the history of vulnerability incidents like XZ Utils, the infected NPM packages of the month, and even for example CVEs that have been found to exist in Linux (a project with thousands of people working on it) for over a decade.
Threat model A: I want to be secure against a government agency in my country using the ordinary judicial process to order engineers employed in my country to make technical modifications to products I use in order to spy on me specifically. Predicated on the (untrue in my personal case) idea that my life will be endangered if the government obtains my data.
Threat model B: I want to be secure against all nation state actors in the world who might ever try to surreptitiously backdoor any open source project that has ever existed.
I'm talking about threat model A. You're describing threat model B, and I don't disagree with you that fighting that is more or less futile.
Many open source projects are controlled by people who do not live in the US and are not US citizens. Someone in the US is completely immune to threat model A when they use those open source projects and build them directly from the source.
> For this threat model
We're talking about a hypothetical scenario where a state actor getting the information encrypted by the E2E encryption puts your life or freedom in danger.
If that's you, yes, you absolutely shouldn't trust US corporations, and you should absolutely be auditing the source code. I seriously doubt that's you though, and it's certainly not me.
The sub-title from the original forbes article (linked in the first paragraph of TFA):
> But companies like Apple and Meta set up their systems so such a privacy violation isn’t possible.
...is completely utterly false. The journalist swallowed the marketing whole.
I also grant that many things are possible (where the journalist says "isn't possible").
However, what remains true is that Microsoft appears to store this data in a manner that can be retrieved through "simple" warrants and legal processes, compared to Apple where these encryption keys are stored in a manner that would require code changes to accomplish.
These are fundamentally different in a legal framework and while it doesn't make Apple the most perfect amazing company ever, it shames Microsoft for not putting in the technical work to accomplish these basic barriers to retrieving data.
The fact it requires an additional engineering step is not an impediment. The courts could not care less about the implementation details.
> compared to Apple where these encryption keys are stored in a manner that would require code changes to accomplish.
That code already exists at apple: the automated CSAM reporting apple does subverts their icloud E2E encryption. I'm not saying they shouldn't be doing that, it's just proof they can and already do effectively bypass their own E2E encryption.
A pedant might say "well that code only runs on the device, so it doesn't really bypass E2E". What that misses is that the code running on the device is under the complete and sole control of apple, not the device's owner. That code can do anything apple cares to make it do (or is ordered to do) with the decrypted data, including exfiltrating it, and the owner will never know.
That's not really true in practice by all public evidence
> the automated CSAM reporting apple does
Apple does not have a CSAM reporting feature that scans photo libraries, it never rolled out. They only have a feature that can blur sexual content in Messages and warn the reader before viewing.
We can argue all day about this, but yeah - I guess it's true that your phone is closed source so literally everything you do is "under the complete and sole control of Apple."
That just sends you back to the first point and we can never win an argument if we disagree about the level the government might compel a company to produce data.
Except for that time they didn't.
But given PRISM, I'm sure Apple will just give it up.
PGP WDE was a preferred corporate solution, but now you have to trust Broadcom.
first of all trim only affects write speed (somewhat), which is not really all that important for non-server use.
it also has some impact on wear which is probably more interesting than its performance impact.
We now know that BitLocker is not secure, and an intelligent open source dev saying that was probably knowingly not saying the truth.
The best explanation to me is that this was said under duress, because somebody wanted people to move away from the good TrueCrypt to something they could break.
[1] https://truecrypt.sourceforge.net
[2] https://en.wikipedia.org/wiki/TrueCrypt#End_of_life_announce...
To everyone saying 'time to use Linux!'; recognize that if these people were using Linux, their laptops wouldn't be encrypted at all!
And because of Bitlocker, their encryption is worth nothing in the end.
> if these people were using Linux, their laptops wouldn't be encrypted
Maybe, maybe not. Ubuntu and Fedora both have FDE options in the installer. That's objectively more honest and secure than forcing a flawed default in my opinion.
No, it's worth exactly what it's meant for: in case your laptop gets stolen!
> flawed default
Look, in terms of flaws I would argue 'the government can for legal reasons request the key to decrypt my laptop' is pretty low down there. Again, we're dealing with the general populace here; if it's a choice between them getting locked out of their computer completely vs the government being able to decrypt their laptop this is clearly the better option. Those who actually care about privacy will setup FDE themselves, and everyone else gets safety in case their laptop gets stolen.
Or remote access to the computer. Or access to an encrypted backup drive. Or remote access to a cloud backup of the drive. So no, physical access to the original hard drive is not necessarily a requirement to use the stolen recovery keys.
This is so much more reasonable than (for example) all the EU chat control efforts that would let law enforcement ctrl+f on any so-called private message in the EU.
This is incorrect. A full disk image can easily obtained remotely, then mounted wherever the hacking is located. The host machine will happily ask for the Bitlocker key and make the data available.
This is a standard process for remote forensic image collection and can be accomplished surreptitiously with COTS.
As for it being user hostile. I am pretty certain that thousands of users a year are delighted when something has gone wrong and they can recover their keys and data from the MS Cloud.
There should perhaps be a screen in a wizard, Do you want your data encrypted? y,n
If (yes) Do you want to be able to recover your data if something bad happens? (else it will be gone for ever, you can never ever access it again) y/n
when it comes to giving out encryption keys, the answer should always be 'we don't have them.' 'you can't get them.'
Sad day for privacy at Microsoft.
Use LUKS instead.
(1) false advertisement
Companies like MS and Apple are telling their clients they offer a way to encrypt and secure their data but at best these claims are only half truths, mostly smoke and mirrors.
This is not OK. I don't want to get into legal parts of it, because I'm sure there's a fine print there that literally says it's smoke and mirrors, but it's despicable that these claims are made in the first place.
(2) the real need of ironclad encryption
I was born and raised in Eastern Europe. When I was a teenager it was common that police would stop me and ask me to show them contents of my backpack. Here you had two options - either (a) you'd show them the contents or (b) you would get beat up to a pulp and disclose the contents anyway.
It's at least 5h debate whether that's good or not, but in my mind, for 90% of cases if you're law abiding citizen you can simply unlock your phone and be done with that.
Sure, there are remaining 10% of use cases where you are a whistleblower, journalist or whatever and you want to retain whatever you have on your phone. But if you put yourself in that situation you'd better have a good understanding of the tech behind your wellbeing. Namely - use something else.
If you open the BitLocker control panel applet your drive(s) will be labelled as "Bitlocker waiting for activation".
Microsoft themselves [1] say:
> If a device uses only local accounts, then it remains unprotected even though the data is encrypted.
There is a further condition: if you explicitly enable bitlocker then the key is no longer stored on the disk and it is secure.
When I run "manage-bde -status" on my laptop it says "Key Protectors: None found". If the TPM was being used that would be listed.
Have you tried plugging the disk or ssd from your old laptop into another computer?
[1]: https://learn.microsoft.com/en-us/windows/security/operating...
But what about unsophisticated users? In aggregate it might be true data exfiltration is worse than data loss? I don't know if that's true.
But what is true is enabling encryption by default without automated backup and escrow will lead to some data loss.
It's difficult for me to separate the aggregate scenarios from individual scenarios. The individual penalty of data loss can be severe. Permanent.
As far as I can see this particular case is a straightforward search warrant. A court absolutely has the power to compel Microsoft to hand over the keys.
The bigger question is why Microsoft has the recovery feature at all. But honestly I believe Microsoft cares so little about privacy and security that they would do it just to end the "help customers who lose their key" support tickets, with no shady government deal required. I'd want to see something more than speculation to convince me otherwise.
Have you heard of our lord and savior, Linux?
If I earn my living from a company that doesn't make Linux versions, should i still switch?
Should my customers?
It's a great idea, and my work does not touch the internet, but the confusing variations of linux do not a happy workfoce make.
Your 'lord and saviour' can fuck off, with all the others, I prefer science.
Then test Omarchy.
Bitlocker isn't serious security. What is the easiest solution for non-technical users? Does FDE duplicate Bitlocker's funcationality?
Yes, the American government retrieves these keys "legally". But so what? The American courts won't protect foreigners, even if they are heads of state or dictators. The American government routinely frees criminals (the ones that donate to Republicans) and persecutes lawful citizens (the ones that cause trouble to Republicans). The "rule of law" in the U.S. is a farce.
And this is not just about the U.S. Under the "five eyes" agreement, the governments of Canada, UK, Autralia and New Zealand could also grab your secrets.
Never trust the United States. We live in dangerous times. Ignore it at your own risk.
If it were preventing a mass murder I might feel differently...
But this is protecting the money supply (and indirectly the governments control).
Not a reason to violate privacy IMO, especially when at the time this was done these people were only suspected of fraud, not convicted.
Well you can't really wait until the conviction to collect evidence in a criminal trial.
There are several stages that law enforcement must go through to get a warrant like this. The police didn't literally phone up Microsoft and ask for the keys to someone's laptop on a hunch. They had to have already confiscated the laptop, which means they had to have collected enough early evidence to prove suspicion and get a judge to sign off and so on.
Similarly, your TPM is protected by keys Intel or AMD can give anyone.
If you want to extrapolate, your Yubikey was supplied by an American company with big contracts to supply government with their products. Since it's closed source and you can't verify what it runs, a similar thing could possibly happen with your smartcard/GPG/pass keys.