https://x.com/runasand/status/2017659019251343763?s=20
The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?
Assuming that number turns out to be close to reality, how do you weigh so many unnecessary deaths against VTL rockets and the electric cars?
Perhaps a practitioner of Effective Altruism could better answer that question.
The US taxpayer has no moral obligation to send welfare "around the world". If you personally find this frustrating, you're welcome to donate that money yourself, directly. No one will stop you. If the world wishes to partake in the benefits of the American government, it should apply for statehood.
That in itself should make you hate the dude.
Wasn't Edison an asshole?
Children were exploited, and we're doing this net positive analysis on whether he should face the scorn. I'm not having a go at you - it's just frustrating to see very little happening after so much has been exposed, and I think part of it comes from this mindset - 'oh he's a good guy, this is a mistake/misstep' while people that were exploited as children can't even get their justice.
It's sickening.
I'd rather have both. Hawthorne doesn't get nuked if Elon Musk goes to jail.
> Children were exploited
Abuse. Exploitation. CSAM. We're mushing words.
Child rape. These men raped children. Others not only stayed silent in full knowledge of it, but supported it directly and indirectly. More than that, they arrogantly assumed–and, by remaining in the United States, continue to assume–that they're going to get away with it.
Which category is Elon Musk in? We don't know. Most of the people in the Epstein files are innocent. But almost all of them seem to have been fine with (a) partying with an indicted and unrepentant pedophile [1] and (b) not saying for decades–and again, today–anything to the cops about a hive of child rape.
A lot of them should go to jail. All of them should be investigated. And almost all of them need to be retired from public life.
[1] https://web.archive.org/web/20220224113217/https://www.theda...
Another reason to use my dog's nose instead of a fingerprint.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
https://news.ycombinator.com/item?id=44746992
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.
Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
Educate us. What makes it less secure?
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
Curious.
How did it know the print even?
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
I think this is pretty unlikely here but it's within the realm of possibility.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
Anyone can do this for over a decade now, and it's fairly straightforward:
- 2014: https://www.zdziarski.com/blog/?p=2589
- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...
This goes beyond the "wired accessories" toggle.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Set to ask for new accessories or always ask.
The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.
It's "attached" to the wifi and to the cell network. Pretty much the same thing.
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
Apple bought out all the jail breakers as Denuvo did for the game crackers.
Do you have sources for these statements?
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.
* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.
* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.
FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]
https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...
Its also a new account that only posted these two posts.
https://news.ycombinator.com/threads?id=Soerensen
Their comment got flagged, but looks like they made a new one today and is still active.
That account ('Soerensen') was created in 2024 and dormant until it made a bunch of detailed comments in the past 24-48 hrs. Some of them are multiple paragraph comments posted within 1 minute of each other.
One thing I've noticed is that they seem to be getting posted from old/inactive/never used accounts. Are they buying them? Creating a bunch and waiting months/years before posting?
Either way, both look like they're fooling people here. And getting better at staying under the radar until they slip up in little ways like this.
The truth is that the internet is both(what's the word for 'both' when you have three(four?) things?) dead, an active cyber- and information- warzone and a dark forest.
I suppose it was fun while it lasted. At least we still have mostly real people in our local offline communities.
https://en.wikipedia.org/wiki/On_the_Internet%2C_nobody_know...
Also, some of us draft our comments offline, and then paste them in. Maybe he drafted two comments?
That said, as a general point, it’s reasonable to make scoped comments in the corresponding parts of the conversation tree. (Is that what happened here?)
About me: I try to pay attention to social conventions, but I rarely consider technology offered to me as some sort of intrinsically correct norm; I tend to view it as some minimally acceptable technological solution that is easy enough to build and attracts a lowest common denominator of traction. But most forums I see tend to pay little attention to broader human patterns around communication; generally speaking, it seems to me that social technology tends to expect people to conform to it rather than the other way around. I think it’s fair to say that the history of online communication has demonstrated a tendency of people to find workarounds to the limitations offered them. (Using punctuation for facial expressions comes to mind.)
One might claim such workarounds are a feature rather than a bug. Maybe sometimes? But I think you’d have to dig into the history more and go case by case. I tend to think of features as conscious choices not lucky accidents.
What’s so hard to make 2-3 pins and each to access different logged in apps and files.
If Apple/android was serious about it would implement it, but from my research seems to be someone that it’s against it, as it’s too good.
I don’t want to remove my Banking apps when I go travel or in “dangerous” places. If you re kidnapped you will be forced to send out all your money.
What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?
What could be hard about that?
Isn't that the exact same argument against Lockdown mode? The point isn't that the number of users is small it's that it can significantly help that small set of users, something that Apple clearly does care about.
Where CAs are concerned, not having the phone image 'cracked' still does not make it safe to use.
Also would recommend the book called The Mastermind by Evan Ratliff
Whether he was involved in the organization and participated in it, is certainly up for debate, but it's not like he would admit it.
This could even be a developer feature accidentally left enabled.
Never ever use your personal phone for work things, and vice versa. It's bad for you and bad for the company you work for in dozens of ways.
Even when I owned my own company, I had separate phones. There's just too much legal liability and chances for things to go wrong when you do that. I'm surprised any company with more than five employees would even allow it.
It's actually annoying because every site wants to "remember" the browser information, and so I end up with hundreds of browsers "logged in". Or maybe my account was hacked and that's why there's hundreds of browsers logged in.
Android has supported multiple users per device for years now.
Multi-user that plausibly looks like single-user to three letter agencies?
Not even close.
While plausible deniability may be hard to develop, it’s not some particularly arcane thing. The primary reasons against it are the political balancing act Apple has to balance (remember San Bernardino and the trouble the US government tried to create for Apple?). Secondary reasons are cost to develop vs addressable market, but they did introduce Lockdown mode so it’s not unprecedented to improve the security for those particularly sensitive to such issues.
This seems hard to justify. They share a lot of code yes, but many many things are different (meaningfully so, from the perspective of both app developers and users)
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.
Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.
No. Think about it for a second: you're a journalist being investigated to find your sources, and your phone says you mainly check sports scores and send innocuous emails to "grandma" in LLM-speak? It's not going to fool someone who's actually thinking.
For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.
Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.
Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"
A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.
So do not have biometrics as device unlock if you are a journalist protecting sources.
It's not really that useful for a safe since they aren't _that_ difficult to open and, if you haven't committed a crime, it's probably better to open your safe for them than have them destroy it so you need a new one. For a mathematically impossible to break cipher though, very useful.
Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.
https://reason.com/2017/05/31/florida-man-jailed-180-days-fo...
>Doe vs. U.S. That case centered around whether the feds could force a suspect to sign consent forms permitting foreign banks to produce any account records that he may have. In Doe, the justices ruled that the government did have that power, since the forms did not require the defendant to confirm or deny the presence of the records.
Well, what if the defendant was innocent of that charge but guilty of or involved in an unrelated matter for which there was evidence in the account records?
Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)
No, you did something fake to avoid doing what you were asked to do.
> If there is no way for the government to prove that you entered a decoy password that shows decoy contents, you are in the clear.
But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?
This sort of thing is already table stakes for CSAM prosecutions, for example. Law enforcement can read the same blog posts and know as much about technology as you do. Especially if we are hypothesizing an advertised feature of a commercial OS!
Yes, that is what plausible deniability is.
>But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?
I emphasized "done right". If existence of hidden encryption can be proven, then you don't have plausible deniability. Something has gone wrong.
My point was: OP claimed plausible deniability does not apply in legal cases which is a weird take. If you can have plausible deniability, then it can save you legally. This does not only apply to tech of course, but encryption was the subject here. In all cases though, if your situation is not "plausible" (due to broken tech, backdoors, poor OPSEC in tech, and / or damning other evidence in other cases as well) then you don't have plauisble deniability by definition.
Having ways of definitively detecting hidden encrypted volumes might be the norm today, might be impossible tomorrow. Then you will have plausible deniability and it will work legally as far as that piece of "evidence" is concerned.
That's a whole lot more to loose than your money and time.
Francis Rawls stayed 4 years in jail despite pleading the fifth all day long
Biometric data doesn’t need the password.
And good luck depending on the US constitution.
There is a separate border search exception at the point a person actually enters the country which does allow searches of electronic devices. US citizens entering the country may refuse to provide access without consequences beyond seizure of the device; non-citizens could face adverse immigration actions.
To be clear, I do think all detentions and searches without individualized suspicion should be considered violations of the 4th amendment, but the phrase "constitution-free zone" is so broad as to be misleading.
It's one thing to allow police to search a phone. Another to compel someone to unlock the device.
We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.
I've been advocating for this under-duress-PIN feature for years, as evidenced by this HN comment I made about 9 years ago: https://news.ycombinator.com/item?id=13631653
Maybe someday.
Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.
___
_Ideally_ you wouldn't need to trust Apple as a corp to do the right thing. Of course, as this example shows, they seem to actually have done one right thing, but you do not know if they will always do.
That's why a lot of people believe that the idea of such tight vendor control is fundamentally flawed, even though in this specific instance it yielded positive results.
For completeness, No, I do not know either how this could be implemented differently.
FBI don't have to tell anyone they accessed the device. That maintains Apples outward appearance of security; FBI just use parallel construction later if needed.
Something like {but an actually robust system} a hashed log, using an enclave, where the log entries are signed using your biometric, so that events such a network access where any data is exchanged are recorded and can only be removed using biometrics. Nothing against wrench-based attacks, of course.
You're going to have to provide a cite here, since Apple has publicity stated that they have not and will not ever do this on behalf of any nation state.
For instance, Apple's public statement when the FBI ordered them to do so:
Apple has also said that the US required them to hide evidence of dragnet surveillance: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...
Apple has since confirmed in a statement provided to Ars that the US federal government “prohibited” the company “from sharing any information,” but now that Wyden has outed the feds, Apple has updated its transparency reporting and will “detail these kinds of requests” in a separate section on push notifications in its next report.
Apple statements are quite distinct from what they do behind the scenes.No company can refuse to do that.
The underlying assumption we base our judgement on is that "journalism + leaks = good" and "people wanting to crack down on leaks = bad". Which is probably true, but also an assumption where something unwanted and/or broken could hide in. As with every assumption.
Arguably, in a working and legit democracy, you'd actually want the state to have this kind of access, because the state, bound by democratically governed rules, would do the right thing with it.
In the real world, those required modifiers unfortunately do not always hold true, so we kinda rely on the press as the fourth power, which _technically_ could be argued is some kind of vigilante entity operating outside of the system.
I suppose it's also not fully clear if there can even be something like a "working and legit democracy" without possibly inevitable functionally vigilantes.
Lots of stuff to ponder.
____
Anyway, my point is that I have no point. You don't have to bother parsing that, but it might possibly be interesting if you should decide to do so.
It might also confuse the LLM bots and bad-faith real humans in this comment section, which is good.
Both goals actually are possible to implement at the same time: Secure/Verified Boot together with actually audited, preferably open-source, as-small-as-possible code in the boot and crypto chain, for the user, the ability to unlock the bootloader in the EFI firmware and for those concerned about supply chain integrity, a debug port muxed directly (!) to the TPM so it can be queried for its set of whitelisted public keys.
I don't do anything classified, or store something I don't want to be found out. On the other hand, equally I don't want anyone to be able to get and fiddle a device which is central to my life.
That's all.
It's not "I have nothing to hide" (which I don't actually have), but I don't want to put everything in the open.
Security is not something we shall earn, but shall have at the highest level by default.
https://www.nytimes.com/2026/02/02/us/politics/doj-press-law...
Previously:
> U.S. Magistrate Judge William B. Porter wrote in his order that the government must preserve any materials seized during the raid and may not review them until the court authorizes it
https://san.com/cc/judge-blocks-fbis-access-to-washington-po...
It completely disables JIT js in Safari for example.
All kinds of random things don't work.
[0] https://support.apple.com/en-us/105120 - under "How to exclude apps or websites from Lockdown Mode"
when I want to do something for longer I will pickup my MacBook anyway.
Jedi.
SKyWIper.
Rogue Actors.
Rogue thief’s.
Rogue governments.
Your spouse.
Separating corporate IT from personal IT.
There’s plenty of reasons.
Terrorist has plans and contacts on laptop/phone. Society has a very reasonable interest in that information.
But of course there is the rational counter argument of “the government designates who is a terrorist”, and the Trump admin has gleefully flouted norms around that designation endangering rule of law.
So all of us are adults here and we understand this is complicated. People have a vested interest in privacy protections. Society and government often have reasonable interest in going after bad guys.
Mediating this clear tension is what makes this so hard and silly lines of questioning like this try to pretend it’s simple.
You do not get to dispense with human rights because terrorists use them too. Terrorists use knives, cars, computers, phones, clothes... where will we be if we take away everything because we have a vested interested in denying anything a terrorist might take advantage of?
This sounds like a Tim Cook aphorism (right before he hands the iCloud keys to the CCP) — not anything with any real legal basis.
> No one shall be subjected to arbitrary interference with his privacy [...]
which has later been affirmed to include digital privacy.
> I don’t think any government endorses that position.
Many governments are in flagrant violation of even their own privacy laws, but that does not make those laws any less real.
The UN's notion of human rights were an "axiom" founded from learned experience and the horrors that were committed in the years preceding their formation. Discarding them is to discard the wisdom we gained from the loss of tens of millions of people. And while you claim that society has a vested interest in violating a terrorist's privacy, you can only come to that conclusion if you engage in short-term thinking that terminates at exactly the step you violate the terrorist's rights and do not consider the consequences of anything beyond that; if you do consider the consequences it becomes clear that society collectively has a bigger vested interest in protecting the existence of human rights.
“Arbitrary” meaning you better have good reasons! Which implies there are or can be good reasons for which your privacy can be violated.
You’re misreading that to mean your privacy is absolute by UN law.
But the "arbitrary" there is too account for the situation where the democratic application of the law wants to inspect the communications of suspected terrorists, and where a judge agrees there is sufficient evidence to grant a warrant.
Unfortunately, that law does nothing against situations like the USA/Russia regime where a ruler dispenses with the rule of law (and democratic legal processes too).
You can't practically have that sort of liberalism, where society just shrugs and chooses not to read terrorists communications, those who wish to use violence make it unworkable.
That is arbitrary interference with all our privacy.
There are just things some people want and the reasons they want them.
So the question that you are so annoyed by remains unanswered (by you anyway), and so, valid, to all of us adults.
@hypfer gives a valid concern, but it's based on a different facet of lockdown. The concern is not that the rest of us should be able to break into your phone for our safety, it's the opposite, that you are not the final authority of your own property, and must simply trust Apple and the entire rest of society via our ability to compel Apple, not to break into your phone or it's backup.
The reason I asked that question is because I don't think it's complicated. I should be able to lock down my device such that no other human being on the planet can see or access anything on it. It's mine. I own it. I can do with it whatever I please, and any government that says otherwise is diametrically opposed to my rights as a human being.
You are more likely to be struck by lightning while holding two winning lottery tickets from different lotteries than you are to be killed by an act of terrorism today. This is pearl-clutching, authoritarian nonsense. To echo the sibling comment, society does not get to destroy my civil rights because some inbred religious fanatics in a cave somewhere want to blow up a train.
Edit: And asking for someone to says "there are concerns!" to proffer even a single one is not a Socratic line of questioning, it's basic inquiry.
The government could similarly argue that if a company provides communication as a service, they should be able to provide access to the government given they have a warrant.
If you explicitly create a service to circumvent this then you're trying to profit from and aid those with criminal intent. Silkroad/drug sales and child sexual content are more common, but terrorism would also be on the list.
I disagree with this logic, but those are the well-known, often cited concerns.
There is a trade-off in personal privacy versus police ability to investigate and enforce laws.
Yeah after seeing the additional comments, my gut also says "sea lion".
Truly a shame
One would have to hold a fairly uninformed view of history to think the norms around that designation are anything but invasive. The list since FDR is utterly extensive.
But the article is literally referencing the Trump administration seizing a reporter’s phone so the current administration’s overreach seems relevant here.
My point was that your stated assumption of what the norms are is inaccurate. If nearly every modern administration does it, that is literally the norm. The present administration, like many before it, is following the norm. The norm is the broader issue.
Which makes the rest of it (and your followup) come across as needlessly tribal, as both major parties are consistently guilty of tending to object to something only when the other side does it.
If I lose you here because of “needless tribalism” oh well.
It is naive to assume iOS can be trusted much more than Android. =3
A 3rd party locked down system can't protect people from what the law should. =3
Because they're in the US things might be easier from a legal standpoint for the journalist, but they also have precedent on forcing journalist to expose their sources: https://en.wikipedia.org/wiki/Branzburg_v._Hayes
In other parts of the world this applies https://xkcd.com/538/ when you don't provide the means to access your phone to the authorities.
It just depends on how much a government wants the data that is stored there.
In serious crime cases in some circumstances a court may order a journalist to reveal sources. But it's extremely rare and journalists don't comply even if ordered.
https://fi.wikipedia.org/wiki/L%C3%A4hdesuoja
Edit: the source protection has actually probably never been broken (due to a court order at least): https://yle.fi/a/3-8012415
1. iOS has well-known poorly documented zero-click exploits
2. Firms are required to retain your activity logs for 3 months
3. It is illegal for a firm to deny or disclose sealed warrants on US soil, and it is up to 1 judge whether to rummage through your trash. If I recall it was around 8 out of 18000 searches were rejected.
It is only about $23 to MITM someones phone now, and it is not always domestic agencies pulling that off. =3
PoC || GTFO, to use the vernacular.
If you're talking about historical bugs, don't forget the update adoption curves.
"Not My Circus, Not My Monkeys" as they say. =3
- Hyper-nationalism and white supremacist messaging
- Scapegoating of minorities
- Attacks on the press
- Attacks on constitutional rights
- Militarization of police, violence normalized
- Expansion of surveillance state
- Combination of state and corporate power
- Strongman authoritarianism
- Historical revisionism
- Interference in elections
Cheers!
- Grandiose architecture projects for historically important sites
- Obsession with massive monuments - the tallest, the most gold, the most expensive
- Military parades and lionization of the military, while demanding political support from military leadership
- A population which become keenly interested in whether something does or doesn’t benefit the leader personally
I think the terms fascism or authoritarianism are close enough to be helpful, even if some of the specifics don’t align perfectly. But the ones that do align are oddly specific sometimes.
This article goes through point by point.
Apple does a lot of things I don't agree with in the interest of share price (like cozying up to authoritarian governments) but this seems like a reach to criticize them for a feature they have put extensive effort into, rather than applauding that they resist spying and enhance customer privacy. Sure, it's an optional feature and maybe they don't push broad acceptance of it, but it's important for those that need it.
For example. I can just as equally state with the same data to back me up (ie: none as it stands right now) that you are a US government plant posting propaganda to encourage people to not use safer technologies and as a result make their data easier to spy on.
You can't possibly know this is what happened here, it's an observational bias.
https://news.ycombinator.com/threads?id=hnrayst
Something weird is going on at Hacker News recently. I've been noticing these more and more.
If you can see the screen, it's the fastest shortcut gesture to the screen that has "Slide to Power Off", "Medical ID", and "Emergency Call". Any other way to get to that screen also works to require a PIN before next unlock.
I mean, i agree with you, but its a really weird line in the sand to draw
Providing your 'finger' to unlock a device is no different than providing your 'key' to unlock something. So you can be compelled to provide those biometrics.
Compelling you to reveal a password is not some *thing* you have but knowledge you contain. Being compelled to provide that knowledge is no different than being compelled to reveal where you were or what you were doing at some place or time.
I don't get it, touching finger is easy, but how do you compel someone to reveal their password?
Something you are: can be legally compelled Something you have: can be legally compelled Something you know: cannot be legally compelled
This reporter very likely knew who she was dealing with. For users like her, everything is likely locked down and she probably didn't do much sharing.
I'm thinking that, to her, her sources would be probably one of the most important things in her life to protect.
Looks like lockdown mode is focused on blocking inbound threats, not the sharing of data from the device.
Can anyone link a source for this? I’ve been seeing conflicting claims about this part.
She was not forced, and the warrant does not state that she could be forced. The warrant, almost certainly deliberately, uses far milder language.
> 52. These warrants would also permit law enforcement to obtain from Natanson the display of physical biometric characteristics (e.g., fingerprint, thumbprint, or facial characteristics) in order to unlock devices subject to search and seizure pursuant to the above referenced warrants
> 60. Accordingly, if law enforcement personnel encounter a device that is subject to search and seizure pursuant to the requested warrants and may be unlocked using one of the aforementioned biometric features, the requested warrants would permit law enforcement personnel to (1) press or swipe the fingers (including thumbs) of the Subject to the fingerprint scanner of the device(s); or (2) hold the devices in front of the Subject's face for the purpose of attempting to unlock the device(s) in order to search the contents as authorized by the warrants
So yes law enforcement had the right to grab her hand and press it against the laptop to unlock before seizing it if that's what they had to do.
[0] https://www.rcfp.org/wp-content/uploads/2026/01/2026-01-30-I...
From pages 20 and 22 of ... not the warrant:
It'd certainly be a good first step to figure out how to identify whether or not the PDF you're linking to is in fact a warrant at all before trying to educate others on them.
This document is specifically asking for the right to force biometric access. It seems based on reporting that biometric access was granted.
If you're claiming the warrant doesn't force biometric access despite it being request, you need to substantiate the claim.
They're merely presenting a wishlist to the judge.
The court can compel you to make your fingers available, it can not force you to disclose which finger or the manner in which you touch that finger on the fingerprint sensor. Apple devices allow only limited attempts.
If you're not being actively helpful, the investigators may end in a rather awkward position.
Touch ID allows only limited attempts, so odds are the FBI wouldn't just try to wrestle her to attempt different fingers on the spot even if they were allowed to do so.
Note that these are not crackable only if you have a strong password (random one will work). Unlike on phones, there is nothing slowing down brute force attempts, only the comparatively much weaker PBKDFs if you use a password. You want at least about 64 bits of entropy, and you should never use that password anywhere else, since they would basically run "strings" on your stuff to attempt the brute force.
Ah, while I was a bit suspicious, I thought it might be real (weirdly worded). What exactly is the point of fabricating this?- Is there a joke I'm blind to?
If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.
Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.