You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
How true is this when they devices are increasingly hostile to user repair and upgrades? MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
If you allowed third-party components without restraint, there'd be no way to prevent someone swapping out a component.
Lock-in and planned obsolescence are also factors, and ones I'm glad the EU (and others) are pushing back here. But it isn't as if there are no legitimate tradeoffs.
Regarding screw tightening... if they ever completely remove the ability to run untrusted code, yes, then I'll admit I was wrong. But I am more than happy to have devices be locked down by default. My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
First, this is 100% false. Second, security through obscurity is almost universally discouraged and considered bad practice.
Think of some common sense physical analogies: a hidden underground bunker is much less likely to be robbed than a safe full of valuables in your front yard. A bicycle buried deeply in bushes is less likely to be stolen than one locked to a bike rack.
Without obscurity it is straightforward to know exactly what resources will be required to break something- you can look for a flaw that makes it easy and/or calculate exactly what is required for enough brute force.
When you add the element of well executed obscurity on top of an also strong system, it becomes nearly impossible to even identify that there is something to attack, or to even start to form a plan to do so.
Combining both approaches is best, but in most cases I think simple obscurity is more powerful and requires less resources than non obscure strength based security.
I’ve managed public servers that stayed uncompromised without security updates for a decade or longer using obscurity: an archaic old Unix OS of some type that does not respond to pings or other queries, runs services on non-standard ports, and blocks routes to hosts that even attempt scanning the standard ports will not be compromised. Obviously also using a secure OS with updates on top of these techniques is better overall.
For example Intel's Management Engine, it was obscured very well. It wasn't found for years. Eventually people did find it, and you can't help but wonder how long it took for bad actors with deep pockets to find it. Its this obscured cubby hole in your CPU, but if someone could exploit it, it would be really difficult to find out because of intel's secrecy on top of the feature.
That's literally the practical basis of security through obscurity.
> Others, like my comment above, are talking about systems carefully engineered to have no predictable or identifiable attack surfaces- things like OpenBSDs memory allocation randomization,
That's exactly the opposite of 'security through obscurity' - you're literally talking about a completely open security mitigation.
> I’ve found when it is impossible for an external bad actor to even tell what OS and services my server is running- or in some cases to even positively confirm that it really exists- they can’t really even begin to form a plan to compromise it.
If one of your mitigations is 'make the server inaccessible via public internet', for example - that is not security through obscurity - it's a mitigation which can be publicly disclosed and remain effective for the attack vectors it protects against. I don't think you quite understand what 'security through obscurity[0]' means. 'Security through obscurity' in this case would be you running a closed third-party firewall on this sever (or some other closed software, like macos for example) which has 100 different backdoors in it - the exact oppposite of actual security.
[0] https://en.wikipedia.org/wiki/Security_through_obscurity
If you're not understanding how memory allocation randomization is security through obscurity- you are not understanding what the concept entails at the core. It does share a common method with, e.g. using a closed 3rd party firewall: in both cases direct flaws exist that could be overcome with methods other than brute force, yet identifying and specifying them enough to actually exploit is non-trivial.
The flaw in your firewall example is not using obscurity itself, but: (1) not also using traditional methods of hardening on top of it - obscurity should be an extra layer not an only layer, and (2) it's probably not really very obscure, e.g. if an external person could infer what software you are using by interacting remotely, and then obtain their own commercial copy to investigate for flaws.
Specific example of where I did this?
> literally gives the same examples to two of the main ones I mentioned at the very top of the article as key examples of security through obscurity: "Examples of this practice include disguising sensitive information within commonplace items, like a piece of paper in a book, or altering digital footprints, such as spoofing a web browser's version number"
I mean, I don't disagree that what you said about changing port numbers, for example, is security through obscurity. My point is that this is not any kind of defense from a capable and motivated attacker. Other examples like the OpenBSD mitigation you mentioned are very obviously not security through obscurity though.
> If you're not understanding how memory allocation randomization is security through obscurity- you are not understanding what the concept entails at the core.
No, you still don't understand what 'security through obscurity' means. If I use an open asymmetric key algorithm - the fact that I can't guess a private key does not make it 'security through obscurity' it's the obscuring of the actual crypto algorithm that would make it 'security through obscurity'. Completely open security mitigations like the one you mentioned have nothing to do with security through obscurity.
> The flaw in your firewall example is not using obscurity itself, but: (1) not also using traditional methods of hardening on top of it
Sooo... you think adding more obscurity on top of a closed, insecure piece of software is going to make it secure?
> if an external person could infer what software you are using by interacting remotely,
There are soooo many ways for a capable and motivated attacker to figure out what software you're running. Trying to obscure that fact is not any kind of security mitigation whatsoever. Especially when you're dealing with completely closed software/hardware - all of your attempts at concealment are mostly moot - you have no idea what kind of signatures/signals that closed system exposes, you have no idea what backdoors exist, you have no idea what kind of vulnerable dependencies it has that expose their own signatures and have their own backdoors. Your suggestion is really laughable.
> not also using traditional methods of hardening on top of it
What 'traditional methods' do you use to 'harden' closed software/hardware? You literally have no idea what security holes and backdoors exist.
> if an external person could infer what software you are using by interacting remotely, and then obtain their own commercial copy to investigate for flaws.
Uhh yeah, now you're literally bringing up one of the most common arguments for why security through obscurity is bullshit. During WW1/WW2 security through obscurity was common in crypto - they relied on hiding their crypto algos instead of designing ones that would be secure even when publicly known. What happened is enough messages, crypto machines, etc were recovered by the other side to reverse these obscured algos and break them - since then crypro has pretty much entirely moved away from security through obscurity.
If there are advantages to a closed source system, it is not in situations where the source is closed to you and contains bugs, but when closed to the attacker. If you have the resources and ability to, for example, develop your own internally used but externally unknown, but still heavily audited and cryptographically secure system, is going to be better than an open source tool.
Ok, let's start with a 'mathematically secure heavily public audited system' - let's take ECDSA, for example - how will you use obscurity to improve security?
> If you have the resources and ability to, for example, develop your own internally used but externally unknown, but still heavily audited and cryptographically secure system, is going to be better than an open source tool.
Literally all of the evidence we have throughout the history of the planet says you're 100% wrong.
You are so sure you’re right that you are not really thinking about what I am saying, and how it applies to real world situations- especially things like real life high stakes life or death situations.
I am satisfied that your perspective makes the most sense for low stakes broad deployments like software releases, but not for one off high stakes systems.
For things like ECDSA, like anything else you implement obscurity on a one off basis tailored to the specific use case- know your opponent and make them think you are using an entirely different method and protocol that they’ve already figured out and compromised. Hide the actual channel of communication so they are unable to notice it exists, and over that you simply use ECDSA properly.
Oh, and store your real private key in the geometric design of a giant mural in your living room, while your house and computers are littered with thousands of wrong private keys on ancient media that is expensive to extract. Subscribe to and own every key wallet product or device, but actually use none of them.
Nah, you're just saying a lot of stuff that's factually incorrect and just terrible advice overall. You lack understanding what you're talking about. And the stakes are pretty irrelevant to whether a system is secure or not.
> For things like ECDSA, like anything else you implement obscurity on a one off basis tailored to the specific use case- know your opponent and make them think you are using an entirely different method and protocol that they’ve already figured out and compromised.
You're going to make ECDSA more secure by making people think you're not using ECDSA? That makes so little sense in so many ways. Ahahahahaha.
If you say so.
> Think of some common sense physical analogies: a hidden underground bunker is much less likely to be robbed than a safe full of valuables in your front yard. A bicycle buried deeply in bushes is less likely to be stolen than one locked to a bike rack.
That's not what security through obscurity is. If you want to make an honest comparison - what is more likely to be a secure - an open system built based on the latest/most secure public standards, or a closed system built based on (unknown)? The open system is going to be more secure 99.999% of the time.
> Without obscurity it is straightforward to know exactly what resources will be required to break something- you can look for a flaw that makes it easy and/or calculate exactly what is required for enough brute force.
The whole point of not relying on obscurity is that you design an actually secure system even assuming the attacker has a full understanding of your system. That is how virtually all modern crypto that's actually secure works. Knowing your system is insecure and trying to hide that via obscurity is not security.
> it becomes nearly impossible to even identify that there is something to attack
That's called wishful thinking. You're conflating 'system that nobody knows about or wants to attack' with 'system that someone actually wants to attack and is defending via obscurity of its design'. If you want to make an honest comparison you have to assume the attacker knows about the system and has some motive for attacking it.
> but in most cases I think simple obscurity is more powerful and requires less resources than non obscure strength based security.
Except obscurity doesn't actually give you any security.
> I’ve managed public servers that stayed uncompromised without security updates for a decade or longer using obscurity: an archaic old Unix OS of some type that does not respond to pings or other queries, runs services on non-standard ports, and blocks routes to hosts that even attempt scanning the standard ports will not be compromised.
That's a laughably weak level of security and does approximately ~zero against a capable and motivated attacker. Also, your claim of 'stayed uncompromised' is seemingly based on nothing.
Instead of, for example in your last example simply labeling something you seem to not like as "laughably weak"- do you have any specific reasoning? Again, I'd like to emphasize that I don't advocate obscurity in place of other methods, but on top of additional methods.
Let's try some silly extreme examples of obscurity. Say I put up a server running OpenBSD (because it is less popular)- obviously a recent version with all security updates-, and it has only one open port- SSH, reconfigured to run on port 64234, and attempting to scan all other ports immediately and permanently drop the route to your IP. The machine does not respond to pings, and does other weird things like only being physically connected for 10 minutes a day at seemingly random times only known by the users, with a new IP address each time that is never reused. On top of that, the code and all commands of the entire OS has been secretly translated into a dead ancient language so that even with root it would take a long time to figure out how to work anything. It is a custom secret hacked fork of SSH only used in this one spot that cannot be externally identified as SSH at all, and exhibits no timing or other similar behaviors to identify the OS or implementation. How exactly are you going to remotely figure out that this is OpenBSD and SSH, so you can then start to look for a flaw to exploit?
If you take the alternate model, and just install a mainstream open source OS and stay on top of all security updates the best you can, all a potential hacker needs to do is quickly exploit a new update before you actually get it installed, or review the code to find a new one.
Is it easier to rob a high security vault in a commercial bank on a major public street, or a high security vault buried in the sand on a remote island, where only one person alive knows its location?
'without security updates for a decade or longer' - do I really need to go into detail on why this is hilariously terrible security?
'runs services on non-standard ports,' - ok, _maybe_ you mitigated some low-effort automated scans, does not address service signatures at all, the most basic nmap service detection scan bypasses this already.
'blocks routes to hosts that even attempt scanning the standard ports ' - what is 'attempt scanning the standard ports' and how are you detecting that- is it impossible for me to scan your server from multiple boxes? (No, it's not, it's trivially easy.)
> Say I put up a server running OpenBSD (because it is less popular)- obviously a recent version with all security updates-, and it has only one open port- SSH,
Ok, so already far more secure than what you said in your previous comment.
> only being physically connected for 10 minutes a day at seemingly random times only known by the users
Ok, so we're dealing with a server/service which is vastly different in its operation from almost any real-world server.
> only known by the users, with a new IP address each time that is never reused
Now you have to explain how you force a unique IP every time, and how users know about it.
> On top of that, the code and all commands of the entire OS has been secretly translated into a dead ancient language so that even with root it would take a long time to figure out how to work anything
Ok, so completely unrealistic BS.
> It is a custom secret hacked fork of SSH only used in this one spot that cannot be externally identified as SSH at all
It can't be identified, because you waved a magic wand and made it so?
> and exhibits no timing or other similar behaviors to identify the OS or implementation
Let's wave that wand again.
> How exactly are you going to remotely figure out that this is OpenBSD and SSH, so you can then start to look for a flaw to exploit?
Many ways. But let me use your magic wand and give you a much better/secure scenario - 'A server which runs fully secure software with no vulnerabilities or security holes whatsoever.' - Makes about as much sense as your example.
> Is it easier to rob a high security vault in a commercial bank on a major public street, or a high security vault buried in the sand on a remote island, where only one person alive knows its location?
The answer comes down to what 'high security' actually means in each situation. You don't seem to get it.
More pragmatic advice would be to not rely solely on security through obscurity, but rather to practice defence in depth.
Widely deployed doesn't mean it's a positive action, and effective ? It just can't be as it's not a security. People really need to pay more attention to these things, or else we DO get nonsense rolled out as "effective".
This is stupid advice that is mindlessly repeated. Security by obscurity only is bad, sure. Adding obscurity to other layers of security is good.
Edit: formatting
It works via your keychain and your contacts, and the recipient gets a little notification to allow you to view their screen.
That’s it - no downloads, no login, no 20 minutes getting a Remote Desktop screen share set up.
Also, it only seems to work on a local network with hostnames.
It 100% works across the internet: it works with contact names, not just host names.
The answer is no, fyi
After you open it, press “Connections -> New” and start typing a contact name.
They get a little notification if they are online, and if they accept you have a seamless screen sharing experience ready to go. It’s honestly magic for the “my parents have an error message and don’t know what to do” situation.
I assume they have you in their contacts as well for it to work.
Glad to see your parents are tech savvy, but this reads like you live in a very different reality from mine.
Your parents don’t need to run the app at all. You need to run the app.
Unless you really do live in a different reality where you want your parents to help you out when you see an error message you don’t understand?
Some of us are old enough to remember the era of the officially authorised Apple clones in the 90's.
Some of us worked in hardware repair roles at the time.
Some of us remember the sort of shit the third-party vendors used to sell as clones.
Some of us were very happy the day Apple called time on the authorised clone industry.
The tight-knit integration between Apple OS and Apple Hardware is a big part of what makes their platform so good. I'm not saying perfect. I'm just saying if you look at it honestly as someone who's used their kit alongside PCs for many decades, you can see the difference.
Some of us might even be of the opinion that such a competition was beneficial to consumers that wanted more.
Yeah, but this is hacker news.
Apple gobbling up supply chains and production capacity is not something a hardware startup should be happy with.
Also, startup engineers don't necessarily like "alien technology", which is what Apple is becoming by developing everything behind closed doors and with little cooperation.
Startups don't like to pay 10-30% of their revenue just for running their software on a device someone has already paid for.
There are more reasons to dislike Apple than you can find on Slashdot.
You start with a very good and fair point.
> Also, startup engineers don't necessarily like "alien technology", which is what Apple is becoming by developing everything behind closed doors and with little cooperation. > Startups don't like to pay 10-30% of their revenue just for running their software on a device someone has already paid for.
You ended with one that is pretty much at odds with my experience. Startups value distribution channels, and the App Store has been fantastic for this.
Furthermore, it's one thing to dislike Apple. It's another thing for people to veer off into conspiracy theory, which is what half the threads on here have started to do.
This is what the vast majority of discourse on these topics has been dominated by on every platform, not just HN. I wonder if there's a shorter term for this, none of 4chan /g/'s crass terms cover this kind of depiction.
Competition-enabled rationalism: https://en.wikipedia.org/wiki/Xserve
You can install whatever OS you want on your computer - Asahi Linux is the only one that's done the work to support that.
You can disable the system lockdowns that "tighten the screws" you refer to and unlock most things back to how they used to be.
But very distinctly, not all. Apple deliberately makes customers buy more than what they need while refusing to sell board-level ICs or allow donor boards to be disassembled for parts. If a $0.03 Texas Instruments voltage controller melts on your Macbook, you have to buy and replace the whole $600 board if you want it working again. In Apple's eyes, third party repairs simply aren't viable and the waste is justified because it's "technically" repaired.
> You can install whatever OS you want on your computer
Just not your iPhone, iPad or Apple Watch. Because that would simply be a bridge too far - allowing real competition in a walled garden? Unheard of.
> You can disable the system lockdowns that "tighten the screws" you refer to and unlock most things back to how they used to be.
And watch as they break after regular system upgrades that force API regressions and new unjustified restrictions on your OS. Most importantly, none of this is a real an option on Apple's business-critical products.
We're clearly talking about Macs for the software parts so I'm not sure why you're bringing in iPhone/iPad/Apple Watch where the status quo has remained unchanged since they were introduced. I'd love those to be opened up but that's another conversation.
Regarding system restrictions on macOS (putting aside the fact it fully supports otehr operating systems on Apple hardware), the ability to disable the system restrictions hasn't changed for years. System Integrity Protection is still a toggle that most users never need to touch.
> Most importantly, none of this is a real an option on Apple's business-critical products.
I don't understand what this means?
Not sure what you mean exactly by this, but to me their Self Service Repair program is a step in the right direction.
They could go out of their way to make things actually easy to work on and service, but that has never been the Apple Way. Compare to framework or building your own PC, or even repairing a laptop from another OEM.
Apple taking your data privacy seriously seems a worthy exception to me. You're free to disagree, and buy an Android.
Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want. Maybe I'm not the average user, but I use mostly open-source Unix tooling and have never had a problem with permissions or restrictions.
Are you talking about packaged applications that are made available on the App Store? If so, sure have rules to make sure the store is high-quality, kinda like how Costco doesn't let anyone just put garbage on their shelves
Try sharing a binary that you built but didn't sign and Notarize and you'll see the problem.
It'll run on the machine that it was built on without a problem, the problems start when you move the binary to another machine.
https://git.sudo.is/mirrors/AsahiLinux-docs/wiki/M3-Series-F...
I tested Asahi and I genuinely love it and I’ll probably be happy to use it as my daily driver as soon as it will be mature enough. And I’m impressed by how it works well (outside of what still doesn’t work at all).
But buying a Mac ARM hopping to run Linux on it today without issue is just a wrong move. Just buy a classic PC if you want to be productive on Linux today.
I’m pretty confident it will happen though since the team itself looks pretty confident about supporting what is currently missing and in the past, achieved more than I hoped.
edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
"You need some cloud-based identity, and this is the best one," even granting its premises, doesn't make being forced into this one a good thing. I'm an Apple user, but there are plenty of people I need to message and share files with who aren't in the Apple ecosystem.
EDIT: As indicated in the reply (written before I added this edit), it sounds like I was ignoring the first part of the post, which pointed out that you aren't forced to use it. I agree that that is a sensible, and even natural and inevitable, reading. I actually wasn't ignoring that part, but I figured the only reason to include this edit was to say "that isn't true, but if it were true, then it would be OK." (Otherwise, what's the point? There's no more complete refutation needed of a false point than that it is false.) My argument is that, if it were true, then that wouldn't be OK, even if you need a cloud-based identity, and even if iCloud is the best one.
But you're not forced. You completely ignored the other response in order to continue grinding an axe.
Same with server parts using HBM—won’t let me upgrade memory there either.
That said, the apple ssd situation is abysmal. At least with memory they have reasons.
I can neither repair nor upgrade my electric car, my furniture, or my plumbing. But they all still belong to me.
You either have have very low standards or very low understanding if you think a completely closed OS on top of completely closed hardware somehow means it 'really belongs' to you, or that your data/privacy is actually being respected.
I disagree with OP celebrating Apple to be the least evil of the evils. Yes, there are not many (if any) alternatives, but that doesn't make Apple great. It's just less shitty.
I can’t even find how to open the Applications view in Finder 9/10 times on a Mac.
Apple isn’t perfect. They’re not better at privacy than some absolutist position where you run Tails on RISC V, only connect to services over Tor, host your own email, and run your own NAS.
But of all the consumer focused hardware manufacturers and cloud services companies, they are the only ones even trying.
If you're using the web, your privacy is about your browser and your ISP, not your OS.
At times, it's even about how you use your browser. No browser will save you from telling google too much about yourself by using gmail, and viewing youtube videos, and using search. The AI's and algorithms collating all that information on the backend see right through "incognito" mode.
Telling people they can get security and privacy by using Linux, or windows, or mac just betrays a fundamental misunderstanding of the threat surface.
Apple takes a 30% tax on all applications running on their mobile devices. Just let that sink in. We are so incredibly lucky that never happened to PC.
Another big selling point of Apple is the hardware. Their hardware and software are integrated so seamlessly. Things just work, and they work well. 99% of the time - there’s always edge cases.
There’s solutions to running Linux distros on some Apple hardware but again you have to make sacrifices.
I keep around a Linux laptop and it's improved immensely in the past several years, but the experience still has rough edges to smooth out.
Uhh, this is just untrue. I have it running on three different laptops from different vendors and Fedora, pop_OS!, and Ubuntu were all pretty much drop-in replacements for Windows, no problems.
You "keep around a Linux laptop" but I daily drive them and it's fine. Sure, there's the odd compatibility problem which could be dealbreaking, but it's not like MacOS is superior in that regard.
macOS has its own oddities of course, but they don't impede such basic usage as video playback.
That's such a security theater. As long as nobody can look inside their ICs, nobody knows what's really happening there.
> "Today we’re making these resources publicly available to invite all security and privacy researchers – or anyone with interest and a technical curiosity – to learn more about PCC and perform their own independent verification of our claims."
https://security.apple.com/documentation/private-cloud-compu...
There are also a million dollars of bounties to be had if you hack it
> Apple oversells its differential privacy protections. "Apple’s privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," says USC professor Aleksandra Korolova, a former Google research scientist who worked on Google's own implementation of differential privacy until 2014. She says the dialing down of Apple's privacy protections in iOS in particular represents an "immense increase in risk" compared to the uses most researchers in the field would recommend.
https://www.wired.com/story/apple-differential-privacy-short...
Looking at a bunch of PCBs doesn't tell you much.
Assuming they go through with that, this alone puts them leagues ahead of any other cloud service.
It also means that to mine your data the way everyone else does, they would need to deliberately insert _hardware_ backdoors into their own systems, which seems a bit too difficult to keep secret and a bit too damning a scandal should it be discovered...
Occam's razor here is that they're genuinely trying to use real security as a competitive differentiator.
Amongst all the big tech companies Apple is the closest you will get to if you want Privacy.
Privacy puts user interests first. Apple doesn't.
Try exporting your private data (e.g. photos) from any modern apple device (one that you paid for and you fully own) to a non apple device that is an industry standard like a usb stick, or another laptop. Monitor some network traffic going out from your laptop. Try getting replacement parts for your broken idevice.
Others aren't pretending to put your interests first, Apple though...
Think for yourself.
I don't comment here often anymore. Don't bother.
Build a desktop PC, yes like a nerdy gamer. ^_^
Install Linux
Been the way for years.
What I meant is it been more popular to build your own PC for gamers due to price and customization.
Or for laptops, Thinkpad and Linux :)
It's most dangerous that they own the closed hardware and they own the closed software and then they also get away with being "privacy champions". It's worse than irony.
Its only 'bulletproof' in PR and Ad copy, because for as long as the US is capable of undermining any tech company that operates within its purview with NSL's, the 'perception of security' is a total fallacy.
In other words, the technology is not bulletproof, no matter how hard the marketing people work to make it appear so - only the society within which the provider operates can provide that safety.
For some, this is an intolerable state of affairs - for others, perfectly tolerable.
Let's see if they really care so much about privacy in 10 years, once LLM/AI has settled. But they do seem to respect it a lot more than Microsoft.
Everytime you launch an app, Mac OS dials home.
Before you reply that it’s definitely true, I encourage you to actually look up the details of the thing you think you’re upset about.
There is no evidence at all that they are trying to ensure you can only run things from the App Store - I run a whole bunch of non-app-store binaries every single day. To make that claim is baseless and makes me de-rate the rest of what you write.
There is always a trade-off between privacy and security. This still falls well under the Google/Android/Chrome level, or indeed the Microsoft/Windows level with its targeted ads, IMHO.
Choose your poison, but this works for me.
My understanding is that they keep a local file with known malware signatures, just like the malware scanners on every other platform.
> macOS includes built-in antivirus technology called XProtect for the signature-based detection and removal of malware. The system uses YARA signatures, a tool used to conduct signature-based detection of malware, which Apple updates regularly
https://support.apple.com/guide/security/protecting-against-...
The phone home functionality is notarization, where apple does a network call to check that the signature on an executable actually came from apple’s notarization process. It is in essence a reputation system, where developers must be on good terms with apple to have the ability to notarize and get a smooth install experience.
From what I had in mind, notarization is only done developer side before publishing. Client side it's just a check against Apple certificates to verify that the binary haven't been tampered since notarization, no phoning home should be involved. (Or maybe just to update Apple certificates).
https://eclecticlight.co/2023/03/09/how-does-ventura-check-a...
They also check the developer certificate in the OCSP stage.
Both of these are mechanisms where apple can effectively lock out developers from having a smooth install experience for their software at their discretion.
1. Most users are not capable of using general purpose computing technology in a wild, networked environment safely.
2. Too many people who matter to ignore insist, "something must be done."
3. And so something shall be done.
4. Apple is navigating difficult waters. As much as I disapprove of how they have chosen a path for iOS, the fact is many people find those choices are high value.
5. I do, for the most part, approve of their choices for Mac OS. I am not sure how they prevent malicious code without maintaining some sort of information for that purpose.
6. We are arriving at a crossroads many of us have been talking about for a long time. And that means we will have to make some hard choices going forward. And how we all navigate this will impact others in the future for a long time.
Look at Microsoft! They are collecting everything! And they absolutely will work with law enforcement anytime, any day, almost any way!
I sure as hell want nothing to do with Windows 11. Most technical people I know feel the same way.
Screenies every 3 to 5 seconds? Are they high? Good grief! Almost feels like raw rape. Metaphorically, of course.
Then we have Linux. Boy am I glad I took the time way back in the 90's to learn about OSS, Stallman, read words from interesting people, Raymond, Perkins, Searles, Lessig, Doctorow, many others!
Linus did all of tech one hell of a solid and here we are able to literally dumpster dive and build whatever we want just because we can. Awesome sauce in a jar right there
, but!
(And this really matters)
...Linux just is not going to be the general answer for ordinary people. At least not yet. Maybe it will be soon.
It is an answer in the form of a crude check and balance against those in power. Remember the "something shall be done" people? Yeah, those guys.
And here we are back to Apple.
Now, given the context I put here, Apple has ended up really important. Working professionals stand something of a chance choosing Mac OS rather than be forced into Windows 11, transparent edition!
And Apple does not appear willing to work against their users best interests, unless they are both compelled to by law, and have lost important challenges to said law.
If you want that, your choices are Apple and Linux!
7. Open, general purpose computing is under threat. Just watch what happens with Arm PC devices and the locked bootloaders to follow just like mobile devices.
Strangely, I find myself wanting to build a really nice Intel PC while I still can do that and actually own it and stand some basic chance of knowing most of what it doing for me. Or TO ME.
No Joke!
As I move off Win 10, it will be onto Linux and Mac OS. Yeah, hardware costs a bit more, and yeah it needs to be further reverse engineered for Linux to run on it too, but Apple does not appear to get in the way of all that. They also do not need to help and generally don't. Otherwise, the Linux work is getting done by great people we all really should recognize and be thankful for.
That dynamic is OK with me too. It is a sort of harsh mutual respect. Apple gets to be Apple and we all get to be who we are and do what we all do with general purpose computers as originally envisioned long ago.
We all can live pretty easily with that.
So, onward we go! This interesting time will prove to be more dangerous than it needs to be.
If it were not for Apple carving out a clear alternative things would look considerably more draconian, I could and maybe almost should say fascist and to me completely unacceptable.
Apple is priced beyond the reach of many "ordinary people" especially outside the western markets. A cheap (perhaps after market) laptop with Ubuntu on it (often installed by the seller) is something that has been getting a lot of traction among regular users. Most of the things they do are via. a browser so as long as Chrome/FF works, they're good. They often install software that undermines the security that the platform natively offers but still, it's a pretty decent compromise.
>Linux just is not going to be the general answer for ordinary people.
It so, I hear you. A decade or more ago, I had Ubuntu running as a general use machine for family and friends use.
It seemed almost there back then, and I saw some success.
Today it would be better, yes? I think so
Fact is, it often takes someone doing support to have it work well, and when that is gone, the software slips behind leaving users to get help.
Today, the numbers are much better. That happens less, but still does happen.
Your point on browser apps is solid. I agree, but those come with their own problems.
I see the most success when I set one up, including Void Tools, many visits to FossHUB...
When done, no network needed and one has a GREAT machine, ready for many tasks!
Both ways have merit and the more the merrier!
Your news bolsters the "soon" in my comment above.
I am quite happy to be proven wrong.
Used, great condition M1 Airs go for ~$450 around here and will last longer than anything Intel or AMD-based for that price, whether new or used.
You know I decided to take my old note 8 for a test drive as a PC of sorts. Went ahead and purchased one of those USB 3 port bricks so I could hook up a nice display, keyboard, mouse, removable storage.
Samsung Dex popped up and it works mostly!
I found one could do quite a lot.
That is not the way I would go, but if I had to? Bring it! Plenty can be done, good skills learned.
Fact is, large numbers of people will just end up on Windows 11 :(
if you are in the US, you need to either register as a developer, or register an apple id and register your app to run it for a week. that's how you run non-app store code. Both of those require permission from apple.
EDIT: Sorry, ios.
I tried installing "Flameshot" via homebrew and it wouldn't run until I went into Finder, right clicked it and clicked open. Luckily it's mentioned in their docs [0] or I would have never guessed to do this.
[0] https://flameshot.org/docs/installation/installation-osx/
I also notice two other installation options in your link that do not come with those additional instructions — which to me suggests with whatever they’re doing on homebrew.
If I were you, I would relax. At least you are not being shoved onto Win 11.
And then think about that. Seriously. I did. Have a few times off and on over the years as we sink into this mess.
I bet you find an OS that does a bit more than you may otherwise prefer to prevent trouble. If so, fair call in my book.
Just how big of a deal is that?
Compared to Android, Windows 10 and tons of network services and such and what they do not do FOR you, and instead do TO you.
And you can run a respectable and useful installation of Linux on that spiffy Apple hardware when it gets old. So make sure it gets old, know what I mean?
It could all be way worse.
As someone that just got out of a gig where I had to run Docker on MacOS - for the love of god, I would have done almost anything to use Windows 11.
Look - if I'm going to be treated like garbage, advertised to and patronized, at least let me use the system that can run Linux shells without turning into a nuclear reactor.
If I did not love computing, I would have bagged on all this long ago.
There are sets of deep roots in play here.
Phrasing struggles are rooted in the differences in these systems, and unless we have spent time in each, struggle seems likely.
That said, I spent time on the Apple side of the computing house early on... I know it helps.
I happen to be in the midst of a repair with Apple right now. And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone. Aside from the occasional sticker, I don't have any custom hardware mods to my phone or laptop, and nor do 99.99% of people.
Can Apple please every single tech nerd 100% of the time? No. Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
There was a time when Apple’s hardware was user-serviceable; I fondly remember my 2006 MacBook, with easily-upgradable RAM and storage. I also remember a time when Mac OS X didn’t have notarization and when the App Store didn’t exist. I would gladly use a patched version of Snow Leopard or even Tiger running on my Framework 13 if this were an option and if a modern web browser were available.
I run NixOS on a plain X11 environment with a browser, an editor and a terminal. It's really boring. For my favorite development stacks, everything works. Flakes make workflow easy to reproduce, and it's also easy to make dramatic setup changes at OS level thanks to declarativeness and immutability.
You may be smart enough to figure it out, but most people (even many smart tech people) get tired of these constant battles.
Here's an example from earlier this evening: I was buying a plane ticket from Japan Air Lines. Chrome automagically translates their website from Japanese to English. Other browsers, e.g. Firefox, and even Safari, do not - I checked. Is there a workaround or a fix? I'm sure you could find one, given time and effort. But who wants to constantly deal with these hassles?
Another very common example is communication apps. Or any time you're exchanging data in some proprietary format. Would it be great if no one used proprietary formats? Yes! Is that the world we live in? No. Can I force the rest of the world to adopt open standards, by refusing to communicate with them? No.
I would argue people are being tugged in that direction more than it being simply better.
You can bet when people start to get to work building things --all sorts of things, not just software, they find out pretty quickly just how important a simple desktop running on a general purpose computer really is!
And most laptops at this point have removable/exchangeable storage. Except for Apple.
Apple has full-disk encryption backed by the secure enclave so its not by-passable.
Sure their standard question-set asks you for your password when you submit it for repair.
But you don't have to give it to them. They will happily repair your machine without it because they can boot their hardware-test suite off an external device.
In particular if a flaw was to be revealed on the secure enclave or encryption, it would be too late to act on it after the machines have been sent in for years.
To be clear, I'm reacting on the "Apple is privacy focused" part. I wouldn't care if they snoop my bank statements on disk, but as a system I see them as behind what other players are doing in the market.
I hear the point you're making and I respect the angle, its fair-enough, but ...
The trouble with venturing into what-if territory is the same applies to you...
What if the disk you took out was subjected to an evil-maid attack ?
What if the crypto implementation used on the disk you took out was poor ?
What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
The trouble with IT security is you have you trust someone and something because even with open-source, you're never going to sit and read the code (of the program AND its dependency tree), and even with open-hardware you still need to trust all those parts you bought that were made in China unless you're planning to open your own chip-fab and motherboard plant ?
Its the same with Let's Encrypt certs, every man and his dog are happy to use them these days. But there's still a lot of underlying trust going on there, no ?
So all things considered, if you did a risk-assessment, being able to trust Apple ? Most people would say that's a reasonable assumption ?
You don't have to. The fact that it's possible for you to do so, and the fact that there are many other people in the open source community able to do so and share their findings, already makes it much more trust-worthy than any closed apple product.
Back when I was new to all of this, the idea of people evaluating their computing environment seemed crazy!
Who does that?
Almost nobody by percentage, but making sure any of us CAN is where the real value is.
It was caught by sheer luck and chance, at the last minute - the project explicitly didn't have a bunch of eyeballs looking at it and providing a crowd-sourced verification of what it does.
I am all for open source - everything I produce through my company to make client work easier is open, and I've contributed to dozens of third party packages.
But let's not pretend that it's a magical wand which fixes all issues related to software development - open source means anyone could audit the code. Not that anyone necessarily does.
Well, have fun with my encrypted data. Then I get my laptop back, and it's either a) running the unmodified, signed and encrypted system I set before or b) obviously tampered with to a comical degree.
> What if the crypto implementation used on the disk you took out was poor ?
I feel like that is 100x more likely to be a concern when you can't control disc cryptography in any meaningful way. The same question applies to literally all encryption schemes ever made, and if feds blow a zero day to crack my laptop that's a victory through attrition in anyone's book.
> What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
What if aliens did it?
Openness is a response to a desire for accountability, not perfect security (because that's foolish to assume from anyone, Apple or otherwise). People promote Linux and BSD-like models not because they cherry-pick every exploit like Microsoft and Apple does but because deliberate backdoors must accept that they are being submit to a hostile environment. Small patches will be scrutinized line-by-line - large patches will be delayed until they are tested and verified by maintainers. Maybe my trust is misplaced in the maintainers, but no serious exploit developer is foolish enough to assume they'll never be found. They are publishing themselves to the world, irrevocably.
Framework has demonstrated in more than one way that Apple's soldered/glued-in hardware strategy is not necessary.
Any claims about security of apple hardware or software are meaningless. If you actually need a secure device, apple is not an option.
I don't think this is precise, but the constraints seem a bit vague to me. What do you consider to be in the list of secure devices?
The fact that Apple refuses to let users bring their own keys, choose their disc encryption, and verify that they are secure makes their platforms no more "safe" than Bitlocker, in a relative sense.
Early, you mention people defending Apple security in a relative sense.
Later, you mentioned Apple refusing user actions to verify security makes them no more safe in a relative sense.
Are you just talking about Apple employing security by obscurity?
I just want to understand your point better, or confirm my take is reasonable.
And for anyone reading, for the record I suppose, I do not consider much of anything secure right now. And yes, there are degrees. Fair enough.
I take steps in my own life to manage risk and keep that which needs to be really secure and or private off electronics or at the least off networks.
I suppose so they can do a boot test post-repair or something like that. I have only used their repair process like twice in my life and both times I've just automatically said "no" and didn't bother asking the question. :)
With Apple FDE, you get nowhere without the password. The boot process doesn't pass go. Which catches people out when they reboot a headless Mac, the password comes before, not after boot even if the GUI experience makes you feel otherwise.
That's good enough for most consumers, but a lot more sensitive for enterprises IMHO. It usually gets a pass by having the contractual relation with the repair shop cover the risks, but I know some roles that don't get macbooks for that reason alone.
except that isn't generally how factory repairs are handled.
I don't know about Apple specifically, but other groups (Samsung, Microsoft, Lenovo) will happily swap your unit with a factory refurbished or warranty-repaired unit as long as it was sufficiently qualified before hand -- so the 'replaced with a newer unit' concept might be fantasy.
Admittedly this was a few years ago. Has apple mended their ways or are they still on the "used car salesman" grindset?
Third parties and resellers though I’m convinced just call their returns/open box units that appear to be in decent condition “refurbished.”
You have a phone with a real, but subtle fault. Something not caught by the normal set of tests. You return it for repair, get sent a new one, they replace the battery in your old one and put into stock as 'reconditioned'.
My phone is perfect, save for a worn out battery. I send it in for battery replacement, they send me yours. Now I've swapped my perfect phone for your faulty phone - and paid $70 to do so.
Did I say it would be a "new one"?
> 'might replace my aging phone with a newer unit, '
unless you just want to argue about the semantics and differences between 'aging', 'newer' , and 'new'.
HN really has turned into reddit.
Semantics is literally the meaning of things. So, yes the difference between those phrases is semantics.
But your use of 'semantics' meant something subtly different. Ain't language weird?
Because Apple got sued for doing that once, and people including myself are in line to get checks from it.
It's called a warranty and not at all exclusive to apple whatsoever?
> Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
Maybe you should stick to reading and not commenting, if this is the best you can do.
Consulting a certificate revocation list is a standard security feature, not a privacy issue.
Also, a CRL/OCSP check isn't a gating check — i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
Remember that CRLs/OCSP function as blacklists, not whitelists — they don't ask the question "is this certificate still valid?", but rather "has anyone specifically invalidated this certificate?" It is by default assumed that no, nobody has invalidated the certificate.
https://www.sentinelone.com/blog/what-happened-to-my-mac-app...
> Last week, just after we covered the release of Big Sur, many macOS users around the world experienced something unprecedented on the platform: a widespread outage of an obscure Apple service caused users worldwide to be unable to launch 3rd party applications.
> As was well-documented over the weekend, trustd employs a “fail-soft” call to Apple’s OCSP service: If the service is unavailable or the device itself is offline, trustd (to put it simply) goes ahead and “trusts” the app.
Even at the time people quickly figured out you could just disconnect from the internet as a workaround until the issue was fixed.
This is just Gatekeeper asking you which code-signing CA certs you want to mark as trusted in its kernel-internal trust store (which is, FYI, a separate thing from the OS trust store): do you want just the App Store CA to be trusted? Or do you also want the Apple Developer Program's "Self-Published App" Notarization CA to be trusted?
Choosing which code-signing CA-certs to trust will, obviously, determine which code-signed binaries pass certificate validation. Just like choosing which TLS CAs to trust, determines which websites pass certificate validation.
Code-signing certificate validation doesn't happen online, though. Just like TLS certificate validation doesn't happen online. It's just a check that the cert you have has a signing path back to some CA cert in the local trust store.
This warning only triggers for legacy releases of apps, published before notarization existed. Since Catalina, notarization has been part-and-parcel of the same flow that gets the self-published app bundle code-signed by Apple. AFAIK it is no longer possible to create a code-signed but non-notarized app bundle through XCode. (It's probably still possible by invoking `codesign` directly, and third-party build systems might still be doing that... but they really shouldn't be! They've had years to change at this point! Catalina was 2019!)
Thus, the "Open anyway" option in this dialog is likely transitional. This warning is, for now, intended to not overly frighten regular users, while also indicating to developers (esp. the developer of the app) that they should really get out a new, notarized release of their app, because maybe, one day, this non-notarized release of the app won't be considered acceptable by Gatekeeper any more.
I'm guessing that once a sufficient percentage of apps have been notarized, such that macOS instrumentation reports this dialog being rarely triggered, the "Open anyway" option will be removed, and the dialog will merge back into the non-code-signed-app version of the dialog that only has "Cancel" and "Move to Trash" options. Though maybe in this instance, the dialog would have the additional text "Please contact the app developer for a newer release of this app" (because, unlike with an invalid digital signature, macOS wouldn't assume the app is infected with malware per se, but rather just that it might do low-level things [like calling private OS frameworks] that Apple doesn't permit notarized apps to do.)
You can't distribute software through the Apple or Microsoft app stores without the software being signed.
You can sign and distribute software yourself without having anything to do with the app stores of either platform, although getting a signing certificate that Windows will accept is more expensive for the little guys than getting a signing certificate that Macs will accept.
On Windows, allowing users to run your software without jumping through additional hoops requires you to purchase an Extended Validation Code Signing Certificate from a third party. Prices vary, but it's going to be at least several hundred dollars a year.
https://www.reddit.com/r/electronjs/comments/17sizjf/a_guide...
Apple includes a software signing certificate with a basic developer account, which runs $100 a year.
You can ignore that on either platform, but users will have to take additional actions before they can run your unsigned software.
Perhaps you turned some "make things ultra-secure" setting on at some point ?
It used to be that you could run any third-party application you downloaded. And then for a while you'd have to right-click and select Open the first time you ran an application you'd downloaded, and then click through a confirmation prompt. And macOS 15, you have to attempt to open the application, be told it is unsafe, and then manually approve it via system settings.
Nope.
It has a built in malware scanner, but that just requires a downloaded list of known malware signatures.
Meanwhile you have a minimal set of developers with the ability to run arbitrary programs, and you can go from there with surveillance on MacOS like having every executable tagged with the developer's ID.
The greater the distance between the developer and the user, the more you can charge people to use programs instead of just copying them. But you can go much further under the guise of "quality control".
And you know this how?
This reads like every macOS fan’s worst nightmare, but there’s zero actual evidence that Apple is going in this direction.
Please share sources if you disagree.
They make the best selling laptop in the world, and other most-popular-in-class laptops. If their strategy is to have people not use laptops, they are going about it funny.
As for every executable being tagged, that is not required. People can build binaries with open tools and other people can run them.
A hash gets created for Apple to play same or different with binaries found to be nefarious somehow. Seems like a reasonable proposition.
not sharing my data with other parties, or using it to sell me stuff or show me ads, is what I would define as respecting my privacy; Apple checks those boxes where few other tech companies do
The problem with many self-repair people is they effectively value their time at zero.
I value my time realistically, i.e. above zero and above minimum wage. It is therefore a no brainer for me to buy AppleCare every ... single ..time. It means I can just drop it off and let someone else deal with messing around.
I also know how much hassle it is. Like many techies, I spent part of my early career repairing people's PCs. Even in big PC tower cases with easy accessibility to all parts its still a fucking horrific waste of time. Hence these days I'm very happy to let some junior at Apple do it for the cost of an AppleCare contract.
Back in 2010 Apple quoted me €700 for a topcase replacement because of shattered display glass. Instead I paid €50 for a third party replacement pane and did 15 minutes of work with a heat gun.
What's more, they fold most of the cost of the repair into the price of parts. So you can either get a replacement screen for €499 and install it yourself, or have it officially repaired for €559. This effectively subsidizes official repairs and makes DIY repairs more expensive.
Apple does extreme gouging with repairs, its hogwash to claim anything else.
My hope is that the machine will work for a long while, like most of them do. In my case it’s a ~$1200 machine so I prefer to self-insure. I’m taking the chance that if it goes bad, I’ll pay to fix or replace it.
This makes sense, for me, when I do it on everything that I buy.
1. people who arguably fall under the definition of careless, or have small children, need repair plans
2. people who are fastidious and nothing ever breaks, don't need repair plans
3. people who are fastidious, have small children, need repair plans
I was a #2 and I'm slowly transitioning into a #3 for specific purchases.
Everything is a tradeoff.
I’d love to live in the F droid alt tech land, but everything really comes down to utility. Messaging my friends is more important than using the right IM protocol.
Much as I wish I could convince everyone I know and have yet to meet to message me on Signal or whatever, that simply isn’t possible. Try explaining that I am not on Whatsapp or insta to a girl I’ve just met…
Also it is nice to spend basically no time maintaining the device, and have everything work together coherently. Time is ever more valuable past a certain point.
This is just the "Why not Linux desktop" argument from the past two decades. Sure, in theory it can be configured to do a lot of different things. But you're probably gonna have to work out the details yourself because the downside of theoretically supporting everything is that it's impossible to just have it work out of the box with every single scenario.
People have been saying this ever since Apple added the App Store to the Mac in 2010. It’s been 14 years. I wonder how much time has to go by for people to believe it’s not on Apple’s todo list.
Genuinely asking: are there any specifics on this? I understand that blocking at the firewall level is an option, but I recall someone here mentioning an issue where certain local machine rules don’t work effectively. I believe this is the issue [1]. Has it been “fixed”?
[1] https://appleinsider.com/articles/21/01/14/apple-drops-exclu...
(So multiple binaries with the same team don't check either.)
And I'd expect all logging is disabled on the CDN.
Yeah because what’s being sent is not analytics but related to notarizarion, verifying the app’s integrity (aka is it signed by a certificate known to Apple?)
This came to light a few years ago when the server went down and launching apps became impossible to slow…
You are free to verify.
I don't think Apple's behavior actually reflects this if you look closely (although I can certainly see how someone could form that opinion):
As a counter example, Apple assisted with their own engineers to help port Blender to Metal (https://code.blender.org/2023/01/introducing-the-blender-met...):
> Around one year ago, after joining the Blender Development Fund and seeding hardware to Blender developers, Apple empowered a few of its developers to directly contribute to the Blender source code.
I'm assuming similar support goes to other key pieces of software, e.g., from Adobe, Maxon, etc... but they don't talk about it for obvious reasons.
The point being Apple considers these key applications to their ecosystem, and (in my estimation at least) these are applications that will probably never be included in the App Store. (The counterargument would be the Office Suite, which is in the App Store, but the key Office application, Excel, is a totally different beast than the flagship Windows version, that kind of split isn't possible with the Adobe suite for example.)
Now what I actually think is happening is the following:
1. Apple believes the architecture around security and process management that they developed for iOS is fundamentally superior to the architecture of the Mac. This is debatable, but personally I think it's true as well for every reason, except for what I'll go into in #2 below. E.g., a device like the Vision Pro would be impossible with macOS architecture (too much absolute total complete utter trash is allowed to run unfettered on a Mac for a size-constrained device like that to ever be practical, e.g., all that trash consumes too much battery).
2. The open computing model has been instrumental in driving computing forward. E.g., going back to the Adobe example, After Effects plugins are just dynamically linked right into the After Effects executable. Third party plugins for other categories often work similarly, e.g., check out this absolutely wild video on how you install X-Particles on Cinema 4D (https://insydium.ltd/support-home/manuals/x-particles-video-...).
I'm not sure if anyone on the planet even knows why, deep down, #2 is important, I've never seen anyone write about it. But all the boundary pushing computing fields I'm interested in, which is mainly around media creation (i.e., historically Apple's bread-and-butter), seems to depend on it (notably they are all also local first, i.e., can't really be handled by a cloud service that opens up other architecture options).
So the way I view it is that Apple would love to move macOS to the fundamentally superior architecture model from iOS, but it's just impossible to do so without hindering too many use cases that depend on that open architecture. Apple is willing to go as close to that line as they can (in making the uses cases more difficult, e.g., the X-Particles video above), but not actually willing to cross it.
What has the EU done to stop Apple doing this? Are Apple currently rolling it out to everywhere but the EU?
that ship has well and truly sailed, this conspiracy might once have held water but Apple's machines are far too commercially ubiquitous for them to have any designs on ringfencing all the software used by all the industries that have taken a liking to the hardware.
What are you talking about? I don’t run a single app from the app store and have never felt a need to.
I'm curious: what hardware and software stack do you use?
https://discuss.grapheneos.org/d/14344-cellebrite-premium-ju...
Edit: I have not posted a source for this claim, because what sort of source would be acceptable for a claim of the form "X has not occurred"?
If you are going to claim Apple's security model has been compromised, you need not only evidence of such a compromise but also an explanation for why such an "obvious" and "cheap" vulnerability has not been disclosed by any number of white or grey-hat hackers.
"Since then, technologies like Grayshift’s GrayKey—a device capable of breaking into modern iPhones—have become staples in forensic investigations across federal, state, and local levels."
"In other cases where the FBI demanded access to data stored in a locked phone, like the San Bernardino and Pensacola shootings, the FBI unlocked devices without Apple’s help, often by purchasing hacking tools from foreign entities like Cellebrite."
1 - https://www.firstpost.com/tech/the-fbi-was-able-to-hack-into...
- they can keep asking for backdoors to "stop terrorists"
- they're not on the hook if for whatever reason they can't access a particular phone in a very mediatized case
- most targets (the not so sophisticated ones at least) keep using a device the agencies have proper access to
Regardless of their actual technical means, I don't expect we ever get a "we sure can!" kind of public boasting any time soon.
I'm very happy to only run stuff approved on Apple's app store... ESPECIALLY following their introduction of privacy labels for all apps so you know what shit the developer will try to collect from you without wasting your time downloading it.
Also have you seen the amount of dodgy shit on the more open app stores ?
I am totally ok with this. I have personally seen apple reject an app update and delist the app because a tiny library used within it had a recent security concerns. Forced the company to fix it.
Well that’s just childish, pouty, and not a very well thought out train of thought on the subject.
The control isn’t over people, it’s about finding a solution to creating and preserving market share via device reliability on the platform. There are 1.4B iPhone users (and that’s a real number, not a fantasy), and not every one of those people is savvy enough to vet their applications before installation. If installation of any app was wide open you would have a large portion of those 1.4B accidentally installing crap. They may have 100 apps on their phone but if 1 is a piece of shit and broken (and yes conservatively at least 1% of apps out there probably have a bug bad enough to wreck some havoc) and it renders the reliability of the phone to shit that’s bad. If the market perceives that the reliability of the device is shit, Apple loses either in increasing or preserving market share for the device. Apple needs those devices need to work reliably and it feels that one way to do that is vetting the apps that will be running on it. The hardware is great, the OS does its job making the hardware platform operational, but the one place where there is the opportunity to introduce instability is in the apps. So you do your best to control that area of instability opportunity on your platform.
Here is the beautiful thing for you…there plenty of other phones out there that will allow you to install whatever the hell you want. Apple only has 16% of the worldwide smartphone market share.
> conservatively at least 1% of apps
That's another made up number of yours, with a similarly made up qualifier
> the market perceives that the reliability of the device is shit
Since the vast majority of devices aren't so locked down, isn't "the market" yelling at yout that you're wrong?
However, I stand firm in my argument about why the iPhone is locked down and why it’s a good thing. Even if you spread into other smartphone manufacturers like Samsung, you still find similar attempts to control the lay users ability to install unvetted apps on the devices. It may even be more important for them to do that too since they don’t fully control the OS on their devices.
> That's another made up number of yours, with a similarly made up qualifier
Obvious it was made up and obviously it was set as an intentionally low bar for software quality because who would argue (especially on HN) that 100% of available software out there is bug free, but if you want to believe that all available software is 100% safe to use, I encourage you to download and install everything you come across no matter whether the device is a smartphone, a Mac, or any other device you use and rely upon. I am sure you will be fine.
Sure, though it doesn't mean what you want it to mean since you just ignore the $$$ elephant in the room that explains the desire for more control. For the same reason, you "stand firm" in ignorance as to "why the iPhone is locked down"
> Obvious it was made up
Glad you realise that.
> intentionally low bar
Intentionally appearing like one
> if you want to believe ... software is 100% safe to use
Again with your fantasies. I believe the justification should be grounded in reality, both in terms of the % estimate as well as in terms of the severity (so no, "bug free" is irrelevant, you need severe billions-afecting bugs that can only be eliminated by hard-forcing the app store, which you can't have since the reality doesn't align with you).
And as to your standing firm in your argument "why it’s a good thing", well, you don't really have an argument, just a desire for one with made up stats and corporate motivations
Thanks for the education in the importance of precision and the rejection of experience in determining reality. I’ll ignore my decades of having to clean up all the messes that apparently non-existent buggy shit software managed to do to novice and lay users who willy-nilly installed it…or maybe didn’t install it, since it was imaginary.
By the way…before you respond again you might read up a bit on situational irony. You seemed to have missed it on my prior comment…and this one is dripping with it.
People who are installing things using a terminal are probably (a) slightly computer savvy and (b) therefore aware that this might not be a totally safe operation.
The things we talk about here which annoy us are for the much larger set of people who need them!
Put another way, it is all about the set of us who cannot really participate in this discussion.
From a skill and trust point of view, Google is doing a lot better than apple will ever.
Including ondevice AI
Easy.
The App Store stores a lot of sensitive data about you and is not end-to-end encrypted. They operate it just like everyone else. You also use Gmail, which is just as sensitive as your iMessages, and Gmail is not end-to-end encrypted, so it's not clear you value that as much as you say.
End-to-end encryption is certainly the most relevant feature for these scenarios.
App store DRM is a red herring, as a developer I can still run as much untrusted code on my MBP as I want and I don't see that going away any time soon.
> "can someone see that I've installed an app"
You say preferences and you didn't say what you mean. One meaning of the word preferences: what if you installed Grindr?
Sure I use gmail, I've been locked in for 15 years. Someday I'll get fed up enough to bite the bullet and move off it.
Apple can push updates and change the rules on your device at any time. Rooted Android works better in that regard: you can still use Google stuff on rooted devices. Also I don't think Apple's security posture for users in China is better than every "other big tech co."
The takeaway for me is that Apple's storytelling is really good. They are doing a good job on taking leadership on a limited set of privacy issues that you can convince busy people to feel strongly about. Whether or not that objectively matters is an open question.
Apple doesn’t run open hardware, and supports features users want that involve opening a network connection back home? Hard privacy fail.
I heard a good definition from my dad: "Privacy for me is pedestrians walking past my window not seeing me step out of the shower naked, or my neighbours not overhearing our domestic arguments."
Basically, if the nude photos you're taking on your mobile phone can be seen by random people, then you don't have privacy.
Apple encrypts my photos so that the IT guy managing the storage servers can't see them. Samsung is the type of company that includes a screen-capture "feature" in their TVs so that they can profile you for ad-targeting. I guarantee you that they've collected and can see the pictures of naked children in the bathtub from when someone used screen mirroring from their phone to show their relatives pictures of their grandkids. That's not privacy.
Sure, I use Google services, but I don't upload naked kid pictures to anything owned by Alphabet corp, so no problem.
However, I will never buy any Samsung product for any purpose because they laugh and point at customer expectations of privacy.
[1] Actually not that weird. Now that I've worked in government departments, I "get" the need for these regulations. Large organisations are made up of individuals, and both the org and the individual people will abuse their access to data for their own benefit. Many such people will even think they're doing the "right thing" while destroying freedom in the process, like people that keep trying to make voting systems traceable... so that vote buying will become easy again.
This should illuminate for you that there is nothing special about iCloud privacy or security, in any sense. It has the same real weaknesses as any other service that is UIs for normal people.
> there's a difference between a person being tricked to give up their credentlials and a zero day.
Only if you work at Apple.
We have mass conversational and voice cloning technology. It's not a question of if but when.
Chip | Geekbench Score (Process)
---- | ------------------------
M1 | 2,419 (5nm)
M2 | 2,658 (5nm)
M3 | 3,076 (3nm)
M4* | 3,810 (3nm)
In my experience, single-core CPU is the best all-around indicator of how "fast" a machine feels. I feel like Apple kind of buried this in their press release.M4 benchmark source: https://browser.geekbench.com/v6/cpu/8171874
When a company adds a supercharger to a car does it not count as faster?
When I add more solar panels to my roof does it not count as more power?
Surely doing this kind of thing is exactly what we want companies to be doing to make their products faster/better.
If you add a supercharger you will get more power, but if the car's transmission is not upgraded, you might just get some broken gears and shafts.
If you add more solar panels to your roof, you might exceed the inverter power, and the panels will not bring benefits.
It's true that you will benefit from the changes above, but not just by themselves - something else needs to change so you can benefit. And in the case of the M4 and these extensions, the software needs do be changed and also to have an use case for these extensions.
In any case, the two main deep learning packages have already been updated so for the place this change was almost certainly targeted for, your complaint is answered. I'm just stunned that anyone would complain about hardware matrix multiplication? I've wondered why that hasn't been ubiquitous for the past 20 years.
Everyone should make that improvement in their hardware. Everyone should get rid of code implementing matrix mult and make the hardware call instead. It's common sense. Not to put too fine a point on it, but your complaint assumes that GeekBench is based on code that has implemented all those changes.
The whole point is that these highly specialized scenarios are only featured in very specialized usecases, and don't reflect in overall performance.
We've been dealing with the regular release of specialized processor operations for a couple of decades. This story is not new. You see cherry-picked microbenchmarks used to plot impressive bar charts, immediately followed by the realization that a) in general this sort of operator is rarely invoked with enough frequency to be noticeable, b) you need to build code with specialized flags to get software to actually leverage this feature, c) even then it's only noticeable in very specialized workloads that already run on the background.
I still recall when fused multiply-add was such a game changer because everyone used polynomials and these operations would triple performance. Not the case.
And more to the point, do you believe that matrix multiplication is a breakthrough discovery that is only now surfacing? Computers were being designed around matrix operations way before they were even considered to be in a household.
I think the whole point is that microbenchmarks provide no data in overall performance. They just test very specific and by no means common use case.
I'm not doubting the number represent real peak throughput on M4. There is just a taste to the timing lining up so well. Also they don't take advantage of fully SVE2 capable ARM cores to compare how much a full SVE2 implementation would help especially at accelerating more algorithms than those that neatly map to streaming SVE and SME.
The single core performance gains of M4 variants over their predecessors are distorted, because the streaming SVE and SME are apparently implemented by combining what used to be them AMX units of four cores.
The press release describes the single core performance as the fastest ever made, full stop:
"The M4 family features phenomenal single-threaded CPU performance with the world’s fastest CPU core"
The same statement is made repeatedly across most the new M4 line up marketing materials. I think thats enough to get the point across that its a pretty quick machine.
If you're 30% faster than the previous generation, I'd rather see that because my assumption is it's 5%.
> Results are compared to previous-generation 1.7GHz quad-core Intel Core i7-based 13-inch MacBook Pro systems with Intel Iris Plus Graphics 645, 16GB of RAM, and 2TB SSD.
However, attention to keeping Intel Macs performant has taken a dive. My 2019 16" MBP died last week so I fell back to my standby 2014 MBP and it's much more responsive. No login jank pause . But it also hasn't been eligible for OS updates for 2 or 3 years.
My new M3 MBP is "screaming fast" with Apple's latest patched OS.
My god, it's ridiculous. I really prefer Linux desktops. They've been snappy for the past 30 years, and don't typically get slow UI's after a year or two of updates.
https://ark.intel.com/content/www/us/en/ark/products/88195/i...
Core i7 14700K was released in Q4 2023. 8 P-core, 12 E-core, 28 threads. ~5.5GHz for Performance-core max frequency.
https://ark.intel.com/content/www/us/en/ark/products/236783/...
This is the same CPU tier, just a later generation.
Passmark scores:
6700K: 8,929
14700K: 53,263
Yeah, that's practically the same performance.
But hey, that newer i7 has way more cores. Let's pick something with a closer core count for a fairer comparison. Let's pick the Core i3-14100 with its 4C/8T with a turbo of 4.7GHz. Even then, its Passmark benchmark 15,050.
https://ark.intel.com/content/www/us/en/ark/products/236783/...
https://www.cpubenchmark.net/compare/2565vs5719vs5831/Intel-...
I get it, an old CPU can still be useful. I'm still using an Ivy Bridge CPU for a server in my closet hosting various services for my home, but it is vastly slower than my Ryzen 7 3700x on my current gaming desktop and was even slower than the previous Ryzen 5 2600 I had before and sold to a friend.
The last Intel macbook pros were released 4 years ago. Their owners are starting to shop around for replacements. Their question will be "will the expense be worth it?"
As an example a single thread on an x64 core of an old Pentium 4 661 @ 3.6 GHz benchmarks at 315 with PassMark while a single x64 core of a current 285k @ 5.7 GHz turbo benchmarks at 5195. Some of that also comes down to things like newer RAM to feed the CPU but the vast majority comes down to the CPU calculating more bits per clock cycle.
After decades of Apple, people still believe them.
Flameproof suit donned. Please correct me because I'm pretty ignorant about modern hardware. My main interest is playing lots of tracks live in Logic Pro.
On the variants: An M1 Max is 10 CPU cores with 8 power and 2 efficiency cores.
M4 Max is 16 cores, 12 + 4. So each power core is 50% faster, but it also has 50% more of them. Add in twice as many efficiency cores, that are also faster for less power, plus more memory bandwidth, and it snowballs together.
One nice pseudo-feature of the M1 is that the thermal design of the current MacBook Pro really hasn't changed since then. It was designed with a few generations of headroom in mind, but that means it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move, while an M3 Max is easier to make (slightly) audible.
I routinely get my M1 fans spinning from compiling big projects. You don’t have to get the GPU involved, but when you do it definitely goes up a notch.
I read so much about the M1 Pros being completely silent that I thought something was wrong with mine at first. Nope, it just turns out that most people don’t use the CPU long enough for the fans to kick in. There’s a decent thermal capacity buffer in the system before they ramp up.
The M3 is much more typical behavior, but I guess it's just dumping more watts into the same thermal mass...
The M1s are likely to remain pretty usable machines for a few years yet, assuming your workload has not or does not significantly change.
I can't wait to buy one and finally be able to open more than 20 Chrome tabs.
I wouldn't be surprised to hear that the geekbench developers are heavily supported by apple's own performance engineers and that testing might not be as objective or indicative of real world perf as one would hope.
From the data side: the M4 hasn't made it to the charts yet but the M3 already holds the 4th place in PassMark's single thread chart as well https://www.cpubenchmark.net/singleThread.html as well as tops Cinebench 2024 https://www.cpu-monkey.com/en/cpu_benchmark-cinebench_2024_s...
The only areas the M series "lags" is in the high end workstation/server segment where they don't really have a 96+ core option or in spaces where you pop in beefy high end GPUs. Everything else the M4 tends to lead in right now.
My bad.
And just to be clear: I didn't speculate that Apple tune's its chip to Geekbench, I speculated that geekbench was overly optimized towards apple's latest chip.
I know iOS developers who recently upgraded their MacBooks and they claim they now feel more sluggish. I wouldn't be surprised if it was due to RAM constraints instead of CPU though.
So, take those artificial bencmarks with a grain of salt. They are so optimized that they are optimized out of the real world.
And the big thing you leave out is that it all depends on how well the software is optimise, how much animations it use and things like that. My iPhone has MUCH better single-thread performance than my old PC, yet it feels much slower for almost everything.
And this is exactly how I feel about Apple Silicon Macs. On paper, impressive performance. In actual practice it doesn't feel that fast.
There's a massive difference between "pretty much every app is 80% faster" and "if you render a 4K ProRes video in Final Cut Pro it's 3x faster."
I insist my 2020 Macbook M1 was the best purchase I ever made
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
My 2015 MBP would like to have a word.
It’s the only laptop purchase I’ve made. I still use it to this day, though not as regularly.
I will likely get a new MBP one of these days.
When you upgrade, prepare to be astonished.
The performance improvement is difficult to convey. It's akin to traveling by horse and buggy. And then hopping into a modern jetliner, flying first class.
It's not just speed. Display quality, build quality, sound quality, keyboard quality, trackpad, ports, etc., have all improved considerably.
When I migrated all my laptops to SSDs (lenovos at the time, so it was drop-dead simple), I thought to myself, "this is a once-in-a-generation feeling". I didn't think I would ever be impressed by a laptop's speed ever again. It was nice to be wrong.
A family 2018 Macbook Air got a second life with a battery replacement. Cheap kit from Amazon, screwdrivers included, extremely easy to do. Still in use, no problems.
I also have a M1 from work that is absolutely wonderful, but I think it's time for me to upgrade the 2015 with one of these new M4s.
The longevity of Macbooks is insanely good.
Longevity is not only a thing of MBPs. OTOH, IIRC, some 2017-2019 MBPs (before the Mx switch) were terrible for longevity, given their problematic keyboard.
Last one with upgrade capabilities, now it has two fast SSDs and maximum Ram. I changed the battery once.
Only shame is that it doesn’t get major MacOS upgrades anymore.
Still good enough to browse the web, do office productivity and web development.
12 years of good use, I am not sure I can get so much value anywhere now
Thanks for reminding me that everything is possible, I may try Opencore to keep it even longer !
I think this M-series macbook airs are a worthy successor to the 2012 MBP. I fully intend to use this laptop for at least the same amount of time, ideally more. The lack of replaceable battery will probably be the eventual killer, which is a shame.
Sold mine last year for $100 to some dude who claimed to have some software that only runs on that specific laptop. I didn't question it.
For it's time, the 2015 model was a fantastic package: reliable and robust in form and function.
Would've kept going on it had Apple silicon and 14 inch not come around.
Barring super niche LLM use cases, I don't see why one would need to upgrade.
Traded it for an M1 Air in 2021 and was astonished at how much faster it was. It even blew away my 2019 16" from work.
You're going to be even more blown away!
Anything you can buy online ships with all required screw drivers and dozen of Youtube videos or ifixit will give you step by step instructions.
10-15 minutes and you'll have the old battery replaced all by yourself.
It's that simple.
The 2012 MBP 15" Retina was probably the only machine I bought where the performance actually got better over the years, as the OS got more optimized for it (the early OS revisions had very slow graphics drivers dealing with the retina display)
The M1 Pro on the other hand, that was a true upgrade. Just a completely different experience to any Apple Intel laptop.
Rebuilding a bunch of Docker images on an older intel mac is quite the slow experience if you're doing it multiple times per day.
I recently replaced it with a used MBA M1, 16GB, 2TB.
It's insane how much faster it is, how long the battery lasts and how cool and silent it is. Completely different worlds.
I was considering upgrading to an M3 up until about a month ago when Apple replaced my battery, keyboard, top case, and trackpad completely for free. An upgrade would be nice as it no longer supports the latest MacOS, but at this point, I may just load Ubuntu on the thing and keep using it for another few years. What a machine.
It has gotten significantly slower the last 2 years, but the more obvious issue is the sound, inability to virtual background, and now lack of software updates.
But if you had told me I'd need to replace it in 2022 I wouldn't believe you
Can't get the latest macOS on it though, but otherwise it still works perfectly well.
Kinda considering upgrading it to a used M1/M2 one at some point.
I have either assembled my own desktop computers or purchased ex corporate Lenovo over the years with a mix of Windows (for gaming obviously) and Linux and only recently (4 years ago) been given a MBP by work as they (IT) cannot manage Linux machines like they do with MacOS and Windows.
I have moved from an intel i5 MBP to a M3 Pro (?) and it makes me want to throw away my dependable ThinkPad/Fedora machine I still uses for personal projects.
My laptop is my work life and my personal life.
I spend easily 100 hours a week using it not-as-balanced-as-it-should-be between the two.
I don't buy them because I need something new, I buy them because in the G4/Intel era, the iterations were massive and even a 20 or 30% increase in speed (which could be memory, CPU, disk -- they all make things faster) results in me being more productive. It's worth it for me to upgrade immediately when apple releases something new, as long as I have issues with my current device and the upgrade is enough of a delta.
M1 -> M2 wasn't much of a delta and my M1 was fine. M1 -> M3 was a decent delta, but, my M1 was still fine. M1 -> M4 is a huge delta (almost double) and my screen is dented to where it's annoying to sit outside and use the laptop (bright sun makes the defect worse), so, I'm upgrading. If I hadn't dented the screen the choice would be /a lot/ harder.
I love ThinkPads too. Really can take a beating and keep on going. The post-IBM era ones are even better in some regards too. I keep one around running Debian for Linux-emergencies.
Could you get more money by selling it? Sure. But it's hard to be the convenience. They ship you a box. You seal up the old device and drop it off at UPS.
I also build my desktop computers with a mix of Windows and Linux. But those are upgraded over the years, not regularly.
What different lives we live. This first M1 was in November 2020. Not even four years old. I’ve never had a [personal] computer for _less_ time than that. (Work, yes, due to changing jobs or company-dictated changes/upgrades)
Other example - I'm by no means rich, but I have a $300 mechanical keyboard - it doesn't make me type faster and it doesn't have additional functionality to a regular $30 Logitech one - but typing on it feels so nice and I spend so much of my life doing it, that to me it's completely justified and needed to have this one then.
That’s a feature, not a bug, for some. When I upgraded to an M series chip MacBook, I had to turn up the heat because I no longer had my mini space heater.
I still have a running Thinkpad R60 from 2007, a running Thinkpad T510 from 2012, and a modified running Thinkpad X61 (which I re-built as an X62 using the kit from 51nb in 2017 with a i7-5600U processor, 32 GB of RAM and a new display) in regular use. The latter required new batteries every 2 years, but was my main machine until 2 weeks ago when I replaced it with a ThinkCentre. During their time as my main machine, each of these laptops was actively used around 100 hours per week, and was often running for weeks without shutdown or reboot. The only thing that every broke was the display of the R60 which started to show several green vertical bars after 6 years, but replacement was easy.
Me too. Only one complaint. After I accidentally spilled a cup of water into it on an airplane, it didn't work.
(However AppleCare fixed it for $300 and I had a very recent backup. :) )
What’s more annoying is that I’d jus to get a new one and recycle this one, but the SSD is soldered on. Good on you for having a backup.
Do not own a Mac unless you bought it used or have AppleCare.
Never had AppleCare or any other extended warranty program.
Did just fine up to now.
As I've noted in a sibling comment, I'll probably stop purchasing mobile Macs until the repair story on Macbooks is improved -- the risk for accidents and repairs is simply much higher on portable machines. That's only going to happen through third-party repair (which I think would simultaneously lead Apple to lower their first-party repair costs, too).
1) a slight overall savings, though I'm not sure about that. 2) a lack of stress when something breaks. Even if there isn't an overall savings, for me it's been worth it because of that.
Certainly, my recent Mac repair would have cost $1500 and I only paid $300, and I think I've had the machine for about 3 years, so there's a savings there but considerably less recent stress. That's similar to the experience I've had all along, although this recent expense would have probably been my most-expensive repair ever.
I’ve underwritten my own Mac ownership since the very first Intel MacBook Pro and just like you I’ve been just fine.
Apple is putting raw NAND chips on the board (and yes soldering them) and the controller for the SSD is part of the M-series chip. Yes, apple could use NVMe here if you ignore the physical constraints and ignore fact that it wouldn't be quite as fast and ignore the fact that it would increase their BOM cost.
I'm not saying Apple is definitively correct here, but, it's good to have choice and Apple is the only company with this kind of deeply integrated design. If you want a fully modular laptop, go buy a framework (they are great too!) and if you want something semi-modular, go buy a ThinkPad (also great!).
I don't truly mind that they solder on the SSD, embed the controller into the processor -- you're right that it's great we have choice here. I mind the exuberant repair cost _on top of_ Apple's war on third party repair. Apple is the one preventing me to have choice here, I have to do the repair through them, or wait until schematics are smuggled out of China and used/broken logic boards are available so that the repair costs what it should: $300 to replace 2 chips on my logic board (still mostly labor, but totally a fair price).
I love Apple for their privacy focus and will continue to support them because I need to do Mac and iOS development, but I will likely stop buying mobile workstations from them for this reason, the risk of repair is simply much higher and not worth this situation.
Day to day I don't mind but when needs change or something breaks it's unfortunate to have to replace the whole machine to fix it.
And yeah, this incident reminded me of why it's important to back up as close to daily as you can, or even more often during periods when you're doing important work and want to be sure you have the intermediate steps.
I actually use my laptop on my lap commonly and I think the i9 was going to sterilize me.
Now on M2 MBP and will probably be using it for a very long time.
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
I'm definitely still happy with it, but job offers upgrade to M4 so... why not?
M1 series machines are going to be fine for years to come.
Actually wasn't M1 itself an evolution / upscale of their A series CPUS that by now they've been working on since... before 2010, the iPhone 4 was the first one with their own CPU, although the design was from Samsung + Intrinsity, it was only the A6 that they claimed was custom designed by Apple.
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
This is the best machine I have ever owned. It is so completely perfect in every way. I can't imagine replacing it for many many years.
Best way I've discovered to find those sorts of deals is to use Slickdeals and set up alerts.
The only thing to keep in mind, is that the M1 was the first CPU in the transition from Intel CPUs (+ AMD GPUs) to Apple Silicon. The M1 was still missing a bunch of things from earlier CPUs, which Apple over time added via the M1 Pro and other CPUs. Especially the graphics part was sufficient for a small laptop, but not for much beyond. Better GPUs and media engines were developed later. Today, the M3 in a Macbook Air or the M4 in the Macbook Pro have all of that.
For me the biggest surprise was how well the M1 Macbook Air actually worked. Apple did an outstanding job in the software & hardware transition.
I don't understand how people are enamored with those things, sure it's better in some way than what it was before but it's also very compromised for the price.
With whisky i feel like id never need anything else. That said, the benchmark jump in the m4 has me thinking i should save up and grab a refurb in a year or two
The battery performance is incredible too.
(of course, everyone else has a macbook too, there's always someone that can lend me a charger. Bonus points that the newer macbooks support both magsafe and USB-C charging. Added bonus points that they brought back magsafe and HDMI ports)
Personally, I practically never use MagSafe, because the convenience of USB C charging cables all over the house outweighs the advantages of MagSafe for me.
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
On PCs some other hardware (notably the SSD) comes with its own memory. But here it's shared with the main DRAM too.
This is not necessarily a performance improvement, it can avoid copies but also means less is available to the CPU.
So apple manages decent GPU performance, a tiny package, and great battery life. It's much harder on the PC side because every laptop/desktop chip from Intel and AMD use a 128 bit memory bus. You have to take a huge step up in price, power, and size with something like a thread ripper, xeon, or epyc to get more than 128 bit wide memory, none of which are available in a laptop or mac mini size SFF.
It's not really a new idea, just unusual in computers. The custom SOCs that AMD makes for Playstation and Xbox have wide (up to 384-bit) unified memory buses, very similar to what Apple is doing, with the main distinction being Apples use of low-power LPDDR instead of the faster but power hungrier GDDR used in the consoles.
Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
No CPU uses 128b memory bus as it results in overfetch of data, i.e., 128B per access, or two cache lines.
AFAIK Apple uses 128B cache lines, so they can do much better design and customization of memory subsystem as they do not have to use DIMMs -- they simply solder DRAM to the motherboard, hence memory interface is whatever they want.
Sure, per channel. PCs have 2x64 bit or 4x32 bit memory channels.
Not sure I get your point, yes PCs have 64 bit cache lines and apple uses 128. I wouldn't expect any noticeable difference because of this. Generally cache miss is sent to a single memory channel and result in a wait of 50-100ns, then you get 4 or 8 bytes per cycle at whatever memory clock speed you have. So apple gets twice the bytes per cache line miss, but the value of those extra bytes is low in most cases.
Other bigger differences is that apple has a larger page size (16KB vs 4KB) and arm supports a looser memory model, which makes it easier to reach a large fraction of peak memory bandwidth.
However, I don't see any relationship between Apple and PCs as far as DIMMS. Both Apple and PCs can (and do) solder dram chips directly to the motherboard, normally on thin/light laptops. The big difference between Apple and PC is that apple supports 128, 256, and 512 bit wide memory on laptops and 1024 bit on the studio (a bit bigger than most SFFs). To get more than 128 bits with a PC that means no laptops, no SFFs, generally large workstations with Xeon, Threadrippers, or Epyc with substantial airflow and power requirements
Also important to consider that the RTX 4090 has a relatively tiny 384-bit memory bus. Smaller than the M1 Max's 512-bit bus. But the RTX 4090 has 1 TB/s bandwidth and significantly more compute power available to make use of that bandwidth.
The M4 max is definitely not a 4090 killer, does not match it in any way. It can however work on larger models than the 4090 and have a battery that can last all day.
My memory is a bit fuzzy, but I believe the m3 max did decent on some games compared to the laptop Nvidia 4070 (which is not the same as the desktop 4070). But highly depended on if the game was x86-64 (requiring emulation) and if it was DX11 or apple native. I believe apple claims improvements in metal (the Apple's GPU lib) and that the m4 GPUs have better FP for ray tracing, but no significant changes in rasterized performance.
I look forward to the 3rd party benchmarks for LLM and gaming on the m4 max.
Many integrated graphics segregate the memory into CPU owned and GPU owned, so that even if data is on the same DIMM, a copy still needs to be performed for one side to use what the other side already has.
This means that the drivers, etc, all have to understand the unified memory model, etc. it’s not just hardware sharing DIMMs.
APUs with shared everything are not a new concept, they are actually older than programmable graphics coprocessors…
https://www.heise.de/news/Gamescom-Playstation-4-bietet-Unif...
And yes, the impressive part is that this kind of bandwidth is hard to get on laptops. I suppose I should have been a bit more specific in my remark.
Or you could buy a M3 max laptop for $4k, get 10+ hour battery life, have it fit in a thin/light laptop, and still get 546GB/sec. However those are peak numbers. Apple uses longer cache lines (double), large page sizes (quadruple), and a looser memory model. Generally I'd expect nearly every memory bandwidth measure to win on Apple over AMD's turin.
There are, in my experience, professionals who want to use the best tools someone else builds for them, and professionals who want to keep iterating on their tools to make them the best they can be. It's the difference between, say, a violin and a Eurorack. Neither's better or worse, they're just different kinds of tools.
I was sorely tempted by the Mac studio, but ended up with a 96GB ram Ryzen 7900 (12 core) + Radeon 7800 XT (16GB vram). It was a fraction of the price and easy to add storage. The Mac M2 studio was tempting, but wasn't refreshed for the M3 generation. It really bothered me that the storage was A) expensive, B) proprietary, C) tightly controlled, and D) you can't boot without internal storage.
Even moving storage between Apple studios can be iffy. Would I be able to replace the storage if it died in 5 years? Or expand it?
As tempting as the size, efficiency, and bandwidth were I just couldn't justify top $ without knowing how long it would be useful. Sad they just didn't add two NVMe ports or make some kind of raw storage (NVMe flash, but without the smarts).
This was really driven home to me by my recent purchase of an Optane 905p, a drive that is both very fast and has an MTBF measured in the hundreds of years. Short of a power surge or (in California) an earthquake, it's not going to die in my lifetime -- why should I not keep using it for a long time?
Many kinds of professionals are completely fine with having their Optanes and what not only be plugged in externally, though, even though it may mean their boot drive will likely die at some point. That's completely okay I think.
At full tilt an M3 Max will consume 50 to 75 watts, meaning you get 1 to 2 hours of runtime at best, if you use the thing full tilt.
That's the thing I find funny about the Apple Silicon MBP craze, sure they are efficient but if you use the thing as a workstation, battery life is still not good enough to really use unplugged.
Most claiming insane battery life are using the thing effectively as an information appliance or a media machine. At this game the PC laptops might not be as efficient but the runtime is not THAT different provided the same battery capacity.
Servers do have many channels but they run relatively slower memory
* Specifically, it being on-die
Also, DRAM is never on-die. On-package, yes, for Apple's SoCs and various other products throughout the industry, but DRAM manufacturing happens in entirely different fabs than those used for logic chips.
This is one of the reasons the "3D vcache" stuff with the giant L3 cache is so effective.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
GB/s and GBps mean the same thing, though GB/s is the more common way to express it. Gb/s and Gbps are the units that are 8x less: bits vs Bytes.
GB/s is the same thing as GBps
The "ps" means "per second"
https://en.wikipedia.org/wiki/DDR5_SDRAM (info from the first section):
> DDR5 is capable of 8GT/s which translates to 64 GB/s (8 gigatransfers/second * 64-bit width / 8 bits/byte = 64 GB/s) of bandwidth per DIMM.
So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
DDR4 clocks in at 3.2GT/s and the fastest DDR3 at 2.1GT/s.
DDR5 is an impressive jump. HBM is totally bonkers at 128GB/s per DIMM (HBM is the memory used in the top end Nvidia datacenter cards).
Cheers.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
This means that in practice, consumer x86 CPUs have only 128GiB/s of DDR5 memory bandwidth available (regardless of the number of DIMM slots in the system), because the vast majority of them only offer two memory channels. Server CPUs can offer 4, 8, 12, or even more channels, but you can't just install 16 DIMMs and expect to get 1024GiB/s of bandwidth, unless you've verified that your CPU has 16 memory channels.
Happy Halloween!
An RTX4090 or H100 has memory extremely close to the processor but I don't think you would call it unified memory.
A huge part of optimizing code for discrete GPUs is making sure that data is streamed into GPU memory before the GPU actually needs it, because pushing or pulling data over PCIe on-demand decimates performance.
If you’re forking out for H100’s you’ll usually be putting them on a bus with much higher throughput, 200GB/s or more.
Bandwidth (GB/s) = (Data Rate (MT/s) * Channel Width (bits) * Number of Channels) / 8 / 1000
(8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
I believe I saw somewhere that the actual chips used are LPDDR5X-8533.
Effectively the parents formula describes the M4 max, give or take 5%.
A top tier hosted model is fast and 100.
Past what specialized models can do, it's about a mixture/agentic approach and next level, nuclear power scale. Having a computer with lots of relatively fast RAM is not magic.
Most laptops will be 2 DIMMS (probably soldered).
The vast majority of any x86 laptop or desktops are 128 bits wide. Often 2x64 bit channels up till last year or so, now 4x32 bit DDR5 in the last year or so. There are some benefits to 4 channels over 2, but generally you are still limited by 128 bits unless you buy a Xeon, Epyc, or Threadripper (or Intel equiv) that are expensive, hot, and don't fit in SFFs or laptops.
So basically the PC world is crazy behind the 256, 512, and 1024 bit wide memory busses apple has offered since the M1 arrived.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
EDIT: wtf what's so bad about this comment that it deserves being downvoted so much
Intel processor graphics architecture has long pioneered
sharing DRAM physical memory with the CPU.
This unified memory architecture offers [...]
It more or less seems like they use "unified memory" and "shared memory" interchangeably in that sectionCalling something "shared" makes you think: "there's not enough of it, so it has to be shared".
Calling something "unified" makes you think: "they are good engineers, they managed to unify two previously separate things, for my benefit".
For Apple to have come up with using the term "unified memory" to describe this kind of architecture, they would've needed to come up with it at least before 2016, meaning A9 chip or earlier. I have paid some attention to Apple's SoC launches through the years and can't recall them touting it as a feature in marketing materials before the M1. Do you have something which shows them using the term before 2016?
To be clear, it wouldn't surprise me if it has been used by others before Intel did in 2015 as well, but it's a starting point: if Apple hasn't used the term before then, we know for sure that they didn't come up with it, while if Apple did use it to describe A9 or earlier, we'll have to go digging for older documents to determine whether Apple came up with it
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources. Thanks !
Smart move by Apple
An M2 is according to a reddit post around 27 tflops
So < 1/10 the performance of just computation. let alone the memory.
What workflow would use something like this?
Memory and memory bandwidth matters most for inferencing. 819.2 GB/s for M2 Ultra is less than half that of A100, but having 192GB of RAM instead of 80gb means they can run inference on models that would require THREE of those A100s and the only real cost is that it takes longer for the AI to respond.
3 A100 at $5300/mo each for the past 2 years is over $380,000. Considering it worked for them, I'd consider it a massive success.
From another perspective though, they could have bought 72 of those Ultra machines for that much money and had most devs on their own private instance.
The simple fact is that Nvidia GPUs are massively overpriced. Nvidia should worry a LOT that Apple's private AI cloud is going to eat their lunch.
Small correction: the M2 Ultra isn't found in laptops, its in the Studio.
Right?
If you need to do some work offline, or for some reason the place you work blocks access to cloud providers, it's not a bad way to go, really. Note that if you're on battery, heavy LLM use can kill your battery in an hour.
- Apple: all the capacity and bandwidth, but no compute to utilize it
- AMD/Nvidia: all the compute and bandwidth, but no capacity to load anything
- DDR5: all the capacity, but no compute or bandwidth (cheap tho)
The M4-Max I just ordered comes with 128GB of RAM.
Somewhat niche case, I know.
:P
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
Surely the entire staff can't be out rock climbing, surfing, eating at trendy Asian-inspired restaurants at twilight, and having catered children's birthday parties in immaculately manicured parks.
It kind of just comes off as one of those YouTube liminal space horror videos when it's that empty.
Think about the early ipod ads, just individuals dancing to music by themselves. https://www.youtube.com/watch?v=_dSgBsCVpqo
You can even go back to 1983 "Two kinds of people": a solitary man walks into an empty office, works by himself on the computer and then goes home for breakfast. https://youtu.be/4xmMYeFmc2Q
It's weirdly dystopian. I didn't realize it bothered me until moments before my comment, but now I can't get it out of my head.
I would think that a brand that is at least trying to put some emphasis on privacy in their products would also extend the same principle to their workforce. I don’t work for Apple, but I doubt that most of their employees would be thrilled about just being filmed at work for a public promo video.
> liminal space horror
reminds me of that god awful crush commercial
This was reminder to me that art is subjective. I don’t get the outrage. I kinda like it.
> MacBook Pro with M4 Pro is up to 3x faster than M1 Pro (13)
> (13) Testing conducted by Apple from August to October 2024 using preproduction 16-inch MacBook Pro systems with Apple M4 Pro, 14-core CPU, 20-core GPU, 48GB of RAM and 4TB SSD, and production 16-inch MacBook Pro systems with Apple M1 Pro, 10-core CPU, 16-core GPU, 32GB of RAM and 8TB SSD. Prerelease Redshift v2025.0.0 tested using a 29.2MB scene utilising hardware-accelerated ray tracing on systems with M4 Pro. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
So they're comparing software that uses raytracing present in the M3 and M4, but not in the M1. This is really misleading. The true performance increase for most workloads is likely to be around 15% over the M3. We'll have to wait for benchmarks from other websites to get a true picture of the differences.Edit: If you click on the "go deeper on M4 chips", you'll get some comparisons that are less inflated, for example, code compilation on pro:
14-inch MacBook Pro with M4 4.5x
14-inch MacBook Pro with M3 3.8x
13-inch MacBook Pro with M1 2.7x
So here the M4 Pro is 67% faster than the M1 Pro, and 18% faster than the M3 Pro. It varies by workload of course.No benchmarks yet, but this article gives some tables of comparative core counts, max RAM and RAM bandwidths: https://arstechnica.com/apple/2024/10/apples-m4-m4-pro-and-m...
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
That's an architectural limitation of the base M4 chip, if you go up to the M4 Pro version you can get up to 48GB, and the M4 Max goes up to 128GB.
The M4 Pro goes up to 48 GB
The M4 Max can have up to 128 GB
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
Do I get 2 extra CPU cores, build a budget gaming PC, or subscribe to creative suite for 2.5 years!?
M4 Max 14 core has a single option of 36GB.
M4 Max 16 core lets you go up to 128GB.
So you can actually get more ram with the Pro than the base level Max.
Apple has hardware accelerated compressed swapping.
Windows has compressed swapping.
And Linux is a mess. You have to manually configure a non-resizable compressed zram, or use it without compression on a non-resizable swap partition.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
* https://en.wikipedia.org/wiki/Wi-Fi_7
* https://en.wikipedia.org/wiki/Wi-Fi_6E
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
https://www.tomsguide.com/face-off/wi-fi-6e-vs-wi-fi-7-whats...
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
One of the features is preamble punching, which is useful in more dense environments:
* https://community.fs.com/article/how-preamble-puncturing-boo...
* https://www.ruckusnetworks.com/blog/2023/wi-fi-7-and-punctur...
MLO helps with resiliency and the improved OFDMA helps with spectrum efficiency as well. It's not just about speed.
The real use is transferring huge files within the LAN.
Wifi 6e/7 6GHz band with a bunch of non overlapping 160MHz channels is where the juice is at. But even then a lot of devices are limited to smaller channel widths.
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
Here is the rabbit hole you might want to check out: https://dancharblog.wordpress.com/2024/01/01/list-of-ssd-enc...
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
I own a media production company. We use Sabrent Thunderbolt external NVMe TLC SSDs and are very happy with their price, quality, and performance.
I suggest you avoid QLC SSDs.
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
Afaik the main oem producer is Winstars, though I could only find sketchy-looking Aliexpress seller so far.
Switched to samsung t9s, so far so good.
My only complaint is that Apple gouges you for memory and storage upgrades. (But in reality I don't want the raw and rendered video taking up space on my machine).
For video editing - even 8K RAW - you don't need insanely fast storage. A 10GBit/s external SSD will not slow you down.
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
I guess you could have a physical MBP in your house and connect it to some bring-your-own-infrastructure CI setup, but most people wouldn't want to do that.
It's a pretty hard problem to partially automate for setups with an engineer in the room. It doesn't sound at all feasible for an unattended data center setup that's designed to host Xcode for compiling apps under macOS.
(This isn't a dig on the Asahi project btw, I think it's great).
However, it doesn't support snapshots for Linux, so you need to power down each session.
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
I believe the rumor is that the MacBook Air will get the update to M4 in early spring 2025, February/March timeline.
Unfortunately Apple won’t tell you until the day they sell the machines.
That said, I believe you. Some press gets a hands-on on Wednesday (today) so unless they plan to pre-announce something (unlikely) or announce software only stuff, I think today is it.
Also, Studio and Pro are hanging there.
The streaming apps virtually all support downloading for offline viewing on iPhone, but the Apple TV just becomes a paperweight when the internet goes out, because I'm not allowed to use the 128GB of storage for anything.
If they're not going to let you use the onboard storage, then it seems unlikely for them to let you use USB storage. So, first, I would like them to change their app policies regarding internal storage, which is one of the purely software improvements I would like to see.
But there are some cases like e.g. watching high-res high-FPS fractal zoom videos (e.g. https://www.youtube.com/watch?v=8cgp2WNNKmQ) where even brief random skipped frames from other things trying to use WiFi at the same time can be really noticeable and annoying.
On my Mac I don't have any of these things, it's mostly for programming and some packages. I'm almost always connected to Wi-Fi (except on planes) so I don't really need any photos or videos.
The only people that I see have high storage requirements on Macs are probably video/media creators? As a programmer I'm totally fine with 512GB, but could probably live with 256GB if I wanted to be super lean.
And it's still only 512GB! The M4 version coming in the new year will surely bump this up to something more sensible.
Only in US it seems. India got a price increase by $120.
Makes me wonder what else will be updated this week (Studio or Mac Pro)?
I get that small form factor requires soldered ram, but RAM is also very cheap these days.
Just kidding! As an Apple Shareholder I feel like you should take what Apple gives you and accept the price. ;)
The Apple ARM laptops are just on an arbitrary point belong the power/efficiency scale.
If it happens to match your needs: great
But it's not like it's ahead of the industry in any way ^^
P.S. writing to you from a six year old ThinkPad running Linux. I'm not an Apple fanboy. It is my opinion that Apple's products are leagues ahead of the rest of the industry and it's not even close: performance, efficiency, and build quality are incredible.
I guess it depends on the person which computer is better for them.
https://techcrunch.com/2020/11/17/yeah-apples-m1-macbook-pro...
I hope that the response times have improved, because it has been quite poor for a 120 Hz panel.
Will that break the display like it did for tape over the webcam?
For my current laptop, I finally broke down and bought a tempered glass screen protector. It adds a bit of glare, but wipes clean — and for the first time I have a one-year-old MacBook that still looks as good as new.
(I tend to feel if you want something specialized, you gotta pay for the expensive model)
It sounds more exciting than M4 is 12.5% faster than M3.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
Back when Moore's law was still working they didn't skip generations like this.
I've just ordered an (almost) top-of-the-range MBP Max, my current machine is an MBP M1-max, so the comparisons are pretty much spot-on for me.
Selling the M1 Ultra Studio to help pay for the M4 MBP Max, I don't think I need the Studio any more, with the M4 being so much faster.
Everyone knows SSDs made a big difference in user experience. For the CPU, normally if you aren't gaming at high settings or "crunching" something (compiling or processing video etc.) then it's not obvious why CPU upgrades should be making much difference even vs. years-old Intel chips, in terms of that feel.
There is the issue of running heavy JS sites in browsers but I can avoid those.
The main issue seems to be how the OS itself is optimized for snappiness, and how well it's caching/preloading things. I've noticed Windows 10 file system caching seems to be not very sophisticated for example... it goes to disk too often for things I've accessed recently-but-not-immediately-prior.
Similarly when it comes to generating heat, if laptops are getting hot even while doing undemanding office tasks with huge periods of idle time then basically it points to stupid software -- or let's say poorly balanced (likely aimed purely at benchmark numbers than user experience).
https://nanoreview.net/en/cpu-compare/apple-m1-vs-amd-ryzen-...
M4 is built with TSMC's 2nd Gen 3nm process. M3 is on the 1st gen 3nm.
For the base M3 vs base M4:
- the CPU (4P+4E) & GPU (8) core counts are the same
- NPU perf is slightly better for M4, I think, (M4's 38TOPS @ INT8 vs M3's 18TOPS @ INT16)
- Memory Bandwidth is higher for M4 (120 GB/s vs 102.4 GB/s)
- M4 has a higher TDP (22W vs 20W)
- M4 has higher transistor count (28B vs 25B)
Not all products got the M3, so in some lines this week is the first update in quite a while. In others like MBP it’s just the yearly bump. A good performing one, but the yearly bump.
I'd really like to justify upgrading, but a $4k+ spend needs to hit greater than 2x for me to feel it's justified. 1.8x is still "kind of the same" as what I have already.
wot, m8? Only Apple will call a 12 megapixel camera “advanced”. Same MPs as an old iPhone 6 rear camera.
Aside from that, it’s pretty much the same as the prior generation. Same thickness in form factor. Slightly better SoC. Only worth it if you jump from M1 (or any Intel mbp) to M4.
Would be godlike if Apple could make the chip swappable. Buy a Mac Studio M2 Ultra Max Plus. Then just upgrade SoC on an as needed basis.
Would probably meet their carbon neutral/negative goals much faster. Reduce e-waste. Unfortunately this is an American company and got to turn profit. Profit over environment and consumer interests.
There will always be a long tail of niche Windows games (retro + indie especially). But you can capture the Fortnite (evergreen) / Dragon Age (new AAA) audience.
1) Either Apple wants to maintain the image of the Macbook as a "serious device", and not associate itself with the likes of "WoW players in their mom's basement".
2) Microsoft worked something out with Apple, where Apple would not step significantly on the gaming market (Windows, Xbox). I can't think of another reason why gaming on iOS would be just fine, but abysmal on MacOS. Developers release games on MacOS _despite_ the platform.
Laptop cameras are significantly smaller in all dimensions than phone cameras. Most laptop cameras are 1-4MP. Most are 720p (1MP), and a few are 1080p (2MP). The previous MacBook was 1080p
For reference, a 4k image is 8MP.
12MP is absolutely a massive resolution bump, and I’d challenge you to find a competitive alternate in a laptop.
I blame the confusion to PC&Android marketing people who were pushing for years and years the idea that the higher the megapixel digits the better the camera is. Non-Apple customers should be really pissed of for the years of misinformation and indoctrination on false KPI.
The marketing gimmicks pushed generations of devices to optimize for meaningless numbers. At times, even Apple was forced to adopt those. Such a shame.
It’s essentially a matte coating, but the execution on iPad displays is excellent. While it doesn’t match the e-ink experience of devices like the Kindle or ReMarkable, it’s about 20-30% easier on the eyes. The texture feels also great (even though it’s less relevant for a laptop), and the glare reduction is a welcome feature.
I prefer working on the MacBook screen, but I nearly bought an Apple Studio Display XDR or an iPad as a secondary screen just for that nano-texture finish. It's super good news that this is coming to the MacBook Pro.
I am probably not the best example to emulate lol.
I will upgrade to M4 Pro and really hate the glare when I travel (and I do that a lot) but at the same time I don't want to lose any quality that the MBP delivers which is quite excellent imho
You can tell not because the system temp rises, but because suddenly Spotify audio begins to pop, constantly and irregularly.
It took me a year to figure out that the system audio popping wasn't hardware and indeed wasn't software, except in the sense that memory (or CPU?) pressure seems to be the culprit.
Even when I remove all "Intel" type apps in activity monitor, I still experience the issue though.
It definitely gets unstable in those situations, but you probably don't want your scripts randomly OOM killed either.
Gee, I wonder why.
$ man proc_setpcontrol
No manual entry for proc_setpcontrol
I've heard it's easier to just use cloud options, but I sill like the idea of being able to run actual models and train them on my laptop.
I have a M1 MacBook now and I'm considering trading in to upgrade.
I've seen somewhat conflicting things regarding what you get for the money. For instance, some reports recommending a M2 Pro for the money IIRC.
This is nice, and long overdue.
Great to see Affinity becoming so popular that it gets acknowledged by Apple.
I might still keep it another year or so, which is a testament to how good it is and how relative little progress has happened in almost 10 years.
I'm not using mine any more, but I noticed a big difference when I replaced the battery and got all the dust out this spring. Also installed a new battery. Still quite a hard sell on the used market :)
Is an upgrade really worth it?
If you do any amount of 100% CPU work that blocks your workflow, like waiting for a compiler or typechecker, I think M1 -> M4 is going to be worth it. A few of my peers at the office went M1->M3 and like the faster compile times.
Like, a 20 minute build on M1 becoming a 10 minute build on M4, or a 2 minute build on M1 becoming a 1 minute build on M4, is nothing to scoff at.
I myself don’t need so much performance, so I tend to keep my devices for many, many years.
> All Apple Silicon Macs are in scope, as well as future generations as development time permits. We currently have support for most machines of the M1 and M2 generations.[^1][^2]
https://softwareengineeringdaily.com/2024/10/15/linux-apple-...
1. Nested virtualization doesn't work in most virtualization software, so if your workflow involves running stuff in VMs it is not going to work from within another VM. The exception is apparently now the beta version of UTM with the Apple Virtualization backend, but that's highly experimental.
2. Trackpad scrolling is emulated as discrete mouse wheel clicks, which is really annoying for anyone used to the smooth scrolling on macOS. So what I do is use macOS for most browsing and other non-technical stuff but do all my coding in the Linux VM.
https://developer.apple.com/documentation/virtualization/vzg...
This is the sad situation on my M2 MacBook Pro :(
$ swift repl
Welcome to Apple Swift version 6.0.2 (swiftlang-6.0.2.1.2 clang-1600.0.26.4).
Type :help for assistance.
1> import Virtualization
2> VZGenericPlatformConfiguration.isNestedVirtualizationSupported
$R0: Bool = false
It's 2024, and I still see most Windows users carrying a mouse to use with their laptop.
Wish the nano-texture display was available when I upgraded last year. The last MacBook I personally bought was in 2012 when the first retina MBP had just released. I opted for the "thick" 15" high-res matte option. Those were the days...
Another positive development was bumping up baseline amounts of RAM. They kept selling machines with just 8 gigabytes of RAM for way longer than they should have. It might be fine for many workflows, but feels weird on “pro” machines at their price points.
I’m sure Apple has been coerced to up its game because of AI. Yet we can rejoice in seeing their laptop hardware, which already surpassed the competition, become even better.
In January, after researching, I bought an apple restored MBP with an M2 Max over an M3 Pro/Max machine because of the performance/efficiency core ratio. I do a lot of music production in DAWs, and many, even Apple's Logic Pro don't really make use of efficiency cores. I'm curious about what restraints have led to this.. but perhaps this also factors into Apple's choice to increase the ratio of performance/efficiency cores.
I believe that’s the case. Most times, the performance cores on my M3 Pro laptop remain idle.
What I don’t understand is why battery life isn’t more like that of the MacBook Airs when not using the full power of the SOC. Maybe that’s the downside of having a better display.
Curious how you're measuring this. Can you see it in Activity Monitor?
> Maybe that’s the downside of having a better display.
Yes I think so. Display is a huge fraction of power consumption in typical light (browsing/word processing/email) desktop workloads.
I use an open source app called Stats [1]. It provides a really good overview of the system on the menu bar, and it comes with many customization options.
Yes, processor history in the activity monitor marks out specific cores as Performance and Efficiency.
Example: https://i.redd.it/f87yv7eoqyh91.jpg
Of course I'm rooting for competition, but Apple seems to be establishing a bigger and bigger lead with each iteration.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
Aside: Really, it's a combination of factors. First, Apple uses a bespoke boot chain, interrupt controller, etc. instead of UEFI and following ARM SystemReady standards like virtually all of the other desktop and server-class ARM machines, and didn't bother with any interoperability. The boot process is absolutely designed just to be able to boot XNU, with tiny escape hatches making it slightly easier to jam another payload into it. On the other hand, just out of pure coincidence, Windows apparently statically links the HAL since Windows 10 version 2004, making it impossible for a straight port to be done anymore. In any case, the Apple Silicon computers are designed to boot macOS, and "went out of their way to make it possible" is an absurd overstatement of what they did. What they did was "do the absolute minimum to make it possible without doing anything to make it strictly impossible." Going out of their way implies they actually made an effort to make it possible, but officially as far as I know Apple has only ever actually acknowledged virtual machines.
I think it would be fair to argue that the reverse is true, too: If you choose to buy a PC, you will be stuck with Windows, or an alternative PC operating system. (Of course, usually a Linux distribution, but sometimes a *BSD, or maybe Illumos. Or hell, perhaps Haiku.) That said, objectively speaking Windows has more marketshare and a larger ecosystem, for better or worse, so the number of people who strictly need and strictly want Windows is going to naturally be higher than the comparative numbers for macOS. This doesn't imply one is better than the other, but it still matters if you're talking about what laptop to buy.
> the platform is reasonably compatible with PC.
Not sure what you mean here. The Apple Silicon platform has basically nothing in common with the x64 PC. I guess it has a PCI express bus, but even that is not attached the same way as any typical x64 PC.
The Apple Silicon platform is actually substantially similar to the iOS platform.
> compared to devices with Qualcomm chips, for example
Also not sure what this is meant to mean, but with the Snapdragon X Elite platform, Qualcomm engineers have been working on upstream Linux support for a while now. In contrast I don't think Apple has contributed or even publicly acknowledged Asahi Linux or any of the Linux porting efforts to Apple Silicon.
Battery life is decent.
At this point I’m not switching from laptop Linux. The machines can even game (thanks proton/steam)
https://browser.geekbench.com/macs/macbook-pro-14-inch-2021-...
https://browser.geekbench.com/v6/cpu/4260192
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
those are another two reasons why I can't ignore Apple Silicon
The other thing I hate about the Thinkpads is that the build/screen/trackpad quality sucks in comparison to the Apple stuff. And for all the griping about Mac OS on this site, Windows is way worse - you can tell MS's focus is on linux in the cloud these days. All the ancillary stuff Apple is good at is underappreciated.
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
YMMV.
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
FWIW, Qualcomm cancelled orders of its Windows devkit and issued refunds before the lawsuit. That is probably not a good sign
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
I upgraded from a 13 pro to a 15 pro expecting zippier performance and it feels almost identical if not weirdly a bit slower in rendering and typing
I wonder what it will take to make Mac/iOS feel faster
I went from an iPhone 13 mini to an iPhone 16 and it's a significant speed boost.
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
The new camera button is kinda nice though.
I was initially indifferent about the camera button, but now that I'm used to it it's actually very useful.
The CPU? Ah, never really felt a difference.
Infuriated by the 13.
The 3.5mm audio thunder bolt adapters disconnect more often than usual. All I need to do is tap the adapter and it disconnects.
And that Apple has now stopped selling them is even more infuriating, it's not a faulty adapter.
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
[0] https://www.audio-technica.com/en-us/ath-m50xsts [1] https://www.amazon.com/Adapter-Anker-High-Speed-Transfer-Not...
I have custom scripts, Ad blocking without VPNs, Application firewalls.
I enjoy having most-full control of my device.
The what? is this the adapter for 3.5mm headphones? If so, you don't have to get Apple made dongles. Third parties make them also.
I'd guess the GPs actual problem is lint in the Lightning port though. Pretty common, relatively easy to clean out too, especially compared to USB-C.
Regardless of either, they both have the same fault.
The connector between the phone and the adapter is poor. It could just be a fault with my phone but I have no way of proving this.
I suspect this sounds like a problem with your specific phone. Never had a problem with any lightning accessories myself.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
I can fairly easily get my M1 Air to have thermal issues while on extended video calls with some Docker containers running, and have been on calls with others having the same issue. Kind of sucks if it's, say, an important demo. I mostly use it as a thin client to my desktop when I'm away from home, so it's not really an issue, but if I were using it as a primary device I'd want a machine with a fan.
I try to avoid docker in general during local dev and luckily it has worked out for me even with microservice architectures. It reduces dramatically CPU and RAM needs and also reduces cycle time.
Air doesn't support 120Hz refresh either.
There's an app that allows to unlock max brightness on Pros (Vivid)[0] even without HDR content (no affiliation).
HDR support is most noticeable when viewing iPhone photos and videos, since iPhones shoots in HDR by default.
I may or may have not seen HDR content accidentally, but I’m not sure.
[0] Hawaii LG Demo: https://www.youtube.com/watch?v=WBJzp-y4BHA [1] Nature Demo: https://www.youtube.com/watch?v=NFFGbZIqi3U
YouTube shows a small red "HDR" label on the video settings icon for actual HDR content. For this label to appear, the display must support HDR. With your M3 Pro, the HDR label should appear in Chrome and Safari.
You can also right-click on the video to enable "Stats for nerds" for more details. Next to color, look for "smpte2084 (PQ) / bt2020". That's usually the highest-quality HDR video [2,3].
You can ignore claims such as "Dolby Vision/Audio". YouTube doesn't support those formats, even if the source material used it. When searching for videos, apply the HDR filter afterward to avoid videos falsely described as "HDR".
Keep in mind that macOS uses a different approach when rendering HDR content. Any UI elements outside the HDR content window will be slightly dimmed, while the HDR region will use the full dynamic range.
I consider Vivid [4] an essential app for MacBook Pro XDR displays.
Once installed, you can keep pressing the "increase brightness" key to go beyond the default SDR range, effectively doubling the brightness of your display without sacrificing color accuracy. It's especially useful outdoors, even indoors, depending on the lighting conditions. And fantastic for demoing content to colleagues or in public settings (like conference booths).
[2] https://www.benq.com/en-us/knowledge-center/knowledge/bt2020... [3] https://encyclopedia.pub/entry/32320 (see section 4) [4] https://www.getvivid.app/
Ahh. Not Firefox, of course.
Thanks, I just ran a random nature video in Safari. It was pretty. The commercials before it were extremely annoying though. I don't think it's even legal here to have so many ads per minute of content as Google inserts on youtube.
How can people use anything that doesn't run ublock origin these days?
They can keep their HDR.
I’m looking forward to the day I notice the difference so I can appreciate what I have.
I can’t understand the people who notice the 120 hz adaptive refresh whatever and one guess is their use is a lot twitchier than mine.
Even 90Hz (like on some Pixels) is substantially better than the iPhone's 60Hz.
I just noticed that I don't really try to follow the screen when I scroll down HN, for example. Yes it's blurry but I seem not to care.
[1] Source: my Galaxy something phone that I keep on my desk for when I do Android development. It has no personal stuff on it, it's only used to test apps that I work on, and even that isn't my main job (nothing since early spring this year for example). It was very smooth when I bought it, now it takes 5+ seconds to start any application on it and they stutter.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
And yes. Web apps are not really great on low-spec machines.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
A 16gb model for about a thousand bucks?? I can’t believe how far macbooks have come in the last few years
but yes, I was looking at and anticipating the max RAM on the M4 as well as the max memory speed
128gb and 546GB/s memory bandwidth
I like it, I don't know yet on an upgrade. But I like it. Was hoping for more RAM actually, but this is nice.
Where this might shift is as we start using more applications that are powered by locally running LLMs.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
Getting an extra five years of longevity (after RAM became fixed) for an extra 10% was a no-brainer imho.
It is absolutely, 100%, no doubt in my mind: the hardware.
One good use case for 32gb Mac is being able to run 8b models at full precision, something that is not possible with 8-16gb macs
I always catch myself in this same train of thought until it finally re-occurs to me that "no, the variable here is just that you're old." Part of it is that I have more money now, so I buy better products that last longer. Part of it is that I have less uninterrupted time for diving deeply into new interests which leads to always having new products on the wishlist.
In the world of personal computers, I've seen very few must-have advances in adulthood. The only two unquestionable big jumps I can think of off hand are Apple's 5K screens (how has that been ten years?!) and Apple Silicon. Other huge improvements were more gradual, like Wi-Fi, affordable SSDs, and energy efficiency. (Of course it's notable that I'm not into PC gaming, where I know there has been incredible advances in performance and display tech.)
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
The only reason the 2009 one now gets little use, is its motherboard now has some electronic issues, otherwise it would serve me perfectly well.
and probably it's good that at least one of the big players has a business model that supports driving that forward
An MB Air with an M3 and no fan out gamed my old gtx 1080 box which stuttered on NMS size games all the time
Shows just how poorly Intel has done. That company should be razed to the ground figuratively and the infrastructure given to a new generation of chip makers; the last one is clearly out of their element
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
What I do know is that Linux constantly breaks stuff. I don't even think it's treading water. These are interfaces are actively getting worse.
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
The quality stuff retains value, not brand.
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
Again, the quality holds the value not the brand.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
But it's a heavy brick with a short battery life compared to the M1/2/3 Mac.
The base model is perfect. Now to decide between the M3/M4 Air and the M4 Pro.
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
Has nothing whatsoever to do with CPU/memory/etc.
- Graphics: the GPU and compute capabilities are definitely not there when comparing to NVidia mid-range offering, it's more akin to a laptop 2060.
- Can only do 3 video outputs at most while there are enough hardware outputs for 5.
The CPU is definitely fast though !
Isn't that just Firefox deciding to let things stay in memory because the memory is there? Anyway, Safari seems to run fine with 8Gb and will unload inactive tabs.
> building a c++ app at -j10
Strange you're getting OOM errors instead of more swapping. But why insist on running 10 tasks at the same time then?
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
To me it's more like 3d printing as a niche/hobby.
< 1% of all engagement with a category thing is niche/hobby, yes.
That's thoroughly unconvincing. That kind of talk is exactly what so many people are tired of hearing. Especially if it's coming from technically-minded people who don't have any reason to be talking like PR drones.
- boomer luddites
- primitive single-celled organisms
- NPCs
And even people who are enthusiastic about AI but aren't fanatical about running it locally get scorn from you.
I can understand and forgive some amount of confirmation bias leading you to overestimate the importance and popularity of what you work on, but the steady stream of broad insults at anyone who even slightly disagrees with you is dismaying. That kind of behavior is wildly inappropriate for this forum. Please stop.
How old are you?
"Bro" has been gender neutral for over a decade. Males and females under the age of 25 call each other "bro" all the time.
Example?
I bought my first Macbook pro about a year and a half ago and it's still working great.
Got the money, are in the consumerism camp: Switch to latest model every year because the camera island changed 5mm.
Got the professional need in games or video and your work isn't covering your device: Switch to new model every couple of generations.
Be me: I want to extend the lifecycle of things I use. Learn how to repair what you own (it's never been as easy), be aware of how you can work in today's world (who needs laptop RAM if I can spin up containers in the cloud) - I expect to not upgrade until a similarly stellar step up in the category of Intel to Apple Silicone comes along.
All past Mx versions being mostly compared to Intel baselines: Boring. M4 1.8 times faster than M1 Pro: Nice, but no QoL change. For the few times I might need it, I can spin up a container in the cloud.
My display is excellent.
14 inch is the perfect screen size.
Battery life is perfect.
Multiply it out: 220 work days a year * $10/day is $2200 a year towards your laptop.
Upgrade accordingly.
Sounds like a good rule of thumb.
Can I also ask what kind of work you do on it? I suspect that some work probably wears out computers faster than other sorts of work.
- new display tech
- better wireless connectivity
- updated protocols on ports (e.g., support for higher res displays and newer displayport/hdmi versions)
- better keyboard
- battery life
Once a few of those changes accumulate over 4+ generations of improvements that’s usually the time for me to upgrade.
My laptops so far: first 2008 plastic macbook, 2012 macbook pro, 2015 macbook pro, and M1 pro 16 currently. I skipped 2016-2020 generation which was a massive step backwards on my upgrade criteria, and updated to 2015 model in 2016 once I realized apple has lost their marbles and has no near plans on making a usable laptop at the time.
Also getting a maxed out configuration really helps the longevity.
Is there some reason your current computer isn't working for you? If not, why upgrade? Use it as long as you can do so practically & easily.
On the other extreme, I knew someone who bought a new MBP with maximum RAM specs each year. She'd sell the old one for a few hundred less than she paid, then she always had new hardware with applecare. It was basically like leasing a machine for $400/yr.
Because it made the esc key useless for touch typists and because, as a vi user, I hit esc approximately a bazillion times per day I mapped caps lock to esc.
Now my fingers don't travel as far to hit esc.
I still use that mapping even on my regular keyboards and my current non-touch-bar macs.
Thanks touchbar macs, rest in peace.
I really like the touchbar Macs because changing the volume to exactly what I want is really easy. All the others have increments that are too large, so I have to try to remember if Shift + Opt + Volume Up/Down is what I want, or some other combination.
can't justify the constant upgrade when it doesn't make me money (work provide one). very excited about the new one though
But there is also another strategy: get a new Mac when they come out and sell it before/after the next model appears. There is a large market for used Macs. A friend of mine has been doing this for quite some time.
Not to mention that in the US the cell phone carriers artificially limit tethering speed or put data caps on it when you tether from your phone. You have to buy a dedicated data-only plan and modem.
Most cellular carriers offer unlimited on-device data plans, but they cap data for tethering. Integrating an LTE modem into a laptop essentially requires a mobile data plan with unlimited tethering - which, AFAIK, doesn’t exist at the moment. I’m not sure why.
I've always heard that patent disputes were at the root of the lack of a modem option. Apple had a prototype MacBook Pro back in the early Intel days IIRC but it was never released.
Maybe if Apple ever gets their in-house modems working, we'll see them on all of the product lines, but until then, it's a niche use case that likely isn't causing them to lose a ton of sales.
I understand that. My point is that I think an LTE modem in a laptop might reasonably use far more data than an LTE modem in a phone or tablet. Most people who download and/or upload very large files do so on their computer rather than their mobile devices.
There is no reason macOS cannot have some option for throttling usage by background updates when connected over LTE. iPads have an LTE option.
That carriers have not figured out how to charge me by the byte over all my devices instead of per device is really not a big issue to me. I would like to pay for an LTE modem and the necessary bandwidth.
My intuition is that when Apple has their own LTE modem and is not dependent on Qualcomm, a MacBook Pro will have an option similar to that for Dell power users.
Regarding LLMs, the hottest topic here nowadays, I plan to either use the cloud or return to a bare-metal PC.
I found a Mac to be just as simple and troublefree to support as a Chromebook, but more capable.
I was very pleasantly surprised, coming from mostly Windows (and a dash of Linux).
But when I gave my mom OSX, she used it and managed to survive. Even Ubuntu was ok for her.
And if she managed, I'm sure most people can.
It also avoids the trouble of using a hosted LLM that decides to double their price overnight, costs are very predictable.
On a side note, anyone know what database software was shown during the announcement?
https://www.datensen.com/data-modeling/luna-modeler-for-rela...
MBP: Apple M4 Max chip with 16‑core CPU, 40‑core GPU and 16‑core Neural Engine
Mac mini: Apple M4 Pro chip with 14‑core CPU, 20‑core GPU, 16-core Neural Engine
What kind of workload would make me regret not having bought MBP over Mac mini given the above. Thanks!
- photo/video editing
- games, or
- AI (training / inference)
that would benefit from the extra GPUs.
I think you need to pick the form factor that you need combined with the use case:
- Mobility and fast single core speeds: MacBook Air
- Mobility and multi-core: MacBook Pro with M4 Max
- Desktop with lots of cores: Mac Studio
- Desktop for single core: Mac mini
I really enjoy my MacBook Air M3 24GB for desktop + mobile use for webdev: https://news.ycombinator.com/item?id=41988340
The base model doesn't support thunderbolt 5.
And the base model still doesn't support more than 2 external displays without the DisplaySync (not DisplayPort!) hardware+software.
"The display engine of the M4 family is enhanced to support two external displays in addition to a built-in display."
https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...
"M4 and M4 Pro
Simultaneously supports full native resolution on the built-in display at 1 billion colors and:
Up to two external displays with up to 6K resolution at 60Hz over Thunderbolt, or one external display with up to 6K resolution at 60Hz over Thunderbolt and one external display with up to 4K resolution at 144Hz over HDMI
One external display supported at 8K resolution at 60Hz or one external display at 4K resolution at 240Hz over HDMI"
> M4 Max supports up to 128GB of fast unified memory and up to 546GB/s of memory bandwidth, which is 4x the bandwidth of the latest AI PC chip. This allows developers to easily interact with large language models that have nearly 200 billion parameters.
Having more memory bandwidth is not directly helpful in using larger LLM models. A 200B param model requires at least 200GB RAM quantized down from the original precision (e.g. "bf16") to "q8" (8 bits per parameter), and these laptops don't even have the 200GB RAM that would be required to run inference over that quantized version.
How can you "easily interact with" 200GB of data, in real-time, on a machine with 128GB of memory??
Edit: Actually you'd want q3 to fit a 200B model into 128GB of RAM. e.g. this one is about 140GB https://huggingface.co/lmstudio-community/DeepSeek-V2.5-GGUF...
(Isn't that kind of like saying you can do real-time 4k encoding when you actually mean it can do real-time 720p encoding and then interpolate the missing pixels?)
> Up to 4.6x faster build performance when compiling code in Xcode when compared to the 16‑inch MacBook Pro with Intel Core i9, and up to 2.2x faster when compared to the 16‑inch MacBook Pro with M1 Max.
OK, that's finally a reason to upgrade from my M1.
I genuinely want to use it as primary machine but with this Intel MacBook Pro I have, I absolutely dislike FaceTime, IMessage, the need to use AppStore, Apple always asking me have a Apple user name password (which I don't and have zero intention), block Siri, and all telemetry stuff Apple has backed in, stop the machine calling home, etc.
This is to mirror tools available in Windows to disable and remove Microsoft bloatware and ad tracing built in.
Pretty much all the software I use is from brew.
Then Windows 11 came out.
My primary desktop & laptop are now both Macs because of all the malarkey in Win11. Reappearance of ads in Start and Windows Recall were the last straws. It's clear that Microsoft is actively trying to monetize Windows in ways that are inherently detrimental to UX.
I do have to say, though, that Win11 is still more customizable overall, even though it - amazingly! - regressed below macOS level in some respects (e.g. no vertical taskbar option anymore). Gaming is another major sticking point - the situation with non-casual games on macOS is dismal.
If that's your question, yes - various options exist like https://asahilinux.org
That, combined with the icloud and telemetry BS, I'd had enough.
Get a PC.
[0] https://au.store.asus.com/proart-p16-h7606wi-me124xs.html
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
The only downsides is that I see a kind of "burnt?" transparent spot on my screen. When connecting to an HDMI cable, the sound does not ouput properly to the TV screen, and makes the video I plat laggy. Wondering if I go to the Apple Store, would fix it?
Personal anecdote: don't get your hopes up. I've had my issues rejected as 'no fault found', but it's definitely worth spending a bit of time on.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
M3 pro has 192 bit wide memory, GPU improvements mostly offset the decrease in memory bandwidth. This leads to memory options like 96GB.
M4 pro has 256 bit wide memory, thus the factor of 2 memory options.
DRAM chips don't just come in power of two sizes anymore. You can even buy 24GB DDR5 DIMMs.
This is misleading:
https://news.ycombinator.com/item?id=25074959
"macOS sends hashes of every opened executable to some server of theirs"
To be fair, the link in this story is to a press release. Arguably there are probably many things in it that can be considered "misleading" in certain contexts.
for the sake of annual releases we get a new number, but besides increased silicon area, the major architectural changes seem to come every couple years.
about time 16gb was the default on something that costs four figures. the on-device ai craze in this lineup has finally pushed the company to give adequate memory.
If you look at the top of line Apple mini at 64gig ram it seems to replace Apple Studio for certain use cases...
I look at my local source vs the recording, and I am baffled.
After a decade of online meeting software, we still stream 480p quality it seems.
A huge part of group video chat is still "hacks" like downsampling non-speaking participants so the bandwidth doesn't kill the connection.
As we get fatter pipes and faster GPUs streaming will become better.
edit: I mean... I could see a future where realtime video feeds never get super high resolution and everything effectively becomes a relatively seemless AI recreation where only facial movement data is transmitted similar to how game engines work now.
I am asking for good 720p... With how static cam footage is it would be less than 8mbps probably.
:/
Are they going to claim that 16GB RAM is equivalent to 32GB on Intel laptops? (/sarc)
> Up to 23.8x faster basecalling for DNA sequencing in Oxford Nanopore MinKNOW when compared to the 16-inch MacBook Pro with Core i9, and up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro.
It would indeed have been nice to see a faster response rate screen, even though I value picture quality more, and it also would have been nice to see even vaguely different colors like the iMac supposedly got, but it seems like a nice spec bump year anyway.
The most obvious view is that Apple price gouges on storage. But this seems too simplistic.
My conjecture is that there's an inescapable tension between supply (availabilty/cost) sales forecasts, technological churn, and roadmaps that leads them to want to somewhat subsidize the lowest end, and place a bit of back-pressure on consumption at the high-end. The trick is finding the tipping point on the curve between growth and over commitment by suppliers. Especially, for tightly vertically integrated products.
The PC industry is more diffuse and horizontal and so more tolerant of fluctuations in supply and demand across a broader network of providers and consumers, leading to a lower, more even cost structure for components and modules.
In real terms, Apple's products keep costing less, just like all computer products. They seem to make a point of holding prices on an appearance point of latest tech that's held steady since the first Macs: about $2500 for a unit that meets the expectations of space right behind the bleeding edge while being reliable, useful and a vanguard of trends.
Looking at how long the 8gb lasted it's a pretty sure bet that now you won't need to upgrade for a good few years.
I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
I'd say the one incentive a MacBook Pro has over the air is the better a screens and better speakers. Not sure if it's worth the money.
> I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
If an HN user can get along with 16gb on their MacBook Air for the last X years, most users were able to get by with 8gb.
They are supposed to be "green" but they encourage obsolescence.
8GB is fine for most use cases. Part of my gig is managing a huge global enterprise with six figures of devices. Metrics demonstrate that the lower quartile is ok with 8GB, even now. Those devices are being retired as part of the normal lifecycle with 16GB, which is better.
Laptops are 2-6 year devices. Higher end devices always get replaced sooner - you buy a high end device because the productivity is worth spending $. Low end tend to live longer.
Or you could get a framework and you could actually upgrade parts that are worth upgrading - instead of upgrade as in buying a new one
It's fine, but the issue is linux sleep/hibernate - battery drain. To use the laptop after a few days, I have to plug it in and wait for it to charge a little bit because the battery dies. I have to shut it down (not just close the screen) before flying or my backpack becomes a heater and the laptop dies. To use a macbook that's been closed for months I just open it and it works. I'll pay double for that experience. If I want a computer that needs to be plugged in to work I have a desktop for that already. The battery life is not good either.
Maybe it's better now if I take the time to research what to upgrade, but I don't have the time to tinker with hardware/linux config like I did a few years ago.
That's what I loved about my 2013 MBP but didn't have the same experience with the newer models anymore either.
I think "months" or even weeks is pushing it anyways, though a good proper suspend would be much appreciated.
Personally I use my laptop every other day so don't really have the same battery issue there
I don't really see a world where this machine doesn't last me a few more years. If there's anything i'd service would be the battery, but eh. It lasts more than a few hours and I don't go out much.
Anyways to each their own, I also had things break and repairability isn't really a thing with Apple hardware (while it easily could be if they wanted - even if difficult and by certified technicians instead of myself)
Before the M4 models: omg, Apple only gives you 8GB RAM in the base model? Garbage!
After the M4 models: the previous laptops were so good, why would you upgrade?
APPARENTLY NOT TODAY.
C'mon mannnnn. The 90s/y2k are back in! People want the colorful consumer electronics! It doesn't have to be translucent plastic like it was back then but give us at least something that doesn't make me wonder if I live in the novel The Giver every time I walk into a meetup filled with MacBook Pros.
I'm sure the specs are great.
https://www.intel.com/content/www/us/en/developer/articles/t...
The difference between ARM and anything else is that ARM has successfully shipped all of its security features (PAC, BTI, MTE) and Intel has not.
The linked Apple Store page says "MacBook Pro blasts forward with the M3, M3 Pro, and M3 Max chips" so it seems like the old version of the page still?
If only they could allow their iPads to be used as a Mac screen natively I might buy a Mini and an iPad and get done with it two use cases but why would Apple want users to be able to do that without extra expense.
No space grey?!
A more charitable interpretation is that Apple only thinks that people with computers a few years old need to upgrade, and they aren't advertising to people with a <1 year old MacBook Pro.
yeah it's about time
don’t think it’s wise though, i bought a base m1 pro mbp when it launched and don’t feel a need to upgrade at all yet. i’m holding off for a few more years to grab whenever the next major increase in local llm capability and battery life comes.
Another observation; I've travelled the world and rarely see people who could use robust, secure products the most (vulnerable people) using Apple products. They're all packing second-tier Samsung or LG Androids and old Windows notebooks (there are decent Samsung, LG, Android, Windows products, but that's not what they have access to).
If it affects your earning power to that extent, you should probably pony up and save in the long run, probably just a few years until you see returns.
Caste system usually can't be bypassed by paying a monthly subscription fee.
I will note that making it a subscription will tend to increase the overall costs, not decrease. In an environment with ready access to credit, I think offering on a subscription basis is worse for consumers?
If it matters that much to you just sell the old one and buy the new. That's your subscription.
I'm getting tired of everything else being updated yet the product most needed is completely being neglected, and for years already.
And no, I don't wanna buy a separate tiny screen for thousands of dollars.
I'm also not interested in these tiny cubes you deem to be cool.
This is why Apple is comparing against M1: M1 owners are the potential buyers for this computer. (And yes, the marketing folks know the performance comparison graphs look nicer as well :)
The compile times for Swift, the gigabytes of RAM everything seems to eat up.
I closed all my apps and I'm at 10gb of RAM being used - I have nothing open.
Does this mean the Macbook Air 8gb model I had 10 years ago would basically be unable to just run the operating system alone?
It's disconcerting. Ozempic for terrible food and car-centric infrastructure we've created, cloud super-computing and 'AI' for coping with this frankenstein software stack.
The year of the Linux desktop is just around the corner to save the day, right? Right? :)
It tells me my computer is using 8gb of RAM after a restart and I haven't begun to open or close anything.
Yikes?
This also means that cleanly booted machine with 16 GB will show more memory used than a machine with 8 GB.
Apple suggests you use the memory pressure graph instead to determine whether you're low on memory for this reason.