It's just that me and other old-time switchers have stopped complaining about it and moved on (taoofmac.com, my blog, was started when I wrote a few very popular switcher guides, and even though I kept using the same domain name I see myself as a UNIX guy, not "just" a Mac user).
For me, Spotlight is no longer (anywhere) near as useful to find files (and sometimes forgets app and shortcut names it found perfectly fine 5 minutes ago), and there is no longer any way to effectively prioritize the results I want (apps, not internet garbage).
Most of the other examples in the article also apply, but to be honest I've been using GNOME in parallel for years now and I consider it to be my "forever desktop" if PC hardware can ever match Apple Silicon (or, most likely, if I want something that is _just a computer_).
I'm there as well. I've been really enjoying desktop Linux lately, but I can't go back to a non-Apple laptop at this point. There's just nothing else on the market that comes close, they all make some tradeoff I'm not willing to make - either screen, speakers, keyboard, heat/battery life/fan noise, touchpad, etc. Apple is the only one that has the entire package.
There's Asahi, but no thunderbolt yet and I'm not sure the future of that project with the lead burning out and quitting. I just want an Apple Silicon-esque laptop, no trade offs on components, that runs Linux, and there's no OEM out there that's offering that experience.
So, until that happens I'm staying on mac, and even with declining quality, it's not all that bad compared to the alternatives yet. I've learned to mostly work around/ignore the odd bugs.
Meanwhile, my 7-year old laptop with Fedora on it I type this is wonderfully snappy and stable. I started to get tempted to actually switch back to a Mac just to get some predictability and stability, but I have avoided macs for years. (And - never having to deal with constant line ending issues)
All I hear from other co-workers is how their perfectly specced laptops lag with Windows. It's freaking Stockholm Syndrome here!
First is the endless badgering to log in, LOG IN, LOGGGG INNNNN with an asinine Microsoft account. If you can tolerate that and actually get the OS running, you're wading through a wonderland of UI regressions and defects.
The default hiding and disabling of options is infuriating. Try showing content from your Windows computer on a TV, for example. You plug your HDMI cable in, and you can select the TV as an external monitor in a reasonably logical manner. Great.
But wait... the sound is still coming from the laptop speakers. So you go to Sound in the system settings. Click on the drop-down for available devices. NOPE; the only device is the laptop speakers.
So you start hunting through "advanced settings" or some such BS. And buried in there you find the TV, detected all along, but DISABLED BY DEFAULT. WHY??? Not auto-selecting it for output is one thing, but why is it DISABLED and HIDDEN?
This is the kind of shit I have to talk my parents through over the phone so they can watch their PBS subscription on their TV. The sheer stupidity of today's Windows UI isn't just annoying, but it's demoralizing to everyday people who blame THEMSELVES for not being "computer-savvy" or slow learners. NO; it's Microsoft's monumental design incompetence and user-hostile behavior.
Microsoft doesn't get the relentless excoriation it deserves for its miserable user experience. There's no excuse for it.
I have a problem with Docker sockets while installing onto Bazzite, and didn’t care enough to look further into it.
There is mention on the arch wiki about enabling multi-threaded compiles, but also I have read you perhaps dont even need to precompile them now and possibly get better performance as the JIT compiles via a different vulkan framework (VK_EXT_graphics_pipeline_library).
I disabled pre-caching (which effects the compile too afaict) and never noticed any stuttering, possibly past some level of compute it's inconsequential. I also noticed that sometimes the launcher would say "pre-compiling" but actually be downloading the cache which was a lot slower on my internet.
Certainly on my (very) old intel system with a GTX1060, Sekiro would try to recompile shaders every launch, pegging my system at 99% and running for an hour. I just started skipping it and never really felt it was an issue, Sekiro still ran fine.
That said, I think anything with kernel-level anti-cheat either does not run or runs poorly.
> Bazzite is named after the mineral, as Fedora Atomic Desktops once had a mineral naming scheme.
More: https://en.wikipedia.org/wiki/Bazzite > Bazzite is a beryllium scandium cyclosilicate mineral with chemical formula Be3Sc2Si6O18.
The real problem is that, just like the grandparent post pointed out, Apple's software quality has been declining. The Tiger to Snow Leopard epoch was incredible. Apps were simple, skeumorphic, and robust.
Right now, the whole system feels a lot less coherent and robustness has declined. IMHO, there are not so many extra features worth adding. They should focus on making all software robust and secure. Robustness should come from better languages that are safe by construction. Apple can afford to invest on this due to their vertical integration.
It's related to why companies with great marketing and fund raising but mediocre or off-the-shelf technology often win over companies with deeper and better tech that's really innovative. Innovation and polishing takes work that subtracts from the time available for fund raising and marketing.
Perhaps the real challenge isn't balancing innovation and marketing—it's creating a culture that genuinely rewards bold ideas and meaningful risk-taking.
Imho, this is the wrong takeaway from parent's point.
Bureaucracy rewards many things that are actual work and take time. (Networking, politicking, min/max'ing OKRs)
Creativity and innovation are rarely part of the list, because by definition they're less tangible and riskier.
A couple effective methods I've seen to fight the overall trend are (a) instill a culture where people succeed but processes fail (if a risky bet fails then the process goes under the spotlight, not the person) and (b) tie rewards to results that are less min/maxable (10x vs +5%).
Because right now it’s clearly so far down, beneath dozens of other priorities, that expecting it to just happen one day seems futile.
And that hardware needs to be coupled with solid software to hook and keep people on this computer. So they can take more time to create more compelling upgrades and sand off more edges.
I think they need to desync all their OS's and focus on providing better releases. There really is no benefit to spending the day updating your Mac, phone, tablet, appletv, and HomePod. Especially when there are no good reasons to update. I feel like Apple became far to addicting to habit and routine that it's become more important to keep that than deliver product. Apple Intelligence is a good example of that.
The things it does may not seem important today, but back then even just my bandwidth costs were a significant percentage of my shareware revenue.
ObjC with manual reference counting wasn't much fun either; while we can blame Apple for choosing ObjC in the first place, they definitely improved things.
Now that the platform is cemented, they don't have an incentive to cater to developers.
The reality is that there was a long period of time where Apple built up lots of goodwill with a developer ecosystem that exceeded by many orders of magnitude the pre-iPhone OS X indie Mac developer scene.
There were many, many developers that hadn’t even touched a Mac before the iPhone came out, and were happy with Apple, and now are certainly not.
Another way to see it is that people who programmed for Mac OS already had reasons to be annoyed by Apple (e.g. 64bit Carbon). The iPhone let it get new people, who eventually found out why the pre-iPhone scene felt that way.
Unfortunately:
1) AVP is about 10x too pricy for games.
2) It's not clear if it can beat even the cheapest headsets for anything important for telepresence (higher resolution isn't always important, but can be sometimes).
Irregardless, you need the associated telepresence robot, and despite the obvious name, the closest Apple gets to iRobot is if someone bought a vaccum cleaner because Apple doesn't even have the trademark.
3) (a) is creepy, and modern AI assistants are the SOTA for (b) and yet still only "neat" rather than actually achieving the AR vision since at least Microsoft's Hololens, and because AI assistants are free apps on your phone, they can't justify a €4k headset — someone would need a fantastic proprieraty AI breakthrough to justify it.
Unrelenting bad press. People talking about nothing else but the decline of their software quality. We can already see that with the recent debacle which caused executive shuffling at the top of the company.
"Bad press" for their declining software quality is like people complaining there's no iPhone mini/SE anymore. Apple just doesn't give a fuck. They've joined the rest of the flock at chasing fads and quarterly bottom lines.
It was already the same story with AirPower (the wireless charging mat). They've pre-announced it, even tried to upsell it by advertising it on the AirPods packaging. It just turned out physics is ruthless.
TBH I've been increasingly sceptical about voice assistants in the "pre-AI" era. I sold my HomePods and unsubscribed from Apple Music because Siri couldn't even find things in my library.
No idea what changed, but it sucks.
I have almost the opposite problem this year. I tell the HomePod to turn the office lights on, it sometimes interprets this as a request to play music even though my library is actually empty, and the response is therefore to tell me that rather than turn on the lights.
Back in the pandemic, same problem with Alexa. Except it was in the kichen, so it said (the German equivalent of) "I can't find 'Kitchen' in your Spotify playlist" even though we didn't even have Spotify.
This is a solved problem since ~1970 -- they're just not spending enough time on it.
So you get rid of removable batteries so customers have to toss their phones away more often, you gimp other feature, you spend more money on advertising than you did actually developing the product (read this bit several times until it sinks in how crazy it is, yet that's how we are with every major phone, every major movie, etc), and so on.
I don't doubt that after 2020 the advertising budgets far outstripped the production budgets - multiple times; I am curious if that trend continues now, now that production isn't hamstrung by covid restrictions.
There are also entire "industries" designed to shield people who want to find quality content from big 'A' advertising.
https://www.youtube.com/watch?v=tGKsbt5wii0 For context John Sculley said "Apple was the marketing company of the decade" in the 80s and Kicked Jobs out of Apple
Today engineers have to put up a fight to do anything resembling craftsmanship.
Capitalism works this way because its customers, the investors, want it to work this way, because growth is how you get compound interest. Investors include anyone with an interest bearing bank deposit, a 401k, stocks, bonds, etc.
No growth means it would no longer be possible for an investment to appreciate.
I think of a similar thing when I see people complaining about how companies don't want to pay good wages. When you go shopping do you buy the $10 product or the $5 essentially equivalent alternative? Most people will buy the $5 one. If you do that, you're putting downward pressure on wages.
It's in your (purely economic) best interest for your wages to be high but everyone else's to be low. That's because when you're a worker you are a seller of labor, while when you're a customer you are an (indirect) buyer of labor.
Everything in economics is like this. Everything is a paradox. Everything is a feedback loop. Every transaction has two parties, and in some cases you are both parties depending on what "hat" you are wearing at the moment.
Equity returns ultimately come from risk premiums. (Which are small now in US equities BTW).
I’m invested in a microcap private equity fund that has returned >20-25% for years. They have high returns because they buy firms at 3-4x cashflow. You will get the high returns even with no growth. And with no increase in valuation. The returns are a function of an illiquidity premium.
With Apple explicitly, growth is expected given the valuation level. If it doesn’t grow, the share price will decline. So yes, in their case, firm is certainly under pressure to grow.
I also don’t agree with your “best interest for wages to be high and everyone else’s lower”. That is one aspect. It is more complicated. Consider Baumol Effect for starters.
Things like retirement, 401ks, etc., are society-wide institutions subject to macroeconomic rules.
Expecting users to change their daily habits in order to marginally improve the operating system of a trillion dollar company feels naive and a bit disrespectful to people who actually use these machines for work.
Even developers… the vast majority of developers ignored Apple for decades (and Apple was also hostile) and it managed to grow despite that.
Might as well ask people to contribute to Gnome or whatever so in the future everyone can go somewhere better. Feels way more feasible.
A sentiment which famously led Steve Jobs to respond that he doesn't understand this, because "people pay us to make that decision for them" and "If people like our products they will buy them; if they don't, they won't" [0]
So according to Steve Jobs himself, the only Apple-acknowledged way to disagree with Apple is to NOT buy their products, and by extend into the services-world of today it means STOP USING their products.
Now Steve Jobs doesn't officially run this company anymore, but I don't see any indication that this philosophy has changed in any way.
Most people are not going to migrate to Android, Windows, Linux or whatever else just to make macOS marginally better.
And it's fine: marginal quality improvements of a product are not the "responsibility" of consumers.
Nobody is saying “gosh, macOS is so damned unstable, but I’ve gotta use it, because…blue bubbles on my iPhone?
You’ve just read some story about a company you already hate and are parroting it.
It's not just "blue bubbles," but "blue bubbles" seems like a good shorthand to me. It's also things like Hand-off, or Universal Control, or getting Messages on both iPhone and Mac seamlessly, or being on the same WiFi network allowing your iPhone/Watch to work as a tv remote for the Apple TV even if you're just visiting a friend. Features that any platform can and does enable, but that do to Apples vertical can work seamlessly out of the box, across all the product lines, while securing network access in the ways most users will want, creating a continuous buy-in loop wherein the more Apple products you buy, the more incentive there is to buy exclusively Apple.
And it's a collective "you." If your entire family uses exclusively Apple products, then you'll be the only person who can't easily use eg the Apple TV in the living room, or the person "messing up" the group chats with "User reacted with Emoji Heart to [3 paragraph text message]," or the one trying to decide between competing network KVM software platforms so that you can use your tablet when your 12-yo can just set their tablet next to their laptop and get a second screen without any setup. Nevermind that these are all social engineering techniques that only exist BECAUSE Apple chose not to play nice with others, they still socially reinforce a deeper commitment to Apple products with each additional Apple product in the ecosystem.
I say this as someone "stuck in the blue bubble" with eyes open about what's going on. I'll keep picking Apple as long as they're a hardware-oriented company, because their incentives are best aligned with mine for the consumer features they are delivering (for now): consumer integration that sells hardware. It's insidious in its own way, but not like "hardware that sells eyeballs" (Google/Meta) or "business integration that sells compliance" (Microsoft).
That their own products depend on it because they developer their products in Mac. And that the professional people they pretend they cater to depend on Macs, and steadily move away.
Nahh, robustness comes from the time you can spend refining the product not from some magic property of a language. That can help but just a bit. There was no Swift in Snow Leopard. Nor there is not much Rust in Linux (often none) and even less (none) in one of the most stable OS available, FreeBSD.
They should just release a new version when the product is ready and not when the marketing says to release it.
> They should just release a new version when the product is ready and not when the marketing says to release it.
bingo, the yearly wwdc-turned-marketing-extravaganza cycle has kind of ruined for apple i think> defaults favor desktop and server performance
Desktops are in S3 half the day consuming ~0 power. During use, electricity costs are so much lower than hardware costs that approximately nobody cares about or even measures the former. Servers have background tasks running at idle priority all day so the power consumption is effectively constant. Laptop and phone are the only platforms where the concept of "Linux power management" makes any sense.
"Idle" x86-64 SOHO servers still eat ~30W with carefully selected parts and when correctly tuned, >60W if you just put together random junk. "Cloud" works because of economies of scale. If there's a future where people own their stuff, minimising power draw is a key step.
If I already need a powerful machine for a desktop, why would I need a second one just so it can stay up 24/7 to run Miniflux or Syncthing? Less is more.
https://www.bee-link.com/products/beelink-ser9-ai-9-hx-370
I have the ser-8 model, and can confirm everything works under Linux. This one has an 80 TOPS AI thing, since you asked about llms.
I want Mac hardware but Linux software. The other makers build quality is horrendous. Especially in the 13inch segment which is my favorite. Using a pretty old laptop because there is no replacement right now.
The new Ryzen AI looks really interesting! Sadly there is no Framework shop for me to look at it and they not ship to Japan..
I have a P1 Gen 7 and it’s fantastic. It feels premium, and it’s thin, light, powerful, has good connectivity and 4K OLED touch screen. I’d take it over Mac hardware any day.
But it is baffling how 1920x1080 (or 1200p) are still the "standard" elsewhere. If I want an X1 carbon, the best screen you can get at 14" right now is 2880x1800 (2.8k). Spec it with 32GB of RAM and it's clocking in at $2700, for a laptop that still has a worse trackpad, worse sound, and worse screen than a 14" MBP at $2399. And the Ultra7 in the thinkpad still doesn't beat the Mac, and it'll be loud with worse battery life.
There truly is nothing else out there with the same experience as an Apple Silicon MBP or Air.
So, my only options for the foreseeable future is wait for Asahi Linux, or suck it up and deal with macOS because at this rate I don't think there will ever be a laptop with the same quality (across all components) of the mac that can run Linux. The only one that came remotely close is the Surface Laptop 7 with the Snapdragon elite, but no Linux on that.
The one snag I ran into was that when it was new, supporting the power modes properly needed a mainline kernel rather than the distro default. But in the grand scheme of things that's relatively trivial.
I have an M1 Macbook Pro from work and honestly I'm not tempted to get one for myself. I am tempted by the M3 and M4 beasts as AI machines, but as form factors go I'm just not sold.
Jokes aside, I had to wait years for Framework to finally allow shipping via a friend in Berlin. I think they ship to Sweden now—they seemed to have an unfortunate misunderstanding that they needed to produce a Swedish keyboard and translate their website before shipping here, which of course is poppycocks.
Out of curiosity, what are you basing this on? From having spoken to people who manage IT fleets, and being the person regular people ask for advice for what device to get, with the occasional exception (which Apple also had plenty of, cf. the butterfly keyboard), you get what you pay for. A 1k-1.5k+ Asus/Dell/HP/Lenovo will get you decent and good build quality.
The cheapest $500 Acer won't.
And it still won't be on par with a $999 apple silicon air, or a MBP.
I've deployed latitudes, precisions, and thinkpads. They all still make tradeoffs that you don't have to deal with on the mac.
The X1 carbon is probably the "best" but, even with that - you are still getting a 1920x1200 screen unless you spend more than a MBP for the 2.8k display (which is still less than the 14" MBP, and costs more than an equivalent specced M4 pro). The trackpad is worse, the speakers are worse, battery life is worse, and they're loud under load.
They're all fine for a fleet where the end user isn't the purchaser, which is why they exist, but for an individual that doesn't want tradeoffs (outside of the tradeoff of having to use macOS), there's no other option on the market that comes remotely close to the mac. For someone that wants Apple silicon MBP level hardware but wants to run Linux, there are zero options.
The screen is the most egregious tradeoff though, the PC world is still adverse to HiDPI displays and even on high end models 1080p or 1200p is still the standard. I can excuse poor speakers, it is a laptop after all, if I really had to I can deal with fan noise, but I shouldn't have to spend more than a MBP to get a decent 120hz HiDPI screen with sufficient brightness and color accuracy.
My work machine is an M2 Pro MBP and except the shitty input HW (compared to the golden era of Thinkpads/Latitudes without chiclet keyboards) and MacOS being quite bad compared to Linux, it completely trounces the neighbouring Dells that constantly need repairs (mostly the USB-C ports and wireless cards failing).
Got two "2k" Lenovos at 4 year intervals.
The first one worked fine but that model was known to have a weak hinge. Had to replace it three times.
The second one had a known problem that some units simply stop working with the internal display and the only solution is replacing the motherboard. My unit worked about a week for me. Seller refunded me instead of repairing because it was end of the line and they didn't have replacements.
Got a "2k" Asus ordered now, let's see how that goes :)
Compared to that, even the one emoji keyboard macbook pro that i had worked for years. The keyboard on those models is defective by design and kept degrading, and I still think Cook should take his dried frog pills more regularly, but the rest of the laptop is still working. Not to mention my other, older apple laptops that are still just fine(tm), just obsolete.
Where's a Thinkpad that can run Maya comfortably for a student? AFAIK they only have models with Quadros that have anything but student prices.
So I'm stuck with "gaming" models.
Besides my daughter likes the bling :) If only they could sell me something that doesn't die in a week...
1. Display output from USB-C didn't work 2. Couldn't run Zotero 3. Couldn't compile Java bioinformatics tools 4. Container architecture mismatches led to catastrophic and hard-to-diagnose bugs
There were things that worked better, too (better task management apps, and working gamepad support come to mind). Overall, even though I only needed those things once or twice a week, the blockers added up and I erased my Asahi partition in the end.
I really appreciate the strides the Asahi project has made (no really, it's tremendous!), and while I would love to say that Linux lets me be most productive, features like Rosetta2 are really integrated that much better into MacOS so that I can't help but feel that Asahi is getting the worst of both worlds right now. I'll probably try again this summer and see what has developed.
It would be kind of funny, but also very sad, if Apple guys mistook the copying of apple's worst behaviour - producing throwaway devices - as a sign of quality. Though I think we are there for years now with phones, I wouldn't expect such thinking here.
My point is that this system is not integrated in the way apple fans usually define the word. I'd claim it is not integrated at all. It is a regular PC (but with soldered ram), which is exactly like framework announced it.
There should be no need to sprinkle some apple marketing bs on that to make it attractive.
It’s absurd.
As someone who actually studied human computer interaction, and since I had to work with borderline unuseable macs multiple times in my career now, plus as someone seeing the utter failure of relatives in just using an iPhone (bought since "it is so much easier", now not even able to call from the car system since it is so buggy), the Apple popularity is absolutely a case where you have to look at external factors like social status. And if that translates to "the users are dummies" to you, then that's your interpretation. Plus yes, translating marketing/status concepts like a bogus "integrated" status absolutely is interesting, thus my intent to clarify whether that is really happening here (plus some criticism, admittedly).
Probably not worth it going further into this though, it will only derail.
* Battery life is a lie, especially since it drains almost as much battery closed as it does open.
...
Overall, I think I am probably going to switch back to a macbook after this, not being able to go a day without charging and your laptop always being on low battery is a bit anxiety inducing.
https://liliputing.com/argon40-is-making-a-raspberry-pi-cm5-...
I am perfectly happy to use last-gen hardware from Ebay if it runs an OS that isn't begging me to pay for subscriptions and "developer fees" annually. My dignity as a human is well worth it.
Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming, the M chips aren't up to NVidia ecosystem, SYCL, the two major compute APIs for any kind of relevant GPGPU workloads, and thus don't really matter.
And gaming, well, even though all major engines support Metal, there is a reason DirectX porting kit is now a thing.
So why pay more for a lesser experience, and then there is the whole issue macOS doesn't have native support for containers, like Windows does (their own ones), and WSL is better integrated and easier to use than Virtualization Framework.
Proton is the acknowledgment of Valve's failure to entice game studios, already targeting Vulkan/OpenGL ES/OpenSL on Android NDK, Switch (which has OpenGL 4.6/Vulkan support), or on PlayStation (Orbis OS being a FreeBSD fork) to target GNU/Linux.
I rather have the real deal, not translations.
There’s no such thing as “native”, all the things you’re talking about are translation layers for hardware instructions themselves, and the overhead for doing software based translation is significantly less than hardware accelerated virtual machines- and we as an industry love those.
The reason for this is because the translations are very cache friendly and happen in userland, so the performance impact is negligible, and the scheduler on Windows is so poor compared to Linux ones that it’s even common for games to perform better on Linux than on Windows.. Which is crazy when you consider the difference in quality of the GPU drivers.
I understand that you want it to “just work”, but that tends to be the experience anyway.
You can do what you want, it’s your life, but this is not a terribly good excuse. Valves “failure” is essentially rectified.
I will add though, that it’s actually Stadia that made linux gaming the most feasible, many game engines (all of the ones I worked on) were ported to Linux to run Stadia, those ports changed essential elements of the engine that would have been slow or difficult to translate; so when Proton came around quite a lot of heavy lifting had gone away. I only say this because Valve gets some credit for a lot of work our Engine programmers did to make Linux viable.
I play most of my games in a window and switch away a lot. A million years ago when I was still playing world of warcraft, the system overall was much more responsive on the same hardware with wow on wine on linux than with wow natively running on windows :)
> it’s actually Stadia that made linux gaming the most feasible
Stadia was the most predatory gaming offering aside from IAP games, sorry. Buy your games again on top of the subscription? Lose them when Google cancels the service? No thanks.
Nvidia's GeForce Now was a lot more honest. Pay for the GPU and streaming, access your owned games from Steam. I'm not using it any more so I don't know how honest they still are, but I did for like a year and it was fine(tm).
The fact that Stadia advanced wine compatibility is great, but technical reasons aren't the only reasons that make a service useful to your customers.
There is certainly such thing as native, one thing is the platform where the APIs were originally designed for and battled tested, and the other is other platform emulating / translating them, by reverse engineering their behaviours with various degrees of success.
Valve's luck is that so far Microsoft/XBox Gaming has decided to close an eye on Proton, and it will run out when Microsoft decides it has gone long enough.
Not sure, Unreal Engine is pretty popular though and Snowdrop is increasingly common for Ubisoft titles.
https://www.protondb.com/app/2842040 https://www.protondb.com/app/2840770/ https://www.protondb.com/app/365590
Star Wars Outlaws
Natively Supports: Windows only
> https://www.protondb.com/app/2840770/
Avatar: Frontiers of Pandora
Natively Supports: Windows only
> https://www.protondb.com/app/365590
Tom Clancy’s The Division
Natively Supports: Windows only
----
You were saying?
Specifically, I worked on those games so I know what they natively support and how things transpired behind the scenes.
Proton has absolutely no hope of working without the changes we made because of stadia, the code we wrote was deeply hooked into Windows and we made more generic variants of many things.
The Division 1 PS4 release was significantly shimmed underneath compared to the win32 and xbox releases: this became much less true over time as porting the renderer to linux (specifically debian) made us genericise issues across the OS’s and when Div2 shipped we had a lot more in common across the releases; we didn’t rely on deep hooks into Microsoft APIs as much
Strange how you ported the renderer to Debian, and yet you couldn't even find a link to a game that has a native Linux support.
Was there ever a port?
> Proton has absolutely no hope of working without the changes
You keep saying this as the absolute truth, and yet at the time when Stadia launched Proton already had 5k working games under its belt.
Strange how Stadia is this monumental achievement without which Linux gaming wouldn't happen according to you.... and yet no one ever mentions Stadia ever contributing any code to any of the constituent parts of what makes Proton tick. Apart from the changes that engines supposedly made to work on a yet another game streaming platform.
There is a functioning version of The Division 1, Division 2, Avatar and Star Wars outlaws that run on Linux internally at Ubisoft.
Nobody will release it because it can’t be reasonably QA’d. (Stadia was also very hard to QA, but possible, as it was a stable target and development was essentially funded).
I’m not sure what your problem is; I said - as clearly as I can - that architectural changes to the engine were neccessary for proton.
I know this, for an absolute fact, because Proton was a topic when I worked on those games and it was not until Stadia (codename Yeti) was on the roadmap, and our rendering architect lost all his hair working on it - that Proton started to even function slightly.
I’m not shilling for Stadia - there’s nothing to shill for, it is dead.
Get over yourself, if you don’t like the truth then don’t start going in on me because my reality does not match your fantasy. Sometimes corporations do things accidentally that push other things forward unintentionally.
I just want to share my thanks to Stadia because I know for a concrete fucking fact that at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.
All I'm saying is that "it’s actually Stadia that made linux gaming the most feasible" statement is at best contentious because in reality gaming on Linux was already made (more) feasible when Stadia had only just launched.
And Stadia used the same tech without ever giving back to Proton at all (atl least nothing I can quickly discover). So the absolute vast majority of work on Proton was done by Valve which you dismissed as "when Proton came around" (it came around before Stadia) and "quite a lot of heavy lifting had gone away" (Valve did most of the heavy lifting).
That's the extent of my "problem".
> at least some of the AAA games I worked on would not function at all on Linux without Stadias commercial interference.
So, not "actually Stadia that made gaming feasible on Linux" but "because Stadia used all the same tech, and there were possible commercial incentives early on until Google completely dropped the ball, bigger studios also invested in compatibility with the tech stack"
Stadia did a lot to help by being a stable target and by being seen as commercially viable. Google also helped a lot to aid developers, not just financially.
That they didn’t contribute code to proton doesn’t factor at all, I just hate to see people not get their dues for their part in the prolification of Linux gaming- because I saw it first hand.
You are labouring under the delusion that I’ve implied Proton did nothing, no, they levied a lot of existing technology and put in a lot of polish. They were helped by Stadia, by Wine, by DXVK and others.
They didn’t do it alone, that doesn’t minimise their contribution, it contextualises them.
Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.
That proton was running some games is a weird revisionist take, very few AAA games ran at all, those that did were super old and there was always some crazy weird bugs- proton got better but also AAA games coalesced into conforming to linux-y paradigms underneath better- so support got better much quicker than expected. You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.
Hope that helps, because honestly this conversation is like talking to a brick wall.
Oh, you very much minimised their contribution. From "when Proton came" (again, Proton came before Stadia) to "Stadia made gaming feasible on Linux" (when Proton made it feasible before Stadia)
> Also: Stadia ports of games were native, they did not use proton- it was architecture changes of the games themselves that made proton work better- not Google making proton itself function better.
So, Stadie games were Linux ports. But as a result of this there are still literally no Linux ports. None of the tech behind Stadia ever made it back into software behind Proton. And "native stadia ports" are somehow responsible for more games that target Windows and DirectX to run better via Proton
> That proton was running some games is a weird revisionist take
Funny to hear this coming from a revisionist. I literally provided you with links you carefully ignored
--- start quote ---
A look over the ProtonDB reports for June 2019, over 5.5K games reported to work with Steam Play
https://www.gamingonlinux.com/2019/07/a-look-over-the-proton...
--- end quote ---
> You can even see this if you track the “gold” released games over years, some of the worst supported games for Proton are from 2015-16; before stadia but after game complexity started rocketing up with next game engines of the day.
Or because the actual heavy lifting that Valve did with Proton paid off, and not the nebulous "native ports" and code that never saw the light of day.
> because honestly this conversation is like talking to a brick wall.
Indeed it is.
Unreal is almost worse because their first party tools (UGS, Horde) will not work on Linux, so you have to treat linux as a console, and honestly the market share isn't there to justify it.
I worked closely with productions using proprietary game engines, I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.
That you don’t see it as an end user, is exactly my point.
You don't have to be a chef to judge what's coming out of the kitchen.
What is the objective impact of Stadia which at its height had a whopping 307 titles [1]? At the time of writing ProtonDB lists 6806 titles as "platinum, works perfectly out of the box" and 4839 games as "gold, works perfectly after tweaks". Steam Deck alone has almost 5x the number of games with "verified" status [2].
What games are being made for Linux thanks to Stadia, and don't just target DirectX and run through Proton? How many Stadia games were ported to Linux thanks to Stadia?
Also, to put things into perspective. Proton was launched in 2018. Stadia was launched in 2019.
In 2019 there were already over 5000 games that worked on Proton. [3]
In 2022 there already were more games with verified status for Steam Deck than there were games for Stadia, and 8 times more games verified to work by users [4]. Stadia shutdown was announced half a year after the article at [4].
Stadia had zero impact on gaming in general and on gaming on Linux in particular as judged by the results and objective reality. Even the games you showed as examples don't support Linux, only target Windows, and are only playable on Linux through Proton [5]
> I feel qualified in stating that Stadia had an outsized impact on our development process in a way that helped proton succeed.
> That you don’t see it as an end user, is exactly my point.
It's strange to claim things like "when Proton came along" when Proton was there before Stadia and already had over 5k games working in the year when Stadia only just launched.
It's strange to claim outsized impact on development process when there are no outcomes targeting anything even remotely close to Linux development, with studios targeting Windows as they have always done.
It's strange to claim Stadia had outsized impact when none of the work translated into any games outside Stadia. When Stadia did not contribute any significant work to the tech that is running Proton. In 2022 they even started work on their own emulation layer that went nowhere and AFAIK never contributed to anything [6]
It's strange to claim that "it's actually Stadia that made Linux gaming feasible" when there's literally no visible or measurable impact anywhere for any claim you make. Beyond "just trust me".
[1] According to https://www.mobygames.com/platform/stadia/ According to wikipedia, at the time of shutting down it had 280 games, https://en.wikipedia.org/wiki/List_of_Stadia_games
[2] https://www.protondb.com/dashboard
[3] https://www.gamingonlinux.com/2019/07/a-look-over-the-proton...
[4] https://www.protondb.com/news/how-many-games-work-on-linux-a...
[5] https://news.ycombinator.com/item?id=43503018
[6] https://www.gamingonlinux.com/2022/03/google-talk-about-thei...
You don’t know how the sausage is made just because you ate a hotdog.
Maybe you should consider things more carefully before making yourself look like an idiot on the internet and simultaneously raising my blood pressure.
If you're playing the likes of Fromsoft/Resident Evil/Kojima games on a PC, be it Windows or Linux, you're not playing on the platform those games were designed for.
"Technical issues" has many meanings.
Unless you play benchmarks instead of games, and care about 8k/1200 fps of course.
Proton is a lesser implementation of Windows API, sure, but Windows itself is a lesser implementation of an operating system for power users.
The initial approach of runtimes did help but it's still has its limitation.
If now a studio just need to test their game under a runtime+proton the same way they would test a version of Windows to ensure it's working under Linux it's a win/win situation. Proton becomes the abstraction of the complex and diverse ecosystem of Linux which is both its strength and weakness.
Another solution would have been everybody using the exact same distribution which would have been way worse in my opinion.
And who knows, maybe one day Proton/Wine would be the Windows userland reference and Windows would just be an implementation of it :D
I thought only Apple had a distortion field.
Most of HN seems to think using a web browser as a translation layer is a good idea, yet they complain when games use a translation layer.
Yes?
I miss the days of native apps with Internet protocols, and USENET discussions.
A web site makes for a crap application and the reverse.
Anyway Linux is liberating, Fedora Desktop is great, no ads in the OS, a Software Store/Installer I actually like to use, curated by usefulness instead of scam Apps. All my Windows Steam Games I frequently use just worked, I have to login to X11 for 1 title (MK11), but everything else runs in the default Wayland desktop. Although I'll still check protondb.com before purchasing new games to make sure there'll be no issues. Thanks to Docker, JetBrains IDEs and most Daily Apps I use are cross-platform Desktop Web Apps (e.g. VS Code, Discord, Obsidian, etc) I was able to run everything I wanted to.
The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh that's enhanced with productivity tools like fzf, eza, bat, zoxide and starship. There's also awesome tools like lazydocker, lazygit, btop and neovim pushing the limits of what's possible in a terminal UI and distrobox which lets me easily run Ubuntu VMs to install experimental software without impacting my Fedora Desktop.
Image editors is the one area still lacking in Linux. On Windows I used Affinity Designer/Photo and Paint.NET for quick edits. On macOS I use Affinity & Pixelmator. On Linux we have to chose between Pinta (Paint.NET port), Krita and GIMP which are weaker and less intuitive alternatives. But with the new major release of GIMP 3 and having just discovered photopea.com things are starting to look up.
Xerox PARC is the future many of us want to be in, not PDP-11 clones.
Sure you can happily avoid the command-line with a Linux Desktop and GUI Apps, although as a developer I don't see how I could avoid using the terminal. Even on Windows I was using WSL a lot, it's just uncanny valley and slow compared to a real Linux terminal.
It's not a weird flex. Weird flex is this: "The command-line is also super charged in Linux starting with a GPU-accelerated Gnome terminal/ptyxis and Ghostty running Oh My Zsh" and then listing a bunch of obscure personal preference tools that follow trends du jour.
And you just alias them, so you can keep using the core utility names to use them.
It is like praising Ratatui for what Turbo Vision, Clipper and curses were doing in 1990's, if I wanted that I would kept using Xenix and MS-DOS.
GUIs are very useful but they are not clearly better (or worse) than CLIs.
But since I'm not gaming I cannot imagine going back to Windows. On the other hand I'm quite enjoying Linux...
> So why pay more for a lesser experience
...however, with few exceptions, I haven't used mouse in decade... and I haven't found anything like MBP's touchpad yet. Maybe I just need to do better research.
As if Vulkan had relevance to graphics programming.
> and WSL is better integrated and easier to use than Virtualization Framework.
you don't need WSL on MacOS because, well, MacOS is already a *nix environment.
It surely has on 80% of a mobile platform, and on a small handset from this little japanese games company.
> troupo 3 hours ago | parent | context | flag | on: Apple needs a Snow Sequoia
> Metal isn't really on pair with Vulkan and DirectX in terms of relevance for graphics programming
As if Vulkan had relevance to graphics programming.
> you don't need WSL on MacOS because, well, MacOS is already a *nix environment.
Agree if everything one wants out of it is classical UNIX experience, that breaks down when having to work with containers and kubernetes locally.
And which platform brings in more money?
> and on a small handset from this little japanese games company.
And not on PS, not on XBox, not on PC (that is, no first-party support).
Right up until you need Linux syscalls. If you're doing anything with containers it's an annoyance.
Have a look at this sample code: https://developer.apple.com/documentation/virtualization/cre...
I think the decline of software went hand-in-hand with the decline of the native indie Mac app. They still exist, but when I started with the Mac (2007), there was a very rich ecosystem of native Mac apps. Most stood head and shoulders above their Linux and Windows counterparts.
Apple has nearly destroyed that ecosystem with: race-to-the-bottom pricing incited by the App Store; general neglect of the Mac platform (especially between ~2016 and Apple Silicon); and a messy reactionary toolkit story with Catalyst, SwiftUI, etc. The new toolkits seem to imply that Apple says that it's the end of AppKit, but most SwiftUI applications are noticeably worse.
With their messy toolkit story and general neglect, developers have started using Electron more and more. Sure, part of the popularity is cost savings, since Electron apps can be used on multiple platforms. But part of it is also that a Catalyst or SwiftUI app is not going to provide much more over an Electron app. They will also feel weirdly out of place and you become dependent on Apple working out quirks in SwiftUI. E.g. 1Password tried SwiftUI for their Mac app, but decided in the end that it was an uphill battle and switched to Electron on Mac instead.
I recently bought a ThinkPad to use besides my MacBook. Switching is much easier than 10 or 15 years ago, since 80% of the apps that I use most frequently (Slack, Obsidian, 1Password, etc.) are Electron anyway. Even fingerprint unlocking works in 1Password. I was vehemently anti-electron and still don't like it a lot, but I am happy that it makes moving to a non-Apple platform much easier.
Now, ios gets the executive attention and it will generally get the best developers assigned to it, and the Mac has to live with the scraps.
I'm on the same boat here. Something is driving me away from my MacBook M1(Pro? Don't even know). I have a gut feeling that it's macOS but can't really put a finger on it yet.
Bought a heavily used ThinkPad T480s (from 2018) and replaced almost every replaceable part of it, including the screen. Being able to replace many parts easily is a nice touch since I am using MacBooks since 2007 exclusively. Guess that's why I somehow overdid it here. Slammed Pop!_OS 22.04 on it and I'm very pleased with the result. The first Linux desktop I actually enjoy since trying SuSE 5-something. Pain points are teams (running in browser), bad audio quality with AirPods when using the microphone and cpu speed and heat. I guess one has to stop using Apple silicon in laptops to realize how amazing these processors are.
Intel CPUs from that era were quite bad and everyone has upped their ante since then. I was thinking about getting a second hand from ~2021-2022, but my wife convinced me to get a new one, so I got a Gen 5 T14 AMD. It has a Ryzen 7 Pro 8840U and I rarely hear the fans, mostly only when Nix has to rebuild some large packages (running NixOS unstable-small).
1Password had a beautiful native Mac app that works to this day. Even assuming SwiftUI is actually bad, why did they have to migrate at all? What was wrong with the existing app?
I'm not disagreeing with the opinions on Apple software quality, but I think the 1Password case is more down to their taking of VC money and having to give (JS) devs some busywork to rebuild something that worked perfectly well.
While it's true for 1Password, there are other password managers. KeePass is great for local password database files if that's what you're after.
It didn't work on Windows and Linux desktops.
I could be wrong, but apparently Spotlight is the service that drives this kind of file system watching. I think macOS has a lower-level inotify-style file system event API, which should be unaffected, but Finder and these other apps apparently use Spotlight. I really wish I had a fix, because it's just crazy having to constantly "refresh" things.
You went through the effort to show some UI when something I am looking for may not be there because indexing is paused... but you didn't think to just unpause the indexing so that I can find it? I feel like I am being spit on, "Yeah, you not finding what you are looking for? I know, I'm not even trying"
Probably not.
Who the hell thought integrating internet search is a good idea - because "aösldkfjalsdkfjalsdkfj" just as everything else is a valid search result in Spotlight now showing me "Search for aölsdkfjöalsdfjasdlfkj in Firefox"...
So... when the hits include six identically-named files, you can't eliminate ones that you know are wrong (on a backup volume or whatever). The level of stupidity here is just mind-boggling.
I also just tried it in Spotlight and Finder, and it did nothing. Which I consider a relief, because undiscoverable bullshit is worse than the feature not existing.
Special mention to all text input fields in macOS having Emacs-style shortcuts.
https://support.apple.com/en-gb/guide/mac-help/mchlp1008/mac
I agree that discoverability could be better, but macOS has pretty consistently had hidden power user shortcuts and modifiers, to keep the basic workflow streamlined/simple for those who don't need it.
And I don't buy the "keeping things simple" excuse for secret hotkeys in other areas. Falling back on gimmicks like undisplayed hotkeys and "long presses" and "gestures" is lazy abandonment of the design task.
I hate this "saving the user from complexity" lie. It's hypocritical: The "non-power" user isn't going to go looking for these options in the first place.
Finder search is a great example. A "non-power" user isn't going to right-click on the column headings in the results and try to add "path" as a column. So how does it help that user to deny everyone else the ability to add it?
Apple mocked IBM for needing a thick user manual back in the day. To suggest that anyone (especially anyone on this site) should have to read documentation to use perform a basic file search (in a GUI, no less) is apologism to the extreme.
I guess you do know the path is shown at the bottom of the window if you select the filename in the list of results?
It also doesn't allow you to sort results by location, as you could if it were a column.
Absurd.
To select, just press on the item.
To hover, press and hold for at least 2 seconds.
To get a list of options, press and hold for at least 2.5 seconds, but not more than 3.5 seconds.
To delete, press and hold for 3.6 seconds, but not longer than 3.9 seconds.
To save, press and hold for 4.1 seconds. Pressing and holding for exactly 4.0 seconds activates the archive action. Pressing and holding for 4.2 or more seconds sends the item to the blocked list.
To retrieve the list of items in the blocked list, press and hold and simultaneously press the volume up and volume down key.
To delete all items in the block list, press and hold and simultaneously press the volume up key only.
To completely reset your device, press and hold and simultaneously press the volume down key only, whilst holding the device in a completely vertical plane, and rotating clock-wise and counter-clockwise, smoothly, at precise 2.35619 radians every 30 seconds.
To trigger the emergency call feature, drop the device at an acceleration of no less than 9.6m/s and no more than 9.7m/s
/s (kind of)
The whole point is that secret hotkeys are design dereliction.
That’s a bit of a problem when discussing problems of normal users with power users, because they don’t even realise how what they’re doing is actually not what normies do.
I’m inclined to agree that hotkeys in MacOS are hard to discover, but cluttering the interface with stuff many users simply do not need cannot be the correct answer.
The things that most frustrate me about Macs is that they've violated the never spoken but always expected "it just works" in so many ways. Things like how Thunderbolt Displays containing a USB hub which are Apple-certified handle re-connection to a Macbook, should "just work", but require fiddling every time. That's just one of numerous examples I could come up with.
Apple historically was probably the best company in the world in understanding the full depth of what "User Experience" means, and it seems like they've really retreated from this position and are regressing to the mean.
I struggle to imagine the software design that works so poorly.
And it makes no sense whatsoever. If "foo" matches "foobar", so should "foob". I honestly don't know how the hell can they still f up such a simple piece of technology in 2025.
Then you do a search for .jpg, and get NOTHING. But only sometimes. Other times it'll work.
People don't want animojis, and they don't want other trite new features that only seem to exist because Apple feels it needs to demo something new every year.
What they want is something that just works without annoyances, distractions, failures, or complications.
Give them that and they'll break down the doors trying to get their hands on it, because it's so far from how most tech works today.
Yeah and they succeeded in that so now it's about selling subscriptions on top of that.
Then when it does show the results, they’re usually in some terribly unhelpful order. It took me ages to try and go through the CUJ of “this app isn’t sending me notifications because I turned them off now I want them back on”
You can only cycle windows in one direction even if you try to do some remapping
Choosing keyboard languages hides a lot of options. Once you understand you need to click on English US to see more detailed options then you get them all, UK, Canadian... Then it's unclear which keyboard layout is currently selected and how to select one from the list you made.
I can't fathom how a DE whose is all about human machine interface guidelines whatever and supposed to be the epitome of UX can't figure out basic stuff about discoverability and clarity
Keyboard layouts are a pain, but there are some solid extensions that clean the flow up and may be upstreamed into GNOME at some point.
It's all opinions, but boy, compared to the mess that is macOS and iOS regarding discoverability ... I'll take GNOME any. day.
The volume and power icons on the top right is actually one button and hides other option like screen lightning volume and wifi etc. If at list they had made a three vertical dots/stacked bars and is the convention for hamburger menus...
From what I heard GNOME devs do not like change and it sucks to be a GNOME extension developer, a quick google search seems to confirm that so it casts some doubt about them up-streaming any of them but maybe you know better. Has it ever happened to other extensions ?
https://discourse.gnome.org/t/developing-gnome-shell-extensi... https://www.reddit.com/r/gnome/comments/pvvku5/why_do_extens...
Haven't really used MacOS or iOS more that five minutes so I can only trust you on that.
On the other hand for example, it is very easy to remap CapsLock to Escape on MacOs. Just go to Setting --> Keyboard and you easily find the option. GNOME ? No, not in settings. Wait I have to use an app called gnome-tweak ? Ok it's in "Advanced keyboard otions" --> Opens a big list of loosely classified options. Oh well it was in miscellaneous category.
I don't know if the CapsLock -> Escape switch is on a roadmap somewhere, but that is a little bananas. That said, my partner comfortably uses GNOME every day to browse the web and manage some files. Has she EVER wondered how to remap CapsLock? No. The people who do want to? Google can give you the answer pretty quickly. Not saying it's good UX, but GNOME balances a lot of use cases, and as this thread suggests, I think they've actually (with a LOT of complaining from engineers and power users) kept that balance pretty damn well to the point where I haven't been surprised by GNOME is a long time, and seems to slowly and progressively get better.
And yes, whoever jumps in here with their own papercut story, I know there is pain in not being the primary audience for software. But honestly, at least I'm in the same Venn diagram with my partner. The primary audience for macOS or iOS now appears to be ... I don't even know anymore. Used to be content creators, now it seems like even Apple doesn't actually know who uses their computers.
And so many tiny thumbnails wedged into the too-narrow System Settings window.
I still have some 10.6.8 install media for both server and client. Truly loved them both.
Another highlight of that job was selling a green iPod Nano to "John Locke" from LOST
The most ridiculous thing that happend to me was in the early days of the Apple Store in SOHO I stopped in to see if I could just buy RAM.
The music was loud so it was like I was speaking loudly to be heard and asked for RAM and the they thought I was asking if I could buy a gram.
Loved Snow Leopard too, & was shocked by how bad Lion was in comparison. Glad they got back on track after that.
Wasn’t 7.5.3 the worst of the string of terrible releases between 7.5 and 7.6? In my memory 7.5.5 was much better, but I still preferred 8.1.
OMG this one drives me bonkers. If anyone out there knows how to turn off internet results, please share!
Similar for me but started in system 7.
It’s lucky for Apple that Windows has got worse faster.
I wish Apple would just fix Spotlight. They don't seem to think it's worth fixing.
Settings -> Spotlight -> Websites, UNCHECK
Also, I vaguely remember there being a way to _order_ results, not just disable them.
My C drive was was super full for some reason I couldn't understand, and Explorer couldn't tell me where the data was. There was about 100GB just unaccounted for.
I don't even use the search index.
Here are some data points I collected at the time:
https://blog.rongarret.info/2009/08/snow-leopard-is-disaster...
https://blog.rongarret.info/2009/09/esata-on-snow-leopard.ht...
In retrospect Snow Leopard deserves the love it eventually got, but at the time it was not entirely clear.
> Apple's software quality (either in terms of polish or just plain QA) has steadily decreased.
Amen to that.
Feelings shared, if only Gnome would provide this column-based file navigation that I miss so much
I'm stuck on a MBP because it's the only laptop with a great screen, speakers, and battery life. Meanwhile my keyboard keys keep getting stuck after a year of usage, and OSX is garbage. Soon as there is similar hardware I can load Linux on, I'll be insta-switching.
I would love to finally get out of Apple ecosystem, I just don't have any decent alternatives right now. Hopefully next year.
However I'm done with Apple. I think it's a decision - not "reasoning". That decision takes time and is painful. It's also a decision specifically against "the best" ecosystem available in favor of something "ok".
Not only they repeatedly disappointed my expectations - they just suck as a company (in my opinion). It's not about being less innovative for decreasing software quality, they have done so much for the market, that I think GNOME wouldn't even exist as it is without them... Its about sealing off every inch of their software and hardware they can. No repair without paying... Making RAM and SSD upgrades ridiculously expensive, you cannot even put default NVMe drives into a mac mini - everything is proprietary. Even their sensors have serial numbers to prevent hibernating if you change them out without "hacking" the firmware.
Hardware-wise I have high hopes for framework working with AMD - although they did not address the issues I'd suggest (speakers, lpcamm2), they're constantly improving without breaking their promises. This is hopefully not going to change when they get bigger.
OS-wise I'll stay on Linux. After a long journey going from Ubuntu to Debian to Fedora using GNOME, KDE and even NixOS with Hyprland for a short period, I gained enough knowledge required to really enjoy Linux. System76 is working on COSMIC, which could be pretty amazing, once it is released.
In case anyone would like to try my current Linux config, I'm constantly working on an "install everything" script (pretty early stage):
https://github.com/sandreas/zarch
HF ;)
You might not want one though.
You don't get nearly as much compute as you would with 6 GPUs, but it also uses less power than a single GPU.
This issue with spotlight is so bad. I use the switcher to pull up my Downloads or Documents directories and half the time it can’t even find them!
> if PC hardware can ever match Apple Silicon
What is wrong with an AMD Ryzen 9 with 16 physical cores? If you need more and you have a virtually unlimited budget, then Ryzen Threadripper is even better. Also: Is Asahi Linux an option for you?FWIW you can massively improve things by just disabling the internet results. It's easily done in the System Preferences
IIRC some competitors are starting to offer a few laptops with ARM processors, I think Samsung has a few. How do you feel about those?
Similar for me but started in system 7.
It’s just lucky Windows has got worse faster.
At least there's quicksilver
Most websites have an element that won't load on the first try, or a button that sometimes needs to be clicked twice because the first click did nothing.
Amazon shopping app needs two clicks every now and then, because the first one didn't do what it was supposed to do. Since 3+ years ago at least.
Spotify randomly stops syncing play status with its TV app. Been true for at least a year.
HBO app has subtitles for one of my shows out of sync and it has been for more than a year.
Games including AAA titles need few months post-release fixing before they stabilize and stop having things jerk themselves into the sky or something.
My robot vacuum app just hangs up forever once in a while and needs to be killed to work again, takes 10+ seconds after start to begin responding to taps, and it has been like that for over 2 years of owning the device.
Safari has had a bug when opening a new tab and typing "search term" too quickly, it opens URL http://search%20term instead of doing a Google search. 8 years ago I've opened a bug for that which was closed as a duplicate, and just recently experienced this bug again.
It really seems that criteria for "ready for production" is way lower now. If my first job 13+ years ago any QA noticed any of that above, the next version wouldn't be out until it is fixed. Today, if "Refresh" button or restarting the app fixes it, approved, green light, release it.
There were a lot of smart people, very interested in fixing things— not only because engineers tend to like fixing things, but also because we, and everyone around us, were users too.
For example, many things related to text input were broken on the site. Korean was apparently quite unusable. I wanted to fix it. A Korean manager in a core web team wanted to fix it. But we couldn't because the incentive structures dictated we should focus on other things.
It was only after a couple years, and developing a metric that linked text-input work with top-level (read, revenue-linked) metrics, that we were able to work on fixing these issues.
I find a lot of value in the effort to make incentives objective, but at a company that was already worth half a trillion dollars at the time, I just always felt there could be more room for caring about users and the product beyond the effects on the bottom-line.
The only solution I know of is to have a business that’s small enough and controlled by internal forces (e.g. a founder who cares) to pay attention to craftsmanship.
Our use of Microsoft 365 is a pretty good example of that. I moved our company to Microsoft 365 because it had some features we wanted. Then I moved the company off Microsoft 365 because it turned out to be too buggy to be useful.
I realise that the actual users of software are not necessarily the same people making the purchasing decisions. But if productivity suffers and support costs rise then the consequences of choosing low quality software eventually filters through to purchasing decisions.
The problem is that managers / those that determine priorities don't get the numbers, they don't see a measurable impact of buggy software. There's only two signals for that, one is error reporters - which depend on an error being generated, that is, software bug - and the other is user reporting, but only a small fraction of users will actually bother to make reports.
I think this is a benefit of open source software, as developers are more likely to provide feedback. But even then you have some software packages that are so complex and convoluted that bugs emerge as combinations of many different factors (I'm thinking of VS Code with its plugins as an example) that the bug report itself is a huge effort.
I don't believe that. IT departments have to support users. Users complain and request support. It costs money and it affects productivity and everybody knows it.
But that's not enough. You would also have to believe that there are significantly less buggy alternatives and that the difference justifies the cost of switching. For big companies that is an incredibly high bar.
But small companies do dump software providers like my company dumped Microsoft.
[Edit] Ah, I think I misunderstood. You're looking at it from the software provider's perspctive rather than the user organisation. Got it.
The problem is that very little competition exists for computer operating systems. Apple, Google, and Microsoft collectively control nearly all of the consumer OS market share on both desktop and mobile. Thus, macOS just needs to be "better than Windows", and iOS just needs to be "better than Android".
> Then I moved the company off Microsoft 365 because it turned out to be too buggy to be useful.
What did you move to?
In general, Microsoft 365 is extremely successful, despite any bugs. There doesn't appear to be any imminent danger of financial failure.
Software vendors also face tradeoffs, engineering hours spent on fixing bugs vs. writing new features. From a bean counter's perspective, they can often live with the bugs.
That's because of some very hard monopolistic anti-consumer behavior from Microsoft in their ecosystem.
I'm not implying that, and I don't think my manager was implying that either. I think rather there were 2 things going on:
1. It's often hard to connect bug-fixing to metrics.
A specific feature change can easily be linked with an increase in sales, or an increase in usage. It's much harder to measure the impact of a bugfix. How can you measure how many people are _not_ churning thanks to a change you pushed? How can you claim an increase in sales is due to a bugfix?
In your case, I'm sure some team at Microsoft has a dashboard that was updated the minute you used one of these features you bought Microsoft 365 for. How could you build something similar for a bugfix?
Bugfixes don't tend make the line go up quickly. If they make the line go up it often is a slow increase of regained users that's hard to attribute to the bugfixes alone. Usually you're trying to measure not an increase, but a "not decrease", which if possible is tricky at best. The impact is intuitively clear to anyone who uses the software, but hard to measure in a graph.
2. A ruthless prioritization of the most clearly impactful work.
I wouldn't have minded working on something less-clearly measurable which I nonetheless thought was important. But my manager does care though because their performance is an aggregate of all those measurable things the team has worked on. And their manager cares, and so on and so forth.
So at the end of the day, in broad strokes, unless the very top (which tends to be much more disconnected from triage and edge-cases) "doesn't mind" spending time on less measurable things like bugfixing, said bugfixing will be incentivized against.
I think we all know this impacts the bottom-line. Everyone knows people prefer to use software that is not buggy. But a combination of "knowing is not enough, you have to show it" and "don't work on what you know, you have to prioritize work on what is shown", makes for active disincentivizing of bug-fixing work.
such QA jobs no longer exists. Ever since the software dev world has moved to doing one's own QA during development, software has been consistently worse in quality. May be there's a correlation there!
Companies abuse Agile so they don't have to plan or think about stuff anymore. In the past decade, I haven't worked in (or seen) a single team that had had more than 2 weeks of work prepared and designed. This leads to something build 4 weeks ago needing a massive refactor, because we only just realized we would be building something conflicting.
That refactor never happens though, because it takes too much time, so we just find a way to slap the new feature on top of the old one. That then leads to a spaghetti mess and every small change introduces a ton of (un)expected issues.
Sometimes I wish we could just think about stuff for a couple of months with a team of designers before actually starting a multi-year project.
Of course, this way of working is great when you don't know what you'll be building, in an innovative start-up that might pivot 8 times before finding product-market fit. But that's not what many of us in big corp and gov are doing, yet we're using the same process.
OS work is somewhere in between, but definitely more towards the latter category.
Not even architecture is being discussed properly under the guise of being agile, it’ll come by itself.
Absolute insanity.
Since then, Google and Apple products have become just as bad as Microsoft's. I think this is because the industry has moved towards an oligopoly where no one is really challenging the big players anymore, just like Microsoft in the late 1990s. The big companies compete with each other, but in oblique ways that go after revenue not users.
Preloading selected results in background tabs and then closing the main tab, so that I can iterate through the results of each clicked item per tab is simply so much more efficient than entering a page, hitting back, entering the next, hitting back, ...
Like the items in Twitter's Explore page.
Which you notice because your page scrolls up wildly as you move to click on what should be the new tab
I actually got penalized in my last performance review because something I shipped “wasn’t that technically complicated”. I was flabbergasted because I consider it my job to make things simpler, not harder to reason about. But you don’t get promotions for simple.
It was suddenly completely broken and stopped working a few years ago. I tried every setting to try to get it working but couldn't.
I feel like a stone age caveman having to manually type everything into my Google calendar.
There are a lot of people raising the same issue in Google forums, but it's not fixed yet.
Ironically they are adding new Gemini AI features into Gmail, which can't do this as well.
As much as I dislike systemd, if this is the reason, then I retract everything negative I ever said.
Like, at least we had a central place to vent about the exact same stuff you just listed, and who knows, in the best case, at least some companies might feel shamed into picking up issues with the most upvotes or see it as a chance to engage with their userbase more directly.
Or I‘m naïve and the most likely outcome is getting sued?
What do you think?
While webkit might have some much needed improvements in the past few years, it is still the behind Blink and Gecko. Safari, the browser itself. Has been awful for the past 10 years. At least on Desktop. And some of these are not issue with Webkit because other webkit browser does it better.
The Address bar is far the worst compared to Chrome ( OmniBar ) and Firefox ( I believe it used to be call Awesomebar ). I have experience the same bug you mentioned and I believe I filed it way earlier.
Opening Bookmarks with too many items continue to pause and jank for 11 years now.
Tab Overview continue to re-render all the tabs. Causing Paging and Kernel_Task CPU spike. My Kernel_Task is currently 80TB at 240 days uptime. That is 333GB of write per day. Simply killing the SSD.
And no Tab Sleeping.
Apple just doesn't give a fuck any more about their software.
Basically the whole thing with Sync is very fickle.
On another note, Safari somehow doesn't work well when you have over 128 Tabs.
I think this is just the result of an optimizing game placing profit above all else (including quality and user satisfaction) which is indeed the norm in this late stage of capitalism. You want to opt out of that? Good thing the GPL opened the way placing human freedoms front and center, and not-for-profit software stacks like KDE (for instance) keep getting better and better over time.
I use commercial OSes at work by obligation, and the turning point from which my experience as a user became better served by free software happened many years ago.
Many of my appliances (dish washer, coffee maker, …) work just fine for weeks before an annoyance pops up („deep clean“, for example). Many of my applications do not. For most I could measure MBTA in minutes. Definitely with Spotlight.
macOS on the other hand, is getting worse, I can definitely concur that spotlight is getting more and more useless. Time Machine as well. It mostly doesn’t work for me, always breaking, hanging…
So when you already start feeling like the operation system is preventing you from doing the things you need to do, then all the small cosmetic flaws seems more in your face.
When Sequoia eliminated the ability to override Gatekeeper by control-clicking, it became clear to me that Apple is now employing a frog boiling strategy towards their ultimate goal -- more control of the software you can run on their hardware.
Trying to get the program to work with our Mac users has become harder and harder. These are all internal developers.
Enabling developer mode and allowing Terminal execution isn't enough. Disabling the quarantine bit works - sometimes - but now we're getting automated nastygrams from corporate IT threatening to kick the laptops off the network. I'm exhausted. The emergency workaround, which I tell nobody about, is way less secure than if they just let us run our own software on our own computer.
I once really urgently needed `nmap` to do some production debugging ASAP. Unfortunately, the security tools would flag this immediately on my machine, as I knew this from previous experiments. Solution - compile my own binary from sources, then quickly rename it. I assume that this "workaround" was totally fine for sec department. At least production got fixed and money kept flowing.
You were denied the tools to get your job done. You've put yourself at risk by applying an unapproved workaround.
Never ever do this (unless you hold substantial shares). Let the company's bottom line take the fall. If that's the only thing they care about, that's your only way to make the problem visible.
But the underlying SRE culture here is that, if you know what you are doing and have a functioning brain of a responsible person, you'd be forgiven a jump over the fence, if it means putting out a fire on the other side of it. We aren't kids.
I don’t get this at all.
I’d much prefer a team of highly empowered and highly responsible engineers than impotent engineers who need hand holding in case they make a mistake.
Engineers _should_ have leeway in how they resolve issues. As I read, though, you have a company policy which explicitly disallows the action you needed to take to fix the problem (if I misread, my apologies). Getting the stakeholders involved is the responsible thing to do when policies need to be broken.
Ideally, the way this kind of situation gets handled should be documented as part of a break-glass policy, so there’s no ambiguity. If that’s not the case, though, the business should get to decide, alongside the policy maker (e.g.: security), whether that policy should be broken as part of an emergency fix, and how to remediate the policy drift after the crisis.
If you’re all tight enough that you’re allowed to make these kinds of decisions in the heat of the moment, that’s great, but it should be agreed upon, and documented, beforehand.
By the way, I'm still burnt out. This work is stressful. Don't let it take away what's already scarce for you.
For binary patching: codesign --force --deep -s - <file> (no developer ID required, "ad-hoc signing" is just updating a few hashes here and there). Note that you should otherwise not use codesign as it is the job of the linker to do it.
Last time we did this I had to spend a week explaining to management that Macs could actually run software other than PowerPoint and it was necessary for our job.
The local workaround that we use is to just spin up a Linux VM and program devices from there. The less legal workaround is using WebUSB and I'm afraid to even tell the necessary people how I did it, because it's sitting out on a public-facing server.
https://developer.apple.com/documentation/security/code-sign...
https://developer.apple.com/documentation/security/notarizin...
There are dedicated sections of the developer web forums:
https://developer.apple.com/forums/topics/code-signing-topic
https://developer.apple.com/forums/topics/code-signing-topic...
...and there's an apple developer support person, Quinn, who appears to be heavily if not solely dedicated to helping developers do binary signing/notarization/stapling correctly.
They have written a slew of Tech Notes about signing and notarization. Main TN is at https://developer.apple.com/documentation/technotes/tn3125-i...
Quinn also has their email address in their sig so people can just reach out via email without even needing an Apple account, or if they prefer more confidentiality.
I mean, come on.
Maybe because I'm using Electron framework which makes things more complicated, but I don't really understand why there's is a difference between different types of certificates (Developer ID, Apple distribution, macOS distribution) and I had to guess which one to use everytime I set it up.
Also why is notorization a completely different process from code signing, and requires completely different set of credentials from it. Seems odd to me.
Because they do completely different things. Signing is a proof that you were the one to write and package that software; notarisation is an online security check for malware. If I recall, you still sign but do not notarise when distributing to the Mac App Store.
Or maybe simpler, why can't Apple just do code sign and notarization with one single cli call, with one set of credentials?
Google Play does this under the hook, I don't even think about it. iOS is similar, Transponder app does everything in one go.
Edit: I haven't tested it yet, but it does seem that you can sign an executable with your own certificate (self-signed or internal CA-issued) however you can't notarize it. Right now, notarization is only required for certain kinds of Apple-issued developer certificates, but that may change in the future.
btw, for those who don’t want to search, Quinn’s signature states:
“ Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com"
Developers pay exorbitant amount of money for much lesser value, and the idea of putting your teammates at risk to stick it to apple is kind of sad bordering with negligence from a business POV.
It's about setting a higher floor for malicious actors than "random botnet residential IP + a captcha solving service". It's about proving some semblance of identity through a card number and a transaction that goes through without a chargeback.
As the case upthread shows, there's plenty to dislike about a system that inhibits running code built for personal use. And it's obviously neither foolproof nor without collateral damage. Reasonable people can debate if it's worth it. But it still ought be acknowledged that the motivations are closer to the reason you have to identify yourself and pay a nominal fee to drive a vehicle on public roads.
In particular, the security boundaries are nonsensical. The whole model of "notarization" is that the developer of some software has convinced Apple that the software as a whole (not a specific running instance) is worthy of doing a specific thing to the system as a whole.
But this is almost useless. Should Facebook be allowed to do various things that can violate privacy and steal data? What if the app has a valid reason to sometimes do those things?
Or, more egregiously, consider something like VSCode. I run it, and the fancy Apple sandbox helpfully asks me if I want to grant access to "Documents." The answer is really "no! -- I want to grant access to the specific folders that I want this workspace to access", but MacOS isn't even close to being able to understand that. So instead, one needs to grant permission, at which point, the user is completely pwned, as VSCode is wildly insecure.
So no, I really don't believe that MacOS's security model makes its users meaningfully more secure. At best, the code signing scheme has some value for attribution after an attack occurs, but most attacks seem to involve stolen credentials, and I bet a bunch just hijack validly-notarized-but-insecure software a la the VSCode example.
Notarization does some minimal checks, but is mostly about attaching a real identity so that maliciousness has at least some real-world consequences. The most obvious being that you lose the ability to get more apps notarized.
Since this isn't true, no acknowledgement required, it doesn't need to be a "major" profit center to magically become a benevolent feature
For your personal needs, you do not need to pay anything for building and using apps locally.
The cost is far far higher than the price.
I develop and distribute few free apps for macOS, and building / notarising is never a problem.
Yes, you need to put keys on the build server for the "Developer ID Application" (which is what you need to distribute apps outside of AppStore) signature to work.
You do not need to give any special access to anything else beyond that.
Anyway, it is indeed more difficult than cross-build for Darwin from linux and call it a day.
In most cases, just involving account management makes the corporate case 10x more of a PITA. Doing things in a corporate environment is a different game altogether.
It is ugly: https://hearsum.ca/posts/history-of-code-signing-at-mozilla/
I do this professionally, I maintain macOS CI workers for my employer. Apple doesn't make it easy.
Last time I tried setting up an Apple developer license inside a large corporation, one that they paid for and not tied to me or my credit card, it was also a nightmare.
And yes, it's also on principle.
Permission that can be revoked for any reason, including being compelled by someone with more power than Apple.
Once signed, binary will work forever, you only need active subscription when you need to re-sign / re-notarise.
Unfortunately, I still have to deal with macOS for work due to corporate policies.
I spent a day or so hacking around with kanata[0], which is a kernel level keyboard remapping tool, that lets you define keyboard mapping layers in a similar way you might with QMK firmware. When I press the 'super/win/cmd' it activates a layer which maps certain sequences to their control equivalents, so I can create tabs, close windows, copy and paste (and many more) like my macOS muscle memory wants to do. Other super key sequences (like Super-L for lock desktop or Super-Tab for window cycling) are unchanged. Furthermore, when I hit the control or meta/alt/option key, it activates a layer where Emacs editing keys are emulated using the Gnome equivalents. For example, C-a and C-e are mapped to home/end, etc.
The only problem is, this is not the behavior I want in terminals or in GNU/Emacs itself. So I installed a Gnome shell extension[1] that exports information about the active window state to a DBUS endpoint. That let me write a small python daemon (managed by a systemd user service) which wakes up whenever the active window changes. Based on this info, I send a message to the TCP server that kanata (also managed by a systemd user service) provides for remote control to switch to the appropriate layer.
After doing this, and tweaking my Gnome setup for another day or so, I am just as comfortable on my Linux machine as I was on my Mac. My main applications are Emacs, Firefox, Mattermost, Slack, ChatGPT, Discord, Kitty, and Steam. My Linux box was previously my Windows gaming box (don't get me started about frog boiling on Windows) and I'm amazed that I can play all my favorite titles (Manor Lords, Hell Let Loose, Foundation, Arma Reforager) on Linux with Proton.
I know it's mostly muscle memory, but macOS shortcuts just seem sane and consistent and that has been one of the biggest frustrations when trying to switch. I found toshy[0] which does something similar - did you try that? The goal is purely macOS key remappings in Linux, so a much smaller scope than kanata.
[0]: https://toshy.app
I have a Kinesis 360 keyboard, and my config[0] probably won't work for other keyboards, but it can give you a starting point for your own config.
[0]: https://gitlab.com/spudlyo/dotfiles/-/blob/master/kanata/.co...
But here's my (unpopular) take as a GNOME user and using Fedora immutable distros + flatpaks -- I suspect Linux is going to go in a broadly similar direction. Maybe not soon (even flatpaks aren't universally acclaimed), but sometime.
I don't even mind that they've introduced a level on the totem pole that's above root. But on my computer, -I- should be the one at that level, not Apple.
the issue seems to be that you still believe this?
However, less corporate distros that mostly just ship built upstream software as-is since they don't have to support it for long periods (think Arch, Fedora, Void, etc) don't have that problem, so I expect we'll continue seeing them use traditional packages.
Ubuntu does the exact same thing with their snap repository, the Firefox apt package from Ubuntu is fake. At least Flatpak is a community-led project unlike snap.
You can limit the file system permissions of the app, like giving only access to downloads, so that if/when there’s a sandbox leak you’re fine. You can also disable various things, like webcam or mic, this way.
In addition, you can get perpetual updates to the latest version of your browser even on old, stable distros like Debian.
Linux is pretty diverse, there are still distributions out there that haven't adopted systemd.
It’s like criticism of the quality of Google search dropping. It has absolutely tanked, but it’s not because the algorithm is worse, it’s because the internet has grown orders of magnitude and most of it uses the same hyper aggressive SEO optimisation, such that the signal to noise ratio is far worse than ever before.
Kagi lets me completely block specific domains. If Google cared about quality they’d let you do the same.
"Those who refuse to give up essential Liberty to purchase temporary Safety deserve to have to deal with the GNOME desktop user experience."
I miss macOS sometimes.
We know this because the emails came out in discovery for one of the antitrust suits.
Suddenly the users file hierarchy started wherever the Home folder was located and it became an island of user controlled environment surrounded by complexity of computer operating systems.
The result I found overall well thought out but when the desktop became just a folder I felt the Mac moved from it’s simplicity embracing the complexity that was offered by windows.
It's amazing the rose tinted glasses people have about the original Macintosh environment. It was insanely janky and (unless you were ruthlessly conservative) insanely unstable by today's standards. By version 10.5 (Leopard) the modern UNIX-based MacOS was unequivocally superior to Classic MacOS in every metric other than nostalgia.
I also believe that the simplicity could have security as performant. The real advantage of the Unix layer is compatibility that the Macintosh was missing.
Are you trying to say that it’s possible for a system to be both simple and secure? Absolutely that’s the case, but with a trade-off — either it needs to restrict the user’s freedom, or be fully disconnected from the outside world.
The threats in the world are real and the internet doesn't help. I 100% agree that a network connection needs to be kept at a distance to make things simpler.
I think the power of language used to describe a system is where simplicity begins.
What I'm working on is creating a crisp line of delineation between "local" and "public" networks.
If by default after is on the "local" network auto-discovery is secure. If things are explicitly needed a user can publish them through physical manipulation to publish to the outside world.
The outside world can now be described using classic Users and Groups which is cultural easy to understand.
I'm trying to create an environment that focuses on making those 2 things plus a third element simple to understand and physically manipulatable.
The freedom I'm looking for is available on the "local" network. The "public" network is where our data is interchanged with the outside world based on our publishing. I don't expect people to interact with this layer much. I expect people to configure it for whatever institution/organization/government.
Most of the complexity I see in computing these days is market drive demand for eyeballs/clicks/...
Actively depleting the good-will they accumulated over the years definitely makes it worse. It's that harder to give the benefit of the doubt to a company also showing the middle finger to their Devs.
Giving priority to AdSense sites, fucking around with content lengths (famously penalising short stay sites), killing advanced search options. That's just thinking about it for 10s, but to me most of it is totally of Google's making.
I can't believe I even have to say this out loud. Look up enshittification.
sudo spctl —-master-disable
Hard disagree. Audio mixing is not difficult[1]. The Linux kernel guys were right - it does not belong in the kernel. The userspace story however, has been a complete shitshow for decades. I think Pipewire mostly fixed that? Not sure, sometimes I still have to log out and back in to fix audio.
The funniest part? It's been working in the BSDs all along. I recommend reading the source of sndiod[1].
[1]: <https://cvsweb.openbsd.org/src/usr.bin/sndiod/>
What's even worse? Probably systemd. I try not to hold a strong opinion - I tolerate it, the way I tolerate traffic when taking a walk. The technical issue however is several orders of magnitude simpler - again, the BSDs come to mind, but you can also write a complete PID1 program in about 20 lines of straightforward C[2]. I don't mind the commands being unfamiliar (they're already all different in almost every OS family); it's that the entire package is dreadfully large in scope, opaque, and I find it more difficult to interact with than anything else in this landscape.
[2]: <https://ewontfix.com/14/>
However, it's worth noting that audio experts doing high grade mixing in production are using these systems quite effectively and have been for a long time. It's similar to Blender in that regard with it always having the "guts" of doing great things, but only the experts that knew the correct spells to cast were able to use it effectively before the UI/UX was improved with 2.x and later I believe.
Are most people better off with Apple defaults?
And it’s not because the problem is “difficult”. It’s because for 20 years it has been claimed that this will be the “year of Linux on the Desktop” and it’s never been good enough for most people.
The problem with Linux is that, while it’s very good, it’s different.
Nobody actually cares how intuitive something is, at least not in absolute. People will still say Windows is intuitive. Pretty much nothing in Windows, from the registry to COM to IIS to setting/control panel/computer management, is intuitive. But they know how to use it and are used to that particular brand of buggy inconsistency.
Linux desktops have been high quality for a long time now. The reality is you, and others, measure quality as “how much is it like windows” or “how much of it is like macOS”. When that’s your metric, Linux will always come up short, just by definition.
Can I get a high performance Linux laptop with good battery life, fast graphics, that runs cool and silent?
And yes, everything works. On bleeding edge 2 month old hardware.
I even use thunderbolt 4 to connect my external displays and peripherals. Not only does it work, but it’s pleasant. KDE has a settings panel for thunderbolt. I can even change my monitor brightness in KDE settings. No OSD required!
But wait, there’s more! I’m running 2 1440p monitors at 240hz and the system never even hiccups.
But wait, there’s more more! The battery settings are really advanced so I can change the power profile, maximum charge, everything.
The only thing I’m unsure about in your comment is “low latency audio”. It seems low latency to me, but I’m not an audio engineer.
What's high performance for you?
I can certainly get a Framework (Fedora and Ubuntu officially supported), throw my prefered Bluefin-Framework image in and get working
Battery life around 7 hours is the average I see reported, Fast/Silent will depend on the model, but I don't see the issue really Upgradability and easeness of battery replacement are a plus
I just picked framework because they were first to come to mind, but I think Dell has a nice Linux story, Tuxedo also comes to mind
These are the typical reviews I see around the Framework
https://community.frame.work/t/fw-16-review-the-good-the-bad...
Poor battery life, heavy, runs hot, poor build quality, bad speakers, and decent but not great graphics.
This should not be an issue. I have hardware that varies a lot and I literally buy random wifi dongles for $1, $4, $5, Amazon, AliExpress, etc. and they have all just worked on first plugin. I can easily take my phone and tether it to my PC using USB-C and it appears in my Gnome network list and just starts using it for Internet.
> how well will it handle low latency audio
Pretty well you can use OBS to verify this. There are plenty of settings if you want to tune that.
> My graphics hardware?
Just ignore Nvidia and move on. Sure they might figure it out one day, I gave up a decade ago and I use Intel integrated or AMD dedicated for GPUs. Nvidia does "work" for most purposes but it will cause you a headache eventually and those are not worth $400 to me.
> How well will it handle power management?
I enjoy the basic controls that Gnome provides that give me a simple way to basically say "go all out" or "save some battery" etc. There are finer grain controls available and I have used commands in the past to push crappy hardware to it's limits before I chucked it (old Intel iGPUs)
> Can I get a high performance Linux laptop with good battery life, fast graphics, that runs cool and silent?
You can get ones that are specifically marketed for this purpose. Tuxedo is one that specializes in this and obviously System76 also do. These have a higher price point than a regular Dell system, which IMO is the better option in some ways. Dell sells more systems and has more users and it will "just work". They sold Linux systems for years and still do I believe.
Regarding "running silent" this is a gripe I have, not that it runs loud but some laptops have custom RGB crap and sometimes in Linux I don't have access to the extra functionality to customize my lighting or manually set my fans to ramp up etc. There are projects that aim to do this, but I have not looked into them beyond the most basic `fancontrol` built in command.
I think once you expand the scope to "most people" it might become impossible to say what the correct answer for that large of a group is. In the past their value add might have been more compelling and their feature lock not as draconian. It appears some people think that has changed over time.
The second part of your post is incoherent to me, I can't tell what you're trying to say.
To run a non-motorized app requires you to open a separate app, navigate to the security section and select that you want to authorize the app to run.
Apple does not have any desire to make distribution of non-notarized binaries commercially viable.
And we've seen this change across all browsers. There no longer is a "continue" prompt for TLS issues. The result is, way fewer maintained sites go months with an expired certificate.
Predefined value on current macOS's Gatekeeper is "move to Bin" instead of OK. Other option is Done - which cancels opening action. If you want to bypass that, you need to go to system settings > privacy & security and manually allow particular app there.
Who know what later updated will bring.
Hard agree with this. I sometimes have to boot up a windows laptop to play Minecraft with the kiddo, and it never stops reminding me how little I know about Windows now, how counter-intuitive everything is, how everything feels designed for a user whose mind I cannot comprehend.
It blows my mind that when right-clicking on a file in file explorer, the 'delete' option is hidden in a sub-menu under 'more options'.
Probably helps that I installed the IoT LTSC version, but still, apart from the task bar being stupidly in the middle (thankfully there's an option to move it to the left), I've had zero issues.
I even added a network printer and it found it quickly, and added it quickly and successfully, which is a feat I don't think I've seen happen on any OS ever.
The context menu is a clear improvement on the old one (which you can still get to with one click).
I agree that having "more options" to begin with was a jarring experience coming from windows 10 though.
Out of old habit I always use shift + DEL key and did not notice it's in the top row now.
It also makes way more sense.
How many "control panels"? How many places are there to adjust audio device properties?
And for reasons I don't understand, why is the window itself not resizable?
It would be fine if the settings available were actually useful or at least could bring me to some tool that does it better. I get no meaningful report of what's eating my batter and why every time I open my MacBook it's dead. And if I want to change the actual resolution of my display I'm given just a list of scaling options pretending to be resolutions. Oh, want to set a specific resolution or refresh rate? You have to do some stupid kinger king foo of option control something _before_ you click on this dialog. I get the criticism about the Windows settings app and legacy power tools (I think this has largely been solved anyway), at least they exist and allow me some iota of control over my computer
You can access most settings by Windows + "yourquery".
It is indicative of a failure, not a solution in and of itself.
FWIW, search as a UI isn't a bad thing, Cmd + Space is the main way I launch apps on macOS (or Win + "type whatever").
To be fair, it's hard to say whether the Settings app is more broken in Windows or macOS these days. I think I'd have to give the crown to macOS here on account of search itself being more broken.
For example:
I prefer keeping my hands on the keyboard, and typing cmd+space followed by mouse is so much faster than finding the right pixels to click through in menu trees when I want to adjust my mouse sensitivity.
The UI fail is if search is required to find the setting every time you need it, because categorization and/or navigation is broken otherwise.
As to keeping your hands on the keyboard, that's an argument for having proper keyboard support in any UI, complete with hotkeys and shortcuts. The big difference between these and search is that the former is (if properly done), consistent and predictable. So e.g. when the app adds new things in the future, your existing key sequences will still do the same thing they did before.
To take your specific examples, if I do Cmd+Space, "mouse", Enter on my system, it will bring up LinearMouse, not system mouse settings.
On my iphone, I have one page of apps, everything else in the app drawer, and use the search all the time. It often gets what I want in one or two chars.
It also has the benefit of being roughly bilingual (English + Installed language) and being there even in machines not setup for you
I can get my mom's computer, fully set in spanish, and I can win + "query", into settings, programs and tools to setup whatever she needs
I admit I honestly have no idea where the system settings are located as I haven't pressed the start button in ages, but the same applies to MacOS as I would use spotlight there as well.
The search doesn't even work all the time. Sometimes it won't do fuzzy search, sometimes typing "bluetooth settings" will do a Bing search, some other time it will open a PDF, and so on.
I figured that I make a six-figure salary as a software developer, I can afford $2/month so that I don't have to fucking become a sysadmin for a game server my child depends on.
There are two editions, Java and Bedrock. Java is the original, available on PC and Mac, and supports programming-like technical play and mods. Bedrock is Microsoft’s reimplementation, available on all devices except Mac, and supports emotes and microtransactions. Other than that they’re largely the same game, and buying either gives you both versions. Realms supports both, but a server is one or the other, not both. There are also other managed hosting providers for Minecraft (both versions), but Realms is probably easier and cheaper for you. Java version has performance problems, but mostly because Microsoft’s code is inefficient, there are a few mods (also written in Java) that everybody uses to fix performance without affecting gameplay.
(... speaking as another dad just trying to play with my kid.)
It's the rule lest someone think you made a bad decision and you're regretting it. Even though it's an OS targeted for your grandmother, you must not let them see weakness.
At this point it's a joke. Either critique Apple or admit you can't without also bringing up some other OS. It's weird.
Another thing I dislike is that it stores the whole message history on the device. It's nice to have at times, but I send a lot of photos, which adds up in storage over time. I pay for iCloud, and store my messages there. Why does my Mac need to hold every single photo I have ever sent?
Thankfully, that is also somehow the future of UI frameworks on all of their platforms!
Oh but you forgot about the “catch up” button they added 2 releases ago that takes you to the last unread message! …
… but only if said last message is within the N most recent messages, in the messages which are already “fetched” from local storage. If it’s more unread messages than that, the button is nowhere to be found.
Like they said “ok we can implement a catch up button but it’ll be hard to solve due to how we do paging.” “Ok we just won’t put the button on screen if we have to page then. Save the hard problem for the next release.” Then they just forgot about it.
I recently had to do a full reinstall of macOS on my Mac Studio due to some intermittent networking issue that, for the life of me, I could not pin down. Post-reinstall, everything's fine.
I've explained in another thread how this kind of thing happens. It may be the same at other large companies.
Bugs come in (via Radar) and are routed to the team responsible. Ever since Jobs came back (and Apple became valuable again) it has also become very much top-down with the engineers, for better or worse, not calling the shots.
Just an obvious example — there are of course no engineers in the decision to make a "Snow Leopard" release or not. That is a "marketing" decision (well, probably Federighi). But further, even for an engineering team, they're probably not going to be able to make that decision even for their own component(s) either. Again, marketing.
So meetings are held and as it gets close to time to think about the NMOS (next major OS) the team is told what features they will implement. Do you think fix bugs is a feature? How about pay down technical debt? Nope, never.
Fixing bugs is just expected, like breathing I guess. And technical debt ... do what you can given your workload and deliverables. Trust me, many engineers (perhaps especially the older ones) want to both fix bugs and refactor code to get rid of technical debt. But there is simply not the cycles to do so.
And then what is even more insipid, the day the OS ships, every single bug in Radar still assigned to a team, still in Analyze, becomes a much much harder sell for the next OS. Because, you know, you already shipped with it ... must not be that bad.
I'd love to see a bug-fix-only Mac OS release. But I suspect that every time the possibility has come up, something like, I don't know, LLMs burst on the scene and there's a scramble.
It's unclear how much explanatory value this has, because the Snow Leopard that everyone is pining for was during the Jobs era. After all, an Apple that goes bankrupt and out of business isn't going to make any software updates.
I find a stark difference between the Jobs era and the Cook era. Under Jobs, the early Mac OS X updates (Puma and Jaguar) came fast and furious, but then the schedule slowed considerably. Panther was 14 months, Tiger 18, Leopard 30 (delayed due to iPhone), Snow Leopard 22 months, Lion 23. Mountain Lion was the first release after the death of Jobs and came only 12 months after Lion. Thereafter, every Mac OS update came yearly, give or take a few months. That's a drastic change in release schedule.
Aqua, the new UI, came down from above soon enough. Drawers, toolbars were new UI elements that arrived. In time Jobs' designers were going through the shipping apps with these new UI elements with changes for the engineers to implement.
Certainly by the time the iPhone had arrived the transition to marketing (and design) calling the shots was complete.
"It just works"
I think it is more that the decision to SAY Snow Leopard was a bug fix-only release was a marketing one. The reality is that release also sported things like 64-bit Intel ports of all apps, added Grand Central Dispatch (e.g. an entirely new code concurrency system) and included a from-scratch Finder rewrite.
I always saw these releases (I bundle Mountain Lion in) were all about trying to rein in excessively long release cycles. Short release cycles tend to not have enough time to introduce new bugs, while extended release cycles create a sense of urgency to get code in under the wire.
Now, release cycles have moved to be staged across a fairly predictable annual calendar. If there's an issue where features are getting pushed out 6 months or a year earlier than they should, that is a management and incentives problem.
This hits right in the feels of any engineer at any company.
Short of it being a requirement to use the latest version of Xcode (once they bump the minimum in the following Feburary), and security updates stopping, there's been very little reason to actually upgrade.
Oh Thank You so much. 2013 I was already questioning on some of the features it keeps adding that were useless. Yosemite with continuity was the only useful feature in the past 10 years.
Yes. relentless focus on big flashy features for the next yearly release cycle was exactly what I felt like it was. And that was the big reason why I dislike Craig Federighi.
Edit: Thinking about it more, former Apple employee that worked during 2005 - 2010 is probably a lot more prestige than post 2015.
- Ever since I've updated to the latest iOS 18, my watch complications(weather doodad) stop working randomly because they just lose the location services permission. Then in settings, the location services permission list acts like the weather app isn't installed.
- The new Mail app now automatically classifies your email, but still gives you the "All Mail" option. But the unread count badge on the app only works off of what they classify as your "Priority" mail. There's a setting to change that, so that it shows you the unread count of ALL mail, not just priority mail, but when you change that setting nothing changes. This is my biggest problem with new iOS.
- Keyboard sometimes doesn't get out the way any more when it should.
These are just off the top of my head. It used to be such a nice, polished experience. Their competition was just outclassed. Now, when my phone dies I'm going to have a good look at all the other options.
Depends on where you were seeing this of course, but this could very well be an app problem instead of a system problem.
Native UIKit/SwiftUI do a little bit of keyboard management for “free”, but there are many circumstances where it falls on the developer’s shoulders to do this. For cross platform frameworks, some do keyboard management others don’t even try. For web apps it’s a coin toss and depends on which of the gazillion ways the dev built their app.
It’s not actually that hard, usually just a matter of making sure that your scrolling content either resizes to match the keyboard-shrunken viewport or adding bottom padding equivalent to the height of the keyboard and then and adjusting scroll position accordingly, but it’s not unusual to see this partially or fully absent, especially on poorly built cheapest-bidder-contracted apps.
Of course SwiftUI gives you almost none of this control, forcing you to hope the magic automatic support works how you expect.
But then neither help you with any of the other interactions, like any background dimming you may want, or tapping away from the keyboard to dismiss. That has to be done manually.
I think they should just throw in the towel and duplicate settings. Meaning, we can turn off Siri learning from an app or from the Siri page. Or we can turn off banners from the app or the notifications page.
There is no silver bullet, just a lot of lead ones and the answer to Apple's quality problem is to begin baking QA back into the process in a meaningful way after letting it atrophy for the last decade or so.
Hire more humans and rely less on automation. Trust your developers, QA, and user support folks and the feedback they push up the chain of command. Fix bugs as the arise instead of assigning them to "future" or whatever. Don't release features until they're sufficient stable.
This is all basic stuff for a software company, stuff that Apple seems to have forgotten under the leadership of that glorified accountant, Cook.
As a former Apple employee of 13 years: Apple knows about the bugs. QA isn’t the problem.
A lot of people complain that their radar for some obvious bug isn’t getting noticed, and conclude that Apple must not be QA’ing, or not dogfooding their own product. This isn’t the case at all. I guarantee the bugs you care about are well known, and QA has already spotted them.
The reality is, they just don’t care. The train leaves the station in September. You’re either on it or you’re not. If you spent the year rewriting some subsystem, and it’s July and you have this huge list of bugs, there’s a go/no-go decision, and the answer is nearly always “go” (because no-go would mean reverting a ton of other stuff too, and that carries its own regression risk, etc.)
So instead there’s just an amount of bugginess that’s deemed acceptable. And so the software is released, everybody slaps high-fives, and the remaining bugs are punted to next year, where they will sit forever, because once we do one release with a known bug, it couldn’t be that important, right? After all, we shipped with it! Future/P2, never to be seen again.
An attempt was made to remedy this by pushing deadlines earlier in the cycle, to make room for more QA time, but that just introduced more perverse incentives: people started landing big features in later dot-releases where there’s less scrutiny, and even more tolerance for bugs.
The honest answer is that Apple needs to start giving a damn about the quality of what they’re pushing. As Steve once said at a pretty famous internal meeting, “you should be mad at your teammates for letting each other down like this”. And heads need to roll. I can only hope that they’re realizing this now, but I don’t feel like the culture under Tim works this way. People’s feelings are way too important, and necessary changes don't get made.
By all means people should complain on forums (why not?), but a forum post complaining about some years-old bug isn't going to be anywhere near as effective as contacting apple's support or filing a bug report.
I'm not a developer, I'm just a regular user - so if I can get all this special treatment, so can you.
(It feels similar to how those same podcasters absolutely blast Apple Intelligence, while non-tech users I've heard from seem to love it.)
In practice it's a challenge because the OS bundles a lot of separate things into releases, namely Safari changes are tied to OS changes which are tied to Apple Pay features which are tied to so on and so on.
It would require a lot of feature flagging and extra complexity which may reduce complexity.
Another way is to start un-bundling releases and fundamentally re-thinking how the dependency graph is structured.
But, in a sense, this still incorporates your idea, because the devs and QA must be given the mandate of finding these bugs, and also towards making the automated tests cover the bug's related test cases (as well as charged with improving the test code itself, which is often in a mediocre state in most code bases I've seen at least).
Apple changed how they tied OS updates to hardware sales in this era and this left a lot of Macs on Snow Leopard for half a decade. So people remember that last point update – which was as close to a low-term-stability release as Apple has ever had.
But to get there, Snow Leopard received 15 updates over 2 years and it was really just point updates to Leopard so it was more like 29 updates over 4 years without a major user facing feature. And this was after Leopard itself took over 2 years to develop.
If Apple did nothing but polish features and reduce bugs for 6 years, people would proclaim them dead. And they might actually be dead since their entire sales model is tied to cycles of development, promotion and delivery. For those of us who remember Apple getting stuck on System 7 between 1990 and 1997 and how the company nearly collapsed in that era: it would be a delay almost on that scale.
Snow Leopard was notably cheaper than Leopard ($30 vs $130), Lion was $30 on the App Store, Mountain Lion was $20, then Mavericks and everything after have been free.
Snow Leopard did have a long life though, it was the last OS that could run PowerPC apps, also the last to run on the original 32-bit Core Duo Intel Macs.
Snow Leopard also introduced the Mac App Store (in a point release), which was a user facing feature.
I think the "zero new features" mostly meant "no flashy user facing features". It had a lot of new features for developers.
I bought an M1 Max that is now almost 4 years old and it still feels new to me. I can't really imagine a change that would happen in the next 2 years that would make this thing feel slow where an M3 would feel sufficient, so I'm curious to see if Apple really does just go hardcore on forced obsolescence going forward. I have a few M series devies now, from M1 to M3, and I honestly cannot tell the difference other than export times for video.
I can imagine some kind of architecture change that might come with an M6 or something that would force an upgrade path, but I can't see any reason other than just forcing upgrades to drop support between M1-M5. Maybe if there is a really hard push next year into 8K video? Never even tried to edit 8K, so I don't know. I'm guessing an M1 might feel sluggish?
In fairness, Apple to do tend to continue to release critical security patches for older versions.
I suspect that it will be AI features that push Apple into deprecating older hardware. But I also hope that the M series hardware will be supported a bit longer than the intel hardware was. Time will tell.
Using an x86 laptop in 2025 is like using a flip phone.
How are the keycaps doing? Mine looked awful after about 2 years of relatively light use, developing really obvious ugly shiny patches (particularly bad on the space bar), quite a letdown on an otherwise great machine.
(Realised that you can actually buy replacements and swap them yourself, via the self-service repair store, so have replaced them once, but am starting to notice shiny patches again on the new set)
Yes, they can theoretically perform better, but only when plugged into mains power, and creating so much heat and fan noise that the experience really isn't good.
Don't think there's anything out there that will outperform the GPU of an M-series Mac without consuming way more power and producing problematic levels of heat+noise.
I actually see progress in things that matter for me as software dev like virtualisation and Docker support. And with frameworks like MLX I can even run image generation tools like FLUX locally on my Mac (search for mflux). Amazing! And Apple Silicone is a screamer... still cannot believe I have the fastest single core PC on Earth in my laptop.
I only thing I use is the calendar to see my personal and work Google calendars aggregated at the same time.
So far I'm happy with macOS. If the whole graphics industry (Adobe etc) would support Linux more I would even switch away to Linux but because I'm dealing with photography, color correction and a little video too I will never switch to Linux (the graphics system quality in macOS is way too good). Windows is unfortunately no go too because of the built-in spyware and ads in the OS (like WTF).
I consider Apple Intelligence also as a sort of spyware. I don't want to activate it ever (but it gets auto activated after updates) and I don't want it to download its stuff and waste space. If people want to use it: fine, but if I personally opt out, I opt out fully Apple!
When it works. Last time I typed “keyboard” in the system settings app, the keyboard settings weren’t part of the results. Ditto “mouse” or “trackpad”. Settings search has been utterly broken on around half of the dot releases for me. If it works, it’s only temporary and then it’s back to not working on the next update (or even reboot.)
Working for two companies I see how in the small one people manually test their changes, try to break them, even having in-code tests. At the big corpo - noone cares. Tests are green? Release to prod, close ticket and take another. Clients complain? There are 5-6 layers of people before such complain can come back to the team.
I wouldn't agree with "less glitchy" than Windows. Currently Win10 is the best one if it comes to stability, but Microsoft is already killing support for it. Windows 11 have problems even with typing into Start Menu search - basic functionality. Randomly takes input or not. So I think we are lowering the bar and the market agrees how low it should go.
Also, why does it take 10 seconds for activity monitor to show information? The list goes on.
If only Mac hardware officially supported Linux, I would never touch that macOS again.
That's not a bug, but a feature. Under View -> Update Frequency, you can change it.
I'm sick of the random Safari crashes.
I miss those. Unfortunately, since Apple doesn't do the whole space theme anymore, you'd probably get some really boring drone shots of California at best before a Setup Assistant faded into view from behind a Redwood or something.
I had to hear those goddamn songs so many times, often all at the same time.
I'm weird though, and never stopped liking it.
Doo-do-doo-doo-do...
For newer Apple apps, sometimes the keyboard shortcuts simply don't exist. I believe part of the problem here is the deprecation of AppleScript, which means there's no incentive to spend time on consistency, and the other part has to do with organizational indifference towards all the wonderful UX innovations from the past.
What Apple has successfully accomplished, in collaboration with other 'big tech' companies is drastically reducing user expectations from their software. I wouldn't completely blame the AppStore's forced race to the bottom for this alone. There is still a huge market for tasteful apps that cost more (even sometimes with obnoxious subscriptions), but if even Apple isn't leading by example, why waste time on it if you could just build another simple note-taking app.
- Using the iPhone to scan documents from Finder has recently stopped working on the second scan. I need to restart my phone to get it to work again.
- iPhone mirroring is terrible: laggy, UI glitches, drops click events, scrolling is a nightmare. This is when it actually even manages to connect.
- Often, with Airpods on, lowering the volume, shutting down the iPhone display and putting it in my pocket quickly enough will entirely turn off volume. If you happen to increase the volume instead, you'll get blasted with maximum volume in your ears.
- Use vertical tabs on Safari for one day. You'll see it actually crash a few times. Not to mention the UI glitches. - Open the App Store on macOS. It first opens empty, then the UI controls show up, then it flickers the entire UI. I am convinced it's a Web app.
- In System Settings, most of the sections you click have a delay in rendering. Nothing feels snappy in that app. I can actually click 3 sections quick enough for the second to never even be rendered.
- Sometimes dragging an application from the Dock popup menu into the Trash does nothing, even though it appears to have worked. I often find that it wasn't deleted at all, that I have to open Applications folder in Finder and hit Cmd-Backspace to delete it.
- On iOS, the alarms app breaks down once you get to ~250 alarms. You can try to add/delete alarms and it’ll appear like they changed, but the change wont be saved. I can’t use the alarms app now and can’t fix it as I can’t delete alarms. By the way, would be nice to reuse alarms when creating at the same time as an existing alarm so you don’t end up with 250+ alarms in the first place.
- On iOS, the notes app breaks down in long documents (~10 pages of text with bullet points). When writing beyond that, some text will sometimes disappear only to reappear when you type some more. Other times, the cursor disappears. This only happens in long documents. All English text, mainly bullet points, often with some text pasted in.
It’s shocking to me that my iPhone 11 Pro can play gorgeous 3D video games, but can’t handle 250 alarms or 10 pages of text..
https://arstechnica.com/gadgets/2009/10/apple-owns-up-to-odd...
Every MacOS update brings along this bloatware that is not easily removed.
This is also the reason that I don't mind the current version of Mac OS. Yes everything you mentioned is a bit meh. Which is part of why I don't use any of those applications. So I don't care. I've disabled Siri. Never used Facetime. Maps, Numbers, and all the other of the dozens of things they bundle: I never touch any of it. I don't need that stuff and when I do, I use alternatives. I have an Android phone so all of the IOS integration stuff is redundant to me as well. They've not locked me into their ecosystem. And I like it like that. I don't allow myself to be locked in.
As a work horse for doing development MacOS is still a fine OS. It does the job. Most updates of the last 10 years or so have been minor window dressing that you barely notice, some under the hood changes, and misc tweaks that mostly fall into the "whatever" category for me. For me the annoying thing is just having to sit through these lengthy updates. I keep postponing them because it's never convenient to take an hours long break when it prompts me.
And I don't really get much out of these updates. To be honest, I can barely tell apart the different versions of their OS. The main notable visual change seems to be the desktop background. Which is usually hidden by applications. So I rarely look at it.
From an OS / software perspective:
Have a "core" macOS that has none of the apps / integrations are baked in at an OS level.
You install the things you want, how you want - eg iMessage, Mail, and then iCloud if you want to sync it, and Photos etc.
Have a slim, fast, stable OS that I can just turn on and get going with.
From the hardware perspective, I made this comment a little while ago but what I want to be able to choose is:
- Device: Watch, iPhone, iPad, MacBook, iMac, Mac
- Size: Mini, "Normal / Default" (Air), Max
- Processing Power: "Normal / Default", Pro, Ultra
- And maybe storage.
That way I can go and buy a MacBook Pro (13"?), or a MacBook Max Pro (15"), or a MacBook Mini (11"), or a normal iPad Mini Ultra, or an iMac Mini (21"?), or a Watch Pro, or a Mac Max Ultra etc.
Device + Size + Power.
It's kinda there, but not quite.
There's no need for a separated core version - just give back control to the user. But honestly, I don't know what would need to happen so we could get it - it feels like it's a lost cause against corporations. There's of course Apple-EU situation where you can remove applications, set defaults, install additional app stores but this is still limited to that market and happen way too late and too slow.
For example, can you remove Chess from MacOS? Nope! Why? What I found on Reddit, it seems because it's integral part of MacOS somehow and I am a bad person for even asking, somehow.
Those are not bloat, those are core features of a computer for 99% of users who are not developers.
There already exists a platform which is unusable for normal people and great for developers, it's called Linux.
There already exists a platform which is great for corporate and hell for normal people, it's called Windows.
So why aren't we allowed to keep the only computing platform which is good for normal users?
Each part alone might not be large but together it starts to become an annoyance.
Also bloat is not just about disk space but also cognitive load and clean interface.
I'm hoping they're gathering usage analytics and will overhaul unused features.
Caveat, I'm probably not their average user, I do almost everything via Spotlight. I don't even use the bottom menu thing, it automatically hides and I only use it when I accidentally hid a window.
I wish that in the next version of macOS, they would strip away all those useless features and systems that they've shoehorned over the past two decades and have the OS look like how Panther or Tiger did, while taking up less than 10 GB of space on the puny SSDs that they ship their machines with.
Indeed, I remember three times when Apple went a bit overboard on the feature front, but dialed it back and made some of the most stable and useful OS versions:
OS 8.5/8.6 pushed a bunch of features and were the last big pushes pre-OSX, but then OS 9 fixed a TON of bugs, and added a few smaller quality of life improvements that made running 'Classic' Mac OS pretty good, for those who were stuck on it for the transitional years.
Mac OS X 10.0 rewrote _everything_, and especially 10.0 was _dog_ slow, with all the new Quartz graphics stuff in an era where GPU accelerated 3D display widgets wasn't quite prevalent. 10.1 patched in a bunch of missing features (like DVD Player—it was still a pretty useful tool back then), and fixed a couple of the most glaring problems... but 10.4 Tiger was the first OS X release that was 'fast' enough OS X was a joy to use in the same way OS 9 was at the time. At least on newer Macs.
And then of course Snow Leopard, which is the subject of the OP.
macOS 13/14/15 have progressively added more little bugs I track in my https://github.com/geerlingguy/mac-dev-playbook project; anything from little networking bugs to weird preferences that can't be automated, or don't even work at all when you try toggling them.
That's besides the absolute _disaster_ that is modern System Preferences. Until the 'great iOSification' a few years back, Apple's System Preferences and preference pane were actually a pleasure to use, and I could usually remember where to go visually, with a nice search assistant.
Now... it's hit or miss if I can even find a setting :(
That said, I expect things to get worse as they manage to converge their multiple platforms in exactly the wrong way (by dumbing them down across the board even as people keep hoping they'll make iPad OS more useful, etc.).
But at least we still have Safari, Apple Silicon is pretty amazing and I can survive inside Terminal and vim. For now.
There was no acceleration (even 2D!) until 10.2
Had a few issues with iCloud syncing and data loss as well and what with being based in the UK and the general problems with geopolitics and the cloud I figured I'd try and get as much stuff out of iCloud as possible. Well there's not much advantage now. Most of it is in the ecosystem tie in, not the hardware. And on top of that the provisioned services such as Apple Music are just pain for me on a daily basis. My entire music catalogue disappeared in a puff of smoke when I was offline for nearly a week. The one thing I wanted it for!
So back to the PC. I ran out of disk space on the (soldered in SSD) Mac. I can't delete anything and macOS has leaked out about 20gb suddenly. I don't know what this is other than about 5 gig of it is Apple Intelligence despite telling it to fuck off. So it's late Friday afternoon and I need to get something done so I can have a clear weekend. I dig in the junk cupboard and find a couple of hard disks but no way of connecting them to the USB-C only Mac. Amazon solutions aren't available for delivery until Sunday. There upon I discovered the kids' "covid work PC" for when they were home studying. Despite the acceptable 16Gb of RAM it only had a meagre 256Gb disk in it. No worries. Opened it up and there's a hole for an SSD in it. It now has +500Gb SSD. Brilliant. On goes windows 11 LTSC. I'm back up running R in under an hour and have transferred all the data over.
I never went back. It feels better here. This thing is a swiss army knife. And extension of me. Not the other way round like on the Mac. The Mac feels like it feeds off me: both cash and energy. Apple need to fix that.
That would be medium-term user. Long-term would be people like me that have been using it since 1984.
I have a collection of Macs going all the way back to 1984. Even the newest one hasn't been turned on in three years.
My daily driver is Windows Server 2016. But it has VMware Workstation so there are lots of virtual machines for my work, including Linux.
I am so tempted by the new M4s. Amazing piece of technology. So sad about the operating system though. Every year I say I'll wait for a quality Linux port.
time machine?
And maybe I’m a minority but the latest macOS is not worse than previous editions, for instance I use Sequoia on a M1 Mac but also 10.4 Tiger and OS 9.2.2 on a PowerMac G4 (MDD, 2x 1.2Ghz with 2Go of RAM) and the stability is not worse on Sequoia than Tiger or 9.2.2, in fact I have encountered more crashes in 9.2.2 and Tiger than Sequoia and all macOS 11+ (except Big Sur who has rough edge on beginning on M1 device)
TL;DR What people remember fondly is not Mac OS X 10.6.0, which was in fact very buggy, and buggier than 10.5.8, but rather later versions of Snow Leopard after almost 2 full years of bug fixes.
See also "Snow Leopard bug deletes all user data": https://www.reuters.com/article/lifestyle/snow-leopard-bug-d...
The yearly release cycle is the problem. Apple needs "another Snow Leopard" only in the sense that I mentioned above, "almost 2 full years of bug fixes", although at this point, Apple has more than 2 years of technical debt.
My recommendation for people who don't absolutely need the latest features: Upgrade to the previous version of macOS when the new version is released. Sequoia is incredibly reliable 7 (soon to be 8) updates in.
I disagree with that part. ;-)
We wouldn't even be having this discussion right now if today's updates were incredibly reliable.
This is what’s being asked for in the article.
Whether that's intentional or not (I believe it is), Apple should focus more on delivering a stable experience, on both new and old devices.
I echo the sentiment a lot of people have already expressed. That is, using Apple products is like being a junkie. You need to use their products because there is no real alternative, but you feel kind of dirty because of their practices.To me, that sounds like it should be a huge red flag for Apple execs.
---
I've been holding over and running 10.5 on my iMac 2019, but then in the beginning of the year had to upgrade to Sequoia (due to software dependencies).
Of course this is just a correlation, not necessary a causation, but within a month the iMac's internal SSD was corrupted to the point that it was unrecoverable, and my 40GB RAM corrupted.
So, yeah, at the very least not sure how much testing went into Sequoia for non Mac Silicon macs.
Quite disappointing considering how long a normal Mac's lifetime used to be, which also justified its high initial hardware price.
It's all WhatsApp, Telegram and Signal here, nowadays. I.e. they wouldn't know about bugs in iMessage as they never open it.
I'd be curious to hear about other regions of the world. Do people there use iMessage?
Just from the top of my head: no E2EE by default. Gifies are restricted (and censored). Reactions are clumsy (there are two rows of different kinds of emojis to choose from now). Adding photos or sharing location is complicated compared to Signal or Whatsapp.
Search is ... well, I hope you don't really need to find anything. Delayed notifications on macOS for no apparent reason, and in 2025 you can still end up with multiple entries for the same contact...
While the 10 years have some security, performance, drivers, file system, refactoring going on. Most of the user features were useless.
And I spend 90% of my time inside Safari, and yet Desktop Safari is still shit after all these years.
I am not excited about 99% of new macOS user features. Most of them are features for features sake. Just continue the macOS engineering work, and for once pour more resources into Safari and allow Safari support on older Mac system.
[1] https://en.wikipedia.org/wiki/MacOS_version_history#Releases
For the majority American Apple users, sure. But I myself hardly ever remember that this app exists.
The thing that drove me nuts in particular in Sonoma though, is their "improved" text fields. Where it would show the stupid little popup with the active keyboard layout icon next to the cursor. Clearly made by someone who doesn't actually need to use multiple keyboard layouts (gosh do I envy those people). But at least I could disable it with a defaults write command.
Oh and Mail, yes, it would sometimes stubbornly refuse to load new messages, or delay them by minutes. It worked fine the previous 10 years. It would've been free to just leave it alone.
Oh man, Mail is almost comically bad, to the point that I occasionally miss messages from people since they're drowned in crap. A native version of Google Inbox that is not Google-owned would be enough for me. (or whatever version/implementation that integrates nicely with my devices)
As a counterpoint, I myself use it everyday. I’m not American and most people I know don’t have iMessage. I still prefer it to using SMS from the phone. And yes, I do agree with the author that the app is buggy.
But I also never chat over SMS with actual people. It's just not done any more by anyone I know. The last time I sent an SMS was probably several years ago. It's 99.9% various confirmation codes and other notifications for me.
If the new wave ushers in some real innovation and vision, then it was worth the gamble.
That said, the hardware and the absence of Windows' user-hostile nonsense bring me endless joy. I don't think I'll go back to a PC (the Mac feels like a different class of quality) but to be honest, I expected more.
Pfft. Nothing works, and a patronizing, laggy OS that actively tries to fight me at every step because it knows better than me.
What a joy. I'm sticking with Ubuntu/Fedora and having to figure out a driver issue every once in a while.
I can't predict whether or not they will get past this, but I'll keep hanging on, anyway.
The code quality (the bits they let us see), however, seems to be going downhill, as is the quality of the documentation. These are things that always held up, in the past.
It's fairly discouraging. I suspect the quality of their hires has been going down. I'm not sure what it is, they want, but it doesn't seem to be quality.
That sounds like the "shake to locate" behavior and I would be totally lost(heh) without it since I have 2 4K monitors plus the onboard display, and that black cursor gets lost very easily. Shake the mouse, get big cursor, find cursor, be happy
It appears that one can disable it if it bothers you enough to comment about it: https://support.apple.com/guide/mac-help/make-the-pointer-ea...
And while there, I learned that I can change the pointer color, so hopefully everyone has learned something valuable today :-D
Love how you can't find a critique of Apple without the person feeling the need to throw shade at Windows. They need to constantly reassure themselves and other fanboys it was the right decision.
And for an OS that's geared to your grandmother it sure does seem to shit the bed often.
Basic OS features have fallen way behind in term of UX - and of vision. Managing files and searching for information have become a chore compared to most internet- or llm-based services. Even a bug-free Finder or faster Spotlight would not bridge that gap.
All apps listed in the article feel similarly lost behind - Mail, Messages, Photos. The only exception is System Settings that does definitely need a snow version.
This is obviously true for other platforms as well.
We are possibly lacking a leap forward. Not faster horses, electric cars.
An obvious root cause of this is the lack of newcomers to the OS again. It's an oligopole that has no interest making things much improved.
No, and I would have been too young to purchase it.
But I'd be surprised at the idea of massive demand for an upgrade to Windows 95. What we did was buy a new computer that had Windows 95 on it. Computers used to go out of date very quickly.
We kept our older computer that ran DOS. (It had Windows 3.1 installed, but the only reason you'd start that was if you wanted to play Solitaire.) It continued to run DOS just fine.
The same sort of late night excitement existed around each early Mac OS X release, incidentally.
Most notably, that computer with Windows 95 on it also had a CD drive.
https://www.svenbit.com/2011/02/install-hackintosh-on-eeepc-...
My headphones will cut out and when I go to pause the video I’ll be clicking frantically because the remote isn’t working either. Or I’ll be in the menu and the remote will pin to the left or right and scroll to the end of some massive YouTube list.
Reboots, resets, nothing fixes it.
My Apple Watch regularly has a glitched Home Screen too.
I defended Apple’s quality recently, right before everything started breaking for me.
Mine doesn't really sleep. It's always warmish despite all my best efforts to make it actually sleep. It's always plugged in, so no biggie, but it's annoying as hell.
Reddit wisdom says it's because of my usb peripherals, but it's just a webcam, mouse, keyboard, and a yubikey.
ERR_ADDRESS_UNREACHABLE it says.
Yes, I said Yes to the new permission. Yes the check mark is on in Privacy, I mean all 20 of them that say "Google Chrome". Yes I toggled it off and on. Yes I rebooted. Still have to use a different browser to access my own local server because there is a new privacy feature that... doesn't work.
Four times kinda — maybe five if you want to count PPC32 and PPC64 separately but I usually don't since the Intel transition happened so soon afterward that there is really no PPC64 lineage to speak of.
I definitely count 32-bit and 64-bit Intel separately though due to the number of years taken to transition, all of the annoying early-Intel-Mac 32-bit EFI issues, and the need to manually opt in to the 64-bit kernel on many machines. In fact Snow Leopard was the first OS to let you do so! The “no new features” tagline was snappy but it's really not true at all :p
https://apple.stackexchange.com/questions/261749/in-which-ve... sez —
“Mac OS X Snow Leopard and above could only be installed on Macs with Intel processors, and introduced a number of fully 64-bit Cocoa applications (e.g. QuickTime X), and most applications were recreated to use the 64-bit x86-64 architecture (although iTunes was a notable exception to this!) This meant these applications could run in 32-bit mode on machines with 32-bit processors, and in 64-bit mode on machines with 64-bit processors. And yes, the kernel was updated so that it could run in 64-bit mode on some limited hardware, and only by default on Mac Pros, while other newer Macs were capable of running the kernel bit did not do so by default.”
Relevant articles:
- “Mac OS X v10.6: Macs that use the 64-bit kernel” https://web.archive.org/web/20121024223751/https://support.a...
- “OS X: Starting up with the 32-bit or 64-bit kernel” https://web.archive.org/web/20121024194635/http://support.ap...
I would have to agree here (and Apple also don't seem to assess feedback for their GUI changes), but unfortunately this thread is already on a software quality meta tangent rather than listing individual annoyances so here's my short list in the hope actual bugs can be discussed:
- window focus management broken: when you minimize or close a window, another random window of that app you're closing the window of is put into front even when that window is minimized; or other completely unrelated apps get focus
- index/Spotlight not showing file locations (full paths) after searching; the fsck?
- gestures being introduced that do stuff that you hit inadvertently and leave you in a state where you don't know how to undo its effects such as the super-annoying "fullscreen" mode when dragging windows around or pressing Command-F since Sequoia. Requires you to fscking research how to leave fullscreen mode (while not as cringe as Windows help "communities", the level of talking past another is getting there, options being discussed that don't exist in Sequoia's Dock/Desktop settings)
- update or feature nagging (I don't care I could use my iPhone as a webcam right now, go away)
- sometimes difficult to find mouse pointer on large screens
- older problem but I know at least one person on the verge of leaving Mac OS because of it after 20+ years of loyalty (or outright fanboyism tbh): in a German locale, you can't switch off PC gender-neutral language which is not only pushy and annoying but also space-inefficient as fsck
- How tf does it take upwards of 5 seconds to take a screenshot with modern hardware on a fully updated OS. How.
- And why do screen recordings sometimes randomly disappear into the bowels of the OS, where even Support struggle to find them?
Although I do not mind the way that window management works on macos, recently I had a mildly infuriating situation. I was doing Cmd+Z to undo something, not sure which app and it didn't work so I pressed it a couple of times instinctively. But although my target app was visible and on top, it was Finder that was actually in focus - accidentally I triggered undo in Finder. I think I managed to undelete a file and something else, but I'm not sure. Not sure if there is a way to find a log of actions. That's something I would love to see in all desktop systems, a history of user actions. Also having undo/redo shortcuts in Finder is potentially destructive, what if I move some files from an SD card, reformat it in camera, and then accidentally hit undo in Finder?
I don't understand why macOS has so many issues. I still encounter memory leaks and have to kill Finder or Dock every few days lest it eat all my memory.
I have to agree with this, System Settings seems very inconsistent (design) and has terrible information architecture / organization.
No idea if it’s related to the new double tapping of alt to open the siri text input.
Our memory is a lot rosier than the reality.
Do we really need MacOS, iOS, iPadOS, WatchOS, TVOS, VisionOS, does maintaining all this make the product better kind term.
But Sequoia has made some M1 Pros run poorly in my environment. It’s unacceptable the amount of resources it takes to do basic stuff that we got right of 30 years ago.
BTW, there is an (earlier) example of Snow Leopard in the Microsoft ecosystem -- that would be Windows XP, which similarly avoided major new subsystems and new applications built-into-the-OS, but was remarkably fast and stable for its time.
It was perceived as bloated because it struggled on the hardware of the time.
Then it needed a near total rewrite with SP2 because it was riddled with security issues.
For me, service packs were largely a question of "do we want to tie up the phone line for however many hours just because Microsoft wants to rearrange a UI layout?"
Apple could conceivably abandon intel Mac Pro systems sold in 2023 by releasing an Apple Silicon-only macOS in 2026, but three years still seems a bit aggressive.
Nevertheless, he is probably right. Only the people who went through working on Windows, Linux both on cheap and expensive machines while dealing with all the "baggage" these environments bring can tolerate MacOS with leniency. I will never come back to anything else until I see a competitive offer from just anyone because what Apple offers is:
* Fast, silent, extremely energy efficient devices with excellent screens and audio.
* The font rendering. I honestly can't believe people who professionally work with text all their lives never mention it here. MacOS had and continues to have the best fonts and font rendering that is.
* Solid build that lasts (I own MacBook Pro and MS Surface Book 2 both from 2019 so I see how they age).
* A device that is ready to work when you open a lid or touch a keyboard button without any "waking up from sleep/hibernation" or freezing due to buggy video drivers and inability to work with GPU in hybrid mode OUT-OF-THE-BOX in 2025.
The above-mentioned is more than enough for me to tolerate any MacOS issues and the ones mentioned in the article are just laughable.
Apple offers you the full package that allows cross-device integration while Win/Linux users still rely on the Google stack or other third party "workarounds". Yes, no surprises here -- owning the hardware and software stack is a massive advantage.
Yeah, until the flex cable breaks (a 5 USD part) and you're forced to replace the entire screen (for 1000 USD).
I'm 100% supportive of Framework laptops, especially if it can be an open standard with aftermarket parts.
Apple products have good quality, but I'd prefer to upgrade just the CPU and keep my old display. Hopefully Framework will work towards that.
The Apple model is wasteful and profit-engineered.
Linux has significantly better font rendering than macOS these days if you're on a 140 or less PPI screen. Linux still does subpixel AA and text looks razor sharp, while Apple pretends very large monitors like my 140ppi 57" don't exist.
Now iOS and MacOS feel sluggish and slothlike, waiting on IO, typically from a remote call. The webdevs have taken over.
Yes they need to remove cruft, and also re-hire the ruthless UI Nazis who would enforce 120hz responsiveness at all cost.
As a user since System 7 it's so sad to see.
Of course today it would be insecure, missing security patches etc. SSL...
I entirely forgot it existed! They still sell that?
My initial wish for Apple was to make macOS as bulletproof, lightweight, and bug-free as possible. But now I just want to use Linux on my M1 MacBook because of all the bullshit that’s going on in the US right now. It’s only a matter of time until the Trump administration will start to dismantle the American technology sector, beginning with the softening of encryption and the death of Advanced Data Protection I currently rely on on iCloud. Mark my words.
Like I’ve said in a couple of comments before in other threads, I’d love to switch to Asahi but without native disk encryption I just can’t. If my laptop gets stolen, all my files would be visible to the thief, and that’s a risk I’m not willing to make.
There are three main choices and they are all compromised in their own way. You just need to figure out what is important to you and what isn't.
What you shouldn't do is take too much notice of posts like these, I've read through the whole thing and haven't had any of the issues mentioned. I've also not seen a mention of the issues I do have. HN has a negative tone, it seems we like to whinge.
„Just get it out somehow.“
„Fixing bugs is not a KPI for our promotions and salary increases.“
Old stuff is practically abandoned. No one knows how to fix it anymore and it’s replaced instead, at best. Disdain for legacy. The only thing management gets excited about is the next shiny thing, currently tacking AI onto everything.
Can you name big companies where this did not happen?
So far Apple has kept it as a toggle in the settings, but it's easy bloat for it to keep spreading. Does anyone need AI in a text editor? No.
In the general sense, notepad and TextEdit should just be less nerdy nano's. They always have been and that's what they were meant for.
If you need something to write reports, a book, etc then you use MS Docs, Google Docs, or whatever Apple provides. Those are the tools where adding AI might be useful as a feature, like the ribbon in Office.
Let purpose made tools just be that.
The articles specific gripes with macOS are Mail, Messages, and System settings. Fixing those does not require a ‘no new features’ (which was always BS) major release.
I recently emailed Tim expressing the same concerns as the article and regarding specific issues with Messages and Mail resource usage and was surprised to get a response from Craig requesting more information and sysdiagnose files, but this is where feedback ended unfortunately.
The current state of the macOS UI is atrocious, devices don't all need the same button shape or menu UX flow across all devices as they are inherently interacted with differently. A Mac isn’t an iPad — why force the same rounded buttons and simplified menus on both? They’re interacted with differently: keyboard and mouse versus touch. I have no idea why this is so difficult for execs to understand or important for them to change. Software teams at Apple are so lucky to have the Apple Silicon innovation on the hardware side, Intel Macs would catch fire on boot-up running any of the latest releases given how atrocious the resource usage is.
While I'm here whinging, the iOS swipe keyboard is garbage (almost totally unusable now) where before it was perfect with the innovative predictive hit-box expansion pioneered by Ken Kocienda. I think that's now been replaced with AI prediction which in 2025 I don't understand why it can be so embarrassingly bad. I had to upgrade to the iPhone Max recently to hit the letters properly. Also Apple I never want to tell someone to "duck off".
Initially I was understanding, but quite frankly now I'm just pissed that it has gotten to this stage, and there is no indication of resolution from execs about these issues.
I’m starting to worry that Apple could go off the deep-end - the way of Microsoft - coasting on hardware sales while letting software quality slide (albeit seeming intentional from Microsoft's side of the fence). I get it — software isn’t where the money is, hardware drives the business - but the two are inseparable BY DESIGN. When macOS struggles with basic functionality, it undermines the value of the Mac itself.
Topping the shitlist has to be the inexplicable splitting of group threads for random people in the group, even when everyone is using an iPhone. Suddenly someone in the group gets the messages by him or herself and can't reply to the group. And this occasionally also happens in one-on-one threads: I've had years-old (maybe decades-old) threads suddenly split off into a new one with a friend of mine for no apparent reason.
There's some fundamental incompetence in Message's design, and I'm sure that the addition of RCS has made it worse because it was slapped onto a rotten core.
Oh yeah, then there's the way Messages (or, to be fair, iOS) loses all of your contacts' names if you travel outside the country. This is another brain-dead defect: Just because you're in a new country code, your iPhone suddenly can't associate U.S. numbers with your contacts. How the hell does this go unfixed for one major iOS revision, let alone 15+ years?
Oh yeah, then there's the way Calendar "helpfully" changes the times on your appointments when you travel... meaning that you'll miss all of them if you travel east, because your phone will move them hours later. I mean... who lives like that? I you're going to London on business and the next day you have a meeting at 10 a.m., your iPhone will "helpfully" change that meeting to, say, 5 p.m.
So when the author muses about whether Apple developers ever actually use this stuff in the real world, the only logical answer is no. Or they just take so little interest in the functional quality of their product that they just check in some grossly defective trash and call it a day... and refuse to fix it year after year.
Or... they're not given time and resources to fix it. I'm pretty gentle when filing bugs about Xcode, because I'm sure they are understaffed. But at this point, the neglect has (or should have) exhausted every developer's patience.
Which brings us to a bit of hypocrisy in the post: "Apple is clearly behind on the AI arms race"
NO. Apple's sad capitulation to armchair "industry observers" and "analysts" has contributed greatly to the very defects the guy complains about. Apple should not have jumped on the "AI" hype in the first place. It does not serve Apple's product line or market. They are not a search company or gatekeeper to huge swaths of the Internet. If they wanted to quietly improve Siri and get it RIGHT, fine. But now they're embarrassed, and resources that should have been spent on QA have been squandered on bullshit "AI" that failed.
Now there's no such change, but instead AI, this weird new cross-cutting but fuzzy function touching everything that no one has ever used reliably at the scale of Apple devices. AI is impossible to reliably test, and all-too-easy to get embarrassing results. I'm glad Apple recently tamped expectations.
The relatively loose concurrency model in Apple's ARM has made it rival the network in introducing new failure modes Many quality issues cited have their root causes in those two sources of indeterminacy.
Amplifying these are the organizational boundaries driving software flaws. Siri as a separate organization with its own network-dependent stack is just not viable for scattering AI. Boosting revenue with iCloud services makes all roads run through the servers in Rome, amplifying network and backend reliability issues. I also suspect outsourcing quality and the maintenance of legacy software has reduced the internal quality signal and cemented technical boundaries, as the delegates protect their work streams and play quality theater. The yearly on-schedule cadence makes things worse because they can always play for time and wait for the next train.
And frankly (to borrow a concept from Java land), Apple might be reaching peak complexity. With hundreds of apps sporting tens of settings, there is simply no way to have a fast-path to the few things different people need. Deep linking is a workaround, but it's up to the app or user to figure that out. (And it makes me livid: I can't count how many important calls I've missed by failing to turn off "Silence unknown callers", with the Phone app settings buried 3 layers deep ON MY PHONE)
A short-term solution I think is not a rewrite but concierge UI setup: come to the store, tell the "geniuses" exactly what you need, and make shortcuts + myUI or whatever is necessary to enable them to make it happen. Then automate that process with AI.
That's something they can deliver continuously. Their geniuses can drive feature-development, and it can be rolled out to stores weekly and -- heavens! -- rolled back just as quickly. Customers and employees get the excitement of seeing their feature in action.
The model of sensitive form-factor designers working in quiet respectful collaboration to produce new curves every year is just wrong for today's needs. All those people standing around at Apple stores should instead be spending an hour or more with each existing customer designing new features, and they should be rewarded for features that take, and especially for features that AI can incorporate.
On the development side, any one should be able to contribute to any new feature, and be rewarded for it. At least for this work, there would be no more silos, and no massive work streams creating moral hazards.
The goal is to make software and a software development process that scales and adapts. It may start at 5% of new UI features, but I hope it infects and challenges the entire organization and stack.
Granted, it will take a famously hub organization and turn it into a web of hubs, but that in itself may be necessary for Apple to build the next generation of managers.
Look for how today's challenges can help you build tomorrow's organizations.
b) it might well be that the author really wishes they could draw a cartoon but can't, and resorted to AI to convey their aesthetic choices
I think your pessimism is unwarranted--AI illustrations do have their use.