(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
One annoying thing is that linux cant run many different GPU drivers at the same time, so you have to make sure the cards work with the same driver.
Properitary 3rd party multi seat also exist for Windows, but Linux has built in support and its free.
https://en.wikipedia.org/wiki/Multiseat_configuration
I am super curious about your setup. I played with MS years ago, but I lost the need. It is a super cool tech that I'd love to see its efficiencies embraced in some way.
I suggest using Arch linux although loginctl should be available in all distributions using SystemD now.
If you don't have enough USB ports you can use a USB hub, some monitors comes with USB hub. And some with built in sound, or you can use wireless headset.
My main issue was that driver support was dropped for my oldest GPU card. So one day when I upgraded the OS it just stopped working. So to be on the safe side get another GPU like the one you already have.
For some reason, the Lenovo Legion S's Windows still comes with a lot of baggage and background services etc.
It's available now, but nobody's been impressed yet. Gives you a gamepad-compatible launcher (although the gamepad PIN to login is buggy). Doesn't seem to actually save resources.
Edit: though, I'd have to give up my Fallout 4 with mods. (Don't judge me I know it sucks but it makes the dopamine go brrr)
The games market today seems more similar to music in its fragmentation.
The addressable consumer market is just a lot bigger and more diverse than it used to be. You go back to the early late 90s and its a market dominated by teenage boys. Go back and look at some of the 00s and early 10s E3 presenations from the big three and its very cringe inducing how focused they are on a teenage boy demographic and appearing edgy and how blatantly sexist they are in their language. For example, at the E3 conference where MS announced xbox live (2004?) they explictly said that girls don't play games (there were actually plenty of girls that did play games at that time), but they might want to use xbox live to design t-shirts to sell to boys on their online marketplace. This was also still the era of booth babes trying to pull in men to booths with barely dressed women. Nearly every game ad was just a wall of exposions and violence or just the latest NFL game.
Today you have fully grown adults in their 30-50s with very different tastes and you have a lot lot more women and girls playing.
On top of we have a lot of diversity in who creates games and the kinds of games they can create and still be commerically successful. Lots of interesting narratively focused games, puzzles games, platformers, and more artsy games. But if you want your multiplayer shooter battlefield and CS2 are still there for you.
https://www.mindlessmag.com/post/a-boy-s-hobby-gender-and-ma...
The reason some of the most popular games are popular isn't because they are fun, it's because they've built an esports industry. Those popular games get spectators which in turn makes the games more popular.
https://softwareengineeringdaily.com/2017/12/18/protocol-buf...
https://softwareengineeringdaily.com/2018/02/06/serverless-a...
I don't understand this - and I'm not being a Windows defender here, I use Linux when I can (and promote its use).
But my Windows 11 installation has zero ads and zero "crapware". And it's a Dell!
Everything that I didn't want on the machine was removed when I purchased it (two years ago). I see no ads. If I did, this can be fixed easily by even non-technical users with OOShutUp10 or similar - or just edited with a registry change.
I've been using Windows since 3.1 and there were some ugly years but that is not the current state-of-the-state. I'm just calling it like I see it at this point.
It's probably the case that I could turn all of these off by hunting down the right config options, and if I used Windows as my primary desktop I'm sure I would. But it's just on my game machines which I don't want to spend a lot of time maintaining, and new crap keeps popping up in updates. It's exhausting.
A Debian Linux desktop, in comparison, is not trying to push you to anything. It's a breath of fresh air (not a term I use often but really fits here).
Note: I never made it to Windows 11, only Windows 10. But my understanding is that these things are getting worse, not better. And while not exactly the same thing, there has been a lot of talk lately about how the file explorer has become so bloated and slow that they have to preload it into memory at startup so that it can respond quickly when you click it... omg, I do not want that.
Yes, you can use cleanup software to fix the symptoms, but that's not the real issue here.
Edit: further research revealed my original first point was a false assumption.
The "current" state does not matter. What matters is that MS can shittify your experience at any time. Your machine can stop working if you don't agree to MS "updates". On Linux you have the assurance that the state of your machine can be preserved and you know exactly what's being installed on it.
FTFY: Windows is spyware. The fact that you paid for spyware or it came on your computer or it has useful properties (like Bonzi Buddy) doesn't make it not spyware.
I did a clean Windows 11 install a few months ago. I expected to be bombarded with ads and all of the other things I kept reading about in comments here, but it’s been fine.
I do find it interesting that so many of the comments about how bad Windows 11 is are coming from comments that also admit they aren’t using Windows 11. Not everything in Windows 11 is my favorite design choice, but the anti Windows 11 comments have taken on a life of their own that isn’t always based in reality.
They don't have annoying bundleware with Windows 11?
I am just using dm-snapshot for this -- block device level, no fancy filesystems.
All the things you’re used to without the corporate “sugarcoating”.
https://gitlab.com/scripts94/kubuntu-get-rid-of-snap
Up until now I didn't care how my software was installed, but snaps REALLY don't play nice, so it's time to retire them. Canonical has lost this battle, and the sooner they accept it and move on, the sooner they can recover their reputation and put this madness behind them.
..edit.. I installed a dummy package that displaces the nagware about the pro version too so I never get those messages during apt update any more.
Taking a quick definitely incomplete look I see at least:
/etc/apt/preferences.d/mozilla.pref
Package: firefox*
Pin: release o=LP-PPA-mozillateam
Pin-Priority: 501
Package: thunderbird*
Pin: release o=LP-PPA-mozillateam
Pin-Priority: 501
/etc/apt/preferences.d/nosnap.pref Package: snapd
Pin: release a=*
Pin-Priority: -10
and removed ubuntu-pro-esm-apps and ubuntu-pro-esm-infra from that same dirbut also there is a mozillateam ppa in sources.list.d, and I don't see any installed package name that looks like it might be that dummy ubuntu-pro-esm thing, so maybe it got removed during a version upgrade and I never noticed because ubuntu stopped that nonsense and it isn't needed any more? Or there is some other config somewhere I'm forgetting that is keeping that hole plugged.
Anyway, it WAS a little bit of fiddling around one day, but at least it was only a one and done thing so far.
I kind of expected to be off of ubuntu by now because once someone starts doing anything like that, it doesn't matter if you can work around it, the real problem is that they want to do things like that at all in the first place. Well they still want what they want and that problem is never going away. They will just keep trying some other thing and then some other thing. So rather that fight them forever, it's better to find someone else who you don't want to fight. I mean that's why we're on Linux at all in the first place right? But so far it's been a few version bumps since then and still more or less fine.
I'd had effectively zero issues avoid snaps.
If you don’t want what makes Ubuntu Ubuntu, why not just run vanilla Debian instead?
Also, amazing house, my friend is enamored of the cat-transit. I used to live not too far from you :)
You cannot say such things without more info. I envision cats sitting on small trollies.
Realistically, that's only going to happen if Valve, specifically, decides to do it. There isn't really another player in the linux-gaming business with enough skin in the game to make it worthwhile.
Those that don't like it can simply not play the games, or can have a dedicated "compromised" machine/vm for gaming.
In some ways, the minor barrier is almost beneficial, in terms of clearly separating work-time and play-time.
I've also said it here before but I will just give up on PC gaming wholesale before I go back to Windows. It's crazy how much gaming on Linux has improved in just the past couple years.
I consider this a feature, not a bug
As I've said elsewhere, Battlefield 6 has got a far better user experience on Linux than Windows and I would recommend it to anyone.
If you want to help Debian test the next release and actually report issues choose Debian testing.
I wonder how many potential users have been scared off by the name... Maybe Debian devs like it that way, less annoying desktop users to support.
What changed?
I've ran testing on my home server, though since it's a bit old now I've switched it over to stable when testing switched to stable.
The testing happens in Debian Sid.
No, just because the Steamdeck's distro is built on Arch, and so you can piggyback on what they are doing.
I don't see why 'piggyback on what [Steam deck is] doing' wouldn't work just as well on any distro, you'd just have a load of extra stuff you're not using too.
That's nothing against Arch, it's what I use, I'm just saying really the only magic is in doing less.
I download the nvidia drivers directly from nvidia. Their installer script is actually pretty decent and then I don't have to worry about whether the distro packages are up-to-date.
Also, I have a pretty unusual setup: my machines netboot from an shared iSCSI volume, setting up a local copy-on-write overlay on each machine.
SteamOS is based on Arch, so I'm sure it would be possible to make it do anything Arch can do. But I don't know Arch -- I know Debian. So I was a lot more comfortable installing Debian and tweaking it the way I needed, then installing Steam on top.
Pretty horrible technology, and unfortunately a good majority of the gaming industry by revenue relies on it.
It is not really a roadblock, more like a bump, and it is not the only bump by far. Some games just don't run on Linux, or quite terribly and they don't have a big enough community for people to care. Sometimes one of your pieces of hardware, maybe an exotic controller, doesn't like linux. Sometimes it is not the fault of the game at all, but you want to do something else with that PC and it isn't supported on Linux, and you don't want to dual boot. Overall, you will have less problems with gaming on Windows, especially if you don't really enjoy a trip to stackoverflow and the command line, but except for anti-cheat maybe, there is no "big" reasons, just a lot of small ones.
And sure, it is improving.
Outrageous that a ubiquitous connection protocol is allowed to be encumbered in this way.
DP1.4 though, so you're still going to need compression.
DP however can't transfer audio, which doesn't matter for a desktop but matters a lot for a TV.
No, it's not, the protocol is completely different (DP is packet-based while HDMI traditionally was not, though AFAIK HDMI 2.1 copied DP's approach for its higher speed modes). When you use a passive DP-HDMI cable (which AFAIK is not fully passive, it has level shifters since the voltages are different), it works only because the graphics card detects it and switches to using the HDMI protocol on that port; if it's not a dual-mode port (aka "DP++" port) it won't work and you'll need an active DP-HDMI adapter.
> DP however can't transfer audio, which doesn't matter for a desktop but matters a lot for a TV.
On the desktop I'm using to type this message, I use the speakers built into the DP-connected monitor (a Dell E2222HS). So yes, DP can and does transfer audio just fine. If it couldn't, then active DP to HDMI adapters wouldn't be able to transfer audio too.
The only thing DP doesn't have AFAIK is ARC, which might matter for a few more exotic TV use cases, and HEC, which AFAIK nobody uses.
This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.
I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.
https://www.techpowerup.com/335152/china-develops-hdmi-alter...
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
I thought audio might be the reason, for as far as I can tell, displayport supports that too.
It took a long time to move from the old component input over to HDMI. The main thing that drove it was the SD to HD change. You needed HDMI to do 1080p (I believe, IDK that component ever supported that high of a resolution).
Moving from HDMI to display port is going to be the same issue. People already have all their favorite HDMI devices plugged in and setup for their TVs.
You need a feature that people want which HDMI isn't or can't provide in order to incentivize a switch.
For example, perhaps display port could offer something like power delivery. That could allow things like media sticks to be solely powered by the TV eliminating some cable management.
For the same sorts of reasons that made it so for decades nearly every prebuilt PC shipped with an Intel CPU and Windows preinstalled: dirty backroom dealings. But in this case, the consortium that controls HDMI are the ones doing the dealings, rather than Intel and Microsoft.
"But Displayport doesn't implement the TV-control protocols that I use!", you say. That's totally correct, but DisplayPort has the out-of-band control channel needed to implement that stuff. If there had been any real chance of getting DisplayPort on mainstream TVs, then you'd see those protocols in the DisplayPort standard, too. As it stands now, why bother supporting something that will never, ever get used?
Also, DP -> HDMI active adapters exist. HDR is said to work all the time, and VRR often works, but it depends on the specifics of the display.
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
(I assume VRR = Variable Refresh Rate)
Most single player games (Spider-Man, God of War, Assassin's Creed etc) will allow a balanced graphics/performance which does 40 in a 120hz refresh.
The best Valve could do is offer a special locked down kernel with perhaps some anticheat capabilities and lock down the hardware with attestation. If they offer the sources and do verified builds it might even be accepted by some.
Doubt it would be popular or even successful on non-Valve machines. But I'm not an online gamer and couldn't care less about anticheats.
For competitive gaming, I think attested hardware & software actually is the right way to go. Don’t force kernel-level malware on everyone.
Cheats were a problem. Not even a nascent problem, but already established. Bad enough that VAC was released in 2002, Punkbuster in 2000...
In competitive gaming you cannot just find a stable friends group to play against: you need competition, and a diverse one. We somewhat palliated this by physically playing in LAN, but that still limits to a radius around you and it's cumbersome when you can just find an opponent online (we had manual matchmaking on IRC before modern matchmaking existed).
The problem is that cheating can be very subtle if done correctly. The difference between "that guy is better that me" and "that guy can see through walls" is pretty much undetectable through non-technical means if the cheater is not an idiot. This poisons the competitive scene.
Competitive gaming is huge. It was big back in the day but now it's a monster. Just check the largest categories on Twitch: LoL, TFT, WoW, CS, Valorant...
"Competitive tennis cannot possibly be huge"
"Competitive coding cannot possibly be huge"
People play competition sports. They except no, or minimal amounts of cheating. Your personal feelings about it don't matter. The kid that plays basketball with 12 years olds on saturday mornings has the right to not have to deal with cheaters, and it doesn't matter if he's in the top .0001% or a shitty player that cannot distinguish his hands from his ears.
Have a quick look at the ladder on Counter Strike, or Faceit, or ranked play on League of Legends/Valorant/Whatever: it's not a niche. These games requiring kernel AC no matter the type of play is another subject, but people play to compare themselves to other, massively.
Who said anything about meaning? People being shit at the game invalidates that the game ruleset is competitive?
This is like saying we need to institute drug testing at all parks to play football. Cheating in sports is a problem that very few players are concerned with. Caring about who wins isn't even common. Most are just kicking a ball around with their mates.
Yeah I'm the one in a bubble because I think players that play competitive games expect competitive integrity, regardless of their skill level.
And they don't even need it all the time either. I did once participate in a CS:S tournament, so I guess I was "competitive", but half the time I was on gun game or ice world or surf maps. My friends and I played normal Warcraft 3 against each other, but otherwise I pretty much only played custom maps, which were apparently popular enough to spawn an entire new genre. I never ran into problems queueing for something like preschool wars or wintermaul. When we did queue for ladder sometimes it was like 10 minutes to find a match.
To your earlier point about e.g. Valorant: my mom invited me to play on weekends with her and my sister. I know my mom is 0% competitive. This was not some serious thing. I couldn't play with them because I'm not going to buy another computer just to run it. That's the absurdity here.
The International (a DOTA 2 competition) has like $40m in prizes. EWC in 2025 was $70m. 99.6 million people watched the League of Legends World Championship final. And we're not even talking about the millions of dollars of sponsorship involved.
That's great your mom isn't competitive in Valorant, but massively irrelevant. It's like me saying "I play flag football with friends, there is no competitive football."
Anti-cheat is important because this is how the best players are discovered, this is how they're recruited. If a game is 50%+ cheaters, the game will die... DOTA2 would cease to exist today as a big deal. Same with Valorant.
Aside from competitive gaming, GTA V online makes $1 BILLION in ARR. That would be $0 if the game was flooded with cheaters.
Now this isn't me defending kernel level anti-cheat, I think there are better ways to do it and some games do a great job here.
But man, calling GTA V online and competitive e-sports a "tiny bubble" is like calling the NFL a "tiny bubble".
This led to legit players that were just good being banned by salty mods, or cheaters that were subtle enough to only gain a slight edge not being banned.
It's a bit like complaining that these days people just want to watch TV, instead of writing and performing their own plays.
Other games did similarly. Quake 3 Arena added Punkbuster in a patch. Competitive 3rd party Starcraft 1 server ICCUP had an "anti-hack client" as a requirement.
I could almost get on board with the idea of invasive kernel anti-cheat software if it actually was effective, but these games still have cheaters. So you get the worst of both worlds--you have to accept the security and portability problems as a condition for playing the game AND there are still cheaters!
the bloggers/journalists calling it malware is doing the conversation a disservice. The problem is only really the risk of bugs or problems with kernel level anti-cheat, which _could_ be exploited in the worst case, and in the best case, cause outages.
The classic example recently is the crowdstrike triggered outtage of computers worldwide due to kernel level antivirus/malware scanning. Anti-cheat could potentially have the exact same outcome (but perhaps smaller in scale as only gamers would have it).
If windows created a better framework, it is feasible that such errors are recoverable from and fixable without outages.
I'm already salty about the binary blobs required by various pieces of firmware.
FPSs can just say 'the console is the competitive ranked' machine, add mouse + keyboard support and call it a day. But in those games cheaters can really ruin things with aimbots, so maybe it is necessary for the ecosystem, I dunno.
Nobody plays RTSs competitively anymore and low-twitch MMOs need better data hiding for what they send clients so 'cheating' is not relevant.
We are at the point where camera + modded input devices are cheap and easy enough I dunno if anti-cheat matters anymore.
Competition vs other human beings is the entire point of that genre, and the intensity when you’re in the top .1% of the playerbase in Overwatch/Valorant/CSGO is really unmatched.
Case in point from a few years back - Fall Guys. Silly fun, sloppy controls, a laugh. And then you get people literally flying around because they've installed a hack, so other players can't progress as they can't make the top X players in a round.
So to throw it back - it is just a game, it's so sad that a minority think winning is more important than just enjoying things, or think their own enjoyment is more important than everyone else's.
As an old-timer myself, we thought it was despicable when people replaced downloaded skins in QuakeWorld with all-fullbright versions in their local client, so they could get an advantage spotting other players... I suppose that does show us that multiplayer cheating is almost as old as internet gaming.
Also, for more casual play, don't players have rankings so that you play with others about your level? Cheaters would alll end up just playing with other cheaters in that case, wouldn't they?
That would require essentially turning it into a console or Android.
Hardware support would inevitably be somewhat limited but that's still better than the situation with either consoles or kernel anticheat.
Making a Valve-only Linux solution would take a lot of the joy of this moment away for many. But it would also help Valve significantly. It's very uncomfortable to consider, imo.
The problem is more the audience. Console players generally expect to be able to just connect the console to the TV, sit on the sofa and play with the official controller. That’s all the game are required to support to be published on the platform.
Even if you were willing to play at a desk, you’d be matchmaking into a special (and small) mouse pool on the console game. Anyone willing to go through so much faff will accept the extra annoyances of a PC, even with kernel anti cheat.
I'm far from an authority on this topic but from my understanding both Sony/MS have introduced mkb support, but so far it looks to be an opt-in kind of thing and it's still relatively new.
But even then, when everyone is trying out a new indie game there’s a chance it won’t work on non-Windows. It’s happened to me.
I am very pro-Linux and pro-privacy, and hope that the situation improves so I don’t have to continue to compromise.
Sure, except that anyone can just compile a Linux kernel that doesn't allow that.
Anti-cheat systems on Windows work because Windows is hard(er) to tamper with.
This isn't complicated.
Even the Crowdstrike falcon agent has switched to bpf because it lowers the risk that a kernel driver will brick downstream like what happened with windows that one time. I recently configured a corporate single sign on to simply not work if the bpf component was disabled.
Anticheat and antivirus are two similar but different games. It's very complicated.
Although even then I'd still have qualms about paying for the creation of something that might pave the path for hardware vendors to work with authoritarian governments to restrict users to approved kernel builds. The potential harms are just not in the same league as whatever problems it might solve for gamers.
- Want to play these adversarial games
- Don't care about compromising control of hypervisor
- Don't simply have a dedicated gaming box
A hypervisor that protects against this already exists for Linux with Android's pKVM. Android properly enforces isolation between all guests.
Desktop Linux distros are way behind in terms of security compared to Android. If desktop Linux users ever want L1 DRM to work to get access to high resolution movies and such they are going to need such a hypervisor. This is not a niche use case.
I would never use a computer I don't have full control over as my main desktop, especially not to satisfy an external party's desire for control. It seems a lot more convenient to just use a separate machine.
Even mainstream consumers are getting tired of DRM crap ruining their games and movies. I doubt there is a significant Linux users would actually want to compromise their ownership of the computer just to watch movies or play games.
I do agree that Linux userland security is lackluster though. Flatpak seems to be a neat advancement, at least in regard to stopping things from basically uploading your filesystems. There is already a lot of kernel interfaces that can do this like user namespaces. I wish someone would come up with something like QubesOS, but making use of containers instead of VMs and Wayland proxies for better performance.
I honestly think you would be content as long as the computer offered the ability to host an arbitrary operating system just like has always been possible. Just because there may be an optional guest running that you can't fully control that doesn't take away from the ability to have an arbitrary guest you can fully customize.
>to satisfy an external party's desire for control.
The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
>It seems a lot more convenient to just use a separate machine.
It really isn't. It's much more convenient to launch a game on the computer you are already using than going to a separate one.
It's a little funny that the two interests of adtech are colliding a bit here: They want maximum control and data collection, but implementing control in a palatable way (like you describe) would limit their data collection abilities.
My answer to your question: No, I don't like it at all, even if I fully trust the hypervisor. It will reduce the barrier for implementing all kinds of anti-user technologies. If that were possible, it will quickly be required to interact with everything, and your arbitrary guest will soon be pretty useless, just like the "integrity" bullshit on Android. Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc. That's still a net minus compared to the status quo.
In general, I dislike any methods that try to apply an arbitrary set of criteria to entitle you to a "free" service to prevent "abuse", be it captchas, play integrity, or Altman's worldcoin. That "abuse" is just rational behavior from misaligned incentives, because non-market mechanisms like this are fundamentally flawed and there is always a large incentive to exploit it. They want to have their cake and eat it too, by eating your cake. I don't want to let them have their way.
> The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
Pretty sure we already have enough technology to fully automate many games with robotics. If there is a will, there is a way. As with everything else on the internet, everyone you don't know will be considered untrusted by default. Not the happiest outcome, but I prefer it to losing general purpose computing.
I'm talking about the entire chip. You are unable to implement a new instruction for the CPU for example. Only Intel or AMD can do so. You already don't have full control over the CPU. You only have as much control as the documentation for the computer gives you. The idea of full control is not a real thing and it is not necessary for a computer to be useful or accomplish what you want.
>and your arbitrary guest will soon be pretty useless
If software doesn't want to support insecure guests, the option is between being unable to use it, or being able to use it in a secure guest. Your entire computer will become useless without the secure guest.
>Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc.
This could be handled by also running another guest that was supported by those app developers that provide the required security requirements compared to your arbitrary one.
>That "abuse" is just rational behavior from misaligned incentives
Often these can't be fixed or would result in a poor user experience for everyone due to a few bad actors. If your answer is to just not build the app in the first place, that is not a satisfying answer. It's a net positive to be able to do things like watch movies for free on YouTube. It's beneficial for all parties. I don't think it is in anyone's best interest to not do such a thing because there isn't a proper market incentive in place stop people from ripping the movie.
>If there is a will, there is a way.
The goal of anticheat is to minimize customer frustration caused due to cheaters. It can still be successful even if it technically does not stop every possible cheat.
>general purpose computing
General purpose computing will always be possible. It just will no longer be the wild west anymore where there was no security and every program could mess with every other program. Within a program's own context it is able still do whatever it wants, you can implement a Turing machine (bar the infinite memory).
They certainly aren't perfect, but they don't seem to be hell-bent on spying on or shoving crap into my face every waking hour for the time being.
> insecure guests
"Insecure" for the program against the user. It's such a dystopian idea that I don't know what to respond with.
> required security requirements
I don't believe any external party has the right to require me to use my own property in a certain way. This ends freedom as we know it. The most immediate consequences is we'd be subject to more ads with no way to opt out, but that would just be the beginning.
> stop people from ripping the movie
This is physically impossible anyway. There's always the analog hole, recording screens, etc, and I'm sure AI denoising will close the gap in quality.
> it technically does not stop every possible cheat
The bar gets lower by the day with locally deployable AI. We'd lose all this freedom for nothing at the end of the day. If you don't want cheating, the game needs to be played in a supervised context, just like how students take exams or sports competitions have referees.
And these are my concerns with your ideal "hypervisor" provided by a benevolent party. In this world we live in, the hypervisor is provided by the same people who don't want you to have any control whatsoever, and would probably inject ads/backdoors/telemetry into your "free" guest anyway. After all, they've gotten away with worse.
We already tried out trusting the users and it turns out that a few bad apples can spoil the bunch.
>It's such a dystopian idea that I don't know what to respond with.
Plenty of other devices are designed so that you can only use it in safe ways the designer intends. For example a microwave won't function while the door is open. This is not dystopia despite potentially going against what the user wants to be able to do.
>I don't believe any external party has the right to require me to use my own property in a certain way.
And companies are not obligated to support running on your custom modified property.
>The bar gets lower by the day with locally deployable AI.
The bar at least can be raised from searching "free hacks" and double clicking the cheat exe.
>who don't want you to have any control whatsoever
This isn't true. These systems offer plenty of control, but they are just designed in a way that security actually exists and can't be easily bypassed.
>and would probably inject ads/backdoors/telemetry into your "free" guest anyway.
This is very unlikely. It is unsupported speculation.
You say this as if they're a guest on your machine and not the other way around.
It's not a symmetrical relationship. If companies don't trust me, they don't get my money. And if I don't trust them, they don't get my money.
The only direction that gets them paid is if I trust them. For that to happen they don't have to go out of their way to support my use cases, buy they can't be going out of their way to limit them either.
> designed in a way that security actually exists
When some remote party has placed countermeasures against how you want to use your computer, that's the opposite of security. That's malware.
Is it possible to do this in a relatively hardware-agnostic, but reliable manner? Probably not.
That way you could use an official kernel from Fedora, Ubuntu, Debian, Arch etc. A custom one wouldn't be supported but that's significantly better than blocking things universally.
I'm not aware that a TPM is capable of hiding a key without the OS being able to access/unseal it at some point. It can display a signed boot chain but what would it be signed with?
If it's not signed with a key out of the reach of the system, you can always implement a fake driver pretty easily to spoof it.
Basically TPM includes key that's also signed with manufacturer key. You can't just extract it and signature ensures that this key is "trusted". When asked, TPM will return boot chain (including bootloader or UKI hash), signed by its own key which you can present to remote party. The whole protocol is more complicated and includes challenge.
These attestation methods would probably work well enough if you pin a specific key like for a hardened anti-evil-maid setup in a colo, but I doubt it'd work if it trusts a large number of vendor keys by default.
It also means that if you do get banned for any reason (obvious cheating) they then ban the EK and you need to go source more hardware.
It's not perfect but it raises the bar significantly for cheaters to the point that they don't bother.
The idea is you implement a fake driver to sign whatever message you want and totally faking your hardware list too. As long as they are relatively similar models I doubt there's a good way to tell.
Yeah, I think there are much easier ways to cheat at this point, like robotics/special hardware, so it probably does raise the bar.
I don't really care about games, but i do care about messing up people and companies that do such heinous crimes against humanity (kernel-level anti-cheat).
I feel like this is way overstated, it's not that easy to do, and could conceptually be done on windows too via hardware simulation/virtual machines. Both would require significant investments in development to pull of
And then you have BasicallyHomeless on YouTube who is stimulating nerves and using actuators to "cheat." With the likes of the RP2040, even something like an aim-correcting mouse becomes completely cheap and trivial. There is a sweet-spot for AC and I feel like kernel-level might be a bit too far.
Assuming that cheats work by reading (and modifying) the memory of the game process you can you can attach a kprobe to the sys_ptrace system call. Every time any process uses it, your eBPF program triggers. You can then capture the PID and UID of the requester and compare it against a whitelist (eg only the game engine can mess with the memory of that process). If the requester is unauthorized, the eBPF program can even override the return value to deny access before the kernel finishes the request.
Of course there are other attack vectors (like spoofing PID/process name), but eBPF covers them also.
All of this to say that Linux already has sane primitives to allow that, but that, as long as devs don't prioritize Linux, we won't see this happening.
but how does the anti-cheat know that the kernel is not modified such that it disables certain eBPF programs (or misreports cheats/spoofs data etc)?
This is the problem with anti-cheat in general (and the same exists with DRM) - the machine is (supposedly) under the user's total control and therefore, unless your anti-cheat is running at the lowest level, outside of the control of the user's tampering, it is not trustworthy. This leads to TPM requirements and other anti-user measures that are dressed as pro-user in windows.
There's no such thing in linux, which makes it inoperable as one of these anti-cheat platforms imho.
(The following was refined by an LLM because I didn't remember the details of when I was pondering this a while back)
All your anti cheats are eBPF programs hooked to the bpf() syscall itself.
Whenever any process tries to call BPF_PROG_DETACH or BPF_LINK_DETACH, your monitors check if the target is one of the anti cheats in your cluster of anti-cheats.
If an unauthorized process (even Root) tries to detach any of your anti-cheat processes, the eBPF program uses bpf_override_return to send an EPERM (Permission Denied) error back to the cheat.
(End LLM part)
Of course, you can always circumvent this by modifying and compiling the kernel so that those syscalls when targeting a specific PID/process name/UID aren't triggered. But this raises the difficulty of cheating a lot as you can't simply download a script, but you need to install and boot a custom kernel.
So this would solve the random user cheating on an online match. Pro users that have enough motivation can and will cheat anyway, but that is true also on windows. Finally at top gaming events there is so much scrutiny as you need to play on stage on vetted PCs that this is a non-issue
But given Linux kernel is monolithic and you can enforce signing of kernel modules too, using TPM to make sure the Kernel isn't tampered with is honestly the way to go.
I believe the goal is to make it so uncomfortable and painful that 99.999% of the users will say fuck it and they won't do it. In this case users need to boot a custom kernel that they download from the internet which might contain key-loggers and other nasty things. It is not just download a script and execute it.
For cheat developers, instead, this implies doing the modifications to allow those sys-calls to fly under the radar while keeping the system bootable and usable. This might not be trivial.
Modern games already employ a bunch of VM-like techniques for tamper protection.
This has effectively killed PC game piracy.
- https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207/432
- https://indico.freedesktop.org/event/10/contributions/402/attachments/243/327/2025-09-29%20-%20XDC%202025%20-%20Descriptors%20are%20Hard.pdf
- https://www.youtube.com/watch?v=TpwjJdkg2RE
The problem is on multiple levels, so everything has to work in conjunction to be fixed properly.At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.
Id Software do prefer Vulkan but they are an outlier.
DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.
The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.
All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.
https://godotengine.org/article/dev-snapshot-godot-4-6-dev-5...
Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.
And for what it's worth, I'm pretty sure Valorant is the most played competitive shooter at the moment.
First, let's ask ourselves how many PCs have users play games with anti-cheat frameworks. I'm absolutely no expert, but if it's more than, what? 10%? let's even say 20% - I'd be surprised.
> and unfortunately a good majority of the gaming industry by revenue relies on it.
Well, it used to be the case that game makers relied on copy protection in floppy discs, and movie distributors on DVD/BluRay copy protection. Conditions changed and they adapted.
I firmly believe that Nvidia doesn't want the general public to ever have better hardware than what is current as people could just run their own local models and take away from the ridiculous money they're making from data centers.
In step they're now renting their gaming GPUs to players with their GeForce now package.
The market share for Nvidia of gamers is a rounding error now against ai datacenter orders. I won't hold my breath about them revisiting their established drivers for Linux.
You're underestimating them. They don't even want rich professional users to own hardware that could compete with their datacenter cash cow.
Take RTX 6000 Pro, a $10k USD GPU. They say in their marketing materials that these have fifth-generation tensor cores. This is a lie, as you can't really use any 5th-gen specific features.
Take a look at their PTX docs[1]. The RTX 6000 Pro is sm_120 in that table, while their datacenter GPUs are sm_100/sm110. See the 'tcgen05' instructions in the table? It's called 'tcgen05' because it stands for "Tensor Core GEN 05". And they're all unsupported on sm_120.
[1] - https://docs.nvidia.com/cuda/parallel-thread-execution/#rele...
EAC has the support for Linux, you just have to enable it as a developer.
I know this, I worked on games that used this. EAC was used on Stadia (which was a debian box) for the division, because the server had to detect that EAC was actually running on the client.
I feel like I bring this up all the time here but people don’t believe me for some reason.
This does not mean it supports the full feature set as from EAC on Windows. As an analogy, it's like saying Microsoft Excel supports iPad. It's true, but without VBA support, there's not going to be many serious attempts to port more complicated spreadsheets to iPad.
https://github.com/JacKeTUs/linux-steering-wheels
Hopefully vr headset support will get better
I haven’t found a tool that can access all the extra settings of my Logitech mouse, not my Logitech speakers.
OpenRGB is amazing but I’m stuck on a version that constantly crashes; this should be fixed in the recent versions but nixpkgs doesn’t seem to have it (last I checked).
On the other hand I did manage to get SteamVR somewhat working with ALVR on the Quest 3, but performance wasn’t great or consistent at all from what I remember (RTX 3070, Wayland KDE).
Alternatively, given you’re running NixOS you can just override the `src` of the derivation with a newer version. This is part of the point of running NixOS: making small modifications to packages in the fly.
You don’t want a vendor you have to publically shame to get them to do the right thing. And that’s MS if any single sentence has ever described them without using curse words.
I’m not really into this AI shenanigans, but it seems to me that if you want people to use /your/ bot, you gotta give it to people in the most seamless and efficient way possible, and that does not translate well to a desktop OS.
I don’t think they would have dethroned iOS or even Android had they stayed their ground, but they probably would’ve had a stronger base to build upon for their Copilot nonsense. Those that used Windows Phone used it because they loved it, Copilot could’ve garnered some good rep from those already sold on Microsoft’s platform; instead, they’re trying to shove it down people’s throats even though very few people actually use Windows because they actively like it, most use it because it’s the “default” OS and they do not (and care not to) know any better.
That resulted in Windows 8.
More recently they've freaked out about ads, app stores, and SaSS revenue, which has resulted in lots of dark patterns in the OS.
Stock price growth is their core business because that is how large firms operate.
MS used to embrace games etc because the whole point was all PCs should run Windows. Now the plan is to get you onto a subscription to their cloud. The PC bit is largely immaterial in that model. Enterprises get the rather horrible Intune bollocks to play with but the goal is to lock everyone into subs.
I thought all of them more or less have operated under Ponzinomics ever since Jack Welch showed that that worked in the short term.
The strength of Linux and Free software in general is not in that it's completely built by unpaid labor. It's built by a lot of paid, full-time labor. But the results are shared with everyone. The strength of Free software is that it fosters and enforces cooperation of all interested parties, and provides a guarantee that defection is an unprofitable move.
This is one of the reasons you see Linux everywhere, and *BSD, rarely.
I doubt it's a large reason. I'd put more weight on eg Linus being a great project lead and he happens to work on Linux. And a lot of other historical contingencies.
This flow is basically the bread and butter for the OSS community and the only way high effort projects get done.
Game studios will keep buying Windows and Visual Studio licenses, target DirectX, and let Valve do whatever they need for game content.
This is a far better user experience for Battlefield players than in Windows.
Have you ever actually attempted to play that half-assed buggy piece of shit?
The one thing I haven’t been able to get working reliably is steam remote play with the Linux machine as host. Most games work fine, others will only capture black screens.
Granted, I don't play online games, so that might change things, but for years I used to have to make a concession that "yeah Windows is better for games...", but in the last couple years that simply has not been true. Games seem to run better on Linux than Windows, and I don't have to deal with a bunch of Microsoft advertising bullshit.
Hell, even the Microsoft Xbox One controllers work perfectly fine with xpad and the SteamOS/tenfoot interface recognizes it as an Xbox pad immediately, and this is with the official Microsoft Xbox dongle.
At this point, the only valid excuses to stay on Windows, in my opinion, are online games and Microsoft Office. I don't use Office since I've been on Unixey things so long that I've more or less just gotten used to its options, but I've been wholly unable to convince my parents to change.
I love my parents, but sometimes I want to kick their ass, because they can be a bit stuck in their ways; I am the one who is expected to fix their computer every time Windows decides to brick their computer, and they act like it's weird for me to ask them to install Linux. If I'm the one who has to perform unpaid maintenance on this I don't think it's weird for me to try and get them to use an operating system that has diagnostic tools that actually work.
As far as I can tell, the diagnostic and repair tools in Windows have never worked for any human in history, and they certainly have never worked for me. I don't see why anyone puts up with it when macOS and Linux have had tools that actually work for a very long time.
I didn’t see a performance increase moving to Linux for the vast majority of titles tested. Certainly not enough to outweigh the fact that I want EVERY game to work out of the box, and to never have to think if it will or won’t. And not all of my games did, and a not insignificant number needed serious tweaking to get working right.
I troubleshoot Linux issues all day long, I’ve zero interest in ever having to do it in my recreation time.
That’s a good enough reason for me to keep my windows box around.
I use Linux and OSX for everything that isn’t games, but windows functions just fine for me as a dumb console and I don’t seem to suffer any of these extreme and constant issues HN users seem to have with it from either a performance or reliability standpoint.
(cue superiority complex) I've been using Linux Desktop for over 10 years. It's great for literally everything. Gaming admittedly is like 8/10 for compatibility, but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows or use CAD, etc. Seriously, ez.
Never had issues with NVIDIA GFX with any of the desktop cards. Laptops... sure they glitch out.
Originally Wine, then Proton, now Bazzite make it super easy to game natively. The only issues I ever had with games were from the Kernel level anti-cheats bundled. The anti-cheats just weren't available for Linux, so the games didn't start. Anyone familiar with those knows its not a linux thing, it's a publisher/anti-cheat mechanism thing. Just lazy devs really.
(cue opinionated anti-corporate ideology) I like to keep microsoft chained up in a VM where it belongs so can't do it's shady crap. Also with a VM you can do shared folders and clipboard. Super handy actually.
Weirdly enough, MacOS in a VM is a huge pita, and doesn't work well.
For a server there is no better choice than Linux, but for my desktop/laptop, I find other alternatives better. Perhaps I haven’t found «the right distro», if so let me know, but until Linux is as low maintenance as windows or macos, it will be for those with an interest in doing that maintenance.
I realize I have a love-hate relationship with Linux. It is perfect, but flawed.
I think it was Jorge Castro, the creator of Universal Blue, who called it the sysadmin culture. Most Linux distros are made by sysadmins for sysadmins, and you're expected to change and configure your system. I was a sysadmin myself for a long time. I used Slackware; switched from the 2.4 kernel to 2.6; tweaked CFLAGS on Gentoo; replaced SysV init with systemd; used PipeWire from the earliest versions - you name it, I did it.
Nowadays I use https://aeondesktop.github.io/ - an immutable system with Btrfs snapshots. Everything is installed from Flathub. The major roadblock is that much of the Linux world expects you to modify the system one way or another, so your mileage may vary. I replaced my printer because I did not wanted to install binary blobs from HP/Samsung.
> Perhaps I haven’t found «the right distro»
I’d look at immutable or image-based offerings, which aims at low or no maintenance: Aeon Desktop, Universal Blue, Endless OS. There are reviews on sites like LWN.net
Compared to Windows-land where nuking and reinstalling the entire OS is a routine maintenance task, checking arch news to see if there's any manual intervention and running `pacman -Syu` is all I really ever think about.
I've been using Linux at work and at home every day for 15 years and I think in that whole time I've only ever had to reinstall the OS due to system issues once.
(I ran an Ubuntu system update on my laptop while on low battery, and it died. The APT database was irrevocably fucked afterwards. I'm not even sure it's fair to blame the OS for this, it was a dumb thing for me to do. I would also not be at all surprised if it's possible to fuck up a Windows installation in a similar manner).
Nowadays I run NixOS and yes that requires quite regular attention. But I've also used Ubuntu, Fedora and Debian extensively and all of them are just completely stable all the time.
(Only exception I can think of: Ubuntu used to have issues with /boot getting full which was a PITA).
I don't know what distribution you're using, but something sounds very broken if you need to do this.
My read is they don't really give a shit about it anymore because the revenue comes from mobile/tablets. Same reason Microsoft is comfortable trashing windows... the revenue is coming from O365 & Azure now. The OS is a loss leader to sell those, and it definitely feels like it these days.
Once a company eats from the fruit of the "ads" tree... they tend to degrade into "awful" from the user side, because the user stops being the primary customer - the conflict of interest there is unavoidable.
Apple is tucking in... https://ads.apple.com/
From my experience, some examples: for gentoo you are much more than a janitor - you must be everything all the time; for redhat based - you can get a major headache with some version upgrades; for arch (currently using, same install from 7 years) - update monthly and I had very few and minor issues
I tried running various Linux distros on my desktop some years ago and definitely agree on the crap-out experience and having to reinstall. Eventually settled on macOS and it's been okay.
The game changer for me has been Nix. It works on macOS. I have had coworkers use it on Ubuntu. I am soon planning to switch to NixOS.
People complain about the syntax but honestly AI gets you around that. You will still do janitorial work, but you mostly only need to do it once.
{
nix.gc = {
automatic = true;
dates = "weekly";
options = "--delete-older-than 30d";
};
}Pin all their apps in favorites and they will persist through updates. Updates don't overwrite desktop shortcuts either (although like other os, a couple might be added that need to be removed). This might be more difficult in gnome, I wouldn't know since I am firmly in the kde camp.
To stay as up to date as possible, use the mozilla apt repo:
https://support.mozilla.org/en-US/kb/install-firefox-linux#w...
Perfect analogy. I'm using Debian for a few months now on my main laptop, and everything is flawed. Seriously, everything.
- Hybrid graphics simply doesn't work. The exception is when it works. Don't even try Wayland with it.
- Graphics card handling is still full with race conditions. It's random when everything works as intended without manual intervention.
- Switching monitors is pain. Sometimes works, sometimes doesn't. Waking up my laptop with a new monitor plugged in is a gamble.
- Energy efficiency was bad with hybrid graphics, but since I had to turn it off, I don't even try to optimize it since.
- It was a pain to make my laptop speakers work. A lot of searching, and applying random fixes until one worked (in reality two fixes together).
- My main bluetooth headset has a feature to mute itself, or stop the music when it's not on my head. Guess which is the only device which I have that have a problem with this? The funny thing is, that it's a random even again. The sound comes back about 10% of the time fully. In another 10% of the time, the sound from some apps comes back, in others doesn't. In the other 80%, I had to reconnect it.
- Don't even talk about printers. It's a gamble, again. Some printers worked at some point in time, some simply don't work, and never will, because nobody cares about them anymore enough.
- Game performance is simply worse than on Windows. First of all, it wasn't trivial to force some games to use my GPU when I had hybrid graphics. The internet is full with outdated information. But even after that, my FPS is consistently worse. I heard some others who have the opposite experience. But this tells me again, that the whole thing is a gamble. Probably it's also a gamble on the game.
- When I press the power off button to put it to sleep, or initiate a normal shutdown, I need to force shutdown the whole laptop. Sometimes I get a notification that text editor is preventing shutdown, and whether I want to force quit it, but it doesn't matter which I clicked, and the "it will be force quit in 60 seconds if I don't select something" is a lie, the whole X framework is killed after a few seconds, and the laptop remains powered on, with the lie "the computer will be shutdown now" in terminal. This happens even when I don't get notification about that something would prevent power off. The shutdown initiation from the OS menu is working, and closing the lid put it to sleep.
And this is my current laptop. I simply couldn't use my previous one with Linux, because some stupid problem with the video card, which I couldn't solve in months. Even installation was a challenge.
I've used Linux in the past 25 years from time to time. It's getting better, but still a long way. You need some janitorial work also with Windows, especially nowadays, but it's still way better experience to click on "leave me alone" once a month, than this constant tinkering, and daily annoyance. I want to build things, not fix things which should just work.
That said I am running Debian Trixie using wayland / kde / cups / nvidia / etc and do not have any of your problems my graphics work, my printers work, my bluetooth works, sleep works. They all required a lot more configuration than the last several versions of Ubuntu had required (which shouldn't be the case if there is better example just right next door), but none are persistent.
Fully open source drivers using AMD video cards. It just works (minus the early x11/wayland debacle, I had to switch back to x11 for a while).
What does that even mean though? Under what circumstances? In what way?
> working professionally on Linux for many years.
Not enough to say which distribution... or do you mean you do kernel development work?
Well said, and in the tech community that's predominantly Apple. We need to change this.
(I'm typing this on my Linux desktop right now... but also have a separate Windows PC for running the games I want to run that don't work on Linux yet. When they work, I'll be thrilled to put Linux on that machine or its successor.)
For many people, this is called "barely working at all." And as I get older I am becoming one of those people so quickly.
That said, tech folk routinely underestimate how much they rely on their own technical skill. Try using Linux for a week without ever opening a terminal. Terminal is a "f this I'm going back to Windows" button for most people.
Plus I never been that FOSS religious, any system with some POSIX compatibility is good enough for me.
Many games refuse to run in VM, even if that VM is windows one. I bet there is a trick to bypass, but then you are at risk of being banned or can't receive support when needed.
That isn't weird. It's by design. MacOS is only designed to run on Apple hardware, and a VM, even if the host is Apple hardware isn't really Apple hardware.
The amount of hate spewed at FOSS is astounding really. People are literally giving you shit for free. Chill out.
Linux would be the desktop of choice years ago if anything from Adobe or Office actually worked on it, the two things that make the world go round. Valve has done their part to develop Proton, but there is no equal push for things people can't do work without.
- F1 2024 didn't load due to anti-cheat
- Dragon's Dogma 2 and Resident Evil 4 had non-functional raytracing
- Cyberpunk 2077 with raytracing on consistently crashes when reloading a save game
- Dying Light 2 occasionally freezes for a whole minute
- Starfield takes 25 minutes to compile shaders on first run, and framerates for Nvidia are halved compared to Windows
- Black Myth: Wukong judders badly on Nvidia cards
- Baldur's Gate 3 Linux build is a slideshow on Nvidia cards, and the Windows build fails for some AMD cards
If you research these games in discussion forums, you can find some configuration tweaks which might fix the issues. ProtonDB's rating is not a perfect indicator (BM:W is rated "platinum").
And while Steve says measurements from Linux and Windows are not directly comparable, I did so anyway and saw that Linux suffers a 10-30% drop in average FPS across the board when compared to Windows, depending on the game and video card.
Honestly, considering where we came from, a 10-30% perf drop is good and is a reasonable tradeoff to consider. Especially for all the people that don't want to touch Windows 11 with a 11-foot pole (which I am), it's a more than decent path. I can reboot into my unsupported Win10 install if I really need the frames.
Really, Linux benchmarks need to be split between AMD and NVIDIA. Both are useful, as the "just buy an amd card lol" crowd is ignoring the actually large NVIDIA install base, and it's not like I'm gonna swap out my RTX 3090 to go Linux.
Thanks for the comparison! Would you have an apples to apples, or rather an NVIDIA to NVIDIA comparison instead of "across the board"? I'd suspect the numbers are worse for the pure NVIDIA comparison, for what I mentioned above.
You are either trolling or completely out of your mind. You simply cannot be serious when saying stuff like this.
I still have the windows install. And with an RTX 3090, framerate is not that much of a consideration for most games, especially since my main monitor is "only" 1440p, albeit a 144Hz one.
Couple that with GSync, framerate fluctuations is not really noticeable. Gone are the days where dipping below 60Hz is a no-no. The most important metric is stutter and 1% lows, those will really affect the feeling of your game. My TV is 120Hz with GSync too, and couch games with a controller are much less sensitive to framerate.
Do I leave performance on the table? Surely. Do I care? In the short term, no. The last GPU intensive games I played are Hogwarts Legacy and Satisfactory, both of which can take a hit (satisfactory does not max the GPU, and Hogwarts can suffer DLSS). The next intensive game I plan on playing is GTA VI, and by this time I'd fully expect the perf gap to have closed. And the game to play fine, given how Rockstar puts care on how the performance of their games, more so with the Gabe Cube being an actual target.
In the long run, I agree this is not a "happy" compromise. I paid for that hardware dammit. But the NVIDIA situation will be solved by the time I buy a new GPU: either they completely drop out of the gaming business to focus on AI, or they fix their shit because Linux will be an actual gaming market and they can't keep giving the finger to the penguin.
The Start menu works great with no lag, even immediately after booting.
The only thing that I consider annoying would be the 'Setup' screens that sometimes show up after bigger updates.
---
Would I trade it all to get on Bazzite DX:
- lower game compatibility and potential bugs
- subpar NVIDIA drivers with the risk of performance degradation
- restricted development in dev containers relying on VS Code Remote
- Loss of the Backblaze Unlimited plan
+ system rollbacks if an update fails
---
That does not seem worth it to me.
The very fact that this has to be explicitly mentioned is laughable.
Like $100 Chinese phones can achieve the same, this is the bare basic for a modern system capable of running 240Hz monitor (I assume it can do so with most games).
I'm testing daily-drive on my main rig (high-end from a few years ago, 5900x + 3090), and honestly I'm rediscovering my computer. A combination of less fluff, less animations, better fs performance (NTFS on NVMe is suboptimal), etc. I was getting fed up by a few windows quirks: weird updates breaking stuff, weird audio issues (e.g. the audio subsystem getting ~10s latency for any interaction like playing a new media file or opening the output switcher), weird display issues (computer locking up when powering on/off my 4k tv), and whatnot. I'm still keeping the w10 install around, as having an unsupported OS is less of a problem for the occasional game, especially since I mostly play offline games.
As for the dev env, you're not limited to bazzite, I run Arch. Well, I've been running it for two weeks on the rig. But you really get the best devex with linux.
That's on a 7950X3D with 64 GB RAM and a Samsung 990 Pro SSD. Maybe it performs worse on slower hardware.
I have 14 TB of SSDs connected, so it's not like there is no content on my PC.
Notably I don't have any HDDs connected, maybe that plays a role here.
I'm curious, did you clean up what's by default in the start menu? Stuff like "recommended", "candy crush", and the likes? On the win11 I tested, those parts loaded slower than the rest, I wonder if the start menu has a timeout of "load then open".
Had I switched to win11 I'd have slapped Classic Shell on it, as I did on win10. It's a reimplementation of the win7 start menu with windows-version-appropriate design, but with win7 reactivity (opens literally the next frame, in no small parts thanks to the absence of animation).
I don't think it made a difference, it was already lag free before.
It's annoying they put Office Copilot and Instagram there, but it uninstalled with just two clicks per item, taking a minute or so to get rid of everything.
I played Baldur's Gate 3 on Linux on a GeForce GTX 1060 (which is almost 10 years old!) without a fan (I found later that it was broken) and I generally did not have issues (couple of times in the whole game slowed for couple of seconds, but nothing major).
Which applies to all the games, basically. I nowadays make sure to select Proton before even running the game for the first time, in case it has a Linux build -- that will invariably be the buggier experience so want to avoid it.
That's not even limited to linux or gaming. A few weeks ago i tried to apply the latest Windows update to my 2018 lenovo thinkpad. It complained about insufficient space (had 20GB free). I then used a usb as swap (required by windows) and tried to install the update. Gave up after 1 hour without progress...
Hardware+OS really seems unfixable in some cases. I'm 100% getting a macbook next time. At least with Apple I can schedule a support appointment.
Not at all my experience which makes me question the rest. Also https://www.protondb.com/app/1086940 most people seem quite happy with it so it's not a "me" problem.
Finally the "10-30% drop in average FPS across the board" might be correct, then so what? I understand a LOT of gamers want to have "the best" performance for what they paid good money for but pretty much NO game becomes less fun with even a 30% FPS drop, you just adjust the settings and go play. I think a lot of gamers do get confused and consider maximizing performances itself as a game. It might be fun, and that's 100% OK, but it's also NOT what playing an actual game is about.
There's a few reports there for the native version of the game: https://www.protondb.com/app/1086940#9GT638Fuyx , with similar Nvidia GPU issues and a fix.
I mostly play fighting games. A 7% drop in FPS is more than enough to break the whole game experience as combo rely on frame data. For example Street Fighter 6 is locked at 60 fps. A low punch needs 4 frames to launch and leaves a 4-frames window to land another hit. If there was a 7% drop in FPS, you would miss your combo. Even the tiniest drop in FPS makes the game unplayable.
It's the same for almost every fighting games. I know it's a niche genre, but I'm quite sure it's the same for other genres. It's a complete dealbreaker for competitive play.
Very true, and this is the biggest issue for me when it comes to gaming on Linux. And it's not just raw FPS count. You can usually brute force your way around that with better hardware. (I'm guessing you could probably get a locked 60 in Street Fighter 6 even with a 30% performance loss?). It's things like input lag and stutter, which in my experience is almost impossible to resolve.
If it weren't for competitive shooters, I could probably go all Linux. But for now I still need to switch over to Windows for that.
Tried running Worms: instant crash, no error message.
Tried running Among Us: instant crash, had to add cryptic arguments to the command line to get it to run.
Tried running Parkitect: crashes after 5 minutes.
These three games are extremely simple, graphically speaking. They don't use any complicated anti-cheat measure. This shouldn't be complicated, yet it is.
Oh and I'm using Arch (BTW), the exact distro SteamOS is based on.
And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
Hard to say what might be going wrong for you without more details. I would guess there's something wrong with your video driver. Maybe you have an nvidia card and the OS has installed the nouveau drivers by default? Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things. This is indeed a sore spot for Linux gaming, though to be fair graphics driver problems are not exactly unheard of on Windows either.
Personally I have a bunch of machines dedicated to gaming in my house (https://lanparty.house) which have proven to be much more stable running Linux than they were with Windows. I think this is because the particular NIC in these machines just has terrible Windows drivers, but decent Linux drivers (and I am netbooting, so network driver stability is pretty critical to the whole system).
BeamNG (before a very recent native Linux beta) was gold despite a serious fps drop and also a memleak to crash any time there's traffic.
So I don't trust the ratings.
Interesting. I saw somewhere else you're using Debian. Is it as opposed from Nouveau or the proprietary drivers from the Debian repos?
I'm currently testing to daily drive my desktop with linux on an NVIDIA GPU, and the Arch wiki explicitly recommends drivers from their repos. However, arch is rolling and the repo drivers are supposedly much more up to date than Debian's ones. Though, I'll keep your comment if I run into anything.
But I have a lot of experience on Debian and Ubuntu trying to use the packages that handle the nvidia driver installation for you. It works OK. But one day on a lark I tried downloading the blob directly from nvidia and installing that way, and I was surprised to find it was quite smooth and thorough, so I've been doing it that way ever since.
Woah, that is extremely cool. Very nice work, sir.
Crazy—it used to be that nvidia drivers were by far the least stable parts of an install, and nouveau was a giant leap forward. Good to know their software reputation has improved somewhat
Whereas, AMD just works and is thus standard recommendation.
1. I needed to install a bleeding-edge kernel version in order to get support for the very new AMD card I had purchased, which was a bit of a pain on Debian. (With NVidia, the latest drivers will support the latest hardware on older kernels just fine.)
2. AMD can't support HDMI 2.1 in their open source drivers. Not their fault -- it's a shitty decision by the HDMI forum to ban open source implementations. But I was trying to drive an 8k monitor and for other reasons I had to use HDMI, so this was a deal-breaker for me. (This is actually now solvable using a DP->HDMI dongle, but I didn't discover that solution at the time.)
But every time I've tried to use AMD the problems have been different. This is just the most recent example.
Obviously I'm using the open source drivers, since the entire point of everyone's argument for AMD on Linux is the open source part.
The root problem may just be that I'm deeply familiar with the nvidia linux experience after 25 years of using it whereas the AMD experience is unfamiliar whenever I try it, so I'm more likely to get stuck on basic issues.
I think people are still clinging onto old "wisdom" that hasn't be true for decades, like "updating breaks Arch", go figure.
If you figure out how to reliably do this, you're a rich man
SteamOS is based on Arch, but customized and aimed at specific hardware configurations. It’d be interesting to know what hardware you’re using and if any of your components are not well supported.
FWIW, I’ve used Steam on Linux (mostly PopOS until this year, then Bazzite) for years and years without many problems. ISTR having to do something to make Quake III work a few years ago, but it ran fine after and I’ve recently reinstalled it and didn’t have to fuss with anything.
Granted, I don’t run a huge variety of games, but I’ve finished several or played for many hours without crashes, etc.
I've been gaming on linux exclusively for about 8 years now and have had very few issues running windows games. Sometimes the windows version, run through proton, runs better than the native port. I don't tend to be playing AAA games right after launch day, though. So it could be taste is affecting my experience.
This sounds like you are rejecting help because you have made up your mind in frustration already.
Because you are doing it wrong. If you want an OS that just works, you should use Ubuntu or Fedora. Why is SteamOS based on Arch then? Because Valve wants to tweak things in it and tinker with it themselves to get it how they like.
You don't.
So use an OS that requires less from you and that tries to just work out of the box, not one that is notorious for being something you break and tinker with constantly (Arch).
But when something crashes with no error message whatsoever, it makes it a tiny bit harder to troubleshoot.
Especially when so many people answer, just like I had predicted, "works on my machine". Which would only be a gotcha if I had implied it worked on no machine whatsoever. Which I didn't.
I'll tinker some more and I'll be sure to post my findings if I get these games to work.
One thing that I do though is get most games at least one year after release, when probably many issues are fixed. I had tons of issues many years ago, with buggy games bought immediately after release (on Windows back then), so now I changed strategy...
I'm not saying "you're doing it wrong", because obviously if you're having trouble then that is, if nothing else, bad UX design, but I actually am kind of curious as to what you're doing different than me. I have an extremely vanilla NixOS setup that boots into GameScope + Tenfoot and I drive everything with a gamepad and it works about as easily as a console does for me.
That probably includes anything that isn't a PC in a time-capsule from when the game originally released, so any OS/driver changes since then, and I don't think we've reached the point where we can emulate specific hardware models to plug into a VM. One of the reasons the geforce/radeon drivers (eg, the geforce "game ready" branding) are so big is that they carry a whole catalogue of quirk workarounds for when the game renderer is coded badly or to make it a better fit to hardware and lets them advertise +15% performance in a new version. Part of the work for wine/proton/dxvk is going to be replicating that instead of a blunt translation strictly to the standards.
With regards to Linux I generally just focus on hardware from brands that have historically had good Linux support, but that's just a rule of thumb, certainly not perfect.
Arch won't hold your hands to ensure everything required is installed, because many dependencies are either optional (you have to read the pacman logs) or just hidden (because it's in the game itself). Valve actually does a great job providing a "works everywhere" runtime as their games are distributed in a flatpak-like fashion, but things can seep through the cracks.
The compositor can have an effect. The desktop settings. The GPU drivers. What's installed as far as e.g. fonts go. RAM setup, with or without swap.
As for steamOS, the real difference, is that despite being Arch-based, you're not installing Arch, but steamOS. A pre-packaged pre-configured Arch linux, with a set of opinionated software and its set of pre-made config files, for a small set of (1) devices. It's not really Arch you're installing, but a full-blow distro that happens to be arch-based.
That said, I understand your frustration as I've hit this many times on a laptop with dual graphics. Getting PRIME to run with the very first drivers that supported it was fun. Oh and I'm likely to hit the same walls as you since I just switched my gaming rig to Arch. GLHF!
But it is also true that many games still require minor tweaks. For example, just last week, I found out that I had to enable hardware acceleration for the webview within Steam, just to be able to log in to Halo Infinite. It was just clicking a checkbox, but otherwise, the game would not have been playable.
But I am always surprised when you find out you have those kinds of issues with Windows as well.
All three games works perfectly well on both Steam OS and on my kid's PC running CachyOS without any intervention.
There are people who make stripped-down versions of windows. Is it fair to say that because these releases exist that windows isn't "just works" either?
I'd say it pretty much "just works" except less popular apps are a bit more work to install. On occasion you have to compile apps from source, but it's usually relatively straightforward and on the upside you get the latest version :)
For anyone who is a developer professionally I'd say the pros outweigh the cons at this point for your work machine.
Interesting, I've had to switch off from Gnome after the new release changed the choices for HiDPI fractional scaling. Now, for my display, they only support "perfect vision" and "legally blind" scaling options.
Now whether or not this feature should have remained experimental is a different debate. I personally find that similar to the fact that Gmail has labeled itself beta for many years.
So on my Framework 13, I no longer have the 150% option. I can pick 133%, double, or triple. 160% would be great, but that requires a denominator of 5, which Gnome doesn't evaluate. And you can't define your own values in monitors.xml anymore.
org/gnome/mutter/experimental-features; scale-monitor-framebuffer, xwayland-native-scaling
Although it was to BSDi then, and then FreeBSD and then OpenBSD for 5 years or so. I can't remember why I switched to Debian but I've been there ever since.
I'm sat here now playing Oxygen Not Included.
Have that desktop be reachable with SSH for all your CLI and sys admin needs, use sunshine/moonlight for the remote streaming and tailscale for securing and making sunshine globally available.
Latency is another problem, recently LTT video show that even as low as 5-10ms added latency can negatively impact your gaming performance, even if you don't notice. You begin to notice at around 20ms.
Regarding latency, this solution is meant as a way to use your notebook for any task, not just gaming. You can still play and enjoy most fps games with a mouse even at 20ms of extra latency, and you can tolerate much more when playing games with a gamepad. If you need to perform your best on a competitive match of cs2 you obviously should be on a wired connection, in front of a nice desktop pc (the very same you were using to stream to your notebook perhaps) and with a nice high refresh rate monitor. Notebooks are usually garbage for that anyways.
HP releases firmware updates on LVFS for both the ZBook and its companion Thunderbolt 4 dock(!). They also got it Ubuntu certified, like most of their business laptops.
Updated Mesa to the latest and the kernel too.
I've had Linux running on a variety of laptops since the noughties. I've had no more issues than with Windows. ndiswrapper was a bit shit but did work back in the day.
What issues have you had?
Beyond that, Lunar Lake chips are evidently really really good. The Dell XPS line in particular shows a lot of promise for becoming a strict upgrade or sidegrade to the M2 line within a few years, assuming the haptic touchpad works as well as claimed. In the meantime, I'm sure the XPS is still great if you can live with some compromises, and it even has official Linux support.
I don’t exactly understand this setup. What’s the vm tech?
Most VM software (at least all of it that I've tried) doesn't properly emulate this. Instead, after you've moved your fingers some distance, it's translated to one discrete "tick" of a mouse scroll wheel, which causes the document to scroll a few lines.
The VM software I use is UTM, which is a frontend to QEMU or Apple Virtualization framework depending on which setting you pick when setting up the VM.
This is an understatement. It is completely impossible to even attempt to install Linux at all on an M3 or M4, and AFAIK there have been no public reports of any progress or anyone working on it. (Maybe there are people working on it, I don’t know).
Sounds like the GPU architecture changed significantly with M3. With M4 and M5, the technique for efficiently reverse-engineering drivers using a hypervisor no longer works.
Thanks, I guess I stand corrected.
> There are screenshots of an M3 machine running Linux and playing DOOM at around 31:34 here
That is encouraging! Still, there is no way for a normal to user to try to install it, unless something changed very recently.
Not working with Linux is a function of Apple, not Linux. There is a crew who have wasted the last half decade trying to make Asahi Linux, a distro to run on ARM macbooks. The result is after all that time, getting an almost reasonably working OS on old hardware, Apple released the M4 and crippled the whole effort. There's been a lot of drama around the core team who have tried to cast blame, but it's clear they are frustrated by the fact that the OEM would rather Asahi didn't exist.
I can't personally consider a laptop which can't run linux "top notch." But I gave up on macbooks around 10 years ago. You can call me biased.
That's fine for people on hn, but it instantly wipes out any chance of non technical users on Windows and Mac. It's a total deal breaker.
Amazing that high dpi still doesn’t work. I tried to run linux on 4k in around 2016-2017 and the experience was so bad I gave up.
Experience is slowly getting better. There is nothing I haven't been able to get to work, but with tricks or adjustments.
I think the "best bonus" is using LLM's in deep research mode to wade through all the blog post, reddit posts etc to get something to work by discovering forementioned tricks. Before, you had to do that by yourself and it sucked. Now I get 3 good ideas from Claude in "ranking order" of how likely it is to make it work => 99% of games I get to run in 5 minutes with a shell command or two. Lutris is also pretty good.
Omarchy on my laptop has finally made computers fun for me again, it's so great and nostalgic. Happy to be back after my brief work-mandated adventure into MacOS.
(My customer demographic is seniors & casual users).
In fact when I read threads like this complaining about Windows, I have to remind myself that most people aren't running LTSC.
Enterprise likes to layer multiple invasive security products though that'll do a lot worse than defender
Loading Teams can take minutes. I'm often late to meetings waiting for the damn thing to load.
Feels like early 90s computing and that Moore's Law was an excuse for bad coding practices and pushing newer hardware so that "shit you don't care about but is 'part of the system'" can do more monitoring and have more control of 'your' computer.
It’s super annoying!
On my laptop I use to write blog posts, that never ever gets plugged into a second screen? Sure, Wayland's great. On a computer that I expect normal people to be able to use without dumb problems? Hell no!
Unfortunately, Wayland inherently can't be like Pipewire, which instantly solved basically 90% of audio issues on Linux through its compatibility with Pulseaudio, while having (in my experience) zero drawbacks. If someone could make the equivalent of Pipewire for X11, that'd be nice. Probably far-fetched though.
Well you see, you are actually just silly for wanting this or asking for this, because it's actually just a security flaw...or something. I will not elaborate further.
I just installed hyprland yesterday and outside of having to switch back to i3 once to install what they had set for a terminal in their default config(kitty), I haven’t had to leave again.
I'm sure I'm not getting everything everything I could out of it.
1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).
2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.
Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.
The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.
My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.
PC has Manjaro Linux with RTX 3060 12GB
Graphic card driver: Nvidia 580.119.02
KDE Plasma Version 6.5.4
KDE Frameworks Version: 6.21.0
Qt Version: 6.10.1
Kernel Version 6.12.63-1-MANJARO
Graphics Platform: Wayland
Display Configuration High Dynamic Range: Enable HDR is checked
There is a button for brightness calibration that I used for adjustment.
Color accuracy: Prefer color accuracy
sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
Brightness: 100%
TV is reporting HDR signal.AVR is reporting...
Resolution: 4KA VRR
HDR: HDR10
Color Space RGB /BT.2020
Pixel Depth: 10bits
FRL Rate 24Gbps
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled. Color look completely washed out.
For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.
I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.
On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...
Resolution: 4k24
HDR: Dolby Vision
Color Space: RGB
Pixel Depth 8bits
FRL Rate: no info
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data.
The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).I would say the colors over all look better on the Blu-ray.
I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.
*Edit: Sorry Hacker News has completely changed the format of my text.
Also, go to YouTube and play this video: https://www.youtube.com/watch?v=onVhbeY7nLM
Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.
The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.
Your Display Configuration
Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.
| Monitor | Connector | Format | Color Depth | HDR | Colorspace |
|------------------------|-----------|-------------|-------------|--------------|------------|
| Dell U2725QE (XXXXXXX) | HDMI-A-1 | ABGR2101010 | 10-bit | Enabled (PQ) | BT2020_RGB |
| Dell U2725QE (XXXXXXX) | HDMI-A-2 | ABGR2101010 | 10-bit | Disabled | Default |
* Changed the serial numbers to XXXXXXXI am on Wayland and outputting via HDMI 2.1 if that helps.
EDIT: Claude explained how it determined this with drm_info, and manually verified it:
> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.
EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.
EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).
EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34
Here's what I'm getting on both monitors, with HDR enabled on Gnome 49: https://imgur.com/a/SCyyZWt
Maybe you're lucky with the Dell. But as I understand, HDR playback on Chrome is still broken.
I'm actually surprised that YouTube HDR works on your side - perhaps it's tied to the ABGR2101010 output mode being available.
That's still pretty crappy. Monitors do not say whether they support BGR input signals or not as opposed to RGB.
The GPU and monitor combination has full 10-bit HDR in Windows. But in Linux it's stuck at 8bpp due to nVidia driver not having 10-bit RGB output.
EDIT: See my sibling comment.
Here's what I'm getting on an RTX 4090 / InnoCN 27M2V and Cooler Master Tempest GP27U.
Prior to that windows was better on laptops due to having the proprietary drivers or working ACPI. But it was pretty poor quality in terms of reliability, and the main problem of the included software being incredibly bare bones, combined with the experience of finding and installing software was so awful (especially if you've not got an unlimited credit card to pay for "big professional solutions").
Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
It's Critic's Disease: When a band moves to a major label, they "suddenly" put out their critically acclaimed masterpiece, when before nobody would review a thing they did and mocked their fanatic fans.
"Now, they've matured."
Let them have it, though. People need to rationalize their past hostility to the right thing in some way in order to progress. If you want people to say that they were ever wrong, you'll die waiting. The situation became completely intolerable where they were insisting on staying no matter what because they weren't stuck-up nerds who care about stupid stuff that no one cares about. They were finally humiliated enough to move.
They'll end up moving to weird semi-commercial distributions that market specifically to them, too, and ridicule people who criticize those distributions for being stuck-up nerds who care about stupid stuff no one cares about. As long as it doesn't break Debian, I'm cool.
Also, I was basically a child and had no idea what I was doing (I still don't but that's besides the point). Things have definitely gotten better.
In the Red Hat 4.2 days, it was something that I was able to use because I was a giant nerd, but I'd never ever ever have recommended it to a normal person. By Ubuntu 12.04, 15 years later, it was good enough that I'd recommend it to someone who didn't do any gaming and didn't need to use any of the desktop apps that were still then semi-common. In 2026, it's fine for just about anyone unless you are playing particular (albeit very popular) games.
It's MY system and I do whatever the heck I want, from play boring stuff to weird prototyping. I get no, like literally 0, anti-feature. I'm not "scared" that an update will limit my agency. I'm just zen and that is priceless.
Also, quite importantly, it works wonderfully with all my other devices and peripherals. I go from Bluetooh headsets easily, I switch monitors, video projectors, XR devices, CV camera inputs, I share files with KDE Connect, I receive SMS notification from my (deGoogled) Android phone, I reply from my desktop, I get notification when my SteamDeck is soon out of battery, etc. ALL my devices play nicely with each other.
So yes, Linux is good now. It's been for a while but it's been even better for the last few years.
Bazzite is rough in the way that all distributions are, but I imagine Windows 11 is rougher.
In Fedora Atomic it should be foolishly easy to set up a system account, with access to specific USB devices via group, and attach a volume that can easily be written to by a non-root user inside of the container.
Oh, and also anti-cheat games forcing me to use Windows. Makes me sick to my stomach booting into Windows 11 every couple of months and having to watch my PC performance tank while it's downloading updates, Windows Defender scans, etc. for 30 minutes
Mesa, the kernel drivers and Proton have all seen a lot of growth this past year combined with a bunch of garbage decisions MS has doubled down on... not to mention, enough Linux users in tech combined with Valve/Steam's efforts have made it visible enough that even normies are considering giving Linux a try.
Debian is a breath of fresh air in comparison. Totally quiet and snappy.
People who don't use Debian misunderstand Stable. It's released every two years, and a subset of the software is kept up to date in Backports. For anything not included in Backports, its trivial to run Debian Testing or Unstable in a chroot on your Stable machine.
I moved to Debian Stable ~20 years ago because constant updates in other distros always screwed up CUPS printing (among other things). Curiously, I was using Ubuntu earlier this year and the same thing happened. Never going back.
> The drivers included are just too old.
This can usually be fixed by enabling Debian Backports. In some cases, it doesn't even need fixing, because userland drivers like Mesa can be included in the runtimes provided by Steam, Flatpak, etc.
Once set up, Debian is a very low-maintenance system that respects my time, and I love it for that.
Ubuntu seems to be slowly getting worse.
- Firefox seems to be able to freeze both itself and, sometimes, the whole system. Usually while typing text into a large text box.
- Recently, printing didn't work for two days. Some pushed update installed a version of the CUPS daemon which reported a syntax error on the cupsd.conf file. A few days later, the problem went away, after much discussion on forums about workarounds.
- Can't use more than half of memory before the OOM killer kicks in. The default rule of the OOM killer daemon is that if a process has more than half of memory for a minute, kill it. Rust builds get killed. Firefox gets killed. This is a huge pain on the 8GB machine. Yes, I could edit some config file and stop this, but that tends to interfere with config file updates from Ubuntu and from the GUI tools.
None of these problems existed a year ago.
But you can adjust your own system. It'd be unhelpful of me to suggest to an unhappy Windows user that they should switch to another operating system, as that demands a drastic change of environment. On the other hand, you're already familiar with Linux, so the switching cost to a different Linux distribution is significantly lower. Thus I can fairly say that "Ubuntu getting worse" is less of a problem than "Windows getting worse." You have many convenient options. A Windows user has fewer.
First time I switched to asus kernel from the generic one was magic - I know asus-linux exists and following the instructions probably would have ended up in a working system, but with bazzite I wrote only one command and everything worked. It still feels weird not to monkey around with package installations (and this was a dangerous path, usually ended up with more work for me) but this is a tradeoff I can live with. The software I used - luckily - already moved to Flatpak so everything was a breeze. Also the fact that I can switch to a working state with one keypress is a stress reliever.
I agree. Linux is good now - for the common user. I still can't see immutable distros can be used for all scenarios but for gaming/home use, this is a methodology I can easily recommend for my friends and family who only want a computer that works without messing with console.
E.g three weeks ago nvidia pushed bad drivers which broke my desktop after a reboot and I had to swap display (ctrl-alt-f3 etc), I never got into gnome at all, and roll back to an earlier version. Automatic rollback of bad drivers would have saved this.
Are Radeon drivers less shit?
Then again Arch is one of those distros that has the attitude that you need to be a little engaged/responsible for ongoing maintenance of your system, which is why I'm against blind "just use (distro)" recommendations unless it's very basic and low assumptions about the user.
[0] https://old.reddit.com/r/archlinux/comments/1prm8rl/archanno...
A couple of months ago I bought a second hand RX 7800 XT, and prepared myself for a painful experience, but I think it just worked. Like I got frustrated trying to find out how to download and install the driver, when I think it just came with Linux Mint already.
It doesn't feel real sometimes. My dotfiles are modularized, backed up in Github and versioned with UEFI rollback when I update. I might be using this for the rest of my life, now.
One just need to make sure that you use the proper _rsync_ command options to preserve hard links or files will be duplicated.
It's not advisable to switch to one of these paranoid configurations outright, but they're a great introduction to the flexibility provided by the NixOS configuration system. I'd also recommend Xe's documentation of Nix Flakes, which can be used on any UNIX-like system including macOS: https://xeiaso.net/blog/nix-flakes-1-2022-02-21/
However, despite really, really wanting to switch (and having it installed on my laptop), I keep finding things that don't quite work right that are preventing me from switching some of my machines. My living room PC, which is what my TV is connected to, the DVR software that runs my TV tuner card doesn't quite work right (despite having a native linux installer), and I couldn't get channels to come through as clearly and as easily. I spent a couple of hours of troubleshooting and gave up.
My work PC needs to have the Dropbox app (which has a linux installer), but it also needs the "online-only" functionality so that I can see and browse the entire (very large) dropbox directory without needing to have it all stored locally. This has been a feature that has been being requested on the linux version of the app for years, and dropbox appears unlikely to add it anytime soon.
Both of these are pretty niche issues that I don't expect to affect the vast majority of users (and the dropbox one in particular shouldn't be an issue at all if my org didn't insist on using dropbox in a way that it is very much not intended to be used, and for which better solutions exist, but I have given up on that fight a long time ago), and like I said, I've had linux on my laptop for a couple of years so far without any issue, and I love it.
I am curious how many "edge cases" like mine exist out there though. Maybe there exists some such edge case for a lot of people even while almost no one has the same edge case issue.
But some of the drawbacks really aren't edge cases. Apparently there is still no way for me to have access to most creative apps (e.g. Adobe, Affinity) with GPU acceleration. It's irritating that so few Linux install processes are turnkey the way they are for Windows/Mac, with errors and caveats that cost less-than-expert users hours of experimenting and mucking with documentation.
I could go on, but it really feels like a bad time to be a casual PC user these days, because Windows is an inhospitable swamp, and Linux still has some sharp edges.
One big plus with Linux, it's more amenable to AI assistance - just copy & paste shell commands, rather than follow GUI step-by-steps. And Linux has been in the world long enough to be deeply in the LLM training corpuses.
Why are you holding computing to the same standard?
I'm curious as a non-gamer. The article seems to be entirely about Windows gaming.
Linux claims to be a general computing operating system, but had historically not prioritized gaming (or UI…). This has changed and the article notes as much. Windows OS is used for a benchmark because it is still the gold standard, and the OS that most games are intended to be executed with.
If it did, we would.
> Why are you holding computing to the same standard?
What?
> The article seems to be entirely about Windows gaming.
It is. It'd be weird if it wasn't.
The Linux world is amazing for its experimentation and collaboration. But the fragmentation makes it hard for even technical people like me who just want to get work done to embrace it for the desktop.
Ubuntu LTS is probably the right choice. But it's just one more thing I have to go research.
If using Ubuntu LTS for gaming, you might want to add a newer kernel: https://ubuntu.com/kernel/lifecycle
Linux Mint would also be a reasonable pick.
I haven't tried Bazzite because I'm not into gaming but Linux Mint is working very well for a lot of people coming from Windows. It just works and has great defaults. Windows users seem to pick it up pretty easily.
Also, Linux Mint upgrades very well. I've had a lot of success upgrading to new versions without needing to reinstall everything. Ubuntu and other distros I've tried often have failed during upgrading and I had to reinstall.
Any reasonably popular distro will have enough other users that you can find resources for fixing hitches. The deciding factor that made me go with EndeavourOS was that their website had cool pictures of space on it. If you don't already care then the criteria don't need to be any deeper than that.
Once you use it enough to develop opinions, the huge list of options will thin itself out.
I think it will probably change at some point, but until then I just can't use it.
The ONLY thing I'm still having trouble with under Linux is Steam VR on the HTC Vive. It works. Barely.
So I installed Fedora on my work machine and find that I can still get all of my work done. Well except the parts that require testing accessibility on Windows screen readers or helping with Windows-related issues.
The only thing I miss now are the many addons made for NVDA, especially the ones for image descriptions. But if I can get something to work with Wayland, I could probably vibe code some of them. Thank goodness for Claude Code.
After a particularly busy OSS event a non-programmer friend of mine asked me, why is it that the Linux people seem to be so needy for everyone to make the same choices they make? trying to answer that question changed my perspective on the entire community. And here we are, after all these years the same question seems to still apply.
Why are we so needy for ALL users and use-cases to be Linux-based and Linux-centric once we make that choice ourselves? What is it about Linux? the BSD people seem to not suffer from this and I've never heard anyone advocate for migration to OSX in spite of it being superior for specific usecases (like music production).
IMO if you're a creator, operating systems are tools; use the tool that fits the task.
Examples:
- An important document is sent to me in a proprietary format
- A streaming service uses a DRM service owned by a tech giant that refuses to let it work with open source projects
- A video game developer thinks making games work on Linux isn't worth getting rid of rootkit anticheat
The downside is Windows users would have to live in a world without subscription-based office suites, locked down media, and letting the CCP into your ring 0.
I do understand the evangelism being obnoxious. I don’t advocate for people to switch if they have key use cases that ONLY windows or OS X can meet. Certainly not good to be pushy. But otherwise, people are really getting a better experience by switching to Linux.
The community aspect of free software both pushes for more people to participate (and often for other groups to be excluded as "wrong" or "evil").
But that community only offers secondary benefits to those who are authors or painters or photographers rather than software developers - economic factors, risk aversion, functionality, and so on. The FLOSS communities are almost invariably driven toward hobbyists and developers rather than authors, artists, gamers, and the like - people whose interest lies outside of tinkering with and/or improving software.
The BSDs were never really a movement in that sense, and macOS is still just a product even if there are enthusiastic users of them both.
Similarly on the Linux side: Android, Steam Deck, and countless IoT devices are examples of successful products where the Linux aspect of them is not really even advertised.
Software freedom is A Good Thing.
And if you want my help, don't ask me to support the garbage that $OSCORP is foisting on you.
This is the sort of question an apolitical person would ask a liberal (I am aware liberalism had been tainted in the recent times), like why is it you people are so needy and constantly preaching about democracy?
Linux/x86 still is poor for battery life compared to Apple.
That’s my impression anyway.
It's a slow moving evergreen topic perfect for a scheduled release while the author is on holiday. This is just filler content that could have been written at any point in the last 10 years with minor changes.
I've not seen anything like the current level of momentum, ever, nor this level of mainstream exposure. Gaming has changed the equation, and 2026 will be wild.
On the other hand, on the Linux side, we had the release of COSMIC, which is an extremely user-friendly desktop. KDE, Gnome, and others are all at a point where they feel polished and stable.
1. 'office' cloud services - now you just need a browser for majority of docs/sheet/slides tasks
2. gaming - while it was possible back, but it was really hit or miss with a game. Nowadays vast majority of games work on Linux out of the box.
The level of momentum feels roughly equivalent to the era of Ubuntu coming around in the mid-2000s. We have been here before.
The bloat is astounding. This is especially egregious now that RAM costs a fortune.
To be honest, I always figured we'd make it in the long run. We're a thrifty bunch, we aim to set up sustainable organizations, we're more enshittification-resistant by nature. As long as we're reliable and stick around for long enough.
I opted to install Linux in a VM under Hyper-V on Windows to avoid hassles with the dual GPUs in my ThinkPad P52, but this comes with several other hassles I'd like to avoid. (Like no GPU access in Linux at all...)
https://learn.microsoft.com/en-us/windows/wsl/tutorials/gui-...
One thing that can be annoying is how quickly things have moved in the Linux gaming space over the past 5 years. I have been a part of conversations with coworkers who talk about how Linux gaming was in 2019 or 2020. I feel like anyone familiar with Linux will know the feeling of how quickly things can improve while documentation and public information cannot keep up.
Windows for gaming, Ubuntu as desktop default, Arch Linux on Laptop, MacOS on the other Laptop.
Games are not the problem, but that one game i want to play on a saturday evening when i have time, is the problem. That one i haven't tried out yet.
So far all the games I want to play run really well, with no noticable performance difference. If anything, they feel faster, but it could be placebo because the DE is more responsive.
Wayland spent a decade to be mostly usable with rough edges, Flatpak sandbox is really rough and most things are designed by amateurs.
Still Windows destroying itself made the gap closer than ever, right now is a great chance to gain market share and funding and professionals.
Ubuntu’s default desktop felt unstable in a macOS VM. Dual-booting on a couple of HP laptops slowed to a crawl after installing a few desktop apps, apparently because they pulled in background services. What surprised me was how quickly the system became unpleasant to use without any obvious “you just broke X” moment.
My current guess: not Linux in general, but heavy defaults (GNOME, Snap, systemd timers), desktop apps dragging in daemons, and OEM firmware / power-management quirks that don’t play well with Linux. Server Linux holds up because everything stays explicit. Desktop distros hide complexity and don’t give much visibility when things start to rot.
Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
For certain timeperiods I have needed to switch to Fedora, or the Fedora KDE spin, to get access to more recent software if I'm using newer hardware. That has generally also been pretty stable but the constant stream of updates and short OS life are not really what I'm looking for in a desktop experience.
There are three issues that linux still has, which are across the board:
- Lack of commercial mechanical engineering software support (CAD & CAE software)
- Inability to reliably suspend or sleep for laptops
- Worse battery life on laptops
If you are using a desktop and don't care about CAD or CAE software I think it's probably a better experience overall than windows. Laptops are still more for advanced users imho but if you go with something that has good linux support from the factory (Dell XPS 13, Framework, etc.) it will be mostly frictionless. It just sucks on that one day where you install an update, close the laptop lid, put it in your backpack, and find it absolutely cooking and near 0% when you take it out.
I also have never found something that gave me the battery life I wanted with linux. I used two XPS 13's and they were the closest but still were only like 75% of what I would like. My current Framework 16 is like 50% of what I would like. That is with always going for a 1080p display but using a VPN which doesn't help battery life.
My experience with FOSS has mostly been that mature projects with any reasonable-sized userbase tend to more reliably not break things in updates than is the case for proprietary software, whether it's an OS or just some SaaS product. YMMV. However, I think probably the most potent way to avoid problems like this actually ever mattering is a combination of doing my updates manually (or at least on an opt-in basis) and being willing to go back a version if something breaks. Usually this isn't necessary for more than a week or so for well-maintained software even in the worst case. I use arch with downgrade (Which lets you go back and choose an old version of any given package) and need to actually use downgrade maybe once a year on average, less in the last 5
No, not really. A Linux desktop with a DE will always be slower and more brittle than an headless machine due to the sheer number of packages/components, but something like Arch + Plasma Shell (without the whole KDE ecosystem) should be very stable and snappy. The headaches caused by immutable distros and flatpaks are not worth it IMO, but YMMV.
I've run Void Linux + Xmonad for many years without any such issues. I also recently installed CachyOS for my kid to game on (KDE Plasma) and it works super well.
Not really, no. What did you install that slowed things down?
> If yes, what actually works long-term?
Plain ordinary Ubuntu 24.04 LTS, running on an ancient Thinkpad T430 with a whopping 8GB of RAM and an SSD (which is failing, but that's not Linux's fault, it's been on its way out for about a year and I should probably stop compiling Haiku nightlies on it).
Can you give an example of which desktop apps are "dragging in daemons"?
If you think Gimp is terrible you'll hate something like DaVinci Resolve.
Despite this, Linux as ecosystem has numerous problems. The "wayland is the future" annoys me a lot. The wayland protocol was released in 2008. Now it is almost 20 years. I don't feel wayland is ever going to win a "linux desktop of the year" award. Things that work on xorg-server still do not work on wayland - and probably never will. I am not saying wayland is useless, I ran it for a while on KDE (though I actually don't use KDE, I use icewm typically), but it is just annoying how important things like GUI on Linux simply suck. In many ways Linux is kind of a server computer system, not really a desktop computer system. I use it as one, but the design philosophy is much more catering to the server or compute-work station objective.
Also, GTK ... this thing keeps on getting worse and worse with every new release. I have no idea what they are doing, but I have an old GTK2-based editor and this one consistently works better than the GTK3 or GTK4 ported version. It was a huge mistake to downgrade and nerf GTK to a gnomey-toolkit only. Don't even get me started on GNOME ...
The success measurements are quite strange. How am I supposed to think Linux is finally good when 96.8% of users do not care to adopt it. I can't think of anything else with that high of a rejection rate. The vast majority do not consider it good enough to use over Windows.
Linux still suffer from the same fragmentation issue: Oh you want to play game, you should use distro X, oh you want an average web-browsing, working, you should use distro Y, or for programming, use Z. Of course all of them can do what other can do, but the community decided that the way it is.
Yesterday i read a reddit thread about an user sharing his issue with pop-os, and most(if not all) comments saying he is using the wrong distro. He is using latest release (not the nightly build), which is a reasonably thing to do as new user.
Not sure if Linux Mint has changed this, but i remember having to add "non-free" repo to use official Nvidia driver. Not a big deal to people who know what they are doing, but still, that is unnecessary firction.
Not the best, but works for me.
I put CachyOS on it, using Steam just run the game's installer adding it as a game to your library -- you just select which proton you want (cachyos-proton) as a dropdown in the Properties in the Steam library. that's it.
it's lightweight, arch (I ditched manjaro), runs KDE and games perfectly, cursor IDE runs great, VMS run great.
first thing I did when I got it from fedex was remove Windows and put Linux on it. I thought 'maybe I'll just bite the bullet and sign up a Microsoft cloud account to be able to access ..my desktop' and 1/4 through its install I held the power button and popped a flash drive in. just say no to windows and you'll all be happy, trust me.
the only effort it required was for me to say f this on using Lutris and just use Steam as the wrapper.
2026 is definitely the year for linux. every year is. valve heavily invested in Arch, proton, and is using Linux on their devices and honestly: Windows is spyware, and after their vibe coded jank 25H2 update that broke a ton of things and Windows 10 being EOL, I hope more people get to enjoy throwing Ventoy on a USB stick with a bunch of linux isos copied over to it and boot and play with what they love.
so I disagree, 2026 is the year for Linux, and Linux is love.
We've reached a point where Microsoft greed and carelessness is degrading Windows from all angles. With the constant forced Copilot, forced sign-ups, annoying pop-ups and ads, it is figuratively unusable; in the case of machines stuck on Windows 10 it is literally unusable.
They are now banking entirely on a captive market of Enterprise customers who have invested too much to quit. The enshittification is feature complete.
Using hardware at least 6-12 months old is a good way to get better compatibility.
Generally Linux drivers only start development after the hardware is available and in the hands of devs, while Windows drivers usually get a head start before release. Brand new hardware on a LTS (long term support) distro with an older kernel is usually the worst compatibility combo.
This is more about what you choose as your operating environment, not what your work imposes as your working environment.
Most places of work, mine included, run Microsoft services that lock them into the ecosystem incredibly tightly.
As per the article title, "if you want to feel like you actually own your PC", this is about your PC, not the one provided to you by your workplace (since it's likely owned by them).
One thing I'm worried about in my work environment is Microsoft enforcing the web versions of Office and deprecating the stand alone desktop applications. The web versions are a massive step down in terms of functionality and ease of use. Your mention of OWA makes me feel as if that is what Outlook will be sacrificed for at some point in the future anyway.
I have the same concern regarding the Outlook desktop client. I briefly used the web based one and it is way less convenient in a work setting.
But yes, this is a possibility, or accessing the windows via rdp. The loss would be with the "always-handy" kind of setup, where Outlook is a click away and pops up its calendar reminder
We need to address the problems rather than pretending it's already great
Shutting down your laptop and having to wait five minutes for systemd to shutdown because of some timeout when you need to get your flight is just one of those reasons you end up going back to windows
Apart from systemd, tell me what isn't great. The whole benefit of the ecosystem is that you can pick and choose your system components and if you don't like any of then, build your own. I am not trying to be dismissive, if I use a windows these days, it is for work and some errant vm to run windows specific app ( hr block comes to mind -- I am just not willing to spend time to run it any other way ).
IMO the next important unblocker for Linux adoption is the Adobe suite. In a post-mobile world one can use a tablet or phone for almost any media consumption. But production is still in the realm of the desktop UX and photo/video/creative work is the most common form of output. An Adobe CC Linux option would enable that set of "power users". And regardless of their actual percentage of desktop users, just about ever YouTuber or streamer talking about technology is by definition a content creator so opening Linux up to them would have a big effect on adoption.
And yes I've tried most of the Linux alternatives, like GIMP, Inkscape, DaVinci, RawTherapee, etc. They're mostly /fine/ but it's one of the weaker software categories in FOSS-alternatives IMO. It also adds an unnecessary learning curve. Gamers would laugh if they were told that Linux gaming was great, they just have to learn and play an entirely different set of games.
Yes, you can get this stuff working, but if you enjoy doing other things in life, have a job and don’t life alone, it is SSSOOOOO much easier to get a Mac mini. Or even windows 11 if that’s your thing.
I've used Mint in the past, loved it until I spent a day trying to get scanner drivers to work. Don't know if that's changed now, was 4 years ago
I am using Fedora on machines with new hardware and liking it as well. It has small pluses/minuses vs Mint.
And if you are running Chrome, and something starts taking a lot of memory, say goodbye to the entire app without any niceties.
(Yes, this is a mere pet peeve but it has been causing me so much pain over the past year, and it's such an inferior way to deal with memory limits tha what came before it, I don't know why anybody would have taken OOM logic from systemd services and applied it to use launched processes.)
If anybody can help me out with a better solution with a modern distribution, that's about 75% of the reason I'm posting. But it's been a major pain and all the GitHub issues I have encountered on it show a big resistance to having better behavior like is the default for MacOS, Windows, or older Linux.
Regardless, I believe EarlyOOM is pretty configurable, if you care to check it out.
If you want a distro that really cares about the desktop experience today, try Linux Mint. Windows users seem to adapt to it quite quickly and easily. It's familiar and has really good defaults that match what people expect.
Try doing less at once, or getting more memory.
If your solution is "don't ever run out of memory" my solution is "I won't ever use your OS unless forced to."
Every other OS handles this better, and my work literally requires pushing the bounds of memory on the box, whether it's 64GB or 1TB of RAM. Killing an entire cgroup is never an acceptable solution, except for the long-running servers that systemd is meant to run.
Windows is unstable even if you have more than enough memory but your swap is disabled, due to how its virtual memory works. It generally behaves much worse than others under heavy load and when various system resources are nearly exhausted.
There are several advanced and very flexible OOM killers available for Linux, you can use them if it really bothers you (honestly you're the first I've seen complaining about it). Some gaming/realtime distros are using them by default.
Of course, if it's absolutely not compatible with your work, you can just disable systemd-oomd. I'm wondering though, what sort of work are you doing where you can't tune stuff to use 95% of your 1TB of memory instead of 105% of it?
A bit hard to do now :(
https://www.phoronix.com/news/Linux-Mint-17-No-OOMD
Thanks for the suggestion!
I recently switched to using a thumb drive to transfer files to and from my phone/tablet, I became demoralized when faced with getting it all setup.
No, thank you! The "smooth, effortless [, compulsory, mandated, enforced] integration" between my Apple devices is the very worst thing about them.
If I have an issue with an application or if I want an application, I must use the terminal. I can't imagine a Mac user bothering to learn it. Linux is for people who want to maximize the use of their computer without being spied on and without weird background processes. Linux won't die, but it won't catch Windows or Mac in the next 5 decades. People are too lazy for it. Forget about learning. I bet you $100, 99% of the people in the street didn't even see Linux in their lives, nor even heard of it. It is not because of marketing, it is because people who tried it returned to Windows or Mac after deciding it is too hard to learn for them to install a driver or an application.
Not up close due to the vast number of inconsistencies.
This could only be fixed by a user experience built from the ground up by a single company.
I get that you're making a Windows joke, but this describes Linux equally well.
The UX leaves a lot to be desired.
Even modern macs fall short of the UX Apple has traditionally been known for...
MacOS is highly consistent compared to Windows.
Perhaps Linux operating systems like Steam or ChromeOS might finally create a beautiful and consistent UI.
If I remember correctly, after the Crowdstrike BSOD-all-windows-instances update last year Microsoft wanted to make some changes to their kernel driver program and these anti-cheat measures on Windows might need to find a new mechanism soon anyway. That's a long way of saying, it's plausible that even that last barrier might come down sooner rather than later.
Some interesting reads on what modern anticheats do:
https://github.com/0avx/0avx.github.io/blob/main/article-3.m...
https://github.com/0avx/0avx.github.io/blob/main/article-5.m...
https://reversing.info/posts/guardedregions/
https://game-research.github.io/ (less in detail and less IDA pseudo)
I switched full time to Linux in 2022 when Elden Ring launched and had better performance in its first week on Linux than on Windows. I personally switched over to KDE Plasma powered by Arch. The first thing I noticed was how perfectly it was immediately. So I'd push the title of the article on step further: Linux has been good for a while.
In the four years since switching over from a dual Windows (for gaming) and Mac (for programming, web browsing, and everything else), I have almost entirely had a better experience in every single area. I still use macOS daily for work, and it is constantly driving me mad. For every task I have thrown at it (from gaming to programming to game dev to photo editing), Linux just works.
On Mac it's more like, "there's an app for that". I have third party package managers on Mac. I use a third party app to display if my internet connection is using Ethernet. It yells at me to delete the CSV file that I created and requires an instruction manual with instructions for the Settings app that have changed three times in three years for how to open the file, add Bluetooth to the menu bar, etc. It even had a permanent red icon on the Settings about not being signed into an Apple ID. And once I signed in, the Settings app has a permanent red icon about paying for Apple Care. My parents have made comments about how they're worried as they get older that they won't be able to keep up with the constant updates and changes to macOS and iOS.
I don't have much to say about Windows besides good riddance. It was far less confusing to use than macOS but was filled with too much bloat and pop up notifications.
The final thing I'll mention is that the first time my girlfriend used my computer, she sat down, opened the browser, and completed her task. She thought that she was using Windows and was able to navigate the new interface without having to spend any time learning anything. For her regular use case of using the PC for an internet browser, Linux just worked. She even asked me afterwards to install it on her laptop to replace Windows! I can't believe we're in a world where that's asked by someone non-technical who just wants a computer to get out of their way so that they can perform their tasks.
I fully switched to GNU/Linux back then and have never looked back. Initially I was quite evangelical but got tired of it and gave up probably around 10 years ago thinking "oh well, their loss". But slowly more and more of the world has switched over, first servers, then developer workstations and now finally just "normal" users.
Similarly, I've always been hugely invested into my tools and have a strong distaste for manual labour. I often watch how others work and can't believe how slow and inefficient it is. Typing up repetitive syntax every time, copy/pasting huge blocks of code, performing repetitive actions when booting their PC etc. I simply haven't been doing this for my whole career, I've been writing scripts, using clever editors, using programming languages to their fullest etc.
I think this is why LLMs don't seem like such the huge breakthrough to me as they do to others. I wasn't doing this stuff manually before, that's ridiculous. I don't need to generate globs of code because I already know how to get the computer to do it for me, and I know how to do that in a sustainable and maintainable way too. It's sad that LLMs are giving people their first real sense of control, when it's actually been available for a very long by now, and in a way that you can actually own it, rather than paying for a service that might be taken away at any moment.
And it mostly works! At least for my games library. The only game I wasn't able to get to work so far is Space Marine 2, but on ProtonDB people report they got it to work.
As for the rest: I've been an exclusive Linux user on the desktop for ~20 years now, no regrets.
You know this is a psyop when they can't resist mentioning Microsoft's product
I've been using Linux full-time (no other OSes at all) for nearly 20 years. Went through all my university education using only Linux. It's problem free if you use it like a grandma would (don't mess with the base system) and even if you mess with it, most things are easily reversible.
That being said, I have noticed that the newfound interest in Linux seems to be a result of big tech being SO abusive towards its customers that even normies are suddenly into computing "freedom".
1. Look at commercial desktop OSes (Windows, MacOS). They spend hundreds of millions to develop and maintain the OS, do updates, quality assurance testing, working with hundreds of thousands of hardware vendors and enterprises, etc, just to try to not break things constantly. This is with "an ecosystem" that is one stack developed by one company. And even they can't get it right. Several Linux-Desktop companies have tried to mimic the commercial companies by not only custom-tailoring their own stack and doing QA, but sometimes even partnering on certified hardware. They're spending serious cash to try to do it right. But still there's plenty of bugs (go look at their issue trackers, community forums, package updates, etc) and no significant benefit over the competition.
2. There is no incentive for Linux to have consistency, quality, or a good UX. The incentive is to create free software that a developer wants. The entire ethos of the OSS community is, and has always been, I want the software, I make the software, you're welcome to use it too. And that's fine, for developers! But that's not how you make something regular people can use reliably and enjoyably. It's a hodge-podge of different solutions glued together. Which works up to a point, but then...
3. Eventually Linux desktop reaches a point where it doesn't work. The new mouse you bought's extra buttons don't work. Or the expensive webcam you bought can't be controlled because it requires a custom app only shipped on Windows/Mac. Or your graphics card's vendor uses proprietary firmware blobs causing bugs on only Linux for unknown reasons. Or your speakers sound like crap because they need a custom firmware blob loaded by the commercial OSes. Or your touchscreen can't be enabled/disabled because Wayland doesn't support the X extensions that used to allow that to work with xrandr. Or your need to look up obscure bootloader flags, edit the bootloader, and restart, to enable/disable some obscure fix for your hardware (lcd low power settings, acpi, disk controller, or any of a thousand other issues). Or, quite simply, the software you try to install just doesn't work; random errors are popping up, things are not working, and you don't know why. In any of these cases, your only hope is... to go on Reddit and ask for help from strangers. There's no customer support line. Your ISP ain't gonna help you. The Geek Squad just shrugs. You are on your own.
And this is the most frustrating part... the extremely nerdy core fan-group, like those on HN or Reddit, who are lucky enough not to be experiencing the problems unique to Linux, gaslight you and tell you all your problems are imagined or your fault.
By your problem statements 1, 2, and 3, there is never likely to be a great desktop OS. The best that we will ever have is a compromise (no cyber pun intended).
1) every OS is buggy, 2) every OS is a hotch-potch, and 3) users end up yelling at the clouds then forced to upgrade to the next version of frustration.
> who are lucky enough not to be experiencing the problems unique to Linux, gaslight you
It isn't just luck , people use Linux every day to do their jobs and pursue their interests. But if no GNU/Linux distro works for your uses, you have whatever commercial OS you are currently using to meet your needs.
As for actual gaslighting , yikes I hope that large groups of people are not conspiring to ruin your day. I personally react in a similar way when corporations tell me please wait, your call is important to us, our menu options have changed.
People dual boot SSD OS for very good reasons, as kernel permutation is not necessarily progress in FOSS. Linux is usable by regular users these days, but "Good" is relative to your use case. YMMV =3
The only sane ways: either a 'correct' set of native elf/linux binaries, or proton = 0 bucks (namely only free-to-play, with 0 cent in any micro-transactions).
It's funny they would choose this phasing.
This is exactly the way I described my decision to abandon windoze, and switch to linux, over 20 years ago...
I tried Cinnamon and while it was pleasantly customizable, the sigle-threadedness of the UI killed it for me. It was too easy to do the wrong thing and lock the UI thread, including several desktop or tray Spices from the official repo.
I'm switching to KDE. Seems peppier.
Biggest hardware challenge I've faced is my Logitech mouse, which is a huge jump from the old days of fighting with Wi-Fi and sound support. Sound is a bit messy with giving a plethora of audio devices that would be hidden under windows (like digital and analog options for each device) and occasionally compatibility for digital vs analog will be flaky from a game or something, but I'll take it.
Biggest hassle imho is still installing non-repo software. So many packages offer a flatpak and a snap and and build-from-source instructions where you have to figure out the local package names for each dependency and they offer one .Deb for each different version of Debian and its derivatives and it's just so tedious to figure which is the right one.
In case it helps:
Can I get a laptop to sleep after closing the lid yet?
Not that long ago the answer to these questions was mostly no (or sort of yes... but very painfully)
On Windows all of this just works.
> on windows all of this just works
Disagree on the sleep one - my work laptop doesn’t go to sleep properly. The only laptop I’ve ever used that behaves as expected with sleep is a macbook.
It’s more than fine for people to dislike Apple products but this is simply not an area where other platforms have them beat.
I'm also seeing results for "macbook pro doesn't go to sleep when lid closed", so other people see this problem too. You can't really claim that other platforms have them beat here if there isn't data to support the claim.
Your comment was written in a manner that echos the same anti-Apple bias that's frequently found on HN. If that's not you, then it's just a misread on my part.
> You can't really claim that other platforms have them beat here if there isn't data to support the claim.
I can, because by and large those are still anecdotal experiences posted online. The deeper integration of OS/hardware due to Apple controlling the entire chain has made sleep mostly a non-issue; it's typically a misbehaving application that might prevent it. There are valid reasons an app might need to do that, so it's not like macOS is going to prevent it - but if sleep's not working right on macOS, it's typically a user error.
This is different from Linux (and Windows, to a lesser degree) where you have a crazy amount of moving parts with drivers/hardware/resources/etc.
All in all, I've given up on sleep entirely and default to suspend/hibernate now.
There are valid reasons why a program might need to block sleep, so it's not like macOS is going to hard-prevent it if a program does this. Most programs should not be doing that though.
Laptop sleep and suspend can still be finicky unfortunately.
I will say my experience using CAD or other CAE software on windows has gotten progressively worse over the years to the point that FEA is more stable on linux than on windows.
We do really need a Solidworks, Creo or NX on linux though. My hope has been that eventually something like Wine, Proton, or other efforts to bring windows games to linux will result in us getting the ability to run them. They are one of the last things holding me back from fully moving away from windows.
I added a couple VMs running windows, linux, and whatever else I need in proxmox w/ xrdp/rdp and remina, and it's really the best of both worlds. I travel a good deal and being able to remotely connect and pick up where I left off while also not dealing with windows nagware has been great.
On one hand we have Steam that will make 1000s of games become available on easy to use platform based on Arch.
For developers, we have Omarchy, which makes experience much more streamlined and very pleasant and productive. I moved both my desktop and laptop to Omarchy and have one Mac laptop, this is really good experience, not everything is perfect, but when I switch to Mac after Omarchy, I often discover how not easy is to use Mac, how many clicks it takes to do something simple.
I think both Microsoft and Apple need some serious competition and again, came from Arch who turned out to be more stable and serious then Ubuntu.
I also play a decent amount of Flight Simulator 2024 and losing that is almost a non-starter for switching.
turn on anticheat if you want to join no cheat sessions.
if you want a cheat game turn off anticheat and you join sessions with other cheat players.
the whole dilemma comes out of malignant users that enjoy destruction of other users ability to enjoy the game.
go nuclear on clients that manage to join anticheat sessions with cheats turned on.As many have pointed out, The biggest factor is obviously the enshittification of Microsoft. Valve has crept up in gaming. And I think understated is how incredibly nice the tiling WMs are. They really do offer an experience which is impossible to replicate on Mac or Windows, both aesthetically and functionally.
Linux, I think, rewards the power user. Microsoft and Apple couldn't give a crap about their power users. Apple has seemed to devolve into "Name That Product Line" fanboy fantasy land and has lost all but the most diehard fans. Microsoft is just outright hostile.
I'm interested to see what direction app development goes in. I think TUIs will continue to rise in popularity. They are snappier and overall a much better experience. In addition, they work over SSH. There is now an entire overclass of power users who are very comfortable moving around in different servers in shell. I don't think people are going to want to settle for AI SaaS Cloudslop after they get a taste of local first, and when they realize that running a homelab is basically just Linux, I think all bets are off as far as which direction "personal computing" goes. Also standing firmly in the way of total SSH app freedom are IPhone and Android, which keep pushing that almost tangible utopia of amazing software frustratingly far out of reach.
It doesn't seem like there is a clear winner for the noob-friendly distro category. It seems like theyre all pretty good. The gaming distros seem really effective. I finally installed Omarchy, having thought "I didn't need it, I can rice my own arch", etc, and I must say the experience has been wonderful.
I'm pretty comfortable at the "cutting edge" (read, with all my stuff being broken), so my own tastes in OS have moved from Arch to the systemd free Artix or OpenBSD. I don't really see the more traditional "advanced" Linuxes like Slackware or Gentoo pulling much weight. I've heard interesting things about users building declarative Nix environments and I think that's an interesting path. Personally, I hope we see some new, non-Unix operating systems that are more data and database oriented than file oriented. For now, OpenBSD feels very comfortable, it feels like I have a prayer of understanding what's on my system and that I learn things by using it, the latter of which is a feature of Arch. The emphasis on clean and concise code is really quite good, and serves as a good reminder that for all the "memory safe" features of these new languages, it's tough to beat truly great C developers for code quality. If you're going to stick with Unix, you might as well go for the best.
More and more I find myself wanting to integrate "personal computing" into my workflow, whether that's apps made for me and me alone, Emacs lisp, custom vim plugins, or homelab stuff. I look with envy at the smalltalks of the world, like Marvelous Toolkit, the Forths, or the Clojure based Easel. I really crave fluency - the ability for code to just pour out - none of the hesitation or system knowledge gaps which come from Stack Overflow or LLM use. I want mastery. I've also become much more tactical on which code I want to maintain. I really have tried to curb "not invented here" syndrome because eventually you realize you aren't going to be able to maintain it all. Really I just want a fun programming environment where I can read Don Knuth and make wireframe graphics demos!
Instead of distro upgrades, spend 3 minutes disabling the newest AI feature using regedit.
But, as the author rightly notes: It's more about a "feeling." Well then, good luck.
Please revert this submission to use the correct title.
Even with imperatively configured distros like Ubuntu, it's generally much easier to recover from a "screen of death" than in Windows because the former is less of a black box than the latter. This means its easier to work out what the problem is and find a fix for it. With LLMs that's now easier than ever.
And, in the worst case that you have to resort to reinstalling your system, it's far less unpleasant to do that in a Linux distro than in Windows. The modern Windows installer is painful to get through, and then you face hours or days of manually reinstalling and reconfiguring software which you can do with a small handful of commands in Linux to get back to a state that is reasonably similar to what you had before.
Incidentally, I can now honestly say I've had more driver issues with Windows than Linux.
The result was that from day 1 of using Linux I never looked back.
Fun aside: I had a hardware failure a few years ago on my old workstation where the first few sectors of every disk got erased. I had Linux up and running in 10 minutes. I just had to recreate the efi partition and regenerate a UKI after mounting my OS from a live USB. Didn't even miss a meeting I had 15 minutes later. I spent hours trying to recover my Windows install. I'm rather familiar with the (largely undocumented) Windows boot process but I just couldn't get it to boot after hours of work. I just gave up and reinstalled windows from scratch and recovered from a restic backup.
Windows has recently been a complete shitshow - so even if Linux hasn't gotten any better (it has) it is now likely better than fiddling around with unfucking Windows, and Windows doing things like deleting all your files.
That's exactly my point.
There's an ever growing list of things to do in order to fix Windows, and that list is likely longer than Linux. This whole "your time is free" argument hinges on Windows not having exactly the same issue, or worse.
Linux is not good. Some hardware support is still reverse-engineering-based, or based on a few individuals best effort activity. Linux needs manufacturers' first hand commitment to quality opensource to be truly good.
Linux is not good. Some software is not on feature-parity among operating systems. With Linux being the software kingdom poor Cinderella. Linux needs software feature parity to be truly good.
Linux is not good. Because too much mainstream new PCs comes with some other operating system pre-installed (and paid for) even if you won't need it. Linux needs freedom of choice sing first PC power on, like a stub to download whatever OS you want (to pay for) or to boot from removable media for Linux to become truly good.
Linux is not good. Because there is still "stuff" that require some specific non-linux software running under some specific non-linux operating system to be made useful things. We need manufacturers to ditch this for linux to become truly good.
I am a happy user of Linux on my primary PC since 20+ years now. But I still have to fight for my freedom every now and again because of one or more of the above points.
Now let's jump back to gaming.
Linux is not good because game industry thinks proprietary platforms and operating systems are better for their business. There is only 1 platform fully supporting Linux and too few titles. Gaming Linux hardly hits 5% of the market share, basically the same as Desktop Linux. While Server Linux is beyond 75%.
I think reasons could be two-fold.
On one hand, Linux is not perceived by industry as attractive as other proprietary platforms. Maybe industry can squeeze much more money from the latter.
On the other hand, it could be that most of the development resources are NOT ORIENTED towards gaming and desktop, so these markets simply lag behind.
Of course, I could be totally wrong: these are my humble opinions with some facts backing them.
Live Long, Linux!
And this is a problem with Linux ?