It will never happen, but my dream is for the Asahi devs, Valve, and Apple to all get together to build out a cross-platform Proton to emulate and play games built for Windows on both x86 and ARM hardware running Linux.
A Steam Deck with the performance and power efficiency of an M-series ARM chip and the entire library of games that run on Proton is just...dreamy.
Her output is incredible.
I recognize it’s a hell of an ask, but I think Alyssa could pull it off.
The end game for Valve isn't Steam Deck 2 or 3 (which is statistically impossible for Valve to produce), but for Steam to be on everything.
Most of the studios that own those games, and target POSIX like OSes on mobile phones and game consoles, are yet to bother with GNU/Linux versions for SteamOS.
What Valve want is the dissolution between platform/architecture and store. By my eye, it's the driving force of their efforts, more so than them selling hardware or being the open source good guys. Not to undervalue their work in helping make Linux a first class citizen for gaming, but the core of their business model is getting people to engage with their store, full stop, and being able to sell their games on Android (and elsewhere) would massively extend their reach.
This may go both ways too, there's also been indications that Valve have been tinkering with Waydroid, meaning Steam could also become a store for Android-native games.
I don't think Valve wants to be at the mercy of Microsoft and their policy & technical decisions.
In order for Microsoft to rug-pull the technology (which is quite different from rug-pulling the business model), they'd have to break compatibility on Windows itself. Video games remain a major reason for home users to run Windows. Making ABI-breaking changes to Win32 or DirectX is just not very likely to happen. And if it did happen, it would be a boon to Valve and not a harm.
The biggest risk (and this would be a classic Microsoft move, to be fair) I can foresee is aggressive API changes that make it hard for Valve/Wine/Proton to keep up but also make it hard for game developers not to. I'm not exactly sure what this would look like, and a lot of the core technologies are pretty stable by now, but it's a possibility. It's not, however, going to harm anything that already exists.
Making SteamDeck use windows wouldn't impact prices much, Microsoft is really friendly for putting windows by OEMs. Could even run modified to act like current steam deck.
Instead, SteamDeck is there to drive up testing on Proton or straight forward porting to Linux, which just availability on Linux and the previous steam machine didn't drive up
If future versions of Proton break compatibility with older Windows apps, you can use different old versions of Proton for individual games. Steam makes this very easy on Linux, but rarely is it necessary.
I don't foresee many Linux distros breaking compatibility with Wine, which is good, as some devs argue Win32 is the only stable ABI on Linux. [1]
I don't foresee legal issues either, as Wine has been around for 31 years, and its corporate sponsors have included Google in the past. I've seen no indication that the project is on shaky legal grounds.
Microsoft could always create a new API that Wine doesn't yet support, but good luck getting developers to use it -- they've tried many times, but not much has stuck, and most devs just stick with Win32. [2]
The steam deck is 100% usable without leaving 'game mode' even a single time. Something that is genuinely impossible using Windows as a base. That's the important part
Also when it comes to breaking proton support (Which does happen) Valve + GloriousEggroll give you access to plenty of older and special versions. Surely that's better than rolling back entire software?
My game doesn't work -> I go to protonDB -> Users saying use X Proton Version or Y ProtonGE version -> I switch the layer used in steam
Hard to imagine a simpler process than that
By supporting Proton, they are guaranteeing that modern and retro Windows games will be playable on Linux far into the future. Trying to get the next Call of Duty to support Linux natively is, quite literally, a waste of everyone's time that could possibly be involved in the process. I cannot see a single salient reason why Linux users would want developers to release a proprietary, undersupported and easily broken native build when translation can be updated and modified to support practically any runtime.
- CD Projekt Red: released Witcher 2 on Linux, didn't for Witcher 3.
- iD Software: released Doom 3 on Linux, didn't for Doom (2016) or Doom Eternal.
- Epic Games: released Unreal Tournament 2004 on Linux, but didn't for Unreal Tournament 3 or Fortnite. (A Linux port was being worked on for UT3, but it ended up getting cancelled.)
- Larian Studios: released Linux version of Divinity: Original Sin, didn't for Divinity: Original Sin 2 or Baldur's Gate 3
Many studios over the years have made native Linux versions, and many studios stopped because the cost of porting exceeded the revenue it generated. Proton didn't exist when Unreal Tournament 3, Witcher 3, Doom (2016), or Divinity: Original Sin 2 released, so Proton wasn't the reason those studios stopped developing Linux titles -- they stopped because it made no financial sense to continue to make them.
Now, with Proton, 79% of the top 1000 games on Steam are gold or platinum rated on ProtonDB. If you're fine with minor issues, 88% are silver rated or better. For the Steam Deck in particular, there are 5,500 verified games, and 16,526 verified or playable games. So I'd argue Proton is doing quite a lot for people gaming on GNU/Linux machines, since they now have access to a solid majority of the top 1000 games on Steam, both on a Linux desktop and on a handheld.
We aren't in the 90s anymore. Win32 has stalled, Microsoft has a regulatory gun to their head and Wine's compatibility (at least in the domain of games) is extremely good, good enough to allow for a commercial product to be a success while being entirely reliant on it. In what way is any of this comparable to what happened with OS/2 outside of "it runs Windows applications"?
- Automated Personnel Unit 3947
Half life 1, 2… hm.
Ok we’ll make three HL2 episodes to follow that up.
Ep I. Ep II. Uhhhh… and let’s stop there, just like forever I guess.
Portal 1. Portal 2.
Left4dead. L4D2.
Team Fortress. TF2.
Counterstrike. CS2. Oh shit we’re releasing a third one, we might have to use the number three finally, oh no… I’ve got it: CS:Go!
Regardless, point stands: they hate the number 3.
Then again, all kinds of companies take liberties with naming including numbers. Look at Windows 7 (12th major release), Windows 10 (successor to Windows 8), the game Battlefield 2 (third in the series), Battlefield 3 (three games after BF2), Battlefield 1 (after the release of BF4), etc.
I empathize if you don't like any version of Windows newer than 7 or XP, but it's time to let the dream of running them forever go. It's not weird when software doesn't support the 2009 version of an operating system anymore in 2024. If they never dropped support, it would be difficult to take advantage of improvements that occurred in the last 10 years, because we'd forever be stuck in baggage.
Of course when it's feasible everybody loves software that really never does drop support, like 7-zip, which I think happily still works on Win9x without KernelEx... but I'd rather 7-zip stopped having serious security issues than continued to work on old Windows versions.
However, it is Microsoft more than anyone else that has decided to stop supporting those operating systems. Windows XP does not have support for any modern version of TLS (only TLS 1.0). There's no good way to support a browser-based app like Steam on a platform that cannot natively provide a secure connection to a modern web server.
There is not such a hard reason to drop Windows 7 support (again, except that Microsoft no longer supports it) but there are security-relevant APIs that are only available starting in Windows 10 which means special patches would have to be maintained just for Steam on Windows 7 to continue working securely.
CodeWeavers released an annoucement when Gaming Portal Toolkit was announced.
https://www.codeweavers.com/blog/mjohnson/2023/6/6/wine-come...
You sure there was any kind of commercial agreement? Doesn’t look like it.
On top of that, Apple's DX to Metal code is not re-distributable. So yes, this sounds like more of the same from Apple.
And the root of the whole browser wars thing was microsoft making an absolute dog of a browser for Mac OS X when it came out and then refusing to support it. lmao.
It was the file manager as well as the browser and it was incredibly capable. By far the most advanced GUI file manager of its time. And a pretty fast and pleasant browser, although the compatibility was hit and miss. (Those were Flash and IE-dominated days as I recall them.)
A lot of what I loved about Konqueror is captured in Dolphin. I don't think I need my web browser to be a file manager... maybe that concept was just a 90s fever dream. But I miss Rekonq. Maybe I should revisit Konqueror.
'Default' KDE apps are often so well thought-out and complete that I never feel the need to deviate from them, and it's not unusual for me to install them on other operating systems when possible. I feel this way about Dolphin, Okular, Ark, Kate, Gwenview, Klipper, and Konsole/Yakuake, too (even though there are several great new terminal emulators out nowadays). And KWin! God, KWin's configurability is so good and it has some really killer compositor effects for productivity that are still unmatched.
Also a quick correction, there was no IE5.5 for OSX. That was for Windows and used a diff rendering engine.
I also remember Safari on Windows, which was convenient for many reasons.
https://raw.githubusercontent.com/apple/homebrew-apple/refs/...
0 chance they upstream anything. So in a way they are already benefitting from valves work.
The real reason Apple is ahead is because they're paying for more expensive more advanced nodes for their CPUs. I you compare CPUs on similar node sizes, you'll see that AMD and Intel are basically caught up architecturally in perf/W metrics.
https://www.notebookcheck.net/Intel-Lunar-Lake-CPU-analysis-...
Intel was losing badly to one or the other at all TDPs. I don’t get the impression that’s changed much. (Even if it has, I can’t remember the last time I encountered a non-xeon intel machine with working hardware and drivers (for any OS, and I tried Windows, Linux and macs).
But I guess there was never a time when an open graphics standard stood as the leader. Maybe during a brief stint in the Windows Vista era at best.
Sure it's a great design, but I believe x86-64 will catch up once again now with everyone using TSMC.
AIUI, if you want the most flops per die, you'll buy x86 - probably the 128-core Xeon for enterprise money. But that's not what's best for hand-held gaming.
AAA titles are typically GPU-bound anyway. More CPU flops may not offer much benefit.
Yes, but actually no. The Steam Deck is playing at extremely low resolutions. Rendering at 720p and 30fps is (on paper) 8x less demanding on the GPU than rendering a native 1440p60 experience. You can fully get by without having a powerful dGPU, which is why the Steam Deck is really able to play so many titles on a weak iGPU.
The problem is translation. Cyberpunk 2077 runs fine on a 25 watt mobile chip that uses x86, which is why the Deck even costs less than $1000 in the first place. If you try to put a mobile ARM CPU in that same position and wattage, it's not going to translate game code fast enough unless you have custom silicon speeding it up. There's really no reason for Valve to charge extra for a custom ARM design when COTS x86 chips like AMD's would outperform it.
For x86 PC games (which pretty much all games are, today), ARM is at a substantial disadvantage, period. The IPC and efficiency advantages are entirely lost when you have to spend extra CPU cycles emulating AVX with NEON constantly. If there were ARM-native games on Windows then things might be different, but for today's landscape I just don't see how ISA translation is better than native.
1) everything standardized, like it or not (note: I do not), on the Windows API, and it has remained relatively stable, which is important because
2) Linux-native games I've had, have become un-executable over time without semi-regular maintenance, and Windows games running on whatever version of Proton they best work with do not have that drawback
> Geometry shaders are an older, cruder method to generate geometry. Like tessellation, the M1 lacks geometry shader hardware so we emulate with compute.
Is this potentially a part of why Apple doesn't want to support Vulkan themselves? Because they don't want to implement common Vulkan features in hardware, which leads to less than ideal performance?
(I realize performance is still relatively fast in practice, which is awesome!)
Yes, it's a big reason.
I tried to port the yuzu switch emulator to macos a few years ago, and you end up having to write compute shaders that emulate the geometry shaders to make that work.
Even fairly modern games like Mario Odyssey use geometry shaders.
Needless to say, I was not enough of a wizard to make this happen!
If you are talking about Vulkan, that is much more complicated. My guess is that they want to maintain their independence as hardware and software innovator. Hard to do that if you are locked into a design by committee API. Apple has had some bad experience with these things in the past (e.g. they donated OpenCL to Kronos only to see it sabotaged by Nvidia). Also, Apple wanted a lean and easy to learn GPU API for their platform, and Vulkan is neither.
While their stance can be annoying to both developers and users, I think it can be understood at some level. My feelings about Vulkan are mixed at best. I don't think it is a very good API, and I think it makes too many unnessesary compromises. Compare for example the VK_EXT_descriptor_buffer and Apple's argument buffers. Vulkan's approach is extremely convoluted — you are required to query descriptor sizes at runtime and perform manual offset computation. Apple's implementation is just 64-bit handles/pointers and memcpy, extremely lean and immediately understandable to anyone with basic C experience. I understand that Vulkan needs to support different types of hardware where these details can differ. However, I do not understand why they have to penalize developer experience in order to support some crazy hardware with 256-byte data descriptors.
I honestly wonder how much the rallying around Vulkan is just that it is a) newer than OpenGL and b) not DirectX.
I understand it’s good to have a graphics API that isn’t owned by one company and is cross platform. But I get the impression that that’s kind of Vulkan‘s main strong suit. That technically there’s a lot of stuff people aren’t thrilled with, but it has points A and B above so that makes it their preference.
(This is only in regard to how it’s talked about, I’m not suggesting people stop using it or switch off it to thing)
Tessellation falling short is just classic Apple, though. Shows how much they prioritize games in their decision making, despite every other year deciding they need a AAA game to showcase their hardware.
(apologies for the crude answer. I would genuinely be interested in a technical perspective defending the decision. My only conclusion is that the kind of software their customers need, like art or editing, does not need that much tessellation).
If you’re using geometry shaders, you’re almost always going to get better performance with compute shaders and indirect draws or mesh shaders.
A lot of hardware vendors will handle them in software which tanks performance. Metal decided to do away with them rather than carry the baggage of something that all vendors agree is bad.
It takes up valuable die space for very little benefit.
If the reason they don't support it in hardware is because they don't want to support it in software, then the logic gets a bit circular.
I'm interested in which came first, or if it's a little of both.
Geometry shaders are generally considered less necessary in modern graphics pipelines due to the rise of more flexible and efficient alternatives like mesh shaders which can perform similar geometry manipulation tasks with often better performance and more streamlined workflows
We (CodeWeavers) are doing this in (a fork of) MoltenVK, and Apple’s D3DMetal is as well.
https://github.com/KhronosGroup/SPIRV-Cross/pull/2200 https://github.com/KhronosGroup/SPIRV-Cross/pull/2204
Interestingly, Apple was on the list of the initial Vulkan backers — but they pulled out at some point before the first version was released. I suppose they saw the API moving in the direction they were not interested in. So far, their strategy has been a mixed bag. They failed to attract substantial developer interest, at the same time they delivered what I consider to be the best general-purpose GPU API around.
Regarding programmable tessellation, Apple's approach is mesh shaders. As far as I am aware, they are the only platform that offers standard mesh shader functionality across all devices.
Although its true that they are an optional feature (as is tessellation).
Which seems like an ineffective move when you have no market share.
So, wait, does this mean that gaming is better on Linux, on a Mac?
Wine is wonderful and with Valve's help it only got better.
But why would gaming on a mac be better? Maybe one day, but for now:
FTA: "While many games are playable, newer AAA titles don’t hit 60fps yet."
I think the answer might be yes, because it's possible to play so many more titles!
https://docs.getwhisky.app/game-support/index.html
I had assumed the lack of Vulkan on macOS was a major issue. Apparently not!
---
PC games use DirectX as their graphics API, so you need something that can translate from DirectX to the native graphics API your OS is running.
On MacOS you'd be translating from DirectX to Metal and Apple provides the emulation software (D3DMetal) as part of the Game Porting Toolkit.
On a Steam Deck, Proton uses Vulkan on Linux as the native graphics API, so in that case you are translating from DirectX to Vulkan.
> DXVK (which translates Direct3D 8, 9, 10 and 11 calls to Vulkan on the fly), vkd3d-proton (which translates Direct3D 12 to Vulkan)
You are forgetting the increasing number of titles targeting Vulkan directly.
You’re lucky to get 60fps playing a fairly undemanding game on MacOS, even on hardware that is otherwise a dream.
For example, Baldur’s Gate 3 is barely playable on my M3 MacBook Pro at well below native resolution with all settings turned down. It’s a brilliant game but hardly cutting edge graphically.
For instance, Alyssa mentions in this post that most emulated games will need at least 16 Gigs of RAM at minimum.
In addition, native ARM games on MacOS don't have the additional overhead of emulating a different CPU architecture and Graphics API.
However, that doesn't take away from this emulated support being an amazing achievement.
That's because the RAM is shared with the GPU and most of these games would require a GPU with at least 2-4GB on top of the normal system requirement to have at least 8GB. So, 8GB of RAM would be cutting it close on a mac since part of that would have to be sacrificed for the GPU.
They are running emulated games in their own separate virtual machine, because Intel games expect a 4k page size and the OS is running with a 16k page size.
Virtual Machines require their own chunk if memory overhead, so the resource usage can't help being higher than a native MacOS game's would be.
And the minimum is pretty minimum. A 16 Gb arm mac will go into yellow memory pressure while running emulated games, I've noticed.
That includes raytracing support and heterogeneous paging support which are two things Alyssa calls out explicitly herself. Not to mention the VM overhead.
That’s not to say Alyssa’s work is not very impressive. It is. But GPTk is still ahead.
That’s not even including the other aspects of Mac support that Asahi still needs to get to. Again, very impressive work, but the answer to your question is No.
DS2 comes in both DX9 and DX11 flavours. The latter should work better with d3dmetal and is more comparable to what proton is doing.
My strong conviction is that From is pretty much technically inept when it comes to Windows ports so I just play their titles on console...
An emulated title that is in itself a not so great port will have trouble ofc...
as for how well fromsoft games run on windows you might have been right 12 years ago when dark souls 1 came out initially. it was a mess at the time, but souls games have been running just fine on windows(and linux for that matter) for years. it's only on macos that it is a mess. this has nothing to do with fromsoft and everything to do with macos.
When it came out initially for windows I had already done two playthroughs so just did ... not ... care. I just read it's a crap port in the news.
> souls games have been running just fine on windows(and linux for that matter) for years
Maybe. For like 4 years I ran my PCs without a dedicated video card because crypto and chip crisis. The whole PS5 with an extra controller cost less than an equivalent PC video card at the time :)
[I do have a video card now, but only because someone paid me to write neural network code.]
Then there's a case for it, since you can run AAA games that apple + macos doesn't support / allow.
Apple and Wine provide the tools, and apps like Whisky make them easy to use.
> Essentially, this app combines multiple translation layers into a single translation tool. It uses Wine, a translation layer that allows Windows apps and games to run on POSIX-based operating systems, like macOS and Linux. It also uses Rosetta and the Game Porting Toolkit, which are two official Apple tools that allow x86 programs to run on Apple Silicon and serve as a framework for porting Windows games to macOS, respectively.
Normally, this sort of process would require users to manually port games to Mac. But by combining Wine, Rosetta, and the Game Porting Toolkit, this can all happen automatically.
https://www.xda-developers.com/hands-on-whisky-macos-gaming/
However, as aleays, running games under emulation has a performance cost.
I don't believe that's true. According to ProtonDB, 80% of the top-1000 most-played games on Steam are confirmed working on Linux: https://www.protondb.com/dashboard
I haven't seen any source documenting nearly similar success rates with Mac but I also haven't seriously tried gaming on Apple Silicon.
Most games are D3D. A very small minority are Vulkan from the get go.
Valve and open source devs have put a lot of effort over the years on projects like Photon which is a translation layer for Windows games.
The correct answer is no, not yet anyway.
Linux running on x86 with proton is still the bee's knees for most games though.
Right. It sounds like the Asahi devs have implemented APIs which aren’t available under stock MacOS.
Back when I was actively developing for Freespace, we had a Linux port that had a better framerate than Windows (the game’s original platform).
I mean this is an incredible achievement either way. Everything is emulated, but they are still running AAA games. Wow.
Other than the page size issue, FEX and Rosetta are comparable technologies (both are emulators, despite what Apple marketing might have you believe). Both FEX and Rosetta use the unique Apple Silicon CPU feature that is most important for x86/x86_64 emulation performance: TSO mode. Thanks to this feature, FEX can offer fast and accurate x86/x86_64 emulation on Apple Silicon systems.
From: https://docs.fedoraproject.org/en-US/fedora-asahi-remix/x86-...
https://fosstodon.org/@slp/113283657607783321
Sergio Lópéz has more info in his blog
https://sinrega.org/2024-03-06-enabling-containers-gpu-macos...
https://sinrega.org/2023-10-06-using-microvms-for-gaming-on-...
Which is an attempt to collapse the stack so that fewer translation and virtualisation stages are needed.
Besides, the main reason Valve is investing so heavily in Linux and Proton is so their destiny isn't tied to someone else's platform. MacOS is just another someone else's platform like Windows is, with the same threat of getting rug-pulled by a first-party app store that spooked Gabe Newell[1] into investing in Linux in the first place.
The magic of Proton from a consumer point of view is that it just works for basically every game, sans those with Kernel-level anticheat stuff. This means thousands of old games that haven't been updated in years will work.any games that don't have active developers.
So Apples solution works for new games but isn't a practical option for compatibility for existing games.
Personally 99% of my Steam Deck usage is with it docked. I do mostly use a controller, but also have it hooked to the same USB switch as my PC so I can hit a button to move my keyboard and mouse over.
Baldur's Gate 3 is the first game I ever ran on my Deck that did not run very well, though. Most stuff I've played runs at 60fps at my external monitor's 1920x1200 resolution. That in addition to not liking the gameplay on BG3 much made me not continue with the game, though I may revisit it someday.
I think you picked as an example one of the games that actually has a native Mac version?
Or is it a well hidden wine package? I've played it start to finish on Macs only and it looked too smooth to be emulation to me.
https://www.theverge.com/2023/10/30/23938676/apple-m3-chip-g...
Alyssa said in her talk that they'll probably get it working in 6 months or so: https://www.youtube.com/watch?v=pDsksRBLXPk&t=2932s
I also run Asahi so will have to check this out to compare
https://box86.org/2022/03/box86-box64-vs-qemu-vs-fex-vs-rose...
Wine has beta support for 32-bit Windows applications on 64-bit-only wine, but it's not default.
They also address it: https://docs.fedoraproject.org/en-US/fedora-asahi-remix/x86-...
What an ingenious idea.
I would not consider the lack of activity in some Asahi Linux areas to be a conflict of priorities. It is in my opinion mostly a result of these lacking areas naturally attracting less developers capable of moving them forward.
God I wish I was smart enough to help out with Asahi Linux...
If you want more, you'll have to take it up with Tim Cook or God (both have a nasty habit of ignoring us little guys). Also an option: not using a laptop that treats Linux as a threat to it's business model.
not saying have it write the code, but just recursively asking for explanations and resources can get you up to speed on tons of things
Ah yeah, here's the post: https://social.treehouse.systems/@marcan/112277289414246878
For the kind of person who wants to run NixOS on Apple Silicon or do Linux gaming on Apple Silicon in the first place, that's probably interesting and not too hard
but if you're allergic to that, you might be able to figure something else out with Box64, which is already packaged in Nixpkgs
x86_64 gaming on NixOS is of course well supported and has been for a long time. There's a 'native' package that I've always used and the Steam Flatpak is also available and works as well as it does anywhere
it was easier when Arch was a first class citizen but the advice nowadays you get upon encountering a problem on Arch is to switch to Fedora
The developers can use what they want. Marcan famously used Gentoo for many years
Is it fundamentally different from any other Nix packaging work in some way?
Is there a modern equivalent with FAANG, Microsoft, Sony, Valve, etc.?
In hell, Apple is in charge of gaming, Google does the customer service, Facebook is responsible for privacy, Microsoft does the UI, and everyone works at Amazon.
Sure this isn't Hell ? Because customer service at Amazon is a best non existant, at worse actively against you...
My SIL bought a scanner from Amazon a few months back and never unboxed it because she was moving house. When she did, it was faulty. They took it back without much of a fight even though the month was up. She just said "I unboxed it yesterday, it's broken".
Contacted customer support, explained what's the problem, the person on the other side said "wait a minute, sir" and removed ads from my Kindle without asking me to pay for it.
That was a good experience with Amazon.
Even if the core shopping/delivery service fails you, if you complain, they'll take the "customer is always right" position and make you whole. They'll refund or re-ship with no questions asked, without requiring you sending back anything or even so much as providing proof.
I'm sure some people must take advantage of that level of customer service, but it's a really pleasant experience.
I complained about a failed delivery (broken box, one item missing). They refunded me but then immediately put me on a watch-list, threatening to ban me if I ever complain again. I will never buy anymore on amazon.
But otherwise this is accurate.
I feel a little guilty, because it's all based on stereotypes, and I don't have enough firsthand experience to say which stereotypes are true.
I do like the implication that were working in the warehouse and not AWS but maybe it's too subtle.
You might also be able to do something with the surprise switch from Linux to Linus. In heaven code is reviewed on GitHub [...], in hell [...] a nd your code is reviewed by Linus.
The switcheroo idea sounds good.
My joke got more attention than I expected, given that it was just a quick first draft. I encourage everyone to improve on it, and share your version wherever you want, without attribution. Consider it a part of the public domain, just like the joke Carlin told.
Are you putting them in charge of UI because they've made React? (which is already a cardinal sin in my book)
But generally, webmasters have found it useful to know who caused their server to fall over^W^W^W^W^W^W is linking to their pages. This was even used as a predecessor to pingbacks once upon a time, but turned out to be too spammable (yes, even more so than pingbacks).
After the HN operators started adding rel=noreferrer to links to the Asahi Linux website, Marcan responded[2] by excluding anyone who has the HN submit form in their browser history, which feels like a legitimate attack on the browser’s security model—I don’t know how it’d be possible to do that. (Cross-origin isolation is supposed to prevent cross-site tracking of this exact kind, and concerns about such privacy violations are why SRI has not been turned into a caching mechanism along the lines of Want-Content-Digest, and so on and so forth.) ETA: This is no longer in place, it seems.
[1] https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Re...
[2] https://social.treehouse.systems/@marcan/110503331622393719
It isn't, at least not in the way you think.
Visited links have always looked different from unvisited ones, and the moment you could customize how links looked via CSS, browsers also had to implement styling for visited links specifically.
Modern browsers put a lot of care into making the changes to those styles observable to the user, but not to Javascript.
This is an extremely hard problem, and browsers have had a lot of security issues related to this behavior. Nowadays, you can only apply a very limited subset of CSS properties to those styles, to avoid side-channel timing attacks and such.
This means you can display a banner to anybody who has a certain URL in their browser history, but you can't observe whether that banner actually shows up with JS or transmit that information to your server.
<!doctype html>
<style>a { color: white; background-color: white; } a:visited { color: black; }</style>
<body><a href="https://example.com/abracadabra" onclick="return false">you are a bad person</a>
[1] https://developer.mozilla.org/en-US/docs/Web/CSS/:visited#pr...How do they stop you from using Canvas to see the output and send it back?
The screen recording/screen sharing API can be used for this but security is the reason you have to give explicit permission to the site before it can do this.
There was also that asteroids game / captcha where links were white/black squares and your goal was to click all the black ones. Of course, clicking a square revealed that you knew the square was black, which meant the URL under it was in your history.
I think that using referer to try to deliver manifestos to users of another site is kinda childish, but so it goes. Every tool can be put to good or bad uses.
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Re...
Alyssa Rosenzweig
Asahi Lina
chaos_princess
Davide Cavalca
Dougall Johnson
Ella Stanforth
Faith Ekstrand
Janne Grunau
Karol Herbst
marcan
Mary Guillemard
Neal Gompa
Sergio López
TellowKrinkle
Teoh Han Hui
Rob Clark
Ryan Houdek
But when someone willingly posts (and keeps) this publicly https://www.youtube.com/watch?v=effHrj0qmwk
and then acts offended or claims doxxing (and starts using it to stir shit up for leverage) when people draw the obvious conclusion, that's behavior in bad faith and should be called out as such and dismissed.
Modern C++ with move semantics is a lot more easy to reason about and memory safe than C99, IMO.
Since it's a greenfield project, they didn't have to worry about the nasty baggage of legacy C++ spaghetti that kills most projects.
Just because you prefer "simple" C99 doesn't mean they do :)
About once a decade someone inside of Apple who is really passionate about games pushes some project through - you had GameSprockets in the 90's, you had someone convincing Valve to port Half-Life, you have GamePortingKit now, but it's just not in the companies culture to give game developers the long-term support they need.
Also, there are plenty of Windows-only games that aren't subject to those practices. Free games, itch.io games, GOG games, etc. There's a big world out there!
Those games are generally not AAA by definition, and often either already have a Linux build released, run acceptably under traditional emulation, or both.