This sounds so rough. I can't imagine pouring your heart out into this labor of love and continue to have to face something like this. Back in the early days of Quora, when it used to be good, there used to be a be nice be respectful policy (they might still have it), I wonder if something like that would be helpful for open source community engagement.
Regardless, major props to Marcan for doing the great work that he did, our community is lucky to have people like him!
First of all, I wholeheartedly applaud Marcan for carrying the project this far. They, both as individuals and as a team proper, did great things. What I can say is a rest is well deserved at this point, because he really poured his soul into this and worn himself down.
On the other hand, I'll need to say something, however not in bad faith. He needs to stop fighting with the winds he can't control. Users gonna be users, and people gonna be people. Everyone won't be happy, never ever. Even you integrate from applications to silicon level, not everyone is happy what Apple has accomplished technically. Even though Linux is making the world go on, we have seen friction now and then (tipping my hat to another thing he just went through), so he need to improve his soft skills.
Make no mistake, I'm not making this comment from high above. I was extremely bad at it, and I was bullied online and offline for a decade, and it didn't help to be on the right side of the argument, either. So, I understand how it feels and how he's heartbroken and fuming right now, and rightly so. However, humans are not an exact science, and learning to work together with people with strong technical chops is a literal superpower.
I wish Hector a speedy recovery, a good rest and a bright future. I want to finish with the opening page of Joel Spolsky's "Joel on Software":
Technical problems are easy, people are hard.
Godspeed Hector. I'm waiting for your return.
For the last few years, I've been saying the following regularly (to friends, family and coworkers): communication is the hardest thing humans will ever do. Period.
Going to the moon, launching rockets, building that amazing app... the hardest thing of all is communicating with other people to get it done.
As a founder (for 40+ years and counting) I manage a lot of different type of people and communication failures are the largest common thread.
Humans have a very, very tough time assuming the point of view of another. That is the root of terrible communication, but assumptions are right up there as a big second.
On the Marcan thing... I just want to say, control what you can and forget the rest (yes, this is direct from stoicism). Users boldly asking for features and not being grateful? Just ignore them. Getting your ego wrapped up in these requests (because that's what it is, even if he doesn't want to admit it), is folly.
I contributed to Marcan for more than a year. I was sad to see the way it ended. I wish him well.
That's very true. I recommend some people to read "The Four Agreements", because that thin book has real potential to improve people's lives through active and passive communication.
Spoiler, but approximately 66% of the adult population make do without being able to maintain their own perspective independently of what their social circle tells them it is. I imagine that would make it extremely challenging to determine what someone else's perspective is. Especially if that perspective is being formed based on empiricism rather than social signalling.
And if we're making book recommendations, Non-Violent Communication is a gem of an idea.
[0] https://medium.com/@NataliMorad/how-to-be-an-adult-kegans-th...
stoics don't write multi-paragraph goodbye letters
Marcus Aurelius wrote extensive personal reflections in his "Meditations". Seneca wrote detailed letters to friends and family discussing philosophy, life, and death. Epictetus discussed death extensively in his Discourses, but sure, they were philosophical teachings rather than personal goodbyes.
They focus on acceptance and equanimity rather than formal farewells.
That said, "control what you can and forget the rest" is indeed stoicism, albeit simplified.
Why you felt the need to add your comment, is a more apt question.
Eh, not really - "multi-paragraph goodbye letters" here refers to the overly dramatic fad that internet denizens sometimes engage in when they leave communities, and they tend to have a lot of whining.
Those types of goodbye letters are not the types of goodbye letters stoics would write.
> Why you felt the need to add your comment, is a more apt question.
If you were able to pick up so swiftly what the person I replied to was implying, you too should be able to have picked up that I replied because I disagreed with that implication.
The alpha male stoic carricutures, maybe. Real world stoics have not been above those.
>you too should be able to have picked up that I replied because I disagreed with that implication.
You could then just say that you disagree and state your case, without rudely asking why they posted it.
I doubt this, but would be curious to see a source.
> You could then just say that you disagree and state your case, without rudely asking why they posted it.
I didn't find it rude at all, and your reply was far less productive than my IMO neutral question. You took offense on behalf of someone else and inserted yourself when it was unnecessary and entirely reliant on your interpretation and perception. Now we're discussing your perceived slight instead of anything of substance.
Right - but it kinda sounds like he's facing headwinds in a lot of different directions.
Headwinds from Apple, who are indifferent to the project, stingy with documentation, and not inclined to reduce their own rate of change.
Headwinds from users, because of the stripped down experience.
Headwinds from the kernel team, who are in the unenviable situation of having to accept and maintain code they can't test for hardware they don't own; and who apparently have some sort of schism over rust support?
Be a heck of a lot easier if at least one of them was on your side.
That is part of the challenge he chose to take on.
> Headwinds from users, because of the stripped down experience.
Users can be ignored. How much you get users to you is your own choice.
> Headwinds from the kernel team, who are in the unenviable situation of having to accept and maintain code they can't test for hardware they don't own
You don't have to upstream. Again, it's not the kernel team that chose to add support for "hostile" hardware so don't try to make this their problem.
> and who apparently have some sort of schism over rust support?
Resistance when trying to push an entirely different language into an established project is entirely expected. The maintainers in question did not ask for people to add Rust to the kernel. They have no obligation to be welcoming to it.
> Be a heck of a lot easier if at least one of them was on your side.
Except for the users all the conflicts are the direct result from the choice of work. And the users are something you have to choose to listen to as well.
Their boss, however, did ask for it, so yes, they do have an obligation to be welcoming to it.
"did ask for it" - did he? Because from my perspective it looks more like he gave the bone for corporations so they will shut up for rust in kernel. After some time it will end up "Sorry but rust did not have enough support - maintainers left and there were issues with language - well back to C"
I addressed your second point here: https://news.ycombinator.com/item?id=43075508
With all the drama, I wouldn't be the least surprised if he soon withdraws that provisional acceptance.
> "A lot of people actually think we're somewhat too risk averse," said Torvalds. "So when it comes to Rust, it's been discussed for multiple years by now. It's getting to the point where real soon now, we will actually have it merged in the kernel. Maybe next release."…
> "Before the Rust people get all excited," the Linux kernel creator and chief said. "Right? You know who you are. To me, it's a trial run, right? We want to have [Rust's] memory safety. So there are real technical reasons why Rust is a good idea in the kernel…”
> “And hopefully, it works out, and people have been working on it a lot, so I really hope it works out…”
https://www.theregister.com/2022/06/23/linus_torvalds_rust_l...
Last September he was still insisting he thinks the project will not fail, and he was not exactly subtle in his criticism of maintainers who refuse to engage with it in good faith.
> "Clearly, there are people who just don't like the notion of Rust, and having Rust encroach on their area.
> "People have even been talking about the Rust integration being a failure … We've been doing this for a couple of years now so it's way too early to even say that, but I also think that even if it were to become a failure – and I don't think it will – that's how you learn," he said.
> "So I see the whole Rust thing as positive, even if the arguments are not necessarily always [so]."…
> With impressive diplomacy, considering his outbursts of years past, Torvalds went on, "There's a lot of people who are used to the C model, and they don't necessarily like the differences... and that's ok.
https://www.theregister.com/2024/09/19/torvalds_talks_rust_i...
But yeah, I still don't think it's all that inaccurate: He may not have wanted it to fail, and still not think it's a technical failure... But socially? Still seems possible he'd be starting to think that while the Rust language per se is a technical success, all the drama surrounding the integration of it into Linux means that that is turning out to be a social failure.
(Or maybe I'm just projecting because that is what it looks like to me.)
Many ARM SOC are designed to run on battery only so the wireless packages and low power states are better, my AMD couldn't go below 400mhz.
But yeah the "Apple M hardware is miles and leagues away" hypetrain was just a hypetrain. Impressive and genuinely great but not revolutionary, at best incremental.
I hope to be able to run ARM on an unlocked laptop soon. I run a Chromebook as extra laptop with a MediaTek 520 chip and it's got 2 days battery life, AMD isn't quite there yet.
It's more nuanced than that. Apple effectively pulled a "Sony A7-III" move. Released something one generation ahead before everybody else, and disrupted everyone.
Sony called "A7-III" entry level mirrorless, but it had much more features even when compared to the higher-end SLRs of the era, and effectively pulled every other camera on the market one level down.
I don't think even they thought they'd keep that gap forever. I personally didn't think it either, but when it was released, it was leaps and bounds ahead, and forced other manufacturers to do the same to stay relevant.
They pulled everyone upwards, and now they continue their move. If not this, they also showed that computers can be miniaturized much more. Intel N100 and RaspberryPi/OrangePi 5 provides so much performance for daily tasks, so unimaginable things at that size are considered normal now.
It's just another "Apple integrating well" story.
Their SoC is huge compared to competitors because Apple doesn't have to make a profit selling a SoC, they profit selling a device + services so they can splurge on the SoC, splurging on the SoC plus being one node ahead is just "being good", the team implementing Rosetta are the real wizards doing "revolutionary cool shit" if anything
...plus, they have a whole CPU/GPU design company as a department inside Apple.
Not dissimilar to Sony:
Sony Imaging (camera division) designed a new sensor with the new capabilities of Sony Semiconductor (fab), and used their exclusivity to launch a new camera built on top of that new sensor. Plus, we shall not forget that Sony is an audiovisual integration powerhouse. They one of the very few companies which can design their DSPs, accompanying algorithms, software on top of it, and integrate to a single product they manufacture themselves. They're on par with Apple's integration chops, if not better (Sony can also horizontally integrate from Venice II to Bravia or Mics to Hi-Fi systems, incl. everything in between).
The gap also didn't survive in Sony's case (and that's good). Nikon and Fuji uses Sony's sensor fabs to use their capabilities and co-design sensors with the fab side.
Canon had to launch R series, upscale their sensor manufacturing chops. Just because Sony "integrated well" when looked from your perspective.
Sony is also not selling you the sensor. It's selling you the integrated package. From sensor to color accuracy to connectivity to reliability and service. A7-III has an integrated WiFi and FTP client to transfer photos. A9 adds an Ethernet jack for faster transfers. Again, integration within and between ecosystems.
Compared to the incremental changes we've seen the previous 10 years before it arrived on AMD/Intel space, it was revolutionary.
What they did doesn't matter. Even if they merely took an intel laptop chip and stuck a chewing gum on it, the result was evolutionary.
So much so, that it put a fire under Intel's ass, and mobilized the whole industry to compete. For years after it came out the goal was to copy it and beat it.
What did you expect to call "revolutionary"? Some novel architecture that uses ternary logic? Quantum chips?
It took them a while to, but they finally offer boards based on AMD chips.
I don't need an upgrade now, but I feel a RISC-V framework is feasible once I do.
Humans are shaped by experience. This is both a boon and a curse. I have been also been on the hot end of the stick and burned myself down, sometimes rightly, sometimes wrongly. Understanding that I don't want to go through this anymore was the point I started to change.
> Collaborating on software development is a social activity and stuff like convincing maintainers to trust you and your approach is just as important part of it (if not more important) as writing code.
Writing the code is at most 5% of software development IME. This is what I always say to people I work with. I absolutely love writing code, but there are so many and more important activities around that, I can't just ignore them and churn out code.
This really depends on what you work on. And how good the managers are on your team. I talked to a manager at Google once about how he saw his job. He said he saw his entire job as getting all of that stuff out of the way of his team. His job was to handle the BS so his team could spend their time getting work done.
This has been my experience in small projects and in very well run projects. And in immature projects - where bugs are cheap and there’s no code review. In places like that, I’m programming more like 60% of the time. I love that.
But Linux will never be like that ever again. Each line of committed code matters too much, to too many people. Is has to be hard to commit bad code to Linux. And that means you’ve gotta do a lot of talking to justify your code.
I did some work at the IETF a few years ago. It’s just the same there - specs that seem right on day 1 take years to become standards. Look at http2. But then, when that work is done, we have a standard.
As the old saying goes, if you want to go fast, go alone. If you want to go far, go together. Personally I like going fast. But I respect the hell out of people who work on projects like Linux and chrome. They let us go far.
Someone who is in a management position, has good political skills and good connections will be way more efficient at doing some of this non-programming work.
This is something that even C-levels forget. Something that takes a CTO 2 minutes to do can take several months for a regular developer to achieve, and I have plenty of experience on and plenty of examples of that.
People might also live their hobby dev experience better if they were really coding for themselves without any expectation except pushing the code to a repo. As a hobby dev, you don't have to make package, you don't have to have an issue tracker, you don't have to accept external contributions, you don't have to support your users if you aren't willing to have this on your shoulder. You don't even need a public git repo, you could just put a release tarball when release is ready on your personal website.
Even before we started coding, there was an RFC written by us. We have talked about it, discussed it, ironed it out with the chief architects of the project. When everything made sense we started implementing it. Total coding hours is irrelevant, but it's small when compared all the planning and it's almost finished now.
The code needs to tap and fit into a specific place in the pipeline. Finding and communicating this place was crucial. The code is not. Because you can write the most sophisticated code in the most elegant way, but if you don't design and implement it to fit to the correct place, that code is toast, and the effort is a waste.
So yes, code might be the most enjoyable (and sometimes voluminous) part, but it's 5% of the job, by weight, at most.
These are all engineering tasks, and the longer you spend on a team/in a company, the more likely it is you provide more value by doing this than by slinging code. You become a repository of institutional knowledge to dispense.
If you can communicate it can be 99% of the value. Getting someone to write something to back it up is trivial in comparison.
"Talk is cheap. Show me the code."
- Linus Torvalds
All the while: You are correct, being able to produce anything that solves a problem is much more valuable than being able to talk about it. But in order to unlock the value (beyond solving your own problem) absolutely requires communication
I have written plenty of code that's stuck on this first step in my life, including some that went to the very same LKML we're talking about here right now. Some of those things have already been independently written again by other people who actually managed to go further than that.
Perhaps "useless" was the wrong word the GP used. "valued" may be better.
It's fairly common for very useful/valuable code to be discarded because the engineer (or his management) failed to articulate that value to senior leaders as well as someone else who had inferior code.
Yeah but FFS using email for patches when there are so much better ways of doing development with git? The Linux Foundation could selfhost a fucking GitLab instance and even in the event of GitLab going down the route of enshittification or closed-source they could reasonably take over the maintenance of a fork.
I get that the Linux folks want to stay on email to gatekeep themselves from, let's be clear, utter morons who spam on any Github PR/issue they can find. But at the same time it makes finding new people to replace those who will literally die out in the next decade or two so much harder.
They're not micro kernel! They're not TDD! They're not C++! They're not CVS! Not SVN! Not SCRUM! Not Gitlab!
Yet the project marches on, with a nebulous vision of doing a really useful kernel for everyone. Had they latched on any of the armchain expert criticism of how they're doing it wrong all these years we wouldn't be here.
The question is - how long will it march on? The lack of new developers for Linux has been a consistent topic for years now. Linus himself isn't getting younger, and the same goes for Greg KH, Ted Ts'o and other influential leads.
When the status quo scares off too many potential newcomers, eventually the project will either wither or too inexperienced people drive the project against a wall.
Why would he need to, he's already a young whippersnapper.
...which doesn't matter at all.
The people in charge decided on their preferred ways of communication. You may believe that there are better ways out there, and I may even agree with you, but ultimately it's completely irrelevant. People responsible decided that this is what works for them and, to be honest, they don't even owe you an explanation. You're being asked to collaborate in this specific way and if you're unable to do it, it's on you. If you want to change it, work your way to become a person who decides on this stuff in the project, or convince the people already responsible. Notice how neither of those are technical tasks and that they don't depend on technical superiority of your proposed methods either.
You are missing one point, namely that email is probably the only communication medium that's truly decentralized. I mean, on most email providers you can export your mailboxes and go to someone else. You can have a variety of email clients and ways to back up your mailboxes. No git clone, no specific mailbox or server is in any way special, I think Linus emphasized recently that they made efforts to ensure kernel.org itself is not special in any way.
Yes, I find Github's or Gitlab's UI, even with all enshittification by Microsoft and whatnot, better for doing code reviews than sight-reading patches in emails. And yet I cannot unsee a potential danger that choosing a service — any service! — to host kernel development would make it The Service, and make any migration way harder to do than what you have with email. Knowing life, I'd say pretty confidently that an outcome would be that there would be both mailing lists and The Service, both mandatory, with both sides grumbling about undue burdens.
Have you ever been in a project which had to migrate from, say, Atlassian's stack to Github, or from Github to Gitlab, or vice versa? Heck, from SourceForge + CVS/SVN to Github or similar? Those were usually grand endeavors for projects of medium size and up. Migrate all users, all issues, all PRs, all labels, test it all, and you still have to write code while it all is happening. Lots of back-and-forth about preserving some information which resists migration and deciding whether to just let it burn or spend time massaging it into a way the new system will accept it. Burnout pretty much guaranteed, even if everyone is cooperating and there is necessity.
But you could probably build tools on top of email to make your work more pleasant. The whippersnappers who like newer ways might like to run them.
This is not about spam, server management or GitLab/Gitea/whatever issue. This is catering to most diverse work methods, and removing bottlenecks and failure points from the pipeline. GitLab is down, everybody is blocked. Your mail provider is failing? It'll be up in 5 minutes tops, or your disk is full probably, go handle it yourself.
So Occam's razor outlaws all the complex explanations for mail based patch management. The answer is concise in my head:
> Mailing list is a great archive, it's infinitely simpler and way more robust than a single server, and keeps things neatly decentralized, and as designed.
This is a wind we can't control, I for one, am not looking and kernel devs and say "What a bunch of laggard luddites. They still use e-mail for patch management". On the contrary, I applaud them for making this run for this many years, this smoothly. Also, is it something different what I'm used to? Great! I'll learn something new. It's always good to learn something new.
Because, at the end of the day, all complex systems evolve from much simpler ones, over time. The opposite is impossible.
Well until you deal with email deliverability issues, which are staggeringly widespread and random. Email were great to send quick patches between friends like you'd exchange a USB key for a group project. For a project the size of Linux? It doesn't scale at all. There is a reason why Google, Meta, Red Hat, and [insert any tech company here] doesn't collaborate by sending patches via email.
mail-based patch management is fine for smaller projects, but Linux kernel is too big by now.. it sure is amazing how they seem to make it work despite their scale, but it's kinda obvious by now, that some patches can go unnoticed, unprioritized, unassigned, ...
and open source is all about getting as many developers as possible to contribute to the development. if I contribute something and wait months to get it reviewed, it will deter me from contributing anything more, and I don't care what's the reason behind it. the same goes for if I contribute something and receive an argument between two or more reviewers whether it's the right direction or not and there's no argumentative answer from a supervisor of the project and this situation goes on for months...
[citation needed]
It's what "open source" enables, but it may not necessarily be a desired goal of a FLOSS project.
It's not really enough to state your case. You have to do the work.
On the surface, the kernel developers are productive enough. Feel free to do shadow work for a maintainer and keep your patch stack in Gitlab. It it can be shown the be more effective, lots of maintainers are going to be interested. It's not like they all work the same way!
They just have a least common denominator which is store-and-forward patch transport in standard git email format.
Everyone still has at least the base branch they're working on and their working branch on their machine, that's the beauty of working with Git. Even if someone decides to pull a ragequit and perma-wipe the server, when all the developers push their branches, the work is restored. And issues can be backed up.
> Also, is it something different what I'm used to? Great! I'll learn something new.
The thing is, it's harder and more difficult in a time that better solutions exist. Routinely, kernel developers complain about being overworked and onboarding of new developers to be lacking... one part of the cause certainly is that the Linux kernel is a massive piece of technology, and another one that the social conventions of the Linux kernel are very difficult, but the tooling is also very important - Ballmer had a point with "developers developers developers".
People work with highly modern tools in their day jobs, and then they see the state of Linux kernel tooling, and they say "WTF I'm not putting up with that if I'm not getting paid for it".
Or to use a better comparison... everyone is driving on the highway in the same speed, but one car decides to slow down, so everyone else overtakes it. The perpetual difficulties of many open source projects to accomodate changing times and trends - partially because a lot of small FOSS is written by people for their individual usage! - are IMHO one of the reasons why there is so much chaos in the FOSS world and many private users rather go for the commercial option.
If you approach it from the viewpoint that you have the solution and they are Luddites, you will influence no one and have no effect.
Asahi Linux is similar, given how hostile and undocumented Apple Silicon is, but it has a great amount of expectations of feature completeness and additional bureaucracy for code changes that really destroys the free-wheeling hacker spirit.
What I found is being able have this "afterburner mode" alongside "advanced communications" capabilities gives the real edge in real life. So, this is why I wish he can build his soft skills.
These skills occupy different slots. You don't have to sacrifice one for the other.
The BSDs. You can fork a BSD. Maybe he could try to mainline into the BSD, but would probably face a similar battle with the BSDs. Right, one again, the benefit mainlining into linux, and there is some (maybe limited) support to include Rust, is you can narrow your scope. You don't need to worry as much about some things because they will just sorta work, I am thinking like upper layers of the kernel. You have a CPU scheduler and some subsystems that, may not be as optimized for the hardware, but at least it is something and you can focus on other things before coming around to the CPU scheduler. You can fork a BSD, but most would probably consider it a hard fork. I also don't think any of the BSDs have developers who are that interested in brining in Rust. Some people have mentioned it, but as far as I know, nothing is in the works to mainline any kind of Rust support in the BSD kernels. So he would probably meet similar resistance if he tried to work with FreeBSD. OpenBSD isn't really open to Rust at all.
If Rust is the point you get up from the bed in the morning, why don't you focus on Redox and make it the new Linux? Redox today is much more than Linux was in 1991 so it's not like you would be starting from scratch.
You're probably not as good as Linus in, well, anything related to this field really. The only way to find out whether you actually are is to do the work. Note that also he spent a lot of time whining to people who were perceived as the powerful in the field. But in addition to whining he went and did the work and proved those people wrong.
Mind you, I'm a PHP developer by day, so this Rust-vs-C debate and memory management stuff is not something I've had experience with personally, but the "Rust is magical" section towards the bottom seems like a good summary of why the developer chose to use Rust.
Discussion at the time: https://news.ycombinator.com/item?id=33789940
I personally fall a little more on the side of the Linux kernel C devs. Inter-oping languages and such does bring in a lot of complications. And the burden is on the Rust devs to prove it out over the long haul. And yes, that is an uphill battle, and it isn't the first time. Tons of organizations go through these pains. Being someone who works in a .NET shop, transitioning from .NET Framework to .NET core slowly is an uphill battle. And that's technically not even a language change!
But I do agree, Redox would probably less friction and a better route if you want to get into OS dev on an already existing project and be able to go "balls to the walls" with Rust. But you also run into, Redox just has a lot less of everything. That is just because it's a small project.
«Undocumented» – yes, but «hostile» is an emotionally charged term that elicits a strong negative reaction; more significantly, though, it constitutes a flagrant misrepresentation of the veritable truth as stipulated within the resignation letter itself:
When Apple released the M1, I realized that making it run Linux was my dream project. The technical challenges were the same as my console homebrew projects of the past (in fact, much bigger), but this time, the platform was already open - there was no need for a jailbreak, and no drama and entitled users who want to pirate software to worry about.
Which is consistent with marcan's multiple previous blog posts and comments on here. Porting Linux (as well as NetBSD, OpenBSD) onto Apple Silicon has been no different from porting Linux/*BSD onto SPARC, MIPS, HP-PA and other platforms.Also, if you had a chance to reverse-engineer a closed source system, you would have known that «hostile» has a very specific meaning in such a context as it refers to a system that has been designed to resist the reverse-engineering attempts. No such resistance has been observed on the Apple Silion computing contraptions.
I think they even left a "direct boot from image" (or something similar) mode as a small door to allow Asahi Linux development, if not to accelerate a little bit without affecting their own roadmap. Even Hector tweeted about it himself!
I just wanted to also add that users will be users. Once its out, there will be endless posts about "why X" and "why not Y". No matter what you do, lots of people are going to be displeased. Its just the way things go. I hope he will want to pick it up again after some time.
The secret is to have a healthy system for taking in those requests, queueing them by priority, and saying, "you are 117 in the queue, you can make it faster by contributing or by explaining why its higher priority".
You can't let feature requests get to you, the moment you do your users become your opponent. None of those requests are entitled, the author has clearly already reached a point where they are antagonistic towards requests.
I would tell them:
"I have 5 P1 tickets, 8 P2 tickets, and dozens of P3 tickets. Your ticket is a P3 ticket."
They would ask that I change it to a P1. I would. Then they would call me an hour later asking me about the ticket and I would tell them:
"I have 6 P1 tickets."
That's when they'd understand ;)
Otherwise he knows he's 6th in line.
The day came when, after prolonged hand wringing and with stern observations about great power and great responsibility, the priority could be set to P0. But like any bunch of junkies we came off this new high all too quickly and the P-1 classification arrived, the showstopper of showstoppers.
In hindsight what I most regret is that we stuck with an integer field; we were denied the expressive power of fractionally critical issues.
I got along great with the sales guys. They could understand that kind of thing.
Not when all the other P1 tickets are from other sales guys.
But at least now they're all fighting with each other, via your manager, so they're out of your hair.
This is when the underrated skill of saying NO pays off massive dividends. One long-term client once told me the thing he appreciated the most, compared to most other consultants, was that I wasn't afraid of pushing back on his requests and saying no (within reason). Probably the most valuable feedback I have ever received.
Obviously, there was some oversight from managers, but overall it worked pretty well.
Great idea about the priority queue.
Its quite difficult to ban someone from a public park, especially when they can just put on a new hat.
Its really easy to ban someone from a private park. Even if they do put on a new hat, when they get belligerent again you just revoke the renewal of their access pass.
My company's bug tracker is mostly internally-filed bugs, but accepts bugs from the public. The difference in tone and attitude is night and day. The public-filed bugs can be wild, varying across: Rude, entitled, arrogant, demanding, incoherent, insulting, irrelevant, impatient... They are also the worst when it comes to actually including enough information to investigate. Frequently filed without logs, without reproduction steps, sometimes without even saying what the filer thinks is wrong. We get bugs with titles "It doesn't work" and with a text description that reads like a fever dream from someone very unwell.
We do have strong personalities among employees, but bug reports tend to be professionally and competently written, contain enough information to debug, and always, always leave out insults and personal attacks. The general public (at least many of the ones technical enough to file bug reports) does not seem to have the emotional regulation required to communicate professionally and respectfully.
In projects where this is a problem, I've made an issue template that clearly requests all the stuff I think I'll need. There's a big note at the top of the template that says it's not optional and that if it isn't filled out fully, I'll close the issue without comment.
And then I do that, every time. Sometimes they fill it out and reopen, sometimes they don't. Either way, I don't end up wasting time trying to help people who don't respect my time.
Fair deals attract people with some money, but the almost-free only attract people who are forever broke, who live their life feeling entitled to everything being handed over to them.
this is always true with, at least a great many, people. it's related to choosey-beggar syndrome. it's a bug/glitch/feature in human psychology.
if you ever have the chance to be a property manager, never ever let someone move in a week early or pay a week later for free. never let your rent get drastically below market. when people aren't paying for something, it's incredibly common behavior to stop respecting it. it's like a switch flips and suddenly they are doing you the favor.
that's why in times past, offering or taking "charity" was considered impolite. but making a small excuse might be ok. say someone needs to stay an extra week after their lease was over, but was strapped for cash. instead of saying "sure you can stay one more week", say "well, you'd really be doing me a small favor staying in the place to watch for the extra week since it's empty anyway. how about i discount the rent by 50% for that week and amend the lease to take care of it."
Having in the person taking these meetings for a software vendor, it can get really toxic quickly and I never had more than 1 meeting a quarter with really toxic people and they were at least paying for the product and maintenance so hearing them out was part of the job. It unfortunate to get to the point where you view customer requests as antagonistic, but I can see how it happens. Some people really feel entitled, and some have a job to do and limited resources or control to do it in.
Does it have to be a meeting? Although it's about sales calls, I'm reminded of https://keygen.sh/blog/no-calls/ (HN discussion: https://news.ycombinator.com/item?id=42725385 )
That said, I sympathize very much with Marcan on this project: getting the basic infrastructure for Linux operational on new hardware inflames passions much more than a niche project like a DAW.
I've read your comments here (and elsewhere) for a long time, and I'm sure you'd have some great ideas or at least opinions about this, which is pretty relevant to what you just wrote: https://news.ycombinator.com/item?id=43037537
It's much easier to shrug off strong comments when the people who do support you are making it possible for you (and one other) to lead a pretty comfortable middle class life.
If you're supporting end users you need to be collecting money from them.
The mechanics of this system are entirely upside down. The corporations have bought into open source to regain control of computing and passionate developers are mired in the swamp of dumb user requests.
Something went very wrong here.
Simplest ( works in enterprise too) is to say pay for it to be faster or even considered.
I say that also because I have been gotten quite a few responses from people that I should use asahi, while looking at what it supports it definitely would not make sense for me, and you cannot just present it to a macos alternative right now.
25 years ago (huh, long time), when Windows ME pissed me off for good, linux wasn't exactly known for being a daily driver but I gave it a try and, unsurprisingly, it did become reliable over the years. Other than Gnome's propensity to make stupid changes to default settings I can't remember the last time I had to even think about messing with the underlying system and other than a simple google search on the linux compatibility of hardware before I buy I just don't think about it. Actually, I take that back, when I first got my current laptop I was messing around to get the AMD mesa drivers (or whatever) working because I wanted to mess around with this fancy GPGPU thing.
Personally, if I were to buy a macbook it would be for the OS and not dodgy linux support because I've walked that road before. If the Christmas sales were just a tiny bit better though...
Imo modern linux experience is much better than the situation you describe, at least as long as you use certain type of hardware. In the past it was definitely harder. But wrt asahi, I want the "luxury" of using an external monitor with my 13" macbook air, and sadly, while in the past x86 machines I put linux I would put some effort and get AMD mesa drivers to work, I cannot do that here. I respect the effort put in the asahi project, but calling it suitable for a daily driver is misleading, unless you specify exactly what sort of daily driver you mean. Stuff like using an external monitor is pretty basic in my book of daily usage.
In hindsight.
> [Linux] did become reliable over the years.
Might have gone the other way. And if it had, nobody would be surprised at that either, now.
For some reason people feel that it is appropriate to throw barbs in their issue reports. Please to everyone out there, if you find an issue and want to report it (hurray open source!) please be kind with your words. There are real people on the other side of the issue.
Always remember, you catch more flies with honey than vinegar.
That seems to be a general characteristic. I strive to be cheerful and helpful whenever I'm asking for something. I feel like (sadly) it sets me apart from the crowd and helps me to get what I'm asking for. And IAC, with so little effort on my part I may brighten someone else' day and that makes me happy.
Just last week I asked housekeeping at a hotel for an old style coffee pot since I had brought my own coffee and filters. I started with "Can I pester you a moment?" and the conversation went up from there. Housekeeping was extremely friendly and helpful. Later I guessed this might have been her way to disarm some of the typical hostile interchanges she's been the brunt of.
There's a broader topic of ... just be nice to people. It doesn't cost anything. It does reassure me that this universe has been struggling with this for decades upon decades--witness the Malvin and Jim scene in WarGames. "Remember when you told me to tell you when you were acting rudely and insensitively?"
That's certainly how I felt when trying to get my drawing tablet to work properly under Linux Mint, although in my case I skipped filing an issue and just gave up and went back to Windows.
> “Asahi is useless to me until I can use monitors over USB-C” “The battery life sucks compared to macOS”
These are not even requests. These are objective statements he can either take note of for prioritisation or ignore. I can also say Asahi is useless to me until usb-c monitors support, but that's just my situation - there's no bad faith or request here. Previously that was the same for WiFi support.
I wish there was some good model for maintainers of bigger projects to deal with this on a personal level. The bigger the project, the more people there will be with unmet requirements and that's just life. It literally can't be solved.
> I miss having free time where I can relax and not worry about the features we haven’t shipped yet. I miss making music. I miss attending jam sessions. I miss going out for dinner with my friends and family and not having to worry about how much we haven’t upstreamed. I miss being able to sit down and play a game or watch a movie without feeling guilty.
This is the big problem really. He should have just turned down his work hours to a regular 40 a week, asked for more donations to pay more people and asked for more volunteer help. And honestly, probably therapy.
I don't know this person so this is completely baseless speculation but I assume they are "going through it" in some way and experiencing significant burnout, which based on my own experience in the past has a way of (negatively) amplifying all sorts of interactions that are related to the source of your burnout.
Basically, making linux work on Apple hardware is a pretty hard task, including a shitload of reverse engineering.
When a user decides to try it, and finds a lot of features missing, they are completely unaware of the work required to get it into that state, and just think they should have the readily available features.
Or: he shouldn't steal people's time with false advertising :shrug:
Also if he wants to create an operating system, then these aren't even requests, but bug reports. So the users ate his false advertising, spent time to try out his system, then spent some more time to file bug reports, and then he calls them "entitled users".
I can't imagine then what's his problem. I don't get offended by people that can't even read. I don't normally call them people let alone entitled :\ Set up a bot that links them the device support page, and problem solved? I don't get it
I think that might be the problem.
It's comments like these that causes people to wear out.
No it isn't. You - fundamentally - don't get to control what people say to you. You need to filter how to take that. And that's incredibly hard. Especially in open source. You need to both be able to ignore (some version of "idiots, who can't be bothered to read") and be openminded enough to take weird requests, because they could be the starting point of a new major contributor. The second is optional, as long as you are happy just doing your thing, but then the former probably won't become a problem for you.
>You need to both be able to ignore
> and be openminded enough to ...
I'm know it's pretty pointless to argue because we see the world in a different way. But realize the (quoted) requirements are you putting on the open source developer.
A developer without these skills will burn out.
> A developer without these skills will burn out.
And I think that's something that should be said more directly. If you want to do open source (as in become the provider of load bearing infrastructure): Then you really need to realise what you are getting yourself into. Would I like that to be different? Sure. Would I bet on that changing? Absolutely not.
And yes, that absolutely means you can either do open source as a hobby, then nobody should ever be willing to rely on the thing you are building (because you can just say "i've got better things to do than fixing the security bug you got") or you can attempt to get other people to use and rely on it, but then you have to find a way not to burn out.
You don't get negative feedback if you don't open communications channels for that.
This some next level philosophy pondering, thanks.
I'd expect the worst part for an Asahi project contributor to be the active sabotage some angry Linux kernel devs are trying to pull because they don't like Rust. Users being unreasonable is one thing, but your fellow maintainers are supposed to be allies at least.
I hope Marcan can find a new project to take on that doesn't involve all of this mess.
I don't think it's even just that, it seems to be something about the price.
I work on a piece of closed-source free software, and we consistently get support requests from unbelievably entitled assholes. The worst of them are the ones that have some technical knowledge; they will not only demand things be fixed or implemented, they make completely erroneous statements about how easy it would be to fix/implement with the conviction that they are 100% correct, with a level of arrogance that is impossible to fathom how they could have written their email with a straight face.
The support requests we receive for a paid offering from the same company are 99% of time much more pleasant people (of course there are the, "I PAID FOR THIS YOU MUST FIX IT!!!1!" on occasions, but they're a definite minority).
When I want to give something away, I list it for some nominal fee like $10, then just tell them to keep it. Because when I used to list things for free, I got the dredges of society bothering me. Asking for delivery, asking me to hold it for 3 months til they can find a truck, cussing at me for saying no to both of these, cussing me because I sold it to someone else already, telling me long sob stories to guilt me. I've never had any of that happen when asking for money(except one guy wanted me to deliver it for $20, which was a fair-ish offer).
I wonder if that same 'pay but you'll get it back under the table' model could work for software? At least until the word got out, I guess.
Sounds like a great time to give them a refund because they didn’t get the product they thought they were getting.
Too passive aggressive? :)
The only way to do that is to never collaborate with anyone else. I hope he'll be someday able to process what happened, why and reach appropriate conclusions. Software development is a social activity, especially with relatively high-visibility projects like Asahi, and it comes with just as usual burden of social troubles as any other kind of social activity.
Yes.
> The only way to do that is to never collaborate with anyone else.
Not necessarily. You can also treat project politics and social skills like any other technical skills that you need on your team like network engineering or database optimization.
If you can find trusted collaborators with those social and political skills, you can make a lot of things happen without necessarily being very good at it yourself.
Team building has a lot of parallels with building a full stack technology. Or building a sports team.
The real answer is to either learn these skills or, as you suggest, delegate them. Hoping to find something that doesn't involve "all this mess" at all will be fruitless.
In those days, I was part of a core development team for a project with a fairly large community. A few bad users and a few bad development team members is all it takes to poison something like that.
Now I barely even contribute to Open Source projects even when I fix them for my own uses.
Anyway, if your project involves convincing hundreds of maintainers to increase their cognitive/work load in order to include your fancy new foreign workflow breaking language into their project, you have to expect pushback.
This has not been my experience. Perhaps consider that the problem is not the users.
> the active sabotage some angry Linux kernel devs are trying to pull because they don't like Rust
On the other hand, users that demand you rewrite the project in their favorite language or otherwise accomodate their preferences over your own are pretty annoying.
Who's demanding a rewrite of Linux?
Oh no. I'm convinced majority of burnouts are almost entirely caused by dealing with shitty people and/or shitty processes.
Shitty processes sometimes happen without shitty people, the people involved just let it happen.
Actually if this distro is my primary / only one I would like to be able to check CPU, GPU, etc. temperature. It is important to know if cooling is adequate or requires cleaning / repair.
In any case Marcan would be way better off having thick skin. Users will always be assholes (well same is generally true about vendors).
"Heavily under development and not ready for prime time use" should have been first line in readme and only reply to such feature request.
So it sounds like they bit more than they could chew.
People become vocal when they are pissed.
I think the best way to deal with this is to just confidently say what you are and are not ready to get done. The social dynamic will always be this way, so we may as well take whatever criticism is useful, leave the rest behind, and move on.
Selling ads? Using it as a gateway to a commercial product? Selling support? Have some genius business plan that allows you to make money in the future? Fine, give it away no strings attached but expecting that users will be grateful is a mistake developers keep repeating. The free users are just as entitled, even more entitled as they don’t have a price tag for your efforts and don’t have a document specifying what are your obligations so they can assume scope of entitlements anyway they wish.
Since you gave it for free, you can’t refund an unhappy customers to make it go away. If it looks like a product, You will be stuck with people who think they did their part by using your products and you failed them. Some may make it a full time job to take a revenge on this injustice.
I’m not even sure that these users are at fault, you actually took something in exchange(like fame, street cred etc) and you are not delivering your part.
> we brought the platform from nothing to one of the smoothest Linux experiences you can get on a laptop.
Despite the accomplishment this overselling irks me.
Wasn’t always like this, I think. Personally have seen the same with other projects and dealing with proprietary Apple APIs and their walled in garden is hard enough.
It’s called a Code of Conduct. It exists and is in use by many organisations, including several open-source projects.
> I get that some people might not have liked my Mastodon posts. Yes, I can be abrasive sometimes, and that is a fault I own up to. But this is simply not okay. I cannot work with people who form cliques behind the scenes and lie about their intentions. I cannot work with those who place blame on the messenger, instead of those who are truly toxic in the community.
The abrasiveness though is the reason people react that way. Not everyone is going to respond with "hey that was abrasive, that's not how we do things, here is a better way to phrase it". The majority will simply shut down or start forming cliques in the background. I can't completely blame them either. Here is Hector threatening to launch a shaming social media campaign on kernel devs:
> https://lore.kernel.org/rust-for-linux/208e1fc3-cfc3-4a26-98...
"If shaming on social media does not work, then tell me what does, because I'm out of ideas."
That's not ok. Even if he feels he is right and they are wrong. People will create cliques and talk behind your back if you act that way. People will look on Rust community after this and say "Remember that time when _they_ where threatening kernel devs with social media drama?". It's not right but that's the perception that will last.
Happened with actix, happened with serde, and now being threatened by kernel contributors. The perception seems at least somewhat based in reality.
1) Issue found by Shnatsel
2) Issue closed as harmless to users by fafhrd91
3) Issue proven harmful to users by Nemo157 and reopened by JohnTitor
4) Issue fixed and closed by fafhrd91
5) Issue proven unfixed and proposed new patch by Nemo157
6) New patch commented "this patch is boring" by fafhrd91
7) Issue is deleted
8) Fix is reversed by fafhrd91, issue still present
http://web.archive.org/web/20200116231317/https://github.com...
A maintainer that rejects a fix for an issue that was proven harmful to users on the basis that it was "boring" and then deletes the issue is a bad maintainer. Death threats and abuse were definitely not the right answer, but public criticism is not unreasonable in such a case. If it were just a hobby project and advertised as such then that would be one thing, but he plastered info about how it was used production by a bunch of big companies on the website. That is not how someone who calls their code "production-ready" acts.
It is unfortunately wrapped up in larger-scale outrage culture than just within tech/programming circles. Rust as a community is very gay and very trans:
https://blog.rust-lang.org/2024/02/19/2023-Rust-Annual-Surve...
https://blog.rust-lang.org/2025/02/13/2024-State-Of-Rust-Sur...
To be clear I am 111% down for that as one of the Alphabet People myself lol. We just can't pretend like it isn't a factor.
Disclaimer: I realize these numbers are probably skewed high due to self-selection of people who are willing to take diversity surveys. The actual percentages are probably somewhat lower, but Rust undoubtedly has the highest concentration of any programming-language community. Zero question.
I know which of the two languages was easier and more pleasant to hire for - which should be impossible as I kept getting told no one uses forth.
Out of 14.5% of the respondents. I wouldn’t call that a very anything community.
[0] https://www.statista.com/statistics/1270166/lgbt-identificat...
Being a minority does not make you underrepresented. Underrepresented means there are fewer than you'd expect, given population-level numbers. In the Rust community it certainly seems true that trans people are overrepresented, though "marginalized" almost certainly still applies. The same goes for LGB, which again does not seem underrepresented in the tech community compared to society writ large, and I think many LGB people probably don't see themselves as "marginalized" in 2025, but I could be wrong.
I don't see why that would be the case.
And for tech in particular I'd say women (half the population) are underrepresented and LGBT (a definite minority) are not. Marginalization is a bit more complicated but similar.
How can you know this? What other communities even have such surveys?
I would expect this to be similar in any language. Anecdotally, I see the % of gay/trans/neurodivergent to be much higher in the dev community than the general population, so the numbers don’t look strange to me.
Perhaps it’s more vocal or more visible, but that would require much more analysis to enquire about cause and effect.
If a tree is felled in a forest and the logger doesn't tell you she's trans, does it make a sound?
This is complete nonsense. We (LGBT) folk are pretty much equally represented in all programming communities. It's just that Rust presents as a very socially activist community, with all the attendant drama and culture war nonsense, including falsely claiming some sort of imprimatur from the LGBT folk to represent them. Cliquey hyper-online gays != the LGBT community.
Fortran, Erlang/OTP, any stack you can think of, will have LGBT devs. Common Lisp has some kickass trans devs. It's not a proliferation of rainbow flag emojis and obnoxious puerile cancel-culture politics that makes one community be 'more' LGBT than another. I won't stand for this kind of erasure of LGBT folk who don't take their assigned place in the culture war barricades.
Rust is a very neat language, but the biggest single barrier to its adoption is the Rust community, and I won't have them hijacking my identity to pretend some moral title to their constant - and deeply unpopular - online brigading, bullying, etc.
Is there any technicality in the language that benefits from extra info? I’m not asking in bad faith, I legit want to know.
Get an HN article about C++, and you can be certain the comment section is going to deteriorate at some point into a religious war mentioning Rust. Get an article about Rust, and there is going to be drama in the comments.
As a programmer that could potential consider Rust, it is off-putting.
So yeah, typical internet houlier than thou reactions, I wouldn’t read much into them.
Of course: You don't have to use Rust to see this very post here on HN, and quite a few other similar ones. Are you saying people just imagine there's a lot of drama around Rust, or what? (That TFA here or in other similar posts are all lies, or outright made-up?) Because to me -- who never use Rust -- it looks like a fact.
Which is a large contributing factor to why I probably never will, either.
They cant help but proselytize. Its like talking to my recent born again christian friend who cant help but steer every conversation to Christianity and reciting scripture. It's infuriating.
Though TBH it very much feels like the cult of OOP that rocked the 90's. And look where that paradigm is now ...
It's alive and well. Sure, Java-style OOP might not be, but that's mainly because it was never sensible OOP to begin with.
A bit like "Agile is dead" and everybody hating "Agile". Sure, what they hate is what's been pushed as "Agile" for the last decade or more: ceremoniel-over-flexibility-Scrum, rigid sprints, "user story" as a synonym for "ticket", etc, etc.
Let's hope that it's just "Fauauxp" that, like Fauxgile, is about to be dead. ASAP.
Try asking a Rust zealot to give three different real world examples where someone should pick C++ over Rust, they can't do it. The zealots are literally incapable of viewing anything other than Rust as divine.
Every experience I've have with Rust people has been negative and worthless, so I view Rust as a major red flag on resumes when hiring :)
(The actix drama was stupid IMO and is fair to criticize the community over tho)
This is just an incredibly odd thing to say. It's so obviously out of line that it seems like someone's joking around.
The Rust community (generally-speaking) just can't see why people have a visceral reaction against them, independent of its technical qualities. In all my years, I've not seen anything like it.
Then I read the thread you linked and thought, “Oh. That.”
To be clear nobody deserves to be harassed or threatened, but Hector’s messages make it clear he is astoundingly good at making himself into a victim of injustice. When his messages mentioned “cancer” I immediately thought that meant another kernel dev told someone to get cancer or die of cancer or something, which would be completely unacceptable. He was using the word metaphorically to describe the way Rust is slowly making its way into the kernel, like a cancer growing.
How anyone (read: Hector) could think this requires CoC action is baffling to me. Insane language policing.
This was my same thought. And then at the end of his rant, he writes:
> If you are interested in hiring me...
No one who values a drama-free work place would hire this person.
The symbol for Cancer is also a crab. The very word "cancer" itself comes from the Greek "καρκίνος" (karkínos), meaning "crab".
Rust literally is Cancer.
One group believes it is Rust (progressives), one group doesn't believe that and wants to continue with C (conservatives).
If they cannot find a way to live at peace with each other, I think the only solution is for the Rust folks to start building the kernel in Rust and not try to "convert" the existing kernel to Rust piece by piece.
Why they cannot live in peace seems to be: a way that C kernel folks would not need to deal with Rust code.
At the core, the story is not that different from introducing new languages to a project.
You are introducing a new tax on everyone to pay for the new goodies you like, and those who are going to be taxed and don't like the new goodies are resisting.
Then entertain his question and tell us what is? Bringing up people’s attention to the matter to finally somehow resolve the situation is his last resort, after spending years trying to upstream even trivial patches. You can eat your cake and have it too - you can’t say you want rust in the kernel and then sabotage any upstreaming efforts
When upstream won't work with you, the answer is to maintain a separate tree. Yes, it's a lot of work to maintain a separate tree. No, you won't get as much use if you're in a separate tree.
Also, the person rejecting the patch seems to have never claimed to want rust in the kernel.
I mean, that's the kind of abusive dynamic I'd expect from a horrible corporation: stringing along underpaid or unpaid interns for several years and refusing to hire them at the end of it without giving any actual feedback.
In this particular case, Hector himself with the blog post hints at it, but a lot of damage has been done already: "I am working on personal issues currently, I'd like to step back for a while and will not be contributing. Thank you, all".
> Bringing up people’s attention to the matter to finally somehow resolve the situation
Not everything has a clear and fast resolution. I think Hector's team were hoping the resolution to be "Shut up everyone, we're doing Rust now, this is all merging in and that's that!". But it could have been "Shut up everyone, we're not doing Rust any longer". They would have been even more upset saying "this is a leadership failure, they're on the wrong side of history" and so on.
> you can’t say you want rust in the kernel and then sabotage any upstreaming efforts
Two wrongs don't make a right though. Call people out and ask them to explain their position, get others on your side. But threatening to drag their names all over Bluesky or X or Reddit or whatever latest thing is, is not productive, even more so it's anti-productive.
I'd argue that we're basically at the point where that _is_ what the de facto policy is, except without it being actually stated. There's a subsystem maintainer blocking any Rust code from being merged (even to be imported as a dependency from outside their subsystem) who said they will do "everything in their power" to stop Rust from being merged into any part of the kernel, and when people asked Linus to clarify whether he still thought it was viable to have Rust in the kernel, he said nothing. Hector made the infamous comment about social media, and _then_ Linus stepped in to say that we needed technical debate rather than social media brigading, which gives the not-so-great precedent that invoking social media was actually more effective at getting some sort of response than the technical debate that he actually said he wants. So now, the status quo is that someone with the power to completely block any progress towards actually including any amount of Rust in the kernel will presumably continue to do so, but Linus still is sticking to the line that we can have "technical debate" about it even though the outcome is predetermined to end in failure.
You're right that not everything has a clear and fast resolution, but given that the only possible ways for this to end other than just making the "no Rust in the kernel" policy explicit is either for Linus overrule the maintainer blocking any Rust code from being merged or every single patch containing any Rust code to be blocked, it seems pretty clear to me that the way things are now is just a slower, less clear version of the negative outcome, so having a clear and fast resolution with an undesired outcome would be far better. This seems like the real cause of frustration that Hector has; it's hard not to feel like the reasons for this path to "resolution" was picked over just admitting that it's essentially official policy that Rust isn't allowed for reasons that are ultimately purely social rather than technical. The correct resolution in my opinion would be if Linus said something like "regardless of my opinion on whether Rust should be allowed in the kernel, I'm not willing to overrule the decision of the subsystem maintainer in this case, so the current status quo will remain unless someone is able to convince people to merge things on their own". My best guess for why he didn't want to do that is that it would essentially paint a target on any maintainers refusing to merge Rust code, which is understandable but seems like it will just cause more frustration in the long run than simply ending acknowledging the reality of the current situation.
It's true but sort of assumes that Linus is an automaton, like a corporation: if you threaten with a social media drama, then he'll respond. The problem is that it feels he was forced to respond, and he didn't really like kernel devs being part of the social media drama. So he responded, so in a strange way, he emerged as the calm voice of reason. And it left a long unpleasant memory in the community regarding it.
> it will just cause more frustration in the long run than simply ending acknowledging the reality of the current situation.
Sadly, I think that's what will happen.
> I'd argue that we're basically at the point where that _is_ what the de facto policy is, except without it being actually stated.
It does seem that way, I agree with you, but I think this made it worse as you highlighted already. So it was an uphill road, but now the hill got steeper and taller. Another way this could have played out is Hector wrote a blog post saying "I am having personal issues, I am frustrated, I am stepping down". Let people figure out more details. But getting into a public spat with Linux devs was not productive for his and his team's goals. He hurt his team (Rust + Asahi) more than he helped in the end.
It makes sense to be extremely adversarial about accepting code because they're on the hook for maintaining it after that. They have maximum leverage at review time, and 0 leverage after. It also makes sense to relax that attitude for someone in the old boys' network because you know they'll help maintain it in the future. So far so good. A really good look into his perspective.
And then he can't help himself. After being so reasonable, he throws shade on Rust. Shade that is just unfortunately, just false?
- "an upstream language community which refuses to make any kind of backwards compatibility guarantees" -> Rust has a stability guarantee since 1.0 in 2015. Any backwards incompatibilities are explicitly opt-in through the edition system, or fixing a compiler bug.
- "which is actively hostile to a second Rust compiler implementation" - except that isn't true? Here's the maintainer on the gccrs project (a second Rust compiler implementation), posting on the official Rust Blog -> "The amount of help we have received from Rust folks is great, and we think gccrs can be an interesting project for a wide range of users." (https://blog.rust-lang.org/2024/11/07/gccrs-an-alternative-c...)
This is par for the course I guess, and what exhausts folks like marcan. I wouldn't want to work with someone like Ted Tso'o, who clearly has a penchant for flame wars and isn't interested in being truthful.
Many discussions online (and offline) suffer from a huge-group of people who just can't stop themselves from making their knee-jerk reactions public, and then not thinking about it more.
I remember the "Filesystem in Rust" video (https://www.youtube.com/watch?v=WiPp9YEBV0Q&t=1529s) where there are people who misunderstand what the "plan" is, and argue against being forced to use Rust in the Kernel, while the speaker is literally standing in front of them and saying "no one will be forced to use Rust in the Kernel".
You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
I personally don't know how to deal with this either, and tend to just leave/stop responding when it becomes clear people aren't looking to collaborate/learn together, but instead just wanna prove their point somehow and that's the most important part for them.
https://lore.kernel.org/rust-for-linux/20250131135421.GO5556...
> Then I think we need a clear statement from Linus how he will be working. If he is build testing rust or not.
> Without that I don't think the Rust team should be saying "any changes on the C side rests entirely on the Rust side's shoulders".
> It is clearly not the process if Linus is build testing rust and rejecting PRs that fail to build.
For clarity, tree-wide fixes for C in the kernel are automated via Coccinelle. Coccinelle for Rust is constantly unstable and broken which is why manual fixes are required. Does this help to explain the burden that C developers are facing because of Rust and how it is in addition to their existing workloads?
Yes actually, I really wish someone would bring that sort of thing to the forefront, because that's a great spot to welcome new contributors.
I'd be suspicious these guys aren't in it for the long haul if they don't get their way and will leave the Rust they shoved in the kernel to bit rot if things don't go their way w.r.t enough rust adoption fast enough. "If you don't let us add even more rust, we will resign from the project and leave you to maintain the rust that's already there and that we added, and that you said you didn't want to add because you didn't trust us to not resign".
Rust 4 Linux people just proving the points of the maintainers scared of abandonment.
The rust 4 Linux people unfortunarely give the impression of caring more about rust than about the kernel, and it's clear that many are willing to make that perfectly clear by abandoning the larger project.
The whole thing needs to be scrapped and rethought with better and more committed leadership. This past 6 months to a year has been embarrassing and done nothing but confirm the fears of anti rust people.
[1]: https://rust-for-linux.com/rust-kernel-policy
> Who is responsible if a C change breaks a build with Rust enabled?
> The usual kernel policy applies. So, by default, changes should not be introduced if they are known to break the build, including Rust.
> Didn't you promise Rust wouldn't be extra work for maintainers?
> No, we did not. Since the very beginning, we acknowledged the costs and risks a second language introduces.
> However, exceptionally, for Rust, a subsystem may allow to temporarily break Rust code. The intention is to facilitate friendly adoption of Rust in a subsystem without introducing a burden to existing maintainers who may be working on urgent fixes for the C side. The breakage should nevertheless be fixed as soon as possible, ideally before the breakage reaches Linus.
The exception is allowing the subsystem to break Rust code temporarily. If you accept a patch in C that breaks Rust code, and the Rust for Linux team doesn't fix it quickly enough, you either need to fix the Rust code yourself, remove it, or re-write it in C. All of this would take time and energy from all the non-R4L kernel devs.
This is why people are reluctant to accept too much mixing of the C and Rust codebases, because even the Rust for Linux team isn't promising to fix breakages in Rust for Linux code.
Have I gotten that right?
And then you're presenting this situation as "the Rust for Linux team isn't promising to fix breakages in Rust for Linux code". Somewhat disingenuous.
> The Rust maintainers have committed to handling all maintenance of Rust code, and handling all breakage of their code by changes on the C side. The only "burden" the C maintainers have to carry is to CC a couple of extra people on commits when APIs change.
But this is not actually true it seems. Even the Rust for Linux policy doesn't say this. But because of the incorrect statement that keeps getting repeated, people are calling Kernel devs unreasonable for being reluctant to Rust patches.
Well, firstly, "randos" aren't getting their patches easily accepted anyway.
And secondly, what's the problem with this? You want one of the following options:
1. Everyone who wants to submit a patch also be proficient in Rust,
Or
2. You want the reviewer to also be proficient in Rust
You don't think that's an unnecessary burden for the existing maintainers?
The burden should be on the people who want to introduce the second language.
What could they possibly "deliver" beyond a strong commitment to fix the code in a timely manner themselves?
Some kernel developers really do feel that any Rust in the kernel will eventually mean that Rust gets accepted as a kernel language, and that they will eventually have to support it, and they the only way to prevent this is to stop any Rust development right now.
And yes, there's nothing that the R4L group can offer to be get around that belief. There isn't any compromise on this. Either Rust is tried, then spreads, then is accepted, or it's snuffed out right now.
A big mistake by R4L people is seeing anti-Rust arguments as "unfair" and "nontechnical." But it is a highly technical argument about the health of the project (though sometimes wrapped in abusive language). Rust is very scary, and calling out scared people as being unfair is not effective.
There is nothing to deliver that would satisfy this argument. Pretending like the disagreement is about a failure of the R4L folks to do "enough" when in fact there is nothing they could do is toxic behavior.
If you go back digging in the LKML archives, Christoph's initial response to Rust was more of a "let's prove it can be useful first with some drivers"
https://lore.kernel.org/lkml/YOVNJuA0ojmeLvKa@infradead.org/
https://lore.kernel.org/lkml/YOW2auE24e888TBE@infradead.org/
That has now been done. People (particularly Marcan) spent thousands of hours writing complex and highly functional drivers in Rust and proved out the viability, and now the goalposts are being moved.
R4L people are allowed to get upset about people playing lucy-with-the-football like this wasting their f***ing time.
They could do exactly what Ted Ts'o suggested in his email [1] that Marcan cited: They could integrate more into the existing kernel-development community, contribute to Linux in general, not just in relation to their pet projects, and over time earn trust that, when they make promises with long time horizons, they can actually keep them. Because, if they can't keep those promises, whoever lets their code into the kernel ends up having to keep their promises for them.
[1] https://lore.kernel.org/lkml/20250208204416.GL1130956@mit.ed...
Many existing maintainers are not "general contributors"
It is unreasonable (and a recipe for long-term project failure) to expect every new contributor to spend years doing work they don't want to do (and are not paid to do) before trusting them to work on the things they do want (and are paid) to do.
Christoph refused to take onboard a new maintainer. The fight from last August was about subsystem devs refusing to document the precise semantics of their C APIs. These are signs of fief-building that would be equally dangerous to the long-term health of the project if Rust was not involved whatsoever.
That's just how programming on teams and trust and teamwork actually works in the real world. Especially on a deadly serious not-hobby project like the kernel.
Sometimes you are gonna have to do work that doesn't excite you. That's life doing professional programming.
Everything Ted Tso recommended is just common sense teamwork 101 stuff and it's just generally good advice for programmers in their careers. The inability of rust people to follow it will only hurt them and doom their desire to be accepted by larger more important projects in the long run. Programming on a team is a social affair and pretending you don't have to play by the rules because you have such great technical leadership is arrogant.
It is absolutely reasonable if the work they want to do is to refactor the entire project.
sound absurd? just replace subsystems in above with C/Rust and the rest is the same.
Folks that maintain rust are responsible for rust code, if they won't deliver what is needed, their rust subsystem will fail, not C codebase, so it's in their own interests to keep things smooth.
my feeling is that some people think that C is the elite language and rust is just something kids like to play with nowadays, they do not want learn why some folks like that language or what it even is about.
I think the same discussion is when Linux people hate systemd, they usually have single argument that it's agains Unix spirit and have no other arguments without understanding why other thinks may like that init system.
No it's not. What you're missing is that if the Rust folks are unable, for whatever reasons, to keep their promises, it falls on the up-tree maintainers to maintain their code. Which, being Rust code, implies that the existing maintainers will have to know Rust. Which they don't. Which makes it very expensive for them to keep those broken promises.
To look at it another way, the existing maintainers probably have a little formula like this in their heads:
Expected(up-tree burden for accepting subsystem X) = Probability(X's advocates can't keep their long-term promises) * Expected(cost of maintaining X for existing up-tree maintainers).
For any subsystem X that's based on Rust, the second term on the right hand side of that equation will be unusually large because the existing up-tree maintainers aren't Rust programmers. Therefore, for any fixed level of burden that up-tree maintainers are willing to accept to take on a new subsystem, they must keep the first term correspondingly small and therefore will require stronger evidence that the subsystem's advocates can keep their promises if that subsystem is based on Rust.
In short, if you're advocating for a Rust subsystem to be included in Linux, you should expect a higher than usual evidence bar to be applied to your promises to soak up any toil generated by the inclusion of your subsystem. It’s completely sensible.
But that's the thing, the deal was that existing maintainers do not need to maintain that code.
Their role is to just forward issues/breaking changes to rust maintainer in case those were omitted in CC.
You are using the same argument that was explained multiple times already in this thread: no one is forcing anybody to learn rust.
What if, in years to come, the R4L effort peters out? Who will keep their promises then? And what will it cost those people to keep those broken promises?
The existing kernel maintainers mostly believe that the answers to the questions are “we will get stuck with the burden” and “it will be very expensive since we are not Rust programmers.”
Those are all in similar situation, where there is noone to maintain it as none of maintsiners have access to such hardware to event test of that is working correctly.
From time to time we see that such thing is discovered that is not working at all for long time and noone noticed and is dropped from kernel.
The same would happen to rust if noone would like to maintain it.
Rust for Linux is provided as experimental thing and if it won't gain traction it will be dropped in the same way curl dropped it.
I think this sort of statement is what is setting the maintainers against the R4L campaigners.
In casual conversation, campaigners say "No one is being forced to learn Rust". In the official statements (see upthread where I made my previous reply) it's made very clear that the maintainers will be forced to learn Rust.
The official policy trumps any casual statement made while proselytising.
Repeating the casual statement while having a different policy comes across as very dishonest on the part of the campaigners when delivered to the maintainers.
Once it’s been around long enough, it has a much better chance of being merged to main.
The thing is, you do still have to present a light at the end of the tunnel. If, after years of time investment and proven commitment, you're still being fed a bunch of non-technical BS excuses and roadblocks, people are going to start getting real upset.
The general pushback for changes in Linux are against large impactful changes. They want your code to be small fixes they can fully understand, or drivers that can be excluded from the build system if they start to crash or aren't updated to a new API change.
You can't take a years-maintained external codebase and necessarily convert it to an incremental stream of small patches and optional features for upstream maintainers, unless you knew to impose that sort of restriction on yourself as a downstream maintainer.
How have they "committed"? By saying they commit[1], I presume -- but what more? Anyone can say anything. I think what makes the "old guard" kernel maintainers nervous is the lack of a track record.
And yes, I know that's a kind of a lifting-yourself-by-your-bootstraps problem. And no, I don't know of any solution to that. But I do know that, like baron Münchhausen, you can't ride your high horse around the swamp before you've pulled your self out of it.
___
[1]: And, as others in this thread have shown, that's apparently just out of one side of their collective mouth: The official "Rust kernel policy" says otherwise.
In US, "thin blue line" is a colloquialism for police officers who typically wear blue and "toe the line." You should not be downvoted/shadowbanned/abused for your post, IMHO.
Given the online temper tantrum thrown by marcan, Ted Tso'o's comment seems totally reasonable, regardless of one's opinion of Rust in the Linux kernel.
You're trying to use Marcan's ragequit to ex-post-facto justify Ted T'so when it's literally the other way around.
>
> > Ted Tso'o accusing the speaker of wanting to convert people to the "religion promulgated by Rust"
> That seems totally reasonable. Putting aside the technical merits of the Rust language for the moment, the Rust community suffers from many of the same issues currently hobbling the Democratic Party in the United States. Namely, it often acts like a fundamentalist religion where anyone who dares dissent or question something is immediately accused of one or another moral failings. People are sick of this nonsense and are willing to say something about it.
It's really interesting that every time I open a thread like this, countless people come out swinging with this claim that Rust is totally this religion and cult, while the rest of the thread will be full of C evangelism and vague rhetorics about how nothing like this ever works, while actively contributing to making sure it won't this time either.
99% of insufferable Rust vs. C interactions I've come across it was the C fella being the asshole. So sorry, but no, not very convincing or "totally reasonable" at all.
This has also been my observation as a C++ developer who finds themselves in a fair few C/C++-aligned spaces. There are exceptions, but in most of those spaces the amount of Rust Derangement Syndrome I've witnessed is honestly kind of tiresome at this point.
Thank you for proving me so right so readily.
Those two things are not mutually exclusive :-)
This is ultimately what this drama comes down to. Not if Rust should or shouldn't be in the kernel, but with kernel maintainers' broken promises and being coy with intentions until there is no other option than to be honest, with the reveal that whatever time and effort a contributor had put in was a waste from the start.
It seems like the folks who didn't want Rust in the kernel will be getting their way in the end, but I had better never hear another complaint about the kernel not being able to attract new talent.
When did he say that?
In any event, that could be true (Rust is Linux's future) while the statement "R4L is not in Linux's future" is also true.
IOW, in principle, I may agree with you on something. That doesn't mean I agree with your specific implementation.
I really wish people would stop throwing around the word "sabotaged". No one "sabotaged" anything. The opposition has been public from the beginning.
If I'm opposed to something, and someone asks my opinion about it in a private conversation, it is not "sabotage" to express my opinion. So far I haven't seen any evidence that those opposed to a mixed-language code base organized behind the scenes to hamper progress in anyway. Instead, their opposition has been public and in most cases instant.
Are people not allowed to be opposed to things anymore?
This is an unfortunate typo because your meaning is completely lost.
If your claim is that individuals are being hypocritical then you may have a point. Especially if you can produce examples.
But if you mean community vs community then you have simply bought in to the religious debate which isn’t interesting.
It's just tiresome. And it boiled over here, because no matter how enthusiastic the Rust people can be their youthful exuberance pales in influence in comparison with the the talent and impact of the Linux Kernel maintainers. And the resulting tantrum shows it.
You're attributing to "the Rust community" an imaginary offense that did not actually happen that way and couldn't be attributed that way even if it did. And then you make claims about how "the Rust community" is toxic. Right.
In contrast to the parent: yes, the presenter says „you don’t have to use rust, we are not forcing you“ but he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.
He did not fail to address that concern. And then Ted shouted him down for 2 minutes such that he couldn't get 2 syllables in to respond.
I'm not disagreeing with anything you said, just curious who the "we" you're referring to here, are you a kernel developer or something similar?
Why would we assume that Ted repeatedly using strawman fallacies, bleating appeals to emotion and acting like a victim...all the while shouting people down...evidence of "acting in good faith"?
When you shout over someone like that you're nothing but a bully.
> he fails to address the concern that a change they introduce would error downstream and someone else had to clean up afterwards.
Because that "concern" was a strawman. It demonstrated that Ted either did not understand what the presenters were asking for, or simply didn't like others asking him to do something, because he's very important and nobody tells him what to do.
As has been exhaustively explained by others in previous HN threads and elsewhere: the Rust developers were asking to be informed of changes so that Rust developers could update their code to accommodate the change.
Ted loses his shit and starts shouting nonsense about others forcing people to learn Rust, and so on.
> but most of all the moderator unable to prevent the situation from exploding
When someone is being abusive to others, the issue is never "the people on the receiving end are not handling it as best they can."
Further: did it occur to you that Ted's infamous short temper, and his "status" as a senior kernel developer, might be why the moderator was hesitating to respond?
Imagine how Ted would have reacted if he was told to speak respectfully, lower his voice, and stop talking over others. Imagine how the army of nerds who think Ted's behavior was acceptable or understandable.
> As has been exhaustively explained by others in previous HN threads and elsewhere: the Rust developers were asking to be informed of changes so that Rust developers could update their code to accommodate the change.
I don't understand why you don't see this as "a really big deal". The C developers make a breaking change. They fix all the C code, then they write an email to the Rust devs explaining the changes.
Then the process of making the change stops, and the C devs have to wait for a Rust dev to read the email, review the C changes, fix and test the resulting rust, and check in the update. (including any review process there is on the rust side.)
Is it hours, days, or weeks? Are there 2 people that know and can fix the code, or are there 100's. Do the C devs have visibility into the Rust org to know its being well run and risks are mitigated?
This is adding a hard dependency on a third party organization.
I would never dream of introducing this kind of dependency in my company or code.
A better analogy would be like an API inside of a monolithic app that has multiple consumers on different teams. One team consumes the API and wants to be notified of breaking changes. The other team says "Nah, too much work" and wants to be able to break the API without worrying about consequences.
If having multiple consumers of an API or interface is a goal, you make communication a priority.
Because he has done more for Linux than you ever will. Therefore, he gets all the benefit of the doubt, and you are assumed wrong
Is this true, though? One reason for this altercation seems to be the basic circumstance that in Linux kernel development, if there is a dependency between two pieces of code A and B, the responsibility to keep B consistent with changes to A lies, in order, with anyone proposing patches to A, the subsystem maintainer for A, and finally the subsystem maintainer for B. If B is Rust code, such as a binding, then that's potentially up to 3 people who don't want to use Rust being forced to use Rust.
The same situation of course also arises between C-only subsystems, but then the natural solution is that you have to go and understand system B well enough yourself that you can make the necessary changes to it and submit them as part of your patch. In that situation you are "forced to use C", but that's a free square because you are always forced to use C to contribute to Linux code.
So if the maintainer of subsystem X can be forced to work with the rust developers of their own subsystem, then that rust developer just got promoted to co-maintainer with veto power. Effectively that's what they'd be, right? I can see why maintainers might not like that. Especially if they don't think the rust dev is enough of a subject matter expert on the subsystem.
That isn't "effective co-maintainership".
It was nothing like making changes that cut across into another developer's C++ code (hell, I would even update their python interfaces/consumers too). That was temporary coordination. The python part was much more frequent and required much more detailed understanding of the internal APIs, not just the surface.
Having someone else responsible for the python part would have come at a huge cost to velocity as the vast majority of my changes would be blocked on their portion. It's ridiculous to imply it's equivalent to coordinating changes with another subsystem.
This is like probably 80% of people and fundamentally why the world is a hellscape instead of a utopia.
The audience member points out that they shouldn't encode the semantics into the Rust type system because that would mean that refactoring the C code breaks Rust, which is not an acceptable situation. The speaker responds to this by saying essentially "tell me what the semantics are and I'll encode them in the Rust type system." That's maximally missing the point.
The proposal would cause large classes of changes to C to break the build, which would dramatically slow down kernel development, even if a small handful of Rust volunteers agree to eventually come in and fix the build.
> You can literally shove facts in someone's face, and they won't admit to being wrong or misunderstand, and instead continue to argue against some points whose premise isn't even true.
I have to say that I used to be excited about Rust, but the Rust community seems very toxic to me. I see a lot of anger, aggression, vindictiveness, public drama, etc. On HN you not infrequently see down voting to indicate disagreement. These clashes with the Linux maintainers look really bad for Rust to me. So bad that I'm pretty convinced Rust as a language is over if they're no longer banging on the technical merits and are instead banging on the table.
I'm sure there are great things about the community. But I would encourage the community to have higher standards of behavior if they want to be taken seriously. The Linux team seem like they're trying to look beyond the childishness because they are optimistic about the technical merits, but they must be so tired of the drama.
I had the same impression.
Why all this drama is 90% of the time around Rust people?
Also, for many of them, Rust is the first systems language they've ever touched. And that fact alone excites them. Because now they can "dream big" too.
But they have bought into the whole C/C++ are by default insecure and therefore garbage. In their mind, no mortal could ever write so much as a single safe function in those languages. So their points of view are always going to be based on that premise.
What they fail to recognize is that an operating system kernel, by virtue of the tasks it has to perform- things like mapping and unmapping memory, reading/writing hardware registers, interacting with peripherals, initiating dma transfers, context switching, etc.- have effects on the underlying hardware and the runtime environment; effects that neither the type system nor the temporal memory safety of Rust can model, because it happens at a level lower than the language itself. Rust's safety guarantees are helpful, but they are not infallible at that level. The kernel literally changes the machine out from under you.
They further fail to appreciate the significant impedance mismatch between C and Rust. When one language has concepts that are in fact constraints that another language simply does not have, there is going to be friction around the edges. Friction means more work. For everyone. From planning to coding to testing to rollout.
So you have well-intentioned, excited, but still self-righteous developers operating from what they perceive to be a position of superiority, who silently look down upon the C developers, and behave in a manner that (to outsiders at least) demonstrates that they really do believe they're better, even if they don't come right out and say it.
Just read the comments in any thread involving Rust. It is inconceivable to them that anybody would be so stupid or naive as to question the utility of the Rust language. To them, the language is unassailable.
The petty drama and social media brigading on top of it, along with the propensity to quit when the going gets tough, it's pretty easy to see why some people feel the way they do about the whole thing.
A programming language is not a religion. It is not a way of life. It is a tool. It's not like it's a text editor or something.
I really hope that last sentence was a joke.
No one thinks this except some strawman that you've devised. No point in reading anything else in this comment when this is so blatantly absurd and detached from reality.
All you have to do is read comments from members of the Rust community online, in every public forum where Rust is discussed in any way.
Understand, I am not trying to villainize an entire community of software developers; but for you to say something that's blatantly false is to just stick your head in the sand.
You should try and read the words people write. Opinions are not formed in a vacuum.
Edit: to be clear- I have no problems with Rust the language beyond some ergonomic concerns. I am not a Rust hater, nor am I a zealot. I do advocate for C# a lot for application code though. But I do not deride others' language preferences. You should not dismiss my observations because I used hyperbole. Obviously not every Rust dev thinks you can't write a secure C/C++ function; don't pick out the one hyperbolic statement to discredit my entire post. Bad form.
With just a google search away you can dismiss your claim of 90% but you don’t want to do it because you only believe what you want to believe.
You have to encode your API semantics somewhere.
Either you encode them at the type system and find out when it compiles, or you encode it at runtime, and find out when it crashes (or worse, fails silently).
There are more breakage in rust due to the type-system-related semantics, but ideally a C dev would also want their system to break if the semantics aren't right. So this is a criticism on C..?
So following this argument, they don't want Rust because C falls short? Nonsense.
edit: The speaker did mention that they didn't want to force limited use on the base APIs, but that for a great deal of their usage, they could have determined fixed semantics, and make intermediary APIs for it. So this was not about limiting the basic APIs.
- (1) the C code will be refactored periodically
- (2) when refactored internally it can break C code, but the change author should fix any breaking in C
- (3) Rust must not break when (1) happens
It's the Rust devs' job to meet those requirements if they want to contribute. It looks in the video like they don't understand this, which is pretty basic.
I think that's part of the gag.
"These people are members of a community who care about where they live... So what I hear is people caring very loudly at me." -- Leslie Knope
that's a very healthy and - I feel - correct attitude towards this kind of criticism. I love when wisdom comes from stupid places.
Satisfied customers will tell you they think your stuff is great, but dissatisfied customers will be able to hone in on exactly where the problem is.
You can even extend this to personal life: if someone tells you your shabby car doesn't fit with the nice suits you wear, you can either take it as a personal attack and get irritated, or take it as feedback and wash your car, spruce up the upholstery and replace the missing wheel cap. In effect they helped you take note of something.
Yes, this is a criticism. Hopefully it's twice as effective as being nice. 8)
When "literally" is used in a figurative way, it's an intensifier. It means "very much". It never means "figuratively".
By the way, I don't mind the nit at all! English is not my first language and I slip up occasionally, so refreshers are welcome :-)
This would be doubly ironic if you're a native English speaker. Are you?
This sounds like a truism, when it isn't. The client may know something is wrong, but good luck at them identifying it. Some times, the client will convince themselves that something is wrong when it isn't. There were people complaining about lag in WoW, they responded by cutting the latency number in half... except that it wasn't cut in half, it was just measured as time to server rather than roundtrip. The complains died out immediately and they were hailed as "very savvy developers that listen to their customers".
It's called a strawman fallacy, and like all fallacies, it's used because the user is either intellectually lazy and can't be bothered to come up with a proper argument, or there isn't a proper argument and the person they're using it against is right.
An honest "no one will be forced to use Rust in the Kernel" would be exactly what it says. A paltering reading could be "we want to make Rust the only language used in the Kernel but you won't be forced to use it because you can quit". i.e. if you are "literally shoving facts in someone's face" and they don't change then they might think you are not telling the whole truth, or are simply lying about your goals.
Unfortunately OP has a valid point regarding Rust's lack of commitment to backwards compatibility. Rust has a number of things that can break you that are not considered breaking changes. For example, implementing a trait (like Drop) on a type is a breaking change[1] that Rust does not consider to be breaking.
[1]: https://users.rust-lang.org/t/til-removing-an-explicit-drop-...
I've read and re-read this several times now and for the life of me I can't understand the hair you're trying to split here. The only reason to do semantic versioning is compatibility...
I'm not aware of semver breakage in the language.
Another important aspect is that Semver is a social contract, not a mechanical guarantee. The Semver spec dedicates a lot of place to clarify that it's about documented APIs and behaviors, not all visible behavior. Rust has a page where it documents its guarantees for libraries [0].
Although there are mechanical aids for it: https://crates.io/crates/cargo-semver-checks
> The only reason to do semantic versioning is compatibility
Sure. But "compatibility" needs to be defined precisely. The definition used by the Rust crate ecosystem might be slightly looser than others, but I think it's disingenuous to pretend that other ecosystems don't have footnotes on what "breaking change" means.
Compatibility is defined precisely! You're definition requires scare quotes. You want to define it "Precisely" so that you can permit incompatible behavior. No one who cares about compatibility does that, it's just an excuse.
Look, other languages do this differently. Those of use using C99 booleans know we need to include a separate header to avoid colliding with the use of "bool" in pre-existing code, etc... And it sort of sucks, but it's a solved problem. I can build K&R code from 1979 on clang. Rust ignored the issue, steamrollered legacy code, and tried to sweep it under the rug with nonsense like this.
[1]: https://github.com/rust-lang/rust/issues/127343#issuecomment...
The way you word that makes it sound like "the maintainers" and "T-libs-api" do not consider this "okay". Reading just above the linked comment, however, puts a very different impression of the situation:
> We discussed this regression in today's @rust-lang/libs-api team meeting, and agree there's nothing to change on Rust's end. Those repos that have an old version of time in a lockfile will need to update that.
TBH that does not inspire confidence. I would expect that something claiming or aspiring to exhibit good engineering design would, as a matter of principle, avoid any breaking change of any magnitude in updates that are not intended to include breaking changes.
Not sure why people are trying to cover this up.
The change was useful, fixing an inconsistency in a commonly used type. The downside was that it broke code in 1 package out of 100,000, and only broke a bit of useless code that was accidentally left in and didn't do anything. One package just needed to delete 6 characters.
Once the new version of Rust was released, they couldn't revert it without risk of breaking new code that may have started relying on the new behavior, so it was reasonable to stick with the one known problem than potentially introduce a bunch of new ones.
Yes: adding a trait to an existing type can cause inference failures. The Into trait fallback, when calling a.into() which gives you back a is particularly prone to it, and I've been working on a lint for it.
It's about the overall stability and "contract" of the tooling/platform, not what the tooling can control under it. A great example was already given: It took clang 10 years to be "accepted."
It has nothing to do with the language or its overall characteristics, it's about stability.
There was an effort to write such a blog post. I pushed for it. Due to personal reasons (between being offline for a month and then quitting my job) I didn't have the bandwidth to follow up on it. It's on my plate.
> The top comment in the thread I point to basically thinks this is nothing.
I'm in that thread. There are tons of comments by members of the project in that thread making your case.
> It is probably too late do anything for this specific issue but it would be good to explain and highlight even minor compatibility issues through the official channel.
I've been working on a lint to preclude this specific kind of issue from ever happening again (by removing .into() calls that resolve to its receiver's type). I customized the diagnostic to tell people exactly what the solution is. Both of these things should have been in place before stabilization at the very least. That was a fuck up.
> This will give people more confidence.
Agreed.
Bugs happen, CI/CD pipelines are imperfect, we could always use more lint rules …
But there’s value in keeping the abstract language definition independent of any particular implementation.
Semver, or any compatibility scheme, really, is going to have to obey this:
> it is important that this API be clear and precise
—SemVer
Any detectable change being considered breaking is just Hyrum's Law.
(I don't want to speak to this particular instance. It may well be that "I don't feel that this is adequately documented or well-known that Drop isn't considered part of the API" is valid, or arguments that it should be, etc.)
Linux breaks internal compatibility far more often than people add or remove Drop implementations from types. There is no stability guarantee for anything other than user-mode ABI.
[0] AFAIK there is code that actually does this, but it's stuff like gc_arena using this in its derive macro to forbid you from putting Drop directly on garbage-collectable types.
And yet the operating mantra...the single policy that trumps all others in Linux kernel development...
is don't break user space.
GCC also occassionally breaks compability with the kernel, btw.
I think that's missing the point of the context though. When Linux breaks internal compatibility, that is something the maintainers have control over and can choose not to do. When it happens to the underlying infrastructure the kernel depends on, they don't have a choice in the matter.
This is equivalent of a C user saying "I'm disappointed that replacing a function with a macro is a breaking change".
Rust had actual changes that broke people's code. For example, any ambiguity in type inference is deliberately an error, because Rust doesn't want to silently change meaning of users' code. At the same time, Rust doesn't promise it won't ever create a type inference ambiguity, because it would make any changes to traits in the standard library almost impossible. It's a problem that happens rarely in practice, can be reliably detected, and is easy to fix when it happens, so Rust chose to exclude it from the stability promise. They've usually handled it well, except recently miscalculated "only one package needed to change code, and they've already released a fix", but forgot to give users enough time to update the package first.
First is if you compile with `-Werror -Wall` or similar; new compiler diagnostics can result in a build failing. That's easy enough to work around.
Second, nearly any decent-sized C program has undefined behavior, and new compilers may change their handling of undefined behavior. (E.g., they may add new optimizations that detect and exploit undefined behavior that was previously benign.) See, e.g., this post by cryptologist Daniel J. Bernstein: https://groups.google.com/g/boring-crypto/c/48qa1kWignU/m/o8...
Here's one example of workarounds in ~100 packages that broke when upgrading to GCC 10: https://github.com/search?q=repo%3ANixOS%2Fnixpkgs%20fcommon...
Every language has breaking changes. The question is the frequency, not if it happens at all.
The C and C++ folks try very hard to minimize breakage, and so do the Rust folks. Rust is far closer to those two than other languages. I'm not willing to say that it's the same, because I do not know how to quantify it.
Rust 2015 can still evolve (either by language changes or by std/core changes) and packages can be broken by simply upgrading the compiler version even if they're still targeting Rust 2015. There's a whole RFC[2] on what is and isn't considered a breaking change.
[1]: https://gcc.godbolt.org/z/5jb1hMbrx
[2]: https://rust-lang.github.io/rfcs/1105-api-evolution.html
That's not what backwards compatibility means in this context. You're talking about how a compiler is backwards compatible. We're talking about the language itself, and upgrading from one versions of the language to the next.
Rust 2015 is not the same thing as C89, that is true.
> packages can be broken by simply upgrading the compiler version
This is theoretically true, but in practice, this rarely happens. Take the certainly-a-huge-mistake time issue discussed above. I actually got hit by that one, and it took me like ten minutes to even realize that it was the compiler's fault, because upgrading is generally so hassle free. The fix was also about five minutes worth of work. Yes, they should do better, but I find Rust upgrades to be the smoothest of any ecosystem I've ever dealt with, including C and C++ compilers.
> You're talking about how a compiler is backwards compatible. We're talking about the language itself, and upgrading from one versions of the language to the next.
That's part of the problem. Rust doesn't have a spec. The compiler is the spec. So I don't think we can separate the two in a meaningful way.
> So I don't think we can separate the two in a meaningful way.
I think that in that case, you'd compare like with like, upgrading both.
I do agree that gcc and clang supporting older specs with a flag is a great feature, and is something that Rust cannot do right now.
But the results of the annual survey have come out: https://blog.rust-lang.org/2025/02/13/2024-State-Of-Rust-Sur...
And 90% of users use the current stable version for development. 7.8% use a specific stable released within the past year.
These numbers are only so high because it is such a small hassle to update even large Rust codebases between releases.
So yes, in theory, breakage can happen. But that's in theory. In practice, this isn't a thing that happens very much.
Decades-old codebases tend to work because the toolchain explicitly hard-codes support for the ways they make assumptions not provided by any standard.
While GCC with few basic flag will, in general, produce binary that cooperates with kernel, kbuild does load all those flags for a reason.
Superset? Or subset? I'd have guessed the latter.
So it's a superset in terms of what's defined
Since Rust doesn't have a standard, the guarantee is "whatever the current version of the compiler can compile". To check if they broke anything they compile everything on crates.io (called a crater run).
But if you check results of crater runs, almost every release some crates that compiled in the previous version stop compiling in the new version. But as long as the number of such breakages it not too large, they say "nothing is broken" and push the release.
And as others have noted, C99 is a standard and Rust lacks one.
That's an impossible standard to hold Rust to, did you mean it the other way around? A C89 compiler can't compile all of C99 either.
Historically the Rust community has been extremely hostile towards gccrs. Many have claimed that the work would be detrimental to Rust as a language since it would split the language in two (despite gccrs constantly claiming they're not trying to do that). I'm not sure if it was an opinion shared by the core team, but if you just browse Reddit and Twitter you would immediately see a bunch of people being outright hostile towards gccrs. I was very happy to see that blog post where the Rust leadership stepped up to endorse it properly.
Just one reference: In one of the monthly updates that got posted on Reddit (https://old.reddit.com/r/rust/comments/1g1343h/an_update_on_...) a moderator had to write this:
> Hi folks, because threads on gccrs have gotten detailed in the past, a reminder to please adhere to the subreddit rules by keeping criticism constructive and keeping things in perspective.
> opposed to the idea of multiple implementations, which is plainly false, as evidenced by the link to the official blog post celebrating gccrs. Ted T'so is speaking from ignorance here.
Why use so strong words? Yes, there's clearly a misunderstanding here, but why do we need to use equally negative words towards them? Isn't it more interesting to discuss why they have this impression? Maybe there's something with the communication from the upstream language developers which hasn't been clear enough? It's a blog post which is a few months old so if that's the only signal it's maybe not so strange that they've missed it?
Or maybe they are just actively lying because they have their own agenda. But I don't see how this kind of communication, assuming the worst of the other part, beings us any closer.
I'm not going to mince words here. Ted T'so should know better than to make these sorts of claims, and regardless of where he got the impression from, his confident assertion is trivially refutable, and it's not the job of the Rust project to police whatever incorrect source he's been reading, and they have demonstrably been supportive of the idea of multiple implementations. This wouldn't even be the first alternative compiler! Several Rust compiler contributors have their own compilers that they work on.
The kernel community should demand better from someone in such a position of utmost prominence.
Reddit... is reddit.
Here's guessing they meant "derailed".
I think it's important to be wary of simplistic narratives (such as "C vs Rust"). Maintaining a complex piece of software comes with tradeoffs and compromises, and the fewer languages you have to worry about the better. On the other hand, the Asahi Linux team have been quite explicit that without Rust, they wouldn't have achieved a fraction of what they have. So clearly there is a lot of value in RfL for Linux as a whole, if implemented well. And that value is reflected in the decision from Linus that RfL should be supported, at least for now.
This might be true, but do you have any actual quantifiable evidence for it? Because FWIW, from what I as an outsider see (mainly in threads like this), all the drama looks very much like "we are Rust users who want our code to be in Linux".
See, for example:
https://xcancel.com/linaasahi/status/1577667445719912450?s=4... https://vt.social/@lina/113056457969145576 https://asahilinux.org/2022/11/tales-of-the-m1-gpu/
In fairness, this is one team working on one project, but if they're attributing much of their success to Rust, it's probably worth listening to and understanding why, particularly as I don't believe they were particularly evangelistic about Rust before this project.
I have no idea about the Google funding, but Marcan's blog post is very explicit they they do not have any corporate sponsorship. If you believe that to be untrue, please explain your reasoning rather than spreading unsubstantiated rumours.
At some point there was some brief discussion for C++ in the kernel and that was essentially immediately killed by Linus. And he was essentially right.
Bad example IMO. What is reasonable about this? http://radicalcartography.net/bayarea.html
This is a very persistent myth, but it’s wrong. Adding any public method to any impl can break BC (because its name might conflict with a user-defined method in a trait), and the Rust project adds methods to standard library impls all the time.
This is a rare situation, and std thrives to prevent it. For example, in [1], a certain trait method was called extend_one instead of push for this reason. Crater runs are also used to make sure the breakage is as rare as T-libs-api has expected. The Linux kernel in particular only uses core and not std, which makes this even more unlikely.
https://play.rust-lang.org/?version=stable&mode=debug&editio...
What I really meant is the case where a method is added to a standard struct impl that conflicts with a user-defined trait.
For example, you might have implemented some trait OptionExt on Option with a method called foo. If now a method called foo is added to the standard option struct, it will conflict.
Which is hilarious since Linux itself was actively hostile to the idea of a second C compiler supporting it. Just getting Linux to support Clang instead of only GCC was a monumental task that almost certainly only happened because Android forced it to happen.
Rust, despite having Linus' blessing to be in the kernel, is still just getting rejected just because it's Rust, completely unrelated to any technical merits of the code itself.
Directly, "the thin blue line" expresses the idea that the police are what separates society from chaos.
It doesn't inherently suggest police are justified in acting outside the law themselves, though, of course, various people have suggested this (interestingly, from both a pro-police and anti-police perspective).
It seems obvious to me that the post was using this phrase in the sense of being a thin shield from chaos.
Rowan Atkinson had a sitcom set in a London police station in the 90s called "The Thin Blue Line". Are you under the impression he was dogwhistling about extrajudicial violence?
I'd never heard of the extrajudicial punishment aspect of the phrase (though I had heard the phrase itself) and it didn't show up when I googled, but I'm not American, so maybe there's some cultural differences.
[0]: https://knowyourmeme.com/memes/punisher-skull
All in all, this could just be another instance of the "culture war" inflaming every other minor disagreement with Ted playfully using the phrase and Marcan misinterpreting it. Or it could be Ted slipping up with their politics. From what I know about Marcan and what can be inferred from his post, they do seem like someone the alt-right would persecute.
I had a look, and it seems that Ted Ts'o is American, so I guess we should assume he understands the cultural significance of the phrase (even though I didn't).
A more charitable interpretation would be “we’re the only line of defense protecting something good and valuable from the outside world, so people should give significant weight to our opinions and decisions”. Which, to be clear, I would still mostly disagree with WRT the police, but it at least doesn’t explicitly endorse corruption.
It's perfectly reasonable to assume he was aware of the implications of his words and chose to use them anyway.
And by the way, so does Wikipedia: https://en.wikipedia.org/wiki/Thin_blue_line doesn't mention this interpretation at all. The closest thing is this sentence, which is really not saying the same thing at all, and at any rate only presenting it as something "critics argue", rather than the settled meaning of the phrase.
> Critics argue that the "thin blue line" represents an "us versus them" mindset that heightens tensions between officers and citizens and negatively influences police-community interactions by setting police apart from society at large.
Well, now you have!
The most charitable interpretation I can imagine is that the Rust-in-Linux project needs specific nightly features, and those don't get stability guarantees. But I think this is still pretty unfair to complain about; my impression is there's a lot of appetite on the Rust side to get those stabilized.
I also think...
> we know, through very bitter experience, that 95+% of the time, once the code is accepted, the engineers which contribute the code will disappear, never to be seen again.
...that while there's truth in this, there's also a large extent to which it's a self-fulfilling prophecy. Someone might want to stick it out to get their work into mainstream once, but then take a look at the process once it's in the mirror and say never again.
...and:
> Instead of complaining about maintainers for who are unreasonably caring about these things, when they are desparately under-resourced to do as good of a job as they industry demands, how about meeting us half-way and helping us with these sort of long-term code health issues?
It's really hard for me to not see "let's actually write down the contract for these functions, ideally via the type system" as doing exactly that. Which seems to me to be the central idea Ted Ts'o was ranting about in that infamous video.
Very strange to see little to no empathy for kernel maintainers in this situation.
Saying people "compulsively downvote" the stuff above is already a strong claim that you have no way to substantiate. I think more broadly what you're claiming is that the people downvoting you and anonfordays are emotional and doing so out of political zealotry, and... again, that a pretty strong claim.
People can downvote a post not because they strongly disagree with its claims, but because they strongly dislike its inflammatory tone ("fragile entryist", "Marcan and others like him leave long paths of destruction in their wake", etc).
People who strongly disagree with a post don't necessary believe the exact opposite of its claims. They can disagree with some of the claims and agree with others, or disagree with the very framing of the post.
If I say "we should outlaw all guns because gun crimes are awful" and you disagree, that doesn't mean you think gun crimes are great.
The community is more than just the language and compiler vendor(s). It's everyone using the language, with particular emphasis on the developers of essential libraries and tools that those users use and on which they're reliant.
In this sense, based on every time I've attempted to use Rust (even after 1.0), Ts'o's remark ain't inaccurate from what I can tell. If I had a nickel for every Rust library I've seen that claims to only support Rust Nightly, I'd have... well, a lot of nickels. Same with Rust libraries not caring much about backward-compatibility; like yeah, I get it during pre-1.0, or while hardly anyone's using it, but at some point people are using it and you are signaling that your library's "released", and compatibility-breaking changes after that point make things painful for downstream users.
> Here's the maintainer on the gccrs project (a second Rust compiler implementation), posting on the official Rust Blog
Same deal here. The Rust developers might be welcoming of additional implementations, but the broader community might not be. I don't have enough information to assess whether the Rust community is "actively hostile" to a GCC-based Rust implementation, but from what I can tell there's little enthusiasm about it; the mainstream assumption seems to be that "Rust" and its LLVM-based reference compiler are one and the same. Maybe (hopefully) that'll change.
----
The bigger irony here, in any case, is that the Linux community has both of these very same problems:
- While the kernel itself has strict backwards-compatibility guarantees for applications, the libraries those applications use (including absolutely critical ones like glibc) very much do not. The ha-ha-only-serious observation in the Linux gaming community is that - thanks to Wine/Proton - the Windows API is the most stable ABI for Linux applications. Yeah, a lot of these issues are addressable with containerization, or by static compilation, but it's annoying that either are necessary for Linux-native applications to work on old and new distros alike.
- As marcan alludes to in the article, the Linux community is at least antipathetic (if not "actively hostile") to Linux-compatible kernels that are not Linux, be they forks of Linux (like Android) or independent projects that support running Linux applications (WSL 1/2, FreeBSD, some illumos distros, etc.). The expectation is that things be upstreamed into "the" Linux, and the norms around Linux development make out-of-tree modules less-than-practical. This is of course for good reason (namely: to encourage developers to contribute back to upstream Linux instead of working in silos), but it has its downsides - as marcan experienced firsthand.
Marcan also linked to this resignation of a Rust Maintainer:
https://lore.kernel.org/lkml/20240828211117.9422-1-wedsonaf@...
which references this fantastic exchange:
https://www.youtube.com/watch?v=WiPp9YEBV0Q&t=1529s
I am not a C person, or a kernel level person, I just watch this from the sideline to learn something every now and then (and for the drama). But this exchange is really stunning to me. It seems so blatantly obvious to me that systematically documenting (in code!) and automatically checking semantic information that is required to correctly use an API is a massive win. But I have encountered this type of resistance (by very smart developers building large systems) in my own much smaller and more trivial context. To some degree, the approach seems to be: "If I never write down what I mean precisely, I won't have to explain why I changed things." A more charitable reading of the resistance is: Adding a new place where the semantics are written down (code, documentation and now type system) gives one more way in which they can be out of sync or subtly inconsistent or overly restrictive.
But yeah, my intuitive reaction to the snippet above is just incredulity at the extreme resistance to precisely encoding your assumptions.
At least that's my understanding from the outside, someone please do correct me if wrong.
Rust developers were saying it would be their job to do this. But then someone said Linus rejected something because it broke Rust. GKH backed the Rust developers and said that was an exception not a rule, but didn't know Linus' stance for sure.
Then Linus chimes in because of one of Hector's replies, but at the time of my reading did not clarify what his actual stance is here.
Whatever he says is guaranteed to piss off at least one side of the argument.
At the rate we're going here the existing kernel devs will alienate any capable new blood, and Linux will eventually become Google Linux(TM) as the old guard goes into retirement and the only possible way forward is through money.
So there will presumably be fewer and fewer programmers, young or old, that want to work in C.
C is one of the most entrenched and still-important languages in the world, so it probably has more staying power than Fortran, COBOL, etc. So the timeline is anybody's guess, but the trajectory is pretty clear.
There are a lot of languages that people prefer to C which aren't well-suited to OS programming (golang, Java) but Rust is one that can do the same job as C, and is increasingly popular, and famously well-loved by its users.
There's no guarantee that Rust will work out for Linux. Looks unlikely, to me, actually. But I think it's pretty clear that Linux will face a dwindling talent pool if the nebulous powers that actually control it collectively reject everything that is not C.
If I've interpreted it correctly (and probably not, given the arguments), Linus won't accept merge requests if they break the Rust code, so the maintainer would need to reach out to the Rust for Linux (or someone else) to fix it if they didn't want to themselves.
And some lead maintainers don't want to have to do that, so said no Rust in their subsystem.
https://lore.kernel.org/rust-for-linux/20250131135421.GO5556...
> Then I think we need a clear statement from Linus how he will be working. If he is build testing rust or not.
> Without that I don't think the Rust team should be saying "any changes on the C side rests entirely on the Rust side's shoulders".
> It is clearly not the process if Linus is build testing rust and rejecting PRs that fail to build.
For clarity, tree-wide fixes for C in the kernel are automated via Coccinelle. Coccinelle for Rust is constantly unstable and broken which is why manual fixes are required. Does this help to explain the burden that C developers are facing because of Rust and how it is in addition to their existing workloads?
Yep, thanks!
So then the argument that even semantics encoded in the Rust types, can be out of the date compared to the actual code, is actually a real thing? I read that somewhere else here in the comments, but didn't understand how the types could ever be out-of-date, but this would explain that argument.
I may be wrong, but that's how I understood it, but who knows how Linus will handle any given situation. ¯\_(ツ)_/¯
Conversely some codepath might use * but that is not in the interface, so your generic code works for numbers but fails for other types that should work.
if you really need a number, why not use a type specifically aligned to that (something like f32|f64|i32|i64 etc...) instead of relying on + operator definition?
> Conversely some codepath might use * but that is not in the interface, so your generic code works for numbers but fails for other types that should work.
do we agree that if it's not in the interface you are not supposed to use it? conversely if you want to use it, the interface has to be extended?
For the first case you have it the wrong way around. My generic code would work on things that are not numbers but I prevent you from calling it because I didn't anticipate that there would be things you can add that are not numbers. (Better example: require an array when you really only need an iterable).
You have a choice between code that statically asserts all assumptions in the type system but doesn't exist, is slow, or a pain to work with, and code that is beautiful, obvious, performant, but does contain the occasional bug.
I am not against static safety, but there are trade offs. And types are often not the best way to achieve static safety.
That’s a sort of weird statement to make without reference to any particular programming language. Types are an amazing way to achieve static safety.
The question of how much safety you can reasonably achieve using types varies wildly between languages. C’s types are pretty useless for lots of reasons - like the fact that all C pointers are nullable. But moving from C to C++ to Rust to Haskell to ADA gives you ever more compile time expressivity. That type expressivity directly translates into reduced bug density. I’ve been writing rust for years, and I’m still blown away by how often my code works correctly the first time I run it. Yesterday the typescript compiler (technically esbuild) caught an infinite loop in my code at compile time. Wow!
I’d agree that every language has a sweet spot. Most languages let you do backflips in your code to get a little more control at compile time at the expense of readability. For example, C has an endless list of obscure __compiler_directives that do all sorts of things. Rust has types like NonZeroUsize - which seem like a good idea until you try it out. It’s a good idea, but the ergonomics are horrible.
But types can - and will - take you incredibly far. Structs are a large part of what separates C from assembler. And types are what separates rust from C. Like sum types. Just amazing.
> [..]Attempting to do so results in code that is harder to read and write.
> You have a choice between code that statically asserts all assumptions in the type system but doesn't exist, is slow, or a pain to work with, and code that is beautiful, obvious, performant, but does contain the occasional bug.
I don't think you are expressing objective truth, this is all rather subjective. I find code that encodes many assumptions in the type system beautiful and obvious. In part this is due to familiarity, of course something like this will seem inscrutable to someone who doesn't know Rust, in the same way that C looks inscrutable to someone who doesn't know any programming.
Compared to, say, dependent type systems, Rust really isn't that far along. The Linux kernel has lots of static analyzers, and then auxiliary typedefs, Sparse, and sanitizers cover a significant area of checks in an ad-hoc way. All Rust does is formalize them and bring them together.
And getting Rust into the kernel slowly, subsystem by subsystem, means that the formalization process doesn't have to be disruptive and all-or-nothing.
I don't follow. The one with zero leverage is the contributor, no? They have to beg and plead with the maintainers to get anything done. Whereas the maintainers can yank code out at any time, at least before when the code makes it into an official stable release. (Which they can control - if they're not sure, they can disable the code to delay the release as long as they want.)
Here is the only thing that matters in the end (I learned this an even harder way. I really worked like the L4R people approach this and was bitten by counter-examples left, right, and center): The Linux Kernel has to work. This is even more important than knowing why it works. There is gray area and you only move forward by rejecting anything that doesn’t have at least ten years of this kind of backwards compatible commitment. All of it. Wholesale. (And yes, this blatantly and callously disregards many gods efforts sounding like the tenuous and entitled claim "not good enough".)
But it’s the only thing that has a good chance of working.
Saying that gravity is a thing is not the same attitude as liking that everyone is subject to gravity. But hoping that gravity just goes away this once is wishful thinking of the least productive kind.
Rust is not "sufficiently committed" to backwards compatibility. Firstly, too young to know for sure and the burden is solely on "the rust community" here. (Yes, that sucks. Been there.)
Secondly, there were changes (other posters mentioned "Drop") and how cargo is treated that counter indicate this.
Rust can prove all the haters wrong. They will then be even more revered that Linux and Debian. But they have to prove this. That is a time consuming slog. With destructive friction all the way.
This is the way.
Can I say that I was immediately put off by the author conflating the "thin blue line" quote from with a political orientation?
The full quote (from the article) being: "Later in that thread, another major maintainer unironically stated “We are the ‘thin blue line’”, and nobody cared, which just further confirmed to me that I don’t want to have anything to do with them."
The way I read it, "thin blue line" is being used as a figure of speech. I get what they are referring to and I don't see an endorsement. It doesn't necessarily means a right-wing affiliation or sympathy.
To me it seems like the author is projecting a right-wing affiliation and a political connotation where there is none (at least not officially, as far as I can see on https://thunk.org/tytso/) in order to discredit Theodore Ts'o. Which is a low point, because attacking Ts'o on a personal level means Martin is out of ammunitions to back their arguments.
But then again, Hector Martin is the same person that though that brigading and shaming on social media is an acceptable approach to collaboration in the open source space:
"If shaming on social media does not work, then tell me what does, because I'm out of ideas."
from https://lkml.org/lkml/2025/2/6/404To me, from outside, Hector Martin looks like a technically talented but otherwise toxic person that is trying to use public shaming on social media and ranting on his blog as tools and tactics to impose their will and force the otherwise democratic process of development the linux kernel. And the on top of everything it's behaving like a victim.
It's a good thing they are resigning, in my opinion.
Jumping to conclusions about police brutality and so forth (as many here in the comments are doing) is very frustrating to see, because, in context, the intent of his phrasing is very clear to anyone who doesn't needlessly infer Contemporary Political Nonsense in literally everything they read.
It can be hard when solving your own acute issue - doing so doesn't mean it is the only fix or the one the project should accept.
Even if it's beneath someone's talent to have to do it, it is an exercise of community building.
I am acquainted with Ted via the open source community, we have each other on multiple social media networks, and I think he's a really great person. That said, I also recognize when he gets into flame wars with other people in the open source social circles, and sometimes those other people are also friends or acquaintances.
I can think of many times Ted was overly hyperbolic, but he was ultimately correct. Here is the part of the Linux project I don't like sometimes, which was recently described well in this recent thread. Being correct, or at least being subjectively correct by having extremely persuasive arguments, yet being toxic... is still toxic and unacceptable. There are a bazillion geniuses out there, and being smart is not good enough anymore in the open source world, one has to overcome those toxic "on the spectrum" tendencies or whatever, and be polite while making reasonable points. This policy extends to conduct as well as words written in email/chat threads. Ted is one of those, along side Linus himself, who has in the past indulged into a bit of shady conduct or remarks, but their arguments are usually compelling.
I personally think of these threads in a way related to calculus of infinitesimals, using the "Standard Parts" function to zero away hyperbolic remarks the same way the math function zeros away infinitesimals from real numbers, sorta leaving the real remarks. This is a problem, because it's people like me, arguably the reasonable people, who through our silence enable these kind of behaviours.
I personally think Ted is more right than wrong, most of the time. We do disagree sometimes though, for example Ted hates the new MiB/KiB system of base-2 units, and for whatever reasons like the previous more ambiguous system of confusingly mixed base-10/base-2 units of MB/Mb/mb/KB/Kb/kb... and I totally got his arguments that a new standard makes something confusing already even more confusing, or something like that. Meh...
Here's my best argument for the binary prefixes: Say you have a cryptographic cipher algorithm that processes 1 byte per clock cycle. Your CPU is 4 GHz. At what rate can your algorithm process data? It's 4 GB/s, not 4 GiB/s.
This stuff happens in telecom all the time. You have DSL and coaxial network connections quantified in bits per second per hertz. If you have megahertz of bandwidth at your disposal, then you have megabits per second of data transfer - not mebibits per second.
Another one: You buy a 16 GB (real GB) flash drive. You have 16 GiB of RAM. Oops, you can't dump your RAM to flash to hibernate, because 16 GiB > 16 GB so it won't fit.
Clarity is important. The lack of clarity is how hundreds of years ago, every town had their own definition of a pound and a yard, and trade was filled with deception. Or even look at today with the multiple definitions of a ton, and also a US gallon versus a UK gallon. I stand by the fact that overloading kilo- to mean 1024 is the original sin.
Right but the problem here is that RAM is produced in different units than storage. It seems strictly worse if your 16GB of RAM doesn't fit in your 16GB of storage because you didn't study the historical marketing practices of these two industries, than if your 16 GiB of RAM doesn't fit in your 16 GB of storage because at least in the second case you have something to tip you off to the fact that they're not using the same units .
> I can think of many times Ted was overly hyperbolic, but he was ultimately correct. Here is the part of the Linux project I don't like sometimes, which was recently described well in this recent thread. Being correct, or at least being subjectively correct by having extremely persuasive arguments, yet being toxic... is still toxic and unacceptable.
I want to say that I am thankful in this world that I am a truly anonymous nobody who writes codes for closed-source mega corp CRUD apps. Being a tech "public figure" (Bryan Cantrill calls it "nerd famous") sounds absolutely awful. Every little thing that you wrote on the Internet in the last 30 years is permanently recorded (!!!), then picked apart by every Tom, Dick, Harry, and Internet rando. My ego could never survive such a beating. And, yet, here we are in 2025, where Ted T'so continues to maintain a small mountain of file system code that makes the Linux world go "brrr".Hot take: Do you really think you could have done better over a 30 year period? I can only answer for myself: Absolutely fucking not.
I, for one, am deeply thankful for all of Ted's hard work on Linux file systems.
> I miss having free time where I can relax and not worry about the features we haven’t shipped yet. I miss making music. I miss attending jam sessions. I miss going out for dinner with my friends and family and not having to worry about how much we haven’t upstreamed. I miss being able to sit down and play a game or watch a movie without feeling guilty.
Honestly I think working like 10+ hour days and not doing other things that are less stressful and enjoyable (people being their biggest stressor in this regard).
They likely have PTSD at this point.
Whatever you need Marcan. I hope you find it. I'm rooting for your health and happiness.
It seems like there's a balancing act between the benefits of writing drivers in Rust (easier, more maintainable), and getting those drivers mainlined (apparently soul-destroying, morale killing), I wonder if the Asahi team is considering simply abandoning linux in favor of something more rust friendly (redox being an obvious candidate, but maybe one of the BSDs?). Given the narrow set of hardware they're aiming to support and that they're writing many of their own drivers _anyway_ (and so are not relying as much on the large # of existing linux drivers), that approach might be more viable. I'd be surprised if the Asahi GPU work wasn't the largest problem by far that their team faces, and as such it would make sense to choose a kernel that lowers the difficulty on that aspect to the greatest degree possible.
The Asahi developers have repeatedly and publicly asserted that were it not for Rust they would not have been able to achieve the level of quality required for the project, at the speed they did, with as small of a team as they have. From the article:
> Rust is the entire reason our GPU driver was able to succeed in the time it did.
Nobody ever claimed that it's impossible to write these drivers in C -- C is "Rust-complete" in the sense that you could in theory write a compiler that translates any Rust program to C.
They're just claiming that Rust allowed them to write much higher-quality code, much faster, which seems plausible.
The existence of other ARM laptops is irrelevant; the reason MBPs are so good has little to do with ARM. Yes x86 makes the processor frontend more complicated but this doesn't make a big enough difference to come close to accounting for how much better the MBP is than its competitors. I would guess the biggest factors are Apple's ability to buy the entire run of TSMC's best process node, and the fact that they have a high level of competence at designing CPU cores and other hardware. The instruction set the core uses is just not that important in comparison.
Really?
What is so great about a locked down hardware, locked down software machine, that phones home to Apple all the time?
The only reason to get Macs is if you have a niche case of needing long battery life (most people don't, even if they say they do), but this is where the other ARM laptops are gonna also be good, without all the proprietary crap.
Even if you consider the hardware "tech jewelry", isn't it strictly better to have a way to run Linux on it instead of sending it to landfill? Seems silly to exclude a particular set of hardware from consideration for arbitrary reasons?
>isn't it strictly better to have a way to run Linux on it
In a perfect world, Apple would open source the firmware, which would let people just compile the linux driver for it. While Asahi project is cool in terms of figuring stuff out, ultimately its a lost cause because Apple will never be on board.
They are relying heavily on mesa. I'd also assume that GNU stuff is also pretty essential.
Perhaps Android would be possible? It has a HAL that might be easier to work with than the raw linux kernel. The android devs have put in a lot of effort to make downstream driver development not painful. With android, they'd also still have GNU stuff available.
The big issue is non-linux will mean every single open source tool may have a compatibility problem. You also end up dumping a huge amount of capabilities (like running docker containers).
Isn't mesa portable? Or are there parts that are OS-specific?
> With android, they'd also still have GNU stuff available.
I don't follow; Android is a non-GNU Linux distro. Or do you mean that being on Linux makes GNU stuff easy? (But then, GNU runs happily on BSDs and other unix-likes)
Even the OS-specific parts are at least permissively-licensed. OpenBSD is about as religious about "all new code must be under an ISC-compatible license" as it gets, and even they pull in Linux DRM/Mesa code for hardware graphics acceleration: https://man.openbsd.org/drm.7
IDK. I'm not familiar with mesa enough to know how portable it is. That said, I do know that it's primarily deployed on linux. An issue with portability is simply that when big projects like mesa are developed, non-linux environments are rarely developed (No clue, for example, if you can build mesa for BSD).
> Or do you mean that being on Linux makes GNU stuff easy?
Mostly this. I don't think, for example, those GNU tools will port over to redox. Building them targeting android is a snap.
There are official freebsd packages of mesa: https://ports.freebsd.org/cgi/ports.cgi?query=mesa-&stype=al...
In fact, https://doc.redox-os.org/book/graphics-windowing.html seems to imply that redox is or plans to use mesa.
Android:
Okay, that's fair; termux already proved that GNU on Android is viable.
The entire point of Asahi is to run Linux on macOS (edit: on Mac hardware, not macOS). If they did what you’re suggesting it would be a completely different project.
At this point, it’s really about what trade-off you’re willing to make. Do you want a better graphical interface or better docker integration?
Because it runs a Linux VM at a considerable overhead and serious issues if you want anything more detailed in networking than `-p 8080:8080`.
FreeBSD may be open to it? It's been awhile, and I haven't kept up to date on it for a year or two. But once again, I think you'd have to start from scratch. So everything for R4L that was built before Asahi Linux needs to be done on the FreeBSD side.
NetBSD is probably a no go. NetBSD supports architectures that Rust (due to LLVM) can't support. Which means it is most likely a no go for NetBSD, NetBSD's schtick is that it can run on anything and they will fully do everything in their power to make sure NetBSD can run on any hardware and be maintained. Hardware portability matters for them.
The attitude I've seen from OpenBSD devs is, the answer is to 'git gud' at C and, not replace C code with Rust. Or in other words, they have no interest in Rust in the OpenBSD kernel.
I don't really know where DragonFlyBSD falls in this. Its the BSD I know the least about.
Perhaps you’re confusing it with XNU? (Which is Mach merged with some BSD stuff).
Wishing I had donated before, I'll sign up for opencollective now. I can only imagine the anticlimactic nature of releasing the emulation stack for gaming [0] and not seeing any increase in interest financially. One wonders what funding might have made it more worthwhile than simply passing the hat.
[0] https://asahilinux.org/2024/10/aaa-gaming-on-asahi-linux/
This is still my position on Asahi Linux: that it is not something that I would use as a daily driver nor recommend to others for use as a daily driver.
> “When is Thunderbolt coming?” “Asahi is useless to me until I can use monitors over USB-C” “The battery life sucks compared to macOS” (nobody ever complained when compared to x86 laptops…) “I can’t even check my CPU temperature” (yes, I seriously got that one).
These would be dealbreakers for me, too. To be clear, I am not saying that it is anyone's job to fix these issues for me. And this isn't meant as an attack on the Asahi Linux team - I think it's incredible what they have been able to do.
But those comments, without any larger context to demonstrate harassment or anything like that, just don't seem too bad to me. The language could be softened a bit, sure, but the criticisms themselves resonate with me and would be valid reasons to not use Asahi Linux IMO.
what's out of line is incessant reporting (via issues, emails, whatever) of what you consider a dealbreaker. that's my impression of what he's complaining about. let the people work. no one likes to respond "not yet" a billion times.
Open source can be brutal, especially with larger and well established projects.
I contribute to several projects as a well recognized person in my field, not at their scale, but everything they say rings true.
Established developers often push back extremely hard on anything new, until and unless it aligns with their current goals. I’ve had maintainers shut me down without hearing out the merits, only to come back a year later when whatever company they work for suddenly sees it as important.
Project leads who will shift goalposts to avoid confronting the clear hostility their deputies show.
I’ve had OSS users call me personal number, or harass me over email for not having their pet interest prioritized over everything else. Often that’s because I’m blocked by the maintainers.
Open source can be extremely brutal and it’s a battle of stamina and politics as much as it’s one of technical merit.
And while I appreciate Marcan's work a lot he is also partically responsible because he himself often jumped on bandwagon attacking other people exactly the same way.
With proprietary software you usually have a corporate mandate, a goal etc to achieve. Any new tech is achieved as part of that drive. You can get people on board or not based on that, and once you’ve decided, there is someone to answer to if you can’t deliver.
Open source doesn’t have that. A project can go in twenty different directions at once, you can say you all agree to something and then have people sabotage it without being answerable to anyone.
Does that make open source worse? No. It’s the trade off for being open, which is extremely valuable but it is a very different push in terms of a product.
So leading well known open source project is politics on a small scale and there will be a lot of people who want to hurt or manipulate you.
If you decide to become a public person and want to have fans and supporters then be ready to have haters as well.
I think anyone working in serious open source projects just need to learn to ignore those users. I definitely would have the attitude of "I'm perfectly fine if no one uses my product" and have a lot of fun banning entitled users left and right.
Talk is cheap, send patches.
https://x.com/FFmpeg/status/1762805900035686805
I see a lot of FOSS maintainers continue to engage and defend themselves against people who have demonstrated themselves as unwilling to contribute in any way, yet expecting that free work be done for them. I wish more open source devs will keep in mind that FOSS work is a gift they're sending out into the world, and it's a common good that anyone can contribute to. That is not to say ignore all criticism or user requests, just that you hold absolutely no responsibility to placate emotionally draining people - the project is just as much their responsibility as yours.
I don't agree with Linus all the time (mostly because I don't have the technical knowledge to agree with), but I 100% agree to his attitude. I hope other large FOSS project maintainers have the same mindset.
But again, maybe they can hire someone like me, whose sole job is to block the very worst entitled users.
It’s why things like CentOS being abandoned, terraform licensing, et. al. never bothered me. I’m not paying them, so :shrug:.
That said, I detect a lot of one sided thinking in his post. He took on an incredibly difficult challenge, faced huge opposition, made incredible technical accomplishments and he feels entitled to a standing ovation. When what he receives is criticism, entitlement and obstructionism he takes it personally. If he did all of this work hoping to get accolades, fame, clout, influence then he did it for the wrong reason. There is a mismatch between his expectations and the reality of the world.
In the best of worlds, we do the right thing because it is the right thing, not because we hope for a pat on the back once it is done. In dharmic religions (e.g. Buddhism), one of the principle mental states one is to aim for is detachment from the outcomes of our actions. Suffering is attachment and Martin is clearly suffering. The other thing in Buddhism is recognizing the suffering in others, and I see a distinct lack of that recognition in Martin's post here. He acknowledges his own abrasiveness but not once does he show compassion for the maintainers who have suffered everything he has suffered, perhaps even from actions Martin himself has done.
Martin has several outcomes he wants, mostly his changes (including the inclusion of Rust) being welcomed in the Linux kernel. He is attached to those outcomes and therefore takes it personally when those outcomes are not achieved. Taking a step away from this attachment is a very good step. IMO, his desire to push for these outcomes has been a significant contribution to the toxicity.
You add these personalities together, where everybody believes they are right an everybody else is wrong then it's a recipe for disaster.
People have to learn to adapt to change, or they get burnt out continually hitting the brick wall.
The linked "thin blue line" message[1] also says this:
> One of the things which gets very frustrating from the maintainer's perspective is development teams that are only interested in their pet feature, and we know, through very bitter experience, that 95+% of the time, once the code is accepted, the engineers which contribute the code will disappear, never to be seen again. As a result, a very common dynamic is that maintainers will exercise the one and only power which they have --- which is to refuse to accept code until it is pretty much perfect --- since once we accept the code, we instantly lose all leverge, and the contributors will be disappear, and we will be left with the responsibility of cleanig up the mess. (And once there are users, we can't even rip out the code, since that would be a user-visible regression.)
Which seems very reasonable. Maintainers shouldn't be expected to support a feature indefinitely just because a 3rd party is interested in upstreaming it. In the case of Rust for Linux and the Asahi project specifically, I imagine this would entail a much larger effort than any other contribution. So just based on this alone, the bar for entry for Asahi-related features should be much higher.
Perhaps this is ultimately a failure of leadership as TFA claims, but it would be foolish to blame the Linux maintainers or its development process, and take sides either way. Maybe what Asahi is trying to accomplish just isn't a good fit for mainline Linux, and they would be better served by maintaining a hard fork, or developing their own kernel.
[1]: https://lore.kernel.org/lkml/20250208204416.GL1130956@mit.ed...
One would hope it was just a particularly bad gaffe, but it could also be an insight into how he actually views himself as a maintainer which is not great.
The expression itself is inherently quite generic; all the claims about it being "specifically about" anything are just people reading stuff into it.
Using phrases whose meaning you're unaware of is generally risky behavior. Trying to claim it meant something other than what the common lexicon says it means is just asinine.
Why this is being brought up in a discussion about Linux is beyond me. Context and nuance matter.
Tangentially: this is a problem with "cancel culture" in general. The mob is willing to lynch people at the mere mention of something they find objectionable. That's what's asinine.
In the context of a long-term good faith maintainer in what is clearly a constructive good faith email, assigning bad faith meaning to a simple phrase is in itself a bad faith action IMHO.
Is it really appropriating anything, though, in this case? Seems this is more people ascribing the expression to a particular political camp in order to taint T'so by association with said camp.
(Sorry, if you were being ironic or sarcastic in your wording, I failed to pick it up.)
Then 2024 happened. Last year was incredibly tumultuous for me due to personal reasons which I won’t go into detail about. Suffice it to say, I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so).
This is _not_ ok in any form, what the actual hell?
Unless he got an extensive posting online on social media platforms I doubt he would garner much attention.
When I started getting harassed in 2022 over an ill advised post on a site, the only thing that stopped it was to sandbag everything and rethink how I interacted on the internet.
Are we supposed to not speak publicly about being harassed? I feel like a culture of silence only allows bullying to perpetuate, and puts the onus on the victim to change their behaviour, and not the harasser's.
Of course, ideally some authority should stop the harasser. But you probably don't want to wait until it gets severe enough for that to happen.
https://vt.social/@lina/112887550181123672
https://docs.google.com/document/d/1W2Vvwg0rwSVb5r4TQ_NmAF8S...
They pretty much straight up confirm the identity of Asahi Lina, so enough with the gaslighting that merely mentioning this (and having a totally reasonable discussion on contributor persona policies and sockpuppeting, which is what this is) is somehow unethical doxxing.
so anyway, i dug into the other party involved in this, i.e. the accused, and it looks like there is more to this story than what Lina lets on in her Google Doc linked here, perhaps there is another side. i am not going to go into details here, but if you are interested, it's not difficult to find her on twitter or bluesky. (and if you do choose to do so, please don't bother her, i don't want to cause her any harm by mentioning her here).
Dug up my old comment on the issue here: https://news.ycombinator.com/item?id=36972907
> The project itself is popular, but HN allows people to talk about the fairly compelling circumstantial evidence that Asahi Lina is marcan's alter ego. Per previous discussion, they have /home/marcan and /home/lina on the same box [0], have the same hostname [1], and have similar accents and speaking patterns [2]. Marcan is free to do this, but it's completely bizarre behavior acted out in public which is now impacting the actual Asahi project. Doing v-tubing under a pseudonym is one thing, but maintaining a sockpuppet contributor on a major open source project and pretending to interact with it is a giant red flag.
[0] - https://news.ycombinator.com/item?id=35242010
Shouldn't we be above those kind of childish telenovela shit?
Asahi Linux lead developer Hector Martin resigns from Linux kernel
https://news.ycombinator.com/item?id=42972062
New Apple Silicon Co-Maintainer Steps Up for the Linux Kernel
However, he's got no social skills nor does he have what it takes to man up and understand he won't get his way.
Additionally I doubt that he really is dealing with stalkers to the degree that he is implying; real people don't talk about their stalkers so much. When I was stalked and harassed I kept the details light and didn't provide much in the way of actual community details because I went to the FBI and local police to deal with it. And yes a few people not only got no contact orders, but lost jobs, families, and more over their exposure.
Marcan is extremely talented, but talent doesn't equal "I get to get my way." All the time. This idea that conflicts need to be resolved quickly and in the favor of a golden boy is a millennial/zoomer issue.
I think it's naive to suggest this would achieve anything when the platform is anonymous and he was based outside the US.
> Outside of that, just get used to be thrown abuse of all kinds.
This is far easier to say than to do. Particularly when you have a public profile as part of your work.
> If you get tagged by a group like that the more you talk about them the more you give them ammo.
Agreed but that still doesn't make it easy to ignore.
If you can't handle that, you need to sandbag. There's no other way around it. It is unlikely that Marcan will stop being the center of attention wherever he goes.
If he wants to stand up for what he believes to be right, he shouldn't have a problem with the consequences of dealing with people who disagree with him, sometimes virulently.
Trans issues are a very contentious issue right now and in my opinion the only way to win that type of situation is to not play. Doesn't mean I don't respect those people, but taking large public stances just put targets on your back
This sounds like victim blaming. I suspect very few people who take a stand are truly prepared for years of abuse, even if they think they are. No one has perfect knowledge of the future.
It's intended that you understand the consequences of your actions. If you're a primary order thinker who can't think past the first step then you got to be way more cautious with your life.
And if the other side is Kiwifarms and its associated offspring, the cops can't do anything at all. These guys are the utterly perfect storm - technically extremely competent but socially they're highly deranged narcissists.
You actually provided more details here than he did, so I guess that's not true.
>> Suffice it to say, I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so).
If anything, I am more doubtful you had stalkers since you are also in this thread saying this[1]. What an unsympathetic reply. You certainly don't come off like you're aware of the stress and fear that can bring if your answer to other people being stalked is "rethink your life choices."
> Sounds to me like you just need to take a break from the internet friend. If you're that high profile that you're getting hit over and over, maybe you need to rethink your life choices. If it bothers you that much especially.
Undiagnosed burnout possibly also played a role here.
This is a strawman with no support behind it in the actual blog post. Marcan's issue wasn't that he wasn't "getting his way", his issue (from his viewpoint) was with Linus and co. claiming to support Rust on Linux while doing nothing to aid in adoption and in the case of some maintainers, actively sabotaging its implementation.
If you're going to do something, then do something. Don't say you're going to do something, then do nothing while your underlings actually do the opposite.
They want to upstream drivers for a device that the creator of clearly has no interest in allowing others to use outside of their walled garden. The knowledge around it is from a massive , albeit impressive, RE effort.
Who is going to support it? Where is the demand for it? It would be different if Apple were to provide patches and drivers for their own hardware, at least then you know there is a vested interest in supporting the hardware from the people who know it better than anyone else and have the means to continue supporting it.
I applaud Hector and everyone else that contributes to Asahi, its genuinely a cool project and the progress they have made is insanely impressive given the lack of any official documentation about the hardware they are working on, but its one of these things that will remain in the realm of a cool curiosity much like running Linux on a games console.
When the response to such a small contribution is just "No rust code in kernel/dma, please" with a complete shutdown of any attempt to discuss alternatives, it's kinda pointless. Even though Rust is supposed to be an allowed language in the kernel now, with blessing from Linus himself, there's apparently just submaintainers of critical, highly shared infrasture that outright refuse it.
So this has nothing to do with "who will own the Apple drivers?!" but just the rest of the kernel going "your integration layers are an affront to our Holy C, begone religious heretics!"
If you start introducing new languages that most maintainers are not anywhere near as familiar with, you create an unmaintainable mess.
Linux isn't a hobby project anymore, its critical infrastructure. You can just introduce changes because a few people think its cool.
This may be controversial but you also don’t have a right to merge in code to the kernel. If the maintainers don’t want rust code then you should write your drivers in C. And if you don’t like that you can maintain your own kernel tree in rust and take on the maintenance burden.
Apple explicitly chose to provide a way to boot third-party operating systems when designing how the M-series SoC boots. Their SoC stuff dates in some components AFAIK back to the very first iPod SoCs in its design.
I would understand that attitude if someone wished to, say, upstream code for PlayStations or other game consoles because that is a bunch of fights waiting to happen, but Apple hasn't made any move directly against FOSS OSes on their computers in the past and there is no reason to believe that will change.
Make that just "marcan and the others". Because judging from a few other threads here, marcan is apparently "lina".
> are damn geniuses
So that's one fewer geniuses than you may have thought.
And they explicitly chose against a UEFI interface like prior Macs, which would have actually enabled proper Linux support. Now you have poor people trying to reverse-engineer a Devicetree from scratch to get basic features to kinda work, emulating hardware features in software and working with no documentation from Apple. They "explicitly" chose to expose iBoot because otherwise you wouldn't be able to reinstall MacOS in a full data loss situation.
By comparison - reverse engineering an unsupported AMD or Intel CPU would at least give you some basis to work off of. You have UEFI as standard, ACPI tables provided by hardware, even CPU documentation and Open Source drivers to work off of most the time. Asahi shot themselves in the foot by trying to support hardware that doesn't support them back. You can argue that Apple was conspiring to help, but we have no actual evidence of that.
> Their SoC stuff dates in some components AFAIK back to the very first iPod SoCs in its design.
And none of those platforms ever got proper Linux support either. I love Linux as much as the next nerd, but it doesn't seem wild to suggest that Apple Silicon will never have feature-complete Linux support. And for many people, maybe that's okay!
This doesn’t seem accurate.
> In macOS 12.1, Apple has added the ability to directly boot a raw image directly instead of a Mach-O, even though Apple has absolutely no use for this functionality. According to Hector Martin (Asahi Linux developer) making things easier for Linux developers is the only known reason Apple would have added this.
https://linustechtips.com/topic/1396740-apple-adds-feature-i...
There was at least one person who said[0]:
"I'd absolutely love to have one, if it just ran Linux.. [...] I've been waiting for an ARM laptop that can run Linux for a long time. The new Air would be almost perfect, except for the OS."
Seems like the same person has even used Asahi to make a Linux kernel release[1][2].
But Linux presumably doesn't have the resources to go chasing hardware platform support based on the whims of a singular Linux kernel developer/maintainer/creator.
----
[0] https://www.realworldtech.com/forum/?threadid=196533&curpost...
[1] https://lore.kernel.org/lkml/CAHk-=wgrz5BBk=rCz7W28Fj_o02s0X...
[2] via: https://arstechnica.com/gadgets/2022/08/linus-torvalds-uses-...
Linux runs on over 90% of all production servers on the planet, think about that. If you introduce a change it needs to work and be maintained, if you add something but then later realise you dont have the resources to maintain it you can't remove it. You have created more work for yourself. Relevant xkcd https://xkcd.com/1172/
But Marcan clearly has true hacker spirit, I'd wager we'll see him again in the future with an equally cool project. It's often best if the visionaries just spend their efforts to get the ball rolling and then let the project evolve on it's own as they move onto their next challenge.
There's an awful lot of money and power associated with operating systems and programming languages (obviously), and the resulting "realpolitik" of situations like these seem to get swallowed up in these discussions.
It makes sense for technical people to think that the technical debate is what essentially matters, but it usually never actually is.
I've found the way Linux has approached Rust in the last couple of years to be a tad confusing. Always cutting a hard line, suddenly Torvalds' opinion is quite wishy washy. Oh, we'll try it, who knows, what's the worst that can happen, type thing? What induced this change, one wonders.
[0] Well-written blog posts on the subject are very welcome, please share if you know one!
Maybe I'm don't know the situation enough, but I think he's not deciding because he has no idea. The "wrong" option may create far more consequences than the "right" one (so he can't e.g. flip a coin) but he has no idea which is "right".
Torvalds has spent so long working with C, even if he went out of his way to learn Rust, he'd never have as much experience to get an unbiased view of both. Perhaps he's hoping people who are younger and have more equal experience, but who are still smart and experienced, will shift the project towards the correct decision. Unfortunately that's not happening, because as a solid BDFL, he's the only one who can make a shift of that size. But either shift could be a huge stain on Linux's history, and he has no idea which, so he's stuck in a horrible conundrum.
If that's the case, keeping in mind that Linux is Torvalds's life's work, even if doing nothing is itself a bad choice, I don't blame him.
Regardless, since Torvalds can't decide, I think the community has to come together and 1) agree on a consensus mechanism then reach a consensus (e.g. vote on it), or 2) come up with a magic third option which Torvalds can accept*.
e.g. "integrate Rust into the kernel, but ensure that all future Rust compatibility issues are handled by Rust developers, even if they are caused by C code changes, through the commitment of a team of general-purpose Rust developers large enough relative to the number of C developers". I don't know if one can really ensure there are enough Rust developers to not block the C developers, but Rust is very popular and maybe growing faster than C, so maybe it's possible.
But how? One of these:
- The C developer needs to know enough Rust to know which changes will affect Rust, contact the Rust developers and explain the change and wait. Extra work and delay they do not want.
- The C developer does not need to know any Rust, but must be doing Rust builds, and if something breaks then contact the Rust developers and explain the change and wait. Extra work and delay they do not want.
- The C developer needs to explain every change to the Rust developers in case it might break, before they can do it. Extra work they do not want.
- The C developer ignores Rust, and the Rust developers must work with unpredictable shifting sands. This works for a while and the Rust codebase becomes larger and new Rust developers join. This is an unstable configuration. The larger the Rust codebase, the more disruptive any change could be and the new Rust developers will feel the changes are malicious, capricious, and will demand this stops and the changes are planned and communicated and the ground stabilises, or they get fed up and quit. Leading back to one of the above points - C developers either need to maintain the abandoned Rust code, or they need to do extra work of tracking, coordinating, explaining, running changes through another team and waiting for them, extra work they do not want, or they can't make certain changes in the C code base, a limitation they don't want.
Rust developers saying "we will do all the work" isn't enough because of the meta-work of organising and communicating the work.
I'm now thinking the solution is "no Rust in the kernel, but we promise to revisit in X years" (then if in X years the picture isn't clearer, revisit in X more years). As someone who greatly prefers Rust, it's unfortunate, but the alternative adds lots of complexity (IMO more than Rust's type system removes) and that's too big an issue.
Moreover, the would-be-contributors who use Rust can (and IMO should) unite and fork the project, creating a "Rusty-Linux" which would be the base for many distros like Linux is the base for all distros. If the fork ends up higher-quality than Linux, especially if Rust adoption keeps growing and C usage starts shrinking, in X years (or 2X, or 3X, etc.) Rust will be so clearly beneficial that nay-sayers will be overpowered (or even convinced), and Rusty-Linux will become Linux.
By not answering this questions and saying he doesn't want to have anything to do with the arguments, Linus simply decided that he doesn't want to solve the problem that only him can solve. The result is clear: R4L will fail if Linus decides that any maintainer can stop the "cancer" to spread and block Rust changes.
R4L implies that Rust will be present in the kernel and will need to be maintained. If Linus is ok with maintainers that have a deep/fundamental problem maintaining/coordinating the maintenance of Rust code, R4L will never happen.
except when he suddenly wasn't
he lost my respect the moment he went there
When asked if he feared competition for Linux his answer was that few liked writing device drivers and as long as no one "young and hungry" came along that could write device drivers and liked it he'd be safe.
There you have him, Hector Martin, young[2] and hungry and loves writing drivers. No surprise he clashes with the old guard.
[1] I don't remember the date but it was still on analog TV, so definitely more than 20 years.
[2] At least a different generation from Linus and could easily be his son. Young enough for generational conflict at any rate.
First, maybe Linux just is bound to always be a C only project. Linus Torvalds infamously dislikes C++, its sorta odd he didn't shut down Rust for Linux in the first place. Redox is on its way...
Second, there are multiple types of compensation. I think the author was probably looking to be compensated in validation from others. Maybe if Linus Torvalds, replied to his email the author would be more inclined to continue.
However, I can't be mad at someone for deciding how they want to spend their time. You only have so many hours in the day.
Would be cool if Qualcomm hired Marcan and worked with an OEM to roll out a series of Arm Linux laptops. That's what we ultimately want.
Marcan had a whole rant[0] in the thread that started all of this about kernel people being payed by corporations instead of being freelance like him. I'm not sure he wants to work for a corporation.
[0] https://lore.kernel.org/lkml/c5a49bcb-45cf-4295-80e0-c4b0708...
I’m sure that Rust will continue to become more complex, and in 20–30 years another systems language will likely come along that is at least as safe while being easier to work with again.
Which is very logical. If you add Rust, why not Zig, Nim, and every other low level language?
A lot of pain and drama could of been prevented if he put his foot down and said only C code is allowed. Instead he left it ambiguous, and a lot of good meaning people have been damaged by this. I know I'd be upset if I had been a part of the Rust for Linux team, and when I actually wanted to get my code in I was told my contributions weren't welcomed.
I wouldn't want to mix and match languages in a project that's so vital to really the entire world. It just seems like a good way for a funky rust bug to cause billions in issues...
I was about to write a question to ask why, if these downstreams are forked, that it is such a big deal to be gatekeeping the upstream and I think I got my answer from this:
"In fact, the Linux kernel development model is (perhaps paradoxically) designed to encourage upstreaming and punish downstream forks. While it is possible to just not care about upstream and maintain an outright hard fork, this is not a viable long-term solution (that’s how you get vendor Android kernel trees that die off in 2 years). The Asahi Linux downstream tree is continuously rebased on top of the latest upstream kernel, and that means that every extra patch we carry downstream increases our maintenance workload, sometimes significantly. "
Is it wrong for Linus to take the side of the kernel and not of the various distros? Serious question. I don't completely understand all of the background here.
"But it goes deeper than that: Kernel/Mesa policy states that upstream Mesa support for a GPU driver cannot be merged and enabled until the kernel side is ready for merge. This means that we also have to ship a Mesa fork to users. While our GPU driver is 99% upstreamed into Mesa, it is intentionally hard-disabled and we are not allowed to submit a change that would enable it until the kernel side lands. This, in practice, means that users cannot have GPU acceleration work together with container technologies (such as Docker/Podman, but also including things like Waydroid), since standard container images will ship upstream Mesa builds, which would not be compatible. We have a partial workaround for Flatpak, but all other container systems are out of luck. Due to all this and more, the difficulty of upstreaming to the Linux kernel is hurting our downstream users today."
I think we are dealing with a maintenance problem of downstream forks and trying to make their lives easier by convincing the kernel maintainers to accept the changes upstream.
Does Linux have a standards committee or some sort of design committee? I thought they had something to decide what goes in and what doesn't. If it doesn't then is it necessarily gatekeeping then? It seems like someone has to make the hard technical choices whether something becomes part of Linux or not and that is what Linus is doing.
I am trying to understand the real issue here. It seems like the difficulty in upstreaming changes to help the downstream folks is the issue not necessarily that the downstream folks are blocked.
Martin's position rests on his claim that the big maintenance burden he would be forced to bear is unfair. I mean, he is the one who chose Rust. No one forced that on him. It is kind of hard for me to see his claim that the maintenance burden of his choice is somehow the responsibility of the upstream maintainers who are all C developers, no matter how small he insists such maintenance would be. My own opinion is that he made his bed, he needs to sleep in it. Or wait for the tides to change in his favor over Rust inclusion in the kernel. All of this crying foul just doesn't sit well with me.
Linus is pro-R4L. He wants to see Rust drivers in the kernel upstream, and has expressed frustration that it has been such a slow process, and that he doesn't understand why some maintainers turn it into a religious issue.
The problem is that he hasn't done much in the way of actually standing up against arbitrary stonewalling by those maintainers. This ensures that everyone gets pissed off and creates a giant mess of arguments. Rust people get pissed off because they were told their contributions were welcome and feel like their time investment is being wasted on bullshit, and C maintainers because the lack of a clear policy leads to ruminating and imaginations running wild.
That person is someone called `Sima` and their posts on Mastodon are pure gas lighting. These are the worst abusers.
>And, of course, “When is M3/M4 support coming?”
This is awful framing. It isn't entitled to ask when something is happenint or to say what makes something unsuitable for you. Marcan seems to take every single social media comment about Asahi Linux as a direct personal attack. No wonder he is burnt out, anyone with such a habit would be...
https://lore.kernel.org/lkml/a869236a-1d59-4524-a86b-be08a15...
https://lore.kernel.org/lkml/a869236a-1d59-4524-a86b-be08a15...
https://lore.kernel.org/lkml/a869236a-1d59-4524-a86b-be08a15...
https://lore.kernel.org/lkml/a869236a-1d59-4524-a86b-be08a15...
1. I think the the DMA maintainer is correct. Don't intertwine implementation languages, that is bad idea and a maintenance hell.
2. Social media "hall of shame"
3. Torvalds is forced to make a statement because of 2. Not 1.
https://archive.is/uLiWX https://archive.is/rESxe
"Behold, a Linux maintainer openly admitting to attempting to sabotage the entire Rust for Linux project (...) Personally, I would consider this grounds for removal of Christoph from the Linux project on Code of Conduct violation grounds, but sadly I doubt much will happen other than draining a lot of people's energy and will to continue the project until Linus says "fuck you" or something. (...)"
"Thinking of literally starting a Linux maintainer hall of shame. Not for public consumption, but to help new kernel contributors know what to expect. Every experienced kernel submitter has this in their head, maybe it should be finally written down."
"Okay I literally just started this privately and the first 3 names involved are all people named variants on "Christ". Maybe there's a pattern here... religion secretly trying to sabotage the Linux kernel behind the scenes??? Edit: /s because apparently some people need it."
Then, the fanatical C developers openly sabotage and work against all the Rust developers efforts. So, the last option for the Rust developers is to take it to social media. Otherwise, the C developers get away with creating a self fulfilling prophecy: Sabotage all Rust efforts, then claim the Rust experiment failed.
Linus didn't seem to ever have the time to actually take a stance, except of course on the social media issue. Fully ignoring all context. It's the equivalent of a school principal suspending the bullied victim for finally snapping and punching their bully.
Even now with Hector out of the picture, there’s still no suitable path forward for rust in Linux. No wonder why people are giving up (exactly what the blockers want).
The suitable path forward is to submit the patch series like normal to Linus, where it will be merged regardless of CH's NACK. CH isn't able to actually stop this process, he's just being a jerk.
However, I agree with you that it would have been nice to actually publicly clarify the situation rather than ignore it and deal with the whole thing behind closed doors. It shouldn't need to be explained that letting this sort of thing fester is a great way to kill motivation in a project and ensure it's less likely that new people will get involved.
I only see Danilo doing that in that thread. And admittedly Linus didn't respond (and Greg KH only minimally responded). But even CC probably means a lot of mail for top maintainers, and at that point I don't see anything that would've gotten in the way of "send a PR despite the Nacked-by", which has been done in the past.
Even if there was, I'm not sure I trust the word of such a drama-seeker directly, so it's reasonable to a evidence of on-mailing-list appeals adding CC (as Danilo did), and if that fails mention of contacting Linus off-list in that specific subthread.
His fellow R4L partners chewed him out for jumping in and spoiling their work. They even quietly but publicly disaffiliated R4L from him.
Social media is an amplifier of interpersonal problems, not a place to seek for a resolution for them - unless your intended "resolution" is to beat down the other side, the people you have to work alongside by necessity, via potshots from random strangers who hardly ever bother to inform themselves fully of the situation. That is never going to be a true resolution, and I think Linus, for all his faults, recognizes that and that's why he draws the line there.
The fact is, you need buy in from other devs and if a dev won't buy in you need to work out a way to avoid them or avoid conflict. It sucks, it slows things down, but frankly making it a "them vs us" is a sure fire way to make them oppose any change you want to make.
Public shaming even more disastrous as there's no better way to entrench someone in a position.
It sounded to me like a list of "friends who want to get more involved, I'll let you know who to avoid". Then, I read the interactions that sparked that post, and I could totally understand the frustration from OP's part.
Linus being unwilling to take a real stand on maintainers blocking Rust just because doesn't really help.
> To back up Sima here, we don't need grandstanding, brigading, playing
> to the crowd, streamer drama creation or any of that in discussions
> around this.
Marcan replied (https://lore.kernel.org/rust-for-linux/208e1fc3-cfc3-4a26-98...):
> If shaming on social media does not work, then tell me what does, because I'm out of ideas.
Then Linus replied (https://lore.kernel.org/rust-for-linux/CAHk-=wi=ZmP2=TmHsFSU...):
> However, I will say that the social media brigading just makes me not want to have anything at all to do with your approach.
> Because if we have issues in the kernel development model, then social media sure as hell isn't the solution. The same way it sure as hell wasn't the solution to politics.
To me, it sure sounds like Marcan is making the case that they tried other venues, didn't feel like it worked, so they resolved to using their social media following to shame kernel developers if they didn't stop.
If the C developers make it a "Them vs Us" thing, there IS NO ALTERNATIVE for the Rust developers.
Linus' reaction is quite literally the equivalent of a parent only punishing the loudest child, not the child that's been silently bullying that kid for months.
In particular, the DMA maintainer didn't want rust code in their DMA subsystem. That sucks, but it means you need to relocate your dma bridge code out of their subsystem. It does mean your driver will be a second-class citizen in the kernel tree (which was always going to be the case for rust).
Linus' reaction was to someone who started a public campaign against another kernel developer and tried to use that following to pressure the maintainers of the kernel to bend to the will of the newcommer. I'm sorry, but I'd also have a pretty negative reaction to that.
The workplace equivalent is you publishing a whistle blowing article against a team in your company because they'd not accept a pull request you worked very hard on. You don't do that. You handle things internally and privately and sometimes you tell the boss "sorry, I can't get this done because another team is blocking the change and they are unwilling to work with me".
And do not mistake my post. I'm not siding with the C dev just because I'm critiquing the rust dev. Guy sounds like he's too stuck in his way. The problem is you don't get a big well working and long running project like the kernel without having these sorts of long-term maintainers that make the calls and shots on what to reject.
The code was never in the DMA subsystem. At no point was there ever any Rust code in the DMA subsystem.
CH didn't even look at the patch before throwing the wall up. When it was pointed out that the patch already was the way he claimed he wanted it, he came up with a 2nd excuse, and then when that avenue was shut down he said he would do anything to stop Rust being put in the kernel, period, he wouldn't work with any Rust developers and he wouldn't accept adding a second maintainer for his subsystem that would do that engagement either.
From that point it's pretty clear that all previous engagement was just in bad faith.
The workplace equivalent is your CEO making a public statement that your work is to be supported, then not firing people who openly gloat about their intent to sabotage your work.
Not that it even makes sense to call it sabotage considering that most people that were involved in the original debate (in the rust for Linux side) didn't see it like that, that the normal kernel development processes were on their way to actually make the change happen anyways, and that Marcan's actions probably did more to sabotage actual support from other maintainers and Linus himself than the original NACK that started all of this ever did.
(Not that Linus ever even gave a blank check for rust on Linux, so I don't think that disagreements and even NACKs are somehow going against what Linus decided)
> Maintainers like Hellwig who do not want to integrate Rust do not have to. But they also cannot dictate the language or manner of code that touches their area of control but does not alter it. The pull request Hellwig objected to "DID NOT TOUCH THE DMA LAYER AT ALL," Torvalds writes (all-caps emphasis his), and was "literally just another user of it, in a completely separate subdirectory."
> "Honestly, what you have been doing is basically saying 'as a DMA maintainer I control what the DMA code is used for.' And that is not how any of this works," Torvalds writes.
> Torvalds writes Hellwig that "I respect you technically, and I like working with you," and that he likes when Hellwig "call[s] me out on my bullshit," as there "needs to be people who just stand up to me and tell me I'm full of shit." But, Torvalds writes, "Now I'm calling you out on YOURS."
> The leader goes on to state that maintainers who want to be involved in Rust can be, and can influence what Rust bindings look like. Those who "are taking the 'I don't want to deal with Rust' option," Torvalds writes, can do so—later describing it as a "wall of protection"—but also have no say on Rust code that builds on their C interfaces.
> "Put another way: the 'nobody is forced to deal with Rust' does not imply 'everybody is allowed to veto any Rust code.'" Maintainers might also find space in the middle, being aware of Rust bindings and working with Rust developers, but not actively involved, Torvalds writes.
https://arstechnica.com/gadgets/2025/02/linux-leaders-pave-a...
There is always an alternative. Exit the project quietly and gracefully if Linus won't show proper leadership. Don't engage in poor behavior back at the C developers, that is just as wrong.
However... this is the same man who made a sock puppet V-Tuber account, and acts in every way like they are two people; even though they've accidentally on-stream shared the system username, shared they have exactly the same kernel version, same KDE configuration, same login, same time zone, even (if I recall correctly) accidentally making GitHub commits as the other person once in a while. He also did this on the Linux kernel mailing lists, where he still maintains the charade.
Point out that's weird, or that it's weird for a maintainer to have a fake persona as a walking female stereotype; and you're the one he shreds and mocks - while simultaneously not denying it. For me, I caught on immediately when I saw the supposed "hijacking" of his stream on April Fool's day, which was her first appearance; and stopped donating. I don't pay people to support stereotypes about women in STEM.
I’ve been supporting Hector since week 1 of the Asahi project and I think it’s a shame he’s thrown in the towel but I can understand why.
I don’t know enough about kernel development to have an opinion about about the Kernel policy of “no aliases” for contributions.
I certainly don’t care that some people think it’s weird for a man to have a female alter ego.
Maybe those things matter to you.
* a brand new account suddenly appears, defending Marcan's behavior (the only comment/post ever of this account) with a very similar writing style
* Marcan immediately "notices" the new comment while doing "random search" (how ? he claims he doesn't browse HN, and even posted a screenshot of news.ycombinator.com being routed to 0.0.0.0 to block his own access to it the day before)
* Marcan highlights the comment in question on his media account [1], praising them "at least [this commenter] gets it"
Only circumstantial stuff, but sure smells very fishy to me.
That is an entirely different situation from facing inner circles in an open source project while contributing to a major port.
Sock puppets aren't taken seriously while the word of the inner circle is taken as gospel.
I also certainly wouldn't take any of his complaints about cliques or brigading with any seriousness or self-reflection afterwards.
This subject also gets inexplicably downvoted and flagged every time you bring it up on Hacker News (look at that, it just happened to my original post); but again, nobody can prove otherwise, and Marcan himself has never denied it, only thrown flames.
https://news.ycombinator.com/item?id=35234480
https://news.ycombinator.com/item?id=32947939
https://news.ycombinator.com/item?id=33792670
https://news.ycombinator.com/item?id=35251905
https://news.ycombinator.com/item?id=36107998
https://news.ycombinator.com/item?id=35238601
https://news.ycombinator.com/item?id=42990443
https://vt.social/@lina/112887550181123672
https://docs.google.com/document/d/1W2Vvwg0rwSVb5r4TQ_NmAF8S...
It's a magnitude more professional than the extremely over the top and public emails that Linus shares, which HN jerks off over. I too would be burnt out if people were picking apart what I said so closely but clapping when Linus says "this code is retarded"
The original message I read (https://lore.kernel.org/rust-for-linux/208e1fc3-cfc3-4a26-98...) they quite explicitly said (verbatim): "If shaming on social media does not work, then tell me what does, because I'm out of ideas."
This message brings up a lot of valid complaints about talented developers being stonewalled and you're honing in on one word that is not being used the way you think. Again, there are dozens of emails from Linus that are vastly more unprofessional than this.
Aha, I thought it was referring to the same "event"/context but it clearly didn't. Thank you for the correction.
If you don't want a maintainer, that's fine, but to claim it has anything to do with professionalism is dumb when this is seen as communication to admire.
> It's a magnitude more professional than the extremely over the top and public emails that Linus shares
Since when do two wrongs make a right? I think it's perfectly fair to say Linus hasn't shown the best leadership here. But that doesn't excuse Marcan's behavior.
The fact remains: Rust doesn’t solve all of C’s problems. It trades them off for a whole lot of new problems, many of which are challenging to address in a kernel development setting (and much less of a problem for userspace software).
This makes the “C is obsolete” position even harder to defend and ignoring the concerns of long-term kernel maintainers is not going to get anywhere! I think these folks ought to learn the lesson of Chesterton’s Fence [1] before continuing on their journey to promote Rust, which does a lot of great things!
[1] https://en.wikipedia.org/wiki/G._K._Chesterton#Chesterton's_...
Agreed.
> There are some members of the Rust community who believe C is obsolete and that C programmers should either switch to Rust or get out of the way. This is an extremely toxic attitude that has no place in the Linux kernel!
Would you care to share some examples of the Rust for Linux community who have said this? I'm unaware of Hector or anyone else saying anything similar? Or is this just a fear of yours?
I think we should be very clear -- believing the future of systems programming is mostly memory safe isn't the same thing as saying "C programmers should...get out of the way".
The problem with the brigading (which has been done by the Rust for Linux community) is that it invites these zealots into the conversation. It's totally inappropriate and not at all constructive towards a compromise.
Plus the stated goal of Rust for Linux is to enable people to write drivers in Rust, not to rewrite the whole kernel in Rust. Yet there are countless people in the wider Rust community that believe Rust is the future and every line of C code still in use should be rewritten in Rust. It's gotten so prominent that "Rewrite it in Rust" has become a meme at this point [2]. There are now many developers in other languages (C and C++ especially) who reject Rust simply because they don't like the community.
[1] https://www.phoronix.com/forums/forum/software/general-linux...
So -- you're bothered by people on the internet, but not specifically the Rust for Linux people or the Rust project people? I guess -- I'm sorry people are saying mean things about a programming language on the internet?
There are also just as many (more!) anti-Rust partisans out there too, who say lots of crazy stuff too. I'm not sure there is much to be done about it.
> Yet there are countless people in the wider Rust community that believe Rust is the future and every line of C code still in use should be rewritten in Rust.
So what? Does your C code still run? I'm struggling to understand what the problem is. People are free to think whatever they want, and, if they what to rewrite things in Rust or Swift or Hylo or Zig or Java, that's how many of them learn!
Yes, they're free to rewrite their own projects in Rust. They aren't free to force others to do the same to their projects. That's what this is all about: a prominent R4L community leader tried to use brigading and shaming to force a Linux kernel maintainer into accepting and maintaining Rust code (along with the entire toolchain to support it). The maintainer refused, Linus got involved, and marcan stormed out of the room.
This isn't a debate about technical merits. It's a debate about maturity and what's appropriate for collaborating with others (and what's not). The Rust community has been going through a lot of growing pains over this issue for a while now.
Nobody tried to force Christoph into accepting or maintaining Rust code. This was stated repeatedly.
I don't see how you can possibly have actually read the discussion and come to this conclusion. At this point you're just making false accusations and contributing to the flamewar.
https://lore.kernel.org/rust-for-linux/2b9b75d1-eb8e-494a-b0...
I wish I knew of a less condescending analogy but I think it gets the point across. The list of former kernel maintainers is extremely long. Anyone who leaves the project, as marcan did, leaves all of their code for someone else to maintain. This is not a problem for drivers which can be left orphaned. For all other code it is a problem!
He expressed complete opposition to having Rust anywhere in the kernel at all, including places he doesn't maintain. He was opposed to any other maintainer deal with Rust for him, even though Robin Murphy (who is already a reviewer on the DMA side) expressed willingness to do so. His initial replies were an exercise in goal-post-moving.
https://lore.kernel.org/rust-for-linux/2b9b75d1-eb8e-494a-b0...
You're making excuses for stuff that does not really need to be excused.
Since Linus backed him up on this issue I’m left with the impression that Christoph is not a lone maintainer standing in the way of the inevitable march of progress; that his concerns are valid and shared by the founder and leader of the project and represent the views of other maintainers who preferred not to step into the ring on this debate.
Furthermore, the Rust code depends on his C dma code. That automatically makes it Christoph’s problem when something breaks, regardless of how many R4L maintainers come and go from the project.
Um, or any other they so choose?
> Yes, they're free to rewrite their own projects in Rust. They aren't free to force others to do the same to their projects.
Where is anyone forcing anyone else to do a rewrite in Rust?
Where is anyone forcing anyone else to do a rewrite in Rust?
When hellwig likened the R4L project to a cancer, he was implying exactly this. He saw this one patch as a Trojan horse (in the original Greek sense, not in the computer virus sense) to get Rust into the main kernel tree. This brings all of the toolchain and language issues into it. By relegating Rust to drivers only, the kernel maintainers avoid the issue of having to maintain a cross-language codebase and toolchain, whether they like it or not.
Being a maintainer of a project that accepts patches from contributors is like operating an orphanage. Allowing anyone to just drop off their unwanted babies results in an unmaintainable nightmare. You can say that the Rust for Linux team have been acting in good faith but the very public actions of one of their (now former) leaders contradicts this. The stated goal of the project was to allow drivers to be written in Rust. Adding Rust bindings to the kernel oversteps that goal. It's a legitimate concern.
You are aware this patch introduced no code into the main kernel tree?
rust/bindings/bindings_helper.h | 1 +
rust/kernel/dma.rs | 271 ++++++++++++++++++++++++++++++++
rust/kernel/error.rs | 1 +
rust/kernel/lib.rs | 1 +
4 files changed, 274 insertions(+)
create mode 100644 rust/kernel/dma.rs
See: https://lkml.org/lkml/2025/1/8/801> The stated goal of the project was to allow drivers to be written in Rust. Adding Rust bindings to the kernel oversteps that goal. It's a legitimate concern.
You do recognize that all drivers will need to bind to some C interfaces? So -- your argument (or the argument you suppose Hellwig has) is that it is better that each driver author recreate each such interface for themselves? Now, when these interfaces break as a result of a change in the underlying C code, instead of fixing that breakage at possibly a single place, that one true binding, now a maintainer might have to fix that breakage in a dozen such places? And this is preferable? This will cause less work for the overburdened maintainer?
It doesn't have to. By becoming a single point of failure for all Rust drivers that depend on it, it becomes the responsibility of all maintainers of the kernel to avoid breaking it when they change the C interfaces. It's a foothold into a world where all kernel maintainers need to run and test Rust builds, something Christoph does not want the headache of dealing with.
When your teenager brings home a puppy and promises you he'll never let the puppy leave his room, you know that's not true and it won't be long before you're the one taking care of it.
Ultimately it's about motivations. Long-term kernel maintainers are motivated to protect and promote the kernel as a maintainable and successful project. R4L developers, on the other hand, seem more interested in promoting Rust than promoting Linux.
> It doesn't have to.
Ah, it's one of those other kinds of Trojan horses that don't enter the city walls.
> By becoming a single point of failure for all Rust drivers that depend on it, it becomes the responsibility of all maintainers of the kernel to avoid breaking it when they change the C interfaces.
So -- I'll ask what the Rust for Linux people asked Hellwig -- what is your suggested alternative? Where do we go from here? Is it Rust drivers not be allowed to common interfaces ever? In that case, what are the Rust for Linux team doing?
Or is it that you would like Linus rethink his decision re: adding Rust to the kernel? And if so, why didn't Hellwig make that case directly to Linus? What's with all this performative bellyaching on the LKML?
The kind that have to be invited in, yes.
So -- I'll ask what the Rust for Linux people asked Hellwig -- what is your suggested alternative? Where do we go from here? Is it Rust drivers not be allowed to common interfaces ever? In that case, what are the Rust for Linux team doing?
That's not the kernel team's problem. They provide a common C interface. The fact that there's an impedance mismatch with binding to them from Rust code is a Rust problem.
Or is it that you would like Linus rethink his decision re: adding Rust to the kernel? And if so, why didn't Hellwig make that case directly to Linus? What's with all this performative bellyaching on the LKML?
I don't know what Linus's goals are, apart from keeping his maintainers happy and keeping the kernel rolling along smoothly. That's not a small thing. From what I can see, Christoph has been a maintainer for over 25 years.
Does Linus want to have his cake and eat it too? Sure. But I think he earned that right by building Linux into what it is today. The R4L team hasn't paid their dues. As someone else mentioned, it took 10 years for Clang to become a supported compiler for the kernel.
In fact, he said that as his very first reply to that thread:
https://lore.kernel.org/lkml/2b9b75d1-eb8e-494a-b05f-59f75c9...
>Everything else is distractions orchestrated by a subset of saboteur maintainers who are trying to demoralize you until you give up, because they know they're going to be on the losing side of history sooner or later. No amount of sabotage from old entrenched maintainers is going to stop the world from moving forward towards memory-safe languages.
I think it's clear from the surrounding context that you are likely over-interpreting some of Hector's comments.
What is the losing side of history here? There is simply too much C code in the Linux project to say "stop this ride, I want to get off and only use Rust" right now. This is a fight about some new code. Rust drivers in kernel and perhaps in the future Rust in other places it makes sense. I believe Hector's arguing Rust drivers are inevitable, because they are already here!
What did I say above:
> I think we should be very clear -- believing the future of systems programming is mostly memory safe isn't the same thing as saying "C programmers should...get out of the way".
The thread was not about Rust drivers, it was about adding Rust code to the DMA module. I.e. about mixing two different languages in a single module, thus requiring being knowledgeable about both languages in order to maintain it, thus making the module less maintainable. In fact, a few developers were saying that they didn't mind Rust drivers, if they used the C ABI as-is. Someone wanted to expose new Rust-specific interfaces to support cleaner abstractions from Rust drivers.
AFAIK this is false. The patch was CCed to the maintainer as FYI, but all the code was in a Rust a module binding to the C DMA interface. If I'm wrong, show me the code.
See the discussion here: https://lkml.org/lkml/2025/1/9/398
I'm willing to grant that it is possible Christoph Hellwig simply misunderstood the patch and overreacted.
rust/bindings/bindings_helper.h | 1 +
rust/kernel/dma.rs | 271 ++++++++++++++++++++++++++++++++
rust/kernel/error.rs | 1 +
rust/kernel/lib.rs | 1 +
4 files changed, 274 insertions(+)
create mode 100644 rust/kernel/dma.rs
See: https://lkml.org/lkml/2025/1/8/801See: https://lkml.org/lkml/2025/1/9/398
You've now discovered why this blew up in the first place. All of the excuses used to reject the code were not just petty but also outright false, and trivially so.
There are _already_ dozens of hobby OS projects and embedded groups doing systems work in Zig. Everyone knows Zig is a systems language. It doesn't have a chip on their shoulder.
Nothing. That's why this was said:
>> *There's a deeper issue* in open-source culture where harshness and gatekeeping drive away passionate contributors.
It's separate gatekeeping.
I entertained getting involved in the kernel for about 3 days, in college. The process is so complex, I just went on to do other fun things. The drama will turn off others. Organizational apathy is much worse, imo. I have quit jobs for this reason and that was when I got paid to stay.
Regardless of whether you think the project should be maintained differently, that's not your call, that's their call. Fork it if you want different policies.
Isn't that also what Linus is doing but on a professional forum, which is even worse? The issue comes down to de-escalation, and there wasn't enough on both sides. It's also not unreasonable to expect more from a figure head who is a role model in open-source development in general.
Linux said no brigading. Hector resigns twice and in the second time, despite saying he wouldn’t elaborate on Rust vs Linux, proceeds to blame Linus and start another social media brigade.
This kind of rant is typical of the public behaviour of the (typically young and "woke") modern social-media "developer" crowd, and your behaviour here only illustrates why so many dislike them. If there are any "hive mind effects" here, they're in your mind.
It's hard enough in physical spaces to remove abusers (usually the abused just stop showing up), I can't imagine there's an answer for preventing this kind of behavior in online spaces
If you want your code merged in the kernel, you have to think about things from Linus' perspective. You cannot in any circumstances try to shame someone into adopting an enormous and unsustainable workload.
What the article quotes Linus complaining about is a process issue. Paragon apparently used GitHub's GUI to merge some of their branches rather than the git CLI. Linus would prefer they use the CLI to merge branches because the GitHub GUI reportedly omits important metadata from merge commits, such as the developer email, and encourages uninformative commit messages.
From the horse's mouth (lkml; Hellwig's headers chopped for brevity):
On Thu, Jan 16, 2025 at 02:17:24PM +0100, Danilo Krummrich wrote: > Since there hasn't been a reply so far, I assume that we're good with > maintaining the DMA Rust abstractions separately.
No, I'm not. This was an explicit:
Nacked-by: Christoph Hellwig <hch@lst.de>
And I also do not want another maintainer. If you want to make Linux impossible to maintain due to a cross-language codebase do that in your driver so that you have to do it instead of spreading this cancer to core subsystems. (where this cancer explicitly is a cross-language codebase and not rust itself, just to escape the flameware brigade).
---
Hellwig was abrasive and unreasonable. But there is no need to perpetuate, repeat, and repost absolutely one-sided, self-serving misrepresentations of the words he used.
You don't need to paraphrase. You don't need to guess. You don't need to distill or simplify.
He wrote English so we could read it; stop paraphrasing. It's unhelpful at best and nefarious at worst.
Edit: I think it's very telling that there is a crowd here that would literally downvote the actual quote. Actually it's more sad than anything.
That's an invitation to self-reflection.
We all should consider that, in every discussion.
The opposite is to think that everyone who disagrees is by definition wrong, which can never be productive.
It doesn't also imply something like open contributions.
The odds were set against the Asahi Linux project from the beginning.
Apple makes great hardware (I have an M1 laptop I use away from home), but if I'm intending to run Linux as my primary OS, I'm buying from a company that is more open to it.
The financial situation sucks. I just threw a small donation their way but funding a project of this scale just from end users is rarely a viable long-term solution...feel like they need to find some high level corporate sponsors.
My best to Hector, what he managed to pull off with the other Asahi developers is remarkable.
A lot of what I’m reading seems to make me feel that drama was… not avoided by this person, putting it charitably.
And I'm someone who I believe still sponsors them! Asahi Linux is an awesome, and dare I say necessary, project.
There is value in learning how to relinquish a constant "defensive posture" mentally. (I have struggled with, and am still working through this, personally, btw.) Heading a project like this surely challenges everyone's stoicism, though.
While it's clear that marcan faced into headwinds, they're also definitely not somebody that I want to be around me in any kind of leadership position.
There’s a lot of toxic behaviors that have wormed their way into certain parts of culture. I’m not as concerned about bright, brash men like Hector.
The amount of drama that this has created for Linux when the entire situation was being handled in a non-dramatic way is staggering.
At its root, nothing about the entire situation had anything to do with marcan anyway and yet somehow he has focused an enormous amount of attention on himself and negative opinions at those he disagreed with. He habitually does things that draw attention to himself (in situations that aren't about him specifically) and then points to _any_ form of criticism he receives and cries harassment.
Of course anyone reasonable would never want to work with him.
I agree with you but just a point of clarification. Martin was working on drivers that he wrote in Rust for the Asahi Linux project. Those drivers would utilize the rust API wrappers that were the subject of this debacle. If they were to be included (as he wanted) it would have made his life easier. If they were rejected then his own driver would be made more complicated. So, he had a legitimate stake in this and when he suspected it wasn't going the way he wanted he lashed out.
The irony is, if he hadn't blown this all up, the change could very well have snuck in under the radar.
Jesus, I wouldn't want to work with either of you...
That's the entire problem.
He threw a shitfit after the situation was already being handled (not ideally, but handled), got the slightest bit of pushback from Linus and then threw all of his toys away but with high publicity.
So on that, is there any guides for FOSS maintainers out there about how to deal with the emotional toll of FOSS, with a focus on self-care, how to say "No", generally just how to deal with the people/human part of FOSS, without focusing on the technical details?
We have a ton of guides from companies and individuals how to make the technical, infrastructure, product, software parts work, but I don't remember ever seeing a guide how to deal with emotions one might feel when doing this sort of work.
I think I'm lucky that it's relatively easy for me to tell people to fuck off when I find them not contributing or being toxic, but judging by the amount of people who feel a real emotional toll, border-lining to the feelings of burnout while working on their dream projects, this doesn't seem to come as easily to everyone, so having a guide/manual about this would be amazing.
https://news.ycombinator.com/item?id=35234480
https://news.ycombinator.com/item?id=32947939
https://news.ycombinator.com/item?id=33792670
https://news.ycombinator.com/item?id=35251905
https://news.ycombinator.com/item?id=36107998
https://news.ycombinator.com/item?id=35238601
https://news.ycombinator.com/item?id=42990443
Out of the links that one other person posted in the replies, these are the most convincing and the most damning to conclude this. [0] [1] [2] [3]
[0] https://news.ycombinator.com/item?id=35251905
[1] https://news.ycombinator.com/item?id=37550189
Few maintainers care about the platform in question (to whom it's more a curiosity like maybe 68k), and don't have the hardware to test any submitted patches. It's painful to have to accept code that you can't test (though it may be common in certain parts of the kernel). It's painful to see a bunch of changes for just one random feature on a random platform. It's unclear how the code will affect other platforms, etc.
Now throw in some controversial stuff. The vendor of the platform is Apple and some patches are written in Rust... oh em gee!
You guys still plug three cables each time you sit at a desk?
This laptop came out 7 years ago, but I'm sure much older models can do this just fine too.
It's not ever coming to Apple Silicon on Linux since post-Thunderspy, Thunderbolt is dangerous to implement even in the best of circumstances. You'd have to reverse-engineer and update Apple's IOMMU, write software drivers for the port since it doesn't have firmware and test it across a variety of vulnerable devices to see how secure it is.
that sentence carries a lot of weight. How many millions of users are left in the dust? Last time I touched a PC, the USB-C port could charge the laptop, unless the battery was empty. Then only the barrel plug could be used. It. was. infuriating.
Once you're finished astroturfing, you're welcome to rejoin the greater discussion.
Yeah, from some very brief research I could only find a singular Linux kernel developer/maintainer/creator who said[0] "I'd absolutely love to have [the new 2020 Air], if it just ran Linux".
Who knows if that one person has even used Apple hardware before or has access to the necessary hardware to put toward a practical use such as a "development platform" while travelling, or, "doing test builds and boots and now the actual [Linux kernel] release tagging"[1][2], let alone be supportive of experimenting with Rust in the Linux kernel[3].
The history of Linux demonstrates the project doesn't have the resources to go chasing support for a hardware platform just because Linus cares about the platform in question...
...even if at least "173 people, and many more" "contributed both to the Linux kernel side but also to the upstream Rust side to support the kernel's needs" in the initial merge[3].
----
[0] https://www.realworldtech.com/forum/?threadid=196533&curpost...
[1] https://lore.kernel.org/lkml/CAHk-=wgrz5BBk=rCz7W28Fj_o02s0X...
[2] via: https://arstechnica.com/gadgets/2022/08/linus-torvalds-uses-...
[3] https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...
Are we so busy or egoistic or ignorant that we can't stop and say thanks? What is even more worse is the entitlement. People who wouldn't lift a finger to help anyone (even their own families) are usually the loudest and the most entitled ones.
I don't know if this is the case around the world (probably is?) and I don't know what the solution is. It just sucks
This is a good analogy. Children are the people who they've been raised to become, so it stands to reason that people will give money and appreciation in the same ways that these things are originally given to them. These things are social constructs; we are inevitably taught how to use them by way of social dynamics. This is all to say that people love to support their darlings... but they've been socially conditioned to expect reciprocity in all transactions. That's how the sausage is made in content-based monetization -- you produce the actual product at a loss and then try to claw it back selling high-margin merchandise that nobody'd ever buy otherwise. The merch acts as social permission to finally do your part and pay the creators.
To risk stating the obvious: this is not a good thing and I think the majority of people would likewise agree. People should be fairly rewarded for their work and we should desire a culture which openly and freely encourages doing so. Culture, however, reflects society. The society we've created is transactional, so that's how people frame the spending of their money and efforts -- indeed, "spending" and "transaction" are practically interchangeable in our collective lexicon. Effort isn't strictly scarce in the same way eggs are, however, so we fail to value it.
> I don't know if this is the case around the world (probably is?) and I don't know what the solution is. It just sucks
It's not a total disaster... so we'll undoubtedly continue to ignore the cracks in the foundation. Martin still got paid good money for his efforts and it was good for him for a time. That podcast you like will sell enough t-shirts and get the rent paid on time, at least for a little while longer. This seems to be about as good as we've collectively agreed to make the world for the time being. A local maxima, so to speak: we've gotten stuck asking for more when less might do better. With a bit of luck and effort, however, we can still catch that pendulum when it eventually begins swinging in the other direction. That's my hope, anyway! For the time being I try to do the things I'd like to see become normal in a more decent world -- sharing generously, paying for the things I like, etc. -- because hopeless accelerationism is for chumps.
*maximum
I'm incredibly grateful for all the work Hector and the others have done on this project. The Air is my dream hardware (I'm a sucker for sleek fanless laptops) and getting to run Linux on it is quite amazing.
So we're now officially acknowledging that the ostensible "owners" of Apple products don't really own their machines at all; Apple does?
Second, what’s the drama? I read the blog, and I’m guessing that on top of being burned out, which sucks, Marcan didn’t like a kernel developer using the phrase “we are the thin blue line” implies he’s politically liberal, in the US sense. He then says he may have been toxic on Mastodon, which might have got him secretly canceled?
All that said, I found his assessment of downstream v. upstream economics (if you can’t successfully upstream you’re doomed to rebase off a massive patch list) pretty interesting. I think the way it is now is the only way that’s good for users — if downstream forks could somehow maintain viability longer term, we would be splitting, e.g. security budgets, performance budgets, etc. I get that it sucks for a group working to upstream, and I am in no way shocked to hear personal politics plays some role in success upstreaming — open source is largely a personal / social capital economy - I guess all that said, hopefully the new Asahi maintainers can work well across what seems like ideology bounds. Maybe?
Marcan watched it unfold on the mailing list wanted Linus to step in and force the C dev to play nice. Since nothing happened, he went to social media and lashed out as a last resort. That's when Linus finally chimed in and pretty much said "you might be right, but this isn't the way to handle things".
He didn't want any Rust code at all touching his turf. He outright NACKed without any technical reason and refused any negotiations with the Rust team that any Rust build fails due to C code breaking changes would be their entire responsibility.
> This led to the C dev saying that while he likes Rust, he believes multi-language codebases are cancer and would stonewall all Rust code that touches his code.
I believe that was an attempt of damage control to save face and he actually meant to call Rust "cancer".
From what I saw on urbandictionary, it seems more likely to be something cops in high crime areas in the UK say.
Apparently it's political in the US, I have no idea, but as I understand it the maintainer just means 'I am here reviewing the change to keep the kernel in good order'.
It’s a poor choice of words for such (relatively) public communication.
I don't know his full biography, seems to be Chinese born and went to MIT, but he signs off 'Cheers', I think it's a reasonable possibility that he doesn't mean whatever politically charged US meaning it has by it.
It is not "at least some" and this isn't something to downplay.
You're, with your US perspective, saying 'hey words have meaning you know, don't downplay murdering homosexuals' while millions^ of people smoke fags in the UK every day.
(^probably? Maybe not any more, a lot of fag-smoking relative to murder at any rate.)
Definitely a shame. I wonder if it would be in Apple's interests to actively support Linux on Mac. It would make Macs more attractive as developer machines, and I don't see how it would disadvantage them.
Trust me, we would know by now if it was. It's not.
Apple is selling a custom CPU core that has no driver support for anything but XNU with a BSD userland. It doesn't support UEFI, it depends on Devicetree bindings and would demand constant updating and support to render a "first class" Linux experience. Once again, anyone with a protracted interest in staying supported by upstream Linux should not be using a Mac and praying the community cares enough to make it good. Apple knows it's a novelty, and they're not going to take it seriously because that's just what they do. MacOS and the App Store is profitable, Linux is not.
The only development they want is development inside XCode. Anything else is a hard no.
Define "a lot" and we can get closer to a common understanding.
You can't however claim the right and reason to refuse PRs and code changes under the moniker of maintenance while simultaneously claiming that the rust community is "actively hostile to a second rust compiler implementation" - you can't have it both ways.
The entire narrative is very indicative of the state of open source unfortunately - incredibly adept programmers getting chewed up by the externalities of maintaining code (and by extension an application). Sometimes it's unappreciative users asking for handouts and sometimes it's other developers (or an organization's development resources) causing contention for a project's trajectory.
I think the entirety of it can be summed up as: forking is there for a reason.
I'm not sure how much work has actually been put into it though.
I still think the best outcome is to fork and recruit some lieutenants for community management. To me the community is losing a lot with his departure. His complaints are legitimate and hopefully linux kernel team can better accommodate his patches. many distros and corporations deliver tremendous value from their forks and it's a better solution than quitting.
I have had to do maintenance on a distributed filesystem driver at one point. This was outside the kernel. I can see why no kernel maintainer would have wanted to look after it even if it was open sourced because the file server was a mixture of C++ and an interpreted language and working out if you'd broken the driver somehow was a miserable job. You would need obscure expertise to understand it all.
Anyone with a good idea can still fork Linux. If their idea is so great it may end up that their branch gets adopted - if they bother to maintain it.
Supporting and developing Rust is a nice to have, but too often its proponent try to force stuff it deeply inside important stacks that are using other languages like for Linux, or what is going on with major things in Python also.
Here we can see the case, that is almost a blackmail that the Linux community is not nice and will die if they don't make Rust support core and mandatory.
My point is that, if you like Rust and Rust is so nice, just go all-in, do your own kernel, do your own stuffs, and we will see the result in the end. But don't ruin existing good stuffs that were working well on their own.
I'm sorry that it burned him out. But I think it is intrinsic to putting anything in the public sphere that it won't match everyone's hopes/desires/needs, that you will get such feedback, and that you have to find a way to be OK with that. (Or step back from such roles, as the author did.)
People aren't going to change.
Loved following his various social feeds. I was sad when he stepped away from the fediverse. I hope he comes back as just a regular hacker without this massive weight on his back.
It seems that it's impossible even for very talented folks to push such work load without corporate backing both without monetary compensation and other resources like non-public documentation. It is very impressive that Asahi team has achieved, but the more hardware advances, the less realistic is to support it on enthusiasm alone.
Criticism hits incredibly hard. I'd watch a friend play for hundreds of people at a concert and he'd receive a standing ovation.
But he overheard a single disgruntled remark from someone which nullified the whole experience for him.
I know he was being overly sensitive about it, but I've heard similar stories from other people too.
That seems silly; just assume people know about the change in circumstances and are trying to give you money for whatever you are doing at the moment, or in gratitude for what you've done before. The person exists, why pause the personal support?
Sidenote: thank you also to all those project leads that put themselves out there. Not easy being the $h1t magnet.
<3
He even has defenders calling for yet more CoCs in order to better "deal with" people who "get in the way." It doesn't get much more toxic than that.
Well, they were proved right.
Sometimes I wish I was so passionate, whereas my philosophy in life towards strangers is a much simpler “fuck you, or pay me”. It allows me to sleep fairly well at night.
I have a few toy projects on GitHub, a couple of which gained a tiny bit of popularity, and I simply ignored every new feature request that I didn’t need, and especially those large PRs “here, I refactored your code to make it functional and it’s so much better now”. I simply said, with no regret: “I won’t merge this, feel free to fork the project, if it’s better I might even switch myself to your project!”. Some got mad, but I truly and genuinely couldn’t care less.
#2 is the cause of #1.
Running commercial games for free has always been the killer application for jailbreaks/custom firmware for game systems. And game system emulators, for that matter.
It's sort of a perfect storm of gaming enthusiasts (particularly those with more free time than money) who want to run commercial games for free vs. companies like Nintendo who have an affinity for (overly) vigorous copyright and trademark enforcement, with jailbreak/custom firmware and emulator developers caught in the crossfire.
Rinux
Otherwise people are going to get accusations that it's a South Park thing.
Marcan wants to use social brigading to get his way, Marcan wants the entire Linux kernel dev flow to bend for him, and, when none of his poorly presented demands get him what he wants, he is - of course - the victim here.
Asahi is neat, but it clearly isn't a working daily driver yet, and it's not abusive to make feature requests and bug reports. Discussions around Rust in the kernel are not, and can never be, an 'injustice'. In Marcan's world, everything other than vehement agreement and immediate compliance is abusive, hostile, toxic, etc. But of course, the only toxic person here is the one threatening to wield the mob to get his way.
Honestly, I'd query whether the benefit is worth the cost. I'll take average code from well-adjusted anons over clever code from bullying, hyper-online micro-influencers any day of the week.
marcan had indicated donations are down, and it's hard to support ones livelihood just from donations alone - especially when they are down.
https://lore.kernel.org/rust-for-linux/c5a49bcb-45cf-4295-80...
And I've been watching from the sidelines, waiting for Asahi Linux to become "stable" enough to consider buying a Macbook and putting Asahi Linux on it.
But then marcan told his supportes to fuck off unless they commit to supporting his political ideas, which I was not willing to do.
I guess this comment will be seen as abuse from the HN crowd. Oh well...
Out of curiosity what are his political views? It has been mentioned a couple of times here already and it seems to be part of the story.
As someone not in the know, would you mind elaborating.
Now, if you want me to explain what political ideas those were: I don't care. Whatever they are, I don't want to support it, even if I have those same ideas. Yes, I do think that open source communities should move away from the politics.
Is this about Marcan’s outspoken support for transgender people? If so, why not simply say that in your comment, rather than framing it in such vague terms?
Surely you see why this is, actually, directly relevant and important context for your statement. It’s not some general political leaning you’re talking about - lumping this (prejudice against a minority group) into the same category as something like banal disagreements over taxation policy amounts to deliberately obscuring what you’re saying behind innuendo.
If you’ve got something to say about his political views in a public forum like this, at least do the people around you the courtesy of being upfront about what you’re actually saying.
I support the freedom of people chosing their sex or gender. At the same time, I'm not willing to fight their wars. And if they force me to go to war, then I pass.
If the group in question were gay people, or a racial minority, would you still treat the issue this way?
Working/contributing in FOSS is already slave labor in itself (literally, billion dollar companies depend on FOSS and many do not contribute back to the ecosystem they depend on). Then the abuse from other FOSS developers and community is just cruel.
Hope the guy is able to recover mentally and physically.
"... I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so)."
Rust in the linux kernel was always going to be a long game. You don't want to have that be a blocker when you really want is to make larger kernel changes.
The problems cited are portrayed as sociological problems, but I really wish people could recognize that all of them can be mitigated, either substantially or entirely, with a single purely-technical solution: microkernels.
* Almost nobody needs to upstream code to the kernel
* Trusted codebase size becomes negligibly small
* Maintenance burden for drivers, subsystems, etc., falls on the users of the subsystems affected, and not the entire community
* Broad language compatibility by service interface instead of ABI compatibility. The need for a singular compiler is reduced in scope down to the size of the subsystem instead of the entire ecosystem.
The biggest problem that can't be solved purely technically is the entitled user problem, but even that is partially solved. This is because the barrier to contribution is substantially lower:
* I can write code in Rust, but I don't know C.
* I can easily write simple drivers for some hardware features like battery managers and fan controllers and temperature sensors, but I don't know anything about kernels.
* I have a lower, but non-zero understanding of security, and would not feel comfortable writing code that runs on ring 0, but wouldn't feel inhibited writing code that benefits from process isolation.
Those attributes about myself inherently mean that for a microkernel OS, I can be a contributor, but for Linux, the best I can be is an entitled user.
I hope someone else can take up the lead and that the clever people involved can continue that work
Do the demigods of the Linux kernel - Linus and the core maintainers - personally want the kind of code Asahi is developing to be merged into the kernel? The author writes as if part of his drive was that Linus himself showed enthusiasm for getting Linux on Apple Silicon.
If there is interest in the work Asahi has done, then the Linux team needs to describe what they see as the gap between today's code quality and support model and what they want to see before upstreaming.
It sounds like the Linux team has been wishy-washy and needs to draw a line in the sand on their needs rather than handwaving about being part of the "community".
It would be fair to say "we don't like your attitude or trust you to work with us kindly over the years and don't want to deal with you", if that's the case. Just don't dance around it.
Perhaps the author jumped to conclusions after Linus himself started using Asahi Linux on his own laptop for Linux kernel development[0]. Note the praise for the Asahi team in the commit message.
It sounds like you agree with me though. The Linux team needs to clearly define the expectation they have for code maintenance from the team trying to upstream Rust code (edit) and the Asahi team needs to acknowledge how/if they can meet those expectations.
The challenge is not dictating from high above some criteria; the challenge is discovering the criteria that will let the Linux project continue development as well as can be arranged. This is why you'll hear Linus say it's a learning experience, and not just make proclamation of how things shall be (at this stage).
Would you say that the Asahi team wasn't receptive to the pace at which the needed criteria were being developed?
My point is that between these two groups there seems to be a misunderstanding of expectations. And being the upstream org, and not having read every mailing list thread, I would expect the kernel team to have built a framework for accepting this kind of code. Or a framework for building the framework.
It sounds like the implicit answer to "Does Linux want Asahi contributions" is "low priority". Which is fine if that's communicated.
I sense you have been involved in these discussions already and have a strong opinion about the specifics of this topic. I don't mean that in a bad way.
I have very little personal interest in Asahi, I am not really part of that "conversation", but I dislike outsiders coming in and expecting to dictate how something that predates them should work. Everyone is entitled to their opinion, but that doesn't mean anyone else has to listen to it. If you want to understand, read linux-kernel the mailing list and watch people like Al Viro work (Minimum realistic time allotment: multiple months).
What i will say, Is that i get the impression that the he is quite sensitive and takes things too personally which is a death wish online. People are savage online and if you don't have thick skin then they will destroy you. Not justifying it but it is what it is
Either way hope all goes well for him. Sad to see this happen i bought a m2 mac because of linux support :(
Why not fork Linux and call it something new, with the hardware support that is exclusively for M1/2/3/4 Mac’s?
It was hard to decide to stop the financial support. On the one had I want maintainers to be able to take breaks without worrying about their livelihood. On the other hand, it was very difficult to tell what was going on and whether Marcan would ever get back to working on the project.
Did you read the rest of it?
Sure Apple being not as great to develop drivers, but ~99% of the article is about displeasure working with Linux maintainers.
EDIT: Here is the breakdown by paragraphs.
| Paragraph # | Tone |
| ----------- | -------------------------------------------------------------------------- |
| 1 | History recap |
| 2 | History recap |
| 3 | mentions Apple and M1 in positive light. |
| 4 | Mostly positive, slight negative towards Apple not being having good docs. |
| 5 | Negative Linux kernel development (upstreaming) |
| 6 | Negative user focused |
| 7 | Negative user focused, complaints about M3/M4 support. |
| 8 | Negative user reviews |
| 9 | Money troubles regarding support. |
| 10 | Linux maintainers, mostly negative |
| 11 | Unknown 2024 event |
| 12 | Negative about users (demanding more features and support) |
| 13 | Stress about Kernel development |
| 14 | Negative about Linux kernel development roadblock |
| 15 | Negative about Linux kernel development (Linus leadership failures) |
| 16 | Positive about Rust |
| 17 | Negative about Linux kernel development (Why Rust can't wait) |
| 18 | Negative about Linux kernel development (downstreaming) |
| 19 | Negative about Linux maintainer (thin blue line) |
| 20 | Negative about Linux kernel maintainers (two faced) |
| 21 | Negative about Linux kernel maintainers |
| 22 | Negative about Linux (disapointment in refusing invitation) and Linus |
| 23 | Negative about Linux maintainers (being corporate) |
| 24 | Negative about Asahi, and dreading to turn on Apple |
| 25 | Negative about burnout |
| 26 | Resignation |
| 27 | Positive about Asahi Linux team |
| 28 | Hiring proposition. |
18 negative paragraphs
2 about Apple (11%)
4 about users (22%)
12 about Linux/Linus/Maintainers (67%)I was wrong about it being 99% about LKM but it's more accurate than saying 50% of issues are Apple.
I counted paragraphs topics myself: History (3), Proprietary Hardware Problems (8), Kernel/Rust problems (8), Other/Quitting (7). Could be off by one or two because I'm not a machine.
Also, I note that you did not disagree with or object to my initial response (the one made before your edit) that it is indeed halfway down the page before rust/kernel stuff is even mentioned.
I think it is safe to say that both Apple and the kernel/rust issues matter here and trying to derail any discussion of Apple's role into even more rust ragebait threads in a HN topic full of them is counterproductive.
No. Those are issues caused by users. You could have most open hardware platform, and it would still persist. See any OSS maintainer complaining about unrealistic user expectations.
> I counted paragraphs topics myself: Proprietary Hardware Problems (8), Kernel/Rust problems (8)
How are you counting those? I made a table, point me which exact paragraphs. Also it's deceptive to pull Rust into this story. marcan had nothing but praise for it, without it, he wouldn't be able to write those drivers.
The main complaint of marcan is the horrible experience you have as a hobbyist Linux contributor. You can't blame Rust or Apple for that. That's on Linus, and Linux maintainers.
One group believes it is Rust (progressives), one group doesn't believe that and wants to continue with C (conservatives).
If they cannot find a way to live at peace with each other, I think the only solution is for the Rust folks to start building the kernel in Rust and not try to "convert" the existing kernel to Rust piece by piece.
Why they cannot live in peace seems to be: a way that C kernel folks would not need to deal with Rust code.
At the core, the story is not that different from introducing new languages to a project.
You are introducing a new tax on everyone to pay for the new goodies you like, and those who are going to be taxed and don't like the new goodies are resisting.
I don't believe in the Linux project since a few years now, especially as "the bearded ones" are not interested in moving the project to a certain future, but only jerking on their old own code.
Good luck for the futur Hector, and thanks for what you managed to do until now with your team.
I personally lost my confidence in it when they stopped properly triaging security issues and flooded everyone interested with just noise.
C is simple language
Rust is lojban.
Trying to convince people who like C to Rust was bad idea.
> I ended up burning out, primarily due to the very large fraction of entitled users.
I stopped reading after this sentence. The God complex is simply too much for me to digest. TL;DR: "I am a programming God; they are newbie peons who refused to bow at my altar of greatness." On other posts about this person, I read multiple comments from people that could be summarised as: "This person is an (amazeballs) amazing programmer, but also a total drama queen." I say, with respect, "You will be missed. Thank you for your contributions." <sigh of relief>That of course was difficult in the corporate environment of 2014-2024. Perhaps he was forced to do it.
In many areas, sanity has returned, so perhaps we can get clearer messaging again in the future.
No clue what you could mean by this
>Perhaps he was forced to do it.
By whom? Linus is the one who decides these things.
> Hi! It looks like you might have come from Hacker News.
> Asahi Linux developers are frequent targets of abuse on Hacker News. Despite our pleas, the moderators have not taken effective action to improve the situation.
> Overtly hateful content is often flagged on HN and not immediately visible. Unfortunately, when a comment is flagged and killed, its child subthread is not. That preserves the 'clean' image of the website, but the reduced moderation activity enables abuse to continue. Although you don't see those threads, search engines do. HN uniquely has a high page rank and low moderation, making it a prime target for bad actors to poison search results with abuse, bigotry, and nastiness. This isn't low-level trolling, but an organized attempt to destroy lives, including of developers in our communities.
> Please demand change within your community.
This is an unfair and gross assessment. I've lost some respect from Asahi for this.
They're calling for extreme moderation of opinions they don't agree with, which is the opposite of open discourse.
Asahi: deal with it. You're Streisand Effecting this. Your inability to handle drama is actually causing more drama. Just turn the other cheek and ignore it.
saying things like "an organized attempt to destroy lives, including of developers in our communities" is patently not true. trolls get flagged. honest nice people who don't agree with you aren't trying to destroy anything and nor do they hate you.
* "bad actors" links to https://en.wikipedia.org/wiki/Kiwi_Farms
* "destroy lives" links to https://en.wikipedia.org/wiki/Kiwi_Farms#Suicides_of_harassm...
While you may be correct that initial "trolls get flagged", the statement on the Asahi site agrees that while the initial comment may be flagged & killed, the other comments in the subthread are still indexed, visible & tend not to get moderated/flag:
"Unfortunately, when a comment is flagged and killed, its child subthread is not. [...] but the reduced moderation activity enables abuse to continue. Although you don't see those threads, search engines do."
Based on other remarks about the content of such subthreads it seems surprising to claim that follow-on comments are made by "honest nice people".
I'm as much of a fan of adverbs as the next person but using words like "categorically", "extremely" & "patently" doesn't seem to leave much room for nuance of interpretation when written by someone who I'd have assumed was a third party observer?
While I could understand someone describing JWZ's HN-tailored "banner" (I wouldn't suggest researching this if anyone is not already familiar) gauche and immature, it feels like somewhat of a stretch in relation to a plain text message who last sentence starts with "Please".
> the other comments in the subthread are still indexed, visible & tend not to get moderated/flagged
indexed: please complain to google.
visible: not unless you turn on show-dead. so don't do that.
don't get moderated: they are already dead.
> I'm as much of a fan of adverbs
i mean what i said. i'm extremely tired of seeing histrionics and exaggerations, misplaced blame, etc. turned into loud, unfair, criticism toward what is probably the best moderated group i can think of.
JWZ's banner is at least recognizable as satire, and his opinions are well known. i can disagree with him, but still find it a little bit funny (and immature). but if you do the same thing (yes, with a please), then you are just exactly as mature. and if you are serious, less grounded in reality and not nearly as funny.
What is the "it" that you're insisting they "deal with," here? What is the "drama?"
Also what value does bigotry, homophobia and transphobia have in open discourse that it must be preserved? None of that is on topic for Hacker News, why must it be on topic for the Asahi Linux community?
Turn the other cheek. Ignore it. It's 2025 we're learning lessons from USENET all over again and having to reign in the over-sensitive, disregulated behavior of some people.
I'm gay, on the spectrum, and my wife is trans. What certain people in "my" community do from places of relative comfort makes life for those of us in more moderate / conservative-leaning places worse. The screeching from our community [2] has turned our little demographic into a major culture war topic, and it's all because of the bad attention and friction you manufacture.
Conservatives let LGBT and trans issues slide for over two decades of my adult life. But by being loud and attempting to silence them -- by harassing them -- you've become the nail that sticks out and have now created a tidal wave of opinion against us.
It's easy for some European or SF trans person to call for universal outlawing and censoring of speech, but you have to realize your message is being read all over the world. It's interpreted by an overwhelming number of people as attempting to memory hole conservatives and flush away their culture.
Simultaneous to your harmful messages, folks are also being inundated with social media rage/engagement bait to make them think liberals are literally attempting to destroy and annihilate conservatives [3].
Your message adds weight to this perception, and all you accomplish here is making the majority of voters angry at us. It even turns moderates and would-be supporters sour.
I hate that you represent me by association and think that this is acceptable behavior.
As another anecdote, when I talk to my friends about Rust, the subject of "drama" frequently comes up. Why is that? Suddenly my work becomes harder for an entirely unrelated and unmerited reason. That's just me as an LGBT person - imagine how straight people feel.
We shouldn't have to keep reading about this over and over. It's orthogonal, childish, dysfunctional behavior.
Take one more look at that loud disgusting banner on the top of the Asahi page. That's neener-neenering in front of everyone. Even the moderates you hope to be your allies. Please, for god's sake, put yourself into different shoes. You're asking them to do it for you, but it's your turn.
I think you'll see that your behavior is also harassment.
Please calm down, slow down, and behave like adults. Not everything warrants a response or attention. Chances are, it'll just go away and get totally ignored. When you engage, you shift the conversation and bring yourselves down to their level. You create a firestorm of drama that everyone watches like a burning wreck.
Stand above that.
[1] I only wanted to talk about the very public, inflammatory resignation and the immature handling of this by certain parties.
[2] eg, folks whose entire personality is to harass people on social media: https://www.tiktok.com/@lillytino_/video/7295890626539687210
[3] Just look at this image and how religious people take it: https://danolinger.com/2018/11/01/responding-to-persecution-...
In the US, DADT was repealed in 2011. Obergefell was 2015. The idea that they let LGBT and trans issues slide for over 20 years is fundamentally wrong and not supported by history.
I reject the rest of your post and the defense of those that would take rights away from individuals and myself because they have to be coddled.
While my life experience has been different to yours, from what you've written about how you've been treated by others in your community, as a consequence of who you are, it seems understandable to me that you might experience those feelings--and, even if they didn't seem understandable to me, it is more important to me that you feel heard and your feelings acknowledged as valid and not dismissed.
I hope I have been able to communicate that intent effectively.
----
At the risk of falling into the stereotype traps of "straight white male thinks every rhetorical invitation is a literal invitation for him to say what he thinks" & "straight-splaining" I did want to provide an answer to the question in the last sentence here:
> "As another anecdote, when I talk to my friends about Rust, the subject of "drama" frequently comes up. Why is that? Suddenly my work becomes harder for an entirely unrelated and unmerited reason. That's just me as an LGBT person - imagine how straight people feel."
(I preface the following with an acknowledgement that it's bullshit that you have had to deal with the impact of this rather than the predominantly straight white males who don't want to be made to feel uncomfortable.)
TL;DR:
FWIW, from my perspective as a straight white male I feel the subject of "Public Interpersonal Conflict" attributed to Rust is directly related to values rightfully espoused/embodied by the Rust project/community/language that are at odds with values held by other groups.
Specifically, groups consisting of predominantly straight white males believe that the comfort of predominantly highly skilled straight white males should be prioritized over the physical well-being of other humans; and, also over the security and stability of the software other humans use.
They are also unlikely to agree with this characterization.
Unlike the above group however, rather than targeting resentment at the people whose physical well-being is at risk I choose to direct my resentment at the predominantly straight white males who choose to dismiss important issues as unimportant "drama" because they resent being "made" to think about issues that impact people other than themselves.
----
For anyone who disagrees with my characterization I would point out that we do not know what other contributions Alan Turing may have contributed beyond "Turing Completeness" & "the Turing Test" to current in-demand fields such as AI if he hadn't been persecuted for not being a straight white male.
I would also remind them the ARM CPU attached to that unified memory on which they're running their latest AGI & LLM models is thanks to another person some people in the present day think should be persecuted for daring to exist.
But equally people shouldn't have to trade advancements in the field of Computer Science for the right to exist without persecution.
----
I will acknowledge that its entirely understandable to want to avoid the associated discomfort because from personal experience it is very uncomfortable to have to re-evaluate one's place & responsibility in the world after a lifetime of being told something different.
----
The other ~2,500 words I wrote on the topic was certainly more nuanced but pretty much said the same thing with more beating around the bush with additional personal context.
For any straight white males who may be confused why someone might think as I do, all I can say is that time spent reading/listening to this (unfortunately, archived) resource is likely to be worthwhile, if temporarily uncomfortable: https://geekfeminism.fandom.com/wiki/Geek_Feminism_Wiki
"Everyone else is a bully, but not me - I'm just trying to raise awareness."
Also, I'll add that whenever I've seen an unflagged hateful comment I've emailed hn@ycombinator.com, and the success rate in getting the comment killed and people told off (or banned) is thus far exactly 100%. This usually happens if someone leaves a comment a few days after the discussion dies down, so few see (and flag) it.
I can dig up many such examples, but I suspect the response would be, "of course that's not moderated" because this community has a different set of values than some others.
Moderation is always an editorial action, and as such we tend to view it as strong when it aligns with our own values and weak when it doesn't.
IMO, if you're going to make charges like this, which would be serious if they were true, you should include links so readers can make up their own minds.
https://news.ycombinator.com/item?id=42907076
https://news.ycombinator.com/item?id=42783776
https://news.ycombinator.com/item?id=42780835
https://news.ycombinator.com/item?id=42718838
https://news.ycombinator.com/item?id=42708579
https://news.ycombinator.com/item?id=42700319
https://news.ycombinator.com/item?id=43034231
https://news.ycombinator.com/item?id=43031405
https://news.ycombinator.com/item?id=42959625
My methodology: Search for any of the following terms: woman, biological, Black, Latino, gay, trans, woke, dei, or virtue signal
Set to "Comments" and "30 days". You'll find plenty of people saying things that are pretty awful. Yes, they are not the majority of posts, this place isn't a cesspool, it's just a place that permits "just asking questions" or "it's up for debate" as a defense for behavior
I could find many, MANY more examples.
Of the 10 links you listed, 7 seem to me obviously to break the site guidelines and I've flagkilled them. One, incidentally, was from an account that we banned earlier today (https://news.ycombinator.com/item?id=43042278), and another was from an account that we banned a couple days later (https://news.ycombinator.com/item?id=42483610).
Of the remaining 3 of the 10, I disagree with you about saagarjha's comment: https://news.ycombinator.com/item?id=42907076. That one seems thoughtful and in keeping with the site guidelines. It does use a lot of sort-of trigger words (I counted "trans", "vegan", "left wing", "Democrat", "progressive", "conservatism", "Republican"), but surely we're not going to punish people just for using words like that.
The other two seemed borderline to me, although I confess that one was so long that I couldn't read it before becoming le tired.
> I could find many, MANY more examples.
I'd be interested in seeing them, and I hope it's clear that I mean that. I don't want to argue about this—I want to see what you're seeing.
I'll dig up some more. They tend to be a bit stochastic, and on various topics. (There was an article that made the front page awhile back written about pg that was written by a trans woman that was an absolute lightning rod for this, iirc.)
There were a number of remarks on the prior thread by people making conspiracy claims, harassment, insults etc. Some of them get flag-killed, some just down voted but ultimately the users on the site still remain.
Of course I'm not one to be above such a thing in terms of insulting people occasionally but HN is really quite permissive in terms of what you can post and get away with. It takes consistent and repeated bad behavior to get a warning, and even more to get banned. And if you're an expert in being politely venomous you can get away with even more. That's why the outside perception of HN tends to be a lot worse than the inward one.
You have NFI what you're talking about. There were major architectural changes in M series chips between Avalanche/Blizzard (M2) and Everest/Sawtooth (M3).
This isn't up for debate. There's tons of evidence out there, including the stream where his VTuber software failed briefly and he "doxxed" himself. It's not a fake. I was there watching, and rooting for him to succeed.
Rather than further clutter up this thread with the same links, yet again, I refer you to:
https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
what if after Linux and Git, Linus came up with his own memory safe language suited for kernel development?
I hope the language industry will either make safer languages, or at least push hard to use static analysers.
The correct way would have been to maintain the Rust code out-of-tree, for as long as it would take, which would also somewhat prove that you are ready to maintain that code over a longer time period.
Sad that this led to him stepping down, but maybe others in the Asahi Linux circle are ready to keep maintaining the code out-of-tree until everyone is ready for it
That's the only other project I can think of that's out of tree besides the usual suspect (ZoL)