> a gigantic manual that lists every property of the system in excruciating detail, which is totally worthless for learning and barely usable as reference.

It's the only usable form of reference! I want all the details to be presented in a reference. Where else?

> low-level tools are terrible too

It seems to me the author is confusing lack of familiarity with lack of existence. There are lots of fantastic tools out there, you just need to learn them. They don't know them, so conclude they don't exist.

> We could have editor plugins and language servers to help beginners along

We already have all that.

> It's the only usable form of reference! I want all the details to be presented in a reference. Where else?

I guess it's like a dictionary: it's only useful if you know the word you want to look up, rather than reading through every definition until you find the function/library/ability that you want. I do agree though, when I need to look something up, I do want it in great detail - it just isn't a very good learning resource.

> It seems to me the author is confusing lack of familiarity with lack of existence. There are lots of fantastic tools out there, you just need to learn them. They don't know them, so conclude they don't exist.

Can you give some examples? The author made a compelling argument on how easy it is to use the browser debugger. I would be of great interest for something similar.

> We already have all that.

I've only seen these for simple python applications or web development, never in any 'low level' space. And certainly not for doing anything interesting in the low level space (something that is not just a C++ language tutorial).

Language servers with LSP for Rust and C++ are available and, I believe, widely used. At least I use them.
I think the point about tooling being the problem deserves more emphasis. I'm a firm believer that the right thing to do should be the easiest thing to do. Currently, the easiest place to innovate is at the top of the stack, using web technologies and languages like JavaScript.

You can see this with languages like Rust and Go—they're some of the first low-level programming languages with actually good tooling, and, as a result, they're becoming very popular. I can pull down some code, run `cargo build`, and not have to worry about the right libraries being installed on my system and whether I've generated a Makefile. Packages are easily searchable, hosted on popular platforms like GitHub, and I can file bugs and ask questions without having to join an obscure mailing list or deal with an unfriendly community.

If you want your language/library/framework/layer in the stack to become popular, make the tooling good and make it easy for folks to get their questions answered. Everything else will follow.

> Instead, I imagine a future where we have new “high-level” tools, built from lower in the stack.

This is exactly what I'm trying to build. I'm writing a library on top of Qt that would make it easy to write native code as easy as it is writing React code! I would say it's even easier since we are throwing all the constraints of browsers and web apis out of the way.

I feel like the term 'engineering' implies bending the rules of reality to your goal in a way that is both economical and is very good at reaching the stated goal.

Like for example if you're an automotive engineer, you can't go ahead and put in the thickest beams made out of the strongest steel on hand, because the resulting car would weigh 20 tons and cost $300k. To add to that, it would probably drive like crap, and wouldn't even protect the driver that well.

In engineering, even a 10% waste is frowned upon. Here I outlined a waste of 10x. I don't think a Reddit comment opening/closing taking 200ms is a 10x waste, but a couple orders of magnitude more.

Why is that, that despite tons of very highly paid (moreso than any other kind) software 'engineers' work on websites like Reddit (which is not the only example of this trend), the result ends up like this?

It's not the lack of resources, constraints, pace of development (I don't remember the reddit site changing all that much in the past decade).

I think if software engineers have the mindset of other engineers, this thing would be considered ridiculous and unimaginable, like the 20 ton car is for automotive engineers.

  • dijit
  • ·
  • 2 hours ago
  • ·
  • [ - ]
> I feel like the term 'engineering' implies bending the rules of reality

I'd phrase it differently: Civil engineering is fundamentally about understanding trade-offs within hard constraints. You have materials of known strength, known wear, and known properties (compression vs shear). It's boring by default because the physics don't budge.

A lot of software engineering, web and SaaS development in particular, hasn't had to confront comparable resource limitations. For decades we've had (for practical purposes) an infinitely fast calculator, practically infinite supply of active working data and and infinite amount of them to chain together. So, without constraints people have just run wild.

But here's where it gets interesting from my perspective: when you point out the resulting bloat (200ms to open a Reddit comment), many developers will defend it not as a technical failure but as correct business prioritisation. "User hardware is cheap, developer time is expensive" or "users will upgrade their devices anyway"—essentially externalising the performance cost onto users rather than absorbing it as an engineering constraint.

That's the fundamental difference. An automotive engineer can't build a 20-tonne car and tell customers to buy stronger roads. But we absolutely can ship bloated software and tell users to buy faster computers, more bandwidth, better phones. And for a long time, we've got away with it.

The question is whether that's still sustainable, or whether we're approaching the limits of what users will tolerate.

I think the main issue is lack of competition. If company A makes a 10% worse car that costs 10% more than company B, they very quickly go out of business.

There is no 'Reddit 2' substitute product (or indeed for lots of software), and network effects tend to dominate, so your benchmark is 'is it bad enough so people would rather use nothing than your product', which is a very low bar to clear.

  • dijit
  • ·
  • 2 hours ago
  • ·
  • [ - ]
Spot on. And it's worse than that: you're not choosing between a 10% slower product and a faster alternative, you're choosing between a slow product and losing access to everyone still using it. That's not a market choice, it's a hostage situation.

We can see this works in reverse: developer tools, CLIs, and local apps where network effects don't apply (ripgrep over grep, esbuild over webpack) performance actually matters and gets rewarded. Developers switch because they can switch without losing anything. But Instagram users can't switch to a lighter alternative without abandoning their social graph.

This is why the "developer time is expensive, user hardware is cheap" argument only works in the absence of competition. In genuinely competitive markets, efficient code becomes a competitive advantage worth investing in. The fact that it's "not worth optimising" is itself evidence of market power, not sound economics.

Your automotive analogy actually understates it: imagine if switching to a better car meant your old car's passengers couldn't ride with you anymore, and that's closer to what we're dealing with.

A major difference is what costs money.

A civil engineer might work on a major bridge that costs a billion dollars to build. An automotive engineer might work on a car that has a cumulative billions of dollars in production costs. An aeronautical engineer might work on a plane with a $100 million price tag.

The engineer’s job there is to save money. Spend a week slimming down part of that bridge and you’ve substantially reduced costs, great! Figure out how to combine three different car parts into one and you’ve saved a couple of dollars on every car you make, well worth it.

Software doesn’t have construction costs. The “engineer” (I have the word in my job title but I hesitate to call us that) builds the whole thing. The operating costs are often cheap. Costs like slow rendering are paid by the customer, not the builder.

In that environment, it’s often not a positive ROI to spend a week making your product more efficient. If the major cost is the “engineers” then your focus is on saving them time. If you can save a week of their time at the cost of making your customers wait 50ms longer for every action, that is where you see your positive ROI.

When software contributes to the cost of a product, you tend to see better software work. Your headphones aren’t running bloated React frameworks because adding more memory and CPU is expensive. But with user-facing software, the people who pay the programmers are usually not the people who pay for the hardware or are impacted by performance.

At the big tech firms, there are engineers looking for software fixes that make tiny efficiency improvements that can save lots of money at scale.

Meanwhile, Google and Apple look for whatever ways they can to improve battery life on their phones.

But for many other developers, this isn’t going to save money or increase sales, so the incentives are more indirect.

  • cjfd
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Sure, high level is the goal. But the question is whether the abstractions are the correct ones that fit the problem. Almost all software that I have encountered that was painful to work with chose a framework that did not apply to their situation.

E.g., develop a generic user interface framework which makes it very quick to produce a standard page with a series of standard fields but at the same time makes it very painful to produce a non-standard layout. After that is done it is 'discovered' that almost all pages are non-standard. But that 'discovery' could also have been made in five minutes by talking to any of the people already working for the company....

Another example: use an agent system where lots of agents do almost nothing, maybe translate one enum value to another enum value of another enum type. Then discover that you get performance problems because agent traffic is quite expensive. At the same time typical java endless typing occurs because of the enormous amount of agent boilerplate. Also the agents that actually do something useful become god classes because basically all non-trivial logic goes there....

> Sure, high level is the goal. But the question is whether the abstractions are the correct ones that fit the problem.

Not quite. The path to high level always involves abstractions that fit the problem. There is still room for a decision to replace high-level with low-level in some very specific bits of a hot path, but that decision also takes into consideration the tradeoffs of foregoing straight-forward high-level solutions with low-level versions that are harder to maintain. The sales pitch to push code that is harder to maintain requires a case that goes way beyond performance arguments.

I really like the sentiment here, and Handmade Network has such a cool vibe, but I can't help but think that he/they would have a bigger impact by focusing more on illustrating to people how this mindset leads to value and less on teaching and learning the skills.

>Building it yourself might sound crazy, but it’s been done successfully many times before—for example, Figma famously built their app from scratch in WASM and WebGL, and it runs shockingly well on very large projects.

Yes, let's hear more about this. "Collapsing Reddit comments could have been like 180ms faster" isn't very convincing to smart, ambitious people deciding what they want to be about. Find more examples like Figma and get people to believe that there's still lots of room for up and comers to make a name for themselves by standing on their performance, and they'll take care of the learning and building themselves.

> "Collapsing Reddit comments could have been like 180ms faster" isn't very convincing to smart, ambitious people deciding what they want to be about

It's fairly compelling to an audience who spends a lot of time browsing reddit, however

Even 20ms is crazy, video games rendering and simulation entire worlds at 60Fps do so in 16.6 ms
I think the real conclusion is: someone has to make a native cross-platform desktop UI framework that doesn't suck. (Yeah Qt exists, but it really sucks...) Until then, everyone will default to just using the browser for a desktop app, and the beatings will continue.

Because of this, I'm really looking forward for PanGUI to step up (https://www.pangui.io/), their UI framework is very promising and I would start using it in a heartbeat when the beta actually releases!

> someone has to make a native cross-platform desktop UI framework that doesn't suck

That's the browser, native ui development failed because it didn't want to lose money on cross platform compatibility, security, or user onboarding experience.

The web is fast enough for 99% of UIs, the story is not about using web, the story is about using the web poorly. old.reddit is not qt.

Why do you think Qt sucks? Other than the C++ focus, and basically the framework's decision to make you use everything they give you, which is controversial.

But apps made with Qt as an end product don't think suck. Qt is a fully featured and modern and high quality framework.

I agree. Modern Qt is great - you get the performance of C++ and the ease of use (simpler than React!) of creating UIs in QML. I've built my note-taking app with a from scratch block editor[1] (like Notions's) with it. Now working on a mobile version[2]. Also working on a LLM client[3]. And working on a library that will simplify and help others build such apps easily - which is my solution to the original article author's problem.

[1] https://get-notes.com/

[2] https://rubymamistvalove.com/notes-mobile-swipe-stack-view.M...

[3] https://www.get-vox.com/

I have yet to find a cross-platform UI framework that really feels native on Gnome. The reality is that there is no shortcut for native UI on all platforms, you have to use the “official” framework (e.g. GTK for Gnome). In my opinion the only softwares that should use a custom framework are professional grade apps where custom workflows can’t fit the native UI (e.g. Blender). Despite all the noise about native cross-platform frameworks, it’s absolutely not fooling anyone, we can spot this immediately on any platform.
> I have yet to find a cross-platform UI framework that really feels native on Gnome. The reality is that there is no shortcut for native UI on all platforms, you have to use the “official” framework (e.g. GTK for Gnome).

The alternative to cross platform frameworks that do not feel completely native on all platforms is to use browsers for desktop apps which do not feel native on any platform. They do not even have similar UIs to each other.

We would be better off using imperfect cross platform frameworks rather than sticking everything in the browser.

I think part of the reason this happens is that users accept it because they are used to web apps so do not expect consistency.

This correct. What's actually needed is a modernized wxWidgets. The goal of the GUI framework should be to find an architecture that is maximally compatible with the native Windows, GTK, QT, Mac OS UI frameworks/libraries plus a simple way to accommodate minor platform differences.
There's Lazarus / FPC. Fast, small. Extremely quick compiler. The language has its warts but very readable and relatively simple yet powerful. Uncool and treated like a relic though
> someone has to make a native cross-platform desktop UI framework that doesn't suck

This is exactly what we're trying to do with Slint (https://github.com/slint-ui/slint ). It’s a native, cross-platform UI framework for desktop and embedded (Rust/C++/Python/JS), with no browser runtime

Has somebody written an analysis why Qt really sucks? It would be great to have a spec for a GOOD cross-platform (desktop) UI framework. It might be also possible to create a reference implementation of that spec on top of Qt.
Never heard of PanGUI - glad to see its C#, will have to try it out.
They're making the initial version with C#, but they have plans to make the library language-agnostic. Rather than creating bindings, they'll write it in a subset of C# and then transpile it to C++/Jai/Zig/etc, so you can get the best language integration without the hassle of also wrangling with your build system.
>Rather than creating bindings, they'll write it in a subset of C# and then transpile it to C++/Jai/Zig/etc, so you can get the best language integration without the hassle of also wrangling with your build system.

It sounds like a clever idea.

What about Flutter?
Not bad for mobile apps, but still sucks a lot for desktop support.

Also, really wished they've opted for a more general language like C# rather than Dart - but that's inevitable since Google needed to make use of their Dart language after they've failed to standardize it on the Web (and I think they don't want to use a language developed by Microsoft of all companies)

They've picked Dart because it was the only language that could have small aot binaries, hot reload capable runtime without compromise and most importantly because they could influence development of the language.

C# is one of the worst choices they could make at the time.

Why would C# be the worst choice? Do you gave any real arguments or is it just your biased opinion.
  • ·
  • 7 hours ago
  • ·
  • [ - ]
Anything that forces a specific language is a no-no.
There was WxWidgets.
The main consensus in the native space is that Qt is still miles ahead of any other cross-platform desktop framework (including WxWidgets). Doesn't mean that Qt is anywhere good - it's just the least worst option out of all.

I hoped someday Flutter might be mature enough for desktop development, but so far they've focused most of their efforts on mobile and I don't think this will change in the future.

As 1 datapoint to support this, see Audacity moving from WxWidgets to Qt for 4.0.
Absolutely, they need Qt in order to design and theme a UI that actually doesn't look terrible (They already had good experience in porting Musescore from vanilla C++ Qt5 to QML widgets, so I think they'll use a similar system for Audacity)
> I hoped someday Flutter might be mature enough for desktop development

I really don't think there is any broad future for Flutter. Requiring adoption of a new programming language is making an already an uphill battle even steeper, and the way they insist on rendering websites in a single giant canvas is... ugh

> The main consensus in the native space is that Qt is still miles ahead of any other cross-platform desktop framework (including WxWidgets). Doesn't mean that Qt is anywhere good - it's just the least worst option out of all.

That's not consensus. I very much reject a "desktop framwork". Qt has its own abstractions for everything from sockets to executing processes and loading images, and I don't want that. It forces one to build the entire app in C++, and that's because, although open-source, its design revolves around the needs of the paying customers of Trolltech: companies doing multi-platform paid apps.

I want a graphical toolkit: a simple library that can be started in a thread and allows me to use whatever language runtime I want to implement the rest of the application.

> I hoped someday Flutter might be mature enough for desktop development

Anything that forces a specific language and/or runtime is dead in the water.

> I very much reject a "desktop framwork". Qt has its own abstractions for everything from sockets to executing processes and loading images, and I don't want that.

Yes, that is the consensus of why Qt sucks - it's a massive framework that tries to do everything at the same time with a massive toolset of in-house libraries. This is inherently tied to the revenue model of the Qt Company - sell custom modules that work well with the Qt ecosystem at a high enterprise-level price. I also wish to just use the "good" parts of Qt but I can't, since it already has a massive QtCore as its dependency.

However, there is still no cross-platform framework except for Qt that can actually do the most important things that a desktop framework actually needs: an actual widget editor, styling and theming, internationalization, interop with native graphics APIs (though I have gripes with their RHI system), etc. That's why I'm rooting for PanGUI (https://www.pangui.io/) to succeed - it pretty much completes all the checkboxes you have, but it's still WIP and in closed alpha.

> I hoped someday Flutter might be mature enough for desktop development >> Anything that forces a specific language and/or runtime is dead in the water.

Yeah, but at that time I thought this was at least better than wrangling with Qt / QML. You can write the core application logic ("engine" code) in C++ and bind it with Dart. There are already some companies I've seen gone a similar route with C# / WPF.

As far as I can tell, PanGUI is a drawing library not a graphical toolkit. Its primitives are geometrical, not widgets. Its showcase is an audio app, which is as far away as possible from a boring productivity application that I'd like to do.

In my university days I was very much into GUIs, and I've written apps with wxWidgets, plain Gtk 1 and 2, GNOME 2, Qt, Tk, GNUstep and even some fairly obscure ones like E17 and FTLK. For my tastes, the nicest ones were probably GNOME2, Elementary and wxWidgets. Especially GNOME2, which had a simple builder that let me create the basic shell of an app, with some horizontal and vertical layout boxes that I could later "hydrate" with the application logic.

>That's why I'm rooting for PanGUI (https://www.pangui.io/) to succeed - it pretty much completes all the checkboxes you have, but it's still WIP and in closed alpha

They say it's in beta and it seems anyone can sign up for the beta.

> Anything that forces a specific language and/or runtime that I don't like is dead in the water.

Ftfy.

PanGUI seems to be interesting. However being mobile ready would matter a lot for adoption and I couldn't see anything on their sites regarding mobile platforms.
I don’t think it makes sense to use the same framework for desktop and mobile apps. That gives you either terrible desktop apps or terrible mobile apps. Put the business/backend logic into a shared library, but build separate mobile and desktop apps on top of it.
IDK, it seems like old reddit did just fine without even trying that hard. The DOM and surrounding JS API is already a high level GUI framework, and the post illustrates that it's perfectly capable of doing useful interactions >60fps. I personally love working with native code, but the modern browser is capable of producing fast interfaces that saturate human senses without it. If you write JS like a C programmer would, it will usually be quite fast, even if it's not optimal. If you write native apps in C++ like a modern JS programmer - frameworks and deps with abandon - it will be a stuttery mess.

When the DOM is not enough, there's already WebGL and WASM. A vanishingly small sliver of use cases can't saturate human senses with these tools, and the slowest, jankiest websites tend to be the least deserving of them (ie: why is jira slow? It's literally a text box with a handful of buttons on the side!).

  • girvo
  • ·
  • 9 hours ago
  • ·
  • [ - ]
> (ie: why is jira slow? It's literally a text box with a handful of buttons on the side!).

Despite me agreeing with your overall point, this is such a ridiculous comment to make. You and I both know Jira is much much more than that. Reductive things like this just turn off people who would otherwise listen to you.

So from what I understand…

Someone needs to build Qt’s successor, probably with more beginner-friendly declarative semantics (akin to HCL or Cue) and probably with syntax closest to YAML or Python (based on learning curve, beginner readability etc).

The backend will probably have to be something written in Zig (likely) or Nim (capable, less likely) and will probably have to leverage OpenGL/Metal, WebGL and WASM.

Obviously a massive undertaking, which is why I think the industry has not reached consensus that this is what needs to happen. The less ideal options we have now often gets the job done.

There is Slint (https://slint.dev/), which was a company founded by ex-Qt devs and seem to be trying to make a better alternative to Qt. The core engine is built wiht Rust, but they also provide C++ and JS bindings and also have a QML-like scripting language you can use to design UI quickly. So far I think they've solidified their revenue model by catering to embedded devices, but I haven't seen a solid usecase for desktop apps (yet...)
There are some desktop apps made with Slint. For example:

- WesAudio has a VST plugin for audio applications: https://slint.dev/success/wesaudio-daw

- LibrePCB 2.0, is migrating their code from Qt to Slint and should soon be released. https://librepcb.org/blog/2025-09-12_preview_of_next_gen_ui/

- krokiet: https://github.com/qarmin/czkawka/blob/master/krokiet/README...

QML isn’t that different anyway from what you describe
I think people overestimate the necessity of 'high level' conveniences or the difficulty of writing C/C++ to the metal.

For example, take Dear.IMGUI which is a c++ UI framework with a kind of data binding, which generates vertex buffers which can be directly uploaded to the GPU and rendered.

It supports most of the fancy layout stuff of CSS afaik (flexbox etc), yet its almost as low level as it gets.

The code is also not that much harder to write than React.

I feel like I'm somewhere on that Venn diagram O:-)

The specific examples in the article are about UI.

I agree that UI ecosystem is a big and slow mess, because there is actually a LOT of complexity in UIs. I would even argue that there is often more complexity to be found in UIs than in backends (unless you are working on distributed systems, or writing your own database). On backend, you usually just need paralellism (95% of jobs is just parallel for, map-reduce kind of thing).

But in UI, you need concurrency! You have tons of mutable STATE flying around that you need to synchronize - within UI, across threads or with the backend. This is /hard/ - and to come back to the point of the article - the only low-level language that I'm familiar with that can do it well and reliably is Rust.

In Rust, my absolutely favorite UI framework is egui. It is based on immediate mode rendering (maybe you're familiar with dearimgui), rather than the old, familiar-but-complex retained mode. It's really interesting stuff, recommend studying on it! Dearimgui has a wonderful quote that summarizes this well:

> "Give someone state and they'll have a bug one day, but teach them how to represent state in two separate locations that have to be kept in sync and they'll have bugs for a lifetime." -ryg

We use egui in https://minfx.ai (Neptune/Wandb alternative) and working with it is just a joy. Emilk did such a fantastic job bringing egui about! Of course it has its flaws (most painful is layouting), but other than that it's great!

This is why I think the maximally from source bootstrap that Guix did, and that Nixpkgs is about to do too https://github.com/NixOS/nixpkgs/pull/479322 is so important.

These bootstraps essentially speedrun software history, and so they tell us a lot about how we got here, and why we write the things we write. But they also create perfect game to weite greenfield alternative bootstraps. The shortest, most readable bootstrap, is proof of the best abstractions, the best way of doing things.

It's a chance to finally put the sort of software stack / tech tree stuff on a more apples-to-apples basis.

  • Twey
  • ·
  • 1 hour ago
  • ·
  • [ - ]
When you can build down as well as up (https://ngnghm.github.io/blog/2015/08/24/chapter-4-turtling-...) — or, equivalently, do away with the up/down orientation entirely, you can have your cake and eat it too.

And as a bonus if you control both slices of bread it's much easier to change the sandwich filling as well! (Though if the original sandwich-builder wasn't careful you might find some sticky residue left over on your bread… maybe someone should take this metaphor away before I do more damage.)

  • Sevii
  • ·
  • 10 hours ago
  • ·
  • [ - ]
He makes an interesting point that we are coding programs that run more slowly now than they did 10 years ago. Javascript has only gotten faster over the last decade, computer have faster CPUs and more RAM. The problem is the frameworks and the programs have gotten slower.

What did we gain exactly? Reddit is better at displaying videos and images now. But it's slower despite faster hardware.

I hate the state of affairs. That said my guess is what we „gained“ is tons of telemetry, tracking and the likes, engineers not needing to think about performance to get a feature out, which absolutely lowers the bar to entry, high level abstractions and ux and visual bells and whistles of varying importance and quality (infinite scrolling, streaming updates, image blend modes, blur effects, scroll timeline animations etc). People creating Pokémon had to think about every bit in their texture atlas and carefully manage the hardware memory manually. Web devs now try not to forget to clean up event listeners in a useEffect that triggers on mouse move to generate data for an interaction heatmap for the marketing department while 25mb of 3rd party scripts make sure every data broker and their mother is well informed about your digital whereabouts.
Then go ahead and write vanilla JS and raw HTML! No one is asking you to use any of these bad "state of affairs". If you do build a 100+ page SaaS app without any framework, let me know, I'd love to see how it works.
My clients are. I do web dev for a living and I use these frameworks day in and day out. It’s not even that I dislike the dev ex on most of them and I’ve seen a lot of good code and bad code and I don’t even wanna blame anyone in particular for the situation we are in. I think my comment was more of a dig at the world we live in than anything else.
The frameworks have NOT gotten slower! God I hate Hackernews sometimes... whats really going on is actually that not a single dev reads the release notes / improvements / learns how to actually use the framework. If anything, the latest React release was almost purely a performance improvement release (and has been for a while) ...most people need to fully understand a tool they use before complaining about it.

Everyone always wants a frontend framework that "just works" - sounds a lot like a free lunch to me! You have to manage the state and updates of your application at some point - the underlying software cant just "guess" what you want. But I'm always like a broken record when these react hate / <insert frontend framework here> hate threads show up - most of the confusion is derived from lack of basic concepts of what problems these frameworks solve in the first place.

> whats really going on is actually that not a single dev reads the release notes / improvements / learns how to actually use the framework.

If everyone fails to read framework release notes then the problem is frameworks. If you change so quickly and often that almost no developer bothers to keep up to date then you are the problem, not the developer.

Something interesting happened, this is the first time I read him and just after I finish the article and I get into YouTube YouTube recommends me a video from the author with the same title
  • zkmon
  • ·
  • 7 hours ago
  • ·
  • [ - ]
I would say 80% of the software quality issues come from the context and constraints defined by the requirements complexity, design choices, stack selection, infra decisions, org processes and so on. A lot of this context is tied to org policies and non-technical decisions. AI is not going to redefine this context. It can only increase code production and code level quality. But the source of 80% of the issues remain intact.
While I am totally on board with the idea... the article doesn't really say what to actually do to help?

"we at Handmade community" - and no link to that community anywhere

blog itself? 2 posts a year, and 2025 posts aren't even on the blog itself (just redirects)

Yes, tooling and toolmaking should be promoted - but promotion itself should also be accessible somehow?

Yeah it's a call to action to improve the tooling but it's not the first one I've seen. Tooling really is under focused in many areas.

It would be nice if every language and library had a great working repl and a Jupyter Lab kernel and good mdn-like documentation and w3schools-like tutorials.

This is probably the community he was talking about:

https://handmade.network/

Here's the manifesto: https://handmade.network/manifesto

My exact complaint. What is the "handmade" community? At first I thought he was talking about woodworking or knitting.

Also the reddit comparison is great, but I wish he would have talked about why the slop is there in the first place.

I'm pretty sure new reddit isn't optimized for speed, it's optimized for analytics and datamining.

I bet they use all those backend calls to get really granular session info. When something is super slow, it's not that it's unoptimized, but rather it's optimized for money over user experience.

Really enjoyed reading this. Stuff like this is what inspires me to keep pursuing logos language and theorem prover. Things on the roadmap next include stuff like adding first-class inline ASM support. Adding great SPMD and auto-vectorization pipelines, and exploring making verifiable private computation a language primitive when you make things private. If interested, read about some of the planned upcoming enhancements here. :) https://github.com/Brahmastra-Labs/logicaffeine/issues
I enjoyed reading this article but I think the author overlooked that "low-level" languages aren't just less supported, they're also character-dense. You can accomplish more with less, simply because it's a higher level abstraction. If you choose to abstract through this problem, aren't you creating a high-level language?
  • ·
  • 9 hours ago
  • ·
  • [ - ]
If you read to the end, the author does advocate for creating more high-level foundations.
Side note, but this article reads like a Wes Anderson film, if that makes any sense.
I haven't seen his whole filmography, but I can see Asteroid City in this, yeah.
  • ·
  • 6 hours ago
  • ·
  • [ - ]
While directionally correct, the article spends a lot of time glorifying jquery and not enough on what a horrible, no good, unoptimized mess of a framework jquery was, and by extension what kinds of websites were built back then. I remember those times well. The reason to use React isn't because it was new, far from it. It was because it won vs. Ember, Angular, et. al. in 2014-2015? as the best abstraction because it was easiest to reason about. It still wasn't great. In fact, still isn't great. But it's the best blend of many leaky abstractions we use to code against the browser apis.
jquery was an unoptimised mess? it's like 30k minimised and just bridged a bunch of functionality that browsers lacked as well as providing a generic api that let you (often) ignore per-browser implementation and testing of your code

there's no reason to blame it for the types of websites being made either, it doesn't really provide enough functionality to influence the type of site you use it on

Since when did we start using file size as a measure of efficiency or optimization?

Off the top of my head: $() CSS parsing and DOM traversal was way slower than querySelector or getElementById, both of which predate jquery by years. Every $('.my-class') created wrapped objects with overhead. Something like $('#myButton').click(fn) involved creating an intermediate object just to attach an event listener you could’ve done natively. The deeper the method chaining got the worse the performance penalty, and devs rarely cached the selectors even in tight loops. It was the PHP of Javascript, which is really saying something.

By the early-2010s most of the library was dead weight since everyone started shipping polyfills but people kept plopping down jquery-calendar like it was 2006.

(I say this as someone who has fond memories of using Jquery in 2007 to win a national competition in high school, after which I became a regular contributor for years)

> $() CSS parsing and DOM traversal was way slower than querySelector or getElementById, both of which predate jquery by years.

You have that backwards – jQuery predates querySelector by years.

The reason why getElementById is fast is because it’s a simple key lookup.

> By the early-2010s most of the library was dead weight

absolutely correct this is because a lot of the shit jquery did was good and people built it into the browser because of that

putting jquery into a site now would be insane but at the time it pushed forward the web by quite a leap

Both querySelector and querySelectorAll came well after jquery. I remember it being a big deal when browsers added support for them.
  • ulbu
  • ·
  • 10 hours ago
  • ·
  • [ - ]
3 counts of “jquery” in the text. once again, which one of them glorifies it?
  • ·
  • 12 hours ago
  • ·
  • [ - ]
I created a https://ideawell.fly.dev/ just for this particular goal, identify pain points and project ideas from HN discussions
I really agree with the point about React and Redux. It is crazy that collapsing a comment can take 200ms just because of the framework overhead. We have way too much power in our computers to be okay with this kind of lag in basic UI tasks.
We should start making low-level enjoyable
For those interested here's the talk that this is from:

https://www.youtube.com/watch?v=AmrBpxAtPrI

This is a good reminder that abstractions are supposed to help us solve problems rather than just hide the details. I feel like I spend too much time fighting against tools that try to prevent me from seeing how things really work.
> At the time, it took New Reddit almost 200 milliseconds to collapse a single comment. That is 200 milliseconds of pure JavaScript, with hardly any DOM work in sight. If you care about quality software, your jaw should be on the floor. It is a staggering amount of waste for what should have been a few DOM calls. And you feel it as a user: an ugly, intense hitch.

> New Reddit was a React app

Many such cases. React is basically synonymous with horrible lag and extreme bloat to me. Its name is the highest form of irony.

I'm really not sure why JS frameworks in general are so popular (except to facilitate easy corporate turnover), when the browser already gives you a pretty complete toolset that's the easiest to use out of any GUI library in existence. It's not low level by any means.

Granted something like an <include html component> feature is desperately missing from the html spec, but there are lightweight solutions for it.

> a Redux action, which would update the global Redux store, which would cause all Redux-connected components on the page to update, which would cause all their children to update as well. In other words, collapsing one comment triggered an update for nearly every React component on the page. No amount of caching, DOM-diffing, or shouldComponentUpdate can save you from this amount of waste.

yeah this is pretty much 1. an incorrect implementation and/or 2. an incorrect take

and easily solvable with a bit of 'render auditing' / debugging

OS is to blame. There should be a way for the OS to tell to the app "offload your state" like phones do. Paging is supposed to achieve this but does not.
The Reddit example is about two different design choices. The DOM is a tree of state that needs to stay in sync with your app state. So how to make that happen without turning your code into a mess. The old Reddit had to first construct the DOM and then for every state change, determine what DOM nodes need to change, find them and update them. Knowing what needs to change gets ugly in a lot of apps. The other alternative is to realize that constructing a DOM from any arbitrary state is pretty much the same as constructing it from initial state. But now you don’t have to track what DOM nodes must change on every state change. This is a massive reduction in code complexity. I will grant that there is something similar to the “expression” problem. Every time there is a new state element introduced it may affect the creation of every node in the DOM. As opposed to every time a UI element is added it may affect every state transition. The first Reddit can be fast, but you have to manage all the updates. The second is slow, but easier to develop. I’m not sure going any lower solves any of that. The React version can be made more efficient through intelligent compilers that are at better at detecting change and doing updates. The React model allows for tooling optimizations. These might well beat hand written changes. The web has complexity also of client/server with long delays and syncing client/server and DOM state, and http protocol. Desktop apps and game engines don’t have these problems.
The thing is that you can still have high-level abstractions without them needing to be as slow as React. React does a slow thing by default (rerendering every child component whenever state changes, so every component in the UI if top-level state is changing), and then requires careful optimisation to correct for that decision.

But you can also just... update the right DOM element directly, whenever a state changes that would cause it to be updated. You don't need to create mountains of VDOM only to throw it away, nor do you need to rerender entire components.

This is how SolidJS, Svelte, and more recently Vue work. They use signals and effects to track which state is used in which parts of the application, and update only the necessary parts of the DOM. The result is significantly more performant, especially for deeply nested component trees, because you're just doing way less work in total. But the kicker is that these frameworks aren't any less high-level or easy-to-use. SolidJS looks basically the same as React, just with some of the intermediate computations wrapped in functions. Vue is one of the most popular frameworks around. And yet all three perform at a similar level to if you'd built the application using optimal vanilla JavaScript.

We measure computer performance in the billions and trillions of ops per second. I'm sorry but if it an app takes 200ms to hide some comments, the app or the tech stack it's on is badly made.

> The web has complexity also of client/server with long delays and syncing client/server and DOM state, and http protocol. Desktop apps and game engines don’t have these problems.

Hugely multiplayer games consistently update at under 16ms.

> The web has complexity also of client/server with long delays and syncing client/server and DOM state, and http protocol. Desktop apps and game engines don’t have these problems.

What part of hiding a comment requires a HTTP round trip? In 200ms you could do 20 round trips.

  • highd
  • ·
  • 12 hours ago
  • ·
  • [ - ]
I'm fairly confident that the new reddit React implementation can be improved in performance by a factor of 3x to 10x. I would be interested to hear others who have good reason to explain why not. I can certainly imagine React-like systems that are capable of statically determining DOM influence sufficient to make comment-collapsing negligible.
It is blatantly obvious to anyone with just a little bit of experience that the reddit devs barely know what they are doing. This applies to their frontend as well as backend. For some reason, reddit is also the only major social network where downtime is expected. Reddit throwing 500 errors under load happens literally every week.
Reddit also puts the "eventually" in "eventually consistent". Not in the sense of consistency being driven by events, but in the colloquial sense of "someday". The new message indicator will go away ... eventually. Your comment will show up ... eventually.
Presumably the mobile apps works better; they don’t care very much about the website because they want to push everyone to the app anyway.
A few thoughts:

* These articles always say that hardware is amazing but software sucks. Let's not forget that hardware has its problems. Intel's management engine is a pile of complexity: https://www.zdnet.com/article/minix-intels-hidden-in-chip-op.... The x86_64 instruction set is hardly inspiring, and I imagine we lose a pile of performance because it fails to adequately represent the underlying hardware. (E.g. there are hundreds of registers on modern CPUs, but you can't access them directly and just have to hope the hardware does a good job of register allocation.)

* Languages unlock performance for the masses. Javascript will never be truly fast because it doesn't represent the machine. E.g. it doesn't have distinct integer and floating point types. Rust represents the machine and is fast, but is not as ergonomic as it could be. OxCaml is inspiring me lately as it's an ergonomic high-level language that also represents the machine. (Scala 3 is also getting there with capture checking, but that is still experimental.) If we want more performance we have to give a way to efficiently write code that can be turned into efficient code.

> hardware

Sure x86 is an absolute mess, but I don't think it's a primary bottleneck. High end x86 cpus still beat high end ARM cpus by a significant margin on raw performance. Even supposing x86/ARM are bottlenecks... yeah a bottleneck at double digit billion ops per second.

> Languages unlock performance for the masses. Javascript will never be truly fast because it doesn't represent the machine.

C# and Go are already really fast (https://github.com/ixy-languages/ixy-languages) languages for the masses and at this point you can compile most things to WASM to get them run in the browser.

You think collapsing a comment quickly would be difficult in x86_64 assembly?

I would expect 1000's of frames of opens/closes per second. Probably an order or two more. The LCD's data bandwidth and our retina's sensitivity would be decisive bottlenecks at far slower speeds.

TLDR: CPUs, that are not getting slower, are not the reason newer software implementations often get slower.

Collapsing a comment is uneconomic in assembly. That's why we have higher level languages and other abstractions.

I think you missed the point of what I'm saying.