The native app was by no means perfect, but it felt like a real productivity tool that was trying to be respectful of it's environment.
I've come to the conclusion that native desktop apps are just not viable from large companies, even if there is headcount. The problem is coordination cost.
If you want to launch new features and experiments here, there and everywhere, then the coordination complexity increases nonlinearly with the number of platforms.
If you can sustain a more deliberate, low churn pace of development then it's workable. Features can be well defined and then implemented by the platform team as they see fit. But if you want a more fast-paced, "just in time" style of development, you need to coordinate with every team for every change... wouldn't it be nice to just write web code and be done?
Even Microsoft are building this way these days.
This is why ironically small companies seem more able to support native apps than large ones. The more "stuff" that's being worked on concurrently, the harder it is to support multiple platforms.
So you're saying that it is impossible for a large company to somehow use native toolkits to draw text bubbles and emojis?? Video and audio is another matter, but MSN Messenger managed 75% of this decades ago, natively.
One weird thing about software development is that there are plenty of things which motivated individual developers can achieve which large companies can't even write the requirements for, let alone achieve.
We can say that for the majority of companies with large teams that already have this. Everyone knows Meta is no different.
In this case, it's a skill issue to ship low quality software which is what whoever that team at Meta just did and knowingly approved.
> The engineers at Meta are world class but they're nerfed by organizational constraints.
That doesn't mean anything given that shipping regressions to billions of users is not of "world class" calibre.
If fact, that is of amateurs behaviour and way below the expectation of a multi-trillion dollar company hiring the "best" engineers which they can certainly afford.
But seriously folks, this is why there will always be room for startups.
Megacorps are slow lumbering beats that suffer from entropy and signal loss at every edge on the graph.
The example here is that some marketing function gets the signal from Zuck to put Lego duplo characters everywhere because metaverse. Of course 90% of the company knows its dead on arrival so they are going to drag feet and try to work on their actual problems (or their local feifdom mandate) and wait for the latest fad to blow over.
Super wasteful, super inefficient but hey ads make an insane amount of money so Zuck can do whatever for a loooong time.
This is why Warren Buffet advocated dividends. Companies and people have few good ideas. Management however uses this money from the good ideas to fund boondoghles (see metaverse). The thinking here is that the money would be better utilized returned to investors to fund other good ideas.
True. I work for one and its terrible. My work is 95% jumping through hoops imposed by other teams, restrictions put in for political reasons by substandard business 'leaders', and often even keeping vendors happy (really why should we care, we are their customers not the other way around :( )
5% is doing actual improvement. And meanwhile HR want me to show I'm making a difference to the company. Well yeah 5%.
I hate it a lot but the pay is good and a permanent contract with 20 years service is valuable (if I'd get made redundant I'd get more than a year's severance so I'm definitely not leaving on my own for something that may not work out)
I'm not sure we should be critical of risk taking in large companies because there was some notable failures. But maybe it is the sort of thing that doesn't scale well in a large org, they want big immediately.
That's because he comes from a generation who has "the government and universities take care of R&D" as a mindset. A lot of the stuff we take for granted today - *nix, TCP, IP, the Internet itself, lasers, microwave, radar - came out of universities, government grants or the military.
The idea that the government is incapable of R&D is relatively new and originates in small-state / lean-state / starve-the-beast ideology.
And personally, I rather believe the wisdom of an old man who made a metric shitload of money by being a good honest citizen than others who got rich on advertising and stonk market shenanigans.
So what, NASA spent billions on the new space race, with everyone knowing the true aim was to kick Boeing's ass and Boeing being widely known as a pork distributor first and foremost and rocket/plane manufacturer second.
The job of politicians isn't to cater to the whims of reactionaries, the job of politicians is to do the right thing and sell it to the voter base.
I'm not defending any large company, they could if they wanted to, they just don't care. If this is "cheaper" and they can cut costs, this is what they'll do.
They know web, they can manage web - web is “less risky.”
In my experience, in a large software company [0] it's very common for new folks on a "team" to have to spend one or (often) more months learning both the software they're now working on and the various libraries used in its construction.
It's very nice if you can find an expert in everything you've used on a software project, but that's not going to happen often. [1] And -IMO- if whoever you've hired can't generally come up to speed on WxWidgets in a couple of weeks while also becoming familiar with the rest of the project they're now responsible for, they're not someone you want working on your project.
[0] Which is the sort of company we're talking about right now.
[1] In part because companies generally don't want to pay enough to hire such people.
I think what he is saying that native platform apps get delegated to different teams and coordinating among those teams becomes an additional cost. You don't want each team going off and doing their own thing.
Your 'answer' is "use a cross-platform GUI toolkit" but that has its own challenges. Not least that you typically build a native app because it delivers a native experience that users expect.
In general (and I accept there may be counter-examples) cross-platform tools fail to do this.
The issue is not that a native Windows app needs to run in a browser.
The issue is that a native Windows app has been replaced by yet another browser.
That's not really native either. Whether it's a web wrapper or Electron or wxWidgets, just because something runs "natively" doesn't mean it feels native.
In my manual labor occupation the office folk often try to get away with a horribly inefficient schedule designed to make their job easy. I often have to remind them that they work for me. I do the actual work, your job is to optimize it.
This to me is the same thing as replacing a rather decent desktop client with a web wrapper. This is their update, its obviously bad for the customer. The excuse is to make further updates easier. There is reason to think these will also be bad for the customer.
Someone once explained to me that the process of everything we build and create is very similar. If some young sector has a wildly different approach one should be very skeptical. It happens that an architect still has work to do after delivering the drawings. Sometimes it is necessary to recall thousands of cars.
Ideal would be to have a perfect construction drawing then build the machine. The architect moves on and designs the next garden. A plumber puts the pipes in, tests if everything works then goes to the next construction site. The difference with software is that, when done, it can last forever. The opposite of what the industry pretends to be true.
And the start menu tile BS wasn't impactful except for the narrowly avoided multibillion dollar GDPR fine Facebook almost fell headfirst into when they declared "mission accomplished" and I realized they forgot the apps existed and escalated, just before the deadline.
I deserved a bonus for finding that, yet it didn't even register on my PSC.
Not a problem in waterfall, since you can set the targets beforehand and just have the teams work in parallel. In an "Agile" setting though (not the manifesto version, the current practice version)? It's a huge mess.
So then the question becomes, is this coordination cost worth it and how much are we willing to "spend". It seems the cost is worth it for Android and iOS since the native app experience on those platforms is so much better.
The macOS app is just a tweaked iOS app so it's also easy to justify.
But what about Windows? The Microsoft provided APIs are a disaster that keeps getting rewritten every half decade, and even Microsoft barely uses them. Windows users don't have a "feel" for a native app the way macOS and mobile users do, since there hasn't been a "native app experience" on Windows for a very long time.
Just look at the 2 context menus on Windows Explorer, they can't even get the sizes and colors between them right.
Meta would have been better off doing like Telegram and just using Qt.
What a wrong and misleading statement.
Not they aren't. Otherwise Winamp 2.95 from 2003 wouldn't flawlessly work on my Windows 11 install 23 years later. Same with my Unreal Tournament 99 installation that I just copied over via a CD left from my Windows 98 PC.
As long as you haven't used any undocumented APIs or made your app hack its way into the OS via hooks like some games did back then for performance optimizations or some anti-piracy SW did, then most Windows apps from over 20+ years ago will still work today and it's also why WINE works so well to get Windows app working on Linux, is precisely because Microsoft HASN'T changed the APIs.
(WinUI is also a hybrid: since it uses the DirectX APIs rather than the old GDI model, most of it is in a userspace package that's delivered as an .appx)
If you want to do basic UI on Windows today, you can use:
- Win32 UI
- MFC
- .NET WinForms
- WinFX
- .NET WPF
- WinRT
- UWP
And I'm sure there are many others that I'm forgetting.I think apps like Notepad++ and Sumatra PDF are still written in Win32.
[1] https://learn.microsoft.com/en-us/windows/apps/desktop/moder...
They also keep doing boneheaded decisions like the new DWriteCore, which is a reimplementation of DirectWrite, and some features are only available in DWriteCore, but DWriteCore is not a strict superset and doesn't even interoperate directly with Direct2D like DirectWrite did.
So once again, either stay on the "stable" as people like to call it version, which is effectively frozen, or use the new one and rewrite half you app!
And then people wonder why nobody uses Microsoft APIs if they can avoid it.
Who cares if they keep working if the only way to get dark mode is to custom render the whole UI yourself? Might as well do something non-native at that point.
Microsoft does not provide a serious UI library for applications to use. It provides 10 half broken and deprecated ones.
Yeah it totally is and I explained why.
>Who cares if they keep working
A lot of people.
>if the only way to get dark mode is to custom render the whole UI yourself
Most the apps I use support dark mode: Chrome, FF, Notepad++, Jellyfin, Qbittorrent, SumatraPDF, Signal, Wiztree, WinSCP, and others I'm forgetting. Seems like it's not an impossible task for the developers.
And on Github you can find FOSS apps[1] that have implemented light/dark mode switching for their UI, so you can just copy what they're doing if you don't know where to start.
Chrome is not native. FF is not native. Qbitorrent is not native (it uses Qt, right there in the name). Signal is not native, it's a web page bundled with Chromium (aka, an Electron app). WinSCP only this year actually started using the right font! But at least that one is native I'll give you that.
The GitHub you sent doesn't really have anything to do with the discussion.
Remember when you used to be able to make custom Windows color schemes? Where you could color each widget type independently? I miss that.
Assuming that your report is true, if Microsoft gave a shit, they'd come up with four color schemes (high-contrast light/dark mode and regular-contrast light/dark mode) for Win32 programs and sync them with the relevant "modern" settings. Alas.
What do you do if you're a Win32 program who refuses to use the system color scheme? Well, I guess you'll stand out no less than you would have been in the Windows 95/98/ME days... stubbornly refusing to conform to the user's commanded colors and all that.
We also had pseudo-dark mode themes for Windows XP. But the new dark mode stuff in explorer doesn't use uxtheme anymore.
When I say it, it's not a joke! :D
> But the new dark mode stuff in explorer doesn't use uxtheme anymore.
I'm only familiar with Windows 10, and in that version Windows Explorer seemed to conform to the system color scheme just fine. Did Microsoft break that at some point in the past three, four months, [0] or did they royally fuck the dog in Windows 11?
For me, the *maddening* color-scheme-nonconformance was Task Manager. No matter what I did, it was *white*. It made it substantially unpleasant when you wanted to check a performance something-or-other in a dark room.
[0] Ever since I learned just how well video games work on Linux (via Steam), I've not booted into Windows. It's so nice. Folks report that there are launchers to run games in the Epic Game Store, but I've not yet bothered, so I can't provide first-hand info.
When you as a dev use a Windows provided control (be it a button, a toolbar, a context menu, etc.) the system grabs the theme information from something called uxtheme. This is how it worked in Windows XP and why you had like 3 different themes you could pick from and switch between at any time, and most apps would respect the selection.
But dark mode doesn't work like that, there's no uxtheme for dark mode. Windows Explorer is given special treatment by Windows with undocumented APIs. Other apps are not so lucky. If they want dark mode, they need to effectively draw the whole UI themselves. Even apps like Notepad++ have to do that these days.
Microsoft: "We offer GREAT backwards compatibility! The best in the business!"
Also Microsoft: "Man, doing things the old way is so hard. What if we had a clean break with the past?!?!"
Microsoft, some time later: "Man, making a clean break from the past is so hard. Let's just ship what we've done so far and think about finishing up the rest later. What? Windows Explorer is broken, and you refuse to let us ship!? *Fine*, we'll hack something in and not tell anyone."
Qt would certainly be the better choice. However, since Meta already has a web version of the WhatsApp client, the WebView2 path was an easy and inexpensive option. After all, MS itself paved the way with Teams, Visual Studio Code, and Outlook.
What are you talking about? The Windows APIs have been stable for at least 20 years.
WPF (Windows Presentation Foundation ) had been the recommendation a while back, but then Microsoft started pushing UWP (Universal Windows Platform). Both of those have been succeeded by WinUI 3. UWP has been deprecated. WPF is alive, but more in a maintenance mode while WinUI 3 takes over the future. Oh, and WinForms were popular, but now not.
There's definitely been a lot of shifting and I think that's caused a lot of annoyance in the developer community - especially as Microsoft ships JS/WebView2 based apps instead of dogfooding their own stuff. If you hang out in the dotnet subreddit enough, you'll definitely see Windows devs annoyed at Microsoft's mercurial attitude toward their desktop frameworks and seeming lack of direction/interest - as their big new things are JS/WebView2.
You have a graveyard of frozen APIs you can use, with new features only available on the later ones.
Windows itself was always using their own custom stuff and not any of those. The closest thing to an established framework in Windows is react native that is sprinkled here and there. And QT that OneDrive uses
Buying shares in a company doesn’t benefit its operations, like making a product, directly. Hence, buying shares != support company’s products, however counterintuitive that feels.
Maybe during a never-ending bull market … but all bull markets end … look at the “lost decade”
Argument here is that buying shares (other than during specific events like an IPO) affects the shareholder, not the company's products.
So whether or not you buy shares has no relation to supporting the company's products. The latter happens when you buy or not buy their product.
Why not use a common library, where platform teams can update what features they use from it at their own pace?
Or why not use some other tech that allows multi-platform native shipping? The company I just started at, RemObjects, builds software for Mac and Windows (and iOS etc) from a common codebase. They just keep the UI separate, and all that is done when there's a new feature is _only_ UI layer coordination.
So an app has a C# (say) or Go codebase, and a UI layer in WPF and Cocoa.
I feel like coordination cost goes way down with both of these:
a) Library approach: central functionality, platform teams not tied to specifics
b) Shared codebase where all logic lives: platform teams do only UI
Edit: forgot to add, all this is can be across languages, ie maybe you have a Go shared library and write your Cocoa UI layer in C#, that's fine, their tech makes it all interop.
I do think this approach is super viable with a product development approach that is more waterfall, however. Where a team owns the platform and it's design.
Also even the non UI is not trivial. I once worked on a very la4ge code base that was console only. We en we eventually gave up on our Windows and Mac ports. Granted it was C++ …
The developer makes a highly optimized native app that is a hit with the user. Now that it's a hit, the developer is now big company. And big company needs telemetry, A/B experiments, fast iteration. They can't just sit around waiting for the original developer to spend more years crafting another masterpiece. Due to tech, sign-ins, or what have you the app is also a kind of monopoly. Since it's a monopoly, quality doesn't matter. It can be a bloated electron thing and nobody can do anything about it other than suck it.
I have clients who regularly send me photos/videos to publish on websites. There are many usability issues around this:
1. There is no option to "download all", which means you need to click through every photo manually and hit the download icon.
2. When navigating through photos, the download icon is often hidden behind a submenu which means it takes two clicks to download a photo instead of one.
3. It's impossible to download videos without first having fully buffered them. This means you need to click through the full video to ensure it's streamed to your device before the download icon appears. This is super annoying especially with longer videos.
4. Bonus non-web annoyance: if a user sends you multiple photos, your phone goes insane with notifications and sounds like a rapid-fire pinball machine. DINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDINGDING
As a web developer I find it incredibly difficult to even think of releasing software with these kinds of basic inadequacies.
I would wager that if they wanted to simplify coordination, Flutter would easily kill iOS and Android with one stone, and optionally desktop and web if they so wish, all without a 1GB web wrapper.
on the desktop client? nah
Calling it nonlinear paints some horrible exponential picture. It's just squared (all to all communication). We deal with squared problems all the time (that's literally what distributed consensus is all about.....)
Eventually, the effort required to actually manage doing the thing is at least as large/larger than just doing it - ex: every large organization.
You may not like that from a 'native look and feel' point of view, but the question 'what is a native Windows app these days anyway' is very much unanswerable, and you can actually implement stuff like this in a performant and offline-sensitive way.
But, yeah, by the time the resulting GPU worker process balloons up to 400MB, that pretty much goes out of the window. I'm actually sort-of impressed, in that I have no idea how I would even make that happen! But that's why I don't work at a powerhouse like Meta, I guess...
(in case anyone needs a reminder of Microsoft's org chart: https://www.globalnerdy.com/2011/07/03/org-charts-of-the-big...)
To be a little glib:
As someone who has worked for a few Big Software Companies, I guarantee that Microsoft's org chart has changed significantly at least once in the last fourteen years.
Re-organizations aren't referred to as "shuffling the deck chairs [on the Titanic]" by the rank and file for no reason, yanno?
But maybe that impression is wrong and they now cooperate better. After all since some Windows 10 update the Windows Explorer can even create files and folders starting with a dot (which from a kernel, fs and cmd perspective was always valid)
Based on my experience with Blasted Corporate Hellscapes, I find it very unlikely that they cooperate better. Middle-ish management lives to stab each other in the back, belly, and face.
> ...Windows Explorer can even create files and folders starting with a dot...
That's progress! Does Windows Explorer still shit the bed when you ask it to interact with a file whose name contains the '|' character? That's always been valid in NTFS, and I think is valid in at least a subset of the Windows programming interfaces.
Every place I've worked which did not use react had steady pushback from UI/UX to move to react. It took active resistance to not use react, even though it didn't make any sense to use.
Not any more, I kept windows 11 around for gaming but I binned the partition, how they managed to make a 7950X3D/7900XTX feel "clunky" is astounding given that I live in KDE which has a reputation for been a "heavy" DE and yet it it feels instantaneously fast in every dimension compared to windows 11.
Full disclosure: I use KDE almost exclusively.
macOS spoiled me.
I guess it's because they decided to make the web client first-class, and instead of maintaining a native client for each platform (windows, mac, linux...) they opted to just serialize all non-mobile uses (which probably aren't that important to them to begin with) to web.
https://engineering.fb.com/2014/10/31/ios/making-news-feed-n...
But to edit large document, visualize any large corpus with side by side comparison, unless we plug our mobile on a large screen, a keyboard and some arrow pointer handler, there is no real sane equivalent to work with on mobile.
Luckily for me, i have the ultimate power so i can just say "Firefox doesn't support that. I don't use chrome. period."
But lately i had to start saying Safari doesn't support that so we would lose all iphones, or we can start investigate after we have a working solution. God damn react.
The advantage of the web app is that it just works, without installation, so there's no friction there. I'd very much prefer a native app, but the overhead is quite high, no?
And from a manager's point of view it seems wasteful to develop the same feature across multiple platforms. And if you look at the numbers it does, but numbers-driven development has been a huge issue for a long time now. They don't consider performance or memory usage a factor, and perceived performance is "good enough" for a web app.
Ever since UX and UIs started to be driven mainly by metrics and numbers, I felt something started going wrong already. Since then (the decades...), I've learned about "McNamara fallacy" which seems to perfectly fit a lot of "modern" software engineering and product management today:
> The McNamara fallacy (also known as the quantitative fallacy) [...] involves making a decision based solely on quantitative observations (or metrics) and ignoring all others. The reason given is often that these other observations cannot be proven.
Happy to learn otherwise, but might be a datapoint on user behaviour (which could also drive corporate choices).
When a developer/company decides to not implement things local and proper way and push it out and be done with it regardless of the resources the product use on the users' system, I mark the company as lazy and cheap, actually.
Shoving the complexity and cost to users' is being inconsiderate.
As much as I like super snappy and efficient native apps, we just gotta accept that no sane company is going to invest significant resources in something that isn’t used 99%+ of the time. WhatsApp (and the world) is almost exclusively mobile + web.
So it’s either feature lag, or something like this. And these days most users won’t even feel the 1GB waste.
I think we’re not far away from shipping native compiled to Wasm running on Electron in a Docker container inside a full blown VM with the virtualization software bundled in, all compiled once more to Wasm and running in the browser of your washing machine just to display the time. And honestly when things are cheap, who cares.
But for real, the average number of apps people downloading get fewer year over year. When the most popular/essential apps take up more RAM, this effect will only exacerbate. RAM prices have also doubled over the last 3 months and I expect this to hold true for a couple more years.
It depends what metrics are considered. We can’t continue to transform earth into a wasteland eternally just because in the narrow window it takes as reference a system disconnect reward from long terms effects.
Maybe what you thinking is a wasm runtime like wasmer.
JIT compiling, native graphics, quick and easy online deployment into sandboxes, support for desktop standards like keypresses, etc.
It feels like the web ate up the windows desktop experience instead of that experience spreading cross-platform and dominating.
I completely agree it would be better to rethink what we want and have markup/code/etc optimised to the task of rendering applications. I don't think it'll happen unfortunately.
They had to stop because native widgets aren't secure enough.
Let google do it on your behalf.
The new electron app does take more resources, but at the very least it works.
By only accepting ANSI input, not encrypting any messages, and not bothering to protect users' from remote attacks.
Facebooks's GUI stack for WhatsApp may be rather buggy but on a technical level there's a lot more going on than back in the days of unencrypted TCP connections over plaintext protocols.
Meanwhile, Telegram has an excellent desktop app (despite their terrible protocol), so it's not like the knowledge was lost either.
I don't use the Telegram web app, but their native apps work excellently. The insertion of ads has been a major disappointment but the chat UX itself is still great, even on native Linux.
Indeed. I'm not actively using Telegram, but I tried the desktop application (made with Qt if I remember well), and it's way ahead of what Whatsapp offers. Not to mention it's fast and relatively light.
Facebook could just take the app, change the colour's to make it green, and replace the messaging protocol with their WhatsApp library, and they'd get an actually usable chat client practically for free.
As a teenager, I thought we'd get better at making software over time. Not worse.
This is partly because MS became insanely complacent. The Windows team is very junior. Just ask anyone who has worked with them. They don't have the skills or resourcing that they did in the 90s.
When I switched from Windows, the thing that I missed from Windows on Linux was the native WhatsApp App. Now they killed there, so feeling better on my switch now!
Huge perf issues because of this.
Also had some serious bugs for a few week. Had to let WhatsApp Web wait for completely sync for 15 minutes~ or else it just stop responding and crashes everything.
Maybe this will make me try the desktop app again.
I don’t think we have. This is always what efficiency leads to, higher resource consumption. The phenomenon was described already in the 1800s: https://en.wikipedia.org/wiki/Jevons_paradox
JS and the web has seen performance improvements. They lead to more ads being served and more code being released faster to users.
That's not the same thing. If you make batteries more efficient then people build more devices that run on batteries and then you need more batteries than ever. But you also get a bunch of new devices you didn't used to have.
When computers get more efficient, miserly corporations cut staff or hire less competent programmers and force their customers to waste the efficiency gains on that, because they don't have enough competition or the customers are locked in by a network effect. The difference is whether you actually get something for it instead of having it taken from you so somebody else can cheap out.
Without any regulations companies will create software that costs more to the users, but saves pennies to the company.
So, we have regressed in efficency.
They are not mutually exclusive but one follows from the other.
It’s company vs user not regression vs efficiency
We have more efficient hardware, so we should be seeing hardware everywhere. But actually we all use the same amount of hardware we did 20 years ago. We all have a desktop, a laptop, a smartphone, a modem, hell even a computer watch, like we did 20 years ago. But they're more efficient now.
Where we do see more hardware now, is in pre-existing appliances, like fridges, TVs. And why is there more hardware? Sometimes it's just a remote control ("turn off TV"). But more often, the increase in adoption follows a specific motive: like figuring out that they could sell ads or subscriptions through it. And the hardware itself is not what's making the ads work: it's the software, that collects the information, feeds it to companies over networks, lets them data-mine it and sell it continuously. Both of these are just serving a human interest to make more money through the creative use of surveillance and marketing. And honestly, most of this could've been done with hardware and software 20 years ago. But because it's in vogue now, we see more of the hardware and software now.
We are comforted by coming up with an explanation that makes logical sense, like the paradox. But the paradox applies most when it coincides with an unrelated human interest. What motivates us is not A+B=C, but a combination of different calculations that sometimes involve A and B, and incidentally add up to C.
That said, I do firmly agree with the parent: there is choice involved here, engineering decisions.
The Microsoft world is particularly bloated, as they insist on shoehorning in unwanted anti-features into their OS. Much more efficient operating systems (and ways of building a chat client) exist.
Jevon's paradox may describe a general tendency, but it's no excuse for awful software.
Completely wrong an irrelevant analogy!
I see where you went sideways, you confused trigger with consequence completely. Here the efficiency for the very same application got very, very very, increadibly hugely, galactically worse. Not better. The premise of the linked article is that the same application gets more efficient. Then comes the increased use of the affected resource. Here the same application went shit, complete shit, concerning efficiency, and had no effect on memory manufacture and prices, WhatsApp is not that significant in computing.
Probably a better analogy was that if technological and tigtly related economical advances raise the availability of resources (here memory, CPU) then things go dumb. If something then the generalized (from time to any resources) Parkinson's law is relevant here: increasing available resources beyond reasonable leading to waste and bad quality outcomes, overcomplication.
The application is ”business logic”.
The engine is JS. The more efficient JS engines get the more compute and memory JS will use to deliver business logic in the universe.
We often hear stories about the speed of development and the issues of maintaining native apps, and then there are these rewrites every few years. Don't they waste more resources vs. creating / fixing the gaps in the native app? And this isn't somes quick startup prototype app that can flop and the effort would be wasted
Everybody is on telegram today, it's not like five years ago when people did not know what it was.
Matrix gives a similar experience with e2ee though, but you have to save a recovery key for the case you lose access to all your sessions
Zuck, six months ago: “Within 12 to 18 months, most of the code will be written by AI. And I don’t mean autocomplete.”
Meta, today: "Maintaining this basic Windows app is just too much work."
Take Microsoft for instance, they have been pushing heavily what they called the New Outlook, which is basically a web-based client mirroring Outlook on the web (OWA) , packed into an EXE file (not sure if it is Electronized or not)! Then, they renamed the real native Outlook app as Classic Outlook to feel old-fashioned and outdated and as result, we ended up losing some core features that made Outlook, Outlook. We lose: COM Add-ins and VBA Macros, MAPI Support, Word as Email Editor, .PST File Support.. to name a few..
This would be probably one of the reasons contributing to the Collapse of Civilization (https://news.ycombinator.com/item?id=25788317)
But fear not! As we learnt from Microsoft, Meta and OpenAI, in early 2026, 100% of the code will be created by GenAI. Finally we can have native apps with working core-functionality. Can't wait for it!
There are far too few people who truly advocate for the user, and it is this dereliction that has fuelled the race to the bottom.
It's not a matter of native libraries versus x-platform solutions versus W3, it's about valuing UX over DX at EVERY TURN. It's about educating yourself as to the resource and performance consequence of the technologies you are advocating for.
There is essentially no way to tell if a JS app is using a lot of memory just by looking at what the process has reserved. There's loads of things that end up in that space - cached pages, cached compiled code, cached bitmaps of rendered pages, etc.
The task monitor tells you what Chrome or Chromium (e.g. Electron) is doing, not what the web app is doing.
There is a good argument to suggest Chromium is hogging more than it should. That's not really WhatsApp's fault though.
I mean, you could argue that you're at fault when you're choosing a platform that literally eats your RAM. They're not at fault that the platform is shit. But they could have chosen another platform. Or, even better, just use the existing, most of the time perfectly working app, and optimize it.
They had no resources to build the new features for every single platform that they offer native clients for? I could not imagine any other app for which this would be more hypocritical. Give me the iOS WhatsApp IPA from 2016, or even earlier, and the Windows Chromium Wrapper from 2018, and I will tell you neither me nor probably 98% percent of users will notice any difference in feature, let alone design.
Yet, I really don't understand why WhatsApp would need app especially with the state mentioned here (which is a basic wrapper)
There are no calls in the web app, but modern web stack is more than enough to provide all the real functionality needed for it.
If it allowed me to do video calls from a laptop, that could be useful but obviously that can't be a feature they therefore offer.
It does though?
According to this not any more.
Because just parsing html/xml/some declarative ui description to create a scenegraph and pushing it to the GPU is something I'm sure even native frameworks like Qt do.
Yet I can feel the latency in vs code vs. say sublime text.
It's tempting to blame the huge js blobs trying to create an application abstraction over a document object model but i feel like there should be more to it than just that...
Also think all the web subsystem that has been added over they years. Every electron app is bundling those in, even the inspector. It’s shipping a whole VM with your code. Containers have at least the decency to strip things down to the bare minimum, but no one is thinning chromium to only what they need.
- HTML rendering - which is insanely complex to do efficiently for arbitrary web apps.
- Video conferencing software
- A graphics engine. Used for rendering web content, canvas2d, webgl, webgpu and video encoding & decoding for a bunch of formats. It also has a bunch of backends (eg CPU, Metal, Vulcan, etc)
- JS, and every feature ever added to JS.
- WASM compiler, optimizer & runtime
- Memory manager, process isolation per tab, and all the plumbing to make that work.
- The Inspector - which contains a debugger and most of an IDE for JS, WASM, CSS and HTML.
- So much interop. Like chromecast support, http1, http2, quic, websockets, webtransport, webrtc, javascript remote debugger protocol, support for lots of pre-unicode text formats, DoH, webdriver, and on and on.
- Extension support
- Gamepad drivers, web bluetooth, webserial, midi support
What am I missing? Probably lots of stuff. I have no idea how much each of these components contributes to browser bloat. But all up? Its a lot. Chrome is bigger than most complete operating systems from a decade or two ago. Browser vendors have to pick between fast, simple and fully featured. Chrome sacrifices simplicity every time.
(disclaimer, I help out a little with that plugin, amongst others for Pidgin-
meanwhile, the web-based version in the browser was nearly instant.
I should try the new web-based app to perform benchmarks, but I think I'll just keep using the browser.
I'll also mention, the "whatsapp web" website is extremely resource intensive, and in my laptop it makes the CPU usage ramp up to 100% very often, and using WhatsApp while having any other programs or even browser tabs open becomes unbelievably annoying. the desktop takes 700MB, the web browser takes 400MB, and WhatsApp web takes at least 1GB alone. this is on Arch Linux, with 4GB of DDR4 RAM.
- slim down to a more efficient, non-virtual-DOM web stack (e. g hypermedia)
- move to a wrapper around native webview (Tauri)
- use one of the excellent cross-platform frameworks (Flutter, or that new one just open sourced by Snap)
For WhatsApp specifically, I don't understand why a company with the size and resources of Meta can't support native versions for Windows, Mac _and_ Linux. I think many people would accept non-feature-parity with the web/mobile versions, in exchange for a tight, reliable desktop messaging app...
WhatsApp now is using a native webview, at least they're using Webview2 on Windows.
It used to bundle MSN, ICQ, IRC and everything in one messenger - super resource efficient.
Unfortulately I cannot reach many contacts that way these days but it showed me how inefficient applications became by using more abstraction layers.
Same on my Mac - it uses almost 16GB of RAM in idle mode with some Tabs, VSCode and Figma open - How did we get here?
edit: Just saw another comment mentioning there is a WhatsApp plugin for Pidgin. Awesome!
And now it seems that there is not even a benefit to installing the native vbersion, they will probably do the same for Mac also now.
I long for a simple, stable chat app that doesn't change every other day. Maybe we should all go back to IRC.
Well sounds like a lot of useless work was being done then, how does it gobble 100MB when idle? Are the protocols that complex?
Just do as I do and open web.whatsapp.com in your favorite browser
I wonder if they avoided that so they could use Electron and target MacOS / Linux too
That was.. ..an interesting read. The seemingly perpetual need of defaulting back to greed and authoritarianism never ceases to amaze me.
I was thinking about switching back to KDE. Well, maybe not after all. Cheerio, as they say.
This comment is just an excuse to do a hitjob on KDE donations. KDE is FOSS and doesn’t use your data to make money.
I was also running a mIRC client in rooms with hundreds to thousands of people at once.
And I was able to do both at the same time.
Now that computer can't even run 1v1 chat.
Meta makes more money than god and there's over a billion WhatsApp users. It's not like this thing is Blender or a AAA game, it's a chat frontend. Maintaining it has to be a rounding error in the budget.
For software companies it’s the same, in a way. They pay whatever the consumers deem appropriate. It’s just that consumers don’t really care for the most part, so the cost is basically $0. RAM is cheap, and if a company prioritizes shipping features over optimizing RAM, that’s almost always going to be the deciding factor.
It's zero, because most people cannot just not use WhatsApp, M$ Teams, etc. because WhatsApp has a monopoly on their centralized garden.
It's not like a Matrix where I can easily switch clients like I can switch refrigerators.
Even if someone would be able to provide an alternative, Facebook would just "kill" them like Apple did kill Beepers iMessage alternative.
My time isn't
Regardless, we’re talking about the average user. As the article implies, on average, users’ time is in fact cheap compared to the alternative, or maybe the time cost is simply insignificant.
Additional 100k cost per year vs time, energy and hardware for billions of users?
If your spikes in load time are random it could be the connection, the amount of free RAM on your computer in that very moment, or maybe how well it is syncing with your phone. However I don't know how it handles that. It used to require an online phone but it seems that it can do without it now.
I’m forced to be in parent group chats in this accursed ecosystem and give Zuck another DAU.
Even Telegram is better, then there is Signal, Briar, and XMPP, too. They are fully featured ones. If you are fine with text only, then Ricochet Refresh is the best candidate as far as privacy goes.
For me this just sounds nonsensical. I am sure it makes sense somewhere in the world
My parents know how to use WhatsApp and have it installed. My sister and her family, too. My friends, their friends, the scammer trying to explain to me how to access the crypto I apparently lost track of on their platform, my former coworkers, my current coworkers, the people I interact with for my freelance work, the government's road safety alerts, my ex-girlfriend who still has a bunch of my clothes, my neighborhthey areood group informing me when I need to put out the trash-they're all on WhatsApp.
It's the network effect. Even to get my family group on Signal, I'd have to get twenty people to install Signal, probably visit some of them to help them with the setup, do it again every time they switch phones, and so on. And they'd probably still default to WhatsApp because that's what they know.
I do not think I would be able to get my kid off the grid completely so might as well use a better alternative, and in many cases you simply just can't because you have to submit your homework through Facebook, and so forth. Thankfully it has settled down a bit post-COVID but there may still be cases of this crap. I loathe it.
In any case, a privacy-oriented alternative is better than the cancerous Meta. If you want no IMs at all, well, that may be tough. Kids find a way, trust me. I know I did. :D
terrible? no way
It's incredibly slow to propagate messages.
Initial set up is very confusing and offers far too many choices.
I've never worked out how to find rooms. Regularly someone will give me a room name and I can't find it or join it.
Their entire service went down (did someone say "decentralised"?) a few months ago.
do you use element web on the phone?
> It's incredibly slow to propagate messages.
was
> Initial set up is very confusing and offers far too many choices.
thing of the past, currently there are two choices, in the future there will be only one
> I've never worked out how to find rooms. Regularly someone will give me a room name and I can't find it or join it.
let someone invite you to the room or share a link
> Their entire service went down (did someone say "decentralised"?) a few months ago.
afaik only matrix.org went down
correct; the rest of the network was unaffected - it was just one instance (albeit the biggest one). https://matrix.org/blog/2025/10/post-mortem.
At least look back at the color of the other grass!
No 1 GB or installation needed
Why is the desktop app even a thing?
the desktop app is considerably faster and more responsive.
the deaktop app allows OS level shortcut keys
the desktop app is easier to work with when applying parameters in programs like excluding it from my VPN or for sandboxing or for isolating network traffic. Or for looking at how much space it takes up on disk. (im not a web developer), it doesnt cause any confusion or mistakes to be made as its logical separation in OS is clear, this is also faster
the desktop app has better keyboard shortcuts that dont collide with your browser, and the same with right click menus
I can easily video call from various PCs while still not trusting my browser with camera/mic permissions
Pin tab, problem solved?
The ergonomics are significantly worse.
I want most of my browser windows full screen. I don't want my instant messenger full screen. Using it in a browser means I have to have one size, and resizing one changes the other.
The experience of using a native app is far superior.
Sure it's a little quirky at times (eg it closes if the browser restarts for update) and it doesn't have a system tray icon, but aside from that, it behaves like a separate app.
I guess if you're using Microsoft's ill-advised window grouping feature it would work less well (require more clicks), but breaking websites out into entirely separate programs just so we can have separate windows because Microsoft screwed up the window management functionality seems like a very inefficient workaround.
With a native app it's just alt+tab - or, if the app is pinned to the taskbar, Win+(1/2/3/4...)
/s
Telegram has its own faults and issues, but the native Windows app is incredibly good and fast.
You know I heard something funny the other day, someone saying that now thanks to Claude Code, companies might go back to doing actual native support because the specialized knowledge and effort have both been basically automated now.
Well, give it a few years!
> WhatsApp for Windows was originally an Electron app, and it was eventually replaced with UWP after years of investment. Four years later, WhatsApp is going back to WebView2, abandoning the original WinUI/UWP native idea.
> My understanding is that the recent layoffs at Mark Zuckerberg-headed Meta likely disbanded the entire team behind the native WhatsApp. I don’t see any other reason why Meta would abandon its native app for Windows. Meta will save costs by maintaining the web app codebase on Windows, but you’re going to hate the experience.
Giving them more time won't change much if they have resigned from devs in order to save some money.
A company's goal is to make money by optimizing its resources. What benefits would Meta gain by maintaining native apps for WhatsApp across the three major operating systems? I can tell you: absolutely none, only negatives. Nobody except a negligible fraction of users would care about native performance or idle memory consumption. No one is going to switch to Signal or whatever the flavor-of-the-year messaging app is because of this.
It would be a different story if WhatsApp were to lose a significant portion of its user base due to the app becoming unusable or extremely slow. But for the vast majority, this change will go unnoticed or frankly won't matter at all. So, expect most companies to continue adopting Electron-like apps (for the few that still have native apps anyway) for exactly the same reasons.
Sorry to be blunt, but it's really tiresome to see these discussions going around in circles here. It’s pointless to keep debating this, it's not going to change. If one day a framework emerges that's comparable to Electron (or something similar) but requires fewer resources to develop against, I could see Meta and other companies considering it... provided the migration costs aren’t too high. But again, no end-user truly cares about this.
Another example of WinUI anemic state.
All this “engineering” from Meta and algorithm acrobatics with hundreds of optimization puzzles and they accept a solution that performs completely far worse in both runtime and space complexity.
Whoever believes that this is the acceptable solution from a trillion dollar company would have failed their own interview.
No company would accept a huge regression like that and push it to billions of users but Meta.
Total amateurs.
Whatsapp screams antitrust. If you look in the dictionary for antitrust, you see Whatsapp
I was very happy with that situation - it was the last and good enough reason to uninstall all FB-related apps from phone, and never looked back. That company (and not only that) is a cancer to whole society, as per design.
Must be a tiny percentage, which is why this version is now a basic web wrapper now.
Anyway, I’d remind everyone that “using” RAM doesn’t mean “would not function with less RAM.”
Many applications just use a lot if it’s available.
RAM is not really something you explicitly ration.
I guess this modern attitude is how we are where we are.
RAM is absolutely a scarce precious resource that we optimize for. At least we used to, and some of us still do.
Oh and the browser (any browser, I tried many) just takes up 1GB per tab it seems. It’s insane. My old 8GB laptop is nearly unusable now and can barely be used to browse the internet and very little else. I can at least keep coding on emacs. Who would think emacs would one day be an example of a lean app!?
You can buy an entire complete mini PC including 16GB of modular RAM for $240 on AliExpress.
I’m not saying “don’t optimize.” I’m saying that watching your task manager, seeing a big number and freaking out isn’t really the definition of unacceptable performance.
It most certainly is. My old pc ran on 8MB of ram. Modern ones need 16GB for a comfortable experience. They do not do much more than I needed back then. I think it's reasonable to expect a simple chat app to not take up 128 times as much memory as my entire PC had when I was young.
Okay but I’m not trying to land on the moon, I’m trying to have an HD group video call and maybe play Cyberpunk in 4K later.
Let me ask you, how much did that 8MB of RAM cost you back in the day? I bet it was more than the $100 it costs to get 32GB of RAM or the $200 it costs to get 64GB. Before you apply inflation!
> Many applications just use a lot if it’s available.
Some of that memory isn't going to be touched again, and will eventually be moved to swap, but it still pushed things out of RAM to be there and is a troublemaker.
The rest of that memory will be needed again, so if it gets swapped out it'll lag badly when you switch back to the program.
Either way 99% of programs are not doing any kind of intelligent use of spare memory. If you see them doing something that looks wasteful, that's because they're being wasteful.
The one thing to remember is that at the OS level, disk cache pretty much qualifies as free memory. But that's unrelated to this issue.
Except when something really does need more RAM, and fails. LLVM for example having, somehow, become a bit chonky and now fails to compile on 32-bit OpenBSD systems because it wants more memory than is available. Less bloated software of course does not suffer from this problem, and continues to run on still functional 32-bit systems.
> Many applications just use a lot if it’s available.
Xorg is using 92M, irssi 21M (bloated, but I've been lazy about finding something leaner), xenodm 12M. That's the top three. Oh, Windows? Yeah. About that. Best you can hope for is not to catch too much of the splatter. (What passes for Mac OS X these days also seems fairly dismal.)
> RAM is not really something you explicitly ration.
Paperclips were hung on the rack doors to make it easier to poke the wee little red reset button when some poorly written software went all gibblesquik (as poorly written software is wont to do) and the OOM killer could not cope and, whelp, reset time. Elsewhere, RAM is explicitly rationed—perhaps certain aspects of timesharing have been somewhat forgotten in this benighted era of bloat?—and malloc will return NULL, something certain programmers often fail to check for, which is generally followed by the kernel taking the error-ridden code out back and shooting it.
Also, even in theory the issue isn't only with "wouldn't function", but "would function slower due to eg disk swaps / cause other apps to function slower".
> An app can use a lot of memory, and it does not necessarily mean it’s a performance nightmare, but the issue with the new WhatsApp is that it feels sluggish. You’re going to notice sluggish performance, long loading time, and other performance issues when browsing different conversations.
The issue, I think, is who the desktop users are. They sales people, they are people who conduct business over WhatsApp. The buyers at a previous job uses whatever the sellers in Asia, eastern Europe and the middle east are using. A long time ago, that was mostly Skype, now it WhatsApp. There a huge benefit to having WhatsApp on your desktop, with easy copy/paste, Excel and everything you need to make the deals.
Maybe Meta doesn't believe you should do business over WhatsApp and don't wont to cater to that crowed.
I would love to see what a professional Windows application developer, if those are still around, could do with a native WhatsApp client. Using modern C++, or just C# and all the tooling provided by the platform, how small and functional could you actually make something like that.
Still, I think the experience reported is very similar to running Chrome, and I think any laptop with 8GB of RAM can handle the application plus Excel and a web browser (or just run WhatsApp in the browser) just fine.
A complete mini PC with 16GB RAM, 512GB storage, and a relatively modern processor goes for like $240 on AliExpress. And that’s before you consider used hardware.
It's much easier to locate an application that has it's own process and presence in the operating system.
The author of the article would have noting to complain about if this was Facebook.