https://www.anthropic.com/engineering/advanced-tool-use
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
Java is possibly the safest bet on the future, it's open source both in spec and in the most common implementation (OpenJDK), and is so widely used that there are multiple FAANG companies critically dependent on Java working that alone could continue the development of the platform were anything happen.
Besides, Oracle has been a surprisingly good steward of the language.
Plus last time I checked Oracle lost that lawsuit.
This is only true for older .NET Framework applications.
<Alanna> Saying that Java is nice because it works on all OS's is like saying that anal sex is nice because it works on all genders
EDIT: someone has (much to my joy) made an archive of bash.org so here is a link[1], but I must say I’m quite jealous of today’s potential 1/10,000[2] who will discover bash.org from my comment!
So how did it work back in the day, people would just submit text and it would get upvoted? I always assumed like half of them were just made up.
As far provenance, I assume a lot of them were made up too, but this one was real.
To everyone else: I acknowlege that this post is not adding value but if you were one of the lucky 1/10000 you would understand that I have no choice.
If anyone ever requested/used an eggdrop(?) bot from #farmbots or #wildbots on quakenet then thanks to you too; that was certainly one of the next steps down the path I took. A (probably very injectable) PHP blog and a bunch of TCL scripts powering bots, man I wish I could review that code now.
you could also run java with js if you are brave enough https://kreijstal.github.io/java-tools/
For example, if I have a struct `PageEntity` with a field `Id`, and I am iterating over a slice of such IDs, I would prefer using `pid` instead of `pageEntityId` as the variable name. But Java APIs and conventions tend to use these longer names, so I find it takes more thinking to remember the different names instead of quickly seeing the behavior of code at a glance.
Java also tends to have a lot of inheritance which results in these long glued-together names and makes it harder to follow program flow because behaviors get introduced in multiple different places (i.e., it has the opposite of locality of behavior).
But those are just my opinions and experiences! I know many people love Java, and it is a versatile and powerful language.
By contrast `bun install` is about as good as it gets.
Please give me Java tools over C, C++, JavaScript or Python ones, any day of the week.
Only .NET and Rust compare equally in quality of DX.
AI tools value simplicity?!?
Check in the Python dependency management chaos, what it is the proposal this month, from what AI startup doing Python tools in Rust?
How many mass security incidents have there been with npm just the last few weeks?
If the build script being a DSL is the issue, they're even experimenting around declarative gradle scripts [0], which is going to be nice for people used to something like maven.
> The challenge
> Traditional tool calling creates two fundamental problems as workflows become more complex:
> Context pollution from intermediate results: When Claude analyzes a 10MB log file for error patterns, the entire file enters its context window, even though Claude only needs a summary of error frequencies. When fetching customer data across multiple tables, every record accumulates in context regardless of relevance. These intermediate results consume massive token budgets and can push important information out of the context window entirely.
> Inference overhead and manual synthesis: Each tool call requires a full model inference pass. After receiving results, Claude must "eyeball" the data to extract relevant information, reason about how pieces fit together, and decide what to do next—all through natural language processing. A five tool workflow means five inference passes plus Claude parsing each result, comparing values, and synthesizing conclusions. This is both slow and error-prone.
Basically, instead of Claude trying to, e.g., process data by using inference from its own context, it would offload to some program it specifically writes. Up until today we've seen Claude running user-written programs. This new paradigm allows it the freedom to create a program it finds suitable in order to perform the task, and then run it (within confines of a sandbox) and retrieve the result it needs.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
To be honest, that sounds more like a pitch for deno than for bun, especially the “paranoidly sandboxed” part.
The acquisition makes more sense. A few observations:
- no acquisition amount was announced. That indicates some kind of share swap where the investors change shares for one company into another. Presumably the founder now has some shares in Anthropic and a nice salary and vesting structure that will keep him on board for a while.
- The main investor was Kleiner Perkins. They are also an investor in Anthropic. 100M in the last round, apparently.
Everything else is a loosely buzzword compatible thingy for Anthropic's AI coding thingy and some fresh talent for their team. All good. But it's beside the point. This was an investor bailout. They put in quite a bit of money in Bun with exactly 0 remaining chance of that turning into the next unicorn. Whatever flaky plan there once might have been for revenue that caused them to invest, clearly wasn't happening. So, they liquidated their investment through an acquihire via one of their other investments.
Kind of shocking how easy it was to raise that kind of money with essentially no plan whatsoever for revenue. Where I live (Berlin), you get laughed away by investors (in a quite smug way typically) unless you have a solid plan for making them money. This wouldn't survive initial contact with due diligence. Apparently money still grows on trees in Silicon Valley.
I like Bun and have used it but from where I'm sitting there was no unicorn lurking there, ever.
Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.
It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).
I've heard that TypeScript is pretty rough on agentic coding loops because the idiomatic static type assertion code ends up requiring huge amounts of context to handle in a meaningful way. Is there any truth to it?
There was recently a conference which was themed around the idea that typescript monorepos are the best way to build with AI
My personal experience and anecdotal evidence is in line with this hypothesis. Using the likes of Microsoft's own Copilot with small simple greenfield TypeScript 5 projects results in surprisingly poor results the minute you start leaning heavily on type safety and idiomatic techniques such as branded types.
> There was recently a conference which was themed around the idea that typescript monorepos are the best way to build with AI
There are also flat earth conferences.
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.
It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!
Makes me wonder what a theoretical “best possible language for vibe coding” would look like
Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.
And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
I've seen codebases of varying quality in nearly every language, "enterprise" and otherwise. I've worked at a C# shop and it was no better or worse than the java/kotlin/typescript ones I've worked at.
You can blame the "average" developer in a language for "not caring ", but more likely than not you're just observing the friction imposed by older packaging systems. Modern languages are usually coupled with package managers that make it trivial to publish language artifacts to package hubs, whereas gradle for example is it's own brand of hell just to get your code to build.
do so because a boss told them 'thats the way we deal with correctness and performance around here'
the fact that their boss made that one decision for them does not somehow transmit the values behind the one decision.
> Nonsense. Average Java/C# is an enterprise monkey who barely knows outside of their grotesque codebase.
Netflix is Java. Amazon is mostly Java. Some of the biggest open source projects in the world are Java. Unity and Godot both use C# for scripting.I don't know where you're getting the impression that Java and C# are somehow only for "enterprise monkey who barely knows outside of their grotesque codebase"
You can add Meta, Google and Palantir to your list and it won’t change that average Java dev is from an Eastern hemisphere and knows nothing about Java outside of JavaEE/Spring.
See how generalizations work?
A typical backend developer using C#/Java is likely solving more complicated problems and having all the concerns of an enterprise system to worry about and maintain.
Dismissing a dev or a system because it is enterprisy is a weak argument to make against a language. A language being used a lot in an enterprise to carry the weight of the business is a sign the language is actually great and reliable enough.
All of them explicitly don’t have to care about performance from the start because of VMs + GC, only when scale starts to matter you start to optimize.
Tooling argument is especially funny to me, given how shit tooling ecosystem is. Sure it is ol’ reliable, but average Java dev is so stuck in their ways that they’ve never even tried to dwell out of their Java cave to see what’s out there.
IntelliJ consuming ALL of RAM, literally as much as it can get hands on. Gradle taking what’s left, rebuilds taking minutes to complete or requiring elaborate setup to have proper hot reload. Both TS and Python have far more expressive and powerful type systems than even modern Java. “Production grade tooling” my ass.
Funny to see Java shmucks looking down at JS/Python folks, as if Java at the time wasn’t picked for literally same reasons as Python/JS nowadays.
what makes you think so?
I believe strong typing is very very useful for human coding,
I'm not convinced its so 'very very' for agents.
I've noticed LLMs just slap on "as any" to solve compile errors in TypeScript code, maybe this is common in the training data. I frequently have to call this out in code review, in many cases it wasn't even a necessary assertion, but it's now turned a variable into "any" which can cause downstream problems or future problems
I tell the LLM to include typing on any new code.
The agent is running the test harness and checking results.
I'm not sure I understand why it's necessary to even couple this to a runtime, let alone own the runtime?
Can't you just do it as a library and train/instruct the LLM to prefer using that library?
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
[0] https://timesofindia.indiatimes.com/technology/tech-news/goo...
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
https://github.com/yt-dlp/yt-dlp/wiki/EJS
where a commandline JS runtime is used without the need for any graphics layer (advertising display layer)
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
> They didn't have to join, which means they got a solid valuation.
This isn't really true. It's more about who wanted them to join. Maybe it was Anthropic who really wanted to take over Bun/hire Jarred, or it was Jarred who got sick of Bun and wanted to work on AI.I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
Sounds like "monetizing Bun is a distraction, so we're letting a deep-pocketed buyer finance Bun moving forward".
And kept their fraudulent name.
They weren’t acquired and got paid just to build tooling as before and now completely ignoring monetization until the end of times.
I'm a user of Bun and an Anthropic customer. Claude Code is great and it's definitely where their models shine. Outside of that Anthropic sucks,their apps and web are complete crap, borderline unusable and the models are just meh. I get it, CC's head got probably a powerplay here given his department is towing the company and his secret sauce, according to marketing from Oven, was Bun. In fact VSCode's claude backend is distributed in bun-compiled binary exe, and the guy is featured on the front page of the Bun website since at least a week or so. So they bought the kid the toy he asked for.
Anthropic needs urgently, instead, to acquire a good team behind a good chatbot and make something minimally decent. Then make their models work for everything else as well as they do with code.
Anthropic are on track to reach $9BN in annualised revenue by the end of the year, and the six-month-old Claude Code already accounts for $1BN of that.
> Almost five years ago, I was building a Minecraft-y voxel game in the browser. The codebase got kind of large, and the iteration cycle time took 45 seconds to test if changes worked. Most of that time was spent waiting for the Next.js dev server to hot reload.
Why in the hell would anyone be using Next.js to make a 3D game... Jarred has always seemed pretty smart, but this makes no sense. He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.
True, but where is the fun in that?
Thanks for assuming I “read” about bundlers somewhere, though. I’ve been using (and configuring) them since they existed.
It's obvious why he didn't write the game in x86 assembly. It's also obvious why he didn't burn the game to CD-ROM and ship it to toy stores in big box format. Instead he developed it for the web, saving money and shortening the iteration time. The same question could be asked about next.js and especially about taking the time to develop Bun rather than just scrapping next.js for his game and going about his day. It's excellent for him that he did go this route of course, but in my opinion it was a strange path towards building this product.
Happy to answer any questions
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
putting everyone using the generated outputs into a sort of unofficial grey market: even when using first-party tools. Which is weird.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful. However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
> had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types
Folks at Bun are "Zig people" for obvious reasons, and a link was made with Handmade software. This happened multiple times before with Bun specifically, so my response is not a "pivot" of any kind. I've highlighted and constrasted our differences to prevent further associations inside a viral HN thread. That's not unreasonable.
I also explicitly congratulated them for the acquisition.
I'm willing to wager that 99.99% of readers do not associate "Handmade" with the org you're associated with, and that most didn't know it existed until this comment. So yes "really", without OP replying, it's understandable that the poster you're replying inferred it had nothing to do with you.
Thank you! I appreciated how you wrote up this clarifying.
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
One of my favorite things is describing a bug to an LLM and asking it to find possible causes. It's helped track something down many times, even if I ultimately coded the fix.
For example: what’s in the middle for programming?
For me 0 is writing 0 and 1. For others 0 is making the nand ports.
And 100 is ai llm vibe.
So 50/middle would be what exactly? It all depends.
Same for anything really. Some people I know keep saying not 8 not 80 to mean the middle.
Like what’s in the middle for amount of coding per day? 12 h? 8h? 2h?
What’s middle for making money? 50k, 500k, 500m?
What’s the middle for taking cyanide ? 1g? 1kg?
What about water? What about food? What about anything?
As you can see, it’s all relative and whomever says it, is trying to push his narrative as “middle” aka correct, while who does more or less is “wrong”.
You see how makes no sense this in the “middle” concept?
Something like that. What you think?
This sounds so cringe. We are talking about computer code here lol
Bun genuinely made me doubt my understanding of what good software engineering is. Just take a look at their code, here are a few examples:
- this hand-rolled JS parser of 24k dense, memory-unsafe lines: https://github.com/oven-sh/bun/blob/c42539b0bf5c067e3d085646... (this is a version from quite a while ago to exclude LLM impact)
- hand-rolled re-implementation of S3 directory listing that includes "parsing" XML via hard-coded substrings https://github.com/oven-sh/bun/blob/main/src/s3/list_objects...
- MIME parsing https://github.com/oven-sh/bun/blob/main/src/http/MimeType.z...
It goes completely contrary to a lot of what I think is good software engineering. There is very little reuse, everything is ad-hoc, NIH-heavy, verbose, seemingly fragile (there's a lot of memory manipulation interwoven with business logic!), with relatively few tests or assurances.
And yet it works on many levels: as a piece of software, as a project, as a business. Therefore, how can it be anything but good engineering? It fulfils its purpose.
I can also see why it's a very good fit for LLM-heavy workflows.
[1] https://codeberg.org/ziglang/zig/src/commit/be9649f4ea5a32fd...
Is there anything I could do to improve this PR/get a review? I understand you are def very busy right now with the acquisition, but wanted to give my PR the best shot:
Do you think Anthropic might request you implement private APIs?
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
If the answer is performance, how does Bun achieve things quicker than Node?
I contributed to Bun one time for SQLite. I've a question about the licensing. Will each contributor continue to retain their copyright, or will a CLA be introduced?
Thanks
No high-level self updater api is planned right now, but yes for at least the low level parts needed to make a good one
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
asking the real questions
Congratulations.
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
This just isn't the hard part of the product.
Like if I was building a Claude Code competitor and I acquired bun, I wouldn't feel like I had an advantage because I could get more support with like fs.read?
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
I don't think they're doing that.
Estimates I've seen have their inference margin at ~60% - there's one from Morgan Stanley in this article, for example: https://www.businessinsider.com/amazon-anthropic-billions-cl...
Not estimate, assumption.
The best one is from the Information, but they're behind a paywall so not useful to link to. https://www.theinformation.com/articles/anthropic-projects-7...
A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.
The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.
Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.
This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.
Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.
Devs can write at a very fast rate with ai.
You still need to check it or at least be aware it's a translation. The problem of extra puns remains.
我不会说任何语言,我否认一切
Incorrect - that was the fraudulent NAV.
An estimate for true cash inflow that was lost is about $20 billion (which is still an enormous number!)
anthropic's unit margins are fine, many lovable-like businesses are not.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
You have no responsibility for an unrelated company's operations; if that was important to them they could have paid their talent more.
From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.
Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.
Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)
> that's just FUD
What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.
> aren’t typically hired individually but as teams.
So? VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.
> What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.
These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)
> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future
It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.
Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.
But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.
Reminds me of when Tron, the crypto company, bought BitTorrent.
If Bun embraces the sweet spot around edge computing, modern JS/TS and AI services, I think their future ahead looks bright.
Bun seems more alive than Deno, FWIW.
Does that mean anything at all?
OpenAI is a public benefit corporation.
Will this make it more or less likely for people to use Bun vs Deno?
And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
People who like Bun for what it is are probably still going to, and same goes for Deno.
That being said I don't see how Anthropic is really adding long term stability to Bun.
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
Amount of people at big corps that care about their lawsuit, and would switch their IT guidelines from node to Deno due to such heroic efforts?
Zero.
I use Hono, Zod, and Drizzle which AFAIK don't need Node compat.
IIRC I've only used Node compat once to delete a folder recursively with rm.
I'm not sure it will make much of a difference in the short term.
For those who were drawn to Bun by hype and/or some concerns around speed, they will continue to use Bun.
For me personally, I will continue to use Node for legacy projects and will continue using Deno for current projects.
I'm not interested in Bun for it's hype (since hype is fleeting). I have a reserved interested in Bun's approach to speed but I don't see it being a significant factor since most JS speed concerns come from downloading dependencies (which is a once-off operation) and terrible JS framework practices (which aren't resolved by changing engines anyway).
----------------------------
The two largest problems I see in JS are:
1. Terrible security practices
2. A lack of a standard library which pushes people into dependency hell
Deno fixes both of those problems with a proper permission model and a standard library.
----------------------------
> And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
I think any predictions between 1-10 years are going to be a little too chaotic. It all depends on how the AI bubble goes away.
But after 10 years, I can see runtimes switching from their current engines to one based on Boa, Kiesel or something similar.
It fades away as a direct to developer tool.
This is a good thing for Deno.
Elaborate? I believe Zig's donors don't get any influence and decision making power.
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
We have yet to witness a segfault. Admitedly it's a bunch of micro services and not many requests/s (around 5k AVG).
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
1. https://github.com/denoland/deno/issues?q=is%3Aissue%20state... 2. https://github.com/oven-sh/bun/issues?q=is%3Aissue%20state%3...
Node.js is a no-brainer for anyone shipping a TS/JS backend. I'd rather deal with poor DX and slightly worse performance than risk fighting runtime related issues on deployment.
Linux needs to be a first class citizen for any runtime/langauge toolchain.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
I haven't had that experience with deno (or node)
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
johnfn@mac ~ % time deno eval 'console.log("hello world")'
hello world
deno eval 'console.log("hello world")' 0.04s user 0.02s system 87% cpu 0.074 total
johnfn@mac ~ % time bun -e 'console.log("hello world")'
hello world
bun -e 'console.log("hello world")' 0.01s user 0.00s system 84% cpu 0.013 total
That's about 560% faster. Yes, it's a microbenchmark. But you said "absolutely zero chance", not "a very small chance".Lol, yeah, this person is running a performance test on postgres, and attributing the times to JS frameworks.
The tools that the language offers to handle use after free is hardly any different from using Purify, Insure++ back in 2000.
Who knows?
Besides, how are they going to get back the money spent on the acquisition?
Many times the answer to acquisitions has nothing to do with technology.
Anthropic chose to use Bun to build their tooling.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
Something about moral and philosophical flexibility.
Why? Genuine question, sorry if it was said/implied in your original message and I missed it.
What gives you this impression?
I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.
Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.
I don't mean to disparage you in particular, this is like the 1000th time I've seen this.
More broadly, I think the observation tends to get repeated because C and Zig share a certain elegance and simplicity (even if C's elegance has dated). C++ is many things, but it's hardly elegant or simple.
I don't think anyone denies that Zig can be a C++ replacement, but that's hardly unusual, so can many other languages (Rust, Swift, etc). What's noteworthy here is that Zig is almost unique in having the potential to be a genuine C replacement. To its (and your) great credit, I might add.
>> At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
@GP: This is not a great take. All four languages are oriented around manual memory management. C++ inherits all of the footguns of C, whereas Zig and Rust try to sand off the rough edges.
Manual memory management is and will always remain necessary. The only reason someone writing JS scripts don't need to worry about managing their memory is because someone has already done that work for them.
A lot of stuff related to older languages is lost in the sands of time, but the same thing isn’t true for current ones.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
So it seems odd to say that Bun is less dependent on the npm library ecosystem.
[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
Here are the Bun API’s:
https://bun.com/docs/runtime/bun-apis
Here are the Deno API’s:
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.
> Zig does force some safety with ReleaseSafe IIRC
which Bun doesn't use, choosing to go with `ReleaseFast` instead.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
https://github.com/aws/aws-cdk/issues/31753
This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change
https://github.com/aws/aws-cdk/issues/33464
but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production
you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun
Maybe an easier first step would be to open source Claude Code...?
Codex has the opposite issue. It has an open client, which is relatively pointless, because it will accept only one system prompt and one prompt only.
How would the payout split work? It wouldn’t seem fair to the investors if the founder profited X million while the investors get their original money returned. I understand VC has the expectation that 99 out of 100 of investments will net them no money. But what happens in the cases where money is made, it just isn’t profitable for the VC firm.
What’s to stop everyone from doing this? Besides integrity, why shouldn’t every founder just cash out when the payout is life-changing?
Is there usually some clause in the agreements like “if you do not return X% profit, the founder forfeits his or her equity back to the shareholders”?
Additionally, depending on round, they also have multiples, like 2x meaning they get at least 2x their investment before anyone else gets anything
I did end up fixing Node.js compatibility later but it was extra work. Felt like they just created busy-work. Node.js maintainers should stop deprecating perfectly good features and complicating their modules.
Investors must be happy because Bun never had to find out how to become profitable.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
and
Implementing the Decisions
are complementary, one of these is being commoditised.
And, in fact, decimated.
Personally I am benefitting almost beyond measure because I can spend my time as the architect rather than the builder.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
Source interview: https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn
I posted a link and transcription of the rest of his "three to six months" quote here: https://news.ycombinator.com/item?id=46126784
You can see my site here, if you'd like: https://chipscompo.com/
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.
You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).
But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.
If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."
That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".
My best writing on this topic is still this though (which doesn't include a video): https://simonwillison.net/2025/Mar/11/using-llms-for-code/
I was honestly baffled how fast Claude knocked this out.
On the opposite spectrum it's just that Claude and Bun are great technologies that joined forces.
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
"the full platform"
there are more languages than ts though?Acquisition of Apple Swift division incoming?
https://github.blog/news-insights/octoverse/octoverse-a-new-...
Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.
Edit:
Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.
Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
Why wouldn't they consider their options for bundling that version into a single binary using Node.js tooling before adopting Bun?
I'm not sure if Joyent have any significant role in Node.js maintenance any more.
regardless, it's certainly not MS.
Claude Code is a 1B+ cash machine and Anthropic directly uses Bun for it.
Acquiring Bun lowers the risk of the software being unmaintained as Bun made $0 and relied on VC money.
Makes sense, but this is just another day in San Francisco of a $0 revenue startup being bought out.
Anything is greater than 0
In the article they write about the early days
We raised a $7 million seed round
Why do investors invest into people who build something that they give away for free?Why invest into a company that has the additional burden of developing bun, why not in a company that does only the hosting?
There's also the trick Deno has been trying, where they can use their control of the core open source project to build features that uniquely benefit their cloud hosting: https://til.simonwillison.net/deno/deno-kv#user-content-the-...
If your userbase or the current CEO likes it or not.
> going to replace all software developers with AI
No?
> building a product that is supposed to make software cost 0 right
No?
Except Node's author already wrote its replacement: Deno.
That perspective following “in two-three years” makes me shudder, honestly.
Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.
Plus many other scenarios.
Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.
All great for AI to recover during its iterations of generating something useful.
Genuinely perplexed.
Might also be a context window thing. Idk how much boilerplate C# has, but others like Java spam it.
I dislike it also..
Like I’ve said: NativeAOT
https://learn.microsoft.com/en-us/dotnet/core/deploying/nati...
on Linux only with CGO_ENABLED=0 and good luck using some non web related 3rd party module which can be used with CGO disabled.
So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?
Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?
And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?
And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.
Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.
TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.
I don't know why it's even necessary for this.
https://github.com/atxtechbro/test-ink-flickering
Issue on Claude Code GitHub:
Stay in distribution and in the wave as much as possible.
Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.
Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.
moreover, now they can make investments in order to make it an an even more efficient and secure runtime for model workspaces.
At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.
Being able to create an agent in any language to run on any hardware has always been possible hasn't it?
I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.
Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?
Because now the Bun team don't have to redirect their resources to implementing a sustainable business model.
No they don't.
https://www.wheresyoured.at/why-everybody-is-losing-money-on...
IOW look where the puck is going.
[1] https://ziglang.org/code-of-conduct/#strict-no-llm-no-ai-pol...
That's like saying GCC and NodeJS are culturally apart, as if that has significant bearing on either?
That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.
I’ve seems like a great tool, but I remember thinking the same about piping, too.
So, yeah, uv is nice, but for me didn't fundamentally change that much.
I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.
#2, if you don't like uv, you can switch to something else.
uv probably has the least moat around it of anything. Truly a meritocracy: people use it because it's good, not because they're stuck with it.
Python is doing great, other than still doing baby steps into having a JIT in CPython.
How was Go involved there before Zig?
The first hints of what become Bun were when Jared experimented at porting that to Zig.
> Being part of Anthropic gives Bun: Long-term stability.
Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.
(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)
Sure. But everything is relative. For instance, Node has much more likelihood of long term stability than Bun, given its ownership.
Given how many more dependencies you need to build/maintain a Node app, your Bun application has a better chance of long term stability.
With Node almost everything is third party (db driver, S3, router, etc) and the vast majority of NPM deps have dozens if not hundreds of deps.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
If they don't want to maintain; GitHub fork with more motivated people.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
npm install -g @anthropic-ai/claude-code
I thought claude code just used Nodejs? I didn't realise the recommended install used a different runtime. curl -fsSL https://claude.ai/install.sh | bash
That install script gives you a single binary which is created using Bun.Feels like maybe AI companies are starting to feel the questions on their capital spending? They wanna show that this is a responsible acquisition.
Acquisition seems like a large overhead and maybe a slight pivot to me.
Put the Bun folks directly on that please and nothing else.
Congrats to Jarred and the team!
This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.
> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.
I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.
this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
> a decade of performance competition in the JS VM space
this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.
Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...
Don't read a snarky comment so literally ;)
That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?
Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.
Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.
The reality is that the insane "JS ecosystem" will rally around whatever is the latest hotness.
v8 is one of the most advanced JIT runtimes in the world. A lot of people have spent a lot of time focusing on its performance.
Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!
That's 100% what happened to Bun. It's useful (like really useful) and now they're getting rewarded
1. acquire talent.
2. control the future roadmap of bun.
i think it's really 1.
...but hey, things are different during a bubble.
And apparently the submission's source for being the only org I can tell that anticipated this: https://www.theinformation.com/articles/anthropic-advanced-t...
https://github.com/oven-sh/bun/pull/24578
So far, someone from the bun team has left a bunch of comments like
> Poor quality code
...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.
But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.
> that the Bun Claude bot created a PR for about 3 weeks ago
The PR with bad code that's also been ignored was made by the bot that Bun made, and brags about in their acquisition post.
...Did you miss the part where Bun used Claude to generate that PR?:)
1. User krig reports an issue against the Bun repo: https://github.com/oven-sh/bun/issues/24548
2. Bun's own automated "bunbot" filed a PR with a potential fix: https://github.com/oven-sh/bun/pull/24578
3. taylordotfish (not an employee of Bun as far as I can tell, but quite an active contributor to their repo) left a code review pointing out many flaws: https://github.com/oven-sh/bun/pull/24578#pullrequestreview-...
This will make sure Bun is around for many, many, years to come. Thanks Anthropic.
Why Bun?
Easy to setup and go. bun run <something.ts>
Bells and whistles. (SQL, Router, SPA, JSX, Bundling, Binaries, Streams, Sockets, S3)
Typescript Supported. (No need to tsc, bun can transpile for you)
Binary builds. (single executables for easy deployment)
Full Node.js Support. (The whole API)
Full NPM Support. (All the packages)
Native modules. (90% and getting better thanks to Zig's interop)
S3 File / SQL Builtin. (Blazingly Fast!)
You should try it. Yes, others do these things too, but we're talking about Bun.
And even in packages with full support you can find many github issues that bun behaves directly which leads to some bugs.
Well, until the bubble bursts and Anthropic fizzles out or gets acquired themselves.
* Get run by devs with filesystem permissions
* Get bundled into production
(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.
(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.
(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.
A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.
This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.
(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.
[^1]: https://github.com/sst/opencode [^2]: https://github.com/sst/opentui
This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.
Prior to that GitHub Copilot was either the VS Code IDE integration or the various AI features that popped up around the GitHub.com site itself.
and when this bubble pops down goes bun
> Long-term stability. a home and resources so people can safely bet their stack on Bun.
Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.
> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.
Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.
I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.
“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.
Is there any historical precedent of someone doing that?
The effective demand for Opus 4.5 is bottomless; the models will only get better.
People will always want a code model as good as we have now, let alone better.
Bun securing default status in the best coding model is a win-win-win
It does matter. The public ultimately determines how much they get in funding if at all.
> The effective demand for Opus 4.5 is bottomless; the models will only get better.
The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.
There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.
To some degree have “opinionated views on tech stacks” is unavoidable in LLMs, but this seems like it moves us towards a horrible future.
Imagine if claude (or gemini) let you as a business pay to “prefer” certain tech in generated code?
Its google ads all over again.
The thing is, if they own bun, and they want people to use bun, how can they justify not preferencing it on the server side?
…and once one team does it… game on!
It just seems like a sucky future, that is now going to be unavoidable.
Why?
[1] www.bunn.com
Regards.
> We’re hiring engineers.
Careers page:
> Sorry, no job openings at the moment.
/s
It’s wild what happens when a generation of programmers doesn’t know anything except webdev. How far from grace we have fallen.
That's quite a bit harder if your tool is built using a compiled language like Go.
Thank you for showing exactly why acquisitions like this will continue to happen.
If you don't support tools like Bun, don't be surprised to see them raise money from VCs and get bought out by large companies.