• dts
  • ·
  • 13 hours ago
  • ·
  • [ - ]
A lot of people seem confused about this acquisition because they think of Bun as a node.js compatible bundler / runtime and just compare it to Deno / npm. But I think its a really smart move if you think of where Bun has been pushing into lately which is a kind of cloud-native self contained runtime (S3 API, SQL, streaming, etc). For an agent like Claude Code this trajectory is really interesting as you are creating a runtime where your agent can work inside of cloud services as fluently as it currently does with a local filesystem. Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
Yea, they just posted this a few days ago:

https://www.anthropic.com/engineering/advanced-tool-use

They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.

I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.

Java can run anywhere too
Java is owned by Oracle. And you sure don't want to do business with that company. There's a reason why postgresql is slowly eating their cake.
This is FUD. Java has many open source implementations and nobody needs to deal with Oracle.
Even if we postulate that he fear is unwarranted and irrational, the fear is still real, based on Oracles history of lawsuits, and so the explanation still holds.
  • gf000
  • ·
  • 1 minute ago
  • ·
  • [ - ]
It explains nothing.

Java is possibly the safest bet on the future, it's open source both in spec and in the most common implementation (OpenJDK), and is so widely used that there are multiple FAANG companies critically dependent on Java working that alone could continue the development of the platform were anything happen.

Besides, Oracle has been a surprisingly good steward of the language.

  • jve
  • ·
  • 2 hours ago
  • ·
  • [ - ]
Well except Google that got sued for US$8.8 Billion because they decided to use specific API signatures but provide their own implementation...?!
  • ·
  • 45 minutes ago
  • ·
  • [ - ]
Come on, that's a completely different story, Google made their own independent SDK using but incompatible with Java. Nobody's arguing you should do that.

Plus last time I checked Oracle lost that lawsuit.

... and Oracle lost
Anywhere where the correct Java version is installed correctly, important caveat
Java’s cardinal sin was not owning the OS like Microsoft’s C# to force end-users to update the framework. Oracle really didn’t understand what they were sitting on with their Ubuntu competitor Solaris.
This has no longer been the case for C# for 10 years since the release of .NET Core and (now) .NET. The runtime is no longer bundled with the OS.

This is only true for older .NET Framework applications.

  • Topfi
  • ·
  • 2 hours ago
  • ·
  • [ - ]
Isn’t it post installation still updated via Windows Update as they said (force end-users to update the framework)?
  • nly
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Only patches, it doesn't automatically install new major versions
  • szundi
  • ·
  • 48 minutes ago
  • ·
  • [ - ]
[dead]
I ain’t hot a horse in this race I just put 2 and 2 together to get 4. I’m sure Java is fine but they didn’t buy Java.
It’s relevant enough that I feel I can roll out this bash.org classic…

<Alanna> Saying that Java is nice because it works on all OS's is like saying that anal sex is nice because it works on all genders

EDIT: someone has (much to my joy) made an archive of bash.org so here is a link[1], but I must say I’m quite jealous of today’s potential 1/10,000[2] who will discover bash.org from my comment!

[1] https://bash-org-archive.com/?338364

[2] https://xkcd.com/1053

Perhaps my biggest claim to fame is being #11 on the bash.org top 100.
Hah, found it: https://bash-org-archive.com/?207373

So how did it work back in the day, people would just submit text and it would get upvoted? I always assumed like half of them were just made up.

Yep, exactly that. I recall that the voting was interesting because it was just ranked on absolute number of votes, no time decay or anything, so it would take quite some time for a new contender to accumulate votes to "compete" on the leaderboard. I don't remember if there were even accounts or if anyone could just vote repeatedly, modulo some IP or cookie-based limits.

As far provenance, I assume a lot of them were made up too, but this one was real.

As one of the lucky 1/10000, holy shit that was amazing. Thank you.

To everyone else: I acknowlege that this post is not adding value but if you were one of the lucky 1/10000 you would understand that I have no choice.

Not discovered from scratch, but was a big fan when it was alive and kicking. Went there from time to time to get some mood boosters. So was very sad when found that it's gone (original one). Thanks a lot for sharing that bash-org-archive.com exists, what a great fun going down this memory lane.
I’ve been browsing the archive since I left that comment, they really were the good old days weren’t they. IRC was my introduction to geekdom, and I don’t think it would be unreasonable to say it shaped my life. Here I am 30-ish years later, an old man yelling at clouds — and I wouldn’t change much!

If anyone ever requested/used an eggdrop(?) bot from #farmbots or #wildbots on quakenet then thanks to you too; that was certainly one of the next steps down the path I took. A (probably very injectable) PHP blog and a bunch of TCL scripts powering bots, man I wish I could review that code now.

That's hilarious. My comment is mostly a joke, but also trying to say that "runs everywhere" isn't that impressive anymore.
Yeah everyone proclaims to IANAL nowadays.
wait - how do you search the quotes??
I don’t think there is a search function, I got the exact wording from a web search (I think “bash Java anal”, arguably a dangerous search!) and then after submitting I wondered if there is an archive of the quotes.
I found another appropriate XKCD: https://xkcd.com/1682/
  • 827a
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Java is not for sale.
Java can be depended on without buying anything.
Oracle lawyers want you to think so.
Ahem, Temurin/OpenJDK disagree
Java's price is your time which you will need tons of as Java is highly verbose. The ultimate enterprise language
try java 25, and update your priors :)
No amount of updates will wash away the stink of Oracle from Java.
This is hn, where unless something is written in rust or zig usually, people will hate on it. They would rather pump a cli tool than any software of sizable scale.
Again, Temurin/OpenJDK disagree
  • ahoka
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Lipstick on a pig.
  • sfn42
  • ·
  • 1 hour ago
  • ·
  • [ - ]
This is such a crappy point. People say it's better now but even in java 8 it's just BS. Oh boo hoo I have to write a few extra words here and there. Woe is me. The IDE will autogenerate the boilerplate for you, you don't even have to write it yourself. And once it's there it's actually useful, there's a reason it exists.
Not in the browser, and no – webassembly doesn't count, otherwise you can say the same about Go and others.
Wasm does count, and you can say the same about Go and others.
Sure, they run, but they can't touch the DOM or do much that's very interesting without JavaScript.
  • Sammi
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Js just runs as is. Atwood's Law and all that.
I remember a time ...
Why doesn’t wasm count?
Compile step makes things more complicated.
As opposed to minimized JS.
You don’t need to minimize JS to be able to run it.
why would the tool minify the script it generated?
  • Sammi
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Same problem, different orders of magnitude.
  • ·
  • 10 hours ago
  • ·
  • [ - ]
May I ask, what is this obsession with targeting the browser? I've also noticed a hatred of k8s here, and while I truly understand it, I'd take the complication of managing infrastructure over frontend fads any day.
HN has a hatred of K8s? That’s new to me
This is a site for startups. They have no business running k8s, in fact, many of the lessons learned get passed on from graybeards to the younger generation along those lines. Perhaps I'm wrong! I'd love to talk shop somewhere.
  • smt88
  • ·
  • 6 hours ago
  • ·
  • [ - ]
K8s is used in many situations it shouldn't be, and a lot of HNers (including me) are bitter about having to deal with the resulting messes
java did run in the browser once.... it was embedded directly on the browser there was also nsapi

you could also run java with js if you are brave enough https://kreijstal.github.io/java-tools/

Java runs in the browser currently, after a transpilation step (same as .ts):

https://teavm.org/

run code anywhere hamstrung by 90s syntax and hidden code indirections
Haven’t checked in on Java in a while?
  • ozim
  • ·
  • 10 hours ago
  • ·
  • [ - ]
From what I gather everyone is still stuck on Java 8 so no need to check?
No, everyone isn’t. You really should check.
Where do you gather this from? We are a startup, on Java and on 25.
Why didn't you choose something more modern/sensible. go/kotlin/anything else on the planet?
  • foo4u
  • ·
  • 9 hours ago
  • ·
  • [ - ]
This is absolutely untrue. Code from JDK 8 runs fine on JDK 25 (just released LTS). It is true that if you did something silly that locks you into certain dependency versions, you may be stuck, but this is not the majority of applications.
I tried to check in on Java recently but got a NullPointerException when using the AbstractSingletonProxyFactoryBean !
I'll never understand people making fun of verbosity. So you really prefer short, ambiguous, opaque and unpronounceable abbreviations? Really?!
For me at least, I find it easier to see the shape of algorithms, control flow, and expressions when the variable names are concise. But this also might be because I have found Go to fit my use-cases and thinking style well, and Go programs tend to follow this naming convention.

For example, if I have a struct `PageEntity` with a field `Id`, and I am iterating over a slice of such IDs, I would prefer using `pid` instead of `pageEntityId` as the variable name. But Java APIs and conventions tend to use these longer names, so I find it takes more thinking to remember the different names instead of quickly seeing the behavior of code at a glance.

Java also tends to have a lot of inheritance which results in these long glued-together names and makes it harder to follow program flow because behaviors get introduced in multiple different places (i.e., it has the opposite of locality of behavior).

But those are just my opinions and experiences! I know many people love Java, and it is a versatile and powerful language.

i haven't. do people still use the "class" keyword?
Is that the issue people have with Java?
  • mythz
  • ·
  • 7 hours ago
  • ·
  • [ - ]
AI tools value simplicity, fast bootstrapping and iterations, this rules out the JVM which has the worst build system and package repositories I've ever had the displeasure of needing to use. Check in gradle binaries in 2025? Having to wait days for packages to sync? Windows/Linux gradle wrappers for every project? Broken builds and churn after every major upgrade. It's broken beyond repair.

By contrast `bun install` is about as good as it gets.

  • pjmlp
  • ·
  • 2 hours ago
  • ·
  • [ - ]
Gradle is something that only Android devs should be using, and because of Google imposes its use. Had not been for Google and Android Gradle plugin, almost no one would care.

Please give me Java tools over C, C++, JavaScript or Python ones, any day of the week.

Only .NET and Rust compare equally in quality of DX.

AI tools value simplicity?!?

Check in the Python dependency management chaos, what it is the proposal this month, from what AI startup doing Python tools in Rust?

Apples and oranges. Maven is leagues beyond npm. Screw Gradle.

How many mass security incidents have there been with npm just the last few weeks?

  • sfn42
  • ·
  • 2 hours ago
  • ·
  • [ - ]
It's just too bad bun is based on literally the worst programming language that's in actual use.
  • mythz
  • ·
  • 1 hour ago
  • ·
  • [ - ]
TypeScript's one of the best, and bun runs it natively.
  • sfn42
  • ·
  • 58 minutes ago
  • ·
  • [ - ]
Typescript is a band aid on the gaping gushing wound that is JavaScript. It attempts to fix one problem JS has and it doesn't really succeed.
  • mythz
  • ·
  • 38 minutes ago
  • ·
  • [ - ]
Sounds like cope. Great Type System, Language Server, IDE Integration, compiler feedback, tooling ecosystem, DX Hot Reload - all things that made it the most used programming language on GitHub.
By using Gradle you certainly didn't make yourself a favor.
I am unsure why people feel the need to say this about Gradle. If you aren't doing anything fancy, the most you will touch is the repositories and dependencies block of your build script, perhaps add publishing or shadow plugins and configure them accordingly but that has never been simpler than it is now. Gradle breaks when you feel the need to unnecessarily update things like the wrapper version or plugins without considering the implications that has. Wrapper is bundled in so you don't have to try and make a build script work with whatever version you might have installed on your system if you have any, toolchain resolution makes it so you don't even need to install an appropriate JDK version as it does that for you.

If the build script being a DSL is the issue, they're even experimenting around declarative gradle scripts [0], which is going to be nice for people used to something like maven.

0: https://declarative.gradle.org/

And yet. None of these issues exist in Maven to begin with.
What do you mean by "context" here?
Under "Programmatic Tool Calling"

> The challenge

> Traditional tool calling creates two fundamental problems as workflows become more complex:

> Context pollution from intermediate results: When Claude analyzes a 10MB log file for error patterns, the entire file enters its context window, even though Claude only needs a summary of error frequencies. When fetching customer data across multiple tables, every record accumulates in context regardless of relevance. These intermediate results consume massive token budgets and can push important information out of the context window entirely.

> Inference overhead and manual synthesis: Each tool call requires a full model inference pass. After receiving results, Claude must "eyeball" the data to extract relevant information, reason about how pieces fit together, and decide what to do next—all through natural language processing. A five tool workflow means five inference passes plus Claude parsing each result, comparing values, and synthesizing conclusions. This is both slow and error-prone.

Basically, instead of Claude trying to, e.g., process data by using inference from its own context, it would offload to some program it specifically writes. Up until today we've seen Claude running user-written programs. This new paradigm allows it the freedom to create a program it finds suitable in order to perform the task, and then run it (within confines of a sandbox) and retrieve the result it needs.

Thanks for the reply.
Jesus wept, for the nerds joyfully want skyney
  • btown
  • ·
  • 11 hours ago
  • ·
  • [ - ]
Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.

Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.

Bun isn't based on V8, it's JavaScriptCore, but your point still stands.
  • hbbio
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Who would have predicted KDE could become the foundation of both AI and gaming
  • jeeeb
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Also the worlds most popular web browsers
Gaming = talking about the Steam Deck?
  • ·
  • 9 hours ago
  • ·
  • [ - ]
you left out the best part...what happened to Kenton? He looked at lightweight serverless architecture..and then what?
I built Cloudflare Workers?
  • ·
  • 21 minutes ago
  • ·
  • [ - ]
This is going to be a HN Classic.
This is how I found out about HN Classic! https://news.ycombinator.com/classic
  • fitzn
  • ·
  • 3 hours ago
  • ·
  • [ - ]
Boom
mic drop
> Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.

To be honest, that sounds more like a pitch for deno than for bun, especially the “paranoidly sandboxed” part.

I'm not confused about the acquisition but about the investment. What were the investors thinking? This is an open source development tool with (to date), 0$ of revenue and not even the beginnings of a plan for getting such a thing.

The acquisition makes more sense. A few observations:

- no acquisition amount was announced. That indicates some kind of share swap where the investors change shares for one company into another. Presumably the founder now has some shares in Anthropic and a nice salary and vesting structure that will keep him on board for a while.

- The main investor was Kleiner Perkins. They are also an investor in Anthropic. 100M in the last round, apparently.

Everything else is a loosely buzzword compatible thingy for Anthropic's AI coding thingy and some fresh talent for their team. All good. But it's beside the point. This was an investor bailout. They put in quite a bit of money in Bun with exactly 0 remaining chance of that turning into the next unicorn. Whatever flaky plan there once might have been for revenue that caused them to invest, clearly wasn't happening. So, they liquidated their investment through an acquihire via one of their other investments.

Kind of shocking how easy it was to raise that kind of money with essentially no plan whatsoever for revenue. Where I live (Berlin), you get laughed away by investors (in a quite smug way typically) unless you have a solid plan for making them money. This wouldn't survive initial contact with due diligence. Apparently money still grows on trees in Silicon Valley.

I like Bun and have used it but from where I'm sitting there was no unicorn lurking there, ever.

  • hoppp
  • ·
  • 12 hours ago
  • ·
  • [ - ]
It's fine but why is Js a good language for agents? I mean sure its faster than python but wouldn't something that compiles to native be much better?
JS has the fastest, most robust and widely deployed sandboxing engines (V8, followed closely by JavaScriptCore which is what Bun uses). It also has TypeScript which pairs well with agentic coding loops, and compiles to the aforementioned JavaScript which can run pretty much anywhere.
Note that "sandboxing" in this case is strictly runtime sandboxing - it's basically like having a separate process per event loop (as if you ran separate Node processes). It does not sandbox the machine context in which it runs (i.e. it's not VM-level containment).
When you say runtime sandboxing, are you referring to JavaScript agents? I haven't worked all that much with JavaScript execution environments outside of the browser so I'm not sure about what sandboxing mechanics are available.
https://nodejs.org/api/vm.html

Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.

It's interesting to see the difference in how both treat the module. It feels similar to a realm which makes me lean by default to not trusting it for untrusted code execution.

It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).

The reference docs are auto generated from node’s TypeScript types. node:vm is better than using the same global object to run untrusted code, but it’s not really a sandbox
Doesn’t Bun use JavaScriptCore though? Perhaps their emulation, rather implementation, leans more towards security.
Running it in a chroot or a scoped down namespace is all your need most of the time anyways.
> It also has TypeScript which pairs well with agentic coding loops, (...)

I've heard that TypeScript is pretty rough on agentic coding loops because the idiomatic static type assertion code ends up requiring huge amounts of context to handle in a meaningful way. Is there any truth to it?

Not sure where you heard this but general sentiment is the opposite.

There was recently a conference which was themed around the idea that typescript monorepos are the best way to build with AI

> Not sure where you heard this but general sentiment is the opposite.

My personal experience and anecdotal evidence is in line with this hypothesis. Using the likes of Microsoft's own Copilot with small simple greenfield TypeScript 5 projects results in surprisingly poor results the minute you start leaning heavily on type safety and idiomatic techniques such as branded types.

> There was recently a conference which was themed around the idea that typescript monorepos are the best way to build with AI

There are also flat earth conferences.

It's especially tricky since monorepos are an obvious antipattern to begin with. They're a de-separation of concerns: an encouragement to blur the unit boundaries, not write docs, create unstable APIs (updating all usages at once when they change), and generally to let complexity spread unchecked.
  • aizk
  • ·
  • 3 hours ago
  • ·
  • [ - ]
I think this is contingent on the skill of the human reviewing the AI's code.
  • ·
  • 7 hours ago
  • ·
  • [ - ]
Not to mention the saturation of training data
[dead]
> It also has TypeScript which pairs well with agentic coding loops

The language syntax has nothing to do with it pairing well with agentic coding loops.

Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.

Typescript is probably generally a good LLM language because - static types - tons and tons of training data

Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.

It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!

Clojure is such an underrated language for vibe coding for this very reason.

Makes me wonder what a theoretical “best possible language for vibe coding” would look like

whoa, instant upgrade. thanks!
> C#'s speed advantage over JS among many other things would make C# the main language

Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.

> Nobody cares about this

And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.

  • ·
  • 7 hours ago
  • ·
  • [ - ]
It's widespread and good enough. The language just doesn't matter that much in most cases
This is one of those, "in theory, there's no difference between theory and practice. In practice, there is" issues.

In their, quality software can be written in any programming language.

In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.

As a developer that switches between java, python and typescript every day I think this is fairly myopic opinion. Being siloed to one lang for long enough tends to brings out our tribalistic tendencies, tread carefully.

I've seen codebases of varying quality in nearly every language, "enterprise" and otherwise. I've worked at a C# shop and it was no better or worse than the java/kotlin/typescript ones I've worked at.

You can blame the "average" developer in a language for "not caring ", but more likely than not you're just observing the friction imposed by older packaging systems. Modern languages are usually coupled with package managers that make it trivial to publish language artifacts to package hubs, whereas gradle for example is it's own brand of hell just to get your code to build.

That's not a fair comparison. In your example, you're talking about the average of developers in a language. In this situation, it's specific developers choosing between languages. Having the developers you already have choose language A or B makes no difference to their code quality (assuming they're proficient with both)
These are statements these developers will make themselves. They will say they don't like more strictly typed languages because they feel constrained and slowed down in development. They will argue that the performance hit is worth the trade offs.
perhaps many of those 'Folks who use languages like Java or C#'

do so because a boss told them 'thats the way we deal with correctness and performance around here'

the fact that their boss made that one decision for them does not somehow transmit the values behind the one decision.

[flagged]
Exactly! In the Java ecosystem, your intelligence is measured by how elaborate an interface hell you can conjure just to do CRUD.

    > Nonsense. Average Java/C# is an enterprise monkey who barely knows outside of their grotesque codebase.
Netflix is Java. Amazon is mostly Java. Some of the biggest open source projects in the world are Java. Unity and Godot both use C# for scripting.

I don't know where you're getting the impression that Java and C# are somehow only for "enterprise monkey who barely knows outside of their grotesque codebase"

> Netflix is Java. Amazon is mostly Java. Some of the biggest open source projects in the world are Java. Unity and Godot both use C# for scripting.

You can add Meta, Google and Palantir to your list and it won’t change that average Java dev is from an Eastern hemisphere and knows nothing about Java outside of JavaEE/Spring.

See how generalizations work?

Chill out buddy. You're going to pop a vein here.

A typical backend developer using C#/Java is likely solving more complicated problems and having all the concerns of an enterprise system to worry about and maintain.

Dismissing a dev or a system because it is enterprisy is a weak argument to make against a language. A language being used a lot in an enterprise to carry the weight of the business is a sign the language is actually great and reliable enough.

I’m not dismissing Java, I’ve spent decades writing it and know what it is capable of, but it is laughable to hear that average Java dev cares more about performance or correctness than Python/JS dev.

All of them explicitly don’t have to care about performance from the start because of VMs + GC, only when scale starts to matter you start to optimize.

Tooling argument is especially funny to me, given how shit tooling ecosystem is. Sure it is ol’ reliable, but average Java dev is so stuck in their ways that they’ve never even tried to dwell out of their Java cave to see what’s out there.

IntelliJ consuming ALL of RAM, literally as much as it can get hands on. Gradle taking what’s left, rebuilds taking minutes to complete or requiring elaborate setup to have proper hot reload. Both TS and Python have far more expressive and powerful type systems than even modern Java. “Production grade tooling” my ass.

Funny to see Java shmucks looking down at JS/Python folks, as if Java at the time wasn’t picked for literally same reasons as Python/JS nowadays.

  • aizk
  • ·
  • 3 hours ago
  • ·
  • [ - ]
TS is enormous, has endless training data, and can interact with virtually anything on the Internet these days. Also, strong typing is very very useful for AI coding context.
> strong typing is very very useful for AI coding context

what makes you think so?

I believe strong typing is very very useful for human coding,

I'm not convinced its so 'very very' for agents.

When I've use agents with TS, failing tests due to typing seems to help the agent get to the correct solution. Maybe it's not required though.
What do you mean by "failing tests", are you talking about runtime code? TypeScript erases all types at compile so these wouldn't affect tests. Unless you meant "compile errors" instead.

I've noticed LLMs just slap on "as any" to solve compile errors in TypeScript code, maybe this is common in the training data. I frequently have to call this out in code review, in many cases it wasn't even a necessary assertion, but it's now turned a variable into "any" which can cause downstream problems or future problems

My code has tests and the LLM wrote more tests.

I tell the LLM to include typing on any new code.

The agent is running the test harness and checking results.

Isn't what you're describing just a set of APIs with native bindings that the LLM can call?

I'm not sure I understand why it's necessary to even couple this to a runtime, let alone own the runtime?

Can't you just do it as a library and train/instruct the LLM to prefer using that library?

Mostly, just Jarred Sumner makes it worth it for Anthropic.
Could also be a way to expand the customer for Claude Code from coding assistant to vibe coding, a la Replit creating a hosted app. CC working more closely with Bun could make all that happen much faster:

> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.

  • gz5
  • ·
  • 11 hours ago
  • ·
  • [ - ]
>Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases

100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.

ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.

The writeup makes it sound like an acquihire, especially the "what changes" part.

ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.

[0] https://timesofindia.indiatimes.com/technology/tech-news/goo...

That's a really cool use case and seems super helpful. working cloud native is a chore sometimes. having to fiddle with internal apis, acl/permissions issues.
  • ·
  • 12 hours ago
  • ·
  • [ - ]
[flagged]
This matches some previous comments around LLMs driving adoption of programming languages or frameworks. If you ask Claude to write a web app, why not have it use your own framework, that it was trained on, by default?
Users are far more likely to ask it about shadcn, or material, than about node/deno/bun. So, what is this about?
  • ojame
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Currently Claude etc. can interact with services (including AWS) via MCPs.

What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).

0: https://bun.com/docs/runtime/s3

That doesn't make sense either. Agents already have access to MCPs and Tools. Your example is solved by having an S3 wrapper as a set of tools.
  • ·
  • 6 hours ago
  • ·
  • [ - ]
Being able to remove a layer of abstraction to get the thing done is usually good right?
An AI company scoops up frontend tech. Do you really think it was because of s3?
  • gedy
  • ·
  • 9 hours ago
  • ·
  • [ - ]
Bun is not really frontend tech
This is an insanely good take I never thought of.
As a commandline end user who prefers to retreive data from the www as text-only, I see deno and bun as potential replacements (for me, not necessarily for anyone else) for the so-called "modern" browser in those rare cases where I need to interpret Javascript^1

At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)

1. One can already see an approach something like this being used in the case of

https://github.com/yt-dlp/yt-dlp/wiki/EJS

where a commandline JS runtime is used without the need for any graphics layer (advertising display layer)

Is this something I’d have to own a tv to understand?
> At the time of writing, Bun's monthly downloads grew 25% last month (October, 2025), passing 7.2 million monthly downloads. We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.

I believe this completely. They didn't have to join, which means they got a solid valuation.

> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.

I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.

> They didn't have to join, which means they got a solid valuation.

Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.

Though if I'm not mistaken, Confluent did the same thing?

They had a second round that was $19m in late 2023. I don't doubt for a second that they had a long runway given the small team.
I don't like all of the decisions they made for the runtime, or some of the way they communicate over social media/company culture, but I do admire how well-run the operation seems to have been from the outside. They've done a lot with (relatively) little, which is refreshing in our industry. I don't doubt they had a long runway either.
Thanks I scrolled past that in the announcement page.

With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.

Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.

  • n2d4
  • ·
  • 13 hours ago
  • ·
  • [ - ]

    > They didn't have to join, which means they got a solid valuation.
This isn't really true. It's more about who wanted them to join. Maybe it was Anthropic who really wanted to take over Bun/hire Jarred, or it was Jarred who got sick of Bun and wanted to work on AI.

I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".

  • n2d4
  • ·
  • 8 hours ago
  • ·
  • [ - ]
Can't edit my comment anymore but Bun posted a pretty detailed explanation of their motivation here: https://bun.com/blog/bun-joins-anthropic

Sounds like "monetizing Bun is a distraction, so we're letting a deep-pocketed buyer finance Bun moving forward".

Isn’t Anthropic itself also burning investors money? I thought no AI company is making any profit.
Anthropic is still a new company and so far they seem "friendly". That being said, I still feel this can go either way.
Yep. Remember when "Open"AI took a bunch of grant money and then turned for-profit?

And kept their fraudulent name.

> I believe this a bit less.

They weren’t acquired and got paid just to build tooling as before and now completely ignoring monetization until the end of times.

Maybe they were though. Maybe Anthropic just wanted to bring a key piece of the stack in-house.
Good for them, could be bad for actual users.
Given the worries about LLM focused companies reaching profitability I have concerns that Bun's runway will be hijacked... I'd hate for them to go down with the ship when the bubble pops.
This is my fear. It's one thing to lose a major sponsor. It's another to get cut due to a focus on profitability later down the line.
Yeah, now they are part of Anthropic, who haven't figured out monetization themselves. Shikes!

I'm a user of Bun and an Anthropic customer. Claude Code is great and it's definitely where their models shine. Outside of that Anthropic sucks,their apps and web are complete crap, borderline unusable and the models are just meh. I get it, CC's head got probably a powerplay here given his department is towing the company and his secret sauce, according to marketing from Oven, was Bun. In fact VSCode's claude backend is distributed in bun-compiled binary exe, and the guy is featured on the front page of the Bun website since at least a week or so. So they bought the kid the toy he asked for.

Anthropic needs urgently, instead, to acquire a good team behind a good chatbot and make something minimally decent. Then make their models work for everything else as well as they do with code.

> Yeah, now they are part of Anthropic, who haven't figured out monetization themselves.

Anthropic are on track to reach $9BN in annualised revenue by the end of the year, and the six-month-old Claude Code already accounts for $1BN of that.

Not sure if that counts as "figured out monetization" when no AI company is even close to being profitable -- being able to get some money for running far more expensive setups is not nothing, but also not success.
Monetisation is not profitability, it’s just the existence of a revenue stream. If a startup says they are pre-monetisation it doesn’t mean they are bringing in money but in the red, it means they haven’t created any revenue streams yet.
"We were maybe gonna fuck ya, buy now we promise we wont"
I am more shocked about the origin story compared to the acquisition.

> Almost five years ago, I was building a Minecraft-y voxel game in the browser. The codebase got kind of large, and the iteration cycle time took 45 seconds to test if changes worked. Most of that time was spent waiting for the Next.js dev server to hot reload.

Why in the hell would anyone be using Next.js to make a 3D game... Jarred has always seemed pretty smart, but this makes no sense. He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.

Maybe same for anthropic, they can simply write agent using Rust/Go. Instead they decide to buy and develop a JavaScript runtime.
  • nly
  • ·
  • 22 minutes ago
  • ·
  • [ - ]
If anything this seems to be a huge victory for Zig, since Bun is mostly written in Zig.
> He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.

True, but where is the fun in that?

First time I see it being a net positive that someone didn't know about Vite: Bun wouldn't exist otherwise.
He may have been serving a game in a canvas hosted in a Next.js app, but have done all the actual game (rendering, simulation, etc.) in something else. That’s a decent approach - Next can handle the header of the webpage and the marketing blog or whatever just fine.
But like... so can an index.html with a script tag? Am I missing something, where did you read that there was a lot of work involving the header or an attached marketing blog?
My point isn’t that you absolutely need that, just that the negative effect on your game development are pretty minimal if you’re not leaning on the SPA framework for anything related to the game. If your game is going to be embedded into an otherwise normal-ish website, this isn’t a terrible way to go (I’ve done it personally with a game mostly written in Rust and compiled to WASM). You can get gains by splitting your game and web site bundles and loading the former from the latter explicitly, but they’re not massive if your bundler was already reasonably incremental (or was already esbuild).

Thanks for assuming I “read” about bundlers somewhere, though. I’ve been using (and configuring) them since they existed.

I meant specifically was there something I was missing about the Bun developer's game that required a complicated header and thus next.js.
index.html with script files would still benefit from a bundler. You can have a very minimal react footprint and still want to use react build tools just for bundling.
Sure, but I'm more confused about the next.js usage than I am about the bundler. The bundler makes sense.
What effect do you imagine Next.js has on a bunch of code manipulating an HTML canvas? For vanilla code directly using browser APIs it’s basically just a bundler configuration, and while it’s not optimally configured for that use case (and annoying for other reasons) it’s probably better than what someone who has never configured webpack before would get doing it themselves.
Well for one, it ships next.js and react.js bundled in with the code manipulating an HTML canvas.
Okay, but it’s a web game. Those will make up less than 0.1% of the downloaded bytes required to render the first frame of the game. One image asset will dwarf the entire gzip/brotli Next.js/React framework.
What is the use case for bundling next.js with the web game? Just the layout of the page surrounding the game canvas? It just seems unnecessary, that's all. Traditionally, software development in general and game development in particular has tried to avoid unnecessary overhead if it doesn't provide enough value to the finished product.

It's obvious why he didn't write the game in x86 assembly. It's also obvious why he didn't burn the game to CD-ROM and ship it to toy stores in big box format. Instead he developed it for the web, saving money and shortening the iteration time. The same question could be asked about next.js and especially about taking the time to develop Bun rather than just scrapping next.js for his game and going about his day. It's excellent for him that he did go this route of course, but in my opinion it was a strange path towards building this product.

Most people use what they know. You start out that way, and if it turns out to be good, you can always do a v2
Yes, but there are obvious limits to that. This is like someone who knows how to bake wanting to build a car, so they start making it out of dough.
Because he wanted to? Do you also berate the choices of people in the 4K demo scene for using too little memory?
I work on Bun.

Happy to answer any questions

I'm sort of surprised to see that you used Claude Code so much. I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc. And I know Bun started with an extreme attention to detail around performance.

I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.

But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).

Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?

I am not your target with this question (I don't write Zig) but there is a spectrum of LLM usage for coding. It is possible to use LLMs extensively but almost never ship LLM generated code, except for tiny trivial functions. One can use them for ideation, quick research, or prototypes/starting places, and then build on that. That is how I use them, anyway

Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV

> Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV

Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.

I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.

In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.

I also consider them a vocal minority, because I don't think they represent the majority of LLM users.

  • dijit
  • ·
  • 12 hours ago
  • ·
  • [ - ]
fwiw, copilots licence only explicitly permits using its suggestions the way you say.

putting everyone using the generated outputs into a sort of unofficial grey market: even when using first-party tools. Which is weird.

  • ·
  • 5 hours ago
  • ·
  • [ - ]
Can you link to more info about this?
I'll give you a basic example where it saved me a ton of time to vibe code instead of doing it myself, and I believe it would hold true for anyone.

Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.

Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.

Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.

Yeah I had OpenAI crank out 100 different fizzbuzz implementations in a dozen seconds—-and many of them worked! No chance a developer would have done it that fast, and for anyone who needs to crank out fizzbuzz implementations at scale this is the tool to beat. The haters don’t know what they’re talking about.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.

I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."

That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.

Handmade Cities founder here.

We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.

If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.

I'll finish with a quote from a blog post [2]:

> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful. However, it fails at novel problems and isn’t practical for my systems programming work.

All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!

[0] https://filepilot.tech

[1] https://terminal.click

[2] https://handmadecities.com/news/summer-update-2025/

Finding this comment interesting, parent comment didn't suggest any past association but it seemingly uses project reference as pivot point to do various outgroup counter signaling / neg bun?
I understand the concern, but really? I found this quote enough to offer proper comments:

> had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types

Folks at Bun are "Zig people" for obvious reasons, and a link was made with Handmade software. This happened multiple times before with Bun specifically, so my response is not a "pivot" of any kind. I've highlighted and constrasted our differences to prevent further associations inside a viral HN thread. That's not unreasonable.

I also explicitly congratulated them for the acquisition.

> and a link was made with Handmade software

I'm willing to wager that 99.99% of readers do not associate "Handmade" with the org you're associated with, and that most didn't know it existed until this comment. So yes "really", without OP replying, it's understandable that the poster you're replying inferred it had nothing to do with you.

Indeed, you cleared up exactly the misconception I had. Thanks for chiming in to clarify
  • ·
  • 8 hours ago
  • ·
  • [ - ]
I think you may have confused parent commenter's "Handmade software movement" types comment to Handmade cities which doesn't seem related to me other than the common word handmade
I might missing some context. Just to check my understanding: HMC and Bun aren't a good match anymore because Bun devs use LLM/AI tooling more than HMC? Basically to really level up a system is incompatible these tools? (IYHO)

Thank you! I appreciated how you wrote up this clarifying.

I like that the filepilot download is 2.1MB. That really illustrates the difference between handmade style stuff and well, most other stuff.
back in my day we used to write code on punch cards.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.

In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.

If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.

Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.

Yep. And there's a lot of people making use of LLMs in both coding and learning/searching doing exactly that.

One of my favorite things is describing a bug to an LLM and asking it to find possible causes. It's helped track something down many times, even if I ultimately coded the fix.

more people should have such a healthy approach not only to llms but to life in general. Same reason I partake less and less in online discourse: its so tribal and filled with anger that its just not worth it to contribute anymore. Learning how to be in the middle did wonders to me as a programmer and I think as a person as well.
Personally I hate this “in the middle” as it’s so relative you can shape to fit your narrative.

For example: what’s in the middle for programming?

For me 0 is writing 0 and 1. For others 0 is making the nand ports.

And 100 is ai llm vibe.

So 50/middle would be what exactly? It all depends.

Same for anything really. Some people I know keep saying not 8 not 80 to mean the middle.

Like what’s in the middle for amount of coding per day? 12 h? 8h? 2h?

What’s middle for making money? 50k, 500k, 500m?

What’s the middle for taking cyanide ? 1g? 1kg?

What about water? What about food? What about anything?

As you can see, it’s all relative and whomever says it, is trying to push his narrative as “middle” aka correct, while who does more or less is “wrong”.

I think both me, and person before me, were commenting more about the fact that taking reserved approach is just healthier and prevents "shitstorms" in discussions that are non existent in current internet landscape. Without offending you, but creating a straw man scenario about how much cyanide one can take and getting angry at it is exactly what I had in mind; I just code, I want to code, sometimes use llm or stack overflow or ask another person for advice about code. The approach in the middle is not taking to the extremes, and making use of any available tools to do our work/hobby and just live life and not be a target of hate (I received hateful messages and even one death threat over a comment where I said that I asked Claude to explain some concept in Zig). I could go and say that "in the middle" is more of a metaphor to just being reserved about stuff but I would be probably called out for "moving goal posts" and "backtracking on own comment". Sorry if something is written weirdly, English is not my first language, I'm open to talk more tho.
Maybe your “sometimes” is too much for me or others. How can you ensure it’s in the “middle”? Maybe I consider extreme. Maybe others consider not enough. Like driving every day: is it extreme driving, or moderate?

You see how makes no sense this in the “middle” concept?

Then how should I call my approach? I definitely wouldn't portray myself as either pro or anti-llm: moderate? Moderate in the colloquial speech would also be not much more than trying to stay in the as you pointed out "relative" point of view. Unless you want to say that everyone is a bit of an extremist.
Good question. I don’t know either. Just I know “middle” is not the right thing. It’s so easy to draw the narrative you wish - self fulfilling
Well, then how would you describe your approach? If the the vague middle is not right thing for you?
Better maybe fact based? I use ai X amount of hours per day. I use Ai for % of needs? My personal feeling of AI improvement in my life is +X%

Something like that. What you think?

Sounds good, but I'm scared that for some other people it's not an improvement since there will always be someone saying that X amount is too much or not enough. But yeah, I could say that 1/4 or 1/5 of coding (or rather, googling of stuff I simply don't know) I do is now delegated to an llm, question is if another person would look at this statement and say something along the lines of "cool, I use it more/less but I'm happy that it helped you/I'm sad that it caused you trouble", I slowly think that we might be discussing kind off the wrong thing. But yeah, numerical/fact based approach doesn't sound half bad even though I have some feeling at the back of my head that it can also be kind off self fulfilling but nonetheless it helps in conveying the message better than what I used before (the I use it a lot/not much/I try to stay in the middle).
"exquisitely hand-written"

This sounds so cringe. We are talking about computer code here lol

Bespoke handcrafted ethically sourced all natural cruelty free source code
I'm not sure about exquisite and small.

Bun genuinely made me doubt my understanding of what good software engineering is. Just take a look at their code, here are a few examples:

- this hand-rolled JS parser of 24k dense, memory-unsafe lines: https://github.com/oven-sh/bun/blob/c42539b0bf5c067e3d085646... (this is a version from quite a while ago to exclude LLM impact)

- hand-rolled re-implementation of S3 directory listing that includes "parsing" XML via hard-coded substrings https://github.com/oven-sh/bun/blob/main/src/s3/list_objects...

- MIME parsing https://github.com/oven-sh/bun/blob/main/src/http/MimeType.z...

It goes completely contrary to a lot of what I think is good software engineering. There is very little reuse, everything is ad-hoc, NIH-heavy, verbose, seemingly fragile (there's a lot of memory manipulation interwoven with business logic!), with relatively few tests or assurances.

And yet it works on many levels: as a piece of software, as a project, as a business. Therefore, how can it be anything but good engineering? It fulfils its purpose.

I can also see why it's a very good fit for LLM-heavy workflows.

I can't speak as much about the last two examples, but writing a giant parser file is pretty common in Zig from what I've seen. Here's Zig's own parser, for example[1]. I'm also not sure what you mean by memory unsafe, since all slices have bounds checks. It also looks like this uses an arena allocator, so lifetime tracking is pretty simple (dump everything onto the allocator, and copy over the result at the end). Granted, I could be misunderstanding the code, but that's the read I get of it.

[1] https://codeberg.org/ziglang/zig/src/commit/be9649f4ea5a32fd...

It used to be arena-allocated but now it's using a different technique which I outlined in this talk: https://vimeo.com/649009599
As it happens, the commit I linked fixes a segfault, which shouldn't normally happen in memory-safe code.
Are you at liberty to divulge how much Anthropic paid for Bun?
Amazing news, congrats! Been using Bun for a long while now and I love it.

Is there anything I could do to improve this PR/get a review? I understand you are def very busy right now with the acquisition, but wanted to give my PR the best shot:

https://github.com/oven-sh/bun/pull/24514

Congrats on the payday :)

Do you think Anthropic might request you implement private APIs?

  • kyyol
  • ·
  • 4 hours ago
  • ·
  • [ - ]
This is an interesting question; not to be too naive, but are there examples in the wild about this scenario? First I’ve heard of private APIs for something open source like this and my interest is piqued!
Vscode had private apis for copilot.
Is this acquihiring?
No. Anthropic need Bun to be healthy because they use it for Claude Code.
Isn't that still "acqui-hiring" according to common usage of the term?

Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.

But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.

Acquihiring usually means that the product the team are working on will be ended and the team members will be set to work on other aspects of the existing company.
That is part of the definition given in the first paragraph of the Wikipedia article, but I think it’s a blurry line when the acquired company is essentially synonymous with a single open source project and the buyer wants the team of experts to continue developing that open source project.
  • dcre
  • ·
  • 7 hours ago
  • ·
  • [ - ]
No it isn’t. That’s not an acquihire. They’re keeping the product.
I think it’s an acquihire, and they also like Bun.
But it seems like that could happen faster internally than publicly?
I consider this more of a strategic acquisition.
Why can't you make CLI autocompletions work? It's so basic, but the ticket has languished for almost as long as bun has existed!
Because nobody (including you, apparently) cares enough to implement it?
Thanks, Jarred. Seeing what you built with Bun has been a real inspiration, the way one focused engineer can shift an entire ecosystem. It pushed me back into caring about the lower-level side of things again, and I’m grateful for that spark. Congrats on the acquisition, and excited to see what’s next
You said elsewhere that there were many suitors. What is the single most important thing about Anthropic that leads you to believe they will be dominant in the coming years?
No idea about his feelings but believing that they will be dominant wouldn't have to be the reason he chose them. I could easily imagine that someone would decide based on (1) they offered enough money and (2) values alignment.
Hi Jarred. Congratulations on the acquisition! Did (or will) your investors make any profit on what they put into Bun?
I've never personally used Bun. I use node.js I guess. What makes Bun fundamentally better at AI than, say, bundling a node.js app that can run anywhere?

If the answer is performance, how does Bun achieve things quicker than Node?

  • ·
  • 7 hours ago
  • ·
  • [ - ]
Easier deployment, you may generate a single binary.
on Bun's website, the runtime section features HTTP, networking, storage -- all are very web-focused. any plans to start expanding into native ML support? (e.g. GPUs, RDMA-type networking, cluster management, NFS)
Probably not. When we add new APIs in Bun, we generally base the interface off of popular existing packages. The bar is very high for a runtime to include libraries because the expectation is to support those APIs ~forever. And I can’t think of popular existing JS libraries for these things.
How much of your day-to-day is spent contributing code to the Bun codebase and do you expect it to decrease as Anthropic assigns more people to work on Bun?
Hi Jarred,

I contributed to Bun one time for SQLite. I've a question about the licensing. Will each contributor continue to retain their copyright, or will a CLA be introduced?

Thanks

With Bun's existing OSS license and contribution model, all contributors retain their copyright and Bun retains the license to use those contributions. An acquisition of this kind cannot change the terms under which prior contributions were made without explicit agreement from all contributors. If Bun did switch to a CLA in the future, just like with any OSS project, that would only impact future contributions made after that CLA went into effect and it depends entirely on the terms established in that hypothetical CLA.
Hello, thank you, but that doesn't answer my question. I'm not asking for a definition, but for information about licensing decisions for the future of Bun.
Does this acquisition preclude implementing an s3 style integration for AWS bedrock? Also is IMDSv2 auth on the roadmap?
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Any chance there will be some kind of updating mechanism for 'compiled' bun executables?
I have a PR that’s been sitting for awhile that exposes the extra options from the renameat2 and renameatx_np syscalls which is a good way to implement self-updaters that work even when multiple processes are updating the same path on disk at the same time. These syscalls are supported on Linux & macOS but I don’t think there’s an equivalent on Windows. We use these syscalls internally for `bun install` to make adding packages into the global install cache work when multiple `bun install` processes are running simultaneously

No high-level self updater api is planned right now, but yes for at least the low level parts needed to make a good one

Hi Jarred, thanks for all your work on Bun.

I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?

I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.

Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.

One more thing I hope doesn't change, is the fun Release videos :-) I really enjoy them. They're very apple-y, and for just a programming tool.
What happens to Bun in a scenario where Anthropic goes under?
Yeah why are you not out on a boat somewhere enjoying this moment? Go have fun please.
Acq's typically have additional stips you have to follow - they probably have new deadlines and some temporary stress for the next few months.
yes, acquisitions rarely result in an immediate cash payout.
my wife and i call each other bun all the time, and it's really weird to see an article full of Buns
how the helldid you got that og name here in hn

asking the real questions

"work on Bun." LOL.

Congratulations.

how can you sleep at night?
Any thoughts on the claude "soul document" that was leaked this week?
I wonder if this is a sign of AI companies trying to pivot?

> Bun will ship faster.

That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.

This is why we can't have nice things
I would like to clarify that I wish I weren't right but I probably am.
"Insanity Is Doing the Same Thing Over and Over Again and Expecting Different Results"
I hate this quote. I often get different results when I do the same thing over and over again. Turns out there’s a lot of on-determinism out there.
Unfortunately that is also the definition of “practice”.
Practice expects same results. Not different results?
Since when is a CLI tool like this a sufficiently demanding technical project that it needs to buy the runtime just to get sufficient support?

This just isn't the hard part of the product.

Like if I was building a Claude Code competitor and I acquired bun, I wouldn't feel like I had an advantage because I could get more support with like fs.read?

I find it a little sad, that there is almost no pushback on what a few people with deep pockets are trying to sell here. Normaly on HN an article on balcon gardening would be met with more critical thinking than this piece. Maybe instead of staring to the screen all day long take a break, think about what people with lots of money care about. And I don't judge, making money is nothing illegal. But Anthropic would be absolutely NOTHING without OSS. And then to see the kind of this effusive, submissive admiration and gratitude for their js wrapper thing makes me sick to my stomach.
I think this acquisition in reality has more to do with developer goodwill? And a little to do with the shell game of making these AI companies hard to value because they collect assets like this.
I’ll be honest, while I have my doubts about the match of interests and cohesion between an AI company and a JS runtime company I have to say this is the single best acquisition announcement blog post I’ve seen in 20 years or so.

Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.

Best of luck to the team and hopefully the new home will support them well.

But how is another company that is also VC backed and losing money providing stability for Bun?

How long before we hear about “Our Amazing Journey”?

On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.

Anthropic may be losing money, but a company with $7bn revenue run rate (https://www.anthropic.com/news/statement-dario-amodei-americ...) is a whole lot healthier than a company with a revenue of 0.
If I had the cash, I could sell dollar bills for 50 cents and do a $7b run rate :)
If that was genuinely happening here - Anthropic were selling inference for less than the power and data center costs needed to serve those tokens - it would indeed be a very bad sign for their health.

I don't think they're doing that.

Estimates I've seen have their inference margin at ~60% - there's one from Morgan Stanley in this article, for example: https://www.businessinsider.com/amazon-anthropic-billions-cl...

>The bank's analysts then assumed Anthropic gross profit margins of 60%, and estimated that 75% of related costs are spent on AWS cloud services.

Not estimate, assumption.

Those are estimates. Notice they didn’t assume 0% or a million %. They chose numbers that are a plausible approximation of the true unknown values, also known as an estimate.
If Morgan Stanley are willing to stake their credibility on an assumption I'm going to take that assumption seriously.
This is pretty silly thing to say. Investment banks suffer zero reputational damage when their analysts get this sort of thing wrong. They don’t even have to care about accuracy because there will never be a way to even check this number, if anyone even wanted to go back and rate their assumptions, which also never happens.
Fair enough. I was looking for a shortcut way of saying "I find this guess credible", see also: https://news.ycombinator.com/item?id=46126597
Calling this unmotivated assumption an "estimate" is just plain lying though, regardless of the faith uou have in the source of the assumption.
  • ·
  • 12 hours ago
  • ·
  • [ - ]
I've seen a bunch of other estimates / claims of a %50-60 margin for Anthropic on serving. This was just the first one I found a credible-looking link I could drop into this discussion.

The best one is from the Information, but they're behind a paywall so not useful to link to. https://www.theinformation.com/articles/anthropic-projects-7...

They had pretty drastic price cuts on Opus 4.5. It's possible they're now selling inference at a loss to gain market share, or at least that their margins are much lower. Dario claims that all their previous models were profitable (even after accounting for research costs), but it's unclear that there's a path to keeping their previous margins and expanding revenue as fast or faster than their costs (each model has been substantially more expensive than the previous model).
It wouldn't surprise me if they found ways to reduce the cost of serving Opus 4.5. All of the model vendors have been consistently finding new optimizations over the last few years.
I sure hope serving Opus 4.5 at the current cost is sustainable. It’s the first model I can actually use for serious work.
I've been wondering about this generally... Are the per-request API prices I'm paying at a profit or a loss? My billing would suggest they are not making a profit on the monthly fees (unless there are a bunch of enterprise accounts in group deals not being used, I am one of those I think)
but those AI/ML researchers aka LLM optimization staff are not cheap. their salaries have skyrocketed, and some are being fought for like top-tier soccer stars and actors/actresses
The leaders of Anthropic, OpenAI and DeepMind all hope to create models that are much more powerful than the ones they have now.

A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.

The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.

> their investors might still take a bath if the very-ambitious aspect of their operations do not bear fruit

Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.

Surely you understand the bet Anthropic is making, and why it's a bit different than selling dollars at a discount
  • myhf
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Because discounted dollar bills are still a tangible asset, but churning language models are intangible?
Maybe for those of us not-too-clever ones, what is the bet? Why is it different? Would be pretty great to have like a clear articulation of this!
  • shwaj
  • ·
  • 12 hours ago
  • ·
  • [ - ]
The bet, (I would have thought) obviously, is that AI will be a huge part of humanity’s future, and that Anthropic will be able to get a big piece of that pie.

This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.

Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.

The question is/was about how they intend to obtain that big piece of pie, what that looks like.
Do you know any translators? They all pretty much lost much of their clients.

Devs can write at a very fast rate with ai.

  • econ
  • ·
  • 5 hours ago
  • ·
  • [ - ]
Machine translations are really good now. Early on I would translate the same sentence back and forwards while "engineering" the "prompt".

You still need to check it or at least be aware it's a translation. The problem of extra puns remains.

我不会说任何语言,我否认一切

You are saying that you can raise $7b debt at double-digit interest rate. I am doubtful. While $7b is not a big number, the Madoff scam is only ~$70b in total over many years.
> the Madoff scam is only ~$70b in total

Incorrect - that was the fraudulent NAV.

An estimate for true cash inflow that was lost is about $20 billion (which is still an enormous number!)

No, I'm scamming myself. Halving my fortune because I believe karma will somehow repay me ten fold some time later.
Somehow? I've been keeping an eye on my inbox, waiting to get a karma vesting plan from HN, for ages. What's this talk of somehow?
you have anthropic confused with something like lovable.

anthropic's unit margins are fine, many lovable-like businesses are not.

Or I'm just saying revenue numbers alone don't prove anything useful when you have deep pockets.
I am fairly skeptical about many AI companies, but as someone else pointed out, Anthropic has 10x'ed their revenue for the past 3 years. 100m->1b->10b. While past performance no predictor of future results, their product is solid and to me looks like they have found PMF.
Idk, I’m no business expert by any means, but I’m a hell of a lot more _scared_ by a company burning so much that’s $7b is still losing
  • Sephr
  • ·
  • 9 hours ago
  • ·
  • [ - ]
They don't need revenue, they need a community. I don't know how this acquisition will affect that.
Cope detected. Classification code: ANT
  • rvnx
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Often it happens that VCs buy out companies from funds belonging to a fresh because the selling fund wants to show performance to their investors until "the big one", or move cash one from wealthy pocket to another one.

"You buy me this, next time I save you on that", etc...

"Raised $19 million Series A led by Khosla Ventures + $7 million"

"Today, Bun makes $0 in revenue."

Everything is almost public domain (MIT) and can be forked without paying a single dollar.

Questionable to claim that the technology is the real reason this was bought.

It's an acquihire. If Anthropic is spending significant resources, or see that they will have to, to improve Bun internally already it makes a lot of sense. No nefarious undertones required.

An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.

If it was an acquihire, still a lot less slimy than just offering the employees they care about a large compensation package and leaving the company behind as a husk like Amazon, Google and Microsoft have done recently.
Is it? What's wrong with hiring talent for a higher salary?

You have no responsibility for an unrelated company's operations; if that was important to them they could have paid their talent more.

From the acquirer’s perspective, you’re right. (Bonus: it diminishes your own employees’ ability to leave and fundraise to compete with you.)

From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.

> And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward.

Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.

> Every employee is a flight risk if you don't pay them a competitive salary

Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)

> that's just FUD

What does FUD mean in this context? I’m precisely relaying a personal anecdote.

> aren’t hired away with promises of a salary but instead large signing bonuses

Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.

> aren’t typically hired individually but as teams.

So? VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.

> What does FUD mean in this context? I’m precisely relaying a personal anecdote.

Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.

> Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary

These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)

> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future

It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.

Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.

And the secretary, sales, project managers, etc who get left behind because the founders and key people were taken away? In an acquisition, they may still be let go. But they also would make money from their equity
You want those people specifically. To get them, you need to hire them for a lot more money than you pay your current folks. That causes a lot of resentment with folks and messes up things like salary bands, etc.

But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.

Who should be paying the founders more? The ones that made a deal with the VCs? They would be hired away from the company.
I left out the part that the motivations for the acquirers were not to save money or to be slimy. It was the only way to get around overzealous government regulators making it harder to acquirer companies.
The real risk is not that Anthropic will run out of money, but that they will change their strategy to something that isn't Bun-based, and supporting Bun won't make sense for them any more.
Is there anything you’d need from bun in the future that can’t be done by forking it?
> But how is another company that is also VC backed and losing money providing stability for Bun?

Reminds me of when Tron, the crypto company, bought BitTorrent.

  • wmf
  • ·
  • 14 hours ago
  • ·
  • [ - ]
The difference is that Tron is a scam and BitTorrent Inc was nothing special either.
Match made in heaven considering BitTorrent Inc bundles crypto miners and other malware with μTorrent.
GIF of Pam from the office saying, “They’re the same picture.”
I misread Amazon, implying that Amazon might buy Anthropic, and I think that's what will end up happening.
In my three or four non chatbot related projects, I’ve found Amazon’s Nova models to be just as good as Anthropic’s.
Ditto, and I got to know Bun via HN. It seemed intriguing, but also "why another JS runtime" etc.

If Bun embraces the sweet spot around edge computing, modern JS/TS and AI services, I think their future ahead looks bright.

Bun seems more alive than Deno, FWIW.

I admit, it is a good acquisition announcement. I can’t remember the last acquisition announcement that was kept for more than 1-2 years. Leadership changes, priorities shift…
  • jjcm
  • ·
  • 14 hours ago
  • ·
  • [ - ]
One thing I like about this, despite it meaning Bun will be funded, is Anthropic is a registered public benefit corporation. While this doesn't mean Anthropic cant fuck over the users of Bun, it at least puts in some roadblocks. The path of least-resistance here should be to improve Bun for users, not to monetize it to the point where it's no longer valuable.
> Anthropic is a registered public benefit corporation

Does that mean anything at all?

OpenAI is a public benefit corporation.

I had the same impression: bottom line up front, didn’t bury the lede, no weasel language.
I wonder what this means for Deno.

Will this make it more or less likely for people to use Bun vs Deno?

And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?

Bun and Deno's goals seem quite different, I don't expect that to change. Bun is a one stop shop with an ever increasing number of built-in high-level APIs. Deno is focused on low level APIs, security, and building out a standard lib/ecosystem that (mostly) supports all JS environments.

People who like Bun for what it is are probably still going to, and same goes for Deno.

That being said I don't see how Anthropic is really adding long term stability to Bun.

I think Deno's management have been somewhat distracted by their ongoing lawsuits with Oracle over the release of the Javascript trademark.

I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.

  • pjmlp
  • ·
  • 2 hours ago
  • ·
  • [ - ]
In regards to Deno, to me that means their business is not really flying and they need this kind of distractions instead.

Amount of people at big corps that care about their lawsuit, and would switch their IT guidelines from node to Deno due to such heroic efforts?

Zero.

Ironically, this was early Deno - but then adoption required backwards compatibility.
I'm in a similar position.

I use Hono, Zod, and Drizzle which AFAIK don't need Node compat.

IIRC I've only used Node compat once to delete a folder recursively with rm.

What do you dislike about having node compatibility?
The bloat. I prefer lean designs with plug-in modules for additional functionality. Not only do unused sub-systems take up memory, but they also increase the potential attack surface.
> Will this make it more or less likely for people to use Bun vs Deno?

I'm not sure it will make much of a difference in the short term.

For those who were drawn to Bun by hype and/or some concerns around speed, they will continue to use Bun.

For me personally, I will continue to use Node for legacy projects and will continue using Deno for current projects.

I'm not interested in Bun for it's hype (since hype is fleeting). I have a reserved interested in Bun's approach to speed but I don't see it being a significant factor since most JS speed concerns come from downloading dependencies (which is a once-off operation) and terrible JS framework practices (which aren't resolved by changing engines anyway).

----------------------------

The two largest problems I see in JS are:

1. Terrible security practices

2. A lack of a standard library which pushes people into dependency hell

Deno fixes both of those problems with a proper permission model and a standard library.

----------------------------

> And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?

I think any predictions between 1-10 years are going to be a little too chaotic. It all depends on how the AI bubble goes away.

But after 10 years, I can see runtimes switching from their current engines to one based on Boa, Kiesel or something similar.

Prediction Bun is absorbed in house and used by Anthropic to have faster/cheaper places for Claude to run code.

It fades away as a direct to developer tool.

This is a good thing for Deno.

Deno is dead. Seems like there haven't been very relevant or user-informed changes on their roadmap for year(s) now.
My first thought went to how openai used Rust to build their CLI tool and Anthropic's CEO bought influence over Zig as a reaction.
  • dkmar
  • ·
  • 5 hours ago
  • ·
  • [ - ]
Jarred just tweeted a few days ago about how little influence over zig he has, funnily enough.

https://x.com/jarredsumner/status/1994950394955665486?s=20

  • ·
  • 5 hours ago
  • ·
  • [ - ]
  • hu3
  • ·
  • 9 hours ago
  • ·
  • [ - ]
That would require them to hire/buy Zig team. Which is not the case.
> bought influence over Zig as a reaction

Elaborate? I believe Zig's donors don't get any influence and decision making power.

As someone who have been using Deno for the last few years, is there anything that Bun does better? Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?). The last time I checked Bun's source code, it was... quite messy and spaghetti-like, plus Zig doesn't really offer many safety features, so it's not that hard to write incorrect code. Zig does force some safety with ReleaseSafe IIRC, but it's still not the same as even modern C++, let alone Rust.

I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.

I haven't used Deno, but I do use Bun purely as a replacement for npm. It does the hard-linking thing that seems to be increasingly common for package managers these days (i.e. it populates your local node_modules with a bunch of hard links to its systemwide cache), which makes it vastly quicker and more disk-efficient than npm for most usage.

Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.

I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?

As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.

Deno does all that. Hell, yarn does too, or pnpm as the sibling mentioned.
Sure, but pnpm is very slow compared to bun.
Deno does that. It also refrains from keeping a local node_modules at all until/unless you explicitly ask it to for whatever compatibility reason. There are plugins to things like esbuild to use the Deno resolver and not need a node_modules at all (if you aren't also using the Deno-provided bundler for whatever reason such as it disappeared for a couple versions and is still marked "experimental").
pnpm does all that on top of node. Also disables postinstall scripts by default, making the recent security incidents we've seen a non-issue.
  • junon
  • ·
  • 13 hours ago
  • ·
  • [ - ]
As the victim of the larger pre-Shai-Hulud attack, unfortunately the install script validation wouldn't have protected you. Also, if you already have an infected package on the whitelist, a new infection in the install script will still affect you.
I’m not sure why but bun still feels snappier.
Aside from speed, what would the major selling points be on migrating from pnpm to bun?
A whitelist in package.json is only a partial assist
Are there any popular packages that require postinstall scripts that this hurts?
IIRC bun zig code base has a lot of fine optimization too. I think the lead did a conference explaining his work. Or maybe i'm confused.
oh thanks yes, i couldn't find it, i was already lost thinking it was a conference by andrew kelley .. thanks a lot
I decided to stick with Node in general. I don't see any compelling reason to change it.

Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.

With the old-school setup I can easily manually edit something in node_modules to quickly test a change.

No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.

Search for pointer exceptions or core dumps on Bun's GitHub issues and you'll see why people (should) use Deno over Bun, if only because Rust is a way more safe language than Zig.
This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state. Whether it be kernel exception, pointer exception, or Rust's panic! - these things exist.

The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.

Don't make a false equivalence, how many times does one get a panic from Deno versus a segmentation fault in Bun? It's not a similar number, and it's simply wrong to say that both are just as unsafe when that's plainly untrue.
Anecodtally? Zero segfaults with bun since I started using it back in beta.
The only time I got a segfault in Bun is when I used bun:ffi to wrap glfw and wgpu-native so I can threejs on the desktop. Ironically, the segfault was in wgpu. Which is Rust. But to be fair it was because the glfw surface had dirty flags for OpenGL and didn’t have the Vulkan extensions. So anyone would have faulted.
  • hu3
  • ·
  • 9 hours ago
  • ·
  • [ - ]
I use Bun in production. Well, one of my clients.

We have yet to witness a segfault. Admitedly it's a bunch of micro services and not many requests/s (around 5k AVG).

> This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state.

There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.

Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.

1: https://go.dev/ref/mem

I'll take a small panic and unwind any day over a total burnout crash. Matters in code and life.
I agree. Pointing at Github issues is a strange metric to me. If we want to use that as a canary then you shouldn't use Deno (2.4k open issues) or Bun (4.5k open issues) at all.
  • rvrb
  • ·
  • 14 hours ago
  • ·
  • [ - ]
I haven't verified this, but I would be willing to bet that most of Bun's issues here have more to do with interfacing with JavaScriptCore through the C FFI than Zig itself. this is as much a problem in Rust as it is in Zig. in fact, it has been argued that writing unsafe Zig is safer than writing unsafe Rust: https://zackoverflow.dev/writing/unsafe-rust-vs-zig/
As someone who has researched the internals of Deno and Bun, your unverified vibe thoughts are flat out wrong. Bun is newer and buggier and that's just the way things go sometimes. You'll get over it.
[flagged]
  • rvrb
  • ·
  • 13 hours ago
  • ·
  • [ - ]
[flagged]
Easily bundling and serving frontend code from your backend code is very appealing: https://bun.com/docs/bundler/fullstack

Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").

I tried several times to port Node projects to Deno. Each time compatibility had "improved" but I still didn't have a working build after a few days of effort.

I don't know how Deno is today. I switched to Bun and porting went a lot smoother.

Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.

Which makes sense given that a big impetus for Deno's existence was the creator of Node/Deno (Ryan Dahl) wanting to correct things he viewed as design mistakes in Node.
My team has been using it in prod for about a year now. There were some minor bugs in the runtime's implementation of buffers in 1.22 (?), but that was about the only issue we ran into.

The nice things:

1. It's fast.

2. The standard library is great. (This may be less of an advantage over Deno.)

3. There's a ton of momentum behind it.

4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.

5. I don't have to think about JSR.

The warts:

1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.

Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.

> Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?).

JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.

It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)

It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.

I’ve been using Deno too. Although npm support has improved and it’s fine for me, I think Deno has more of a “rewrite the world” philosophy. For example, they created their own package registry [1] and their own web framework [2]. Bun seems much more focused on preexisting JavaScript projects.

[1] https://jsr.io/ [2] https://fresh.deno.dev/

It's interesting that people have directly opposite opinions on whether Deno or Bun are meant to be used with the existing ecosystem - https://news.ycombinator.com/item?id=46125049
I don’t think these are mutually exclusive takes. Bun is essentially taking Node and giving it a standard library and standard tooling. But you can still use regular node packages if you want. Whereas Deno def leaned into the clean break for a while
[dead]
At this stage I don't think either is better over the other. Deno has inexplicable high memory usage issues in Linux containers. Bun more or less suffers from the same with an added dose of segfaults.

1. https://github.com/denoland/deno/issues?q=is%3Aissue%20state... 2. https://github.com/oven-sh/bun/issues?q=is%3Aissue%20state%3...

Node.js is a no-brainer for anyone shipping a TS/JS backend. I'd rather deal with poor DX and slightly worse performance than risk fighting runtime related issues on deployment.

Linux needs to be a first class citizen for any runtime/langauge toolchain.

It has wayyyyy better nodejs compatibility (day 1 goal)
As far as I know, modern Node compat in Deno is also quite great - I just import packages via 'npm:package' and they work, even install scripts work. Although I remember that in the past Deno's Node compat was worse, yes.
  • 0x457
  • ·
  • 13 hours ago
  • ·
  • [ - ]
Pretty sure one of the Deno day 1 goals was to correct mistakes made during the early days of Node.js.
  • ·
  • 12 hours ago
  • ·
  • [ - ]
I really want to like Deno and will likely try it again, but last time I did it was just a bit of a pain anytime I wanted to use something built for npm (which is most packages out there), whereas bun didn't have that problem.

There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.

Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless

Looking at Bun's website (the comparison table under "What's different about Bun?") and what people have said here, the only significant benefit of Bun over Node.js seems to be that it's more batteries-included - a bigger standard library, more tools, some convenience features like compiling JSX and stripping TypeScript types on-the-fly, etc.

It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.

  • gre
  • ·
  • 15 hours ago
  • ·
  • [ - ]
I had memory leaks in bun and not in deno or node for the same code. ymmv
  • bcye
  • ·
  • 15 hours ago
  • ·
  • [ - ]
It just works. Whatever JavaScript/TypeScript file or dependencies I throw at it, it will run it without needing to figure out CJS or ESM, tsconfig, etc.

I haven't had that experience with deno (or node)

Same. I had a little library I wrote to wrap indexedDB and deno wouldn't even compile it because it referenced those browser apis. I'm sure it's a simple flag or config file property, or x, or y, or z, but the simple fact is, bun didn't fail to compile.

Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.

  • bcye
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Even for server ~~java~~typescript, I almost always reach for Bun nowadays. Used to be because of typestripping, which node now has too, but it's very convenient to write a quick script, import libraries and not have to worry about what format they are in.
Is JSC less tested? I thought it was used in Safari, which has some market share.

I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).

I've found it to be at least twice as fast with practically no compat issues.
Twice as fast at executing JavaScript? There's absolutely zero chance this is true. A JavaScript engine that's twice as fast as V8 in general doesn't exist. There may be 5 or 10 percent difference, but nothing really meaningful.
You might want to revise what you consider to be "absolutely zero chance". Bun has an insanely fast startup time, so it definitely can be true for small workloads. A classic example of this was on Bun's website for a while[1] - it was "Running 266 React SSR tests faster than Jest can print its version number".

[1]: https://x.com/jarredsumner/status/1542824445810642946

I only claimed there is absolutely zero chance that Bun is twice as fast at executing general JavaScript as Deno. The example doesn't give any insight into the relative speeds of Bun and Deno, as fast as I can tell.

    johnfn@mac ~ % time  deno eval 'console.log("hello world")'
    hello world
    deno eval 'console.log("hello world")'  0.04s user 0.02s system 87% cpu 0.074 total
    johnfn@mac ~ % time   bun -e 'console.log("hello world")'
    hello world
    bun -e 'console.log("hello world")'  0.01s user 0.00s system 84% cpu 0.013 total
That's about 560% faster. Yes, it's a microbenchmark. But you said "absolutely zero chance", not "a very small chance".
Keep in mind that it's not just a matter of comparing the JS engine. The runtime that is built around the engine can have a far greater impact on performance than the choice of v8 vs. JSC vs. anything else. In many microbenchmarks, Bun routinely outperforms Node.js and Deno in most tasks by a wide margin.
The claim I responded to is that Bun is "at least twice as fast" as Deno. This sounds a lot more general than Bun being twice as fast in cherry-picked microbenchmarks. I wasn't able to find any benchmark that found meaningful differences between the two runtimes for real-world workloads. (Example: https://hackernoon.com/myth-vs-reality-real-world-runtime-pe...)
Real world benchmarks include database queries and http requests? That’d quickly obviate any differences between runtimes.

Lol, yeah, this person is running a performance test on postgres, and attributing the times to JS frameworks.

It depends on what. Bun has some major optimisations. You’ll have to read into them if you don’t believe me. The graphs don’t come from nowhere
  • ·
  • 13 hours ago
  • ·
  • [ - ]
  • pjmlp
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Agreed, the language would be interesting during the 1990's, nowadays not so much.

The tools that the language offers to handle use after free is hardly any different from using Purify, Insure++ back in 2000.

  • defen
  • ·
  • 14 hours ago
  • ·
  • [ - ]
I find comments like this fascinating, because you're implicitly evaluating a counterfactual where Bun was built with Rust (or some other "interesting" language). Maybe Bun would be better if it were built in Rust. But maybe it would have been slower (either at runtime or development speed) and not gotten far enough along to be acquired by one of the hottest companies in the world. There's no way to know. Why did Anthropic choose Bun instead of Deno, if Deno is written in a better language?
  • pjmlp
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Because maybe they reached out to them, and they didn't took the money, while Bun folks business model wasn't working out?

Who knows?

Besides, how are they going to get back the money spent on the acquisition?

Many times the answer to acquisitions has nothing to do with technology.

  • defen
  • ·
  • 14 hours ago
  • ·
  • [ - ]
> Claude Code, FactoryAI, OpenCode, and others are all built with Bun.

Anthropic chose to use Bun to build their tooling.

  • pjmlp
  • ·
  • 13 hours ago
  • ·
  • [ - ]
We can think of they making bun an internal tool, push roadmap items that fit their internal products, whatever, which doesn't answer the getting back money of the acquisition.

Profit in those products has to justify having now their own compiler team for a JavaScript runtime.

> Why did Anthropic choose Bun instead of Deno, if Deno is written in a better language?

Something about moral and philosophical flexibility.

  • n42
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Don't engage with this guy, he shows up in every one of these threads to pattern match back to his heyday without considering any of the nuance of what is actually different this time.
  • pjmlp
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Look an admirer!
> I'll admit I'm somewhat biased against Bun?

Why? Genuine question, sorry if it was said/implied in your original message and I missed it.

Good question, hard to say, but I think it's mainly because of Zig. At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
> At its core Zig is marketed as a competitor to C, not C++/Rust/etc

What gives you this impression?

I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.

Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.

I don't mean to disparage you in particular, this is like the 1000th time I've seen this.

  • troad
  • ·
  • 4 hours ago
  • ·
  • [ - ]
You have pretty explicitly framed Zig as a C replacement yourself, e.g.: https://www.youtube.com/watch?v=Gv2I7qTux7g

More broadly, I think the observation tends to get repeated because C and Zig share a certain elegance and simplicity (even if C's elegance has dated). C++ is many things, but it's hardly elegant or simple.

I don't think anyone denies that Zig can be a C++ replacement, but that's hardly unusual, so can many other languages (Rust, Swift, etc). What's noteworthy here is that Zig is almost unique in having the potential to be a genuine C replacement. To its (and your) great credit, I might add.

>> At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.

@GP: This is not a great take. All four languages are oriented around manual memory management. C++ inherits all of the footguns of C, whereas Zig and Rust try to sand off the rough edges.

Manual memory management is and will always remain necessary. The only reason someone writing JS scripts don't need to worry about managing their memory is because someone has already done that work for them.

  • ·
  • 4 hours ago
  • ·
  • [ - ]
I got to love that the author of the thing can show up and say “Why?! I never said any of that!”

A lot of stuff related to older languages is lost in the sands of time, but the same thing isn’t true for current ones.

Rust is more of a competitor to C++ than C. Manual memory management is sometimes really helpful and necessary. Zig has a lot of safety features.
I mean, they said they looked at the source code and thought it was gross, so there’s a justification for their concern, at least.
I always figured Bun was the "enterprise software" choice, where you'd want to use Bun tools and libraries for everything and not need to bring in much from the broader NPM library ecosystem.

Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.

If you want to download open source libraries to be used in your Bun project then they will come from npm, at least by default. [1].

So it seems odd to say that Bun is less dependent on the npm library ecosystem.

[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages

Yes, both can pull in open source libraries and I can't imagine either dropping that ability. Though they do seem to have different eagerness and competency on Node compatibility and Bun seems better on that front.

From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.

Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.

That’s true of some parts of Deno’s standard libraries, but major functionality like Deno.test and Deno.serve are Deno-specific API’s.

Here are the Bun API’s:

https://bun.com/docs/runtime/bun-apis

Here are the Deno API’s:

https://docs.deno.com/api/deno/

Stopped following Deno while they were rejecting the need for a package management solution. Used Bun instead.
  • croes
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Isn’t because packages are one of the problems deno tried to fix?
They tried to realign package management with web standards and tools that browsers can share (URLs and importmaps and "cache, don't install"). They didn't offer compatibility with existing package managers (notably and notoriously npm) until late in that game and took multiple swings at URL-based package repositories (deno.land/x/ and JSR), with JSR eventually realizing it needed stronger npm compatibility.

Bun did prioritize npm compatibility earlier.

Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.

  • dmit
  • ·
  • 14 hours ago
  • ·
  • [ - ]
> is there anything that Bun does better?

Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.

> Zig does force some safety with ReleaseSafe IIRC

which Bun doesn't use, choosing to go with `ReleaseFast` instead.

Is it just me, but I don't find npm that slow? Sure it's not a speed demon, but I rarely need to do npm install anyways so it's not a bottleneck for me.

For deploy, usually running the attached terraform script takes more time.

So while a speed increase is welcome, but I don't feel it gives me such a boost.

The speed shows up for large projects. Especially if you end up with multiple node_modules directories in your dev sandbox.
I've been using Bun since 2022 just to be trendy for recruitment (it worked, and still works despite it almost being 2026)

Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.

I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them

https://github.com/aws/aws-cdk/issues/31753

This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change

https://github.com/aws/aws-cdk/issues/33464

but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production

you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun

  • qjack
  • ·
  • 11 hours ago
  • ·
  • [ - ]
Anthropic has been trying to win the developer marketshare, and has been quite successful with Claude Code. While I understand the argument that this acquisition is to protect their usage in CC or even just to acquire the team, I do hope that part of their goal is to use this to strengthen their brand. Being good stewards of open source projects is a huge part of how positively I view a company.
> Being good stewards of open source projects is a huge part of how positively I view a company.

Maybe an easier first step would be to open source Claude Code...?

I think because their models are open (e.g. CC can send any instruction and it’ll use your max plan), they need to keep the code obfuscated to prevent people from sending everybody and their mother through that API.

Codex has the opposite issue. It has an open client, which is relatively pointless, because it will accept only one system prompt and one prompt only.

  • mcdow
  • ·
  • 10 hours ago
  • ·
  • [ - ]
From the comments here it sounds like most people think the amount Anthropic paid for the company was probably not much more than the VC funding which Bun raised.

How would the payout split work? It wouldn’t seem fair to the investors if the founder profited X million while the investors get their original money returned. I understand VC has the expectation that 99 out of 100 of investments will net them no money. But what happens in the cases where money is made, it just isn’t profitable for the VC firm.

What’s to stop everyone from doing this? Besides integrity, why shouldn’t every founder just cash out when the payout is life-changing?

Is there usually some clause in the agreements like “if you do not return X% profit, the founder forfeits his or her equity back to the shareholders”?

All VC's have preferred shares, meaning in case of liquation like now, they get their investment back, and then the remainder gets shared.

Additionally, depending on round, they also have multiples, like 2x meaning they get at least 2x their investment before anyone else gets anything

Probably not much more than their valuation, which is the key difference since the investor will still get a net return.
Bun is pretty cool. I maintain a Node.js library and updated my Node.js engine version and my library just didn't work on the latest version... In frustration, I decided to try Bun for the first time... I had never used it before but my library worked straight away, no warnings, no errors. I have never seen that level of compatibility before when a library works better with an alternative engine than the one it was designed for.

I did end up fixing Node.js compatibility later but it was extra work. Felt like they just created busy-work. Node.js maintainers should stop deprecating perfectly good features and complicating their modules.

This acquisition makes no sense.

Investors must be happy because Bun never had to find out how to become profitable.

  • baq
  • ·
  • 15 hours ago
  • ·
  • [ - ]
It’s enough Anthropic finds it profitable to run Claude Code on it.
Hard to say it makes no sense when you don't know how much they were acquired for. I would guess it is a trivial amount relative to Anthropic's war chest.
> This acquisition makes no sense.

except this sense:

> Investors must be happy because Bun never had to find out how to become profitable.

But what is the upside for anthropic?
I've seen a few of these seemingly random acquisitions lately, and I congratulate the companies and individuals that are acquired during this gold rush, but it definitely feels awkwardly artificial.
Quote from the CEO of Anthropic in March 2025: "I think we'll be there in three to six months where AI is writing 90% of the code and then in 12 months we may be in a world where AI is writing essentially all of the code"
I think this wound up being close enough to true, it's just that it actually says less than what people assumed at the time.

It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.

Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.

That's ridiculous. Not it isn't even close.
At an individual level, I think it is for some people. Opus/Sonnet 4.5 can tackle pretty much any ticket I throw at it on a system I've worked on for nearly a decade. Struggles quite a bit with design, but I'm shit at that anyway.

It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.

Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.

  • mjr00
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Why didn't they just use AI to write their own Bun instead of wasting 8-9 figures on this company? Makes no sense.
From the article, Claude Code is being used extensively to develop Bun already.

> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.

You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.

  • mjr00
  • ·
  • 13 hours ago
  • ·
  • [ - ]
> You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.

Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.

Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!

They likely have other things to do.
"Wasting" is doing a lot of work in that sentence.

They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.

Let me refer you back to the GP, where the CEO of Anthropic says AI will be writing most code in 12 months. I think the parent comment you replied to was being somewhat facetious.
Because 90% is not 100%.
Deciding what to Implement

and

Implementing the Decisions

are complementary, one of these is being commoditised.

And, in fact, decimated.

Personally I am benefitting almost beyond measure because I can spend my time as the architect rather than the builder.

Same. I don’t understand how people aren’t getting this yet. I’m spending all day thinking, planning and engineering while spending very little time typing code. My productivity is through the roof. All the code in my commits is of equal quality to what I would produce myself, why wouldn’t it be? Sure one can just ask AI to do stuff and not review it and iterate, but why on earth would one do that? I’m starting to feel that anyone who’s not getting this positive experience simply isn’t good at development to begin with.
Maybe he was correct in the extremely literal sense of AI producing more new lines of code than humans, because AI is no doubt very good at producing huge volumes of Stuff very quickly, but how much of that Stuff actually justifies its existence is another question entirely.
Why do people always stop this quote at the breath? The rest of it says that he still thinks they need tech employees.

> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced

(He then said it would continue improving, but this was not in the 12 month prediction.)

Source interview: https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn

I actually like claude code, but that was always a risky thing to say (actually I recall him saying their software is 90% AI produced) considering their cli tool is literally infested with bugs. (Or it least it was last time I used it heavily. Maybe they've improved it since.)
Do you have a source for the quote?
Is this why everyone only seems to know the first half of Dario's quote? The guy in that video is commenting on a 40 second clip from twitter, not the original interview.

I posted a link and transcription of the rest of his "three to six months" quote here: https://news.ycombinator.com/item?id=46126784

  • ·
  • 11 hours ago
  • ·
  • [ - ]
Thank you.
I'm curious what people think of quotes like these. Obviously it makes an explicit, falsifiable prediction. That prediction is false. There are so many reasons why someone could predict that it would be false. Is it just optimistic marketing speech, or do they really believe it themselves?
Everybody knows that marketing speech is optimistic. Which means if you give realistic estimates, then people are going to assume those are also optimistic.
Why didn't they have the AI write a JS runtime instead of this acquisition?
The big picture of “build a runtime” is an easier idea than “what would make this runtime better and how should the parts interact”.
Given the horrible stability of Windows this year, it seems like Microsoft went all in on that
Accurate for me. Accurate for basically every startup from the past 12 months. Prob not for legacy codebases, though.
It’s writing 90% of my code now but it’s 100% reliant on me to do that effectively.
AI writes about 90% of my code.
What languages and frameworks? What is the domain space you're operating in? I use Cursor to help with some tasks, but mainly only use the autocomplete. It's great; no complaints. I just don't ever see being able to turn over anywhere close to 90% with the stuff we work on.
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.

You can see my site here, if you'd like: https://chipscompo.com/

  • pjmlp
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Only 10% to go for a full replacement.
Probably about 95% of mine now. Much better than I could for the most part.
Weird, AI writes terrible code for me that would never pass a code review. I guess people have different standards for good code.
Hah. It can’t be “I need to spend more time to figure out how to use these tools better.” It is always “I’m just smarter than other people and have a higher standard.”
Show us your repos.
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.

It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/

  • ·
  • 11 hours ago
  • ·
  • [ - ]
Spot on.
The tools produce mediocre, usually working in the most technical sense of the word, and most developers are pretty shit at writing code that doesn't suck (myself included).

I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.

  • sulam
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Or maybe he's working in a space that is less out of distribution than the work you're doing?
You’re right, I’m not making a nextjs/shadcn/clerk/vercel ai wrapper startup.
I don't remember saying I worked with nextjs, shadcn, clerk (I don't even know what that one is), vercel or even JS/TS so I'm not sure how you can be right but I should know better than to feed the trolls.
  • ·
  • 11 hours ago
  • ·
  • [ - ]
  • ·
  • 7 hours ago
  • ·
  • [ - ]
I suspect you do not know how to use AI for writing code. No offence intended - it is a journey for everyone.

You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).

But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.

Always the same answer. It's the user not the AI being blown out of proportion. Tell me, where are all those great amazin applications that were coded 95-100% by AI? Where is the great progress the great new algorithms the great new innovations hiding?
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI. Probably 90-95% of it is AI driven.

It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/

From the link:

"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.

If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."

That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".

  • ·
  • 12 hours ago
  • ·
  • [ - ]
How many agents, tools, MCP & ACP servers, claude hooks, and workflows do I need to set up before English becomes a good programming language?
  • ·
  • 13 hours ago
  • ·
  • [ - ]
One agent, a few sub-agents, 1 MCP server, no "ACP" (never seen that used), no hooks, one workflow that I usually follow.
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Do you know of any YouTube videos where you would say they do a very good job of showing off this style of coding?
I made this one recently: https://www.youtube.com/watch?v=qy4ci7AoF9Y - notes here: https://simonwillison.net/2025/Nov/6/upgrading-datasette-plu...

My best writing on this topic is still this though (which doesn't include a video): https://simonwillison.net/2025/Mar/11/using-llms-for-code/

Thanks for this! I've been looking for a good guide to an LLM based workflow, but the modern style of YouTube coding videos really grates on me. I think I might even like this :D
Always enjoy reading your blog Simon!
This one is a bit old now so a number of things have changed (I mostly use Claude Code now, Dynamic context (Skills) etc...) but here's a brief TLDR I did early this year https://www.youtube.com/watch?v=dDSLw-6vR4o
Post a repo
https://github.com/Aeolun/cool-rust-terminal

I was honestly baffled how fast Claude knocked this out.

How much time do you think you saved versus writing it yourself if you factored in the time you spent setting up your AI tooling, writing prompts, contexts etc?
Your best example of something you made with AI is another AI code generator… definitely not beating the AI bubble allegations anytime soon.
1. I didn't say it was a best example, I replied to a comment asking me to "Post a repo" - I posted a repo. 2. Straw man argument. I was asked for a repo, I posted a repo and clearly you didn't look at the code as it's not an "AI code generator".
1. I didn’t ask for a repo. 2. Still wasn’t me. Maybe an AI agent can help you check usernames? 3. Sorry, a plugin for an AI code generator, which is even worse of an example.
it boils down to - we didn't have full conviction that over the long run we will prove superior to node.js, however a.i company burning a lot of cash, has invested in us by basing their toolchain on us - so they have no option to acquire-hire us.
  • hu3
  • ·
  • 9 hours ago
  • ·
  • [ - ]
quite the uncharitable take.

On the opposite spectrum it's just that Claude and Bun are great technologies that joined forces.

I don't really see how Bun fits as an acquisition for an AI company. This seems more like "we have tons of capital and we want to buy something great" than "Bun is essential to our core business model".
If Anthropic wants to own code development in the future, owning the full platform (including the runtime) makes sense.

Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).

Even outside of code development, Anthropic seems to be very strongly leaning into code interpreter over native tool calling for advancing agentic LLM abilities (e.g. their "skills" approach). Given that those necessitate a runtime of sorts, owning/having access to a runtime like Bun that could e.g. allow them to very seamlessly integrate that functionality into their products better, this acquisition doesn't seem like the worst idea.
They will own it, and then what? Will Claude Code end every response with "by the way, did you know that you can switch to bun for 21.37x faster builds?"
They're baking the LORA as we speak, and it'll default to `bun install` too

   "the full platform"
there are more languages than ts though?

Acquisition of Apple Swift division incoming?

TypeScript is the most popular programming language on the most popular software hosting platform though, owning the best runtime for that seems like it would fit Pareto's rule well enough:

https://github.blog/news-insights/octoverse/octoverse-a-new-...

I think there's a potential argument to be made that Anthropic isn't trying to make it easier to write TS code, but rather that their goal is a level higher and the average person wouldn't even know what "language" is running it (in the same way most TS devs don't need to care the many layers their TS code is compiled via).
  • ·
  • 15 hours ago
  • ·
  • [ - ]
According to a JetBrains dev survey (I forget the year) roughly 58% of devs deploy to the web. That's a big money pie right there.
Bun isn’t on the web. It’s a server runtime.
It's a JS runtime, not specifically servers though? They essentially can bundle Claude Code with this, instead of ever relying on someone installing NodeJS and then running npm install.

Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.

Edit:

Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.

They could do that already, nothing in the license prohibited them from doing so.
Sure, but Bun was funded by VCs and needed to figure out how to monetize, what Anthropic did is ensure it is maintained and now they have fresh talent to improve Claude Code.
Server here I used loosely - it obviously runs on any machine (eg if you wanted to deploy an application with it as a runtime). But it’s not useful for web dev itself which was my point.

Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.

  • ·
  • 15 hours ago
  • ·
  • [ - ]
Why acquire Swift when you can write iOS apps in Typescript instead?
Which would use something like Bun ;)
It doesn't make sense, and you definitely didn't say why it'd make sense... but enough people are happy enough to see the Bun team reach an exit (especially one that doesn't kill Bun) that I think the narrative that it makes sense will win out.

I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.

(actually this reminds me of Harry giving Dobby a sock: on so many levels!)

  • logsr
  • ·
  • 13 hours ago
  • ·
  • [ - ]
Claude Code running on Bun is an obvious justification, but Buns features (high performance runtime, fast starts, native TS) are also important for training and inference. For instance, in inference you develop a logical model in code that maps to a reasoning sequence, and then execute the code to validate and refine the model, then use this to inform further reasoning. Bun, which is highly integrated and highly focused on performance, is an ideal fit for this. Having Bun in house means that you can use the feedback from all of automation driven execution of Bun to drive improvements to its core.
Looks like they are acquiring the team rather than the product
No, they're clearly acquiring the technology. They're betting Claude Code on Bun, they have an invested interest in the health of Bun.
Why would they want to bet on nascent technology whereas Node.js bas existed for a god 15 years?
Because they needed something that could produce a single binary that works on every platform. They started shipping Claude Code with Bun back in July: https://x.com/jarredsumner/status/1943492457506697482
Every time I see people mention things like this in node vs bun or deno conversations I wonder if they even tried them.

>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.

>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.

Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:

> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe

> [32ms] bundle 60 modules

> [439ms] compile dist/myprogram.exe

it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.

Node is death through thousand cuts compared to the various experiences offered by Bun.

Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.

I agree, they seem to have never tried it at all! Bun DX is the best, and Bun is the trend setter. Others are just catching up!
They evidently evaluated Node.js in comparison to Bun (and Deno) earlier this year and came to a technical decision about which one worked best for their product.
I highly doubt that the JS ecosystem is driven mostly by hype so I highly doubt the nodejs solution even put on a table in an internal issue tracker.
Claude Code shipped on top of Node.js for the first four months of its existence.

Why wouldn't they consider their options for bundling that version into a single binary using Node.js tooling before adopting Bun?

  • jitl
  • ·
  • 15 hours ago
  • ·
  • [ - ]
it starts fast and does better job than nodejs for their product
Because Microsoft already owns that.
Are you referring to node? MS doesn't own that. It's maintained by Joyent, who in turn is owned by Samsung.
Joyent handed Node.js over to a foundation in 2015, and that foundation merged into the JS Foundation to become the OpenJS Foundation in 2019.

I'm not sure if Joyent have any significant role in Node.js maintenance any more.

Oops, thank you :)

regardless, it's certainly not MS.

Microsoft owns npm outright and controls every aspect of the infrastructure that node.js relies on. It also sits on the board (and is one of the few platinum members) of the Linux Foundation, which controls openjs. It is certainly MS.
  • ·
  • 15 hours ago
  • ·
  • [ - ]
That was my thinking is, this would be useful for Claude Code.
  • rvz
  • ·
  • 15 hours ago
  • ·
  • [ - ]
It does actually.

Claude Code is a 1B+ cash machine and Anthropic directly uses Bun for it.

Acquiring Bun lowers the risk of the software being unmaintained as Bun made $0 and relied on VC money.

Makes sense, but this is just another day in San Francisco of a $0 revenue startup being bought out.

Does this acquisition mean Claude Code the CLI is more valuable than entiriety of Bun?
Claude Code has an annual run rate of $1bn. Bun currently has an annual run rate of $0.
It certainly generated more revenue, so this is not surprising?
> It certainly generated more revenue, so this is not surprising?

Anything is greater than 0

except for losing money?
No, just that people who borrowed bun 7 million dollars want some of it back...
  • ·
  • 15 hours ago
  • ·
  • [ - ]
What is the business model behind open source projects like bun? How can a company "aquire" it and why does it do that?

In the article they write about the early days

    We raised a $7 million seed round
Why do investors invest into people who build something that they give away for free?
The post mentions why - Bun eventually wanted to provide some sort of cloud-hosting saas product.
Everyone could offer a cloud-hosted saas product that involves bun, right?

Why invest into a company that has the additional burden of developing bun, why not in a company that does only the hosting?

The standard argument here is that the maintainers of the core technology are likely to do a better job of hosting it because they have deeper understanding of how it all works.

There's also the trick Deno has been trying, where they can use their control of the core open source project to build features that uniquely benefit their cloud hosting: https://til.simonwillison.net/deno/deno-kv#user-content-the-...

Hosting is a commodity. Runtimes are too. In this case, the strategy is to make a better runtime, attract developers, and eventually give them a super easy way to run their project in the cloud. Eg: bun deploy, which is a reserved no op command. I really like Buns DX.
Yep. This strategy can work, and it has also backfired before, like with Docker trying to monetize something they gave away for free.
Except Amazon would beat them to it
Free now isn't free forever. If something has inherent value then folks will be willing to pay for it.
  • aizk
  • ·
  • 3 hours ago
  • ·
  • [ - ]
Well, if they suddenly changed the license, we'd get a new Redis --> Valkey situation. Or even more recently, look at minio no longer maintaining their core open source project!
I mean if you're getting X number of users per day and you don't need to pay for bandwidth or anything, there's gotta be SOME way to monetize down the line.

If your userbase or the current CEO likes it or not.

Ads. Have you seen the dotenv JavaScript package?
Either for a modest return when it sells or as a tax write off when it fails.
  • bcye
  • ·
  • 14 hours ago
  • ·
  • [ - ]
VCs do not invest for a modest return.
No, but faced with either a loss or a modest return, they'll take the modest return (unless it's more beneficial to not come tax season). Unicorns are called unicorns for a reason.
  • bcye
  • ·
  • 10 hours ago
  • ·
  • [ - ]
The question was why do investors invest
Look Bun is a great product but something hilarious about the company that is “going to replace all software developers with AI” BUYING software. You are building a product that is supposed to make software cost 0 right? Why wouldn’t you just “vibe” code yourself Bun?
I think you’re confused.

> going to replace all software developers with AI

No?

> building a product that is supposed to make software cost 0 right

No?

Those aren't your customers. The people that want to build things with bun are. The problem with people who already know how to code is that they have opinions if they actually read the generated code. If you sell to people who don't (for whatever reason), you face less criticism.
Anyone know how much Anthropic paid for Bun? I assume it was at least $26M, so Bun could break even and pay back its own investors, but I didn't see a number in the announcements from Anthropic or Bun.
Claude said $187M
"Node.js compatibility & replacing Node.js as the default server-side runtime for JavaScript"

Except Node's author already wrote its replacement: Deno.

That's good news. I hope this will encourage the industry to use the Zig language (and its creators to release version 1.0).
>If most new code is going to be written, tested, and deployed by AI agents

That perspective following “in two-three years” makes me shudder, honestly.

I use Claude Code CLI daily - it's genuinely changed how I work. The $1B number sounds crazy but honestly tracks with how good the tool is. Curious how Bun integration will show up in practice beyond the native installer.
Doesn’t sound crazy at all? My Max subscription costs me more than all the other netflix/spotify etc combined, but I pay it happily, and spotify would go before Claude does.
  • iyn
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Curious about the deal value/price — any clues whether it was just to make existing investors even (so say up to $30M) or are we talking some multiple? But if it's a multiple, even 2x sounds a bit crazy.
One option is that the current Bun shareholders didn't see a profitable future and didn't even care if they were made even and a return of the remaining cash was adequate.

Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.

Plus many other scenarios.

i don’t get it either - bun being the foundation of tons of AI tools is like a best possible outcome, what were they hoping for when they raised the money? Or is this just an admission of “hey, that was silly, we need to land this however we can”? Or do they share major investors and the therefore this is just a consolidation? (Edit: indeed, KP did indeed invest $100M in Anthropic this year. I’m also confused - article states Bun raised 26M but the KP seed round was 7, did they do the A too but unannounced? Notably, the seed was summer 2022 and chatgpt was Nov 30, so the world is different, did the hypothesis change?)
  • asim
  • ·
  • 15 hours ago
  • ·
  • [ - ]
It's more honest than the Replicate answer but I think inevitably if you can't raise the next round and you get distracted by the shiny AI that this is the path taken by many teams. There is absolutely nothing wrong with that. There was an exuberant time when all the OSS things were getting funded, and now all AI things get funded. For many engineer founders, it's a better fit to go build deep technical stuff inside a bigger company. If I had that chance I would probably have taken it too. Good luck to the Bun team!
Genuine question: why js?

Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.

All great for AI to recover during its iterations of generating something useful.

Genuinely perplexed.

AI are good at JS because basically there is a ton of JS code available publicly without usage restriction: the JS code published to be executed in your browser. Most of JS code attached to web pages has no explicit license, but the implicit license is that anyone can download it and run it. Same for HTML and CSS. So using that public code to train models is a no brainer.
If I was to pick a language, I'd pick the one all developers agree is the best.
Ahahahahhahahahahhahahahaahaha. Please tell me this is tongue-in-cheek and just more subtle than I give HN credit for. Please.
Not all devs, not even most, but I certainly think this
Sadly, this will be the trend with things moving forward. JS is perceived as a good language and LLMs are meant to make them even easier to write. It is not about the mertis of a language. It's about which languages LLMs are "good" at.
  • jitl
  • ·
  • 8 hours ago
  • ·
  • [ - ]
There’s like 100x more JS developers than C# developers. JS can also run code very quickly, where with an AOT language, you need to AOT compile it. For tool calls, eval-as-a-service, running in browser JS is far ahead of C#.
Same reason AIs also use Python and DBMSes offer JS or Py UDFs easily, interpreted languages take no build time and are more portable. JS is also very popular.

Might also be a context window thing. Idk how much boilerplate C# has, but others like Java spam it.

  • hoppp
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Because js became an everything language that everyone can write and its the only language you ever need.

I dislike it also..

One other angle yet mentioned: JS is browser native. No matter how slow it is, browser is now the LCD. Similar server-client codebase, while ugly, is another plus.
You could make a better argument for Go (compiles to native for multiple targets, zero actual dependencies (no need for a platform or virtual machine on the target)
C# has AOT compilation producing native, single file assemblies. A bit behind on this compared to Go, but it's there.
C# no longer requires .net installed or bundled inside exe.

Like I’ve said: NativeAOT

https://learn.microsoft.com/en-us/dotnet/core/deploying/nati...

Go is the most portable compiled language out there and makes a lot of compromises with the interpreted lang world. But it's got its own issues.
>zero actual dependencies

on Linux only with CGO_ENABLED=0 and good luck using some non web related 3rd party module which can be used with CGO disabled.

  • aizk
  • ·
  • 3 hours ago
  • ·
  • [ - ]
I thought c# was a dead language at this point?
Atwood’s Law
  • blixt
  • ·
  • 10 hours ago
  • ·
  • [ - ]
Extrapolating and wildly guessing, we could end up with using all that mostly idle CPU/RAM (the non-VRAM) on the beefy GPUs doing inference on agentic loops where the AI runs small JS scripts in a sandbox (which Bun is the best at, with its faster startup times and lower RAM use, not to mention its extensive native bindings that Node.js/V8 do not have) essentially allowing multiple turns to happen before yielding to the API caller. It would also go well with Anthropic's advanced tool use that they recently announced. This would be a big competitive advantage in the age of agents.
I almost read this as anthropic will be using our idle CPU/GPU resources for their own training tasks ;)
This somewhat answers the question of "how on earth is a JS runtime company going to profit?"
Shopify should buy Ruby on Rails because they depends on it
  • hu3
  • ·
  • 9 hours ago
  • ·
  • [ - ]
didn't they try a hostile takeover of the ruby gems thing (forgot the name)?
  • ctoth
  • ·
  • 15 hours ago
  • ·
  • [ - ]
This decision is honestly very confusing to me as a constant user of Claude Code (I have 3 of them open at the moment.)

So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?

Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?

And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?

  • dboon
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Ink (and modern alternatives) probably are the best TUI toolkit. If you want to write a UI that's genuinely good, you need e.g. HTML, or some way to express divs and flex box. There isn't really another way to build professional grade UIs; I love immediate mode UI for games, but the breadth of features handled by the browser UI ecosystem is astonishing. It is a genuinely hard problem.

And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.

Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.

TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.

I like JS for this use case, and React on web, but really not fond of the Ink usage. Idk if it's Ink itself or the way it gets used, but somehow people are making CLIs that lag and waste terminal space now.
Ink seems to be the root cause of a major issue with the Claude Code CLI where it flickers horribly when it needs to repeatedly clear the screen and redraw.

I don't know why it's even necessary for this.

https://github.com/atxtechbro/test-ink-flickering

Issue on Claude Code GitHub:

https://github.com/anthropics/claude-code/issues/769

The idea that you need or want HTML or CSS to write a TUI is missing the entire point of what made TUIs great in the first place. They were great precisely because they were clean, fast, simple, focused -- and didn’t require an entire web stack to draw colored boxes.
I'm not so sure about that. I've written some nontrivial TUIs in my time, the largest one being [1], and as the project got more complicated I did find myself often thinking "It sure would be nice if I could somehow just write this stuff with CSS instead of tiny state machines and control codes for coloration". There's no reason these languages couldn't compile down to a TUI as lean as hand-coloring everything yourself.

[1]: https://taskusanakirja.com/

I'm certainly not advocating for a return to C + ncurses, but there's a wide ocean of options between that and HTML+CSS+JS in the terminal.
  • dboon
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Yes, for simple projects, absolutely. But when you're shipping something as widely adopted as CC, I disagree. At the end of the day, you're making a UI. It happens to be rendered via the terminal. You still need accessibility, consistent layouts, easy integration with your backend services, inputs, forms, and so on. If you don't need that stuff, there are lots of other, simpler options. But if you do, your other options begin to resemble a half baked, bug filled reimplementation of the web. So just use the web.
“Port it to a different language” a language that’s more out of distribution? Bad devex. Store data as an unreadable binary file? Bad devex.

Stay in distribution and in the wave as much as possible.

Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.

I have to admit this was my first thought, too. I'm pretty obsessed with Claude Code, but the actual app is so incredibly poorly engineered for something that doesn't even do that much.

Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.

Boggles the mind.
  • a-dub
  • ·
  • 15 hours ago
  • ·
  • [ - ]
they acquihired the team and derisked their investment in building claude code on top of bun. makes sense to me.

moreover, now they can make investments in order to make it an an even more efficient and secure runtime for model workspaces.

  • ·
  • 15 hours ago
  • ·
  • [ - ]
There's no reason to run agents on expensive AI platforms or on GPUs - when you can have the AI create an agent in JS and thus runs with very high performance and perfect repeatability on far less expensive CPUs.

At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.

Agents already do this exact thing, except that the go-to language for Claude to write one-off scripts in is usually Python.
Am I missing something - I thought that GPUs are for training the weights

Being able to create an agent in any language to run on any hardware has always been possible hasn't it?

  • tkel
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Oh no ... unfortunately this likely means a Bun.AI API in my JS runtime.
So many comments about reasoning here, yet none about the very obvious one, it's not stability of the infrastructure, it's future direction of a product like Claude Code. They need to know how to continue their optimisation machine to fit developers needs the best way possible (for good or for worse).

I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.

  • sulam
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Don't they already have a ton of telemetry from Claude Code itself? I'd be shocked and expect an instant fork if Anthropic telemetry was added to Bun.
I don't get it. Why would Anthropic need to own a JS runtime?
Because they have a product that makes $1bn+ a year that depends on having a good, stable, cross-platform JS runtime.
I'm still confused. Why not just pour a ton of resources into it since it's open source. I guess dev mindshare? It is a great product
Pouring a ton of resources into an open source project that raised $26m in VC doesn't guarantee that the project will stick around. Acquiring it does.
Buying Bun to ensure it sticks around doesn't pass the smell test unless they had very few months of runway left
Bun had four years of runway left.
You're describing Node.js which has existed for the last 15 years
And is owned by Microsoft. The theory is that by symmetry Anthropic should own a node competitor.
Microsoft doesn't own node.
but they are a company that burns billions every year in losses and this seems like a pretty random acquisition.

Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?

> Why would Anthropic's acquisition of them make them better at what they were already doing?

Because now the Bun team don't have to redirect their resources to implementing a sustainable business model.

It's Anthropic, not Microsoft. They already had a runway of 4 years, and honestly, that is preferable to hitching their wagon to a volatile startup like Antropic.
>but they are a company that burns billions every year in losses

No they don't.

> As discussed previously, OpenAI lost $5 billion and Anthropic $5.3 billion in 2024, with OpenAI expecting to lose upwards of $8 billion and Anthropic — somehow — only losing $3 billion in 2025. I have severe doubts that these numbers are realistic, with OpenAI burning at least $3 billion in cash on salaries this year alone, and Anthropic somehow burning two billion dollars less on revenue that has, if you believe its leaks, increased 500% since the beginning of the year.

https://www.wheresyoured.at/why-everybody-is-losing-money-on...

  • pzo
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Ok but node is even more stable and mature - compare node api parity in bun and also issue of bun vs node
But they are not using node any more?
  • sneak
  • ·
  • 15 hours ago
  • ·
  • [ - ]
That doesn’t require or benefit from acquiring Bun. Node continues to exist and serve fine.
I'm wondering if Bun would be a good embedded runtime for Claude to think in. If it does sandboxing, or if they can add sandboxing, then they can standardize on a language and runtime for Claude Code and Claude Desktop and bake it into training like they do with other agentic things like tool calls. It'd be too risky to do unless they owned the runtime.
  • baq
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Why would Sun then Oracle own Java? Why would Microsoft own .net? Why would Apple own swift?

IOW look where the puck is going.

  • ·
  • 15 hours ago
  • ·
  • [ - ]
Considering that 1) Bun is written in Zig, 2) Zig has a strict no-AI policy [1], and 3) Bun has joined Claude, it seems that Bun and Zig are increasingly culturally apart.

[1] https://ziglang.org/code-of-conduct/#strict-no-llm-no-ai-pol...

You’re reading a code of conduct for contributing to the zig project. I don’t think everything there is guidance for everything written in zig, eg ‘English is encouraged’ is something one might not want for a project written in zig by native French-speakers, and I don’t think that’s something zig would want to suggest to them. I read the AI part is much more motivated by the asymmetries of open source project contribution than any statement about the language itself. Fly-by AI contributions are bad because they make particularly poor use of maintainer time. Similar to the rule on proposing language changes, which can suck up lots of reading/thinking/discussion time. When you have people regularly working together (eg those people in anthropic working on bun) the incentives are different because there is a higher cost to wasting your colleague’s time.
> Bun and Zig are increasingly culturally apart

That's like saying GCC and NodeJS are culturally apart, as if that has significant bearing on either?

  • M4v3R
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Nothing I found says anything about Zig folks being inherently against AI. It just looks like they don’t want to deal with “AI Slop” in contributions to their project, which is very understandable.
It is remembers to me to Arduino buy for Qualcomm. And it was not good news.
Has CC always used Bun? When it tries it out many months ago it was an npm install not bun install in their instructions (although I did use bun install myself). Just odd that if they were using bun, why the installation wasn’t specifically a “bun install” (I suppose they were trying to keep it vanilla for the npm masses?)
Wondering to what degree this was done to support Anthropic’s web crawler. Would assume that having a whole JS runtime rather than just a HTTP client could be rather useful. Just hypothesising here, no clue what they use for their crawler.
Hope nobody buys Astral or Python is f*cked.
Then it would probably be back to Poetry. Or some other newcomer, or maybe a fork of uv.
uv is very forkable - dual-licensed under Apache and MIT, high quality codebase, it's Rust rather than Python but the Python community has an increasing amount of Rust experience these days.

That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.

  • baq
  • ·
  • 15 hours ago
  • ·
  • [ - ]
If you froze uv today it’ll take years for anything to get to a state where the switch would be worth it.
Honestly, given the constant rollercoaster of version management and building tools for Python the move to something else would be expected rather than surprising.

I’ve seems like a great tool, but I remember thinking the same about piping, too.

  • baq
  • ·
  • 14 hours ago
  • ·
  • [ - ]
uv is a revolution in every possible positive sense of the word in the Python world and I've been here since 1.5. it is imperative that bitter oldtimers like us try it, I did and the only regret I've got is that I didn't do it sooner.
I also tried it and am now using it for new projects. But I was just fine with Poetry too. Yes, uv is faster and probably better code. But my use-cases didn't necessitate to re-create the venvs frequently, so the slowness of Poetry didn't matter that much to me, and I am not using the "one-off script" kind of approaches that uv enables (writing the dependencies in a comment in the script itself).

So, yeah, uv is nice, but for me didn't fundamentally change that much.

Our entire business runs on Python without a drop of Astral in the mix. No one would even notice.
you should try uv, really impressive tool
Honestly, that is an understatement. `uv run` has transformed how I use Python since 99% of the time I don't need to setup or manage an environment and dependencies. A have tons of one-off Python scripts (with their dependencies in PEP 723 metadata at the top of the file) that just work with `uv run`.

I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.

I don't want to even think about it. uv has been a revelation!
#1, uv is open-source and it could easily be forked and kept up to date.

#2, if you don't like uv, you can switch to something else.

uv probably has the least moat around it of anything. Truly a meritocracy: people use it because it's good, not because they're stuck with it.

  • pjmlp
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Never used any of their tools.

Python is doing great, other than still doing baby steps into having a JIT in CPython.

Finally, an event capable of killing the Python demon!
Congratulations to the bun team!
All vendors will have to implement test time code execution, solution exploration, etc. as it's a low hanging fruit with huge gains, so I see it as a great hire. Love Bun, happy for you guys!
When I saw the headline I was ready to be mad, but after reading the post, I'm cautiously on board with this.
  • s-mon
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Congratulations to the team. Knowing some of the folks on the Bun team I can not say I am surprised. They are the top 0,001% of engineers, writing code out of love. I’m hugely bullish on Anthropic, this is a great first acquisition.
So, what if Claude Code starts using Bun in all applicable situations? If model providers train their models to use a tech stack beneficial to their business interests?
> I started porting esbuild's JSX & TypeScript transpiler from Go to Zig

How was Go involved there before Zig?

esbuild is still a Go app today: https://github.com/evanw/esbuild

The first hints of what become Bun were when Jared experimented at porting that to Zig.

To be honest, I never thought of Bun as something that someone would buy or invest in. What product do they sell?
What matters: it's staying open source and MIT licensed. I sincerely hope it stays that way. Congrats to the Bun team on making a great tool and getting the recognition they deserve.

> Being part of Anthropic gives Bun: Long-term stability.

Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.

(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)

Nothing gives you long term stability in tech. You have to constantly work at staying stable, and it isn't always up to anything the company is in control of, no matter what ownership they have.
> Nothing gives you long term stability in tech.

Sure. But everything is relative. For instance, Node has much more likelihood of long term stability than Bun, given its ownership.

> Node has much more likelihood of long term stability than Bun

Given how many more dependencies you need to build/maintain a Node app, your Bun application has a better chance of long term stability.

With Node almost everything is third party (db driver, S3, router, etc) and the vast majority of NPM deps have dozens if not hundreds of deps.

I’m talking about long term stability of the tool and ecosystem, not of any specific app.
Sure, that makes it a good backup strategy. But there’s little reason to use a worse tool until the time you need the backup comes.
This reads more like Anthropic wanted to hire Jarred and Jarred wants to work with AI rather than build a Saas product around bun. I doubt it has anything to do with what is best for bun the project. Considering bun always seemed to value performance more than all else, the only real way for them to continue pursuing that value would be to move into the actual js engine design. This seems like a good pivot for Jarred personally and likely a loss for bun.
It doesn't read like that to me at all. This reads to me like Anthropic realizing that they have $1bn in annual revenue from Claude Code that's dependent on Bun, and acquiring Bun is a great and comparatively cheap way to remove any risk from that dependency.
I haven't had any issue moving projects between node, bun, and deno for years. I don't agree that the risk of bun failing as a company affects anthropic at all. Bun has a permissible license that anthropic could fork from, anthropic likely knew that oven had a long runway and isn't in immediate danger, and switching to a new js cli tool is not the huge lift most people think it is in 2025. Why pay for something you are already getting for free and can expect to keep getting for free for at least four years, and buy for less if it fails later?
  • _jab
  • ·
  • 13 hours ago
  • ·
  • [ - ]
This argument doesn’t make much sense to me. Claude Code, like any product, presumably has dozens of external dependencies. What’s so special about Bun specifically that motivated an acquisition?
A dependency that forms the foundation of your build process, distribution mechanisms, and management of other dependencies is a materially different risk than a dependency that, say, colorizes terminal output.

I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.

  • rvnx
  • ·
  • 12 hours ago
  • ·
  • [ - ]
MIT code, let Bun continue develop it, once project is abandoned hire the developers.

If they don't want to maintain; GitHub fork with more motivated people.

> MIT code, let Bun continue develop it, once project is abandoned hire the developers.

Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?

If they found themselves pushing PRs to bun that got ignored and they wanted to speed up priority on things they needed, if the acq was cheap enough, this is the way to do it.
I'm also curious if Anthropic was worried about the funding situation for Bun. The easiest way to allay any concerns about longevity is to just acquire them outright.
  • ·
  • 13 hours ago
  • ·
  • [ - ]
Really? What risk is even there?
Except bun is OSS, so they could have just forked if something happened
It's not easy to "just" fork a huge project like Bun. You'll need to commit several devs to it, and they'll have to have Zig and JSC experience, a hard combo to hire for. In many ways, this is an acquihire.
Nah, it reads like the normal logic behind the consulting model for open source monetization, except that Bun was able to make it work with just one customer. Good for them, though it comes with some risks, especially when structured as an acquisition.
So Anthropic sees its CLI (in TypeScript) as the first-class product and maybe planning to expand the claude code with more JS based agents / ecosystem? Especially owning the runtime gives a lot of control over developer experience.
I'm confused. I installed claude code with:

    npm install -g @anthropic-ai/claude-code
I thought claude code just used Nodejs? I didn't realise the recommended install used a different runtime.
They switched to recommending this as the installation method back in July:

  curl -fsSL https://claude.ai/install.sh | bash
That install script gives you a single binary which is created using Bun.
Maybe that's why I didn't have some bugs people were reporting on HN, or because I was using linux.
  • kgc
  • ·
  • 5 hours ago
  • ·
  • [ - ]
Should we be porting our Python projects over to Javascript?
I'm only surprised that it wasn't Vercel who bought them.
Interesting that this announcement is tied in with one for Claude Code revenue.

Feels like maybe AI companies are starting to feel the questions on their capital spending? They wanna show that this is a responsible acquisition.

Genuine question, why acquisition when anthropic could simply sponsor, contribute and influence instead?

Acquisition seems like a large overhead and maybe a slight pivot to me.

Godspeed. Seems like a good pairing. Bun is sort of the only part of the JS ecosystem I like, and Code has become such an important tool for my work, that I think good things will come out of this match. Go Bundler as well.
Aham, tx. Good to know - I'll switch my projects to Deno.
you know Deno is VC backed right
Neat. I just started using bun as my default "batteries included" JavaScript engine, so it's nice they're getting this boost.
I’m curious to what the acquisition price was. Bun said they’ve raised $26 million so I’m assuming the price tag has to be a lot higher than that for investors to agree to an acquisition.
This morning I found myself muttering something I won't repeat as a reaction to Claude Code's remarkably slow startup time.

Put the Bun folks directly on that please and nothing else.

Wouldn’t it make more sense to write the same functionality using a more performant, no-gc language? Aren’t competitors praised for their CLIs being faster for that reason?
With AI tooling, we are in the era where rapid iteration on product matters more than optimal runtime performance. Given that, implementing your AI tooling in a language that maximizes engineer productivity makes sense, and I believe GC does that.
  • logsr
  • ·
  • 13 hours ago
  • ·
  • [ - ]
JS/TS has a fundamental advantage, because there is more open source JS/TS than any other language, so LLMs training on JS/TS have more to work with. Combine that with having the largest developer community, which means you have more people using LLMs to write JS/TS than any other language, and people use it more because it works better, then the advantage compounds as you retrain on usage data.
One would expect that "AI tooling" is there for rapid iteration and one can use it with performant languages. We already had "rapid iteration" with GC languages.
If "AI tooling" makes developers more productive regardless of language, then it's still more productive to use a more productive language. If JS is more productive than C++, then "N% more productive JS" is still more productive than "N% more productive C++", for all positive N.
Codex is written in Rust
Bun has completely changed my outlook on the JS ecosystem. Prior to Bun, there was little focus on performance. Now the entire space rallies around it.

Congrats to Jarred and the team!

  • krig
  • ·
  • 15 hours ago
  • ·
  • [ - ]
> Prior to Bun, there was little focus on performance.

This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.

> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.

I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.

  • logsr
  • ·
  • 13 hours ago
  • ·
  • [ - ]
> Vibe-coded projects get bought by vibe-coded companies

this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.

> a decade of performance competition in the JS VM space

this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.

  • krig
  • ·
  • 12 hours ago
  • ·
  • [ - ]
> Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.

Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...

Don't read a snarky comment so literally ;)

> Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves

That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?

That's because it's not written in JS at all but a compiled systems language, no wonder it's gonna be fast.
Virtually all JavaScript engines are written in compiled languages. (Most runtimes for that matter nut just JS)
My mistake, I was thinking of the wider ecosystem not the runtime, ie formatters, bundles and linters like Biome, oxc, etc being written in Rust or other compiled languages. That's where I saw the biggest speedup, because developers of them decided to use a compiled language to write them in instead of JS via a JS runtime where you'll inherently be limited by even a JIT language.
One important original point of node was that v8 made JS very fast by compiling to machine code, plus it’s had multithreading built in for a decade.
Machine code yes (along with Spidermonkey, JSC and Nashorn), the timeframe around 2005-2010 saw the introduction of JIT'ed JS runtimes. Back then however JS was firmly single-threaded, it was only with the introduction of SharedArrayBuffer that JS really started to receive multithreading features (outside of SharedArrayBuffer and other shareable/sendable types, a runtime could opt to run stuff like WebWorkers/WebAudioWorkers in separate processes).

Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.

Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.

That is the most absurd thing I've heard in 20 years. Chrome literally was launched on performance, for JS and beyond.

The reality is that the insane "JS ecosystem" will rally around whatever is the latest hotness.

> Prior to Bun, there was little focus on performance

v8 is one of the most advanced JIT runtimes in the world. A lot of people have spent a lot of time focusing on its performance.

I'm sure the Bun team will get Claude Code straightened out. Weird acquisition, but TBH Anthropic needed to fill this hole.
I use bun in a project but Claude Code always uses node to run throwaway scripts. Maybe they can persuade it to use bun as part of this acquisition?
I bet CC will become a binary with bun included and it'll use it's internal JS engine to run most scripts.
Oddly I saw it try to use bun the other day, and was confused because everything in the project is in node.
I always tell it to use Bun and it works? Am I misunderstanding?
It seems the default is node (despite the project docs saying to use bun and all example script documentation using bun). It will use bun if told, but there’s definitely nothing saying to use node and it uses that anyway.
So, we can anticipate that the new Anthropic browser will now have the interpreter Ken Thompson previewed for us 41-odd years ago?
A single bun? Is that really newsworthy?
Oh you silly goose.
:(
But will they fix command line autocompletions?
  • klysm
  • ·
  • 15 hours ago
  • ·
  • [ - ]
This wasn’t very high up on my list for acquisitions but props to the bun team for cashing in on the AI hype somehow!
on the post they try to reassure the following question "If I bet my work project or company's tech stack on Bun, will it still be around in five or ten years?" and the thing is that we don't know if Anthropic itself will be around 5 to ten years
My long-term bet on Node being "boring" and "stable" continues to pay major dividends. So glad I never invested any time and effort on this ecosystem…
  • pjmlp
  • ·
  • 13 hours ago
  • ·
  • [ - ]
That is the way, when one is long time around, there are these alternatives coming and going, while the reference platforms keep going.
must be nice to have a 1gb node_modules folder for hello world
I love bun but for a cli tool: why they don’t write Claude Code in Go and call it a day?
Makes sense, I had idea how else the investors would have made money on a javascript bundler/jsc frontend
Sounds like the goal is to bundle up Bun with Claude Code insanely tightly, to the point where it doesn't matter if you have nodejs installed locally, but also they can optimize key things for Claude Code's Bun runtime as needed. It's a brilliant acquisition, and bun stays open source, which allows it to continue to grow, to Anthropics benefit and everyone else's.
  • mpeg
  • ·
  • 15 hours ago
  • ·
  • [ - ]
A nice start would probably be for Claude Code to stop trying to use npm when it detects a bun lockfile and vice versa...
I just ln bun to npm, npx, and node. This has the added benefit of letting ts_ls and various other tools work without requiring me to have both node and bun installed locally.
Yeah Claude is very good, but it definitely needs to get "smarter" in some nuanced areas.
Congratulations to Jared. He and the team are Real Ziggers. Looking forward to a faster Claude Code!
why couldn't Anthropic simply use Claude Code to write Bun over the weekend??
It is open source (MIT license), Claude should have a pretty good start on it already.
Congrats Jarred and team! You have saved humanity many hours already, and I'm sure with Anthropic's backing, you will spare us many more. Farewell would-be headaches from Node & NPM tooling and waiting for builds and tests and package updates. Exciting times ahead!

Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!

No strategic roadmap is ever going to tell you: "Build a $0-revenue JavaScript runtime and one day an AI company will acquire you"
It reminds me of hearing that music majors often do well in medical school. Want to go to medical school? Just major in music, duh.
Ha, Physics majors get the same talk about law school. It's just the selection bias of selecting for people willing to make hard pivots filtering out the under-achieving, go-with-the-flow types.
Lots of strategists will tell you something like: "Build something that's useful and then there will be money".

That's 100% what happened to Bun. It's useful (like really useful) and now they're getting rewarded

  • wmf
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Honestly that's probably the best play. Monetizing dev tools directly is a nightmare.
And you risk ending up like Postman or Insomnia, once beautiful software which is now widely hated by developers.
Countdown till Astral is acquired?
i really think this is part of the pitch deck for bun's funding. that a bigger company would acquire it for the technology. the only reason an AI company or any company for that matter would acquire it would be to:

1. acquire talent.

2. control the future roadmap of bun.

i think it's really 1.

I had the same thought when openai acquired rockset.
Well, that was the playbook in the 1999-2001 dotcom days.
Which is probably why no one's going to recommend it these days

...but hey, things are different during a bubble.

I finally hope Bun will start to support and work on WSL1
Maybe they just like to work together *shrug*.
Looks like a good time to try learning Zig again
Who is expects Anthropic to migrate all their code to Codeberg.
Maybe now Claude will not assume that I use npm, and actually start using bun?
Shouts out to the fellow who half-broke the news in this submission that was presumably killed because of the aggressive paywall: https://news.ycombinator.com/item?id=46123627

And apparently the submission's source for being the only org I can tell that anticipated this: https://www.theinformation.com/articles/anthropic-advanced-t...

  • krig
  • ·
  • 15 hours ago
  • ·
  • [ - ]
This announcement made me check in on the arbitrary code execution bug I reported that the Bun Claude bot created a PR for about 3 weeks ago:

https://github.com/oven-sh/bun/pull/24578

So far, someone from the bun team has left a bunch of comments like

> Poor quality code

...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.

But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.

So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
The OP directly says:

> that the Bun Claude bot created a PR for about 3 weeks ago

The PR with bad code that's also been ignored was made by the bot that Bun made, and brags about in their acquisition post.

  • krig
  • ·
  • 12 hours ago
  • ·
  • [ - ]
I just reported the bug, it was the bot that was proudly mentioned in the announcement which created the PR and the code...
  • ·
  • 14 hours ago
  • ·
  • [ - ]
> So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?

...Did you miss the part where Bun used Claude to generate that PR?:)

I misinterpreted that first comment too. To clarify:

1. User krig reports an issue against the Bun repo: https://github.com/oven-sh/bun/issues/24548

2. Bun's own automated "bunbot" filed a PR with a potential fix: https://github.com/oven-sh/bun/pull/24578

3. taylordotfish (not an employee of Bun as far as I can tell, but quite an active contributor to their repo) left a code review pointing out many flaws: https://github.com/oven-sh/bun/pull/24578#pullrequestreview-...

  • krig
  • ·
  • 12 hours ago
  • ·
  • [ - ]
Right, this is accurate. Except I thought taylordotfish worked for bun, so I guess no one at bun has looked at it at all then.
I did.
Bun is such a great runtime. If you haven't tried it, try it. It's got bells and whistles.

This will make sure Bun is around for many, many, years to come. Thanks Anthropic.

Why Bun?

Easy to setup and go. bun run <something.ts>

Bells and whistles. (SQL, Router, SPA, JSX, Bundling, Binaries, Streams, Sockets, S3)

Typescript Supported. (No need to tsc, bun can transpile for you)

Binary builds. (single executables for easy deployment)

Full Node.js Support. (The whole API)

Full NPM Support. (All the packages)

Native modules. (90% and getting better thanks to Zig's interop)

S3 File / SQL Builtin. (Blazingly Fast!)

You should try it. Yes, others do these things too, but we're talking about Bun.

Its not 100% nodejs compatible. I see enough non-green dots in their own official report https://bun.com/docs/runtime/nodejs-compat

And even in packages with full support you can find many github issues that bun behaves directly which leads to some bugs.

Not saying it’s 100%, there’s still the repl missing but all of node’s API is available in the sense that it’s ABI compatible (or will be very near term).
> This will make sure Bun is around for many, many, years to come.

Well, until the bubble bursts and Anthropic fizzles out or gets acquired themselves.

If they keep it MIT licensed, if/when things come crashing down, I think its reasonable to think Bun would continue on in some form, even if development slows pace without paid contributors.
...and then it's going to be time for an "incredible journey" post.
  • croes
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Does it have permission flags yet like deno has?
I’ve never understood the security utility of the Deno flags. What practical attack would they protect you from? Supply chain seems to be the idea, but how many npm packages do people use that neither:

* Get run by devs with filesystem permissions

* Get bundled into production

It'll be around until they realize it makes 0$, and costs them millions per year in salaries/stock. then it will quietly die
You think they wouldn't have done that napkin math before deciding to acquire it?
Anthropic uses a lot of bun. In fact, they bet the farm on it.
  • ptak
  • ·
  • 15 hours ago
  • ·
  • [ - ]
What a trip. Love both, so all good I guess.
  • dboon
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Incredible news on so, so many levels!

(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.

(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.

(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.

A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.

This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.

(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.

[^1]: https://github.com/sst/opencode [^2]: https://github.com/sst/opentui

Well, Bun is MIT-licensed. So once they change the license and/or kill the project, the community can fork it easily.
  • wmf
  • ·
  • 14 hours ago
  • ·
  • [ - ]
The point of this deal is that they do not need to change the license. Nobody will ever pay for Bun and now they don't have to force it.
  • taf2
  • ·
  • 9 hours ago
  • ·
  • [ - ]
okay so does that mean openai buys deno?
So this is a rug pull we were afraid of? Bun got me into javascript ecosystem after years of hating on it. This sucks.
  • _pdp_
  • ·
  • 15 hours ago
  • ·
  • [ - ]
It makes total sense to me.
Hahaha congratulations. This is amazing. The most unlikely outcome for a devtools team. Fascinating stuff.

This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.

The Bun team works hard, glad to see it pay off.
Is Claude Code the first CLI tool to have a $1BN ARR?
  • CSSer
  • ·
  • 15 hours ago
  • ·
  • [ - ]
I don't know for sure, but it's definitely the first tool of that value to have a persistent strobing (scroll position) bug so bad that passersby ask me if I'm okay when they see it.
Man, I had never even put words to that problem but you are right that it is beyond annoying. It seems to me like it worsens the longer the Claude instance has run - I don't seem to see it early in the session.
  • CSSer
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Yeah, issues have been open on GitHub for months. I've tried shortening my scrollback history and using other emulators but it doesn't seem to make a difference. It's pretty frustrating for a paid tool.
ha I thought it was just a me thing and had have accepted my fate.
This graph from the SemiAnalysis blog suggests that GitHub Copilot reached it earlier this year: https://substackcdn.com/image/fetch/$s_!BGEe!,f_auto,q_auto:...
"GitHub Copilot" encompasses so many different products now that it's hard to see it as a CLI tool.
It doesn't make a lot of sense that they'll compare Microsoft 365 Copilot with Claude Code, though? Like it is a legit CLI tool but we should ignore it because it shares the name with something else?
The GitHub Copilot CLI tool is brand new, they only launched that in September: https://github.blog/changelog/2025-09-25-github-copilot-cli-...

Prior to that GitHub Copilot was either the VS Code IDE integration or the various AI features that popped up around the GitHub.com site itself.

Yeah but my point was it's pretty clear which Copilot the article was talking about and pretending like it could be anything beacuse "everything's copilot" is a L take sorry.
Terraform gets to $600mm if you squint really hard make up stuff. Kubectl though. Whatever you want to say about kubernetes complexity, it does get a bunch of money run through it. We could also look at aws-cli, gcloud and az, and if we assign cloud budgets that get run through there, I'm sure it's in the hundreds of millions. Then there's git. Across the whole ecosystem, there's probably a cool couple billion floating through there. gh is probably much smaller. Other tools like docker and ansible come to mind, though those are not quite as popular. Cc only hits $1B ARR if you squint really hard in the first place, so I think in this handwavy realm, I'd say aws-cli comes first, then kubectl, then git, with maybe docket and terraform in the mix as well. Nonetheless, Claude is a really awesome cli tool that I use most days, I find.
Can anyone provide some color around this: "I started porting esbuild's JSX & TypeScript transpiler from Go to Zig"? Hypothetical benefits include monolanguage for development, better interoperability with C and C++, no garbage collection, and better performance. What turned out to be realized and relevant here? Please, no speculation or language flames or wars.
  • slig
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Love bun! Congratulations!
Interesting. Looking through a strategic lens, I feel like this is related to the $1,000 free credit for Claude Code Web (I used a few hundred). What the heck are they aiming for? CodeAct? (https://arxiv.org/abs/2402.01030)
  • m00dy
  • ·
  • 4 hours ago
  • ·
  • [ - ]
who's going to buy deno ?
Curious, how did he pay the bills when spending these years developing Bun?
Bun was VC funded.
I thought it said he was building a voxel game in the browser?
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Wow.
bullish for js, bearish for python?
Good luck, always worried about stuff like that because it happened so many times and the product got worse eventually. At the same time, ai understand how much effort went into building something like Bun and people need to fund their life's somehow, so there's that.
In other news - Amp Code is a separate company now - https://ampcode.com/news/amp-inc
Reminds me of Atlassian buying an AI browser.
First major success story for Zig language? (Not trying to diminish Bun's team success)
I'd say Ghostty is a pretty big success story as well.
Let's not forget about TigerBeetle either. They weren't bought (as far as I'm aware), but they seem to have some pretty good backing from customers.
Congrats. This is the first time I remember reading a genuine, authentic story about a sale. Much preferred over “this is about continuing the mission until my earn-out is complete.”
> If Bun breaks, Claude Code breaks. Anthropic has direct incentive to keep Bun excellent.

and when this bubble pops down goes bun

  • qsort
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Anthropic? The AI people?
Look, if a terminal emulator can raise $67 million by riding the AI hypewave then a Javscript runtime can do the same. Nobody ever said that AI investments and acquisitions have to make any sense.
Well this just created a lot of work for me. Everything’s turning to shit at an alarming rate.
wow !
Congrats...

> Long-term stability. a home and resources so people can safely bet their stack on Bun.

Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.

> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.

There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.

Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.

Yeah that’s the main part that puzzled me, super happy for the team that they got a successful exit, but I wouldn’t really consider Anthropic’s situation to be stable…
Yeah, no reader of tech news will take an acquisition of a company with four years of runway as anything but decreasing the odds their product will still be around (and useful to the same audience…) in four years. Even without being tied to a company with lots of exposure to a probable bubble.
How so? Presumably Jarred got a nice enough payout that if Anthropic failed, he would not need to work. At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
History?

I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.

“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.

> At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.

Is there any historical precedent of someone doing that?

I say don't muddy the water with the public panic over "will it won't it" bubble burst predictions.

The effective demand for Opus 4.5 is bottomless; the models will only get better.

People will always want a code model as good as we have now, let alone better.

Bun securing default status in the best coding model is a win-win-win

  • pzo
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Opus 4.5 is not living in vacuum. It’s the most expensive of models for coders and there is Gemini 3 pro - with many discounts and deepseek 3.2 that is 50x cheaper and not much behind.
> I say don't muddy the water with the public panic over "will it won't it" bubble burst predictions.

It does matter. The public ultimately determines how much they get in funding if at all.

> The effective demand for Opus 4.5 is bottomless; the models will only get better.

The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.

There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.

  • ·
  • 13 hours ago
  • ·
  • [ - ]
If claude code starts having ads for bun in the code it generates, I am never using it again.

To some degree have “opinionated views on tech stacks” is unavoidable in LLMs, but this seems like it moves us towards a horrible future.

Imagine if claude (or gemini) let you as a business pay to “prefer” certain tech in generated code?

Its google ads all over again.

The thing is, if they own bun, and they want people to use bun, how can they justify not preferencing it on the server side?

…and once one team does it… game on!

It just seems like a sucky future, that is now going to be unavoidable.

What?

Why?

Not to be confused with Bunn [1], the coffee maker makers.

[1] www.bunn.com

Hahahahahhaahhahahahahahhahahahahahhahahaha.

Regards.

Classic - brand new blog post:

> We’re hiring engineers.

Careers page:

> Sorry, no job openings at the moment.

It's the Anthropic careers page that you're likely looking for now:

https://www.anthropic.com/jobs?team=4050633008

Is it just me or does this page keep jumping back to the top when I try to scroll?
Same on iOS. It was probably vibe coded.
It's doing that for me as well (desktop Safari).
It's doing it to me as well in Brave on macOS.
Maybe the engineers are Claude agents.
[dead]
why not Antrophic just fork or make a clone Bun themselves????

/s

[dead]
[flagged]
deno won, rust won
Why the hell is a CLI coding agent built in JavaScript?

It’s wild what happens when a generation of programmers doesn’t know anything except webdev. How far from grace we have fallen.

The big advantage of a language like JavaScript of Python for a CLI tool of this nature is that they naturally support adding extensions or plugins.

That's quite a bit harder if your tool is built using a compiled language like Go.

Ehhhh. In either case you have to define a neat clean plugin API. Whether it loads a DLL/SO or just some scripts isn’t that huge of a difference.
  • zwnow
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Well not gonna use Bun anymore I guess
  • jjice
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Why not?
  • zwnow
  • ·
  • 15 hours ago
  • ·
  • [ - ]
Because I avoid all major AI players with everything I got as all of them are thieves.
  • jekrb
  • ·
  • 14 hours ago
  • ·
  • [ - ]
...you do know that YC has backed several AI companies, right?
  • zwnow
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Does that make it a big AI player? I only read shit on here.
  • rvz
  • ·
  • 14 hours ago
  • ·
  • [ - ]
Did you donate money or time to Bun?
  • zwnow
  • ·
  • 13 hours ago
  • ·
  • [ - ]
Why would I
  • rvz
  • ·
  • 13 hours ago
  • ·
  • [ - ]
There you go.

Thank you for showing exactly why acquisitions like this will continue to happen.

If you don't support tools like Bun, don't be surprised to see them raise money from VCs and get bought out by large companies.

  • zwnow
  • ·
  • 12 hours ago
  • ·
  • [ - ]
I make 2k a month i dont have the financial freedom to support Javascript runtimes
anthropic wont win, and will just get bought out by an ibm or oracle in the end...time to migrate from bun now
If Bun ends up at either IBM or Oracle, then it's a pretty safe platform, it could stay around for 50 years.
oh well. it was cool while it lasted! I guess I'll figure out how to make deno do what I want, now.