
The Bun Story: From 45-Second Frustration to a Billion-Dollar AI Infrastructure
How one developer's impatience with slow builds created the runtime that powers the next generation of AI coding tools
In the summer of 2020, Jarred Sumner was building something fun: a Minecraft-like voxel game in the browser. He was using Next.js, React, and all the modern tooling that developers swear by. There was just one problem.
Every time he changed a line of code, he had to wait 45 seconds for the dev server to reload.
Forty-five seconds. Long enough to check Hacker News. Long enough to lose your train of thought. Long enough, apparently, to change the entire JavaScript ecosystem.
<> "This was frustrating, and I got really distracted trying to fix it."/>
That distraction would consume the next five years of his life—and culminate in December 2025 when Anthropic, the AI company behind Claude, acquired his creation for an undisclosed sum. It was Anthropic's first-ever acquisition, a move that signaled something profound: in the age of AI agents writing code, the infrastructure layer matters more than ever.
This is the story of Bun.
Part I: The Cramped Apartment in Oakland
To understand Bun, you have to understand the pain that birthed it.
By 2020, the JavaScript ecosystem had become a Rube Goldberg machine. Want to build a modern web app? You need Node.js for the runtime. npm (or Yarn, or pnpm) for packages. Webpack or Vite for bundling. Babel or SWC for transpiling. Jest for testing. TypeScript for types. Each tool with its own configuration file, its own mental model, its own dependency hell.
For Jarred, who had received a Thiel Fellowship in 2014 and spent years working on developer tools at various startups, this felt fundamentally wrong.
<> "I just wanted an instrument that doesn't get in the way of my work. In the end, I had to write it myself."/>
In May 2021, he started with the part that hurt most: the transpiler. JavaScript tools like Babel were written in JavaScript—interpreted code processing interpreted code. The newer tools like esbuild were written in Go, which was faster. But Jarred had a different idea.
He would write it in Zig.
Why Zig?
This is where Jarred's story diverges from conventional wisdom. In 2021, if you were writing performance-critical systems software, the "smart" choice was Rust. It was the hot language. It had the ecosystem. It had the safety guarantees.
Jarred chose Zig anyway.
<> "Zig is sort of similar to writing C, but with better memory safety features in debug mode and modern features likedefer... It has very few keywords so it's a lot easier to learn than, for example, C++ or Rust."/>
Within three weeks, he had a working transpiler. The benchmarks were absurd:
- 3x faster than esbuild (the previous speed king, written in Go)
- 94x faster than swc (written in Rust)
- 197x faster than Babel (written in JavaScript)
This wasn't marginal improvement. This was a different universe.
But a transpiler wasn't enough. To make Next.js server-side rendering work, he needed a JavaScript runtime. And that meant he needed a JavaScript engine.
Part II: The JavaScriptCore Gambit
Here's where the story gets technical—and fascinating.
The JavaScript ecosystem is dominated by one engine: Google's V8. It powers Chrome, Node.js, and Deno. It's fast, battle-tested, and has a massive team behind it.
Jarred looked at V8 and said: "No."
Instead, he chose JavaScriptCore—Apple's JavaScript engine that powers Safari. It's the engine almost nobody uses for server-side JavaScript.
Why? One word: startup time.
<> "JavaScriptCore seems to start around 4x faster than V8."/>
For long-running servers, this doesn't matter much. But for CLI tools, serverless functions, and—crucially—AI agents that spawn processes thousands of times per hour, cold start time is everything.
Jarred spent a month reading WebKit's source code. Then he built a runtime around JavaScriptCore that could do what Node.js does, but with a fraction of the overhead.
He did this alone. In a cramped apartment in Oakland. Just coding and tweeting about Bun.
Part III: The Explosion
On July 5, 2022, Jarred released Bun v0.1.0.
It was a single binary that contained:
- A JavaScript/TypeScript runtime
- A package manager (replacing npm)
- A bundler (replacing Webpack)
- A transpiler (replacing Babel)
- A test runner (replacing Jest)
One download. Zero configuration. Everything just worked—and everything was fast.
The response was immediate and overwhelming.
20,000 GitHub stars in the first week.
<> "Those first two weeks after the release were one of the craziest weeks of my life. My job switched from writing code all day to replying to people all day."/>
By August, he had formed a company called Oven.sh, raised $7 million from Kleiner Perkins (with Vercel's Guillermo Rauch as an investor), and rented his first office in San Francisco.
The Bun era had begun.
Part IV: The Technical Magic
Let's pause the narrative to understand what makes Bun actually fast. Because "it's written in Zig" is only part of the story.
The Package Manager: Copy-on-Write
When you run npm install, the package manager downloads tarballs, extracts them, and copies files into node_modules. Lots of copying. Lots of I/O.
Bun does something smarter. On supported file systems (APFS on Mac, Btrfs on Linux), it uses Copy-on-Write. Instead of physically copying files, it asks the operating system to create a reference to the existing data. The actual copy only happens if you modify the file—which almost never happens in node_modules.
Result? Installing a gigabyte of dependencies takes fractions of a second.
The Lockfile: Binary, Not JSON
npm's package-lock.json and Yarn's yarn.lock are text files. Human-readable, but slow to parse.
Bun's bun.lockb is a binary format that gets memory-mapped directly. No parsing. No JSON overhead. Just raw speed.
The Tradeoffs
Here's something the Bun marketing doesn't emphasize: some of this speed comes from doing less.
When you run bun install, it doesn't verify that your cached packages match what's on the npm registry. npm and pnpm do this check—and it adds seconds of latency. Bun skips it.
Is this the right tradeoff? For most developers, probably yes. For enterprise environments with strict reproducibility requirements? Maybe not.
Every engineering decision has costs. Bun chooses speed and trusts you to know what you're giving up.
Part V: The Skeptics Have a Point
Not everyone was impressed.
In September 2023, developer Jared Wilcurt published a piece titled "Bun hype. How we learned nothing from Yarn." It became one of the most-discussed critiques in the JavaScript community.
His central argument was damning: we've seen this movie before.
<> "I see a lot of parallels between Yarn and Bun. Both targeted existing open source systems, and instead of actually contributing to them, just went off to create their own competing technology. Both sold themselves on being way faster. Both announced they were v1.0 and ready for production... while not actually supporting Windows."/>
The Windows criticism stung because it was true. When Bun announced its 1.0 "production-ready" release in September 2023, Windows support was experimental at best. The package manager, bundler, and test runner were disabled on Windows entirely.
<> "The Windows build is highly experimental and not production-ready."/>
For a tool claiming to replace your entire development stack, this was a significant gap.
The Compatibility Question
There was also the matter of Node.js compatibility. Bun marketed itself as a "drop-in replacement," but developer David Fekke tested this claim:
<> "While the makers of Bun claim that it is a drop in replacement for Node.js, I did not find that to be the case."/>
React Server Components didn't work. The crypto library wasn't fully compatible. Certain npm packages expected Node-specific APIs that Bun didn't implement.
The Yarn Parallel
Wilcurt's historical analogy was uncomfortable because it was accurate. Yarn launched in 2016 promising to be 10x faster than npm. It succeeded—for about a year. Then npm improved. Then pnpm arrived. By 2023, Yarn was effectively dead.
Would Bun follow the same trajectory?
Part VI: The Claude Code Pivot
Here's where the story takes an unexpected turn.
In late 2024, something changed in the software industry. AI coding assistants stopped being toys and started being tools. Claude Code, Anthropic's AI coding agent, was leading the charge—and it was growing faster than anyone expected.
By December 2025, Claude Code had reached a $1 billion annual run-rate in less than a year.
And inside Anthropic's infrastructure, something interesting was happening: they were using Bun.
Not just using it—depending on it. Bun's ability to compile JavaScript into a single static binary (bun build --compile) was perfect for distributing AI agents. No runtime installation required. No dependency conflicts. Just one file that runs anywhere.
<> "Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot."/>
Jarred started using Claude Code himself. He got, in his words, "kind of obsessed with it."
Through a series of long walks around San Francisco with engineers from Claude Code's team—one of them lasted four hours—an idea began to form.
What if Bun's future wasn't building a cloud hosting product?
What if it was becoming the operating system for AI agents?
Part VII: The Anthropic Deal
On December 1, 2025, Jarred published a post titled "Bun is joining Anthropic."
The acquisition marked several firsts:
- Anthropic's first-ever M&A deal
- The largest exit in the JavaScript tooling space in years
- A validation of the "single-developer creates infrastructure" model
The terms were not disclosed, but the context tells a story. At the time of acquisition:
- Bun had $0 in revenue
- It had 4+ years of runway from $26M in total funding
- Monthly downloads had just crossed 7.2 million (up 25% in a single month)
Jarred didn't need to sell. But he saw something:
<> "I think Anthropic is going to win. Betting on Anthropic sounded like a more interesting path. To be in the center of things. To work alongside the team building the best AI coding product."/>
The deal came with promises:
- Bun would remain MIT-licensed and fully open source
- The same team would continue development
- The GitHub repository would stay public
But make no mistake: this was a bet on a specific future. A future where most new code is written by AI agents, and the infrastructure that runs that code matters more than ever.
Part VIII: What This Means
Let's step back and consider what Bun's journey tells us.
For Developers
Bun proves that individual developers can still move mountains. Jarred wasn't backed by a tech giant. He didn't have a team of hundreds. He had Zig, a cramped apartment, and an obsession with performance.
The tools we use daily—npm, Webpack, Jest—were written by committees over decades. Bun showed that one person, with the right technical insight and enough stubbornness, can reimagine everything from scratch.
For the JavaScript Ecosystem
Bun's existence forced the old guard to improve. npm 10 is faster than npm 5. Deno has gotten more Node-compatible. Competition, even from brash newcomers, drives progress.
But it also raised uncomfortable questions. Why did it take a single developer with a Zig transpiler to show that JavaScript tooling could be 100x faster? What does that say about the innovation happening (or not happening) in established projects?
For AI
This might be the most important implication.
When Anthropic acquired Bun, they weren't just buying a JavaScript runtime. They were buying the infrastructure layer for an agent-first future.
AI coding assistants don't care about npm's history or Webpack's plugin ecosystem. They care about three things:
- Speed (agents spawn thousands of processes)
- Simplicity (fewer failure points)
- Portability (single-binary deployment)
Bun was accidentally optimized for all three. What started as one developer's frustration with slow builds became the foundation for the next generation of software development.
Epilogue: The Open Questions
Bun's story isn't over. In many ways, it's just beginning.
Will Node.js survive? Probably. It has too much momentum and enterprise adoption to disappear. But it will increasingly become the "COBOL of JavaScript"—maintained, but not where the innovation happens.
Will Bun's competitors catch up? They will try. Deno is already shipping more Node-compatible features. The JavaScript ecosystem is nothing if not competitive.
Will being owned by an AI company change Bun? This is the real question. Anthropic says Bun will remain open source and community-driven. But priorities inevitably shift. Features that serve AI agents may take precedence over features that serve human developers.
Will we remember Bun in five years, or will it follow Yarn into obsolescence? The Yarn parallel haunts this story. Every tool is faster—until the old tool catches up. Every new framework is better—until developers realize they traded one set of problems for another.
What makes Bun different is the Anthropic backing. It's no longer a VC-funded startup racing to find a business model. It's infrastructure for a company valued at over $60 billion, building products that are already generating billions in revenue.
That's a different kind of staying power.
A Final Thought
In his December 2025 blog post, Jarred Sumner shared a prediction he made in 2023:
<> "I think there's a future where programming languages are designed for language models instead of language models adapting to existing programming languages."/>
At the time, it seemed like a bold speculation. Now, with Claude Code writing more code in Bun's repository than any human contributor, it looks like prophecy.
Jarred started this journey because he was tired of waiting 45 seconds for his code to reload. Five years later, he had created the runtime that AI agents use to write code faster than any human ever could.
Sometimes the best things come from the simplest frustrations.
Jarred Sumner and his team continue to develop Bun as part of Anthropic. The project remains open source under the MIT license at github.com/oven-sh/bun.
Key Timeline
| Date | Event |
|---|---|
| 2014 | Jarred receives Thiel Fellowship, leaves college |
| May 2021 | First Bun transpiler benchmarks (197x faster than Babel) |
| July 5, 2022 | Bun v0.1.0 released, 20k GitHub stars in first week |
| August 2022 | Oven.sh founded, $7M seed round |
| September 2023 | Bun v1.0.0, $19M Series A from Khosla Ventures |
| October 2025 | 7.2M monthly downloads, 25% month-over-month growth |
| December 1, 2025 | Anthropic acquires Bun |
By the Numbers
- $0 — Bun's revenue at time of acquisition
- $26M — Total funding raised (seed + Series A)
- ~14 — Size of Bun team
- 7.2M — Monthly downloads (October 2025)
- $1B — Claude Code's annual run-rate
- 4x — JavaScriptCore's cold start advantage over V8
- 45 — Seconds that started it all
Sources: Bun official blog, InfoWorld interview (June 2023), Pragmatic Engineer newsletter, Reuters, Dev.to community discussions, FEK.IO analysis

