Reco.ai's $400 JSONata Rewrite Just Saved Them $500k Annually
Everyone says AI can't replace real engineering work. That complex language implementations require months of careful human craftsmanship. That $400 in tokens won't get you much beyond a decent chatbot conversation.
Reco.ai just proved all of that spectacularly wrong.
Seven hours. That's how long it took Nir Barak to completely reimplement JSONata—a complex query and transformation language comparable to jq with lambdas—using Claude. The total cost? $400 in tokens. The annual savings? A staggering $500,000.
Let me paint you the before picture. Reco.ai processes billions of events across thousands of distinct expressions in their SaaS security pipeline. They needed JSONata for their policy engine, but there was one problem: their entire stack runs on Go, while JSONata only had a JavaScript reference implementation.
So they did what any reasonable engineering team would do—they spun up a Kubernetes fleet of jsonata-js Node.js pods to handle RPC calls. Classic "JavaScript tax" in action. That fleet alone was costing them $300K annually.
<> "The fact that this only took $400 of Claude tokens to completely rewrite makes it even more baffling. I can make $400 of Claude tokens disappear quickly in a large codebase."/>
That's from the Hacker News discussion (196 points, 185 comments), and it perfectly captures the absurdity of the situation. We're living in an era where $400 can eliminate $300K of infrastructure costs.
What gnata Actually Delivers
Barak didn't just create a toy implementation. gnata is a full parser/evaluator for JSONata 2.x semantics with some serious optimizations:
- 1000x speedup in evaluation
- Parsing only needed JSON subtrees (not the entire document)
- A streaming layer called StreamEvaluator for batching
- Complete elimination of the RPC overhead
The performance gains come from more than just ditching Node.js. JSONata was designed for single evaluations, but Reco needed to run thousands of expressions against similar events. Their Go implementation batches these operations efficiently—something the original couldn't do.
The Elephant in the Room
Look, I'm as skeptical as anyone about AI magic bullets. Seven hours feels impossibly fast for reimplementing a complex language. But the numbers don't lie, and Reco is betting their production pipeline on this code.
The real question isn't whether Claude can write decent Go code—it can. It's whether this approach scales beyond one brilliant engineer working on a well-defined problem. Barak clearly knew JSONata inside and out before starting. He understood the performance bottlenecks. He had specific optimization targets.
This isn't about AI replacing engineers. It's about exceptional engineers using AI to eliminate months of tedious implementation work.
The timing matters too. This effort was directly inspired by Cloudflare's recent "How we rebuilt Next.js with AI in one week" post, where they reimplemented Next.js APIs on Vite for ~$1,100 in tokens. We're seeing a pattern emerge: companies using LLMs to escape the "JavaScript tax" in performance-critical paths.
For SaaS security firms like Reco, this kind of optimization isn't just about cost savings. When you're processing billions of audit logs from Salesforce and other platforms, detecting risky OAuth grants and anomalous integrations, every millisecond matters. Their AI governance policies depend on real-time analysis.
The broader implications are wild. If $400 can eliminate $500K in annual costs, how many other companies are sitting on similar optimization opportunities? How many JavaScript dependencies could be rewritten in Rust, Go, or Zig for the price of a decent dinner?
Maybe the future isn't AI replacing developers. Maybe it's developers using AI to finally build the infrastructure they always wanted, without the compromises they thought were permanent.

