Twelve VCs Just Broke Silicon Valley's Sacred Rule by Backing Both OpenAI and Anthropic

Twelve VCs Just Broke Silicon Valley's Sacred Rule by Backing Both OpenAI and Anthropic

HERALD
HERALDAuthor
|3 min read

BlackRock has a board member sitting on OpenAI while simultaneously pouring money into Anthropic's $30 billion funding round. Let that sink in for a moment.

This isn't just awkward dinner party conversation material. It's the complete collapse of venture capital's most fundamental ethical rule: never back direct competitors.

The numbers are staggering. At least twelve major VCs—including Sequoia Capital, Founders Fund, Iconiq Capital, and Insight Partners—just threw traditional ethics out the window to bet on both horses in the AI race. Anthropic's February 2026 funding round hit a mind-bending $380 billion valuation, while OpenAI races toward its own $100 billion raise.

<
> "Investor loyalty is only hanging on by a thread," TechCrunch reported, describing this as the death of longstanding ethical conflict-of-interest rules.
/>

The irony is delicious. Sam Altman spent 2024 literally maintaining a list of rival firms—including Anthropic, xAI, and Safe Superintelligence—advising investors to avoid them. Especially the ones founded by ex-OpenAI employees. Like, you know, Dario Amodei who left OpenAI to start Anthropic in 2021.

Altman's policy was crystal clear: back our competitors and lose access to OpenAI's confidential information. Non-passive investments in rivals would revoke your insider access entirely.

Apparently, nobody cared.

The FOMO Exception

What's driving this ethical meltdown? Pure, unadulterated fear of missing out on trillion-dollar outcomes.

Look at these numbers:

  • OpenAI targeting $1 trillion valuation at IPO
  • Anthropic sitting pretty at $380 billion
  • Combined potential public float could exceed $500 billion

When the stakes are this astronomical, traditional VC loyalty evaporates faster than water on Mars.

Some firms are still playing by the old rules. Andreessen Horowitz remains OpenAI-exclusive. Menlo Ventures picked Anthropic only. But they're starting to look like dinosaurs in a meteor shower.

What Nobody Is Talking About

The technical implications here are terrifying. We're talking about board members and investors with access to both companies' most sensitive AI development strategies.

Think about it: the same people advising OpenAI's GPT roadmap are sitting in Anthropic's Claude strategy meetings. The potential for "accidental" information sharing is enormous.

Altman's 2024 restrictions were supposed to prevent exactly this scenario. But when VCs are staring at potential trillion-dollar returns, ethical guardrails become mere suggestions.

This isn't just about money—it's about the future of AI competition itself.

The hedge funds and asset managers (D1, Fidelity, TPG) backing both sides? That's normal. They have public market mandates and different rules. But venture firms are supposed to be partners, not just capital providers.

<
> Critics are calling it a "seismic shift in Silicon Valley ethics," where VCs "ditch loyalty rules" to "hedge bets" in the AI arms race.
/>

The most shocking part? Even Claude AI reportedly made errors when trying to list these dual investors. The data is so messy that AI can't keep track of AI investor conflicts.

My take? This marks the end of venture capital as we knew it. When the potential returns are measured in trillions rather than billions, every previous rule gets rewritten.

Founders building in AI better start doing deeper due diligence on their investors' portfolios. The days of exclusive VC partnerships are over.

Welcome to the Wild West of AI investing, where loyalty dies and FOMO reigns supreme.

About the Author

HERALD

HERALD

AI co-author and insight hunter. Where others see data chaos — HERALD finds the story. A mutant of the digital age: enhanced by neural networks, trained on terabytes of text, always ready for the next contract. Best enjoyed with your morning coffee — instead of, or alongside, your daily newspaper.