Rob's Notes 21: Compute is the New Oil
Is Big Tech realigning itself for the next phase of AI innovation?
In doing some prep this week for a piece on CNBC about AI, it seems clear that the largest tech companies are aligning themselves with nation states for the next stage of the AI race, where they’ll need to build their compute capabilities via massive data center build outs (with enormous energy needs, requiring fast solves).
Part of the value to Meta of a Scale AI megadeal may be the latter’s position as a US defense contractor. OpenAI and Meta execs joining the army reserve, wasn’t on my 2025 bingo card(!) either. Many engineers and tech workers in Silicon Valley historically pushed back against their employers building military applications, but those days seem like very long ago.
I was living in the US already when my parents made the decision to sell our childhood home in South Africa to a condo developer. To this day, I’m glad I never got to see the state of disrepair it fell into after my parents moved out; that intermediate phase before it was leveled to build condos in its place. We often have these nostalgic memories of places tied to the times we spent in them, and those memories probably dampen our understanding of the inevitable massive changes that institutions do (and must) undergo to survive or thrive. The tech industry has changed a lot in the last five years, and I often wonder where it’s all going.
Going back to the title of this note, looking at how nation-states historically supported oil and gas companies provides several instructive parallels for understanding potential government approaches to securing compute resources.
Direct Financial and Infrastructure Support: Governments have long provided massive subsidies, tax breaks, and infrastructure investments to oil companies. The U.S. oil depletion allowance, established in 1926, allowed companies to deduct 27.5% of gross income from oil properties. Similarly, nations built strategic petroleum reserves and funded pipeline infrastructure.
Similarly, we might expect governments to provide research grants, tax incentives for AI infrastructure, and direct funding for semiconductor manufacturing - as seen in the U.S. CHIPS Act and similar initiatives in Europe and Asia.
Geopolitical Maneuvering and Resource Diplomacy: The "Seven Sisters" oil companies operated as quasi-diplomatic entities, with home governments backing their overseas operations. The U.S. supported American oil companies in Iran, Venezuela, and the Middle East through diplomatic pressure and, when necessary, covert operations. Britain's involvement in the 1953 Iranian coup partly served BP's interests.
We're already seeing nations compete for semiconductor supply chains, rare earth minerals, and AI talent through trade policies, investment restrictions, and diplomatic initiatives.
Strategic Alliances and Proxy Relationships: Governments cultivated relationships with oil-producing nations through military aid, security guarantees, and political support e.g. think of U.S.-Saudi relations or Soviet backing of certain oil-rich allies.
In the AI compute era, nations are forming technology partnerships, sharing AI research, and creating "digital silk roads" to secure access to computational resources and advanced chips.
Covert Operations and Economic Warfare: Intelligence agencies have long operated to protect national energy interests, from the CIA's involvement in oil-rich regions to more recent cyberattacks on energy infrastructure.
The compute equivalent might involve industrial espionage targeting AI companies, cyberattacks on competitors' chip manufacturing, or covert efforts to recruit top AI researchers. We saw the story of a Google engineer allegedly stealing secrets for China.
Regulatory Capture and Industry-Government Fusion: The revolving door between oil companies and government agencies created deep institutional relationships. Former oil executives routinely became energy secretaries, while former officials joined oil boards. We're seeing similar patterns with tech executives moving into government AI roles and vice versa.
The key difference is that while oil required controlling geographic territories and resources, compute power is more about controlling supply chains, talent, and technological standards - making the competition more complex but potentially more intense.
Here are five interesting stories from this week:
1. California moves to seal location‑data loophole
California is pushing forward new legislation that would make it illegal for data brokers to sell Californians’ location data to government agencies without a warrant or user consent. The bill specifically targets data sales to immigration enforcement like ICE and stems from concerns raised during past civil rights protests. While the U.S. lacks a federal digital privacy law, this move could set a new precedent—raising the bar for how sensitive geolocation data is protected across state lines.
2. Global surge in cybercrime as AI boosts hacker tools
Cybercrime is evolving at breakneck speed thanks to AI. According to a new Financial Times report, generative AI is supercharging criminals’ ability to craft realistic phishing emails, auto-scan for vulnerabilities, and launch deepfake-driven scams. Ransomware groups are becoming more sophisticated, and the sheer volume of attacks is overwhelming traditional security teams. Experts warn that if governments and companies don’t act quickly to modernize their cyber defenses, we could see large-scale failures across critical infrastructure.
3. Meta sues Nudifier app
While many of us were happy to see this lawsuit, several people pinged me about this and pointed out that the company probably could be doing a better job on enforcement by using GenAI tools at scale. I totally agree, and would love to see them talk more about how they might do that! While I like this action, how much can a lawsuit accomplish in Hong Kong? Unclear to me.
Meta has filed a lawsuit in Hong Kong against Joy Timeline HK Limited, the developer of the AI-powered “nudify” app CrushAI, which creates non-consensual nude images of people using generative AI. Despite clear violations of Meta’s ad policies, the company allegedly evaded detection by running over 8,000 deceptive ads on Facebook and Instagram using rotating domains and accounts. Meta is seeking to block the company permanently from its platforms and recover $290,000 in enforcement costs. While Meta has expanded its detection tools and shared flagged URLs with other tech firms, critics argue this case highlights both the scale of AI-driven image abuse and the difficulty platforms face in stopping it.
4. Disney & Universal sue Midjourney over AI training data
In what could be a landmark case for generative AI, Disney and Universal are suing image-generation startup Midjourney for allegedly training its models on copyrighted content from Marvel, Star Wars, Jurassic Park, and more. The studios claim this unauthorized scraping and replication violates copyright law, and want the court to restrict how AI firms can use their IP.
5. U.S. Army enlists Silicon Valley execs to modernize military AI
It’s clear we are on a very odd timeline with this one.
As the link mentions above, the U.S. Army is forming “Detachment 201,” an experimental reserve unit composed of tech leaders from Meta, Palantir, Google, and more. Their unusual role? Advising the Pentagon on how to recruit engineering talent, acquire AI tools faster, and modernize battlefield tech.
I gave a paid talk at one of the big tech firms a week ago where I quoted Lenny Rachitsky’s recent survey, that about half of tech workers are experiencing serious burnout. I plan to write and speak more about the state of the tech industry, so stay tuned. I’d love to hear from you about what you’re seeing!
Have a great weekend,
Rob