#34 - The power of three
Thoughts on Big Tech, Big AI, and the return of empires
Part of this article references Technofeudalism by Yanis Varoufakis, which I recently finished reading as part of a book club with CPO Connect, a community for product leaders at Head of level and above. We held the book discussion yesterday, so credit for some of the ideas in this article must also go to fellow colleagues in the book club: Harry, John, Lilli and Ashwin. Thank you.
It feels like this week the rhetoric around AI has cranked up a notch or two. In various communities and Whatsapp groups I’m part of the same three articles have been shared and thoughts invited:
Something big is happening, by Matt Shumer
The AI Vampire, by Steve Yegge
HBR published a study looking at how using AI leads to burnout (this article discusses it without the HBR paywall)
Those who are firmly in the pro-AI camp call this the ‘doomsday narrative’, pushing back on the argument that AI is going to be the end of humanity, and is instead a great optimiser at work and home. This makes it easy to dismiss the genuine concerns of the sceptics.
But is AI itself really ‘good’ or ‘bad’? As John pointed out in our book club discussion, we are quick to put a label on things we do not properly understand, to help us relate to them and give meaning to our experiences of them. The technology itself may be neutral; it is bad actors — people and their intentions — that define its moral qualities.
Yet stepping back, I see a broader convergence of forces forming in tech and beyond that understandably makes us worried about our collective future. Power is consolidating across three critical domains: economic, political, and informational, all driven by tech. A handful of companies now control the infrastructure that underlies our economy. Democratic institutions are weakening as autocrats rise and great powers carve up spheres of influence. Billionaires own the platforms that shape what we see, read, and believe. And now AI is the vehicle accelerating all three forms of consolidation simultaneously. In this article, I'm going to explore each of these domains in turn, showing how they reinforce each other and where this convergence is taking us.
Technofeudalism (economic power)
Having read Varoufakis’ book, I can see a similar pattern emerging with the AI companies that happened with cloud computing in the 2010s. But let me first describe the main premise of Technofeudalism:
The use of computers in finance made it possible to create complex financial instruments like derivatives and collateralised debt obligations. These were so complex that even those trading them often didn't fully understand the underlying risks or assets they contained.
These financial instruments packaged together large quantities of risky sub-prime mortgages (loans to borrowers with poor credit histories). By slicing and repackaging these loans, they were presented to markets as low-risk investments, disguising their actual exposure.
When borrowers began defaulting on sub-prime mortgages at scale, the interconnected nature of these financial products meant losses cascaded through the global financial system, triggering the 2008 financial crisis. Major institutions including Lehman Brothers collapsed.
However, governments deemed many failing banks ‘too big to fail’ due to the systemic risk their collapse posed. Central banks and governments (particularly in the US and UK) provided massive bailouts using public funds to prevent total financial system collapse.
This led to a period where central banks implemented quantitative easing, creating new money to purchase government bonds and other assets. This flooded financial markets with cheap capital. Banks and investors, facing low interest rates, sought profitable investment opportunities.
Big Tech companies (Google, Amazon, Apple, Facebook) appeared as attractive investment options, having demonstrated rapid growth and market dominance. They could access virtually free capital through ultra-low interest rates.
Big Tech used this cheap capital to build what Varoufakis calls cloud capital: the digital infrastructure (data centres, cloud platforms, algorithms, networks) that now underpins much of the economy. Crucially, this capital doesn't just produce goods but controls access to digital markets and services.
Varoufakis argues profit has become detached from traditional capitalist relations. Big Tech doesn't primarily profit by selling products at a markup on production costs. Instead, they extract rent by controlling access to their platforms (through subscription fees, commission on transactions, advertising access fees).
This rent-seeking model resembles feudalism more than capitalism. Just as feudal lords extracted rent from serfs for access to land, Big Tech extracts rent from users and businesses for access to cloud capital. We've become cloud serfs, paying dues to our digital landlords for access to essential infrastructure they control.
So we can see this same pattern now playing out with the likes of OpenAI, Anthropic and Google — the AI ‘Big 3’. They’re in a race to grab as much AI-land as possible: lock in customers and make it exceptionally hard to switch, whilst charging a rent (monthly subscription) to access their models. This has been possible because they have the most advanced frontier models, coupled with the deepest pockets based on their backers:
OpenAI > Microsoft ($13 billion)
Anthropic > Google ($2.3 billion) and Amazon ($4 billion)
Google > own resources (unclear, but expected to be tens of billions of dollars based on overall R&D expenditure)
A note on Microsoft and Meta, who are conspicuously absent in the Big 3 above: Microsoft have made a substantial investment in OpenAI, so whilst they have their own specialised AI models (Phi-3 and Phi-4), their Copilot consumer product is actually using OpenAI under the hood. Meanwhile, Meta has launched their own LLM model, Llama, but chosen to go a different route, and open source it. Meta’s strategy is based on a belief that the real competition isn’t on the model itself, but the application or platform layer it sits on — which is where their strength is via their other products. And regardless of how that bet turns out, they then have their own model powering Facebook, Instagram, Threads and Whatsapp, and are not beholden to the Big 3.
It also calls into question the assumption that we are living in an AI bubble. Varoufakis would argue we’re not. Bubbles are predicated on the old rules of capitalism, and that’s not the underlying economic mechanism at play here. The likes of Microsoft and Softbank ($41 billion total investment) are ploughing capital into OpenAI without the expectation of return in the short term (OpenAI lost $14 billion last year alone) so it can instead grow exponentially fast to become the dominant player in the market. Once that's achieved, they will lock in consumers in perpetuity and extract whatever rent they want, just as feudal lords needed no return on investment in their land, only the ability to collect rent from those who had no choice but to use it.
Democracy in retreat (political power)
As part of the research for my TEDx talk I’ve been watching others on YouTube, and last week came across Sarah Wilson’s: How to Respond to Societal Collapse (well worth a watch if you have 14 minutes).
One of the stats she shared stood out to me: more than 70% of the world’s population now lives under autocracy, and even in the West, democratic norms are weakening. As Harry neatly put it: democracy is precious. We've lived in a short sliver of time where it was able to flourish — since WWII social democracy has bloomed, with a strong welfare state, worker rights and protections, and regulated capitalism. But all that is now ending.
We saw global power consolidating into elite hands at Davos this year. Trump announced his new Board of Peace: a private members club for world leaders with a $1 billion admission fee for permanent membership. Early signatories included the autocratic regimes of Hungary, Azerbaijan and Turkey. Notably, major EU democracies — the UK, France, Germany, Italy, Norway, Spain, Sweden and Slovenia — all declined to join. Trump's approach appears aimed at undermining supranational democratic institutions like the United Nations. The US itself currently owes $1.5 billion in unpaid UN dues.
Mark Carney’s speech at Davos warned of "a rupture in the world order", as great powers abandon even the pretence of rules for the unhindered pursuit of their interests. The old international order, where countries could be held accountable for their actions, no longer exists. Instead, new alliances must be forged, one by one, with allies willing to do the work to preserve relations and democracy. It's another stark illustration of how power consolidates faster than ever: autocrats can consolidate power in weeks, whilst defending democracy remains painstakingly slow.
The world is fracturing along the lines of three great powers. The US is asserting dominance in its hemisphere through force — demanding Greenland from Denmark, detaining Venezuela's president. Russia's invasion of Ukraine signals broader territorial ambitions in Eastern Europe. China continues expanding its economic influence across Southeast Asia and Africa through infrastructure investment and debt diplomacy. We're returning to a world of empires and spheres of influence, where might makes right and smaller nations become vassals to their regional hegemon. As our book club noted, Britain is not blame-free here — we pioneered this model during our colonial era.
So what is the role of AI in this political power consolidation? Autocratic regimes control their populations through mass surveillance, and AI supercharges this capability. Ring's recent Super Bowl advert neatly illustrates how this works: under the feel-good guise of searching for missing pets, the commercial explained how Ring doorbells can send data through a network to a central security apparatus (currently provided by Flock). The facial recognition tech that can be used to look for a lost Labrador can also be used to identify potential deportees and women who have abortions.
Controlling the narrative (informational power)
Bezos purchased The Washington Post in 2013 for $250 million cash. It wasn’t too problematic when he refrained from meddling in its editorial decisions. However, in October 2024, just before the presidential election, Bezos intervened to block The Washington Post’s editorial board from endorsing Kamala Harris for president. This broke the paper’s tradition of endorsing presidential candidates, which it had done since 1976. Critics noted that its timing was suspicious, coming shortly after Bezos had meetings with Trump about his other company, Blue Origin.
When Musk bought Twitter in 2022, he claimed it was to defend free speech. What he actually bought was algorithmic control of the platform’s information flow. He can amplify the voices that serve his interests, suppress those that don't, and flood users' feeds with content that reinforces his preferred ideology. This isn't traditional media ownership. Murdoch could influence what his newspapers published, but readers chose whether to buy The Sun. Musk controls the algorithm that determines what 500 million users see before they've made any choice at all (except, of course, whether to open the app in the first place).
The consolidation of media ownership in the hands of a few billionaires isn’t coincidental. Many of them share what Curtis Yarvin calls the Cathedral worldview: the belief that media and academia (along with the civil service) form an informal power structure that shapes acceptable discourse. Yarvin, a Silicon Valley reactionary whose ideas have influenced figures like Thiel and Andreessen, argues this network enforces progressive orthodoxy and must therefore be dismantled. The solution, in their view, isn't to reform these institutions but to capture them. Billionaires buy newspapers and social platforms to deliberately dismantle the institutional checks that might hold them accountable.
When the same actors control economic infrastructure, shape political power, and own information channels, democratic accountability becomes nearly impossible. Which brings us to an uncomfortable question: what's our role in this?
Complicity and resistance
Reading this article has probably made you feel quite uncomfortable — it does me writing it. It feels dark, heavy, depressing to think about how different forms of power are consolidating into the hands of the few. Nobody wants to believe they are part of the problem, especially when the problems feel too big to solve individually. And especially not when that involvement results in harm to others. Yet in so many ways we are complicit in this process: using social media to stay connected with family and friends; using Big Tech products at work and home; investing our pensions in Big Tech; relying on AI tools that entrench these power structures further.
An underlying point in these articles, particularly Shumer's, relates to a fundamental aspect of these systems: power can be consolidated because it is taken from elsewhere. Shumer calls this out explicitly: when we give more hours than we are paid for, we are being exploited. AI enables us to do more in the same amount of time — but we’re not paid for doing more. That extra value goes up the chain to the technofeudalists.
For those of us working in tech, the cognitive dissonance cuts deeper. Building AI features, optimising platforms, creating subscription models — these are the skills that advance our careers and make our companies valuable. What looks like innovation and commercial opportunity is also the mechanism through which power consolidates into fewer hands.
So what is the answer here?
We can see the push back happening. Amazon cutting ties with Flock; users leaving X for Bluesky; internal resistance at Google forcing them to end its contract with the Pentagon over Project Maven; Germany being the first state to implement the EU’s AI Act, signalling how democracies can still exert their power to constrain Big AI.
It’s not at all easy to do this. No doubt the loss of subscribers at WaPo is what has led to the substantial layoffs announced last month. Those are real people losing their jobs, real consequences of resistance. And that is what those with power are banking on — we won’t blame them, we will blame each other.
The power consolidation I’ve described only works if we remain atomised, as individual consumers making individual choices, individual workers absorbing individual consequences. But as Malcolm Gladwell argues in The Tipping Point, dramatic social change often happens suddenly once critical mass is reached. It takes a surprisingly small number of people working collectively to trigger systemic shifts that seemed impossible before. Erica Chenoweth's research on resistance movements reveals the threshold is surprisingly low: active participation from just 3.5% of a population is enough to trigger unstoppable systemic change.
Varoufakis ends Technofeudalism with a call for precisely this kind of collective action: coordinated resistance from those who build, regulate, and fund these systems. The tipping point he describes isn't about everyone opting out. It's about enough people in positions of influence refusing to participate in the consolidation of power into feudal structures. My call to action is to recognise that innovation and power consolidation have become dangerously entangled. Every AI feature, every platform optimisation, every subscription model we create and use, we must ask ourselves whether it's genuinely useful or whether it's just another way to extract rent from infrastructure people will have no choice but to use.



