Collective Sovereignty
AI can either scale extraction or help people reclaim power together

In This Post: AI is often discussed in terms of business sovereignty, data sovereignty, national sovereignty, and who controls the intelligence layer. This post argues for another kind of sovereignty: collective sovereignty.
Collective sovereignty means using AI to help ordinary people see clearly together, coordinate choices, understand leverage, and reclaim power over systems that increasingly shape daily life. AI can either deepen the old pattern of extraction, where scale serves a powerful few, or it can help people organize buying power, voting power, taxpayer power, community leverage, and steadier decision-making from the ground up.
The central question is not whether AI will organize human behavior. It will. The question is whether it organizes us for extraction or for maturity.
The Missing Sovereignty Conversation
A lot of people in technology are talking about sovereignty now. Business sovereignty. Data sovereignty. National sovereignty. The concern is that a few large AI companies could control too much of the intelligence layer. That concern is real.
But there is another kind of sovereignty we need to talk about: collective sovereignty.
By collective sovereignty, I mean the ability of ordinary people to see clearly together, coordinate choices, understand leverage, and reclaim some power over the systems that shape daily life. That may sound abstract at first, but it is actually very practical. It includes buying power, voting power, taxpayer power, community leverage, and the ability to understand what is happening before decisions are already made for us.
The Pattern I Was Trying to Name in 2012
This is not a new concern for me. In 2012, when I wrote It’s Just Commerce: Returning Balance to Business, I was trying to name a pattern that troubled me. We had built a system of commerce that depended on people but increasingly did not serve people. Our purchases, labor, taxes, attention, trust, and communities powered the system, yet more and more of the benefit and decision-making concentrated upward.
Commerce itself was not the problem. The problem was imbalance. Scale became the great advantage of modern business, but too often that scale served a handful of powerful people who could use size, market dominance, and government relationships to keep their business models protected. The larger some companies became, the more they could shape the marketplace, influence government, and make ordinary people feel smaller inside a system they were still powering every day.
Now I see the same pattern moving into AI.
They Are Doing It Again
The old model says scale wins. Bigger companies, bigger platforms, bigger data centers, bigger capital, bigger government access. The assumption is that the future will be shaped by those large enough to dominate the intelligence layer.
Maybe that is where the current path leads. But I do not believe it has to be the only path.
As I have watched a handful of billionaires and large technology companies race for AI dominance, something clicked in me: they are doing it again. The same old pattern, only more refined this time. Scale is being built first, the public cost is being absorbed quietly, and the benefits will likely concentrate among the few people and companies positioned to own the next layer of the economy.
This is not about one billionaire, one company, or one political party. It is about a pattern of scale, power, government access, and public cost.
The Public Powers the System
We can already see pieces of this pattern in the energy buildout around AI. The data centers needed to power artificial intelligence are creating enormous new demand for electricity. Reuters recently reported that U.S. residential electricity prices are expected to rise in 2026 and 2027, with AI and data-center expansion contributing to power-demand growth. Texas is projected to see some of the largest electricity-demand increases.
That does not mean AI data centers are the only reason bills rise. Energy markets are complicated. Aging infrastructure, gas prices, transmission upgrades, weather, policy choices, and population growth all matter. But the pattern is still important: ordinary people may help absorb the cost of building the next layer of private technological power before they have any meaningful say in what that power should serve.
Texas already gives us another glimpse of the same pattern through cryptocurrency mining. Large flexible users, including bitcoin miners, can participate in demand-response or curtailment arrangements, reducing power use during peak demand and receiving market compensation or energy credits. Riot Platforms, for example, reported earning $31.7 million in power and demand-response credits during August 2023.
That may make sense inside the technical logic of grid management. But from the public’s point of view, it raises a larger question: why are ordinary households increasingly asked to live inside the cost and complexity of systems built around private scale?
What Angers Me Is the Shrug
What angers me is the shrug. The idea that massive job disruption, higher energy demand, public infrastructure pressure, and deeper concentration of power are simply the cost of progress. I do not accept that. Commerce was designed by people. Business models are designed by people. AI systems are designed by people. If the current path treats human beings as costs, inputs, or collateral damage, then we should not pretend that was inevitable. It was a choice.
And if it was a choice, then another choice is possible.
Exhaustion Is Part of the Pattern
Part of the reason many Americans do not see this pattern clearly is not because they do not care. I think many people are exhausted. They work hard, often for long hours, with too few real breaks and too little space to study how power is moving around them. They are trying to pay bills, raise families, manage health issues, care for parents, keep jobs, and make it through the week. Meanwhile, the systems depending on their labor, taxes, purchases, attention, and trust keep growing more complex.
That exhaustion is not separate from the pattern. It is part of what makes the pattern so hard to see. The people powering the system often have the least time to understand how the system is being redesigned around them.
AI Can Make This Worse, or Reverse the Direction
This is where AI could either make the problem worse or help relieve it. AI built from an extraction worldview will use that exhaustion to predict, persuade, segment, and manage people more efficiently. It will help institutions understand people better than people understand the institutions shaping them.
But AI built for collective sovereignty would move in the opposite direction. It would help people understand systems that have become too complex to track alone. It could help communities compare prices, coordinate buying power, follow legislation, understand public money, challenge bad incentives, reduce emotional reactivity, and make steadier decisions together.
In that sense, AI could become something like a lobbying firm for ordinary people. Not a partisan machine. Not another outrage engine. A practical coordination tool. A way for people to understand leverage and act together with more clarity.
When Voting Feels Trapped
Voting still matters. I want to be clear about that. But many Americans can feel how trapped the political system has become. Gerrymandered districts, partisan media, donor influence, and the constant pull of us-vs.-them conflict keep people fighting inside a structure that often benefits from the fight.
That does not mean we give up on democracy. It means we need additional leverage while democracy repairs itself.
One of the most immediate forms of leverage people still have is where we place our money, attention, subscriptions, purchases, and trust. If voting at the ballot box has been weakened by division and structural distortion, then voting collectively with our dollars becomes even more important.
That kind of collective economic choice could eventually influence the political system too, because government responds to organized power. Right now, too much organized power sits with large institutions, lobbyists, donors, and corporations. AI could help ordinary people organize a different kind of power from the ground up.
Scale Is Not the Same as Strength
This is also where scale needs to be reconsidered. The largest AI companies may have enormous advantages, but scale is not always the same as strength. AI may expose where scale is actually complexity pretending to be strength. The larger a system becomes, the harder it can be to keep its incentives, data, people, workflows, products, and decisions aligned.
So maybe the goal is not to chase everything the largest AI companies chase. Maybe the goal is to build AI clear enough to serve a real human need well.
For me, that lane is collective sovereignty.
AI should not merely make individuals more productive. It should make communities more capable.
Spherical Thinking Is Practical Systems Intelligence
That is also what I mean by spherical thinking. Spherical thinking is not moral superiority. It is practical systems intelligence. It asks who or what is being extracted in an unbalanced way, not only because harm matters, but because unbalanced extraction does not sustain. It eventually weakens the system it feeds on.
This is the opposite of the narrow extraction mindset. It does not ask only, “Can this scale?” It also asks, “What happens when it does?”
The Sweetness of Seeing Clearly Together
We do not yet have much lived experience of collective sovereignty. Most of us know what it feels like to be managed, marketed to, divided, priced, tracked, exhausted, and nudged. We know what it feels like to climb alone, carrying our own gear, trying to make sense of systems that were not built for us to understand.
But we have not had much experience seeing clearly together. Buying together. Voting with better information. Understanding public money. Negotiating from shared intelligence. Making decisions with enough steadiness and trust that the climb does not feel so lonely.
That kind of shared clarity could create joy. And joy matters. Fear may wake people up, but joy is what helps people keep climbing.
It’s Not Too Late
The fear that a handful of billionaire-controlled companies could shape AI in destructive ways is real. But government intervention may not be the only answer. There may also be a market answer, a civic answer, and a human answer.
What if people had the choice to use AI built for their needs? AI that helps them live better, understand more clearly, reduce emotional reactivity, coordinate with others, and reclaim power together?
I believe many people would choose that. They would vote with their dollars because a tool that helps people become steadier, wiser, and more capable together is not a side project. It may be one of the most practical uses of AI we could build.
That does not mean AI can prevent every disruption or solve every human problem. But it can be aimed differently. It can treat people as participants in the future, not debris from progress.
That is what I mean by collective sovereignty.
And it is not too late.
About David
David Barnes is the co-founder of Peace of Mind Overtures and co-creator of the Alignment Movie Process, a practice that uses film, intention, and carefully developed resonance statements to help people notice emotional patterns, release old reactions, and find their way back to balance.
His work explores emotional maturity, coherence, collective focus, and how AI might help humanity find the signal instead of amplifying the static.

