By Mark Keenan
March 23, 2026
Over the past two decades, the structure of power in society has begun to change in ways that are easy to feel but harder to clearly describe. Authority is no longer exercised primarily through visible institutions-parliaments, courts, or even public debate-though such mechanisms have never been the whole story. Increasingly, it is embedded in systems: financial, digital, and informational.
What appear to be separate developments-expanding debt, digital currencies, artificial intelligence, cultural transformation, and the moderation of public discourse-may be better understood as parts of a broader shift. Whether or not there is a single coordinated plan, a convergence is becoming clear: a set of systems evolving in ways that can shape human behavior, perception, and choice.
At stake is not simply policy, but the architecture of modern life itself.
Debt and the Control of Time
Modern economies run on debt. Governments, corporations, and households all rely on borrowed money to function. This is not new. What has changed is the scale and centrality of debt as an organizing principle.
Debt does more than finance activity; it structures the future. When income must service prior obligations, choices become constrained. Economic life begins to follow paths shaped by interest payments, refinancing cycles, and credit conditions.
In this sense, debt is not only a financial instrument-it is a claim on time. It reaches forward, shaping future labor and limiting flexibility. When entire systems depend on the continuous expansion of credit, the space for independent decision-making narrows accordingly.
These dynamics are explored in more detail in my book The Debt Machine, which examines how modern credit systems shape economic behavior over time.
Digital Currency and the Control of Behavior
The gradual shift toward digital payment systems introduces a further layer of possibility. Unlike cash, digital money can be tracked, analyzed, and-at least in principle-conditioned.
Central bank digital currencies (CBDCs), now openly discussed in policy circles, raise important questions. If money becomes programmable, it may be possible to embed rules into transactions themselves: where money can be spent, when it expires, or under what conditions it is released.
Whether or not such capabilities are widely implemented, the direction is clear. Financial systems are becoming more integrated with data systems, and data systems are becoming more capable of influencing behavior.
Money, once a neutral medium of exchange, risks becoming a channel through which policy is quietly enforced.
Artificial Intelligence and the Shaping of Thought
Artificial intelligence represents another layer of this transformation. It is often described as a tool for efficiency, prediction, and optimization. In practice, its most immediate impact is on information.
Search engines, recommendation systems, and AI-generated responses increasingly mediate what people see and read across digital platforms. Information is not simply accessed-it is filtered, ranked, and presented through algorithmic systems.
As explored in my book The AI Illusion, these systems do not operate in a vacuum. They are trained on curated datasets, shaped by institutional priorities, and bounded by design choices about what counts as "reliable" or "safe."
The result is subtle but significant. Control is no longer exercised by telling people what to think, but by shaping the environment in which thinking occurs. Some ideas become more visible, others less so. Over time, this influences not only opinions, but the range of questions people feel able to ask.
This is not direct coercion. It is cognitive framing.
Identity and the Reconfiguration of the Self
Digital identities, profiles, and data records are increasingly becoming the interface through which individuals interact with institutions. As these systems expand, the distinction between lived identity and administered identity may begin to blur.
The question is not whether identity can evolve-it always has-but how much of that evolution is shaped organically, and how much is mediated by systems designed for classification and control.
Climate Policy and the Management of Movement
Climate policy provides another example of this convergence. Environmental concerns are increasingly being integrated into financial regulation, investment frameworks, and long-term planning.
Carbon accounting, emissions tracking, and sustainability metrics are becoming embedded within economic systems. In principle, this allows for better management of environmental impact. In practice, it also creates mechanisms through which access to resources and mobility could be influenced.
While the role of carbon emissions in climate change remains the subject of ongoing debate-as I discuss in my book Climate CO2 Hoax-the structural direction of policy is clear.
If financial systems, identity systems, and environmental metrics converge, it becomes technically possible to link economic participation with behavioral criteria-how one travels, consumes, or produces.
Again, this is not necessarily how these systems will be used. But it is what they make possible.
Censorship and the Boundaries of Truth
Finally, the governance of information itself has shifted. Digital platforms now play a central role in determining what content is visible, shareable, or monetizable.
Content moderation, once limited in scope, has expanded into a complex system involving automated filters, policy frameworks, fact-checking bodies, and algorithmic ranking. The stated goal is often safety or accuracy, yet in practice the power to define what counts as "fact" is concentrated in a relatively small number of institutions and actors with little transparency or accountability.
The effect is the establishment of boundaries-sometimes explicit, sometimes implicit-around acceptable discourse.
When combined with AI systems that generate and prioritize information, this creates a feedback loop. What is visible becomes what is considered real; what is considered real shapes what is produced and reinforced.
In this environment, truth risks becoming less a matter of open inquiry and more a function of system design, shaped by the institutions and systems that determine what is visible, what is suppressed, and what is treated as authoritative.
Convergence: Toward a Programmable Society?
Each of these developments-debt, digital currency, AI, identity systems, climate frameworks, and information governance-can be analyzed on its own terms. But their deeper significance emerges when viewed together.
They form an interlocking architecture:
- Debt structures the future
- Digital money shapes behavior
- AI filters perception
- Identity systems define participation
- Climate frameworks influence movement
- Information systems bound discourse
What emerges is not a single mechanism of control, but an ecosystem. A society in which many aspects of life are mediated through systems that can, in principle, be programmed.
As noted in The AI Illusion, the modern technocratic order is best understood not as isolated policies, but as a set of reinforcing structures that shape both external conditions and internal perception.
Importantly, this process does not require overt coercion. It operates through incentives, defaults, and design. It makes certain choices easier, others harder, and gradually channels behavior without the need for direct enforcement.
The Question of Choice
None of this means that a "programmable society" is inevitable. Systems are built, maintained, and adjusted by human beings. They reflect decisions, priorities, and trade-offs.
The question, therefore, is not whether these technologies and structures will exist-they already do. The question is how they will be used, and under what constraints.
Will financial systems remain tools of exchange, or become instruments of behavioral policy?
Will AI expand access to knowledge, or narrow it through unseen filters?
Will identity systems empower individuals, or reduce them to profiles within databases?
These are not technical questions alone. They are political, ethical, and ultimately human.
Conclusion
History has often been shaped by systems that, once established, seemed to take on a life of their own. Today, we may be entering a phase in which those systems are increasingly digital, interconnected, and responsive.
It is tempting to see this as an "algorithm of history"-a process unfolding according to its own internal logic. But history is not code. It is the result of human choices shaped by real-world constraints.
The emergence of programmable systems does not eliminate agency. It makes the conditions under which agency is exercised more complex. Economic and legal systems have long shaped the boundaries within which those choices are made.
Understanding these systems is the first step. The next is deciding where human judgment, discretion, and freedom must remain outside their reach. In practice, such decisions are rarely made collectively or in time to redirect emerging systems. They emerge through the actions and choices of individuals, each responding within the constraints of their own circumstances.
There is no single future. The outcome, programmable or otherwise, will not unfold in the same way for everyone. It will be shaped by the choices individuals make.