The Agency Shift: Power, AI, and the End of Optimization
For most of the past decade, we talked about AI as an optimization story.
Faster models.
Lower costs.
Better predictions.
Higher margins.
That framing is now obsolete.
We have crossed a threshold. AI is no longer just improving tasks — it is reallocating agency.
And agency is power.
From Efficiency to Authority
In its first phase, AI optimized at the edge of business processes: ads, recommendations, workflow automation. It was incremental. Technical. Contained.
In its current phase, AI influences decisions that shape markets and institutions: hiring, credit allocation, healthcare triage, legal review, military targeting, public discourse.
Once systems begin shaping outcomes rather than assisting tasks, the problem stops being technical.
It becomes political. Economic. Institutional.
Optimization asks: How do we do this better?
Governance asks: Who decides? Who is accountable? Who bears the risk?
We are now in the second question.
The SaaS Selloff Is a Signal
Recent turbulence in the SaaS market is not just a valuation adjustment. It reflects a structural repricing of intelligence.
If agentic AI can perform managerial, analytical, or compliance tasks at marginal cost, then core competencies migrate downward into infrastructure.
Just as cloud collapsed capital barriers for software, AI collapses labor barriers for knowledge work.
Average costs drop.
Margins compress in some layers.
They expand in others.
The value shifts from application interfaces to compute, orchestration, and governance.
The question is no longer “Which SaaS company wins?”
It is “Who controls the intelligence layer?”
Power, Energy, and the Infrastructure Layer
The AI economy is capital intensive. Data centers consume vast amounts of electricity and water. Compute density matters. Industrial policy matters.
This is where geopolitics converges with economics.
China is advancing open-source models at scale.
The United States is doubling down on precision, frontier research, and semiconductor leadership.
Europe debates fragmentation versus industrial consolidation.
Compute has become strategic capital.
Energy has become economic leverage.
Semiconductors are no longer components — they are sovereign instruments.
AI is not just software. It is industrial policy in silicon form.
Post-Order Fragmentation
At the Chicago Booth Economic Outlook 2026, the tone was measured — even somber.
Not because of a recession forecast.
But because of institutional uncertainty.
The post–World War II rules-based order was never self-enforcing. It was underwritten by power — economic, military, industrial, institutional.
Today, enforcement feels fragmented.
Middle powers explore coalitions.
Global supply chains are reconsidered.
National capitalism resurfaces.
When power diffuses, institutions lag.
When institutions lag, volatility rises.
AI accelerates this gap.
Technology scales exponentially.
Governance adapts incrementally.
That mismatch is the real risk.
Scale Changes the Stakes
A human error affects dozens.
An algorithmic error affects millions.
Scale transforms mistakes into systemic risk.
Which means the critical questions are no longer about model accuracy alone:
- Who audits these systems?
- Who overrides them?
- What happens when incentives diverge?
- Who pulls the plug?
Incentives matter. Firms optimize for profit. States optimize for control. Individuals optimize for convenience. Societies optimize for legitimacy.
AI amplifies whichever incentive dominates system design.
Without governance, the strongest incentive wins — not necessarily the best one.
Is There an AI Bubble?
Perhaps.
But bubbles form when capital outruns utility.
We are still in the infrastructure build phase.
The world is preparing not just for large language models, but for more advanced world-model systems and heterogeneous AI agents integrated across industries — healthcare, finance, logistics, education.
The analogy is not dot-com 1999.
It may be closer to electrification.
Overbuild may occur.
Valuations may swing.
But the underlying transformation is structural.
The market is repricing layers — not collapsing the thesis.
The Leadership Question
The deeper issue is not technological.
It is institutional.
All my life, I believed humans are inherently decent — that given the opportunity, they act in good faith.
Experience has refined that belief.
Leadership is not about assuming goodness.
It is about designing systems where goodness survives pressure.
The AI era forces this discipline.
Institutions cannot rely on inertia.
Markets cannot rely on optimism.
Governance cannot rely on sentiment.
Rules endure only when backed by credible structure — whether economic, technological, or political.
The Agency Shift
We are no longer debating tools.
We are debating who governs the systems that shape outcomes.
AI reallocates decision leverage from individuals to algorithms — and from dispersed actors to those who control compute, data, and energy.
That is the agency shift.
And that is why this moment feels different.
The next decade will not be defined by whether AI is powerful.
It already is.
It will be defined by whether institutions, markets, and leaders adapt fast enough to govern that power — without fracturing the economic order that sustains it.
The optimization era is ending.
The governance era has begun.
The Macro Current
Power flows. Institutions strain. Technology accelerates.
The question is never whether systems change — only who designs what comes next.
