How AI will transform Capital

on

|

views

and

comments

The lazy way to write about AI is to start with jobs. The more revealing place to start now is capital.

A lot of recent commentary still treats AI as a software story with a labor twist. Smarter assistants. Faster coding. Fewer junior analysts. Some of that is real. But it misses where the weight is moving.

In 2024, generative AI drew $33.9 billion in private investment globally, and 78% of organizations reported using AI in some form. That is not just a sign of interest. It is a sign that firms are rebuilding parts of their operating base around a new technical layer.

The real shift is not simply that more work can be assisted by a model. It is that AI is changing what counts as scarce, defensible, and expensive. Compute access. Power availability. Proprietary data. Workflow control. The systems that make those things usable at scale.

Cheap intelligence does not mean cheap deployment.

Old AI frameWhat matters more now
AI mainly changes labor costsAI changes the hierarchy of capital inside the firm
Models are the main moatPhysical bottlenecks and proprietary workflows often hold longer
Software scales cleanlyPower, data centers, integration work, and organizational redesign slow the picture down
Returns should appear quickly in productivity numbersMuch of the buildout behaves like intangible capital and is still measured badly

The old AI story was labor. The new one is capital.

Labor still matters. But if you start there, you end up with a distorted picture. You see substitution before you see dependency.

One reason is that the buildout is now large enough to show up in the financing layer. According to the OECD’s 2025 review of AI venture funding, AI firms accounted for 61% of global VC value in 2025. Even more telling, IT infrastructure and hosting pulled in $109.3 billion that year. The money is not only chasing apps. It is chasing the stack beneath them. CV3 has already looked at that balance-sheet side in The $212 Billion Bet: AI’s CapEx Gold Rush.

This is why the phrase capital deepening matters here. AI is not just a layer of clever software spread thinly across existing firms. It is pulling new spending into data centers, networking, chips, storage, cooling, grid access, internal data systems, evaluation loops, and integration work that rarely makes a keynote slide.

That changes who captures value. If model access gets cheaper over time, the firms that own bottlenecks or control sticky workflows may do better than the firms that merely talk most loudly about “using AI.”

If model access gets cheaper, what stays scarce?

Why compute and power now matter more than software mythology

There is a reason Nvidia keeps sitting near the center of so many conversations. It is not because chips are glamorous. It is because they sit where software ambition meets physical constraint.

The IEA projects that global electricity use from data centers could reach around 945 TWh by 2030. That is the part of the AI story many people still wave away as plumbing. It is not plumbing. It is the cost structure.

That has a few consequences:

  • Compute access is not evenly distributed, even if model interfaces are.
  • Power availability becomes part of AI strategy, not an afterthought.
  • Deployment speed depends on physical and organizational readiness, not only on model quality.

That also changes how moats should be read. If strong model performance becomes easier to reproduce than many assumed, the harder advantages may sit lower down: in power, data, workflow ownership, and execution capacity. The same logic appears in physical systems too, which is why Physical AI in 2026: Why the Moat Is Data, Not the Robot Body sits close to this argument rather than off in a separate corner of the site.

The model may get cheaper. The substation does not.

That sounds obvious once stated plainly. Still, a surprising amount of AI writing behaves as if better models float free from physical limits.

Proprietary data and workflow control are the stickier layer of value

This is the part that tends to get flattened into slogans about data. The phrase is tired. The underlying point is not.

A firm with access to a frontier model is not in the same position as a firm that also owns the data exhaust, the internal decision rules, the historical edge cases, the evaluation criteria, and the place in the workflow where a recommendation actually turns into action. Those things are messy. They are often buried inside operations. They do not photograph well. They are still where much of the durable control sits.

The World Intellectual Property Organization’s 2025 review of intangible investment found that intangible investment has grown 3.7 times as fast as tangible investment since 2008. Software and data are the fastest-growing category. That matters because AI does not arrive in the economy as a neat box labeled “productivity.” It often arrives as more data work, more model orchestration, more internal tooling, and more hard-to-value service capacity.

Put more simply, not every AI asset sits on a balance sheet in a way that feels satisfying to outsiders. Some of the important ones still look like scattered expense until they start throwing off better service, faster decisions, or tighter customer capture. CV3 has already approached that from two angles in Why Data Pipelines Are the New Oil Rigs of AI and Private Wealth Intelligence.

There is a named tension here that does not resolve neatly: cheap intelligence, expensive infrastructure.

That tension also explains why broad labor forecasts so often sound cleaner than reality. Anthropic’s January 2026 Economic Index report found that API usage was 74% work-related and that three-quarters of those interactions were classified as automation. That is a serious signal. But it still does not tell you that organizations can simply strip out labor on demand. Real workflows are not built from benchmark tasks. They are built from handoffs, exceptions, trust, and internal politics.

That is why the more useful question is often not “Will AI do this job?” It is “Who owns the system around the task, and who gets paid when the task becomes cheaper?”

Why the payoff still looks blurry in the numbers

There is another reason the picture feels strange. A lot of AI investment is still being counted badly.

The Brookings blueprint on counting AI makes the point cleanly: AI often behaves like expensed intangible capital. In plain language, firms may be building something durable while accounting systems still make much of the spending look like ordinary current cost. Brookings also points to the usual lag from complementary organizational changes. You buy the tools first. You restructure around them later. The economic payoff rarely arrives on the same schedule as the invoice.

That helps explain why the current moment feels split in two. Capital markets see the buildout. Operators feel the pressure. Official numbers still look oddly incomplete.

What gets paid for nowWhat may actually be getting built
Model subscriptions and cloud billsLonger-lived service flows inside the firm
Integration teams and data cleaningProprietary workflow memory and better internal control
Evaluation, oversight, and model tuningHigher-quality decision systems that scale unevenly but persist
Organizational redesignorganizational complements that allow AI spending to matter at all

That is the dense part. Here is the plain version. Some of what looks like ordinary AI spending today is really the early cost of building a longer-lived internal asset. And some of the gains will not show up cleanly until the organization around the tool changes too.

There is an awkward edge case worth keeping in view. Not every deployment speeds work up. Some still create drag through verification, mismatch, or overconfidence. That is not a counterargument to the broader shift. It is a reminder that the path from model capability to economic value is uneven, and often slower than the demos suggest.

For long-horizon capital, that may be the cleanest way to read the moment. Not as a story of instant automation, and not as a simple software boom, but as a repricing of bottlenecks and control points. Some assets are getting easier to access. Others are becoming harder to replace. CV3 has looked at the wider macro side of that in AI’s Economic Revolution and at the longer arc in AI 2027: Analysis.

Jonathan Haskel and Stian Westlake’s Capitalism without Capital still helps here. Not because it predicted this exact moment, but because it trained readers to look for value where ordinary accounting and ordinary industrial habits tend to miss it.

This is an analytical piece, not portfolio, legal, or tax guidance.


Is AI mainly a labor story or a capital story?

It is both, but the capital side explains more of the current shift. Labor effects sit inside a larger move toward compute, power, proprietary data, workflow control, and the systems that make those things usable at scale. For a related CV3 angle, see The Silent Restructuring Brought On by AI.

Why do power and data centers matter so much in AI?

Because model quality does not float free from physical supply. Training, inference, storage, networking, and cooling all depend on infrastructure that is slow to build and expensive to expand. That turns power and site readiness into part of the AI cost structure. For a related CV3 angle, see The $212 Billion Bet: AI’s CapEx Gold Rush.

What makes proprietary data valuable if strong models are becoming easier to access?

General model access narrows one part of the gap, but it does not erase the value of internal data, local edge cases, evaluation systems, and the point inside a workflow where model output turns into a real decision. That is where generic capability meets firm-specific control. For a related CV3 angle, see Why Data Pipelines Are the New Oil Rigs of AI.

Why do AI gains still look hard to see in the numbers?

A good share of the spending behaves like long-lived intangible investment while still being recorded in ways that blur the asset being built. Firms also need complementary organizational change before gains show up cleanly. For a related CV3 angle, see AI’s Economic Revolution.

Share this
Tags

Must-read

What the SpaceX IPO Could Mean for Orbital AI Data Centers and AI Infrastructure Control

The useful way to read the reported SpaceX IPO is not as a listing event. It is as a possible financing event for bottleneck...

Open-Source Investing Agents: The 5 OpenClaw-Like Options That Actually Matter

The useful shift is not that finance software can now talk back. It is that a small group of open-source systems are starting to...

Will Tesla Terafab Reduce Nvidia Dependence and Change Competition in Physical AI?

A fab announcement is not a moat. It is more like a course correction — a sign of where a company thinks the shoals...

Recent articles

More like this