Perspectives by Moneta – Agentic AI and the Hard Power Behind Autonomy

In a previous newsletter, we explored how agentic AI – a class of autonomous, decision-making systems – is establishing itself as a transformational force across many economic sectors. But as investors seek to position themselves on this tech wave, the question of the physical infrastructure needed to support these agents also comes into focus, as well as the energy and resource implications of scaling this autonomy across economies globally.

The story of agentic AI isn’t just about software. It also requires silicon, grid power, bandwidth, and physical system coordination. The strategic edge will not belong solely to those with the best agentic offerings, but also to those who own or enable the infrastructure required to run them.

Compute Isn’t Free: Silicon Scarcity and Edge Requirements

Agentic AI workloads differ fundamentally from traditional SaaS or even inference-heavy generative AI applications. They are continuous, multi-modal, and often require real-time reasoning over large streams of structured and unstructured data. This brings two major infrastructure constraints into play:

1. Specialized Compute Architecture:

General-purpose CPUs are insufficient for real-time autonomy. Agentic systems often require a blend of GPUs, TPUs, FPGAs, or emerging architectures like neuromorphic and photonic chips – each optimized for high-throughput reasoning and low-latency decision loops.

Companies like Nvidia, AMD and others are central to this buildout, but global supply chains for advanced chips remain constrained. The CHIPS Act in the U.S. and similar legislation in the EU and East Asia reflect growing national urgency to localize and secure compute manufacturing.

2. Edge and Distributed Compute:

Not all decisions can – or should – be made in centralized cloud environments. Latency-sensitive use cases like robotic control, autonomous vehicles, and grid routing require edge inference – pushing AI to the “last mile” of infrastructure.

This has given rise to a new class of players like edge GPU node operators, low-power chip designers, and intelligent sensor networks. Companies enabling federated learning, low-bandwidth optimization, and embedded inference are an integral part of the infrastructure required.

The Grid Becomes a Competitive Advantage

Training large AI models may grab headlines, but running agentic systems day-to-day is the real energy draw. Unlike periodic inference from GPT models, autonomous agents operate continuously – scheduling, negotiating, simulating, and acting across workflows. This introduces 2 additional key challenges:

1. AI’s Baseline Power Load Is Non-Trivial

Some estimates suggest data centers already consume 1-2% of global electricity. With the rise of agentic systems embedded across logistics, manufacturing, finance, and utilities sectors alone, that number will almost certainly increase dramatically in the next decade. Autonomous operations in real-time industrial settings require uptime guarantees and redundant systems, further compounding energy draw. This data center build-out will drive increased demand not only for the “workhorse metals” of data centers like copper, aluminum, steel, and lithium but also tin, which is indispensable in data centers because it literally holds all the electronics together.

2. Clean, Local Energy Sources Create Strategic Moats

Companies with direct access to low-cost, renewable power will be structurally advantaged in deploying agentic AI at scale. Think of this as the new oil potentially including “compute zones” such as geothermal plants in Iceland powering data centers, solar-plus-battery microgrids co-located with AI training facilities in sun-drenched geographies, hydropower-backed compute zones in Scandinavia, or natural gas powered projects supporting data centers in Canada such as Kalina (case study to follow).

As such, investors should watch for vertical integration between energy producers and AI infrastructure providers – echoing the early days of the Cloud when Amazon built data centers adjacent to hydroelectric dams.

Networks, Bandwidth, and AI Latency

The next wave of AI agents will not simply run on-premise or in the cloud – they will communicate constantly with other agents, humans, and IoT systems. This demands:

  • High-reliability, low-latency bandwidth between agents and systems.
  • 5G/6G and satellite coverage for remote deployments in sectors like agriculture, shipping, and mining.
  • Edge-to-core data synchronization that preserves consistency without bottlenecks.

In practical terms, that means the winners in agentic AI will not always be those developing new agentic offerings, but also telcos upgrading last-mile fiber, satellite networks enabling real-time agent coordination in under-connected regions, and hardware firms embedding ultra-fast communication modules in industrial systems.

Data Gravity, Storage, and Cooling

Agentic AI systems are voracious consumers of context: weather data, machine telemetry, transaction logs, traffic flows, supplier signals, and more. With this comes a cascade of physical needs:

  • Massive storage capacity, including real-time streams and time-series logs.
  • Geographic co-location of data, compute, and physical operations to minimize latency.
  • Thermal management innovations – cooling AI systems in dense deployments is now both a technical and environmental challenge. Liquid cooling, immersion, and climate-sensitive data center placement (e.g., Arctic zones) are growing fast for this reason.

Sovereign AI and the Infrastructure Race

As autonomy becomes a sovereign asset class, national governments are beginning to treat AI infrastructure like critical public utilities. Expect to see:

  • Strategic reserves of compute, similar to energy reserves.
  • Government-funded energy infrastructure tied to national AI strategies (e.g., India’s compute zones, China’s multi-modal AI parks, and Europe’s data localization mandates).
  • Public-private partnerships around smart grids, energy-routing AI, and industrial optimization agents.

Countries without resilient grid infrastructure or compute manufacturing capacity risk becoming dependent on external suppliers – not just for chips, but for decision-making intelligence itself.

Investment Signals and Strategic Implications

For capital allocators, this moment demands a key pivot. Pure software multiples will not always apply in an agentic AI world governed by physical throughput and grid bandwidth.

Instead, look for:

DomainInvestment Signal
Energy + AIClean energy sources tied to AI operations
Edge ComputeReal-time processing infrastructure and low-power chips
Telco InfrastructureHigh-bandwidth and low-latency networks with AI capacity
Cooling & Data StorageThermal innovation and storage at the edge
Chip ManufacturingSovereign-aligned fabrication and packaging assets
Smart Grid OperatorsAI-coordinated, resilient, and adaptive utilities


Final Takeaway: Intelligence Demands Infrastructure

Agentic AI is not a floating layer of intelligence. It is grounded in the material world – powered by electrons, transmitted through fiber, cooled by water, and housed in steel. As it reshapes industries, the real bottlenecks will not only be AI model quality or prompt engineering – they will also include kilowatt-hours, chip access, grid uptime, and transmission latency.

This is the hard tech backbone that makes autonomy possible. Just as oil pipelines enabled the last industrial age, energy and compute infrastructure will play a critical role in determining who leads in the agentic one.


Moneta is a boutique investment banking firm that specializes in advising growth stage companies through transformational changes including major transactions such as mergers and acquisitions, private placements, public offerings, obtaining debt, structure optimization, and other capital markets and divestiture / liquidity events. Additionally, and on a selective basis, we support pre-cash-flow companies to fulfill their project finance needs.

We are proud to be a female-founded and led Canadian firm. Our head office is located in Vancouver, and we have presence in Calgary, Edmonton, and Toronto, as well as representation in Europe and the Middle East. Our partners bring decades of experience across a wide variety of sectors which enables us to deliver exceptional results for our clients in realizing their capital markets and strategic goals. Our partners are supported by a team of some of Canada’s most qualified associates, analysts, and admin personnel.

Disclaimer:

This newsletter is for informational purposes only. Its contents should not be construed as investment, financial, tax, or other advice. Nothing contained herein is intended to constitute a solicitation, recommendation, endorsement, or offer to buy or sell any security, financial product, or instrument. Please consult a qualified investment professional who is familiar with your particular circumstances before making any financial or investment decisions. Views expressed here do not necessarily reflect those held by every member of our organization or by our clients.

Stay Up To Date

Follow Perspectives by Moneta on LinkedIn to receive our latest insights and updates on capital markets.
Subscribe on LinkedIn