Blog
03.2026

Why Thermodynamics is the New Frontier of AI Infrastructure

Everyone’s racing to build faster AI chips. Nobody’s talking about what’s actually slowing it down: heat.

AI infrastructure is scaling at an extraordinary speed. Data center compute demand is projected to grow more than 3x by 2030. Modern AI, from large language models to image and video generators, consumes enormous amounts of electricity to both train and run. Training a single frontier model can require millions of kilowatt-hours. But the bigger shift is inference: serving billions of prompts, searches, and interactions in real time turns AI into a continuous, 24/7 energy load.

Performance is no longer just a function of compute. It is a function of thermodynamics.

Five Forces Driving AI Into a Power Wall

The thermal constraint does not exist in isolation. It is the convergence of five structural pressures that have been building simultaneously.

1. Training is massively energy-intensive. Frontier models require enormous GPU clusters running for weeks or months. Each generation is larger than the last, not smaller.

2. Inference is becoming the real energy sink. Serving billions of prompts, searches, images, and video requests turns AI into a continuous, always-on power load. As AI embeds itself in everyday software and devices, energy demand becomes increasingly persistent.

3. Data centers are hitting power and cooling limits. AI facilities now require tens to hundreds of megawatts of power. Grid capacity and advanced cooling systems are becoming gating factors for deployment.

4. Sustainability pressure is rising. AI workloads run 24/7. Renewable energy is intermittent. The tension between AI expansion and climate commitments is real.

5. Energy is becoming a material cost driver. Electricity is no longer a rounding error. It meaningfully affects model economics, pricing, and competitive advantage.

As AI scales, performance is no longer just about compute. It is about thermodynamics.

The Shift: AI Has Entered the Laws of Physics

NVIDIA’s CEO Jensen Huang recently described AI as a five-layer cake: Energy, Chips, Infrastructure, Models, and Applications.

AI: A Five Layer Cake

That framing starts at the right place – energy.

Jensen’s point is direct: “At the foundation of AI is power. Intelligence generated in real time requires power generated in real time. Every token produced is the result of electrons moving, heat being managed, and energy being converted into computation. There is no abstraction layer beneath physics. Energy is the first principle of AI infrastructure – and the binding constraint on how much intelligence a system can produce.”

Cooling sits inside the Infrastructure layer, directly above the chips. As power density rises, thermal management determines how much compute can actually run, how efficiently it runs, and how densely it can be deployed.

That framing captures exactly what Frore Systems is building into. For decades, the technology industry has talked about two foundational layers: the software stack and the hardware compute stack. There is a third layer that has now become equally foundational: the Thermal Stack.

Today, Frore is introducing the Thermal Stack – an integrated architecture that manages heat from chip to atmosphere. It has become as foundational as silicon itself.

The AI Thermal Stack is the integrated cooling architecture required to solve the heat problem at scale. It encompasses heat extraction from compute and networking hardware and heat rejection into the atmosphere, spanning both data centers and edge platforms. As AI workloads scale exponentially, the thermal stack directly determines compute density, energy efficiency, and performance.

Frore Systems is redefining the Thermal Stack across three markets:

  1. AI data centers, enabling higher compute density, reduced weight, and improved power and water efficiency
  2. Industrial edge AI gateways, supporting intense AI workloads in compact, rugged, dustproof, and water-resistant enclosures
  3. Consumer AI devices, delivering high-performance AI computing in ultra-thin, silent devices.

The brain of AI computing moved from CPUs to GPUs to TPUs. The next critical unit isn’t about processing speed. It’s about managing the heat generated by all that processing. In the AI era, a new class of CPUs, or what I’m calling “Cooling Processing Units,” is emerging. As AI scales, cooling becomes the performance frontier.

What Frore Saw Early: Solve the Edge First

When Frore was founded in 2018, the AI data center boom was not yet visible. The company started with a precise problem: to solve the thermal problem in computing.

The first wedge was edge devices — ultra-thin platforms where traditional fans simply cannot operate. AirJet, the world’s first solid-state active air-cooling chip, was built to address that constraint. At just 2.65mm thick, AirJet delivers active cooling in devices that are silent, dustproof, and water-resistant, eliminating thermal throttling and doubling performance in the most constrained form factors.

AirJet has been in the market for eight years and is now shipping in volume across industrial edge AI gateways, ultra-compact consumer AI devices, and IoT platforms.

The edge wedge proved something that mattered for what came next: thermodynamics could be re-architected.

Cooling was not a fixed constraint to be managed. It was a design problem to be solved at the materials and manufacturing level.

That conviction translated directly into Frore’s data center approach.

Redesigning the Thermal Architecture of AI Data Centers

As AI moved into hyperscale data centers, liquid cooling became essential. But traditional liquid cooling is heavy and built through slow, expensive machining processes, and designed for an era of much lower compute density.

Frore reimagined it from the substrate up. LiquidJet is their data center solution. It’s a 3D direct liquid-cooling coldplate that delivers 75% higher heat transfer efficiency, runs GPUs 8°C cooler, and reduces coldplate weight by 55%, all as a drop-in upgrade to existing infrastructure.

Frore Systems LiquidJet Coldplate

LiquidJet Nexus extends the platform further: 2x compute density per rack, 65% reduction in thermal stack weight, support for 53°C inlet temperatures, and the elimination of connectors and hoses manifolds.

LiquidJet Nexus

This is not an incremental improvement in cooling. It is a redesign of the thermal architecture that governs how much AI computing a given facility can run, at what cost, and with what energy footprint.

Building Thermodynamics at Scale

The moat goes beyond Frore’s products. The structural advantage Frore has built is not just technical. It operates across three reinforcing layers.

1. Manufacturing architecture. Frore applies semiconductor-style manufacturing to cooling, using scalable process technology applied to metal wafers. This is not a design that can be handed off to a contract manufacturer.

Frore is a fab company, not a fabless design shop: vertically integrated from materials science through chip design, fabrication, and supply chain. The supply chain infrastructure itself, built with manufacturing operations in Taiwan and engineering in California, is not replicable quickly by a new entrant or an incumbent trying to bolt on thermal capabilities.

2. Operational depth at lean scale. At full production scale, Frore will operate with ~300 people, including ~100 engineers, generating revenue at a scale that reflects the economics of a semiconductor company rather than a traditional hardware vendor. That ratio reflects the leverage that comes from IP-intensive, vertically integrated manufacturing.

3. Organizational durability. As Seshu shared in our conversation, it starts with people. As teams scale, culture becomes either a multiplier or a fracture point. The founding team is intact after eight years, with near-zero attrition in an environment where retention is one of the hardest problems in the industry. A culture that compounds over time is a moat of a different kind.

The Thermal Stack as Foundational Infrastructure

Frore is defining the Thermal Stack category in the AI era.

Today, Frore Systems is announcing a $143 million Series D, valuing the company at $1.64 billion. The capital will accelerate the global scale-up of AirJet, LiquidJet, and LiquidJet Nexus across both edge and data center markets.

We’re proud to have backed Seshu Madhavapeddy, Suryaprakash Ganti, and the Frore team since their founding in 2018 through Series D.

Thermals are no longer background infrastructure. They define what AI systems can do.

Originally published on LinkedIn.

# #