Blog
01.2026

Capital Is Buying Scale – Issue #15

This week’s theme is Capital Is Buying Scale, reflecting how AI capital is concentrating around platforms that can pair growing usage with real, visible revenue.

As AI systems move from pilots into production, monetization is becoming as important as capability. Advertising, subscriptions, and long-term enterprise contracts are no longer downstream considerations. They are now underwriting the next wave of AI infrastructure and product expansion.

We see this clearly across recent signals. Meta’s up to $6 billion partnership with Corning secures fiber and expands domestic manufacturing, funded by resilient ad revenue and an expanding push into premium AI subscriptions across its core apps. Microsoft continues to vertically integrate with Maia 200 as OpenAI-related workloads drive a sharp rise in commercial backlog and long-dated cloud commitments. NVIDIA’s deepening relationship with CoreWeave reflects the same dynamic, large infrastructure investments increasingly backed by committed demand.

New monetization models are forming alongside this buildout. OpenAI is testing advertising inside ChatGPT, reportedly targeting premium CPMs as it positions conversational AI alongside high-end media. Anthropic has significantly raised its revenue forecast, signaling that enterprise customers are scaling AI use in core workflows rather than treating it as an experiment. IBM’s expanding generative AI book further reinforces that paid deployments are moving into production environments.

Across the stack, capital is flowing toward AI platforms that can fund growth internally through revenue rather than externally through fundraising. Distribution, pricing power, and forecastable demand are increasingly determining who can scale fastest and sustain it the longest.

Here is your Saturday guide to the signals shaping the future of AI: 

Infrastructure

  • Corning and Meta sign a multi-year deal valued at up to $6 billion to scale U.S. data center infrastructure. The partnership positions Corning as a core supplier of advanced fiber for Meta’s AI data centers and includes a new North Carolina manufacturing facility, reinforcing the push toward domestically built AI infrastructure. Click here
  • Microsoft unveils Maia 200, a new in-house AI inference chip designed to power the next wave of Azure workloads. Built specifically for large-scale AI inference, Maia 200 boosts performance for production AI models across Azure services, including Microsoft 365 Copilot and Azure AI Foundry, strengthening Microsoft’s vertically integrated approach across chips, cloud, models, and applications. Click here
  • Tesla plans to spend more than $20 billion this year as it reallocates capital from traditional EVs toward autonomous vehicles, humanoid robots, and AI-driven manufacturing. The record investment underscores Tesla’s push to reinvent itself as an AI and robotics company, with most spending tied to Cybercab robotaxis, Optimus robots, and battery and lithium production. Click here
  • Apple warns that rising memory chip prices are starting to squeeze margins as suppliers shift capacity toward AI workloads. Samsung and SK Hynix are prioritizing high-bandwidth memory for AI servers, tightening supply of conventional chips, and raising costs for smartphones and PCs, with analysts now forecasting weaker global device sales. Click heree

Enterprise

  • Tesla cuts its Model S and X as it shifts more aggressively toward AI and robotics. The company will repurpose its California plant to produce Optimus humanoid robots, invest heavily in AI and xAI, and double down on robotaxis, marking a strategic pivot away from low-volume EVs as revenues and profits decline. Click here
  • OpenAI is seeking premium pricing in its early push of ChatGPT ads, reportedly targeting CPMs of around $60, well above typical digital ad rates. The strategy positions ChatGPT ads alongside high-end media, such as premium video, with an initial focus on large brands and limited performance metrics, as OpenAI begins testing advertising as a new revenue stream. Click here
  • Meta will test new premium subscriptions across Instagram, Facebook, and WhatsApp, adding paid features and expanded AI capabilities to its core apps. The company plans to bundle productivity tools, creativity features, and its Manus AI agent into subscription offerings, signaling a push to diversify revenue beyond ads while keeping core experiences free. Click here
  • Anthropic lifts its 2026 revenue forecast by 20%, projecting sales to surge to as much as $18 billion this year and $55 billion next year. The revised outlook signals accelerating enterprise demand for AI models as customers scale their use beyond experimentation into core business workflows. Click here
  • Meta sharply raises capital spending to accelerate its push toward superintelligence. The company plans to boost 2026 capex by roughly 73% to $115–$135 billion to fund massive AI data center buildouts and talent hiring, with strong ad revenue cushioning the investment surge and sending shares higher as investors back Zuckerberg’s long-term AI strategy. Click here
  • Anthropic and JPMorgan signal that AI has not yet disrupted core enterprise software. Leaders from both organizations suggest that while AI is improving productivity and workflows, it is still augmenting existing enterprise applications rather than replacing them, reinforcing that large-scale system change inside companies will take time. Click here
  • Microsoft’s cloud growth slowed in the latest quarter, but demand tied to OpenAI drove a sharp jump in future revenue. The company reported a roughly 60% surge in its commercial revenue backlog, highlighting how OpenAI-related workloads and long-term AI contracts are increasingly offsetting near-term cloud deceleration. Click here
  • Microsoft signs a $750 million cloud deal with Perplexity, expanding its role as a core platform for emerging AI companies. The three-year agreement moves Perplexity beyond its prior reliance on Amazon and lets it deploy models through Microsoft Azure Foundry, including systems from OpenAI, Anthropic, and xAI, underscoring Azure’s growing pull as an enterprise AI backbone. Click here
  • IBM reported strong enterprise momentum as AI demand drove a 12% revenue increase and expanded its generative AI book of business to $12.5 billion. The results indicate that large customers are moving beyond pilots to scaled AI deployments across core operations, reinforcing IBM’s position as a key enterprise AI vendor as adoption accelerates. Click here

Capital Flows

  • OpenAI is reportedly seeking up to $100 billion in new funding at a $750 billion valuation, potentially marking the largest private capital raise in tech history. NVIDIA, Microsoft, and Amazon are expected to contribute a significant portion of the round, underscoring how AI infrastructure providers are increasingly financing the very companies driving demand for their compute and data center capacity. Click here
  • NVIDIA is investing an additional $2 billion in CoreWeave to deepen its AI data center partnership. The companies plan to build large-scale AI factories powered by NVIDIA’s computing platform, targeting up to 5 gigawatts of capacity by 2030. Click here
  • Apple buys Israeli startup Q.AI for nearly $2 billion to boost its AI device strategy. The acquisition strengthens Apple’s push into AI-powered wearables, adding silent-speech and facial-analysis technology as it races to catch up with Meta, Google, and OpenAI. Click here
  • Tesla commits $2B to xAI, deepening Elon Musk’s push to build a major OpenAI rival. The investment ties Tesla more closely to xAI’s Grok platform and signals growing integration between Musk’s AI ambitions and Tesla’s products, even as xAI faces mounting regulatory scrutiny worldwide. Click here

Research

  • OpenAI launches Prism, a free AI-native workspace for scientific writing and collaboration. The tool embeds GPT-5.2 directly into research workflows to help scientists draft, revise, and manage papers more efficiently, indicating that AI is becoming a core productivity layer in modern research. Click here
  • Google DeepMind has launched Project Genie, an experimental tool that lets users create and explore AI-generated interactive worlds. Available to Google AI Ultra subscribers in the U.S., Project Genie showcases advances in world-model research, enabling real-time environment generation for use cases spanning gaming, simulation, and creative media. Click here
  • Alibaba launches Qwen3-Max-Thinking, a new AI model focused on advanced reasoning and efficiency. The model leads major benchmarks, surpassing GPT-5.2 and Gemini 3 Pro, and cements Qwen’s position as the most widely used open-source AI foundation globally. Click here

Policy

  • China has conditionally approved DeepSeek to buy NVIDIA’s H200 AI chips, marking a cautious step toward easing advanced chip access. The move highlights Beijing’s effort to boost domestic AI while navigating ongoing U.S.-China tech tensions. Click here
  • Music publishers sue Anthropic for up to $3 billion, alleging the AI company illegally downloaded more than 20,000 copyrighted songs to train its models. The lawsuit, led by Universal Music Group and Concord, escalates scrutiny on how AI companies source training data and could become one of the largest copyright cases in U.S. history. Click here
  • The EU moves to force Google to open Gemini AI services and search data to competitors under its Digital Markets Act. Brussels launched formal compliance proceedings to define how rival AI companies and search engines gain equal access to Google’s AI features and data, signaling tighter enforcement to prevent Big Tech from locking up core AI infrastructure. Click here

Global AI Strategy

  • SK Hynix will establish a new U.S.-based “AI Company,” committing at least $10 billion to expand its artificial intelligence business. The entity will serve as a hub for SK Group’s AI strategy, building on SK Hynix’s leadership in high-bandwidth memory used in AI systems and supporting the company’s growing investments across the U.S. AI and semiconductor ecosystem. Click here
  • Micron is committing $24 billion to expand its Singapore manufacturing footprint amid AI-driven demand for memory. The investment will boost production of high-bandwidth memory to support data centers and AI workloads, reinforcing Singapore’s role as a key hub in the global semiconductor supply chain. Click here
  • Elon Musk is exploring space-based AI data centers to secure cheaper energy and scale compute beyond Earth. The idea would use solar-powered satellites to run AI workloads, with SpaceX positioned to lead, as rivals in the U.S. and China test similar orbital compute concepts. Click here

Talent Signals

Each week, we spotlight key roles tied to the themes shaping this week’s AI headlines, connecting talent to the companies driving the news.

  • Decagon builds conversational AI agents that automate customer support across chat, email, and voice. The company recently raised $250 million to scale its AI concierge platform and expand enterprise use cases. As organizations shift from pilot AI projects to always-on systems that execute core workflows, Decagon’s work supports automated, reliable engagement at scale. Open roles are listed on its careers page. Click here
  • CoreWeave operates large-scale AI data center infrastructure optimized for high-performance training and inference. As capital and compute concentrate around AI factories and vertically integrated stacks, CoreWeave sits at the center of the push to scale reliable, production AI infrastructure. The company lists open roles across teams on its careers page. Click here
  • Airtable is expanding beyond no-code tools with Superagent, a standalone system that coordinates multiple AI agents for complex business work. The move reflects the broader shift toward agent-native enterprise software that operates within real workflows. Open roles are listed on Airtable’s careers page. Click here

You can see all the opportunities at Mayfield-backed AI companies here, and across the broader ecosystem here.

Social Signals

The most important conversations in AI are unfolding across social media, where top voices are shaping the next wave of signals and strategy. Here are some of the top social signals and their takes from the past week.

  • Gavin Baker (Click here) — “Revenue per employee up 75% for the top decile of AI/software companies in 2025. Probably doesn’t slow down in 2026, given the December revolution in AI coding agents. Nothing to see here, move along. No evidence of AI productivity anywhere.” Baker points to data showing ARR per employee accelerating sharply at the top end of AI and software companies, with gains concentrated in the highest-performing decile. The charts highlight widening dispersion across both percentiles and company sizes, suggesting productivity gains are uneven and skewed toward leading firms, reinforcing the view that AI-driven efficiency is already materializing in financial performance rather than remaining theoretical.
  • Alec Coughlin (Click here) — “Demand for AI fluency surged by 7x, ‘faster than for any other skill.’ Exponential change is here. Humans + Machines, not Humans vs Machines.” Coughlin points to new McKinsey Global Institute data showing a sharp rise in demand for AI fluency and AI-related skills across US occupations between 2023 and 2025. The charts highlight growth not just in technical AI roles, but in broader AI fluency across both STEM and non-STEM jobs, signaling that the shift is about widespread tool adoption and human–AI collaboration rather than narrow specialization.
  • Balaji Srinivasan (Click here) — “PERSONAL PRIVATE PROGRAMMABLE. I’ve been thinking more about the intersection of Claude Code and Obsidian. There is an upcoming tech stack here that I’m calling personal private programmable.” Balaji sketches a near-term stack in which open-weight local models, rich personal file formats such as Markdown, Git repositories, and email archives, and crypto-native identity and encryption come together. He argues that with local AI and local wallets, personal data becomes more programmable while remaining private, enabling encrypted collaboration and computation without centralized platforms. The core claim is that the Claude and Obsidian workflow points toward re-decentralization, where keeping data local is not just safer, but more powerful, because it is easier to encrypt with crypto and compute on with local AI.

To go deeper, subscribe to my monthly Founder Insights newsletter, where I share lessons from the frontlines of company building, perspectives on AI’s future, and our industry’s road ahead: https://www.linkedin.com/newsletters/founder-insights-7274531066957217793/

↓ Drop a note in the comments with the areas of AI you want us to explore next.

Originally published on LinkedIn.

# #