Tech Earnings Season Takeaways
What Q1 2026 calls revealed about compute, memory, ads, agents, and the next phase of software.
I’m Tanay Jaipuria, a partner at Wing and this is a weekly newsletter about the business of the technology industry. To receive Tanay’s Newsletter in your inbox, subscribe here for free:
Hi friends,
We’re past the thick of the Q1 2026 earnings season and like prior quarters, I’m going to round up some of the key themes I took away across earnings call in AI and software land.
I. Capex Keeps Going Up
The headline from every hyperscaler call this quarter: capex is accelerating. Not just for 2026 but for 2027 too. Just four companies combined will be spending over $700B: Amazon at $200B, Microsoft at $190B, Google at $180-190B, Meta at $125-145B.
Google CFO Anat Ashkenazi flagged that even that number keeps moving:
“We expect our 2027 CapEx to significantly increase compared to 2026” - Anat Ashkenazi, CFO, Google
Sundar Pichai shared the demand signal behind it:
“We are compute constrained in the near term. And as an example, our cloud revenue would have been higher if you were able to meet the demand.”
Microsoft guided to $190B noting that “strong customer demand across workloads, customer segments, and geographic regions continues to exceed available capacity.”
II. The Memory Chokepoint
There was a lot of discussion around Memory in particular as a key driver of constraint on the supply side, specifically HBM and high-bandwidth DRAM
Apple flagged it directly in its quarterly outlook:
“For the June quarter, we expect significantly higher memory costs...beyond the June quarter, we believe memory costs will drive an increasing impact on our business” - Tim Cook, CEO, Apple
Meta’s CFO called it out as the primary driver of their own raised capex number:
“Most of that CapEx increase is due to higher component costs, particularly memory pricing.” - Susan Li, CFO, Meta
Andy Jassy flagged the same thing:
“Everybody knows that the cost of components, particularly memory, has skyrocketed. We are in a stage where there is just not enough capacity for the amount of demand.” - Andy Jassy, CEO, Amazon
Intel CFO David Zinsner flagged the downstream risk:
“Constraints and rising prices around key components like memory, wafers, and substrates are driving higher costs that could impact demand for our product at some point in the year.” - David Zinsner, CFO, Intel
One underreported side effect: the memory shortage is accelerating enterprise cloud migration. Suppliers are prioritizing their largest customers, which means hyperscalers. Enterprises trying to procure on-prem can’t get equivalent access. Jassy on the call:
“A meaningful part of these suppliers are prioritizing their very largest customers, which cloud providers are. We have seen a number of conversations we have been having with enterprises for many months — where it has just been slower in getting the transformation plan to move to the cloud — accelerate rapidly just because we have a lot more supply than what others have.” - Andy Jassy, CEO, Amazon
III. The CPU is Back
For the past few years, every infrastructure conversation was a GPU conversation. That’s shifting. Intel, AMD, and Amazon all flagged that CPUs are re-entering the picture, specifically for agentic workloads.
Andy Jassy at Amazon discussed this trend:
“AI is commonly seen as a GPU story, but the rise of agentic workloads, real-time reasoning, code generation, learning, and multi-step task orchestration is driving massive CPU demand as well.”
Intel CEO Lip-Bu Tan also noted similarly:
“In recent months, we have seen clear signs that the CPU is reinserting itself as the indispensable foundation of the AI era. The CPU now serves as the orchestration layer and critical control plane for the entire AI stack.”
AMD updated its server CPU TAM forecast dramatically:
“We now expect the server CPU TAM to grow at greater than 35% annually, reaching over $120 billion by 2030” - Lisa Su, CEO, AMD
Both AMD and Intel shared similar perspectives on how the ratio may evolve. For training, you see roughly 1 CPU per 7-8 GPUs. For inference, 1 per 3-4. For agentic workloads: approaching 1 CPU per GPU.
IV. AI Improving Core Ads Businesses
One of the more under discussed stories is just how effective compute is at driving improvements in recommendation and ads systems, and its benefiting the core businesses of many companies.
Meta’s ads business is doing ~$56B in quarterly revenue, still growing 32% y/y and Google Search ads is doing ~$60B in quarterly revenue, still growing 19% y/y
For Google, search queries are at an all-time high, which is the opposite of what narratives from a year ago predicted. At the same time, the cost per click number is likely up significantly even though the exact percentage wasn’t shared. Google also noted that only about 20% of queries get monetized today and with AI mode and better ad recommendations, that percentage could move higher.
Meta saw a 10% lift in Reels engagement from ranking changes alone:
“On Instagram, the ranking improvements that we made in Q1 drove a 10% lift in Reels time spent. On Facebook, total video time increased more than 8% globally in Q1” - Susan Li, CFO, Meta
They also shared how their efforts around their Generative Ads Model for ranking is seeing ROI:
In Q1, enhancements we made to Lattice’s modeling and learning techniques along with advances in our GEM model architecture drove a more than 6% increase in conversion rate for landing page view ads
Pinterest and Reddit also posted strong quarters seeing strong growth in their ads businesses.
V. The AI-Generated Code Proxy
Multiple CEOs disclosed the share of their engineering output now coming from AI. I think it partly shows two things despite not being the best metric:
Coding and Coding Agents has been the hero use case of AI that has been mass adopted across pretty much every enterprise
Companies are using this metric at least externally to Wall Street as a proxy for where they stand in AI adoption
DoorDash’s Tony Xu gave the most direct quote:
“Well north of half of our code—probably closer to two-thirds of our code—is written by AI today.”
HubSpot’s Yamini Rangan shared both adoption and output data:
“100% of our engineers now use AI tools, and we have seen a 73% increase in lines of code updated per engineer.” - Yamini Rangan, CEO, HubSpot
Uber’s Dara Khosrowshahi specifically called out the rise of autonomous coding agents:
“About 10% of our code now that is committed is built by agents, autonomous agents out there.”
Spotify said they are “spending more compute per employee” and seeing “tremendous return in terms of productivity.” with internal metrics of output doubling.
For those curious, here’s a quick summary of some of the disclosures around what percent of new code committed is AI-generated:
Chime: 84% (as of May '26)
Google: 75% (Apr '26)
Uber: ~70% (Apr '26)
DoorDash: ~65% (May '26)
Atlassian: >50% (Apr '26)
Shopify: >50% (May '26)
Snap: ~40% (Feb '26)
VI. SaaS Cos Layering on Agents
Among SaaS companies, the questions around a SaaSpocalypse and their ability to adapt to AI are a few things: with the shifts to agents, what happens to their core businesses built on seat-based models and will they or the labs / other upstarts give customers the agentic version of their offering that customers will want over time, which often looks like headless versions / MCP servers to be compatible with existing AI agents.
SaaS companies touched on their agentic offerings and also the seat compression piece. Atlassian discussed how their Rovo AI credit usage grew 20%+ month over month and that they aren’t seeing seat compression yet:
“We are not seeing any signal of seat compression from customers. If anything, we are seeing the opposite. We are seeing strong expansion numbers, strong cross-sell numbers between Collections, strong usage of AI.” - Michael Cannon-Brookes, CEO, Atlassian
HubSpot’s Dharmesh Shah touched on the agentic experience and the importance of going headless:
“We are big believers in the idea of headless - not big believers in the notion of humanless. The right platform for go-to-market for our customer base is going to be a combination of serving humans with a very personalized modern experience, and supplementing that with a really good agentic experience.” - Dharmesh Shah, CTO, HubSpot
Datadog framed their AI strategy in two explicit buckets: “AI for Datadog” (AI making the Datadog platform better through various agents) and “Datadog for AI” (Datadog providing observability and security for AI workloads).
In March, we launched our MCP server for general availability. With MCP Server, developers access live production data to debug their applications directly in their AI coding agent or IDE. We delivered this AI security agent, which autonomously triages Datadog Cloud SIEM signals, conduct in-depth investigations of potential threats, and delivers actionable recommendations. We’ve seen Bits AI security agent reduce investigations that could take hours to as little as 30 seconds - Olivier Pomel, CEO, Datadog
VII. SaaS Leaning on Context
SaaS companies are also leaning on their data and context as differentiation and a big part of their value add vs AI labs and other products.
Atlassian’s CEO:
“In a world where humans will run teams of agents, context is the only anchor to avoid chaos, and we believe that companies who prioritize context will become truly AI-native.” - Michael Cannon-Brookes, CEO, Atlassian
HubSpot’s Yamini Rangan and Dharmesh Shah talked about context as their common foundation:
“AI without the right context produces output; AI with the right context produces outcome.” - Yamini Rangan, HubSpot
“We are essentially ambivalent as usage shifts from human usage to agentic usage — whether it runs on our runtime agents that we have built or if there are third-party apps and agents that have been built. All of those agents are going to need a common foundation.” - Dharmesh Shah, CTO, HubSpot
VIII. The Roadblocks in Agentic Commerce
There was also a lot of discussion about agentic commerce and the changing nature of distribution, which is still very early and arguably has been a let down so far.
Amazon’s Andy Jassy gave the clearest explanation of why horizontal agents have struggled as shopping tools:
“The experience just has not gotten great with these third-party horizontal agents yet. They are not often able to get the pricing right or the product information right. They do not have any personalization data or any shopping history... I happen to think that if you are going to a particular retailer that you would like to do business with, you would like to shop from, if they have a great agentic shopping assistant, you are going to often start there because it is where you are doing your shopping. That is what we are aiming to make Rufus be — we are aiming to have it be the best shopping assistant anywhere.” - Andy Jassy, CEO, Amazon
Pinterest CEO William Ready also had a view on why third-party agentic commerce efforts have stalled:
“On agentic commerce more broadly, you have also seen meaningful strategic pivots from some of the platforms that were most aggressively pursuing that space. That validates our view that the barriers to progress in agentic were likely not technical, but around user behavior and ecosystem incentives.”
At the same time, it’s clear that AI is being used for top of funnel and mid funnel search and driving traffic to merchants (in commerce and beyond). Shopify’s view shared was the more optimistic one:
"The early signals on AI channels are really compelling. In the first quarter, AI-driven traffic to Shopify stores has grown 8x year-over-year, while orders from AI-powered searches have increased nearly 13x. And within this, new buyer orders are occurring at nearly twice the rate of other channels." - Harley Finkelstein, President, Shopify


