Big Tech x Generative AI Q3 '24 Update (Part 2)
How Meta and Microsoft's Generative AI investments are going so far
This is a weekly newsletter about the business of the technology industry. To receive Tanay’s Newsletter in your inbox, subscribe here for free:
Hi friends,
Last week, I rounded up what Amazon, Apple and Alphabet are doing in Generative AI and the early results they’re seeing based on their earnings call. This week, I’ll discuss what we can learn from Meta and Microsoft’s investments in Generative AI so far based on the Q3 earnings call.
Meta
Meta’s investments in AI are coming along 4 key areas: i) Llama to power the AI features in their products, ii) Meta AI, iii) Core Recommendations and Ads and iv) AI in Reality Labs Devices. Let’s take them one by one.
Llama
Meta continues to invest in open source, with the launch of Llama 3.2 models this past quarter, including the best performing open source multimodal models. Zuckerberg believes that open source will continue to offer the best performance to cost ratio for developers.
“It seems pretty clear to me that open source will be the most cost-effective, customizable, trustworthy performance and easiest to use option that is available to developers. And I am proud that Llama is leading the way on this.”
Llama 4 is in the works, with Meta continuing to ramp up the compute and scale of these models to improve performance. We may see them launch as early as Q1 next year.
“We're training the Llama 4 models on a cluster that is bigger than 100,000 H100s or bigger than anything that I've seen reported for what others are doing. I expect that the smaller Llama 4 models will be ready first, and they'll be ready -- we expect sometime early next year. And I think that there are going to be a big deal on several fronts, new modalities, capabilities, stronger reasoning and much faster.”
Meta AI
Meta AI is Meta’s flagship consumer product built around their Llama models, and available across all their key applications. Meta recently added
Meta AI is already at 500 million monthly active users, and Meta believes it is on track to be the most used AI assistant in the world by end of year.
Meta AI added voice mode, better image generation and a host of features that has improved engagement, and Zuckerberg hinted at much more improvements coming particularly in line with Llama 4 (perhaps agentic actions?)
“I'm sort of intentionally now not saying too much about the new capabilities and modalities that we're launching with Llama 4 that are coming to Meta AI. I mean I noted in the comments upfront that there -- with each major generational update, I expect that there will be large new capacities that get added. But I think that, that's just going to be -- that's partially what I'm excited about, and we'll talk more about that next year when we're ready to.”
Recommendations and Ads
In terms of direct, short-term ROI on better AI, improving the feed recommendation system and ads engine is the bread and butter move for Meta, and they’re starting to see the improvements from the advances in AI already.
On the recommendations side, they’re already seeing an increase in time spent from better ranking work, which directly translates into more ad impression opportunities.
“Improvements to our AI-driven feed and video recommendations have led to an 8% increase in time spent on Facebook and a 6% increase on Instagram this year alone.”
Perhaps even more interestingly, Meta has observed that scaling laws apply to recommendation models, leading them to rearchitect their recommendation models and see better results.
"Previously, we operated separate ranking and recommendation systems for each of our products because we found that performance did not scale if we expanded the model size and compute power beyond a certain point. However, inspired by the scaling laws we were observing with our large language models, last year, we developed new ranking model architectures capable of learning more effectively from significantly larger data sets. To start, we have been deploying these new architectures to our Facebook ranking video ranking models, which has enabled us to deliver more relevant recommendations and unlock meaningful gains in launch time. Now we're exploring whether these new models can unlock similar improvements to recommendations on other services.”
On the ads side, improving systems and AI applied on more data is allowing for improvements in efficiency of ads, allowing for better results for advertisers (and enabling Meta to increase prices per ad impression)
“We are finding opportunities to achieve meaningful ads performance gains by adopting new approaches to modeling. For example, we recently deployed new learning and modeling techniques that enable our ad systems to consider the sequence of actions a person takes before and after seeing an ad…Since we adopted the new models in the first half of this year, we've already seen a 2% to 4% increase in conversions based on testing within selected segments.”
Lastly, tools to help advertisers create better ads are also getting adopted, and advertisers that use it are seeing better ad results.
“More than 1 million advertisers used our Gen AI tools to create more than 15 million ads in the last month. And we estimate that businesses using image generation are seeing a 7% increase in conversions and we believe that there's a lot more upside here.”
Reality Labs Devices
Finally, another area where AI is helping Meta see momentum is their Ray-Ban Meta glasses. Demand for glasses, in large part thanks to improved AI features and a deeper Meta AI integration continue to be strong, with even the $1000 clear glasses selling out almost immediately.
Ray-Ban meta glasses are the prime example here (of integration between AI and weables). They're great looking glasses that let you take photos and videos, listen to music and take calls.
But what makes them really special is the Meta AI integration. With our new updates, it will be able to not only answer your questions throughout the day, but also help you remember things, give you suggestions as you're doing things using real-time multi-modal AI and even translate other languages right in your ear for you.
I continue to think that glasses are the ideal form factor for AI because you can let your AI see what you see, hear what you hear and talk to you.
If you don’t yet receive Tanay's newsletter in your email inbox, please join the 10,000+ subscribers who do:
Microsoft
Microsoft has positioned itself as a leader in the AI wave, and is already starting to see strong results, particularly from via Azure Cloud, with their aggregate AI business expected to surpass $10B ARR next quarter.
Our AI business is on track to surpass an annual revenue run rate of $10 billion next quarter, which will make it the fastest business in our history to reach this milestone.
They are infusing AI across their business, but let’s talk about it in a few key areas: i) Infrastructure including Azure AI ii) Developers including Copilot iii) Enterprise Applications and iv) Consumer Applications
Infrastructure
At the silicon and accelerator layer, Microsoft continues to have a broad range of offerings, including their own Cobalt 100 VMs and Maia 100 accelerator.
We offer the broadest selection of AI accelerators, including our first-party accelerator, Maia 100 as well as the latest GPUs from AMD and NVIDIA. In fact, we are the first cloud to bring up NVIDIA's Blackwell system with GB200-powered AI servers
Azure continues to take share for Microsoft, and their partnership with OpenAI continues to bear fruit for them, with Azure OpenAI usage doubling in the past six months.
Our partnership with OpenAI also continues to deliver results. We have an economic interest in a company that has grown significantly in value, and we have built differentiated IP and are driving revenue momentum. More broadly with Azure AI, we are building an end-to-end app platform to help customers build their own Copilots and agents. Azure OpenAI usage more than doubled over the past 6 months as both digital natives like Grammarly and Harvey as well as established enterprises like Bajaj Finance, Hitachi, KT, and LG move apps from test to production.
Microsoft also noted that most of their Azure revenue comes from inference, because they’re turning away some of the raw GPU leasing demand given capacity constraints.
If you sort of think about the point we even made that this is going to be the fastest growth to $10 billion of any business in our history, it's all inference, right? One of the things that may not be as evident is that we're not actually selling raw GPUs for other people to train. In fact, that's sort of a business we turn away because we have so much demand on inference that we are not taking what I would -- in fact, there's a huge adverse selection problem today where people -- it's just a bunch of tech companies still using VC money to buy a bunch of GPUs.
We kind of really are not even participating in most of that because we are literally going to the real demand, which is in the enterprise space or our own products like GitHub Copilot or M365 Copilot. So I feel the quality of our revenue is also pretty superior in that context.
Developers
Copilot continues to have strong momentum with enterprise customers increasing 55% quarter-over-quarter. In addition, Copilot continues to get more agentic across the developer workflow with launches including Copilot Workspace.
We are introducing the next phase of AI code generation, making GitHub Copilot agentic across the developer workflow. GitHub Copilot Workspace is a developer environment which leverages agents from start to finish so developers can go from spec to plan to code all in natural language.
AI is also help grow the low-code / no-code movement and momentum for Microsoft’s Power platform offering there, with adoption growing 4x y/y.
“We have brought generative AI to Power Platform to help customers use low-code, no-code tools to cut costs and development time. To date, nearly 600,000 organizations have used AI-powered capabilities in Power Platform, up 4x year-over-year.”
Enterprise Applications
Microsoft continues to infuse AI in every application they have, from their core office suite to Dynamics 365 to Industry-specific products.
While Microsoft didn’t share too much by way of revenue numbers and uptake, they did give us some indications of how it’s going, also noting that some companies are saving 3 hours per week per employee by adopting it.
“Nearly 70% of the Fortune 500 now use Microsoft 365 Copilot, and customers continue to adopt it at a faster rate than any other new Microsoft 365 suite.”
Revenue impact from it is expected to be continue to grow over time as adoption increases:
We continue to see growth in M365 Copilot seats, and we expect the related revenue to continue to grow gradually over time.
In their CRM and ERP products, Microsoft is going beyond copilots with specialized agents:
“Just last week, we added 10 out-of-the-box autonomous agents to Dynamics 365 that helps customers automatically qualify sales leads, track suppliers and work hand-in-hand with service reps to resolve issues.”
Lastly, their industry-specific AI products are also going well, with DAX copilot for healthcare ramping revenue faster than Github Copilot did.
One year in, DAX Copilot is now documenting over 1.3 million physician patient encounters each month at over 500 health care organizations like Baptist Medical Group, Baylor Scott & White, Baltimore Medical Center, Novant Health, and Overlake Medical Center. It is showing faster revenue growth than GitHub Copilot did in this first year. And new features extend DAX beyond notes helping physicians automatically draft referrals after visit instructions and diagnostic evidence.
Consumer Applications
Microsoft also continues to leverage AI across their consumer properties such as LinkedIn, Browsers and Search.
On LinkedIn, Microsoft launched AI features to benefit sellers and recruiters and job seekers, and those that use it are finding it makes them more effective.
Our AI-powered tools also continue to transform how people sell, learn and hire. In sales, new AI features help every team member perform at the level of top sellers and drive more profitable growth. LinkedIn's first agent hiring assistant will help hirers find qualified candidates faster by tackling the most time-consuming task. Already hirers who use AI Assistant messages see a 44% higher acceptance rate compared to those who don't. And our hiring business continues to take share.
Lastly, Microsoft continues to take share (albeit at relatively s with Bing and Edge in the search and browser markets in large part because of their copilot and AI features.