NVIDIA and its GPU customers are now a large driver of equity market returns, earnings growth, earnings revisions, industrial production and capital spending. NVIDIA’s financial results are extraordinary (it beat on revenues and earnings again last week, and announced a $50 billion stock buyback); NVIDIA has also experienced the fastest road to being the market’s largest stock in the post-war era.
But for investors, the more important questions look past the economics of selling GPUs and focus on the ability of hyperscalers (Google, Amazon, Microsoft, Meta, etc) and other AI infrastructure users/providers to earn adequate returns on hundreds of billions in AI-related capital spending. The level of this spending now rivals the mainframe era of the late 1960’s and the fiber optic deployment of the late 1990’s. For adequate returns on AI infrastructure to materialize, within the next 12-18 months we will need to see a greater shift in favor of “inference” tasks (AI used to run production models for corporate customers) rather than GPU capacity primarily being used to train foundational models and chatbots. In this Eye on the Market, we take a closer look.
Good morning, everybody. This is Michael Cembalest with the September 2024 Eye on the Market podcast. This one's called "A Severe Case of COVIDIA," which everybody has. It's the prognosis for an AI-driven U.S. equity market that's driven by what's going on with Nvidia and its GPU customers. Nvidia reported last week that it beat revenues, it beat earnings, and it announced a stock split. The numbers look great, but I thought it was interesting for us to take a look at what needs to be sustained for the U.S. equity markets and the economy, and capital spending and industrial production to be driven so much by the GPU AI phenomenon.
First, I do have some show and tell. This was a picture here of me in August in Trinidad. I'd make my annual pilgrimage to go tarpon fishing. That's a nice 40-, 50-pound tarpon that I catch on my kayak. I took some clients with me. We had a great time and plan on going back again next year. Hope everybody had a nice August.
So where are we? COVIDIA is defined by a handful of stocks really dominating a lot of aspects of U.S. financial markets and economic activity. And you've all seen this chart before. The Mag 7 now represents some crazy number, like 70% of all of the large-cap U.S. stock market, way above the peak in 2000, which was about 18%, and way above the normal levels of, like, 10% or 12%. So this is a large-cap-driven market.
Increasingly, the largest stocks are driving both market returns and earnings revisions. As a matter of fact, if you look at earnings revisions in 2024, the only quintile that had positive earnings revisions were the big companies. The other four quintiles had falling earnings revisions. If you look at a chart on industrial production, excluding high tech, industrial production is basically flat for all of industry, but is rising sharply for semiconductors, computer equipment and communications equipment.
And then this is really, in some ways, the most amazing chart. If you look at capital spending and R&D, the Mag 7 stocks are growing at an enormous rate compared to the overall stock market, excluding those seven stocks, something like a 25%, 30% increase in capital spending just from January of this year. So this is an unbelievably capital-intensive, earnings-intensive, growth-intensive domination by the largest stocks.
And obviously, a lot of this revolves around Nvidia, its GPUs and the AI transformation that's taking place. Now, to be clear, concentration of assets and revenues in the U.S. has been going on for a long time. We have a chart in here showing that since 1930, so basically for the last 100 years, there's been a steady creep of concentration in terms of assets and revenues of the largest 1% of companies. But we've now reached some new extremes. If you look at the percentage of stocks that are outperforming the S&P 500, it's only about a third. And that's about as low as it gets over the last 30 years or so.
Nvidia's margins—I think they reported margins of 75%, but even before that, their margins of 60% were well ahead of the rest of the other Mag 7 companies. And Nvidia's also had the fastest road to the top. And we have a chart here that starts in the 1960s and looks at all the stocks that eventually became the largest stocks in the market at the time and what their road to success was.
And if you look back, there's nothing that ever has looked like Nvidia. The closest thing that you could compare it to would be Cisco, where its market cap grew really sharply in the, in the two to three years before the year 2000. But the speed and magnitude of Nvidia's climb over the last six to eight quarters is unmatched looking back over the last 50, 60 years in U.S. capital markets.
And the other amazing thing is the speed with which revenues and earnings are being revised higher. And we have a chart here that shows, just a little over a year ago, what the projections were for revenues and how those projections have now tripled for 2024 compared to what the 2024 projections were just a few months ago. And so, again, the speed with which the company is growing its revenues and earnings are kind of remarkable.
And that's really—when everybody brings up parallels to dot-com boom, that's really the big difference. And if you look at the share price of Nvidia versus its earnings, they've gone up in lockstep since 2022. And we have some charts in here that compares that, for example, to Nvidia in 1988, 1999. That didn't happen. You had an enormous surge in the stock price, but you had a very, very small increase in earnings. And so obviously that wasn't sustainable, and the whole thing fell apart.
The markets right now are pretty optimistic on the AI transformation. And the way that I would make that judgment is—let's create two buckets of AI suppliers. So that would be things like Nvidia, AMD and a couple of other smaller companies. And then you can look at the AI beneficiaries, the companies that are buying those GPUs and then using them for business purposes, renting out compute space for people doing AI transformations. So that's going to be the hyperscalers like Microsoft, Google, Amazon and Meta, and then also companies like Salesforce and Adobe and Intuit. And the P/E multiples for the AI suppliers are more or less in the same ZIP code as the P/E multiples for the AI beneficiaries.
There was a time last year when the AI supplier multiples were meaningfully higher. So the AI beneficiaries have caught up, which is one way of thinking about the stages of the transformation and that the markets are now more aggressively pricing in not just that companies like Nvidia are going to make a lot of money selling the GPUs, but the markets are pricing in the notion that the beneficiaries are going to make a lot of money using those GPUs. And that's what we're going to dive into here in a minute.
In terms of competition, there's different ways of measuring Nvidia's dominance. You can look at data center revenues. You could look at advanced chip revenues. Most of the numbers that you'll find will show data center or advanced AI chip GPU market share of somewhere around 90%. And that leaves 10% for everybody else, whether it's AMD or Intel or ARM, which is owned by SoftBank, Amazon, Google, Microsoft, Apple, Cerebras, Groq, Meta, Tesla. So this is as dominant a market as you'll find anywhere in terms of Nvidia's market share. Maybe ASML and lithography machines is something close.
And so there's a lot of effort being put in to develop competitive GPUs to Nvidia. We'll see what happens. Some of the rumored partnerships between OpenAI and other companies don't appear to be gathering much steam. And then some of the chips that are being developed are probably be used for inference, which is lower computational intensity and complexity than model training. So it does look like Nvidia is going to control the lion's share of the GPU market for probably at least the next two to three years.
So what do we make of this condition called COVIDIA and how durable this is and what the long-term prognosis is for the U.S. equity markets? I think it's less about Nvidia's financials, which are spectacular, and more related to whether or not the AI transformation that a lot of people are talking about is actually going to take place. Because it's one thing for Nvidia to make a lot of money selling the GPUs. Eventually, the people buying or renting those GPUs need to make a lot of money in order to pay for them and for this whole ecosystem to be sustainable.
So if you make a list of all the people that will tell you that the AI transformation is really the most remarkable thing they've ever seen, and it's gathering steam, you'll definitely find that from McKinsey reports. Every 15 minutes, McKinsey is talking about some incredible transformation. And McKinsey's projection is that generative AI is going to add around $8 trillion to the global economy. McKinsey is somewhat famous for making these projections against which they'll never be measured.
Sam Altman has talked about AI being the biggest and most important of all technology revolutions. Let's get more into the weeds of investing. Y Combinator, the VC accelerator, 70% of all their deals last year were AI. And that's more meaningful to me because if you look historically, the average annual return on Y Combinator deals since 2005 is about 175%. So the track record of Y Combinator is actually pretty impressive.
AI accounts for 40% or 50% of new unicorns created in 2024. The technology itself in terms of language models and transformers is improving so rapidly that the performance benchmarks that people use to assess them for reading comprehension and image classification and advanced math keep becoming obsolete, and people keep having to make new benchmarks to measure them because they keep getting better.
Now remember, a lot of those benchmarks are simply measuring memorization. And I haven't seen too many benchmarks that don't do that. They're just measuring the ability of a model to do a good job answering questions about information that it had already been trained on, which is called contamination. But still, the models are getting better, and the benchmarks are getting tougher. And we're starting to see that it's not just call centers and coders and professional writers that are affected, but a broader universe of jobs that are potentially impacted by this stuff.
As another sign of investor confidence, Elon Musk's new startup, xAI, raised $6 billion at a valuation of $24 billion just a few months ago and plans to build the world's largest supercomputer in Tennessee for model training. So that's all kind of amazing news, and yet you knew there was a but coming. And if you read the Eye on the Market, we'll go into a little bit more detail here.
I just have a bunch of questions. Is OpenAI really going to lose $5 billion this year? Because that's what they're rumored to be on track, according to some information from a place called The Information. Is it really going to cost $100 billion to train a single AI model in two or three years, which is a figure that Anthropic's CEO cited?
And then there was this piece by an analyst at Sequoia Capital. They know as much about AI as anybody, and they have estimated $600 billion of data center spend, and can only come up with a current estimate of about $100 billion in revenues earned on that data center space. So they're asking, where's the missing $500 billion in revenues going to come from so that the hyperscalers can break even based on their data center spend?
There was some interesting analysis from Barclays that looked at the amount of compute infrastructure being built. They estimate that the amount that's been built could power 12,000 versions of ChatGPT. Now, they're making that assumption based on the current number of daily users and the number of queries per day. Let's assume that they're underestimating a killer AI app which would have a lot more users and more queries. But you get the point. There's a lot of infrastructure being built out to support a lot of potential AI applications that don't exist yet.
And how come respondents to some surveys on how much money they save from people that use AI applications—why do most of them cite cost decreases of just 10% or less? I wouldn't scoff at being able to chop off 10% from my company's expenses. That's a meaningful addition to net income and operating margins. But still, I was expecting higher numbers. This was a new Street research survey that looked across seven or eight different industries and asked practitioners how much money they were saving by using generative AI.
And then there was a study that came out of MIT from Daron Acemoglu. And obviously, you have to make a bunch of assumptions. He's estimating a productivity boost of just six basis points a year. He looked at total factor productivity growth from AI. And in the Eye on the Market, we kind of walked through the numbers and how we got there. But that's only about a 10th of what you get every year of total factor productivity growth over the last 20 years.
So there's still a lot of questions out there, and we're trying to get into some of those. And where we end up is over the next 12 to 18 months, there's going to have to be a handoff from model training to inference, which is model using, in order for there to be a clear path to be able to generate the revenue to pay for all the capital spending that's going into all of this AI infrastructure. For those of you following the webcast on the screen, here's the chart where they surveyed sales, marketing, legal, HR, supply chain, software, IT. These are the highest and best uses of generative AI. And the vast majority of respondents talked about cost savings of 10% or less.
Then the other thing is, why do so many ChatGPT users use it infrequently? So if you look at ChatGPT use by country, you'll see some high numbers. But again, the majority are using it once or twice—only once or twice, or then maybe monthly. There's actually a very, very small number of users that use it on a daily or weekly basis.
And so when you look at the costs of Nvidia's—basically their benchmark GPU, which is the DGX H100, it's 10 times the cost of a CPU when you look at the amortization of capital costs, electricity, maintenance, software and stuff like that. So just as a rough rule of thumb, the productivity benefits from GPUs have to be 10 times higher than CPUs in order for all of this server expenditure to make sense.
In terms of thinking about where we are in 2024, the data center spend was a little more than twice as much, around two-and-a-half times more, on model training and R&D than on inference. And again, inference refers to completed applications for generative AI that are being used by end users like banks, insurance companies, accounting firms or pharmaceutical companies, retailers like Walmart. And right now, the model training is around two-and-a-half times higher than the actual expenditures associated with applications that are running.
So that's going to have to flip, right? I think that's going to have to change substantially, maybe even in—instead of two-and-a-half to one—one to two-and-a-half over the next 12 to 18 months for the markets to sustain a lot of the multiples that they're putting on some of these stocks, because you've got to see more evidence of the handle. Now what kind of AI adoption rates can you find? There's a number of different ways of looking at it.
The census canvasses 15 different sectors and asks them about AI adoption rates. This is probably the best—this is the best thing we found if you're looking for evidence that the AI transformation is happening. So they're looking at professional writing, educational services, financial industry, healthcare, real estate, software, arts and entertainment. And when you look at the survey responses from 2023 to 2024 to project to 2025, the share of firms using AI is continuing to go up. And as you'd expect in the software and infotech space, it's almost a quarter.
So those numbers look good. And if you were just looking at this, you'd say, okay, the AI transformation is on track. Everything looks good. Then there are some information from Bain. And this is kind of like you're in the dark, and you're trying to feel your way through some very abstract information because there's no one true answer. So Bain then looked at different enterprises, and their adoption numbers for different industries was a little bit lower. They show a lot of projects in the developmental pilot phase but not so much in production. And some of the actual production numbers declined in 2024 versus 2023.
So the Bain numbers are showing a lot of companies launching development and pilot programs, but the stickiness of AI in production is a little bit lower. And then I would say the most dour take is from the Fed, where they're looking at AI in terms of job openings, like what job skills do you need to have. And when they look back 15 years ago at the need for cloud computing and smart device skills as urban areas with those kind of job requirements, those were increasing a lot faster than AI is increasing today. And AI is only picking up at a pace that's a little bit faster than 3D printing.
So you kind of have a mixed bag here in terms of the AI adoption speed. I might be being a little bit too pessimistic, or a little bit impatient is the right word. It took 20 years for e-commerce to reach 20% of sales. The iPhone came out in 2007. It took three years for the Uber app to show up, and then it took another, I think, five years for them to reach 20 or 30 million active monthly users. So I guess it normally takes a while. I think I'm just responding to the hype around generative AI from some of its most vocal adherents, and being surprised that the pace of adoption doesn't appear to quite match that enthusiasm.
So just to wrap up—and again, as usual, there's a lot more information in the written Eye on the Market, where we get into all of these things. Every computing cycle works like this. First you get infrastructure, then platforms get developed, and then applications. And if that's the case, maybe it's too soon to worry that there's no killer app like there was in the enterprise resource planning software of the '90s or the search and e-commerce applications of the 2000s.
But within the next couple of years, all of those corporate AI adoption trends that we show are going to have to move a lot higher—in other words, more inference activity and spending—to avoid a metaverse outcome for all the capital that's been deployed. And I would take a look at that Sequoia Capital analysis, because that's the one that's really interesting in terms of framing how much value and revenue has to come from the corporate sector to amortize the capital spending costs in terms of what's being put in the ground by the hyperscalers.
And on the last page, we have a chart that looks at Nvidia's data center revenues are approaching around 15% of all market-wide capital spending. And that's an unbelievable number—one company's data center revenues representing 15% of all the capital spending taking place in the market. That happened twice. It happened once in 1969 at the peak of the mainframe era with IBM, and then it happened in the year 2000 with Cisco, Lucent, Nortel and their revenues that they were receiving as a percentage of market-wide capital spending.
So I think we all just need to be mindful that there is a remarkable explosion of AI-related capital spending. And what investors should be doing, and what we're doing on a day-to-day basis, is monitoring as closely as we can, not just the phenomenal numbers at Nvidia, but let's see what's happening to the revenues and the margins of the people buying and renting those GPUs. And then let's see the most important question—what's happening to the margins and revenues of the companies that are the customers of all that AI infrastructure who are supposed to be using AI applications to make their lives easier, faster, cheaper and more productive?
So we're in the earlier stages of this transformation, but the markets are pricing it in pretty aggressively. So this was, this was an important piece for us to plow through so that we could understand the different stages, what the markets are pricing in, and what kind of barometers we need to watch going forward. So thank you for listening. Unless the election is canceled, we will have a piece on the election in October, and we'll talk to you again then. Thanks for participating in the webcast. See you next time.
Good morning, everybody. This is Michael Cembalest with the September 2024 Eye on the Market podcast. This one's called "A Severe Case of COVIDIA," which everybody has. It's the prognosis for an AI-driven U.S. equity market that's driven by what's going on with Nvidia and its GPU customers. Nvidia reported last week that it beat revenues, it beat earnings, and it announced a stock split. The numbers look great, but I thought it was interesting for us to take a look at what needs to be sustained for the U.S. equity markets and the economy, and capital spending and industrial production to be driven so much by the GPU AI phenomenon.
First, I do have some show and tell. This was a picture here of me in August in Trinidad. I'd make my annual pilgrimage to go tarpon fishing. That's a nice 40-, 50-pound tarpon that I catch on my kayak. I took some clients with me. We had a great time and plan on going back again next year. Hope everybody had a nice August.
So where are we? COVIDIA is defined by a handful of stocks really dominating a lot of aspects of U.S. financial markets and economic activity. And you've all seen this chart before. The Mag 7 now represents some crazy number, like 70% of all of the large-cap U.S. stock market, way above the peak in 2000, which was about 18%, and way above the normal levels of, like, 10% or 12%. So this is a large-cap-driven market.
Increasingly, the largest stocks are driving both market returns and earnings revisions. As a matter of fact, if you look at earnings revisions in 2024, the only quintile that had positive earnings revisions were the big companies. The other four quintiles had falling earnings revisions. If you look at a chart on industrial production, excluding high tech, industrial production is basically flat for all of industry, but is rising sharply for semiconductors, computer equipment and communications equipment.
And then this is really, in some ways, the most amazing chart. If you look at capital spending and R&D, the Mag 7 stocks are growing at an enormous rate compared to the overall stock market, excluding those seven stocks, something like a 25%, 30% increase in capital spending just from January of this year. So this is an unbelievably capital-intensive, earnings-intensive, growth-intensive domination by the largest stocks.
And obviously, a lot of this revolves around Nvidia, its GPUs and the AI transformation that's taking place. Now, to be clear, concentration of assets and revenues in the U.S. has been going on for a long time. We have a chart in here showing that since 1930, so basically for the last 100 years, there's been a steady creep of concentration in terms of assets and revenues of the largest 1% of companies. But we've now reached some new extremes. If you look at the percentage of stocks that are outperforming the S&P 500, it's only about a third. And that's about as low as it gets over the last 30 years or so.
Nvidia's margins—I think they reported margins of 75%, but even before that, their margins of 60% were well ahead of the rest of the other Mag 7 companies. And Nvidia's also had the fastest road to the top. And we have a chart here that starts in the 1960s and looks at all the stocks that eventually became the largest stocks in the market at the time and what their road to success was.
And if you look back, there's nothing that ever has looked like Nvidia. The closest thing that you could compare it to would be Cisco, where its market cap grew really sharply in the, in the two to three years before the year 2000. But the speed and magnitude of Nvidia's climb over the last six to eight quarters is unmatched looking back over the last 50, 60 years in U.S. capital markets.
And the other amazing thing is the speed with which revenues and earnings are being revised higher. And we have a chart here that shows, just a little over a year ago, what the projections were for revenues and how those projections have now tripled for 2024 compared to what the 2024 projections were just a few months ago. And so, again, the speed with which the company is growing its revenues and earnings are kind of remarkable.
And that's really—when everybody brings up parallels to dot-com boom, that's really the big difference. And if you look at the share price of Nvidia versus its earnings, they've gone up in lockstep since 2022. And we have some charts in here that compares that, for example, to Nvidia in 1988, 1999. That didn't happen. You had an enormous surge in the stock price, but you had a very, very small increase in earnings. And so obviously that wasn't sustainable, and the whole thing fell apart.
The markets right now are pretty optimistic on the AI transformation. And the way that I would make that judgment is—let's create two buckets of AI suppliers. So that would be things like Nvidia, AMD and a couple of other smaller companies. And then you can look at the AI beneficiaries, the companies that are buying those GPUs and then using them for business purposes, renting out compute space for people doing AI transformations. So that's going to be the hyperscalers like Microsoft, Google, Amazon and Meta, and then also companies like Salesforce and Adobe and Intuit. And the P/E multiples for the AI suppliers are more or less in the same ZIP code as the P/E multiples for the AI beneficiaries.
There was a time last year when the AI supplier multiples were meaningfully higher. So the AI beneficiaries have caught up, which is one way of thinking about the stages of the transformation and that the markets are now more aggressively pricing in not just that companies like Nvidia are going to make a lot of money selling the GPUs, but the markets are pricing in the notion that the beneficiaries are going to make a lot of money using those GPUs. And that's what we're going to dive into here in a minute.
In terms of competition, there's different ways of measuring Nvidia's dominance. You can look at data center revenues. You could look at advanced chip revenues. Most of the numbers that you'll find will show data center or advanced AI chip GPU market share of somewhere around 90%. And that leaves 10% for everybody else, whether it's AMD or Intel or ARM, which is owned by SoftBank, Amazon, Google, Microsoft, Apple, Cerebras, Groq, Meta, Tesla. So this is as dominant a market as you'll find anywhere in terms of Nvidia's market share. Maybe ASML and lithography machines is something close.
And so there's a lot of effort being put in to develop competitive GPUs to Nvidia. We'll see what happens. Some of the rumored partnerships between OpenAI and other companies don't appear to be gathering much steam. And then some of the chips that are being developed are probably be used for inference, which is lower computational intensity and complexity than model training. So it does look like Nvidia is going to control the lion's share of the GPU market for probably at least the next two to three years.
So what do we make of this condition called COVIDIA and how durable this is and what the long-term prognosis is for the U.S. equity markets? I think it's less about Nvidia's financials, which are spectacular, and more related to whether or not the AI transformation that a lot of people are talking about is actually going to take place. Because it's one thing for Nvidia to make a lot of money selling the GPUs. Eventually, the people buying or renting those GPUs need to make a lot of money in order to pay for them and for this whole ecosystem to be sustainable.
So if you make a list of all the people that will tell you that the AI transformation is really the most remarkable thing they've ever seen, and it's gathering steam, you'll definitely find that from McKinsey reports. Every 15 minutes, McKinsey is talking about some incredible transformation. And McKinsey's projection is that generative AI is going to add around $8 trillion to the global economy. McKinsey is somewhat famous for making these projections against which they'll never be measured.
Sam Altman has talked about AI being the biggest and most important of all technology revolutions. Let's get more into the weeds of investing. Y Combinator, the VC accelerator, 70% of all their deals last year were AI. And that's more meaningful to me because if you look historically, the average annual return on Y Combinator deals since 2005 is about 175%. So the track record of Y Combinator is actually pretty impressive.
AI accounts for 40% or 50% of new unicorns created in 2024. The technology itself in terms of language models and transformers is improving so rapidly that the performance benchmarks that people use to assess them for reading comprehension and image classification and advanced math keep becoming obsolete, and people keep having to make new benchmarks to measure them because they keep getting better.
Now remember, a lot of those benchmarks are simply measuring memorization. And I haven't seen too many benchmarks that don't do that. They're just measuring the ability of a model to do a good job answering questions about information that it had already been trained on, which is called contamination. But still, the models are getting better, and the benchmarks are getting tougher. And we're starting to see that it's not just call centers and coders and professional writers that are affected, but a broader universe of jobs that are potentially impacted by this stuff.
As another sign of investor confidence, Elon Musk's new startup, xAI, raised $6 billion at a valuation of $24 billion just a few months ago and plans to build the world's largest supercomputer in Tennessee for model training. So that's all kind of amazing news, and yet you knew there was a but coming. And if you read the Eye on the Market, we'll go into a little bit more detail here.
I just have a bunch of questions. Is OpenAI really going to lose $5 billion this year? Because that's what they're rumored to be on track, according to some information from a place called The Information. Is it really going to cost $100 billion to train a single AI model in two or three years, which is a figure that Anthropic's CEO cited?
And then there was this piece by an analyst at Sequoia Capital. They know as much about AI as anybody, and they have estimated $600 billion of data center spend, and can only come up with a current estimate of about $100 billion in revenues earned on that data center space. So they're asking, where's the missing $500 billion in revenues going to come from so that the hyperscalers can break even based on their data center spend?
There was some interesting analysis from Barclays that looked at the amount of compute infrastructure being built. They estimate that the amount that's been built could power 12,000 versions of ChatGPT. Now, they're making that assumption based on the current number of daily users and the number of queries per day. Let's assume that they're underestimating a killer AI app which would have a lot more users and more queries. But you get the point. There's a lot of infrastructure being built out to support a lot of potential AI applications that don't exist yet.
And how come respondents to some surveys on how much money they save from people that use AI applications—why do most of them cite cost decreases of just 10% or less? I wouldn't scoff at being able to chop off 10% from my company's expenses. That's a meaningful addition to net income and operating margins. But still, I was expecting higher numbers. This was a new Street research survey that looked across seven or eight different industries and asked practitioners how much money they were saving by using generative AI.
And then there was a study that came out of MIT from Daron Acemoglu. And obviously, you have to make a bunch of assumptions. He's estimating a productivity boost of just six basis points a year. He looked at total factor productivity growth from AI. And in the Eye on the Market, we kind of walked through the numbers and how we got there. But that's only about a 10th of what you get every year of total factor productivity growth over the last 20 years.
So there's still a lot of questions out there, and we're trying to get into some of those. And where we end up is over the next 12 to 18 months, there's going to have to be a handoff from model training to inference, which is model using, in order for there to be a clear path to be able to generate the revenue to pay for all the capital spending that's going into all of this AI infrastructure. For those of you following the webcast on the screen, here's the chart where they surveyed sales, marketing, legal, HR, supply chain, software, IT. These are the highest and best uses of generative AI. And the vast majority of respondents talked about cost savings of 10% or less.
Then the other thing is, why do so many ChatGPT users use it infrequently? So if you look at ChatGPT use by country, you'll see some high numbers. But again, the majority are using it once or twice—only once or twice, or then maybe monthly. There's actually a very, very small number of users that use it on a daily or weekly basis.
And so when you look at the costs of Nvidia's—basically their benchmark GPU, which is the DGX H100, it's 10 times the cost of a CPU when you look at the amortization of capital costs, electricity, maintenance, software and stuff like that. So just as a rough rule of thumb, the productivity benefits from GPUs have to be 10 times higher than CPUs in order for all of this server expenditure to make sense.
In terms of thinking about where we are in 2024, the data center spend was a little more than twice as much, around two-and-a-half times more, on model training and R&D than on inference. And again, inference refers to completed applications for generative AI that are being used by end users like banks, insurance companies, accounting firms or pharmaceutical companies, retailers like Walmart. And right now, the model training is around two-and-a-half times higher than the actual expenditures associated with applications that are running.
So that's going to have to flip, right? I think that's going to have to change substantially, maybe even in—instead of two-and-a-half to one—one to two-and-a-half over the next 12 to 18 months for the markets to sustain a lot of the multiples that they're putting on some of these stocks, because you've got to see more evidence of the handle. Now what kind of AI adoption rates can you find? There's a number of different ways of looking at it.
The census canvasses 15 different sectors and asks them about AI adoption rates. This is probably the best—this is the best thing we found if you're looking for evidence that the AI transformation is happening. So they're looking at professional writing, educational services, financial industry, healthcare, real estate, software, arts and entertainment. And when you look at the survey responses from 2023 to 2024 to project to 2025, the share of firms using AI is continuing to go up. And as you'd expect in the software and infotech space, it's almost a quarter.
So those numbers look good. And if you were just looking at this, you'd say, okay, the AI transformation is on track. Everything looks good. Then there are some information from Bain. And this is kind of like you're in the dark, and you're trying to feel your way through some very abstract information because there's no one true answer. So Bain then looked at different enterprises, and their adoption numbers for different industries was a little bit lower. They show a lot of projects in the developmental pilot phase but not so much in production. And some of the actual production numbers declined in 2024 versus 2023.
So the Bain numbers are showing a lot of companies launching development and pilot programs, but the stickiness of AI in production is a little bit lower. And then I would say the most dour take is from the Fed, where they're looking at AI in terms of job openings, like what job skills do you need to have. And when they look back 15 years ago at the need for cloud computing and smart device skills as urban areas with those kind of job requirements, those were increasing a lot faster than AI is increasing today. And AI is only picking up at a pace that's a little bit faster than 3D printing.
So you kind of have a mixed bag here in terms of the AI adoption speed. I might be being a little bit too pessimistic, or a little bit impatient is the right word. It took 20 years for e-commerce to reach 20% of sales. The iPhone came out in 2007. It took three years for the Uber app to show up, and then it took another, I think, five years for them to reach 20 or 30 million active monthly users. So I guess it normally takes a while. I think I'm just responding to the hype around generative AI from some of its most vocal adherents, and being surprised that the pace of adoption doesn't appear to quite match that enthusiasm.
So just to wrap up—and again, as usual, there's a lot more information in the written Eye on the Market, where we get into all of these things. Every computing cycle works like this. First you get infrastructure, then platforms get developed, and then applications. And if that's the case, maybe it's too soon to worry that there's no killer app like there was in the enterprise resource planning software of the '90s or the search and e-commerce applications of the 2000s.
But within the next couple of years, all of those corporate AI adoption trends that we show are going to have to move a lot higher—in other words, more inference activity and spending—to avoid a metaverse outcome for all the capital that's been deployed. And I would take a look at that Sequoia Capital analysis, because that's the one that's really interesting in terms of framing how much value and revenue has to come from the corporate sector to amortize the capital spending costs in terms of what's being put in the ground by the hyperscalers.
And on the last page, we have a chart that looks at Nvidia's data center revenues are approaching around 15% of all market-wide capital spending. And that's an unbelievable number—one company's data center revenues representing 15% of all the capital spending taking place in the market. That happened twice. It happened once in 1969 at the peak of the mainframe era with IBM, and then it happened in the year 2000 with Cisco, Lucent, Nortel and their revenues that they were receiving as a percentage of market-wide capital spending.
So I think we all just need to be mindful that there is a remarkable explosion of AI-related capital spending. And what investors should be doing, and what we're doing on a day-to-day basis, is monitoring as closely as we can, not just the phenomenal numbers at Nvidia, but let's see what's happening to the revenues and the margins of the people buying and renting those GPUs. And then let's see the most important question—what's happening to the margins and revenues of the companies that are the customers of all that AI infrastructure who are supposed to be using AI applications to make their lives easier, faster, cheaper and more productive?
So we're in the earlier stages of this transformation, but the markets are pricing it in pretty aggressively. So this was, this was an important piece for us to plow through so that we could understand the different stages, what the markets are pricing in, and what kind of barometers we need to watch going forward. So thank you for listening. Unless the election is canceled, we will have a piece on the election in October, and we'll talk to you again then. Thanks for participating in the webcast. See you next time.
Since 2005, Michael has been the author of Eye on the Market, covering a wide range of topics across the markets, investments, economics, politics, energy, municipal finance and more.
J.P. Morgan’s website and/or mobile terms, privacy and security policies don’t apply to the site or app you're about to visit. Please review its terms, privacy and security policies to see how they apply to you. J.P. Morgan isn’t responsible for (and doesn’t provide) any products, services or content at this third-party site or app, except for products and services that explicitly carry the J.P. Morgan name.
LEARN MORE About Our Firm and Investment Professionals Through FINRA BrokerCheck
To learn more about J.P. Morgan’s investment business, including our accounts, products and services, as well as our relationship with you, please review our J.P. Morgan Securities LLC Form CRS and Guide to Investment Services and Brokerage Products.
JPMorgan Chase Bank, N.A. and its affiliates (collectively "JPMCB") offer investment products, which may include bank-managed accounts and custody, as part of its trust and fiduciary services. Other investment products and services, such as brokerage and advisory accounts, are offered through J.P. Morgan Securities LLC ("JPMS"), a member of FINRA and SIPC. Insurance products are made available through Chase Insurance Agency, Inc. (CIA), a licensed insurance agency, doing business as Chase Insurance Agency Services, Inc. in Florida. JPMCB, JPMS and CIA are affiliated companies under the common control of JPMorgan Chase & Co. Products not available in all states.
Please read the Legal Disclaimer and the relevant deposit protection schemes in conjunction with these pages.
Click to access DPS website.