[ad_1]
The earnings whirlwind for technology and silicon companies now has enveloped Advanced Micro Devices Inc.
AMD,
after it reported fourth-quarter and full-year 2023 results. I know that many other stories and analysts will dive into the dollars more deeply than me, but it’s worth recounting a couple of the key points that I think are relevant to how we look at the ability for AMD to sustain its momentum into 2024.
From a fourth-quarter 2023 perspective, looking at year-on-year comparisons to a 2022 that was by all accounts pretty bad for AMD (and the chip market in general), revenue was up 10%, gross profit was up 10%, margin was essentially flat, and operating income jumped 12% to over $1.4 billion.
Drilling into AMD’s key business units, the data-center group saw a 38% increase in revenue and 50% increase in operating income. The client segment that accounts for the company’s chips for consumer desktops and laptops saw revenue increase 62% and operating income climb 136%.
For the first time in any substantial way, the data-center business is now the largest component of AMD’s revenue.
The total 2023 results, compared to 2022, are a bit misleading because of just how bad the first half of last year was.
The best news for investors during the earnings call came from CEO Lisa Su when she announced AMD was raising revenue projections for its MI300 family of AI accelerators for the data-center segment to $3.5 billion from $2 billion, a massive 75% jump. This is based on better-than-expected ramping of product validation with customers, and the resulting increased demand because of it. I wouldn’t be surprised to see that $3.5 billion number that AMD is promoting for 2024 to be an under-call; Su mentioned in the Q&A that the company had built up the supply chain to ship “substantially more” than the $3.5 billion mark if demand is there.
AMD continues to tout a projected market size of $400 billion for AI accelerators by 2027. Though details on this projection are still a bit light, if the company can manage to capture just 5% of that market by 2027, investors are looking at a revenue target of $20 billion, with a steep curve up from 2024 to hit that.
Investors are likely questioning if the company can sustain this kind of growth and momentum. Does it have the expertise to compete with the likes of Nvidia
NVDA,
and hold off the rise of chip startups and even in-house silicon expansion at cloud-service companies?
Su has built an execution machine at AMD, giving customers such as Microsoft
MSFT,
and Lenovo
992,
the confidence to commit their product lines and futures to AMD’s roadmap — something that 10 years ago would have been unthinkable. When the CEO mentioned casually on the earnings call that AMD is speeding up its AI-chip product-release schedule — similar to how Nvidia announced a faster cadence of new products last year — few should doubt that she can make it happen.
On the client side of things, the future is a bit more murky. AMD recently announced its Ryzen 8000 series of chips for laptops that include a new, faster AI accelerator that makes it one of several new CPUs coming to market for the AI PC. Later this year, AMD will release its Strix family of chips, which promises to improve AI performance by a factor of three, while also introducing a new CPU architecture dubbed Zen 5.
While the product family that AMD has in store for 2024 in the consumer space looks to be high performance and offer compelling AI and graphics features, the market growth in the PC space is projected to be much lower than for the data center. Even though most analysts see a “supercycle” coming in the PC space, thanks to the expansion of compelling AI uses case to drive consumers to buy new systems, AMD is hedging a bit here.
“AI software may be more compatible with Intel AI chips sooner than AMD’s.”
Risks for AMD are bigger in the client space than the data-center segment, in my view. For enterprise and cloud-service providers integrating AI accelerators and new chips, the ability to build and customize software for AMD products is a small portion of the overall cost of doing business. If AMD is only worried about 10 to 20 key applications for its MI300 family of AI GPUs, it can focus engineering efforts. But in the consumer space, where AI applications and workloads will number in the dozens or even hundreds, Intel
INTC,
has a massive software-development team that it can utilize that AMD does not. As a result, AI software may be more compatible with Intel AI chips sooner than AMD’s.
Another part of the risk is that Intel’s latest chips, like its Core Ultra family, are really good — and in a market where OEMs and consumers aren’t starving for new processors (as is the case in the data center), then AMD will have to compete more directly on its value proposition. Those Intel Core Ultra chips tend to target higher-priced systems, so AMD will only have to be better than Intel’s last-generation products to win.
I still expect to see unit- and revenue-share increases for AMD in the client space through 2024. The company has continued to grow its footprint relative to Intel in the laptop space over the past two years, and like the data center business, AMD benefits from its record of continued execution. As AMD edges up from a 15% market share in 2022 to almost 20% as of late 2023 for client chips, it seems inevitable that it will be able to continue that growth to upwards of 30% in short order.
Ryan Shrout is president of Signal65 and founder of Shrout Research. Follow him on X @ryanshrout. Shrout has provided consulting services for AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia and others. Shrout holds shares of Intel.
More: AMD comes up short on ‘impossible task,’ sending its stock lower
[ad_2]
Source link