[ivory-search 404 "The search form 146633303 does not exist"]

Is inference AMD’s key to $1 trillion?

Welcome to AI Collision 💥,

In today’s collision between AI and our world:

  • AMD’s run at $1 trillion
  • 80% growth year on year…forever?
  • Amazon’s new Glorm

If that’s enough to get the AI thinking, read on…

AI Collision 💥

Nvidia might’ve trained the AI revolution, but perhaps it’s AMD that’s going to run it.

This last week at their Advancing AI event, AMD’s CEO Lisa Su unveiled what might go down as AMD’s turning point to their final march to the trillion-dollar-club.

And it’s all about inference.

While Nvidia chases ever-larger training clusters and accelerating their AI ambitions to more powerful chips for whatever purposes the buyers see fit AMD is betting big on the next phase of AI where models don’t just learn, they think.

In an interview at the event Lisa Su said,

“What has really changed is the demand for inference has grown significantly,”

“It says that we have really strong hardware, which we always knew, but it also shows that the open software frameworks have made tremendous progress.”

While the market is fully aware of the demand for Nvidia chips, it still wasn’t all that widely recognised how many of the big AI players are also going heavy on AMD chips.

xAI is conitnuing to buy up both Nvidia and AMD chips, Meta, Microsoft, Oracle and even OpenAI are all aggressivly buying AMDs chips and also placing orders for their next generation of inference chips.

Sam Altman from OpenAI even called the specs of AMDs MI400 chips, “totally crazy” at the event.

But what makes this inference play such a big deal? And how is it inference that could take AMD into the trillion dollar club?

AI as we know it is made up of two very different beasts…

Training, is the part when you teach the model. That means months-long, data-hungry process where massive clusters of GPUs crunch billions of tokens to deliver meaningful information. That’s where Nvidia’s H100s and Blackwells shine, and dominate the market.

Inference is what happens next. It’s the part that gets embedded into your phone, your Zoom call, your autonomous car, your fraud detection software. It’s ChatGPT replying, thinking, considering, predicting. It’s AI being used. And it’s what the world needs to take AI from a great new technology to a transformative platform for humanity.

Inference is where the big money will be and where AMD is firmly setting its sights.

I should add, Nvidia is well aware of this too. But it seems their focus is still more on the neural network of AI that will be the platform for AI, through things like I’ve written about before, like GR00T for Thor and their push to AI in real-world applications like robotics and autonomous systems.

AMD’s playbook though is more obvious now. Let Nvidia keep the crown for training the monsters and building the networks. AMD will be the chipmaker for when those monsters need to really think.

AMD isn’t catching up, they’ve always been thereabouts with their technology. We know it’s great, we know it works, we know they’ve got decent AI chops, but Nvidia has stolen the bulk of the limelight.

But AMD’s timing here might be perfect. The first wave of the AI boom was about showcasing what’s possible. The second wave, the one we’re entering now, is about delivering that promise. And inference is the gateway to it all.

Su expects the demand for inference to grow 80% annually. So there’s plenty of market, capital and growth for Nvidia and AMD to make a truck tonne of cash.

Yes there’s an element of who has the one chip to rule them all, but also it will be about who delivers performance per watt, per dollar, per workload.

That’s AMD’s playground, and where their fortune in inference could come from.

It’s no coincidence AMD shares are quietly up double-digits this week while everyone’s watching Nvidia. The market’s waking up to the fact that while AMD might have missed the first leg of the AI trade, they’re gearing up for the one that sticks, the one that embeds AI into everything.

AMD’s moment might finally be here. And the trillion-dollar club should probably make some room.

#AD

Trump’s Executive Order 14179 To Ignite AI Boom Phase 2? 

Just signed by Donald Trump, Executive Order 14179 could trigger a tidal wave of investment into “next-gen AI” – marking the start of what experts are calling Phase 2 of the AI boom. 

James Altucher – a former hedge fund manager and AI engineer who worked on IBM’s Deep Blue –  has just revealed his top 3 strategies to capitalise on this breakout moment.

Capital at risk.

Boomers & Busters 💰

AI and AI-related stocks moving and shaking up the markets this week. (All performance data below over the rolling week).

man in black suit jacket and black pants figurine

Boom 📈

  • Vertiv (NYSE:VRT) up 7%
  • BigBear.ai (NYSE:BBAI) up 7%
  • Micron (NASDAQ:MU) up 5%

Bust 📉

  • iRobot (NASDAQ:IRBT) down 17%
  • Xpeng (NYSE:XPEV) down 9%
  • C3.ai (NYSE:AI) down 6%

From the hive mind 🧠

  • Nothing instills confidence in a workforce like knowing someone else is coming for their job. Specially when that “someone” doesn’t sleep, eat, stop or get tired.
  • I think this will be a common thing, that AI becomes just a thing that’s always there, in any app that you can’t get rid of. Some will be good…some will be the modern version of Clippy.
  • The AI and energy story is real and very important. But let’s not get too carried away with it. It will be because of AI that we discover the best ways to power and use AI. Things like this I think end up as more a hindrance than help.

Artificial Polltelligence 🗳️

Weirdest AI image of the day

The new Amazon Glorm

ChatGPT’s random quote of the day

“Part of what makes programming difficult is most of the time we’re not doing it — we’re debugging.”
— Bret Victor

Thanks for reading, and don’t forget to leave comments and questions below,

Sam Volkering

Editor-in-Chief
AI Collision
5 1 vote
Article Rating
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x