Print the page
Increase font size
Newsflash: AMD Says Buy NVDA

Newsflash: AMD Says Buy NVDA

Davis Wilson

Posted October 22, 2024

Davis Wilson

Here’s a few quotes from a conversation I had this weekend:

“Nvidia is too expensive.”

“Nvidia is already up 190% this year… No way it can go any higher.”

“I’m going to buy AMD instead. They’re the next Nvidia.”

I’ve heard them each plenty of times. Maybe you have as well.

My eyes roll back in my head each time I hear them.

That’s because it’s clear that AMD and Nvidia are in different leagues.

AMD’s "Advancing AI 2024" event pounded the final nail in the coffin – and their stock prices are showing it.

AMD Launches the “Nvidia Killer” MI325X

AMD recently launched the Instinct MI325X chip that many analysts deem as a direct threat to Nvidia’s data center graphics processors – otherwise known as GPUs.

Nvidia currently controls 90% of this market.

AMD’s CEO Lisa Su described the MI325X as having 1.8x more capacity and 1.3x more bandwidth than Nvidia’s H200, while also having 1.3x greater peak theoretical compute in certain situations.

All great specs.

The MI325X should start shipping at the end of the current quarter and will be in the field early next year, according to Su.

AMD’s stock price closed down 4% the day of this announcement while NVDA closed higher 2.5%.

Since the announcement – which was a little over a week ago – AMD’s stock has continued lower 6% while Nvidia is up 7%.

The problem is that while AMD’s new chip may compete favorably with Nvidia’s H200, it’s important to know that Nvidia’s H200 was released in early 2024.

That’s an entire year before AMD’s MI325X will be in the field.

By that time, Nvidia’s Blackwell – the most advanced AI chip ever created – will be the new standard in GPUs.

[Note: AMD decided conveniently not to compare their MI325X to Blackwell.]

The real news from AMD’s event happened during the keynote speech by Su, where she said the total addressable market for data center AI acceleration “will grow at more than 60% annually to $500 billion in 2028,” driven by AI demand that has been growing beyond expectations.

That’s an 11x increase from $45 billion in 2023, and higher than Su’s previous estimate of $400 billion in 2027.

This is good news for AMD…

But this is even better news for Nvidia, the de facto leader with 90% market share in the space.

Remember last week when I said:

“If you’re looking to buy Nvidia, don’t just research Nvidia.

Research the whole industry.

  • Check its competitors like AMD and Qualcomm.
  • See what its supplier TSMC is saying.
  • Pay close attention to its customers like Meta Platforms, Microsoft, and Alphabet.”

AMD’s "Advancing AI 2024" event is a prime example of this.

AMD announced a chip that’s one full year behind Nvidia’s comparable model and predicted the market that Nvidia dominates will grow to $500 billion in 2028.

It’s no surprise why AMD’s stock price has fallen this past week while Nvidia continues to trade higher.

AMD should have just named the event "Advancing NVDA 2024.”

Yet some people still want to buy AMD over NVDA…

*eye roll*

Martin Shkreli: Confessions of the “Most Hated Man”

Posted February 26, 2026

By James Altucher

When the culture writes you as the villain, own the box office.

AI Prevents “Starmageddon”

Posted February 25, 2026

By Chris Campbell

This may be the most important use of artificial intelligence so far.

Musk’s Starlink Snag

Posted February 24, 2026

By Chris Campbell

The smart question isn’t “What’s growing?” It’s “What can’t grow without a breakthrough?”

Success, Shame, and the 1% Fix

Posted February 23, 2026

By James Altucher

Are you building something that survives your worst day? Here’s how you know.

Silicon, Steel, and Starlight

Posted February 20, 2026

By Chris Campbell

Space once meant launches and applause. Now it means contracts and committees. And investors are taking notice.

The AI Bubble Playbook

Posted February 19, 2026

By Chris Campbell

The safest position in a technological shift is not resisting it. It’s operating it.