How does Super Micro (SMCI) help us to pick AI stock winners?

Welcome to AI Collision šŸ’„,

Create a humorous image depicting two AI characters playing chess, one named Groq and the other Grok. These characters should be designed as distinct, comical robots sitting across from each other at a chessboard. Groq should be slightly larger, with a brighter color scheme, and have a label or name tag clearly showing the name 'Groq'. Grok, on the other hand, should be slightly smaller, with a darker color scheme, and have a label or name tag displaying 'Grok'. Both should have exaggerated expressions of concentration and confusion. The chessboard should be nearly full, indicating a tense moment close to checkmate. The scene is cartoonish, filled with light-hearted humor, and other small robotic figures should be spectating the match, adding to the comedic environment.

In todayā€™s collision between AI and our world:

  • Up down, nobody knowsā€¦

  • Groq vs Grok

  • Which AI have you used?

If thatā€™s enough to get the Groqs (or Groks) fighting, read onā€¦

AI Collision šŸ’„ how do you tell if an AI stock is worth picking or not?

My morning routine usually starts at 5.30am to 6.00am.

And no, this isnā€™t one of those ā€œlife hackā€ posts where I tell you Iā€™ve done six workouts and eaten 2kg of steak before 7am.

My day starts that early because at least one of my kids is standing beside my bed, like a silent serial killer, waiting for one eye to open so they can go and play.

With no other choice, Iā€™m up, bleary eyed, them smashing some milk and a banana and me checking in to see whatā€™s been going down in the markets overnight.

Hence, my usual first port of call is a data aggregation site like Yahoo Finance where I can catch up on closing prices and post-market trading in the US, whatā€™s been moving in Australia and Asia and then mainlining coffee right into my system.

As I did that yesterday morning, I checked in on a few ā€œdarlingā€ stocks of the market.

One of those being Super Micro Computer Inc. (NASDAQ:SMCI).

When I checked in, it wasnā€™t the price action that caught my attention right away, it was some of the news feeds about SMCI that took my half-waking-up state into the realms of a dreamland.

Hereā€™s the first thing I sawā€¦

$1,300 per share would be 71% higher.

Then I saw thisā€¦

60% higher would be $1,212.

Then thisā€¦

And this came off the back of an intraday high of $1,077 on Friday 16 February to an intraday low of $692 on Tuesday 20 Februry. Thatā€™s a 35% swing lower.

So, what is it? What does SMCI do next?

Well, the short answer is, short term, nobody bloody knows!

Will it retest over $1,000 in the next week, or will it sail 50% lower towards prices it was trading at only a month ago?

Every trader, broker and analyst will have a take on it. So how do you know who to listen to, who to trustā€¦?

That includes me by the way!

I think the approach to take here is one that over time has proven for me to be most effective. That is to simply form a distinctive world view about what our world looks like in a decade, in two decades, and work backwards from there.

If you think that technology like AI will only get better, that weā€™ll use more computationally intensive and heavy hardware and that we will be more connected, faster, with greater access to AI, then you just need to ask: what are the smart plays that build that world?

And then ask: is SMCI the kind of company that fits into that thesis?

But if you think AI is all hyped up, that an unforeseen competitor will come along and scoop up the market, or that the technology fundamentally isnā€™t going to have a big impact on our world, then you come to find your answers there pretty quick too.

I believe this is how you at least start with figuring out what AI stocks are worth tipping capital into or not. After youā€™ve at least built a thesis or got some expert information and guidance to build your view (like the exceptionally great content here at AI Collision šŸ’„) then you look at those companies and see if they actually make money from the tech theyā€™ve got going.

This is a tried and tested blueprint for picking AI stocks. Some call it qualitative, top-down investing. But for me, I think itā€™s just a reverse jigsaw. When you know what the end puzzle looks like, then you start to figure out what the pieces are that make it so.

Itā€™s not always right, but you donā€™t need to be right all the time. But when you see the pace and explosion of AI companies involved in it, like SMCI, then I think itā€™s a puzzle worth figuring out.

AI gone wild šŸ¤Ŗ

First, we had computer processing units (CPUs).

Then we had graphics processing units (GPUs).

Now weā€™ve got language processing units (LPUs).

Well, we donā€™t ā€œhaveā€ LPUs per-se, but we might. And thatā€™s enough to have sent the AI world into a bit of a tizz this last few days.

Have you ever heard of Groq?

No, thatā€™s not a spelling mistake. Weā€™re not talking about Grok, Elon Muskā€™s AI that heā€™s rolling out through X.com ā€“ the Grok that we wrote about in November last year.

I mean Groq, the same Groq that apparently existed well before Elon Muskā€™s ever did.

Hereā€™s what Groq said when Elon made a song and dance about Grok:

Source: Groq

Now weā€™ve got the whole ā€œwhose Groq (or Grok) came firstā€ out of the way ā€“ clearly it was Groq ā€“ the question remains, why is this important today?

Well, it comes back to the CPU, GPU and now LPU evolution.

Groq is claiming that itā€™s developed an ā€œAI chipā€ thatā€™s exponentially faster than anything out there in the market right now. More specifically, the AI chips that Nvidia is making and that companies like Microsoft are using.

As Groq explains:

An LPU Inference Engine, with LPU standing for Language Processing Unitā„¢, is a new type of end-to-end processing unit system that provides the fastest inference for computationally intensive applications with a sequential component to them, such as AI language applications (LLMs).

In practice there are claims that when using LPUs, the speed at which computation occurs is as much as 13-times faster.

In a recent Gizmodo article the speed comparison was put to the test, finding:

Groq produces 247 tokens/second compared to Microsoftā€™s 18 tokens/second, according to a third-party test from Artificial Analysis published last week. That means ChatGPT could run more than 13x as fast if it was running on Groqā€™s chips.

I also decided to do my own real-world test between Groq and OpenAIā€™s ChatGPT.

With ChatGPT (using GPT-4) I asked the question, ā€œWho invented the TPU?ā€

26 seconds later, I got the following answer.

The Tensor Processing Unit (TPU) was invented by Google. TPUs were designed specifically for accelerating machine learning tasks. They are custom-designed chips optimized to run TensorFlow, Google’s open-source machine learning framework, and are intended to deliver higher performance and efficiency than conventional hardware such as CPUs and GPUs for certain types of computations common in deep learning algorithms. Google announced TPUs in 2016 as part of their efforts to accelerate machine learning workloads and improve the performance of their artificial intelligence services.

I then went on to Groq, using the Llama 2 70B-4K model, and asked the same question, ā€œWho invented the TPU?ā€

1.46 seconds later I got the following answer:

The Tensor Processing Unit (TPU) was invented by Google. The first TPU was developed in 2013 by the Google Brain team, led by Andrew Ng, a prominent researcher and entrepreneur in the field of artificial intelligence. The team recognized the need for specialized hardware to accelerate the training of deep neural networks, which were becoming increasingly complex and computationally intensive.

Google developed the TPU specifically to address the challenges of training deep neural networks, which were taking weeks or even months to train on traditional CPUs and GPUs. The company designed the TPU to optimize the performance of deep neural network computations, and the first TPU was deployed in Google’s data centres in 2015.

Since then, TPUs have become widely adopted in various industries, and Google has continued to develop and improve the technology. In 2018, Google announced the Cloud TPU, which allowed customers to rent TPU resources in the cloud, making it easier for organizations to access the technology without having to invest in their own hardware.

In summary, the Tensor Processing Unit (TPU) was invented by the Google Brain team, led by Andrew Ng, to address the need for specialized hardware to accelerate the training of deep neural networks.

Groq gives a more in-depth answer whilst also doing it 17.8-times faster.

Thatā€™s quite something.

Now, for me, thereā€™s still a big question that remains.

These new LPUs, are these the real deal? Are we talking about a true game-changer piece of hardware here, or is this a lot of hype and the reality is itā€™s the models, not the hardware, thatā€™s making a difference here?

And amidst all of it, whatā€™s the investible angle?

Right now, Iā€™m not sure. Iā€™m trying to figure that out. Ideally, if Groq was investible then itā€™d be something worth considering, a big chunk of asymmetric risk on the tableā€¦

But itā€™s not. So, what impact does this have short term? My initial feeling is that itā€™s another platform for which the wider industry leaps higher from.

That itā€™s further confirmation that 2024 is going to be the year of AI hardware. That big tech, mid-tech and little-tech are all scrambling to develop, launch and get to market, game-changing AI hardware.

The demand for all this is not only strong but growing at an insatiable pace. I suspect weā€™ll be hearing a lot more about Groq (with a Q, remember) in the near future as the idea of LPUs starts to grab hold of the market.

Ed note: by the way, I mentioned on Tuesday that my new AI briefing was coming soon. Itā€™s actually going live next Tuesday. If you wanted to check that out, you will need to pre-register for it.

Again, to see what its all about just hit the button below and itā€™ll take you where you need to go to find out more.

Sam’s AI Alpha Event Registration

Boomers & Busters šŸ’°

AI and AI-related stocks moving and shaking up the markets this week. (All performance data below over the rolling week).

man in black suit jacket and black pants figurine

Boom šŸ“ˆ

  • Appen (ASX:APX) up 35%

  • Brainchip (ASX:BRN) up 29%

  • Cyngn (NASDAQ:CYN) up 18%

Bust šŸ“‰

  • Vicarious Surgical (NASDAQ:RBOT) down 12%

  • C3.ai (NASDAQ:AI) down 5%

  • Nvidia (NASDAQ:NVDA) down 3%

From the hive mind šŸ§ 

  • There are a lot of elections happening in 2024 around the world. And itā€™s shaping up that AI is going to be the hottest story about how elections are run; how the voters are influenced; and when losers lose, how these elections were ā€œcorruptedā€. AI and elections are something weā€™re going to hear a lot more about in 2024.

  • While on this topic of AI and deepfakery, another ā€œgodfatherā€ of AI has published an open letter about the need for greater regulation on deepfakes. While AI and elections will be a hot topic, so will the regulation of AI.

  • Often at home when Iā€™m looking for something (that happens to be right in front of me) and canā€™t find it, my wife refers to it as me doing a ā€œboy lookā€. I rebut that as itā€™s just how Iā€™m wired. And Iā€™m now hoping that AI is going to prove my point exactly.

Artificial Polltelligence šŸ—³ļø

This weekā€™s poll is about using AI models. Mainly have you used them and if so, which onesā€¦

If your answer happens to be ā€œotherā€ drop us a comment to tell us which one(s) so we can dive in ourselves to find out more.

Weirdest AI image of the day

Communist countries propaganda posters starring famous fictionnal characters ā€“ r/Weirddallee

r/weirddalle - USSR

ChatGPTā€™s random quote of the day


“Technology is best when it brings people together, but it is at its peak when it makes us forget it’s even there, seamlessly enhancing our lives without our conscious recognition.” ā€“ Tatsuo Horiuchi


Thanks for reading, see you on Tuesday. And if youā€™re enjoying our work, please like, share and leave comments below,

Sam Volkering

Editor-in-Chief
AI Collision
Leave a comment
Although Southbank Investment Research Ltd, the publisher of AI Collision is regulated by the Financial Conduct Authority, the editorial content in AI Collision is not regulated by the Financial Conduct Authority. The editorial content is for general information only; it gives no advice on investments and is not intended to be relied upon by individual readers in making (or not making) specific investment decisions. Your capital is at risk when you invest. Any investment decisions should be considered in relation to your own circumstances, risk tolerance and investment objectives.
Occasionally we may tell you about other information services published by Southbank Investment Research Limited which do contain content which is regulated by the FCA. When viewing this regulated content, you should review the risk warnings accompanying it. 
You can unsubscribe from AI Collision at any time by clicking the link below.
ISSN 2977-0882
Ā© 2024 Southbank Investment Research Ltd. Registered in England and Wales No 9539630. VAT No GB629 7287 94. Registered Office: 2nd Floor, Crowne House, 56-58 Southwark Street, London, SE1 1UN. Authorised and regulated by the Financial Conduct Authority. FCA No 706697. 
https://register.fca.org.uk
0 0 votes
Article Rating
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x