[ivory-search 404 "The search form 146633303 does not exist"]

What does Napster tell us about the future of AI?

Welcome to AI Collision šŸ’„,

an anthropomorphic Napster logo smashing a bunch of CDs and CD players with a gigantic sledgehammer, make it fun, and satiric

In todayā€™s collision between AI and our world:

  • Napster, iTunes, Spotify, ChatGPT

  • How often do you ChatGPT?

  • Mad Max, mad coppers

If thatā€™s enough to get the CDs smashing, read onā€¦

AI Collision šŸ’„

The New York Daily News, Chicago Tribune, Denver Post, Mercury News, Orange County Register and St Paul Pioneer-Press, and Tribune Publishingā€™s Orlando Sentinel and South Florida Sun Sentinel are all in the process of suing OpenAI and Microsoft.

The crux of the case is the allegation that OpenAI and Microsoft have been in copyright infringement, by using copyright protected articles to train their AI models.

We covered this particular report at the start of the month saying,

So, letā€™s say Microsoft did let its AI access and read and learn from new articles. And that learning (amongst other avenues Iā€™m sure) led to the AI becoming smarter, and then helping Microsoft to monetise that AI to provide services to other businesses.

Whatā€™s the difference between that and my kid reading the paper, learning from these articles (amongst other avenues Iā€™m sure) and then becoming smarter and deciding to create their own digital media company, which then grows and sells services and products to people and becomes a multi-billion-dollar success?

It feels like legacy media is trying to put a copyright claim on learning.

Well, it looks like this is a battle thatā€™s going to rage on for some time.

More recently in the last few days, a report on Press Gazette explained,

Financial Times chief executive John Ridding has told other news publishers they ā€œhave leverage and should insist on paymentā€ from AI companies.

Ridding went on to say,

The payment matters, for principle and for revenue of course, but also important is the opportunity to extend our reach and to understand how users will interact with AI.

As with the digital and mobile revolutions, pulling up the drawbridge or trying to hold back the tide is not going to be a strategy for success.

I donā€™t know about you, but I remember the days when Napster launched to market, and it completely upended the music industry.

Whatā€™s happening with AI and legacy media feels remarkably the same, particularly when you hear media moguls like Ridding saying they should get payment from AI.

If you donā€™t know the Napster story, itā€™s important to get context, because the way that played out is very likely how this news battle with AI is going to play out tooā€¦

Napster was launched in June 1999 by Shawn Fanning and Sean Parker. The core premise was the democratisation of music distribution.

They did this via peer-to-peer (P2P) file sharing. The service allowed users to share and download MP3 files from each other, circumventing traditional music distribution methods.

Previously if you wanted music, youā€™d have bought a CD, maybe ripped the music to your hard drive, or maybe you had to pay for the MP3 file.

This way, you could freely share your music with anyone, anywhere.

Napster enabled millions of people worldwide to obtain songs for free, which upended the established music market.

The reason Napster blew up so quickly was it was easy, intuitive and free. However, it quickly became a target of the music industry, which saw it as a direct threat to its revenue model.

And it was.

Record labels and artists were concerned about the loss of income due to unauthorised sharing ā€“ in short, breaches of copyright.

High-profile lawsuits soon followed, most notably from the Recording Industry Association of America (RIAA) and the world-famous metal band, Metallica.

The crux of these lawsuits is exactly the same as the one legacy media are aiming at AI today: the collision of technology and innovation with intellectual property and copyright.

The courts eventually ruled against Napster in 2001, forcing it to shut down its free service.

The legacy industry won the battle, but technology won the war.

The rise of Napster proved that the legacy models were broken and ripe for disruption and innovation.

Thanks to Napster, it forced the hand of big tech like Apple, which then launched iTunes in 2003. iTunes made it easy and intuitive for consumers to legally purchase and download music.

Then of course we ended up with Spotify in 2008, which took the way in which we consume music to another level again.

However, without Napster directly challenging the way in which we consumed music, thereā€™s an argument that iTunes or Spotify might not exist.

If the legacy model worked, there would be no need for disruption and a better way.

This is the exact situation that legacy media is facing. The traditional model seems to not be as relevant as it once was. Their revenue and business models are facing a direct threat, how we consume and absorb media is changing and AI is fast pushing how that occurs.

I expect that parts of legacy media may win their cases against AI companies. The legal system, remember, is deeply rooted in old, archaic laws, and it isnā€™t particularly flexible for new innovation and technology.

But what will come from all this are new opportunities and ways we might not even consider possible now, as to how we consume and absorb our news and media.

Itā€™s these kinds of awkward, and disruptive collisions that enable something even better to come from the rubble. And thatā€™s why we can look to the past, like with Napster, to learn what might come in the future with AI

AI gone wild šŸ¤Ŗ

What do people actually do with AI?

Itā€™s nearly impossible not to know about it, and Iā€™d argue that everyone has heard of it, seen it in the paper, on the news, or an advert of some kind pop-up in a social media feed.

Howeverā€¦

Who actually uses AI? Well, we might have a bit of an idea, and the answer is, not many.

Note: Before I dive into this idea a bit more, be sure to vote in our poll below this week. Weā€™re running the same primary question about how often you use AI, to compare the results of the data Iā€™m about to explain with how you use AI ā€“ make sure to vote below!

Now, according to data compiled by Dr Richard Fletcher and Prof. Rasmus Kleis Nielsen and published recently on the Reuters Institute and University of Oxford website, not many people use generative AI all that frequently.

Their data is drawn from a survey of around 2,000 people from six different countries: the US, UK, Argentina, France, Japan and Denmark.

You can view the full report and analysis here, but to surmise, only around 7% of people in the US use ChatGPT daily. In the UK, itā€™s 2%.

For Googleā€™s Gemini, itā€™s 3% in the US andā€¦ 0% in the UK!

And for Copilot, itā€™s 2% in the US and 1% in the UK.

I will admit, those results kind of blew me away. The numbers do start to creep higher when the weekly use is considered. And by the time the survey asks monthly, the use for ChatGPT is closer to 20% in total, but Gemini and Copilot massively lag behind, barely crossing the 10% mark.

That means the bulk of people still havenā€™t used it, donā€™t know what it is, and havenā€™t even heard of it.

Whatā€™s interesting to me, is the Nanna test. What is the Nanna test you might ask? Well, itā€™s a theory Iā€™ve had that harks back to the rise of cryptocurrencies. My theory is that when your Nanna starts using it, then you know itā€™s full-blown mainstream.

So, go ask your Nanna if sheā€™s using ChatGPT, Gemini or Copilot yet. My guess is, not likely. But I bet she has a smartphone now.

Maybe then, sheā€™ll use AI and never even know it. And perhaps thatā€™s the thing about AI, that it wonā€™t be a case of knowing if youā€™re using it on a daily basis.

No one thinks, Iā€™m going to use Google Search today. People just ā€œGoogleā€ stuff.

Maybe thatā€™s where AI ends up, that it stops becoming about ChatGPT, and just ends up as ā€œIā€™ll ask AIā€ and it wonā€™t matter what we use.

Boomers & Busters šŸ’°

AI and AI-related stocks moving and shaking up the markets this week. (All performance data below over the rolling week).

man in black suit jacket and black pants figurine

Boom šŸ“ˆ

  • Predictive Oncology (NASDAQ:POAI) up 27%

  • Nvidia (NASDAQ:NVDA) up 19%

  • AMD (NASDAQ:AMD) up 5%

Bust šŸ“‰

  • C3.ai (NYSE:AI) down 8%

  • Datametrex (TSXV:DM) down 25%

  • Quantgate Systems (OTCPK:QGSI) down 24%

From the hive mind šŸ§ 

Artificial Polltelligence šŸ—³ļø

Todayā€™s poll is directly related to the AI Gone Wild section above. Itā€™s wild (to me at least) as to how little people actually use AI.

But when you live and breathe this market day in, day out, itā€™s easy to forget at how nascent a stage it all really is.

Now, we have a bit of an idea form the research report noted above as to how few people use AI, and todayā€™s poll is supplementing that data with your feedback.

Granted, as a captive AI audience, itā€™s likely the results might be a little higher. And I do want you to be as honest as you can here.

Soā€¦

Weā€™ll take a look at the results and compare them with the earlier data next week.

Weirdest AI image of the day

The State Troopers always are angry when you take your mad max rig on the highway ā€“ r/Weirddallee

r/weirddalle - The State Troopers always are angry when you take your mad max rig on the highway

ChatGPTā€™s random quote of the day


“Personal computers are the most empowering tool we’ve ever created. They’re tools of communication, they’re tools of creativity, and they can be shaped by their user.”

ā€” Bill Gates


Thanks for reading, and donā€™t forget to leave comments and questions below,

Sam Volkering

Editor-in-Chief
AI Collision
Leave a comment
Although Southbank Investment Research Ltd, the publisher of AI Collision is regulated by the Financial Conduct Authority, the editorial content in AI Collision is not regulated by the Financial Conduct Authority. The editorial content is for general information only; it gives no advice on investments and is not intended to be relied upon by individual readers in making (or not making) specific investment decisions. Your capital is at risk when you invest. Any investment decisions should be considered in relation to your own circumstances, risk tolerance and investment objectives.
Occasionally we may tell you about other information services published by Southbank Investment Research Limited which do contain content which is regulated by the FCA. When viewing this regulated content, you should review the risk warnings accompanying it. 
You can unsubscribe from AI Collision at any time by clicking the link below.
ISSN 2977-0882
Ā© 2024 Southbank Investment Research Ltd. Registered in England and Wales No 9539630. VAT No GB629 7287 94. Registered Office: Basement, 95 Southwark Street, London SE1 0HX. Authorised and regulated by the Financial Conduct Authority. FCA No 706697. 
https://register.fca.org.uk
0 0 votes
Article Rating
guest

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Jim Hollands

Sam, regarding your survey, I have answered never, but my go to search is Perplexity rather than Google. So perhaps I do use AI every day. Enjoying AI Collision.
Regards, Jim Hollands

David

Of all places, I use the AI inbuilt on e-bay selling, when it comes to listing a product for sale.

So having uploaded some nice photos, and completed the product description boxes, colour, size, etc, it becomes time to describe the item for sale in the main body of the advert – and there’s an option to use AI to compose the description. I now click the button to see what comes up, far quicker than having to type it myself – and then just modify a few phrases as I see fit, add a personal comment or add some extra detail, or maybe delete some over exuberant sentence or two.

Certainly saves time! And yes, my stuff got sold too

I find it amusing and helpful at the same time.

2
0
Would love your thoughts, please comment.x
()
x