[ivory-search 404 "The search form 146633303 does not exist"]

Special Edition: Interview with Aligned AI

Welcome to AI Collision.

Today is another special edition, just like last Thursday’s AI Collision.

Today I’m speaking with Rebecca Gorman, CEO of Aligned AI, which you can read a little more about below.

You’ll also find our boomers and busters for the week, results from our poll, a new poll, and an AI image (plus more from the link) that had me in stitches!

If that’s enough to get your robot waiting tables, read on…

Note: these won’t be regular, weekly editions. But we had a couple of great interviews with Idan and now Rebecca that I wanted to get out to you right away.

PS if you’re in an AI-related field or know someone in an AI-related field, reach out and let me know. I’d love to speak on camera about what you (or they) do and the AI industry. Just hit me up in the comments.

AI Collision 💥 in conversation with: Aligned AI

When is AI safe AI? When is it more or less humanlike?

There’s a lot of fear about AI. What it can become – if it can feel and behave like a human. The dystopians out there believe that AI may reach a “singularity” whereby it exceeds all human capacity, takes on a mind of its own, realises humans aren’t needed and goes on to terminate us all.

This is the existential threat many perceive will become reality with out-of-control AI.

On the other hand, what if AI that is more humanlike is smarter, safer, augments our capabilities, makes us better, lifts living standards, reduces mortality and makes the world a better place – a more profitable place?

Not many dare to put in the work to make safer AI, but that’s exactly what Aligned AI is doing. Safer AI is better for us all and its CEO, Rebecca Gorman, jumped onto Zoom with me recently to discuss AI, Aligned AI’s work and what the future could be like with safer AI in it.

Error: multivariate_id attribute is required.

Boomers & Busters 💰

AI and AI-related stocks moving and shaking up the markets this week. (All performance data below over the rolling week).

man in black suit jacket and black pants figurine

Boom 📈

  • Symbiotic (NASDAQ:SYM) up 18%

  • Amesite (NASDAQ:AMST) up 11%

  • WiMi Hologram Cloud (NASDAQ:WIMI) up 9%

Bust 📉

  • Tesla (NASDAQ:TSLA) down 2%

  • Darktrace (LSE:DARK) down 8%

  • BigBear.ai Holdings (NYSE:BBAI) down 13%

From the hive mind 🧠

Artificial Pollteligence 🗳️ : The Results Show

Due to last Thursday’s poll not working properly, we set it again on Tuesday and the results are in…

The question was: would you be comfortable with a humanoid AI robot around your kids?

The options were:

  • Yes

  • No

  • Don’t have kids

  • Have robot kids, find this offensive

And the winner is…

A close one…but it’s NO!

I was expecting the “No’s” to win this one. It was close(ish) with a 24% (Yes) to 29% (No) split.

Actually the “Don’t have kids” won technically – but I’m just going to gloss over that.

I’m not surprised with the result because the idea of a Boston Dynamics ATLAS with its infinite strength towering over our little precious ones is a little bit scary.

Boston Dynamics ATLAS robot in action over an obstacle course

This is exactly why my conversation with Rebecca Gorman is so relevant. Yes, humanoid robots might still be a little weird, but they’re coming. And their safety and trustworthiness is paramount.

And also still 24% of the vote was “Yes”, so there’s enough open minded people willing to accept our robotic friends when they enter society en-mass. That shows there will be a market for them.

Thanks for voting!

Now, to get our polling back on its regular schedule (poll on Thursday, results on Tuesday), I’m chucking another in here to see what you think about one of the things discussed in my interview with Rebecca Gorman from AlignedAI.

Thanks again for voting…results on Tuesday!

Weirdest AI image of the day

Assorted data visualisations from Dalle.

P.S. hit that link above for loads more that are brilliant!

ChatGPT’s random quote of the day


“AI is the single most powerful technology force of our time. It is the automation of automation, where software writes software.”

– Jensen Huang, the CEO of NVIDIA


Share AICollision

Thanks for reading, see you on Tuesday. And if you’re enjoying our work, please like, share and leave comments below,

Sam Volkering

Editor-in-Chief
AI Collision
Leave a comment
Although Southbank Investment Research Ltd, the publisher of AI Collision is regulated by the Financial Conduct Authority, the editorial content in AI Collision is not regulated by the Financial Conduct Authority. The editorial content is for general information only; it gives no advice on investments and is not intended to be relied upon by individual readers in making (or not making) specific investment decisions. Your capital is at risk when you invest. Any investment decisions should be considered in relation to your own circumstances, risk tolerance and investment objectives.
Occasionally we may tell you about other information services published by Southbank Investment Research Limited which do contain content which is regulated by the FCA. When viewing this regulated content, you should review the risk warnings accompanying it. 
You can unsubscribe from AI Collision at any time by clicking the link below.
ISSN 2977-0882
© 2023 Southbank Investment Research Ltd. Registered in England and Wales No 9539630. VAT No GB629 7287 94. Registered Office: 2nd Floor, Crowne House, 56-58 Southwark Street, London, SE1 1UN. Authorised and regulated by the Financial Conduct Authority. FCA No 706697. 
https://register.fca.org.uk
0 0 votes
Article Rating
guest

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Sam Volkering

Looks like our polling isn’t working in emails again for some reason. Substack polls look glitchy at best. Seems to work on the web at least, so if you get a chance vote here!

Lesley

Loving this Sam, thanks.. Great conversation. (The Poll is still not loading on my browser (Brave)). Looking forward to your next videos.

2
0
Would love your thoughts, please comment.x
()
x