Brian Christian Transcript

Brian Christian is the author of The Most Human Human, which was named a Wall Street Journal bestseller, a New York Times Editors’ Choice, and a New Yorker favorite book of the year. He is the author, with Tom Griffiths, of Algorithms to Live By, a #1 Audible bestseller, Amazon best science book of the year and MIT Technology Review best book of the year. And his newest book is The Alignment Problem: Machine Learning and Human Values.

AI has been a very hot topic of discussion among business leaders over the past few decades, and there are varying degrees of worry. Today Brian is sharing his view on AI and machine learning and whether we should be worried or not. He also explains why everyone should get to know more about AI, even if you aren’t in a technical role.

In this episode of the podcast we explore:

  • The history of AI and machine learning
  • How questions from Elon Musk pushed Brian to write his book, The Alignment Problem
  • What is supervised learning vs. reinforcement learning in regards to AI
  • Potential problems we should look out for when it comes to AI
  • What is an algorithm and what goes into creating one
  • Advice for people who want to be more aware of this realm

Listen to the episode on Apple Podcasts, Spotify, Google Podcasts, or your favorite podcast platform.

If you want more content like this you can subscribe to my Youtube channel.

If you enjoyed the article and want more content like this here’s what you can do:

  1. Subscribe to The Future of Work Podcast where I interview business leaders around the world each week.
  2. Grab a copy of The Future Leader which has been endorsed by the CEOs of MasterCard, Best Buy, Oracle, Audi, Unilever, Domino’s Pizza, Ritz Carlton, Kaiser, and Marshall Goldsmith. It explores the most essential skills and mindsets for future leaders.
  3. If you are or want to be an entrepreneur then my wife and I just launched a brand new podcast on how to Be Your Own Boss, called the BYOB Podcast where we share what we did and how we did it. You can subscribe to that here.