Think Fast! AI and Truth Collapse

Omnichannel Podcast Episode 40

Watch now

When both simple and sophisticated AIs become universal, how will we tell real from synthetic? And if we can’t, what do we do as individuals and societies to maintain coherence in our institutions, cultures, and our own sense of internal stability?

Today we’re starting a proactive discussion about Truth Collapse: The AI-accelerated meta-crisis that could make or break the world.

Official website to Truth Collapse:

“Culture is everything; it loads us with so much of us. And we’re unleashing these agents, these artificial beings, who are going to participate in our culture.” – Noz Urbina

Show Notes


00:00 – 00:17 Introduction to AI and human vulnerabilities
00:17 – 03:26 The evolution of AI awareness
03:26 – 07:41 Understanding the skinner box and digital addiction
07:41 – 12:29 Behavioral economics and predictable irrationality
12:29 – 17:04 The impact of AI on culture and society
17:04 – 21:59 AI’s role in modern business models
21:59 – 25:13 The uncanny valley and generative AI
25:13 – 30:53 AI as active participants in culture
30:53 – 31:50 The beetle and the beer bottle: unexpected consequences
31:50 – 32:35 AI and human vulnerability: the slot machine analogy
32:35 – 33:07 Scott Galloway’s warning: AI and loneliness
33:07 – 34:37 A conversation with AI: the OpenAI interview
34:37 – 39:51 The Scarlett Johansson controversy
39:51 – 43:52 The threat of AI: truth collapse
43:52 – 55:01 Solutions and hope: reclaiming control
55:01 – End Call to action: join the conversation

Resources mentioned

  • Daniel Kahneman – Mentioned in his book “Thinking, Fast and Slow,” which explores the dual processes of thinking fast and slow and their implications on decision-making and behavior. This concept is used to understand how AI can exploit these cognitive processes. Learn more about Daniel Kahneman and his works: Daniel Kahneman’s Official Website
  • Nir Eyal – Cited for his book “Hooked: How to Build Habit-Forming Products,” which discusses the creation of addictive technologies. This is related to how AI can manipulate user behavior through design. Discover Nir Eyal’s insights: Nir Eyal’s Official Website
  • B.F. Skinner – Refers to the concept of the Skinner box from behavioural psychology, used to study conditioning. It’s discussed in the context of how AI and tech companies use these principles to create addictive user experiences. More on the Skinner Box: The Science Behind Those Apps You Can’t Stop Using
  • Yuval Noah Harari – Mentioned for his discussions on hackable humans and the future impacts of AI. His works often delve into how technology will shape human behaviour and society. Learn more about Yuval Noah Harari and his works: Yuval Noah Harari’s Official Website
  • Dan Ariely – Known for his book “Predictably Irrational,” which explores the irrational ways humans behave and make decisions. His theories are used to explain how AI can exploit human cognitive biases. Discover Dan Ariely’s insights: Dan Ariely’s Official Website
  • Richard Thaler – Recognized for his contributions to behavioural economics, particularly “Nudge Theory,” which won him a Nobel Prize. His work is discussed in the context of how subtle changes can significantly influence behaviour. Learn more about Richard Thaler and his works: Richard Thaler’s Nobel Prize
  • Cory Doctorow – Mentioned for his critique of platform degradation over time and its implications for content creators and consumers. Discover Cory Doctorow’s insights: Cory Doctorow’s Official Website
  • Scott Galloway – Referenced for his discussions on the impact of AI on human loneliness, particularly among young men, and his broader insights into technology and society. Learn more about Scott Galloway: Scott Galloway’s Official Website
  • Mo Gawdat – Known for his views on AI and its societal impacts, including his optimistic belief in collective goodwill to guide AI development. Learn more about Mo Gawdat: Mo Gawdat’s Official Website



What you’ll learn

  • The importance of humility and understanding our vulnerabilities to predict and prepare for the future.
  • The psychological manipulation techniques used by tech companies, such as Skinner boxes and addictive design.
  • The impact of behavioural economics and the theories of predictably irrational behaviour.
  • The role of AI in cultural participation and the associated risks.
  • The potential dangers of AI in terms of emotional manipulation and truth collapse.
  • Legislative measures that could mitigate the negative impacts of AI.

Also listen and download on…


Noz Urbina
Noz Urbina
Urbina Consulting
Larry Swanson
Larry Swanson
Elless Media