pod.link/1565088425
pod.link copied!
The Inside View
The Inside View
Michaël Trazzi

The goal of this podcast is to create a place where people discuss their inside views about existential risk from AI.

Listen now on

Apple Podcasts
Spotify
Overcast
Podcast Addict
Pocket Casts
Castbox
Podbean
iHeartRadio
Player FM
Podcast Republic
Castro
RSS

Episodes

Owain Evans - AI Situational Awareness, Out-of-Context Reasoning

Owain Evans is an AI Alignment researcher, research associate at the Center of Human Compatible AI at UC Berkeley, and now leading... more

23 Aug 2024 · 2 hours, 15 minutes
[Crosspost] Adam Gleave on Vulnerabilities in GPT-4 APIs (+ extra Nathan Labenz interview)

This is a special crosspost episode where Adam Gleave is interviewed by Nathan Labenz from the Cognitive Revolution. At the... more

17 May 2024 · 2 hours, 16 minutes
Ethan Perez on Selecting Alignment Research Projects (ft. Mikita Balesni & Henry Sleight)

Ethan Perez is a Research Scientist at Anthropic, where he leads a team working on developing model organisms of misalignment. Youtube:... more

09 Apr 2024 · 36 minutes
Emil Wallner on Sora, Generative AI Startups and AI optimism

Emil is the co-founder of palette.fm (colorizing B&W pictures with generative AI) and was previously working in deep learning for... more

20 Feb 2024 · 1 hour, 42 minutes
Evan Hubinger on Sleeper Agents, Deception and Responsible Scaling Policies

Evan Hubinger leads the Alignment stress-testing at Anthropic and recently published "Sleeper Agents: Training Deceptive LLMs That Persist Through Safety... more

12 Feb 2024 · 52 minutes
[Jan 2023] Jeffrey Ladish on AI Augmented Cyberwarfare and compute monitoring

Jeffrey Ladish is the Executive Director of Palisade Research which aimes so "study the offensive capabilities or AI systems today... more

27 Jan 2024 · 33 minutes
Holly Elmore on pausing AI

Holly Elmore is an AI Pause Advocate who has organized two protests in the past few months (against Meta's open... more

22 Jan 2024 · 1 hour, 40 minutes
Podcast Retrospective and Next Steps

https://youtu.be/Fk2MrpuWinc

09 Jan 2024 · 1 hour, 3 minutes
Paul Christiano's views on "doom" (ft. Robert Miles)

Youtube: https://youtu.be/JXYcLQItZsk Paul Christiano's post: https://www.lesswrong.com/posts/xWMqsvHapP3nwdSW8/my-views-on-doom

29 Sep 2023 · 4 minutes
Neel Nanda on mechanistic interpretability, superposition and grokking

Neel Nanda is a researcher at Google DeepMind working on mechanistic interpretability. He is also known for his YouTube channel... more

21 Sep 2023 · 2 hours, 4 minutes
The Inside View
Owain Evans - AI Situational Awareness, Out-of-Context Reasoning
The Inside View
Claim your free pod.link
Customize to match your brand
Claim a memorable URL
Add your own Google Analytics
Confirm Ownership

To claim this podcast, you must confirm your ownership via the email address located in your podcast’s RSS feed (mic****@gmail.com). If you cannot access this email, please contact your hosting provider.