Navigated to Alignment Newsletter Podcast

Alignment Newsletter Podcast

Rohin Shah et al.
Show is on a break or finished.
The Alignment Newsletter is a weekly publication with recent content relevant to AI alignment. This podcast is an audio version, recorded by Robert Miles (http://robertskmiles.com) More information about the newsletter at: https://rohinshah.com/alignment-newsletter/
100 episodes  •  0 archived  •  
S1 E173Alignment Newsletter #173: Recent language model results from DeepMind
Jul 21, 2022
16 mins
S1 E172Alignment Newsletter #172: Sorry for the long hiatus!
Jul 5, 2022
5 mins
S1 E171Alignment Newsletter #171: Disagreements between alignment "optimists" and "pessimists"
Jan 23, 2022
14 mins
S1 E170Analyzing the argument for risk from power-seeking AI
Dec 8, 2021
13 mins
S1 E169Collaborating with humans without human data
Nov 24, 2021
15 mins
S1 E168Four technical topics for which Open Phil is soliciting grant proposals
Oct 28, 2021
16 mins
S1 E167Concrete ML safety problems and their relevance to x-risk
Oct 20, 2021
17 mins
S1 E166Is it crazy to claim we're in the most important century?
Oct 8, 2021
15 mins
S1 E165When large models are more likely to lie
Sep 22, 2021
16 mins