PodcastsPhilosophyEA Forum Podcast (Curated & popular)

EA Forum Podcast (Curated & popular)

EA Forum Team
EA Forum Podcast (Curated & popular)
Latest episode

407 episodes

  • EA Forum Podcast (Curated & popular)

    “My lover, effective altruism” by Natalie_Cargill

    2026/04/26 | 8 mins.
    Crossposted from Substack. This post is part of a 30-posts-in-30-days ordeal at Inkhaven. All suboptimalities are the result of that. This is part 2, here is part 1 in my EA mini series!
    On my way to my tenth EAG in a decade, my brother-in-law explained effective altruism to me.
    At first, he couldn’t quite remember if he’d heard the phrase before. But he searched the corners of his mind until the definition made itself known: “yeah, it's just a bunch of wankers who pretend to have social impact, but all they do is go to conferences and raise money and they’ve never had any impact at all.”
    I have never had any chill. I did not develop it in that moment. If anyone is going to say EA is just a bunch of wankers, it's going to be me, newbie.
    “That's interesting, Ben, but I think it's a serious misconception — you might not be aware that EA has literally raised billions of dollars for global health charities, which very likely saved hundreds of thousands of children's lives (do you hate children, Ben?) They have literally stopped millions of hens being tortured in cages too small for them [...]
    ---

    First published:

    April 17th, 2026


    Source:

    https://forum.effectivealtruism.org/posts/sA5iFynuMJQAuJ6ku/my-lover-effective-altruism

    ---

    Narrated by TYPE III AUDIO.

    ---
    Images from the article:
    Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
  • EA Forum Podcast (Curated & popular)

    “A Database of Near-Term Interventions for Wild Animals” by Bob Fischer

    2026/04/24 | 17 mins.
    The Animal Welfare Department (AWD) at Rethink Priorities supports high-impact strategies to help animals, especially where suffering is vast and largely neglected. Therefore, one of our focus areas is wild animal welfare (WAW), where uncertainty about tractability makes identifying cost-effective interventions particularly challenging. While much of the current WAW work rightly focuses on academic field-building (see Elmore & McAuliffe, 2024), it is worth determining whether there are viable, near-term interventions that are already available or close to implementation.
    With this goal in mind, we have developed the Wild Animal Welfare Intervention Database (WAWID). This project evaluates an array of interventions that may be promising for improving WAW in the (relatively) near term, evaluating them relative to criteria of interest to funders, advocates, researchers, and potential implementers across the WAW space.
    The WAWID is available here:
    Wild Animal Welfare Intervention Database The landing page includes a full list of the interventions and evaluation criteria. This report explains how we developed the WAWID, what we think you can learn from it, and suggest some future directions for this work (conditional on funding). A future report will provide some descriptive statistics.
    How we developed the WAWID
    We launched this project in [...]
    ---
    Outline:
    (01:34) How we developed the WAWID
    (08:14) Initial observations
    (13:13) Limitations
    (15:20) Future Directions
    (16:48) Acknowledgements
    ---

    First published:

    March 25th, 2026


    Source:

    https://forum.effectivealtruism.org/posts/pEbiEmeu2agEHJgyu/a-database-of-near-term-interventions-for-wild-animals

    ---

    Narrated by TYPE III AUDIO.

    ---
    Images from the article:
    Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
  • EA Forum Podcast (Curated & popular)

    “The AI people have been right a lot” by Dylan Matthews

    2026/04/20 | 10 mins.
    This post was crossposted from Dylan Matthew's blog by the EA Forum team. The author may not see or reply to comments.
    Subtitle: Try to keep an open mind as the world gets increasingly wild.
    The crowd at EAG 2015 (Center for Effective Altruism) In 2015, I went to my first EA (Effective Altruism) Global. It was then on-the-record for journalists, which is a rule that got changed for all subsequent events due to my actions.
    My exposure to EA at that time was mostly through people who took high-paying careers in order to “earn to give” to global health charities, which I had written about in the Washington Post. I also knew the movement cared a lot about animal welfare. I was aware that there were people worried about catastrophic risks, and specifically about AI; this had come up in a profile I wrote of Open Philanthropy (my now-employer, albeit under a new name these days). But I still broadly thought of EA as the bednets and cage-free commitments people.
    I was really taken aback by how dominant discussions of AI risk were at the event. The marquee panel featured Superintelligence author Nick Bostrom, future If Anyone Builds It [...]
    ---
    Outline:
    (03:31) What should I learn from bungling this?
    (06:43) Listen to the people saying stuff will get weird
    ---

    First published:

    April 16th, 2026


    Source:

    https://forum.effectivealtruism.org/posts/9FPxMET3W4wewwSyf/the-ai-people-have-been-right-a-lot

    ---

    Narrated by TYPE III AUDIO.

    ---
    Images from the article:
    Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
  • EA Forum Podcast (Curated & popular)

    [Linkpost] “The Anthropic IPO Is Coming. We Aren’t Ready for It.” by Sophie Kim

    2026/04/17 | 15 mins.
    This is a link post. More money is coming than AI safety has ever seen. The capacity to deploy it doesn't exist yet.
    Image Source: Fortune.com This week, Anthropic announced Claude Mythos Preview– a model so capable at finding software vulnerabilities that the company decided to wait to release it publicly. Last month, Bloomberg reported that Anthropic's annualized revenue had hit $19 billion, doubling in under four months, and by some accounts surpassing OpenAI. It is currently one of the most valuable private companies on the planet.
    Anthropic is also going to IPO, possibly as soon as October 2026. When it does, a few thousand people, including some of the wealthiest people on the planet, will become liquid, and a meaningful fraction of those people will want to give to AI safety.
    Tech liquidity events have created major philanthropists before. Dustin Moskovitz's Facebook shares became Good Ventures, which became Coefficient Giving, which became the single largest funder of AI safety. Vitalik Buterin donated $665 million to a fledgling FLI. Jed McCaleb's crypto wealth became a billion-dollar endowment to the Navigation Fund, which went from $4 million in grants its first year to over $60 million by 2025.
    None of these come [...]
    ---
    Outline:
    (00:13) More money is coming than AI safety has ever seen. The capacity to deploy it doesnt exist yet.
    (03:36) The constraint is talent, not funding
    (05:47) We dont just need more bets- we need decorrelated ones
    (08:39) Different structural positions
    (09:17) Independent grantmakers with different worldviews
    (10:12) The Pitch: Consider becoming a grantmaker or founding something new
    (12:52) The window is brief
    (13:48) Next Steps
    ---

    First published:

    April 12th, 2026


    Source:

    https://forum.effectivealtruism.org/posts/ychu2LAw54sNcocKH/the-anthropic-ipo-is-coming-we-aren-t-ready-for-it


    Linkpost URL:
    https://thecounterfactual.substack.com/p/the-anthropic-ipo-is-coming-we-arent

    ---

    Narrated by TYPE III AUDIO.

    ---
    Images from the article:
    Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
  • EA Forum Podcast (Curated & popular)

    “AI Safety’s Biggest Talent Gap Isn’t Researchers. It’s Generalists.” by Topaz, Agustín Covarrubias 🔸, Alexandra Bates, Parv Mahajan, Kairos

    2026/04/16 | 13 mins.
    This post was cross posted to LessWrong
    TL;DR: One of the largest talent gaps in AI safety is competent generalists: program managers, fieldbuilders, operators, org leaders, chiefs of staff, founders. Ambitious, competent junior people could develop the skills to fill these roles, but there are no good pathways for them to gain skills, experience, and credentials. Instead, they're incentivized to pursue legible technical and policy fellowships and then become full-time researchers, even if that's not a good fit for their skills. The ecosystem needs to make generalist careers more legible and accessible.
    Kairos and Constellation are announcing the Generator Residency as a first step. Apply here by April 27.
    Epistemic status: Fairly confident, based on 2 years running AI safety talent programs, direct hiring experience, and conversations with ~30 senior org leaders across the ecosystem in the past 6 months.
    The problem
    Over the past few years, AI safety has moved from niche concern toward a more mainstream issue, driven by pieces like Situational Awareness, AI 2027, If Anyone Builds It, Everyone Dies, and the rapidly increasing capabilities of the models themselves.
    During this period, over 20 research fellowships have launched, collectively training thousands of fellows, with 2,000-2,500 fellows [...]
    ---
    Outline:
    (01:18) The problem
    (03:41) Why the pipeline is broken
    (05:59) Why this matters now
    (07:31) Counter-Arguments
    (10:11) The Generator Residency
    ---

    First published:

    April 13th, 2026


    Source:

    https://forum.effectivealtruism.org/posts/k3nq7FxBCsrNFmAYi/ai-safety-s-biggest-talent-gap-isn-t-researchers-it-s-2

    ---

    Narrated by TYPE III AUDIO.

More Philosophy podcasts

About EA Forum Podcast (Curated & popular)

Audio narrations from the Effective Altruism Forum, including curated posts and posts with 125 karma. If you'd like more episodes, subscribe to the "EA Forum (All audio)" podcast instead.
Podcast website

Listen to EA Forum Podcast (Curated & popular), The Shawn Ryan Show and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features

EA Forum Podcast (Curated & popular): Podcasts in Family