Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
In this fascinating episode of Big Technology Podcast, host Alex Kantrowitz sits down with Bloomberg features writer Ellen Huet to explore how Silicon Valley—a place renowned for independent thinking—repeatedly falls into powerful forms of groupthink. Ellen, author of "Empire of Orgasm: Sex, Power, and the Downfall of a Wellness Cult," draws compelling parallels between cult dynamics and Silicon Valley culture. (01:22) The conversation examines how group houses, self-help programs, and "high agency" ideology create fertile ground for both innovation and manipulation, with particular focus on how these same psychological patterns appear in today's AI and AGI movements.
Host of Big Technology Podcast and a seasoned technology journalist who has spent over six years living in San Francisco. He provides insightful commentary on the tech industry and its cultural dynamics.
Features writer at Bloomberg News and author of "Empire of Orgasm: Sex, Power, and the Downfall of a Wellness Cult." Ellen has spent 12-13 years living in San Francisco and has extensively covered Silicon Valley culture, including investigations into OneTaste and startup incubators like HF0.
Silicon Valley's group house culture serves as more than just cost-effective living arrangements—they function as ideological incubators where beliefs and business ventures are born. (04:07) Ellen explains that these houses often organize around professional identities (AI researchers, founders) and shared belief systems like effective altruism or rationalism. The bonds formed through communal living create deep trust and connection, making it natural for housemates to launch companies together. Companies like Anthropic have emerged from these environments, demonstrating how ideology-first communities can spawn major tech ventures. This represents a fundamental shift from traditional business formation, where ideology drives the initial connection rather than pure market opportunity.
The popular Silicon Valley concept of "high agency"—believing you can radically control your circumstances—while often empowering, can be weaponized for manipulation. (13:03) Ellen reveals how this philosophy, prominent among rationalists and AI researchers, teaches that individuals are 100% responsible for their life experiences. While this mindset can drive innovation and personal growth, it becomes dangerous when taken to extremes. In cult-like environments, this belief system prevents people from recognizing when they're being exploited, as acknowledging victimhood becomes seen as having a "victim mentality." The pressure to maintain high agency can keep people trapped in harmful situations, believing their suffering is their own fault rather than recognizing systemic abuse.
Silicon Valley's ambitious, often displaced individuals gravitate toward intensive personal development programs that promise breakthrough performance. (09:35) Programs like Landmark, Hoffman Process, and Conscious Leadership Group attract tech workers seeking to optimize themselves for greater professional impact. While these programs can provide genuine benefits, they also create psychological dependency and vulnerability. The intense, immersive experiences can alter participants' worldviews and social connections, sometimes isolating them from previous relationships while binding them to new communities. This pattern of seeking transformation through intensive programs makes Silicon Valley residents particularly susceptible to groups that promise radical personal change, whether legitimate or exploitative.
Once individuals invest heavily in an ideology—especially one tied to their professional identity and social connections—changing course becomes psychologically and practically costly. (37:54) Ellen describes how people who dedicate years to a belief system, move into communal housing, change careers, and build their entire social network around an ideology face massive cognitive dissonance when questioning that system. The more someone has publicly committed to being "the founder fighting for X cause" or "the researcher working on AI safety," the more painful it becomes to admit the ideology might be flawed. This creates a stickiness that can trap people in situations that no longer serve them, from cult-like wellness companies to potentially misguided AGI pursuits.
The AI safety and AGI development communities exhibit psychological patterns similar to cult dynamics, according to former participants. (40:01) Ellen reports that people deeply involved in AI doomerism describe their experience using cult language—complete worldview takeover, social isolation from non-believers, and sense of cosmic significance. Whether believing AGI will destroy humanity or transform it, adherents experience the same psychological markers: overarching ideology affecting all life decisions, natural isolation from previous relationships, access to "special knowledge," and sense of awe and wonder. This pattern appears regardless of whether someone is an AI optimist or pessimist, suggesting the psychological structure matters more than the specific beliefs.