Search for a command to run...

Timestamps are as accurate as they can be but may be slightly off. We encourage you to listen to the full context.
In this episode of "The Interview," Lulu Garcia Navarro speaks with Jimmy Wales, co-founder of Wikipedia, about his new book "The Seven Rules of Trust" and the mounting challenges facing the world's largest encyclopedia. Wales discusses how Wikipedia maintains its credibility through transparent editing processes, quality sourcing standards, and civil discourse among volunteer editors. (03:00) The conversation covers recent attacks from political figures and influencers, including Elon Musk's criticism and plans for Grokopedia, congressional investigations, and accusations of bias from both sides of the political spectrum.
Jimmy Wales is the co-founder of Wikipedia and author of "The Seven Rules of Trust: A Blueprint for Building Things That Last." He founded Wikipedia in 2001 alongside Larry Sanger, creating what has become one of the world's most trusted information sources. Wales continues to be involved with the Wikimedia Foundation, the nonprofit organization that operates Wikipedia, though he emphasizes the community's intellectual independence from any centralized control.
Lulu Garcia Navarro is a journalist and host of "The Interview" podcast from The New York Times. She conducts in-depth conversations with notable figures across various fields, bringing a thoughtful and probing approach to complex topics around trust, technology, and contemporary challenges facing institutions.
Wales emphasizes that Wikipedia's transparency mechanisms - including visible edit histories, talk pages showing editor discussions, and clear notices when content is disputed - are fundamental to building public trust. (09:29) Rather than presenting information as infallible, Wikipedia openly acknowledges uncertainties and shows readers the collaborative process behind each article. This approach allows people to evaluate the reliability of information themselves and understand different perspectives on controversial topics. For professionals, this demonstrates how admitting limitations and showing your work can actually enhance credibility rather than undermine it.
Wikipedia's approach to determining facts relies on identifying high-quality, peer-reviewed sources rather than trying to adjudicate truth directly. (07:03) Wales explains they prioritize established publications with track records of accuracy and correction processes over random social media posts or unverified claims. This creates a practical framework for navigating information overload - rather than trying to determine absolute truth, focus on the credibility and methodology of your sources. Professionals can apply this by developing consistent criteria for evaluating information sources in their decision-making processes.
Wales describes how Wikipedia's most contentious pages often end up being higher quality because the scrutiny and debate force editors to rely more heavily on evidence and achieve genuine consensus. (10:45) The key is creating structures where people with opposing views can engage productively - through clear rules, transparent processes, and shared commitment to factual accuracy over winning arguments. Rather than avoiding disagreement, professionals can embrace it as a tool for improving decisions when proper frameworks for civil discourse are established.
Wales attributes Wikipedia's ability to resist political pressure and commercial influence to its nonprofit structure and long-term perspective. (38:00) By removing profit motives and designing systems for durability rather than short-term gains, Wikipedia maintains editorial independence even under attack. He emphasizes that being "not for sale" provides crucial protection against those who would compromise the platform's integrity. Professionals can apply this by building sustainable business models and governance structures that prioritize long-term value creation over short-term pressures.
Wales discusses how Wikipedia approaches AI and new technologies - embracing useful applications like fact-checking assistance while firmly rejecting wholesale automation of content creation. (25:25) The platform experiments with AI tools to support human editors but maintains human oversight for all content decisions. This balanced approach allows organizations to leverage technological advances without compromising their fundamental mission or quality standards. Leaders can apply this by clearly defining their core values and using them as filters for adopting new tools and processes.