If a user posts offensive content or hate speech online, can the company that hosts the platform be held liable? If so, what does that mean for free expression in a digital world?
Most Americans likely would struggle to define “intermediary liability,” a legal term that refers to laws that hold internet service providers and social networks responsible when someone uses their services to engage in unlawful behavior on the internet. But the term has gained relevance in recent years as policymakers expand the concept to include offensive speech and other online behavior that is deemed inappropriate.
To explore this shift, the Charles Koch Foundation (CKF) has provided Stanford Law’s Daphne Keller with a grant to study intermediary liability and its civil liberties’ implications.
“Laws that regulate Internet platforms also regulate ordinary internet users, and can significantly impact their rights,” Keller said. “It is unfortunately becoming all too clear that many new proposals, especially in the area of liability for speech, threaten users’ ability to engage online. This grant will allow us to more deeply explore that impact.”
Keller, the director of Program on Platform Regulation (PPR) at Stanford’s Cyber Policy Center (CPC), studies international approaches to intermediary liability and works with global platforms and free expression organizations to protect innovation, freedom of expression, privacy, and other user rights. Her work has been featured in several national media outlets, including The Atlantic, The New York Times, The Washington Post, and The Wall Street Journal, putting her at the center of the debate about how to preserve free speech and related civil liberties online.
Keller will use the grant from CKF to educate the public about the impact of internet content moderation, conduct research, hire a research fellow and graduate students, and facilitate conversations with experts in the debate over digital free speech.
Keller is frequently described as the world’s leading expert on intermediary liability and content moderation. Her recent work has focused on the troubling trend of governments pressuring social media platforms to enforce speech restrictions that would violate constitutional or human rights if those governments attempted to enforce them directly.
“It is more important than ever to understand the importance of preserving free expression online, especially in these challenging times,” said CKF Executive Director Ryan Stowers. “Daphne and her team will explore goals and trade-offs inherent in intermediary liability. We are proud to support this important work.”
Keller’s work at Stanford began at the Center for Internet and Society, where she directed the Intermediary Liability program. Until 2015, she served as associate general counsel for Google and had primary responsibility for the company’s search products. She has testified before the European Parliament and the U.S. Senate, as well as legislatures, courts, and regulatory bodies around the world.