As venture capitalists and former tech execs speak out about social media addiction, designers are facing ethical questions about their work.
Journalism + Design is hosting a discussion about whether persuasive technology and behavioral design are colonizing our attention. We’ll ask: Where is the line between sharp UX and user manipulation? How can we design digital content to reach a large online audience while maintaining ethical rigor?
Join us on Monday, February 5, to hear from Nir Eyal, a behavioral design consultant for Fortune 500 companies and author of “Hooked: How to Build Habit-Forming Products”; James Williams, a researcher at the Oxford Internet Institute and a cofounder of Time Well Spent; and Alexis Lloyd, the chief design officer at Axios and previously the creative director of The New York Times R&D Lab. Nicholas Thompson, editor-in-chief of Wired Magazine, will moderate.
Admission is free, but you need to register via Eventbrite. We hope to see you there!
If you want to get up to speed before the event, here’s a reading list that inspired conversation among the J + D team and sparked the idea for this event.
Under the auspices of Time Well Spent, Harris is leading a movement to change the fundamentals of software design. He is rallying product designers to adopt a “Hippocratic oath” for software that, he explains, would check the practice of “exposing people’s psychological vulnerabilities” and restore “agency” to users. “There needs to be new ratings, new criteria, new design standards, new certification standards,” he says. “There is a way to design based not on addiction.”
[Chamath] Palihapitiya’s criticisms were aimed not only at Facebook, but the wider online ecosystem. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he said, referring to online interactions driven by “hearts, likes, thumbs-up.” “No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.”
[H]ow can a company like Facebook be retooled around “meaningful interactions” instead of engagement? The first step is to understand why it’s hard. Popular articles place blame in certain places: the advertising business model, centralization, tech bro culture, tech-giant monopolies, or just capitalism-as-usual. But I don’t think it’s that simple. I think that a difficulty with meaningful interactions starts with the nature of software itself. I believe even the most well-intentioned teams, operating in the best possible culture, would still struggle to write software that’s time well spent.
How to Fix Facebook —Before It Fixes Us by investor and adviser Roger McNamee (Washington Monthly)
Then came the U.S. election. The next day, I lost it. I told [longtime Facebook executive] Dan [Rose] there was a flaw in Facebook’s business model. The platform was being exploited by a range of bad actors, including supporters of extremism, yet management claimed the company was not responsible. Facebook’s users, I warned, might not always agree. The brand was at risk of becoming toxic. Over the course of many conversations, I urged Dan to protect the platform and its users.
How To Design Non-Addictive UX (It’s Really Not Hard) (Fast Company)
4 Steps to Bring Ethical Clarity to Native Advertising (Nieman Reports)