BLOG : POST
Now, more than ever, designers and media makers are facing down tough ethical questions about their complicity in the insidiously addictive new modes of technology. There have been no shortage of words devoted to interrogating the dark side of our seemingly indispensable tech: Smartphones are hijacking our minds. Social media’s dopamine-driven feedback loops are like seam rippers to our social fabric. Even the engineers of our attention economy are feeling something akin to guilt.
To address these concerns, Journalism + Design recently hosted “The Dark Side of Design,” a frank conversation to investigate the space between effective user experience design and a habit-forming, manipulative product. Moderated by WIRED‘s Editor in Chief, Nicholas Thompson, the event also dug into the ethical quagmires that plague designers and users in the modern tech age.
The panel was comprised of Alexis Lloyd, the Chief Design Officer at Axios and former Creative Director of The New York Times R&D Lab; Roger McNamee, a founding advisor of the Center for Humane Technology, venture capitalist and co-founder of Elevation Partners; James Williams, a researcher at the University of Oxford’s Internet Institute, a co-founder of Time Well Spent and former Google strategist; Nir Eyal, a behavioral design consultant and author of Hooked: How to Build Habit-Forming Products; and Tristan Harris, co-founder and executive director for the Center for Humane Technology and former Google employee.
Stream the talk in full or skim through some of the highlights, below.
7:43 – Nir Eyal: “There’s a big difference between addiction and habits. Habits are behaviors done with little or no conscious thought. We have good habits, we have bad habits. Addictions are always bad. When we use that word a lot, everything is addictive. If I like something, it’s additive. If I use it a lot it’s addictive. That’s not really the right way to use that term and addiction is immoral… creating products that are addictive is a immoral because addictions are behaviors that, even when people want to stop doing, they can’t.”
10:36 – James Williams: “There is fundamental world and political problem we have now that we all live in this technological environment, this attention economy where all of these products, all of these services are competing to capture and keep and exploit our attention…”
11:16 – James Williams: “I think I came to realize that this is the moral and political problem of our time because in order to give attention to anything that matters, whether it’s climate change, or combating extremes or anything, you know, we have to be able to give attention to what matters and I think that’s exactly what I saw these technologies were undermining using these various kind of petty tricks, hacking our reward pathways, et cetera. “
16:03 – Roger McNamee: “I’m a person who’s here because I happened to be involved with Facebook in a moment in time, it gives me real credibility to say. This is the most important thing I’ve ever done and it scares the hell out of me. Not because they’re bad people. Not because they intended to any of this stuff, but rather because the system go out of control and the profit motive makes it really hard to fix.
21:21 – Alexis Lloyd: “I think that when you look at the ways that technology has become addictive in a negative way, it tends to be because they’re not really solving a problem for you and me and your everyday users of the technologies. And that’s either because they’re not considering the problem, their not considering what the users might have… The second is when you consider the needs of your users but you think they’re all like you. So that’s another problem that often comes up when you look at companies that have tended to be founded by young upper middle class white men who are deciding often for people who they assume are just like that with the same problems they have, but then get extended to a much, much wider set of participants and a much wider audience and a lot of problems ensue because of that. Third is because your incentives are misaligned and you’re not actually designing for your user, you’re designing to achieve other things like the participation of your user which I think we’ve all clued into in one way or another.”
24:04 – Alexis Lloyd: “I think part of the solution to that is to continue working on ways that we can actually quantitatively measure those things and then the other pieces to keep talking to the users and to balance out the qualitative and quantitative ways of measuring that so that you have a richer understanding of what the success or failure of your product is.”
27: 05 – Tristan Harris: “We just invented a new species of persuasion. Every time you open up Facebook—whether it’s the content story you see on a news feed or it’s the ad that you see—we think that you know the story at the top is the latest thing that one of my friend’s posted, but that’s not what actually happens. When you open that blue F icon, you just activated a super computer on the other side of the screen that’s trying to play chess against your mind and figure out: ‘what’s the move I can play 50 million steps ahead of what your mind has any clue is coming? What’s the perfect thing to show you that will keep you scrolling?'”
37:24 – Nir Eyal: “I think there’s a lot that technology companies can do to make products safer for children. However, we also have to be careful of saying, “You see these kids today with their rebelliousness, using Snapchat all time,” because if we don’t use it’s not meaningful to us, right… We’ve got to be very careful about judging people just based on how they spend their time because my hobby, my passion, what’s meaningful for me is somebody else’s frivolity, particularly when it comes to people who spend time on social media and gaming. This rumor, this perception that if you’re a gamer, if you enjoy spending time on social media, that somehow you’re defective, you’re weird, you’re not having a real experience, that’s always harmful, you know. I think that’s putting yourself on moral pedestal.”
41:04 – Nir Eyal: “We know that a study from alcoholics show that… the number one factor to determine relapse among alcoholics is not the level of physical dependency, it’s their belief in their own powerlessness. Not the physical dependency. We’re not talking about booze here, something you ingest into your body. We’re not freebasing Facebook and injecting Instagram remember. Right? Your level of powerlessness. Your belief that you can’t do something is more of a factor in how much you stay addicted. “
45:07 – Alexis Lloyd: “Every new technology has produced a backlash about how that technology is going to destroy society. Comic books were going to destroy society because no one had to keep all the information in their heads anymore. The printing press was going to destroy society for various other reasons, like, this goes back through human history and so,first of all, how do we separate out what is our sort of perspective on that?”