Episode 3: Please Don’t Read the Comments

buy the stars lyrics In this episode of Tricky, Heather and Emily talk to Sarah L. Roberts, the woman who coined the term “commercial content moderation,” about how elements of online discourse are governed by outsourced and unseen low-paid workers, who sift through “the grossness of humanity.”

And they ask Andrew Losowsky of the Coral Project whether newsrooms and journalists still have a part to play in fostering civil discourse, on and offline.

Plus: the bubonic plague, dance mania, Karen Carpenter, and pointy shoes. Read the full transcript below.

Subscribe to Tricky on iTunes

Heather Chaplin: Hello! This is a Tricky, a podcast where we try to untangle the knotty questions that will determine the shape of the future of journalism. I’m Heather Chaplin, Director of the Journalism + Design program at The New School.

Emily Bell: And I’m Emily Bell, the Director of the Tow Center for Digital Journalism at Columbia University. We think about this stuff quite a lot in our day jobs, but still we find ourselves chatting on DMs, texting, Slacking furiously, about these big questions and the ways we’d like to explore them. Hence this podcast which—

Heather Chaplin: Emily, Emily—

Emily Bell: What?

Heather Chaplin: I got to stop you here.

Emily Bell: Oh really? Why?

Heather Chaplin: I don’t know how to say this nicely, but I think you’re violating some Tricky community guidelines.

Emily Bell: No, really?

Heather Chaplin: And I can’t be silenced about this.

Emily Bell: You’re curbing my freedom of expression.

Heather Chaplin: Don’t silence me please. I mean, Emily, you’re in the hot tub.

Emily Bell: Oh, what?

Heather Chaplin: Our content moderator in the Philippines has just alerted me to this fact. And yes, I had noticed it, but you know I just work here. I’m not in charge of these things and I’m not getting paid fifteen dollars an hour to pull videos of beheading humans from the internet.

Emily Bell: I was told as long as I didn’t behead anyone in the hot tub it would be fine. I’m sorry.

Heather Chaplin: It does get messy in there.

Emily Bell: That’s the handbrake turn into this week’s thorny problem. I’m not actually in a hot tub, our listeners will be relieved to hear. That’s fake news, Heather, which your content moderators in the Philippines are not trained to detect, incidentally. But are you maybe hinting at the issue at hand?

Heather Chaplin: It’s true we are talking about — god, please forgive me for uttering this phrase because I almost drove a steak knife into my heart when you first brought it up — content moderation.

Emily Bell: Content moderation. Which I promise is so far from the horrible bland as it sounds. It really is key. It’s about the whole nature of how we keep our free society together. ‘Cause you know that we like to tackle tiny problems on this. Conversation, dialogue, health of the public sphere, how we talk to each other, and how we manage all of these unprecedented, online conversations, and this wave of human creativity. I think we should call Sarah Roberts because she’s so smart and she studies this content moderation industry in depth. She’s also assistant professor at the Department of Information Studies at UCLA and she’s been researching this area for years. Hi Sarah.

Sarah Roberts: Hi, how are you?

Emily Bell: I’m well. Can you just sort of connect the dots for us on this a little bit, Sarah? When we’ve seen news stories in the past couple of months about Facebook saying, we’re going to add another 10,000 people to look at this problem, you know, we’re going to clean up our platform, we’re going to get rid of fake news. You’ve been studying this for a very long time and you’ve got some pretty directive thoughts about how we could have avoided this situation but also sort of how it started.

Sarah Roberts: Yeah, for the past year and a half we’ve seen more and more of those statements. When platforms come under fire for content that has appeared on their platforms, or that has maybe been removed, and they’ve gotten criticism. And the response, almost across the board, has invoked at some level, this idea that there will be human beings who are already employed or who will be sought out and employed in sort of this gatekeeping capacity. Interestingly enough, none of the platforms have simply said, oh we’ve got AI tools that we’re going to employ that will resolve this issue. That may be a part of the picture, but each of these big platforms has come out and said, you know, we’re going to bring our content moderation teams up to 10,000. What’s interesting about those statements is for the first time, the platforms are going on record in a very public way — really thanks to the work of a lot of journalists who’ve put them on the spot with numbers. So for many years people have asked me, how many people do this work? And I’ve had to make my best guesses. But now we have, from some of the major firms, numbers in the tens of thousands for each one of their platforms. That having been said, what’s also interesting is what is not told about the circumstances of these workers. Where are they in the world? Are they actual employees of these firms? Or are they contractors or subcontractors? What are their working conditions? What is it like to do this job? Who gives them their directives? How much agency do they have to make decisions themselves? And so on. So that’s the kind of work that I’ve tried to unearth in my research, that has not been forthcoming from the platforms themselves, but has really been something that has been uncovered thanks to workers who’ve been willing to talk, and violate non-disclosure agreements, and so on over the years and share with me.

Emily Bell: Do you think that the application of those types of numbers and the kinds of contacts you’re talking about is actually going to improve the situation?

Sarah Roberts: You know, it’s such a complex set of issues and problems that is now being put at the feet of this sort of, unnamed, unknown legion of globalized workers. And so certainly, in order to contend with the vast amounts of material that are uploaded to these platforms minute-by-minute, it’s not possible to attend to that either simply with machines and computation, but 10,000 people is probably still not enough to do a really thorough job. So a number of questions still remain. Simply put, how much of the material on a given platform — on YouTube, on Instagram, on Facebook, et cetera — is actually ever reviewed at all? I would contend that it’s a very small portion. Given the changing political context around the world, given the seeming, and you know, still out out for debate, role that these platforms have played in key, kind of political, debates around the world in the last couple years, there is there’s sort of a natural moment of questioning of their roles. And it gets back to this fundamental point of: What are these platforms actually? Are they a public good? Are they the public square? Are they akin to going down and standing on the soapbox in the park and making a statement? Or is it more like you’re going to the shopping mall and you can be there as long as you conform and behave but you’re really there at the at the whim of the property owner? In other words.

Emily Bell: I thought of you, Sarah, when I was reading Jack Dorsey’s thread earlier in the week about how he wanted Twitter now to be, you know, a good actor in terms of creating better discussion and a better environment for discussion. On this thread, he was talking about the work that Cortico and Social Machines — which are both, incidentally, they come through MIT — but they’re initially Twitter-backed research projects and introduces of the concept of measuring conversational health. Is measuring conversational health something that we either could or should do? And should we be allowing Jack Dorsey and his research outgrowths to do that?

Sarah Roberts: You know, again, I find this switch in tone and tenor fascinating. I certainly welcome the senior officials, execs, and thought leaders in Silicon Valley. I welcome their change of heart and change of tune. I should say that, you know, Twitter in particular has been fairly hostile to criticism from academics and journalists and others. And yet it has been at the nexus of a number of sort of terrible expressions of the kinds of problems that we’ve seen online, whether it’s gamergate and abuse towards women, alt-right harassment, the question of meddling, et cetera. And so, you know, actually the Twitter CEO and other parties at Twitter had been, up until recently, fairly intransigent in kind of rejecting the need to engage around these issues. So this, to me, was a fairly significant change in posture. That having been said, I think, as many rightly pointed out, this notion of you know, applying quantitative analytics and you know, ostensibly big data measures and so on, struck many as sounding like more of the same. So what we have here is a platform that sort of invented a problem and will apply tricks from the same bag in order to resolve the problem. There may not only be other answers to the questions that Jack Dorsey and others are asking, but there might be better questions that can be asked that could be informed by other ways of knowing, other disciplines, just other points of view. So you know, when Twitter kind of proposes to resolve its own problems that it can’t get a handle on, by bringing its closest friends into the room and applying some of the same kinds of, you know, computational solutions to what really are social problems, I am concerned about the likelihood of success there.

Heather Chaplin: I keep thinking, you know, when you when you have a hammer everything looks like a nail. And this this came up, well, I think on both of our episodes so far. Geez, it’s almost as if we’re deeply interested in this! But this idea that you could solve a problem as complex as the state of dialogue and discourse today with it — as an engineering problem — just seems ludicrous to me. And the idea that, I feel like, we are so influenced by this one kind of personality. Which is the tech startup guys who see the world as this sort of modular problem that they can solve with the same way that they built these platforms. It’s just not true.

Sarah Roberts: You know, the only metaphor that comes to mind right now is, you’ve hit the nail on the head, right? I mean that’s it. I’ve been in a lot of closed door sessions of late. You know, I’ve heard industry folks and others make some very curious statements to the effect of, ‘well we really don’t know much about social norms.’ That was something that came up recently in a closed door session.

Heather Chaplin: What does that mean? What what are they saying?

Sarah Roberts: In terms of what things could be measured or how platforms might adequately contend with what, I think, we ought to agree are actually social phenomenon and social problems. And they’ve sort of had a sense of of these being issues that are intractable or otherwise unexplored. And you know, meanwhile there was a whole bunch of social scientists and others at that very table who raised an eyebrow to the statement and it was like, well there’s there actually is a deep kind of tradition of looking at these issues from a number of perspectives. Certainly not to exclude the field of journalism, right? Which may not be completely one-for-one mappable onto online spaces, but certainly has something to say about, you know, the terms of engagement with the public on issues, or how we present information, or what it means to have a code of ethics — whether or not those are adhered to — at least we can gesture at them, and so on. So it’s sort of like there’s this sensibility that all of this stuff is brand new and no one’s ever thought about it before and I’m not sure that’s the case.

Heather Chaplin: And I think one of the things we are very interested in here is sort of acknowledging that these problems are so complex and so difficult. It’s not about fixing them. Give us an intervention. What’s a little something we could do that might make a difference?

Sarah Roberts: Well one thing I like to point out, in terms of how we might collectively improve where we are today, is simply the fact that the way we have ceded our interpersonal and a lot of our public engagement to for-profit platforms was not a foregone conclusion, right? It didn’t have to be that way. Which means it doesn’t have to always be that way. We have to remember the collective imagination is very powerful. So is ingenuity of people. And there are other reasons that we might invest or engage that could be other than, you know, sort of taking our thoughts, and our affect, and our interpersonal behavior, and monetizing that for someone. You know, there were online spaces that existed before the internet became commercialized and they looked different, and they had governance, and they had rules. And some of them were draconian spaces that were almost impossible to be on because of the rules. Some of them were laissez faire to the point of being anarchic, and there were things all the way in between. But those norms and rules were perceptible, intangible. There were human beings who were responsible for them. Those kinds of things could be identified and perceived. Whereas what we have now is an internet that has been built up — and built on — a notion of unfettered free expression and the free circulation of information that, I would argue, has never been the case on these platforms and yet has been sold to the public on those grounds.

Heather Chaplin: Well I was really struck on that Jack Dorsey thread that you guys were talking about before. That all of the comments to him seemed to be people like you know, ‘fuck off Jack’, like, ‘we have a right to free speech.’ And you’re just sort of like, whoa, what’s going on here?

Sarah Roberts: And the platforms really are largely responsible for for those notions because that’s how they sold their utility to the public, right? Especially when we’re in a context of fewer and fewer public, or quasi public, outlets for expression. So I don’t think it’s a mistake, for example, that we’ve seen the rise of these platforms, and people using them as primary information sources, when we’ve seen the shuttering of, say, local dailies in communities, right?

Emily Bell: Right on.

Sarah Roberts: Or a shrinking, kind of, full-time professional staff on the state government beat, for example.

Heather Chaplin: Are there any other times in history where the the public square has kind of been handed over to for-profit ventures?

Sarah Roberts: I mean, I am not a historian or a legal scholar so I’ll put that—

Heather Chaplin: You don’t know all things?

Sarah Roberts: No, I’d put that content out there—

Heather Chaplin: I’m deeply disappointed.

Sarah Roberts: But I’ve certainly been someone who’s wondered about that and read about that. And you know someone like Jamie Boyle, who’s a legal scholar at Duke University, and has written on the enclosure of the English commons, for example.

Emily Bell: I’m keeping very quiet here because that’s exactly what happened with the enclosure of the commons.

Heather Chaplin: She’s like shrinking under the table.

Emily Bell: I come from a farming family so it was probably me who actually did it.

Sarah Roberts: Who actually enclosed commons, who actually shot the foragers out of out of your land. You know, I’ll give you another example that’s kind of close to my heart and close to, you know, my everyday work which is to work with and teach future librarians and information professionals. There’s been this ripple effect with the proliferation of online information sources, that are commercial and commercialized, that has actually become become an impetus, or a sort of a reasoning, for those who would like to see the public library, for example, receive less funding or even disappear altogether. In other words, I have students who are who are in training to go be public librarians, or otherwise be in this shrinking public sphere, whose own family members say to them, ‘why do you why would we need librarians anymore when we have Google?’

Emily Bell: Right.

Heather Chaplin: Oh my god that literally just made me nauseous. Sorry, I’m having a moment.

Sarah Roberts: And I mean if nothing else we know through the work of people, like Safiya Noble and others, the deeply problematic nature of Google Search as the primary source of information for all people. You know that presents grave, great informational inequity problems, and other kinds of issues.

Heather Chaplin: Let me ask you, I sort of before said, you know what’s a small thing we could do? And I really loved your answer of just sort of remembering that this is a designed system and that it could be redesigned. That it didn’t, you know, spring full-formed from from the earth. Let me ask you, even more specific, are there any little concrete things that people who are concerned about this can actually do?

Sarah Roberts: I’m at odds sometimes with my closest colleagues and peers in these spaces who kind of continue to argue for the internet as a free speech site. I wonder, and I might suggest, that a better tact would be to acknowledge that, in fact they might not be that. They might be something else. And given that, if we change our understanding about what these platforms are and are not, maybe we can assert other kinds of pressures that come from a different place. So, what I mean is, for example, in the case of Twitter or YouTube, you know, I would assert that these aren’t the public square. Despite the fact that these platforms may not be the public square, they can still be held accountable to the public. And certainly the platforms, as you’ve seen, are very receptive to public opinion because what is their ultimate fear? To lose their users because of what they do or don’t do. So that’s kind of the first thing that I would say. The second thing that I would say though, that is more in direct relationship to the research that I do, is around the workers themselves who are tasked with dealing with this information removal, cleaning, deletion, and so on. Which, frankly, is a miserable job. It’s a horrible job as you know. It’s fraught with, sort of, the danger of being exposed to all sorts of things that are traumatizing and disturbing. We don’t know what the long-term ramifications are for these workers. And so people have often asked me over the years, you know, gosh what could we possibly do? And I will just quote one worker who does this work on Amazon Mechanical Turk. Her name is Rochelle LaPlante, and she simply said when asked this recently, ‘pay me.’ You know what I mean? Pay me appropriately, pay me so I can get the psychological care I need. Give me that in the workplace, support me, give me the right kind of pay, so that if I need to take time off I can do it. Give me benefits, don’t make me a precarious worker.

Emily Bell: We may well drag you back onto Tricky at some point because there are so many things that you touched on there which I know we are going to want to talk about.

Sarah Roberts: I can’t help it. I’m here in southern California I find myself invoking Karen Carpenter all the time, you know. We’ve only just begun.

Heather Chaplin: Is that what happens when you move to California?

Sarah Roberts: I guess. You know, we’ve only just begun with this kind of rethinking.

Emily Bell: Thank you Sarah.

Heather Chaplin: Thank you Sarah.

Emily Bell: We have only just begun.

Sarah Roberts: Alright, thank you folks.

Heather Chaplin: You know, I’ve recently been very interested in moderation as a value. Mostly, I think, because it’s not one that we hold very highly right now. Like online, everybody is either crying or outraged all the time. And this whole conversation, it just really been reminding me of the bubonic plague. But seriously, if you were a person in 1347 to 1351, those were some rough years. There was the black plague that wiped out about a third of the population. It was a very difficult time for lots of reasons as well as that. But the thing that has always been so interesting to me about that time is the way people reacted and the word you would have to use is excess. People were freaking out. Not surprisingly, a lot of dead bodies, a lot of crazy shit happening. But start with fashion. So these—

Emily Bell: Fashion in 1347.

Heather Chaplin: So fashion in 1347, not joking here. These are all facts, pointed shoes Emily. Now, I’m not just saying pointed shoes, I’m saying pointed shoes that were so long and pointed that the people wearing them could not walk. And these shoes became so controversial that clergymen would rage about them from the pulpit. Or take another example of excess: this was not a good time for the Jews. There was a rise in, what we would now call pogroms, in Strasbourg. They murdered 2,000 Jews over one weekend. People couldn’t explain what was going on around them, so they were turning to all kinds of crazy things, like astrology or, you know, there’s a conspiracy of Jews to poison all the wells in Europe. And then this is my absolute favorite, all-time example, of immoderate reactions to stress which was dancing mania.

Emily Bell: Dancing mania.

Heather Chaplin: Do you know about this?

Emily Bell: I think I’ve participated. Could have happened in 1347, that’s quite a long time ago.

Heather Chaplin: So, 1347 in the Rhineland. This phenomenon — the Germans are kind of behind everything but that’s another another show — so 1347, the Rhineland people just started dancing like crazy. This is all true. Like hundreds of people, thousands of people, would just go from town to town on public squares, and they would dance and dance and dance until they dropped to the ground with exhaustion. At which point, they would start moaning and clutching their limbs and shouting about the devil and god and, I’m not kidding, the evils of pointed shoes. So doesn’t that kind of sound like how people act online?

Emily Bell: I would rather have dance mania and pointed shoes than Nazis and misogynists. But yes, it does sound a little bit like that.

Heather Chaplin: My point being people react to fear and uncertainty and the sense of chaos or, you know, I think in Barbara Tuchman’s amazing book, A Distant Mirror, the way she describes the 14th Century as a period of anguish where there was no sense of an assured future.

Emily Bell: Right, so to make sure we don’t end up with plague, which I hear is very unpleasant, I’d like to talk to somebody about whether civil discourse is even possible and how journalism publishers can facilitate it. We have Andrew Losowsky, who is the Project Lead on the Coral Project at Mozilla, which was founded in collaboration with The New York Times and The Washington Post. Andrew welcome to Tricky.

Andrew Losowsky: Thank you. A longtime listener, first-time caller.

Emily Bell: This is amazing. We’ve been— Andrew was our listener for the first episode. So you may have heard, Heather was just ranting about the bubonic plague, and dancing mania, and pointed shoes, and some of the stuff I tuned out of. With this sort of return of tribalism and point of change and uncertainty that we’ve seen online, are we actually capable of moderation or will the extreme always dominate online conversation?

Andrew Losowsky: So I think it really depends as to where online you are talking about and for what purpose. Right now, we’re in this phase where we have these enormous platforms, as you say that have incredible scale, and that’s really the benefit for investors and also for the users. And yet, at the same time, this question of, does scale make it actually more difficult to be able to have more measured conversations or more respectful conversations. And I think that’s one of the questions we really need to ask is, are these the right kind of paradigms, are these the right kind of platforms, to be able to have civil conversation, if that’s what you want. And there’s a lot of people who may not want it.

Emily Bell: Before we had platforms, we had gatekeepers that were journalists — maybe — a bunch of white male gatekeepers, and they probably felt that they were facilitating civilized discourse. So, you know, the Coral Project connects with newsrooms and you can tell us a little bit about that work. But what’s the role, now, of journalists in shaping or at least engaging in a civil way of public discourse? Because in a way newsrooms have backed away from this.

Heather Chaplin: And what do we even mean by civil?

Emily Bell: Right.

Heather Chaplin: Because do you mean polite?

Andrew Losowsky: So the question of civil is an important question. Because I worry about, are we tone-policing people who, in many situations quite reasonably, now have more extreme language or ways of behaving because they haven’t been listened to at all in the past. This question of white, male gatekeepers is very well-taken. The overall question of, ‘what is the role of journalism within civil discourse or within community?’ I think, we need to step back a little bit and think, first of all, what is the role of journalism in society? Why should we have journalism? What is the benefit to, right exactly, what is it for? How do our communities benefit? How do we as individuals benefit? What’s the mission of journalism? And that’s going to change from organization to organization. And, what I’ve found, as part of the Coral Project, we’ve talked to more than 150 newsrooms in 30 countries. And one question that actually is very hard from most organizations to answer is the simple question: What is your mission? Because their general mission seems to be to survive and continue the legacy that they have. And the legacy isn’t a mission, it’s only a history. And so the idea was saying, well what is it that we do? What makes us different from anyone else? So the work that we do at the Coral Project is to bring journalists closer to the communities they serve in order to make the communities healthier, in order to make journalism more trusted, in order to increase diversity, and so on.

Emily Bell: The iron core of journalism is really reporting. It’s the thing that we do that nobody else does. You know, we get up every day, and we try and find things out, and then we try and write them down or make a video of them and show it to people. Does the fact that, when you’re talking about conversational health, et cetera, you know, kind of that broader sort of benefit to the community— is that just becoming too attenuated from what the really, sort of core purpose, of journalism is?

Andrew Losowsky: I think that what we need to remind ourselves is that at the core of journalism is also listening. And that when you’re talking about, well, what we do is report, well reporting is listening of a kind. The question is, who were we listening to? For whose benefit? And at what stage in the reporting? So much of the idea of conversation is after the reporting is finished, the piece is published, then the community gets to have their say. And in the meantime, the journalist is over there doing something else. Rather than thinking about it as a continuum, and a continued conversation, and to continue back and forth.

Heather Chaplin: I feel like, Emily, what you are kind of trying to get at is guys, let’s not forget we’re actually supposed to be like holding powerful people’s feet to the fire and re we putting too much energy, maybe, in this idea that we’re supposed to be conversation facilitators. I don’t want to put words in your mouth, but I feel like maybe is that attention that we’re not necessarily acknowledging, and just assuming that this is what we want?

Emily Bell: Right, and Andrew you’re very close to newsrooms because you’re trying to implement this. I’d be very interested as well to hear, on the back of that, has the receptivity of newsrooms to the work that you do in the Coral Project do changed, as we’ve seen things like the business models of journalism change? Where you actually need a more engaged audience, both to look at the things that you’re doing, but also to give you money.

Heather Chaplin: Right, right.

Andrew Losowsky: Has it changed? Not much I would say over the past three years that we’ve been in operation. There is some greater willingness to understand that there is something that the newsrooms have been missing. Certainly post-election, there was a sudden surge in interest, and then saying, wait maybe we are not as close to the communities we serve as we thought we were. Maybe there is a distance. You’re absolutely right about holding people in power to account. But the question, again, comes back to: in what ways? And are we holding power to account on their own terms? Or are we doing it in a way that is actually addressing the real questions that people want answering in order to have the information they need to make their own lives better.

Emily Bell: What about this move away from commenting systems? So the Coral Project sort of came at a point when lots of newsrooms were just saying, we’re not going to do that anymore, because social media does it for us. And we’ve actually been hearing about how social media doesn’t really do it for them, that it doesn’t have that mission to shape conversation. But I know all too well the conversations that happen within news organizations, about the amount of effort they have to put into curating a community or moderating comments. And suddenly we have people just saying, we’re not going to do it, we’re going to we’re going to just go back to just publishing articles. Was that a misstep by the news industry?

Andrew Losowsky: I think it was. I think it was also a fundamental misunderstanding of what you need to have in order to have a sustained onsite community. It doesn’t have to be all or nothing. And this is one of the big things that I face, is that a lot of newsrooms have said to me, ‘look, we didn’t have the resources to do 24/7 moderation on every single article on our site, so we took off all comments and now we do nothing.’ All right, well that’s a huge space in the middle where you could have a weekly discussion, where you open comments just on one page, or Q and A’s with journalists once a day, or you know, there’s a whole range of other things you can do. And you can do that on your platform or on someone else’s. If you do it on someone else’s, then yes with going to primarily Facebook and Twitter and other spaces, you do get the benefits of there’s a lot more people on there and people can find due. But what you’re also doing is handing over the ownership of the direct relationship between you and the reader to a third party that will monetize that and sell it back to you. And that’s a really precarious position to be in.

Emily Bell: And that raises another thing about journalism, which I think bothers all of us, which is, are we to some extent just becoming very marginal in all of this? You know, that we can enjoy your work and look at, you know, kind of the great conversations that take place in threads on New York Times or whatever. But it’s not where the main event is. And that’s actually, you know, these kind of commercially moderated, vast platforms where again, you know the kind of governance structures don’t really have a journalistic mission or even a mission particularly to listen.

Heather Chaplin: Or a public good.

Emily Bell: Or a public good mission.

Andrew Losowsky: I think that we also need to think about who are the audiences that we serve? And does that have to be as big an audience as possible to have the impact and the change that we want to have? So if your mission and your goals is to try to create policy change around a certain idea, or to create active engagement within a particular community, or to have impact in the courts, or to have certain kinds of reach, or something else, you can. There is an open question about, is it better to focus more keenly on a smaller committed number of people? Or is it better to try to go for full scale? At which point you can end up in click bait and other spaces too.

Heather Chaplin: When you say better, are you talking about the business model of the news organizations, or are you talking for, you know, democracy?

Andrew Losowsky: I’m talking about the business model, I’m talking about the mission that the news organization is trying to do, and then the way in which that organization serves democracy.

Heather Chaplin: What role should journalism be playing in facilitating public dialogue? I’m trying to get at the question—

Emily Bell: It’s the public sphere question.

Heather Chaplin: Exactly. So what’s actually at stake? Like who cares about public dialogue? Why is it important? What’s at stake if it, you know, if the system that facilitates it is not healthy?

Andrew Losowsky: I mean I think that journalism ultimately needs to be making people’s lives better and helping empower people to make their own lives better. And doing it in a way that creates a healthy society to enable everyone to be able to do that. And so, as a result of that is to say then, well what is the role of discourse in conversation in that? Well obviously, it needs to be at the core because how do you know what makes people’s lives better? How do you know if you are making their lives better? How do you know if you’re able to be the fact-checker, the investigator, the person who they turn to to say, is this true is this not? And if you’re not in dialogue, you are in danger of just creating the things that you think other things that people want. And especially when we end up with a great lack of diversity in newsrooms, we end up with very easy to get into groupthink, and in certain ways of seeing the world and certain people who we think are our audience, because we think that we are representatives of our audience.

Emily Bell: We’ve seen this rash of initiatives to fix conversation and to fix the public sphere. Often coming from places that don’t, as far as I can see, have many credentials in terms of fixing anything. How are we meant to sort of think about all of these new initiatives? And I wonder what your perspective is, as somebody who’s been working on this for a long time, which actually even predates the Coral Project.

Andrew Losowsky: You’re absolutely right, there’s a lot of different initiatives. And on the one hand, this is an unprecedented moment in terms of digital communication and conversation. We can have conversation at a scale, in a speed, and a reach that we’ve never had before. And so it churns and moves forward in ways that we’re still trying to understand. So I’m very— I think it’s very important that there are a range of different approaches and these approaches are tried and studied and and so on. I think that one of the issues that we also face is the question of, are we hitting conflicting incentives? And I think that’s something that actually was talked about in the first episode of Tricky.

Emily Bell: Oh, he is a listener.

Heather Chaplin: He’s good, this one.

Andrew Losowsky: It was some point after the small bottle of gin appeared from the box.

Emily Bell: It was actually a very large bottle of gin, but you’re not to know that, it’s an audio thing.

Andrew Losowsky: So this idea of, is it in the platforms’ incentives to have better discourse online? And there’s this moment now where suddenly they’re realizing, maybe it is from a civil perspective, but it hasn’t been from a monetary perspective.

Emily Bell: Yeah. And when you mention monetary perspectives, one of the things I remember from my Guardian days is just that putting resource into curating good conversations online, is you know is pretty resource intensive. So at a time when news organizations are under a lot of cost pressure, how can you make the argument in newsrooms that it’s actually worth hosting conversations or worth moderating? You know, really worth putting resources into moderation rather than reporting results.

Andrew Losowsky: So I see moderation as being a part of reporting resource. And the way that I get around this, I speak to a lot of very skeptical journalists you would be amazed to hear.

Emily Bell: Oh yes, that’s astonishing.

Andrew Losowsky: Exactly. And I have heard many people say to me, ‘yeah it’s great what you’re doing but I’m never going to use any of your tools, I’m never going to listen to the readers, I’m too busy over here doing my thing.’ And I say that’s fine, but listen I’ve got three questions for you. First of all, where do you get your ideas for what your writing and your sources from? Where do you find these people? And then the second question is, do you believe that, among your readership, there are people who are experts in the topic that you write about and know things that you don’t know yet that you might want to report on? And then the third question, is do you want to talk to those people and get that information? And at that moment they say, well yes, of course I would love to do that, how can I do that? And then it’s a question of OK, well how can we create a space in which this can happen? One thing I do tell newsrooms is don’t make promises you can’t keep. So if you can’t put the resources into moderating everywhere on the site don’t put comments anywhere on the site. Find other ways to engage. And comment is only one part of what we do at Coral.

Emily Bell: What do you think the future for all of this is, Andrew?

Andrew Losowsky: I think the future is a range of different tools and a range of different approaches with newsrooms, where they think about engagement like they’ll think about a photograph or a graphic. To say, what are we going to do with this story when we commission it? Are we going to send a photographer, are we going to get an illustrator? OK, what are we going to do about engagement around this story before we start reporting? Are we going to engage before we report, during the reporting process, is the engagement a key part of this? Or is it something where it doesn’t really— isn’t really necessary for this piece and we’ll do more in the engagement resources over there? And so I think that engagement just becomes another tool and it becomes one that actually helps power the journalism process.

Heather Chaplin: Wait, how— clarify how we went from dialogue to engagement? We’ve been talking about dialogue as sort of this, you know, sort of Greek ideal, you know, people’s ability to discuss things and whatnot. We’re going to get into Buber in a minute. Stick around for that. But now, I feel like you’re more talking about the relationship between news organizations and audiences, which is— is that the same thing? Is that a different thing? And I notice the words you were using shifted.

Andrew Losowsky: Yeah, so it is a connected thing, but it’s a different thing. For me, what I’m saying a lot to newsrooms is: if you want to have community. why do you want to have it? What is the value to you? What is the value to your mission? What is the connection to your mission? So if you’re going to host dialogue on your page, how does that actually create a community that is engaged with your work and helping support your work? Because if it’s not, there were other platforms that do just general dialogue and conversation, and you do not need to be involved in them. If they’re talking about you, you don’t even have to watch if you don’t want to. Whereas, if you are going to get into this business — and I think it is a very smart move for journalism, journalism needs to be in the listening and talking business — then you need to think of it as engagement around fulfilling your mission. Rather than general dialogue about the topic that you hope goes well.

Heather Chaplin: So the word community gets thrown around a lot. How do you define community? I mean what about, like, the child molesters and the embezzlers? Are they in the community? Or are they out of the community? Is community only the good people?

Andrew Losowsky: There may be a community of embezzlers out there.

Heather Chaplin: We serve the community of embezzlers. Now there’s a business model.

Emily Bell: We’ve already discussed the fact that all of our friends are the charlatans and mercenaries.

Heather Chaplin: Yeah, I serve the mercenary community mostly.

Andrew Losowsky: A community is a gathering of people, virtual or real world, or in any space, who agreed that they have something in common, who have shared rituals and language, and who, if the community comes under attack, then they would look to defend it because they understand what the definition of who is in and who is out about that. So what that means is that a news organization probably doesn’t have one community, they have many. And they may have many that coexist in the same comments thread, for that matter. And this is one of the issues around huge platforms like Facebook, is that you get this overlap and clashing between very different communities who are pulled together by the algorithm to comment in each other’s spaces, where there are different rules, different concepts, different cultural expectations. And because all of the tools look the same, people end up acting the way that they expect. Whereas actually they’re involved in a different community. So I think that we have to be very careful about constructing and managing communities in this way. And that’s one of the reasons why taking it off these social platforms can be a benefit, because you can set the rules and set the tools in a different way, and say this space looks different, it acts different, so this is how we expect you to be in this space.

Heather Chaplin: That’s interesting, algorithmically created communities. Which perhaps lead to problems. Alright, I think that’s a good note to to end this with. Andrew, thank you for joining us.

Emily Bell: Thank you, Andrew. Do come back and talk to us more about this and keep listening as you are a precious first listener.

Andrew Losowsky: Thank you both.

Heather Chaplin: So Emily, do you feel like you have a better sense of the shape of the problem?

Emily Bell: Yeah I kind of do. Let’s get back to Sarah Roberts, who is saying there are so many facets to this because we are talking about such a huge scope of what we see or discuss or post into a sort of a public forum. I get a much clearer sense now of how journalism should connect to this and what its role should be. Which is as a kind of, I think, a sort of a standard setter in this area. Is that too wanky, to be a standard setter?

Heather Chaplin: Well I’m just picturing those standards that people use to carry out on the battlefield.

Emily Bell: With their pointy shoes.

Heather Chaplin: But seriously, do you have a sense of what a healthy system might look like?

Emily Bell: You know, there is an emerging set of scholarship which talks about the rights of the listener. And this is in law and communications theory as well. So we’ve had a pretty short, if you like, history of free speech. You know, American free speech. So Tim Wu would say it’s like the Super Bowl. We think it’s a very old American tradition, but it’s actually relatively new. It was really established in the 50s and 60s, the modern version of this free speech doctrine that we have. And actually what we have now is we have too much speech of all types. Of automated, some of it’s wrong, some of it’s deliberately misleading, some of it’s propaganda. And we have to think about, how do we drain that out of the system? First of all, without infringing people’s rights to free expression. We can only really do that we start to think about the rights of the person receiving it. So in other words, do people have the right to hear valuable information versus lots of Russian bot chatter? And somehow I think that that’s, to me, that’s the shape of the problem that we need to think about. Which is, how do people get to the information that really matters to them? Or to have the conversations which are, ugh I hate this phrase, meaningful.

Heather Chaplin: Yeah yeah. Well I hate to get all Martin Buber on you.

Emily Bell: I don’t like it when that happens either. He was a beardy, Martin Buber.

Heather Chaplin: He was bearded but, which as you know is not my favorite look for a man, but he was smart. And I was poking around this morning and he has this definition of, what he called true dialogue—

Emily Bell: He is a philosopher, just to be clear.

Heather Chaplin: And he talked about his one that included openness, honesty, and mutual commitment. Which I thought were were sort of interesting definitions. But I was also interested that he talks about dialogue as not just being about expressing your point of view, which I think is sort of how we commonly think of it, or even reaching a conclusion. But, he said, is the prerequisite for any kind of authentic relationship. And he was talking about either between a set of humans or between human and god. And so I just thought—

Emily Bell: A woman and a fish. I’ve just seen Shape of Water.

Heather Chaplin: I haven’t seen it yet, don’t spoil it.

Emily Bell: Fish sex.

Heather Chaplin: And that was brought to you by the Wikipedia paraphrasing services, which I like to use whenever I can. But I just thought it was sort of a nice point to end on in terms of what’s at stake.

Emily Bell: Please improve our conversational health with your comments and suggestions. You can email us at trickpod at gmail dot com and subscribe to Tricky wherever you get your podcasts. We highly value your podcast reviews and as you’ve heard this week, we will actually drag you on to the podcast if you are one our regular listeners. They help other listeners pick us up from the crowd and they warm our cold, cold, cynical hearts— even the ones from our relatives.

Heather Chaplin: And if you’ve already rated and reviewed us, which again is how the algorithms surface shows, please ask your friends to do so too.

Emily Bell: This has been a production of The New School’s Journalism + Design program with help from the Tow Center for Digital Journalism at Columbia. And with funding from the Knight Foundation.

Heather Chaplin: Our producer is Sara Burningham. With help from George Civeris. Thanks also to Kayla Christopherson. And if you want more, you’ll find links, background information, and show notes at trickypod dot com. That was Andrew Losowsky.

Andrew Losowsky: Losowsky.

Emily Bell: Losowsky.

Heather Chaplin: Wait, have I been mispronouncing your name the entire time that I’ve known you?

Emily Bell: Yes, she has. But he’s British so he’s never going to tell you.

Heather Chaplin: But Andrew, you never said anything?

Andrew Losowsky: Because—

Heather Chaplin: I’m not being sarcastic. I am mortified.

Andrew Losowsky: Well you never asked.

Heather Chaplin: Oh my god, oh my god, I wasn’t listening. That’s what happens when you don’t listen.

Emily Bell: So that was Andrew Losowsky.

Heather Chaplin: I want to die right now.

Andrew Losowsky: The “w” is like an “f,” so it’s just Losowsky.

Heather Chaplin: Is that a British thing?

Emily Bell: It is a British thing.

Andrew Losowsky: Oh yeah, not saying anything is definitely a British thing.

Heather Chaplin: No, I know that part is British but the— anyway, sorry. Take it Emily. I’m just going to drop dead over here. Go ahead.

Emily Bell: Thanks Andrew Losowsky.

Andrew Losowsky: Thanks Emily, thanks Hawther.