The business model for media and social media focuses on outrage, shock and anger. Now the writer, editor, and journalist Andrew Losowsky says media owners must ask themselves existential questions: What do they highlight? Why do they highlight it? What are the goals? He spoke to Ashley Norris about creating better comment platforms that make the reader feel part of the mission.
- Many publishers are using comment systems that are not fit for purpose.
- Intelligently curated community management can enhance relationships with readers and underpin membership and subscription initiatives.
- When it comes to community, culture trumps technology.
From a political perspective, the next twelve months are going to be, well, interesting. The UK, Russia, South Africa and Mexico are among the more than 50 countries heading to the polls, and come 2025, a certain Donald Trump could be back in the White House.
News organisations’ comment sections and social media channels will again be lively centres of debate. But how do media organisations ensure those debates are nuanced and informative rather than just toxic slanging matches between partisan individuals?
Andrew Losowsky, Head of Community Product at Vox Media, believes he has a solution. After a career in journalism in the UK, US and Spain, he worked for The Huffington Post at a time when the website was continually breaking stories and setting political agendas. Andrew created a team, a vision and a strategy to build a new kind of commenting platform for Mozilla, The New York Times and The Washington Post – one that was open and not solely designed to wind up readers in a bid to generate more traffic.
Nine years on, that product, now called Coral, is owned by Vox Media and used by media companies across the globe, including the websites of The Washington Post, Financial Times, Slate, Globo, Wired, The Hindu, and The Globe and Mail. Andrew works with media companies to develop responsible commenting and engagement strategies while integrating the open-source platform (so tech teams can tweak it however they want) into their systems.
* Andrew will be at Mx3 Barcelona on March 12-13, sharing insights in a session called “Bringing people together: Three principles for making communities better.” He will attend the whole event, so connect and meet with him there. The full Mx3 agenda is here. You can read more about the event here. *
Why the media lets users down
Andrew says he developed Coral because media companies, until fairly recently, used comment platforms that weren’t fit for purpose.
“The initial problem was that there was no easy way for news organisations to talk with their audiences and communities and to engage in a productive conversation. The platforms that existed were not built for journalism or with the needs of a consistent readership in mind.”
The platforms invariably came from social media and tech companies. Facebook, for example, had a plugin many newsrooms used.
“Remember that early on in the growth of the web for news organisations, the quickest way to publish content was a blog. Coding at that point was a very restrictive thing. Not many people knew how to do it. And the options were very limited. So you’d start a blog to make a website and then it had a button that says ‘allow comments’. So it really just came in the box, and people didn’t really have any strategy around that.”
Strategy trumps technology
In many ways, the issue is not technology as such, Andrew says. Media companies neglected to put together effective strategies for commenting and community management.
“A lot of times, people were waiting for technology to fix a strategy problem, and that never goes well. So a big part of the work we were doing was saying, ‘yeah, we’ll build the technology’, but you shouldn’t use it until you have the strategy in place. We’ll also build out the guides and the tutorials for the strategy first, and then you can customise what you need with the technology.”
According to Andrew, the disconnect was because teams within media companies weren’t effectively collaborating as no one steered them in the right direction.
“Digital teams were outsourced, or literally in different buildings, or, in the case of The Washington Post, their digital team was in a different state from the editorial team. And so it was like those people on the third floor, we don’t know what they do, but they make it appear. It was printing onto the internet versus having a digital strategy.
“But the other thing is that if you’re talking about community, newspapers, magazines and traditional media are just not used to having the audience talking back. People used to write letters to the editor, and then two will get into print, and that was the limit of their interaction with readers.
Editorial teams were not used to the idea of people saying, “I don’t like that headline. This article is biased. We don’t like the way that you’re approaching this. You’re missing out on huge parts of the story. This is happening to me and you’re not reflecting my experience.”
The internet changed the nature of interaction and conversation, Andrew explains. “And so it’s not only a technology question; it’s also how you interact with the audience, what’s your mission for your community, and the experience and the creation of what you’re trying to do.”
The media has taken a long time to recognise that nurturing a community has significant business benefits, especially as many large companies continue to pivot from largely advertising-based revenue strategies to membership and subscriptions.
“If community systems work really well, people feel like they are a part of the mission,” says Andrew. “They’re more than members and subscribers. They’re not buying a product to consume. They’re a part of a bigger mission, a bigger idea. And we’re seeing a lot of moves towards that now.
“The key is managing moderation and community engagement and management via a first-class piece of software rather than as an afterthought that was unnecessary but had to be dealt with.”
He adds that they did a lot of work on building one of the best moderation experiences and creating ways for journalists to interact with the communities.
“Being able to give people warnings [if their content is inappropriate] and congratulations [if it is] makes sense for that media.
“So, for us at Coral, the highlighting features are most important. We don’t use an algorithm to say what are people getting most upset about right now. What we do is say to the journalists, here’s some selections of what you might want to highlight. Then the journalists or the community managers decide what the comments to highlight are.
“There’s an editorial curation perspective, because that really is playing on the highlights and strengths of journalism, which is seeing everything that’s out there and curating it.”
AI with a human touch
What role does AI play in community management – and in Coral in particular?
“We’ve been using AI for about six years. The main way is to spot toxicity, abuse, spammers, trolls, etc. And what was very clear from the beginning was that we don’t let AI make the final decisions. We flag things for moderator attention and sometimes hold things back so moderators can look at them before they’re published. We’ve been working with a couple of different companies to develop this,” Andrew explains.
If someone types anything that the AI identifies as likely toxic or abusive, they get a message asking if they are sure they want to post it. The message points out that the language in the comment might violate guidelines. The person is given the option to edit it or submit it anyway.
“First of all, we’re saying, ‘Are you sure?’ We’re not saying, ‘Hey, you, stop that’ because AI is not 100% guaranteed accurate.
“We are very careful to say the language that’s being used in the comment might violate our guidelines. You can still submit it if you think that’s right. And then if they submit it anyway, or they change something, but the AI still thinks it’s above a threshold, then we pass it to the moderators.”
Andrew says they did a study with the University of Texas to find out what happens when people see those types of messages.
“A third of the time, what was then posted was fine and not toxic at all. And moderators thought it was fine. Or people decided not to post anyway. So they were yelling in the comment box with frustration. But then they are like, ‘Oh, I get it. It’s not gonna go through. It’s not going to be allowed. I’ll just walk away and switch off.’
“This is a great result. We feel like this validates the use of the system. And hopefully improve people’s behaviour more often.”
Bringing people together
What impact will all the elections have on community management?
Andrew argues that owners need to start asking themselves existential questions.
“The business model, both for media and social media, in many ways focuses on outrage, shock and anger. So I think that this issue is about what we highlight. Why do we highlight it? What are we looking for? What are the goals here?”
Andrew sees hope in communities and groups trying to have a productive conversation based on the needs of communities.
“There’s a group called Spaceship Media, who do wonderful work, bringing people together from different political persuasions to talk about things that matter to them. And what matters to them is not ‘my politician dunked on your politician, haha’. But actually what matters is issues around safety, guns, abortion, LGBTQ, etc.”
Spaceship Media brings smaller communities of people who have very different ideas across ideological spectrums into a closed space that is safer and better for conversation, Andrew says. It frames the conversation, and then allows community members to commission journalists to find out the facts.
“No one is going to challenge their beliefs by being yelled at and insulted by someone who thinks something different. We have to find ways to come together and engage, and that might mean doing it in smaller groups and not on giant platforms.
“That might also mean separating monetisation and making money and creating power from the ways that the platforms and media organisations are set up. So a lot of our work is trying to take out the heat and create a purpose and focus and energy around coming together around these questions and learning from each other and having conversations.”
To illustrate his point, Andrew has an analogy based on local food banks. He says you could put a hundred cardboard boxes on street corners, hoping people will donate food. But people will see an empty box, say it’s for rubbish, and throw their rubbish in there.
Or you could say you are collecting food for the food bank, and ask where that will be successful. It will probably be outside a supermarket. So, instead of taking a hundred boxes, you can just take five and go to the five biggest supermarkets in the town.
If you want to be even more successful, you will ask what food the food bank wants, Andrew explains. If it’s tins, you’ll put a table next to the box and say, “Here’s the best of what we’ve got so far. Please give more tins.” As people go in, you say, “Hi, have you seen this box?”. As they give, you thank them. “This is great and means a lot to us. Really appreciate it.” It makes people feel good, and they then want to do it again and tell others they’re doing it.
“What we’ve done with the internet is we’ve said we’ve published a hundred articles today. Here’s an empty box at the bottom of each one, put whatever you like in it, and then go, ‘Oh, these comments are terrible, it is probably because the internet’s broken. Why is technology so bad? I hate technology.’
“No, the problem is because you’ve not told people what it’s for. So it really has to come down to what is the purpose. What’s the strategy? What are we trying to do here? And then what is the best tool? Sometimes that’s not comments. Sometimes that’s not text, but figuring out first what you are trying to do and why. And then helping people understand that and reward them for doing it instead of expecting them to magically understand and figure it out.”
I ask Andrew if the issues facing social media, in particular the difficulties with X/Twitter, might create opportunities for the media to shift conversations to their own websites.
“I do think this is an opportunity to find better communities,” he says. “I think that rebuilding X/Twitter is not the right move, but building better spaces is, and I’m really happy to see we’re heading this moment of experimentation and rethinking because we’ve been in this space now nearly a decade and we can do this better.”