
Bystander Intervention on Social Media: Responding to Racist Hate Speech and Cyberbullying

( Yasmeen Khan )
Melissa Brown, assistant professor of communication at Santa Clara University, discusses research examining ways people bully others online and what bystanders can to decrease harmful effects of cyberbullying.
[music]
Brian Lehrer: It's The Brian Lehrer Show on WNYC. Good morning again, everyone. Now, how to intervene on social media when you see racism or cyberbullying. There are better and worse ways to intervene, depending on your situation. What we know, starting out, is that bullying and expressions of racism on social media are both distressingly common, but what's your best and proper role if you're a bystander who sees something like that? How do you actually do it?
Well, here with some answers is Melissa Brown, Assistant Professor of Communication at Santa Clara University. In her academic work on race and sociology in digital spaces, she co-authored a recent study published by the Brookings Institution. It's titled Bystander Intervention on Social Media: Examining Cyberbullying and Reactions to Systemic Racism. Thanks so much for coming on the show, Professor Brown, welcome to WNYC.
Melissa Brown: Thank you for having me.
Brian Lehrer: Listeners, we're inviting your stories now, on having been either a victim of online bullying or racism and what people could have done in your situation. Ir have you been an online bystander? Have you ever intervened when you've seen expressions of racism or cyberbullying of any kind, or have you thought about intervening, but not known how? 212-433-WNYC, 212-433-9692, or tweet @BrianLehrer. Professor Brown, by way of background first, how prominent is bullying online, and who is most at risk? How is this measured?
Melissa Brown: Bullying online is a increasing phenomenon in terms of, when we think about traditional bullying compared to online bullying, one of the things that we learn that there's certain features of technology that have made it a unique phenomenon. For example, the fact that a lot of offenders of bullying online are anonymous because of not having their profile pictures or not having their true identities available online.
Also harmful content and harmful bullying can go viral, so not only is it limited to specific platforms, but can cross platforms as well and lead people to being the story of news or other platforms and go on for days at a time. There's also a tendency for users to be emotionally detached as well when they're cyberbullying others.
Brian Lehrer: How do race and gender come into this? That's obviously, a huge issue in this field.
Melissa Brown: One of the things that we've seen so far is that the study and research on cyberbullying, when it relates to identities, covers the topic of LGBTQIA bullying or gender bullying, particularly towards women. We saw less research about race, but one of the things that led us to ask about race was one of the things that we saw in our previous studies about the Black Lives Matter Movement was the ways that people were opposing that particular movement by using racism and bullying in order to counteract the motives of those social justice users. In particular women, LGBTQ people, and racially and ethically marginalized people are victims of bullying online.
Brian Lehrer: Let me look past the four types of bullying on social media or four types of racism on social media. I'll mention them. I think they'll be clear to people. You can go into them a little if you want. I want to skip quickly to the bystander aspect, but the four types of racism you identify on social media are stereotyping, scapegoating, allegations of reverse racism, and echo chambers. I think that's probably a relatively clear list to most people as to what those refer to, or some of the things that those refer to. What are some ways that bystanders can intervene online?
Melissa Brown: We found about four ways that bystanders intervene online. Some of it depends on the features of the platforms as well, and in particular, we looked at Twitter and Reddit. The four ways that people intervene was one, they intended to educate people that they felt were racially offensive by relying on some type of news stories, or maybe a sociological or historical report to correct the misinformation that's often deployed in racism.
We also saw that people might call out somebody for being racist, so explicitly name that users being racist or white supremacists. You also saw that sometimes people engaged in insults or mocking, so they might use crude language or vulgar humor in order to shame the person that they felt was being racially offensive. Then finally, depending on what type of platform you have, there was a lot of content moderation.
For example, Reddit allows people to up or downvote people, and we found that when people didn't approve every racist message, they might downvote it. Also on Reddit, you can do things like remove a post or have a bot regulate the type of content on the platform as well. Those were the four major types that we saw.
Brian Lehrer: Let's go through some of those types of bystander intervention attempting to educate, which you mentioned, seems like a graceful and well-intentioned strategy, and in the research users either link to relevant sources or questioned an abuser's logic at the level of logic. Does that ever work?
Melissa Brown: Actually, yes. We found that education and evidence were the best strategies. We had two camps of strategies. There were prosocial strategies in which when you engage in that type of intervention, it keeps social confusion gauzing and it promotes well-being among users online. Of course, there was anti-social strategies such as insults or mocking, or callouts that didn't have the effect of effectively resolving the tension that was happening. In general, education and evidence was one of the best approaches that we saw, either people eventually back down or they stopped engaging in the racist harassment once they were confronted with some type of factual evidence to contradict their points.
Brian Lehrer: What about online spaces being largely anonymous? You write, "Social media offers the opportunity for people to mask their identity similar to the KKK hoods of the past." Are people more abusive with the digital mask? I think all of our intuition from just being on social media is that yes, a lot more people feel a lot more free to do this stuff. In that context, how can you be a bystander and intervene?
Melissa Brown: I think being a bystander intervening is being mindful of who that you're engaging with and how they might approach. Of course, reacting immediately might not be the best response. Thinking of the visceral emotional reaction and that sometimes racism inspires in you might not be the most fruitful response, because it doesn't allow people to get disconfirming information. The goal of your bystander intervention should be to identify the misinformation or the stereotype or the harm that they're perpetrating directly, and address that.
Unfortunately, even if somebody is harassing you, responding negatively to you in response, the best way to respond back is to remain emotionally detached and focus on educating and informing to dispel. The reason why that's beneficial is because, even if you don't get that offending users, there are multiple other users who are reading what you're doing on online, and you're going to be able to educate and inform silent bystanders as well.
Brian Lehrer: Let's take a phone call. Ethan on Cape Cod you're on WNYC. Hi Ethan?
Ethan: Hey Brian, thanks for this subject. I just wanted to report a successful interaction last week, which actually happened on LinkedIn, where I think that was normally immune from all this stuff, but there was just some crazy posts about the benefits of arming more and more people, and more and more guns being the solution. I posted something, and then someone just attacked me in a personal way. Basically, a few people said, "Leave the country."
I responded, but I responded in a private message. I said, "Let me get this straight. You're saying because I disagree with you, I should leave the country. I think that that's probably not a feature of the democracy that you seem to want to defend." His very next message back was an apology. I think my takeaway from that was, it can be more fruitful to just go one-on-one, instead of continuing the argument in public. In public posts where people, I think get just more puffed-up, and more invested in the argument. I addressed the guy personally and directly and politely and asked him one pointed question and he basically apologized. He said, "Yes, you're right. I wasn't being my best self there."
Brian Lehrer: Professor Brown, a model there for others from Ethan on Cape Cod?
Melissa Brown: I would say so. I think what Ethan did, pointed to a lot of issues that we saw. For example, these conversations often take place in public. People being aware that they have an audience shapes how they interact with you. If we think about the traditional schoolyard bully, they tend to bully people for a large audience to get the reaction of others. Ethan taking the step of directing that to a private message and then having a polite conversation, allowed him to diffuse the situation and make it a one-on-one interaction as opposed to something that was going on for show or going on for audience. I definitely think that if you feel comfortable reaching out to individuals and having one-on-one conversations, that's definitely a fruitful strategy for mitigating racism online.
Brian Lehrer: Ethan, thank you so much for that story. Listeners, who else has a story of either being bullied online, finding yourself the victim of racist expressions online or sexist expressions online, or anything else related to identity, or if you've ever been a bystander and seen such things and did intervene or wondered how to intervene? We're talking about that with Melissa Brown, who's studied it and written about it for the Brookings Institution. She teaches at Santa Clara University in California. 212-433-WNYC, 433-9692, Or you can tweet a story and subject yourself to the potential that somebody would not be nice to you, @BrianLehrer.
Ethan's story about going behind the scenes one-on-one in a direct message instead of publicly and getting an apology, it's a great story. In your four categories of intervention strategies, that's one attempt to educate or provide evidence. That he was coming from a logic place and a one-on-one connection place. Then you have content moderation, call-outs, and insults or mocking. On call-outs, I'm sure there are times when somebody feels like it wouldn't even be the right thing to do in the larger sense, to have a quiet interaction with a cyberbully or somebody who was being racist online that, that person needs to be called out publicly, because you can't let a racist expression, for example, stand online without being called out as if out there are no consequences. Talk about the pros and cons of the call-out as you see them.
Melissa Brown: I think a call-out, like all the strategies, really depend on certain various factors as well. For example, in the context of a racism echo chamber, calling out might not necessarily be effective because you're likely the only person in that echo chamber calling it out. You actually invite tons of people directing their eye towards you. Whereas if you do a call out in a space where you have other bystanders or other people affirming your perspective, that can be a strategy in such that you have a collective voice calling out the racism going on, as opposed to individual talking back to one racist. I think thinking about, "Who is with me as I'm making this bystander intervention," is also an important note to take, depending on what platform you're on and how you're engaging with these individuals.
Brian Lehrer: Depending on what platform you're on is an interesting distinction. I see you've documented different kinds of behaviors on different platforms. Twitter is different than on Reddit, for example. You want to go into those a little bit?
Melissa Brown: Sure. One of the things that we found is that the features of the platforms shape the types of strategies, as well as the types of racist discourses that happen on those platforms. For example, we noticed that when people intervened on Reddit, there was way more content moderation happening than on Twitter. The reason being is that Twitter doesn't really have those types of features, to allow users to monitor or shape what types of interactions actually happen.
You can say block somebody or meet somebody on Twitter, but that doesn't actually prevent them from saying something, whereas on Reddit because each of those particular spaces, the subforms called subreddits their own moderators, moderators can Institute rules and strategies that say, "Hey, we don't allow racism here. If it happens, you're immediately going to get removed, you're immediately going to get blocked." Having a cultural norm about that digital space is something that goes a long way. Another thing that-- Go ahead.
Brian Lehrer: No, you go ahead.
Melissa Brown: Yes, and so then another thing that we saw for example is on Twitter. One of the things that was very popular as opposed to Reddit, and maybe this goes back to the moderation as well, is that about 60% of the intervention strategies we saw on Twitter were insults and mocking. On Twitter, people are more quick to address what they see racism happening through some types of insult or mocking. That might be just the Twitter culture of cracking jokes and using quote tweets and replies to address these typical situations. Even though you can get some type of emotional release from the insults and mocking, they don't have the same effect of mitigating or eliminating these types of racist events the way something content moderation does.
Brian Lehrer: Let me read a couple of tweets that have come in. One person writes, "A big problem I found with interfering with bullying on Facebook is that reporting offensive content rarely results in removal. I've reported comments, for example, comparing black people to apes, calling trans women rapists, and they're never found to violate community standards." Another one, somebody writes, "In the ADLS community, American Descendants of Slaves, the art of the insult is a friendly pastime. Detroiters label it capping. In Chicago. it's labeled playing the dozens. I'm told that in Virginia, it's called cracking. In the black gay Chicago community, it's called reading someone." Is any of that something you're familiar with and something that can be productive, in your opinion, and to the other Twitter as well about Facebook not taking action when you report offensive content?
Melissa Brown: Yes. In response to the issue of Facebook, I think that brings up the question of, a lot of what you can do, interventionwise, depends on the platform affordances or the capacity for action that the features of a certain technology allow you to do. One of the things that we criticize Facebook for, is that it doesn't really give affordances neither at the individual level or at the corporate level for people to intervene on what they see.
For example, it's policy on Facebook that saying something like feminists might say, the phrase, "Men are trash," to criticize [atriarchy is going to get you blocked on Facebook just like if you said something racist. The features of Facebook and themselves don't necessarily allow for recognizing the ways social hierarchies operate and then create the systemic inequalities that people are trying to address through these online vice and intervention strategies.
One of our goals, of course, as researchers and policymakers is to confront these corporations and say, "Hey, you really need to look at your terms of service. You need to look at your algorithms and how they function at how they prevent marginalized people from using their voices and actually do more harm than good." The question of insults and humor, I should [unintelligible 00:16:56] say that the types of insults and humor that we saw weren't comparable to the ways that black people across the United States and across the diaspora use these types of strategies of humor, the reading and the playing the dozens, that weren't quite similar strategies as to what we saw.
I think that's another factor as well is like, we're not saying don't be funny and we're not saying don't, you know, poke fun at what people are saying, but we are saying being crude and being equally violent in your response to somebody's racist offense isn't necessarily going to negate that particular offensive behavior.
Brian Lehrer: What about the other comment about sometimes the art of the insult especially as pushback against racism can be what the tweeter calls a friendly pastime.
Melissa Brown: Yes. I think even in the context of, for example, ethnically uniform or ethnically, I think enclaves as they call them ethnic enclaves online, Black Twitter being one, for example, this is a pastime. This is a strategy that black people use to call out the ways that they are marginalized on digital platforms. That looks a lot different than the type of strategies that we saw. I think that if you did want to go the human route, I think black social media users kind of have the market cornered on the best ways to implement that. Being mindful that those particular cultural strategies have a long history that don't necessarily look like the types of human strategies that we see when people are confronting users online.
Brian Lehrer: Did anything resembling a white savior complex show up in your research? Did white bystanders intervene in cases of racist hate speech with misplaced intentions or expecting themselves to wind up the hero of the story?
Melissa Brown: That's a really interesting question. I think I had a thought about that. One thing that I did notice is that a lot of times these conversations about race and racism aren't happening across racial lines. They tend to actually be happening within groups. Oftentimes, white people, white non-racist, or anti-racists are confronting white racists. There's not necessarily an opportunity for the white saviorism to occur. Within that dynamic outside of our reporting, of course, we do see on social media when white saviorism does try to rear its head, people of color, social media users are pretty quick to point out, "Hey, this is our space. Let us have a voice. You can be a listener, but please don't talk over what you're seeing online."
I think that's a good strategy as well. It's like, "If you want to be observant of the struggles and strategies that people of color are using online, be aware that you are, once again, a bystander and have to give precedence to what people of color would like to see when they are confronting racism online, as opposed to talking over and stepping over people."
Brian Lehrer: We're almost out of time, but I want to take two calls real quick, and get a last thought from you, one on the most personal kind of cyberbullying or expressions of racism online, and one about the least personal. First, Tiffany in Richfield. Tiffany, you're on WNYC. Real quick.
Tiffany: Right, yes. I love everything you're saying, but I feel like, at this point, if we're to bring up something to try to kindly tell someone, "What you're saying is actually really racist and wrong," it's so confrontational. I'll continue to try some of the strategies that you've laid out because they're so great, but we're a mixed-race family and we get stuff all the time. We're constantly trying to kindly push back, but it's gotten to the point where we've realized if we're going to kindly push back and try to draw attention to it, we're probably going to blow up that relationship. We haven't found a way to make it work.
Brian Lehrer: That's with people you know and the concern about blowing up the relationship. All the way on the other side or the other end of that spectrum, Vicky in Tudor City, you're on WNYC. Hi, Vicky. Real quick.
Vicky: [crosstalk] on Facebook, and I'm watching something streaming like now politics. Are you there?
Brian Lehrer: Yes.
Vicky: Can you hear me?
Brian Lehrer: Yes.
Vicky: Okay. There'll be sort of a cyberbullying of the democratic process particularly because all these bot-like comments will come through just denigrating the President or Jen Psaki, who is eminently professional.
Brian Lehrer: The Press Secretary.
Vicky: I appreciate your expert thoughts on this because it's difficult.
Brian Lehrer: Vicky, thank you very much. In our last minute, Professor Brown, we've got the racism in the personal community. Tiffany in Richfield was dealing with people she knows and can't figure out how to do it the best way. Vicky in Tudor City is dealing with bots. They may not even be people behind those comments, individuals.
Melissa Brown: Yes. In the case of dealing with people in your personal circle, I think it's important to set boundaries for yourself. If you're on the fight against racism, you're on the right side and you have to be mindful that in this society where racist ideals are the default of what we're taught and educated, unfortunately, it's going to take a very long time to change people's hearts and minds. Being patient and taking a distance and taking a step away and setting boundaries, especially in the digital space, is really key and important. I would say if you feel comfortable muting and blocking is the best strategy with your own personal circle, then say, "I would prefer to have those conversations face-to-face."
In the context of dealing with bots, I think, unfortunately, that is once again, a topic that stakeholders such as policymakers, researchers, and the technology corporations themselves have to get a hold on. I think that, unfortunately, as an individual user, you might not have the resources to address those beyond reporting that you think that these bots exist. Unfortunately, bots are something that are part of disinformation campaigns and nefarious actors are using these types of technologies to propagate these. Being aware that they exist, I think, is good, and being able to dissect when something is not from a real human source is the best strategy addressing that as well.
Brian Lehrer: Professor Brown, communications professor at Santa Clara University and author of a study for the Brookings Institution titled Bystander Intervention on Social Media: Examining Cyber Bullying and Reactions to Systemic Racism. I think it helped a lot of people navigate what they're seeing and come up with some strategies of what they might do. Thank you so much.
Melissa Brown: Thank you.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.