How Do Social Media Algorithms Impact LGBTQ People?
At a recent School of Data Science event, four panelists discussed how social media can exacerbate transphobia, homophobia, and other types of discrimination.
On social media platforms like TikTok, users can easily disseminate information to millions of people on a daily basis. And thanks to this capacity, transphobia, misinformation, and hate speech have spread rapidly in such spaces.
At the same time, extremism and discrimination aren’t new. At a recent panel organized by the School of Data Science, professor Jess Reia, who uses they/them pronouns, emphasized this. “Social media platforms reproduce inequalities that already exist in society,” they said.
The event, titled “Every Data Point Tells a Story,” brought together four data science researchers and practitioners to discuss the intersection between data science, social media, and the visibility and safety of LGBTQ individuals.
The speakers focused in particular on the algorithms behind features such as TikTok’s “For You” page, which might seem innocuous but in fact keep individuals in a bubble and often push them toward extremism.
“I think the misperception is that algorithms are neutral,” said Kelsey Campbell, founder of the LGBTQ data analytics organization Gayta Science. “But the reality is, the disinformation that’s being spread and the way [these algorithms are] funneling people to more extreme content is very deliberate.”
Disinformation researcher Kayla Gogarty described a study that she co-authored, which looked at how quickly TikTok directs users to hate speech and far-right content. The researchers found that when a person engaged with transphobic content, the platform recommended not only more transphobic content, but also content that was homophobic, misogynistic, and racist — and directed users to vaccine misinformation as well.
"We see people gravitate towards the same sort of content within echo chambers, once they're started down that rabbit hole,” said Gogarty, who serves as a research director at Media Matters, an organization that monitors and combats right wing misinformation.
Social platforms also thrive on sensational media, and many right-wing groups capitalize on that fact. This makes social media sites uniquely harmful to LGBTQ people, people of color, and other disadvantaged groups. Although we know that algorithms use our geographic location, said Francesca Tripodi, assistant professor in the School of Information and Library Sciences at the University of North Carolina, Chapel Hill, we don’t always consider that they take in things like our search terms, which can reflect personal and political biases.
“Algorithms take into account those starting words,” she said. “What are the first things that you type in? And how does that then drive returns, as well as who else you're connected with online?”
Still, even with their nefarious potential, Reia noted that social media platforms can provide queer communities with “support they cannot have physically,” particularly in small towns. “These are spaces that can also be positive, despite all these other problems that we mentioned,” they added.
The speakers emphasized the need for more discussions about ways to improve social media. While de-platforming and demonetizing can be effective for putting pressure on sites that provide spaces for hate speech, such an approach likely won’t lead to major shifts when it comes to corporations like TikTok, Facebook, and Google, which are used by millions of people and have become deeply embedded in society.
Regardless of what the future brings, the data scientists who work on social media algorithms need to understand how they're shaping the world around them, the speakers agreed.
“I see a long way ahead, a bumpy road for sure,” Reia said. "Right now, it's a lot of mitigation and damage control. But I really hope in a few years from now we can see a field that is more responsible and accountable to the public.”