respondology mark

‘A really ugly playing field’: How tech is tackling online trolling of athletes


Abuse on social media, exemplified by reactions to Simone Biles' Olympic withdrawal, underscores a broader issue faced by athletes. Respondology's AI platform aims to combat this toxicity by discreetly removing abusive posts, offering hope for a safer online environment.

Simone Biles’ shock exit from the gymnastics teams event at the Olympics last week led to the five-time Olympic medallist being both praised and pilloried on social media for prioritising her mental health.

“Choke! The great athletes do not choke and quit under pressure,” one person posted. “You wanna blame your failures on mental illness. Hahahaha you’re no GOAT f… off,” wrote another.

Biles cited the pressure of social media when she withdrew from the competition, saying “there were a couple of days when everybody tweets you and you feel the weight of the world”.

Abuse on social media is a growing issue for athletes and not just at the Olympics. There were 12,500 racist tweets and posts sent to three Black players on England’s football team after England lost to Italy in the Euro Championships final in July.

The abuse is often racist and Australia’s eSafety commissioner, Julie Inman Grant, says her office’s work with the AFL monitoring abuse Australian football players cop online found Indigenous and Torres Strait Islander people were three times more likely to receive targeted online hate and harassment.

The research found women were much more likely to receive misogynistic abuse that was sexualised and violent. “It’s not all out there in the open,” Inman Grant says. “In fact, a lot of the abuse goes directly to the players’ [direct messages] on Instagram and other places.”

The eSafety Commission is “certainly watching” what happens at the Olympics.

“If you identify as LGBTQI, or you have a disability, you’re also three times as likely to be targeted with online harassment,” she says. “I think we can expect with the Special Olympics you will see that kind of targeting as well, or particular athletes who are openly gay, bi or trans.”

Inman Grant says social media has a lot of promise in promoting a variety of voices, “but to the extent that these voices aren’t protected, it actually results in suppression of speech and creates a really ugly playing field”.

US startup Respondology is trying to tackle the problem with its platform that uses artificial intelligence to automatically hide abusive posts that use key words or emojis.

“It removes it instantly from the public post for the troll that’s posted it,” says Matt Von der Muhll, director of Respondology for APAC. “We call it discrete removal so the troll themselves actually still have it held on their account so they think the post is still up, but it’s been removed for everyone else.”

In the US, NASCAR racing uses the tool after black drivers, in particular, Bubba Wallace, were inundated with racist messages and hate speech after speaking out about displays of Confederate flags at race tracks.

Read the full article on The Sydney Morning Herald.

Latest articles

Discover the alarming reality of toxicity in NFL social media comments - our report exposes the most affected teams.
Learn how Respondology helped a leading athletic apparel company use comment moderation to ensure its social media presence was aligned with its brand values.
Learn why social media comment moderation is important for brands and about the symbiotic relationship between technology and human intelligence to safeguard users and brand reputation.

Subscribe to our newsletter to get content (that is actually useful) delivered to your inbox.