Algorithms Are Fueling Misogyny, Which Fuels Sexual Violence.

.png)
Read in...
Algorithms. Artificial intelligence. Cloud computing. Blockchain. In a world burgeoning with technological advancements, these terms get subsumed into mainstream vernacular. And fast. Oftentimes, it happens before the average person can get a grasp on what they actually mean, and perhaps most importantly, how they shape our lives.
The term “algorithm” has no universally accepted definition, but a common one comes from the computer scientist Harold Stone: “An algorithm is a set of rules that precisely define a sequence of operations.” Think of it like a series of instructions. In machine learning, this takes on a special spin: the algorithm is a set of directions computers follow to learn from data.
This can be relatively innocuous: Has this user looked at content of water bottles before?
Nope: Okay, wrap it up. Move on to something else. Quick. Just keep them on the platform.
Yes: Show them an ad for a shiny new water bottle. That’ll get their attention. Bam. Now a matching reusable coffee cup. Get that monay!
Algorithms are the logic that companies rely on to predict, and profit from, our online habits. Social media platforms have them, and so do your favourite online fashion retailers, streaming services like Spotify and Netflix, job platforms, and even dating apps. These platforms need to gain, and maintain, our attention to sell advertising space. Content that catches and keeps our awareness is gold. By extension, content that provokes strong emotional reactions is too. Sure, sometimes this comes in the form of cute fluffy cat reels. But the algorithm also loves fear and rage, insecurity and desire.
That is the stuff that keeps you scrolling, bloodshot and bleary-eyed, huddled under the covers until your head pounds with the echo of hundreds of voices. Strangers from the internet, pixels and provocations, some selling dreams, others peddling nightmares.
Algorithms and Misogyny. What’s the Link?
We know algorithms control what we see. They’re clever. And they want us to stay on platforms as long as possible. To keep us scrolling, algorithms amplify and promote hateful content. Why? It’s engaging. It’s zesty. Provoke them. Polarize them. That’s the strategy. We’re not going to stay on TikTok very long watching videos of bean plants sprouting or people putting stickers on jars.
The algorithm also knows we’re lonely. We’re craving connection. Some young men are feeling this particularly hard. Gallup data from 2023 and 2024 found that 25% of American men between 15 and 34 felt lonely “a lot of the previous day,” which was significantly higher than 18% of young women in the same bracket. The algorithm exploits this vulnerability by feeding boys content that claims to “fix” the emptiness they feel. And it offers them a simple explanation for their isolation and disenfranchisement: blame women and feminism.
This isn’t just a theory born from moral panic. There’s a growing body of research which explores the link between algorithms and misogynistic content.
- In a 2024 study, a group of researchers interviewed young people who were creating and interacting with extreme online content. Based on this, they made dummy TikTok accounts for archetypes representing teenage boys vulnerable to online radicalisation. Some of the personas were seeking content on masculinity. Others were lonely, looking for connection. TikTok worked its magic and began suggesting videos to the fake accounts. The researchers tracked the content it pushed for seven days. After five, the algorithm was showing four times as many videos with misogynistic content, including objectification, sexual harassment and discrediting women.
- Reset Australia examined how YouTube’s algorithms promoted misogynistic and anti-feminist content to boys and young men. In a 2022 study, researchers created 10 experimental accounts and analyzed their recommended content. Four accounts represented boys under 18, another four were modeled on men over 18. For these profiles, dummy users engaged with varied ideological content, ranging from mainstream to extreme. The last two avatars were controls that didn't seek out any particular content – they just followed YouTube’s recommendations. The algorithm pushed videos with misogynistic views to all 10 accounts, within hours of their creation. This included content that criticised feminism and abortion.
- In 2024, researchers from Dublin City University made 10 experimental accounts mimicking 16 to 18-year-old boys, five on TikTok and five on YouTube Shorts. Within the first 23 minutes of the experiment, all accounts were pushed anti-feminist content. It didn’t matter whether they were seeking male-supremacist views or not. Once an account spent time engaging with this kind of content, the amount shown to them increased rapidly. After two to three hours of viewing, the majority of videos (76% on TikTok and 78% on YouTube Shorts) were showing anti-feminist content. Creators argued against equality and encouraged the submission of women. Andrew Tate was the most prevalent manosphere influencer; his monologues popped up 582 times for the YouTube Short accounts, and 93 times on the TikTok ones.
Algorithms are powerful creatures. Even though most young men are not looking for anti-feminist content, the invisible code behind our screen seems hellbent on making sure it finds them. Sometimes, within mere minutes. So, the algorithm marches forwards, fulfilling its mission, muttering its mantra: more screen time.
The creators pedalling misogynistic views online also have profit in mind. They want to enrage and engage their audiences. Build up cult-followings. Solidify empires. Andrew Tate’s platform The Real World advertises yearly memberships like CONQUER for $996 a year, so you can “join thousands crushing their competition and building enemies.” The platform, supposedly a space for 113,000+ “like-minded individuals” to “work towards personal growth,” offers courses in topics as varied as crypto investing and copywriting.
Anti-feminist creators often first capture the attention of young men through benign content about physical health, entrepreneurship and money. When users interact with these creators, echo chambers can be created where the algorithm perceives a positive reaction and promotes similar content from these creators and similar accounts. The result is dangerous: users can get trapped in feedback loops where initial exposure to less harmful content encourages the push of more extreme misogynistic material. Perhaps young men begin to think that these views are normal. That everyone shares these opinions: feminism is a disease of the modern age, rape jokes are funny, and women belong at home.
How Does This Promote Violence Against Women In "The Real World"?
People can gradually and subconsciously adopt the views they’ve consumed online. Last year, the ABC reported on the way misogynistic content has made its way off screens and into the classroom. A female teacher with decades of experience began to notice a shift in the behavior of young boys during the last few years.
“I’ve been told to ‘f**k off, bitch’... I’ve been told to ‘shut up, bitch’. Imagine standing in front of a class full of high school students and having that kind of abuse hurled at you.”
While it’s hard to draw a direct line between this boy’s sexist outburst and the rise of algorithms pushing anti-feminist videos, there is a growing body of research on how manosphere content and everyday misogyny are connected.
Dr Stephanie Wescott examines how creators from the manosphere shape behaviour in schoolyards. She spoke to 30 female teachers, emphasising the role of online content in galvanising sexist behavior.
“I think many parents would be shocked not only about what their boys are seeing online, but also how it is informing their attitudes towards girls and women and how they’re expressing those things…They’re taught that feminism has taken power from men, and that women are now their oppressors… and so they actually feel slighted by women and angry at them.”
One student came up to a teacher and said, ‘Miss your boobs look really good today.’ Another boy spat in his teacher’s water bottle. One teacher reported a student saying, ‘Why are the girls here? They don’t need an education. They can just make an OnlyFans account.’
Dr Wescott and her team found that the rise in such sexist behaviour coincided with a return from online classes following COVID restrictions. Their hypothesis is that the increased screentime young Australian boys had during the pandemic played a role in the surge of misogynistic outbursts at school once things went back to normal.
In Australia, there has been a rise in teen-on-teen perpetrated sexual violence, despite our best prevention efforts. New statistics from the ABS show that in every state and territory, there are, on average, 10% more victim-survivors of sexual assault than previous years. This also coincides with a decrease in the reporting rate, meaning actual rates are likely even higher.
The algorithm is just one force possibly contributing to the rise of sexual violence, yet it’s a powerful beast to contend with. Gazbiah Sans, a Preventing Violence Extremism Expert from The European Commission, highlights cases like Elliot Rodger and Alek Minassian – they exemplify the sheer power and shattering consequences of unchecked online misogyny. Summing up the link between virtual rabbit holes and radicalisation, Sans concludes, “platform algorithms that optimize for engagement fuel exposure to this content and deepen ideological entrenchment.”
Algorithms Are Clever But So Are We. What Can We Do Together?
Regulating algorithms is one way to help curb the rise in misogyny. This is already being done in the EU. Adopted in 2022, The Digital Services Act (DSA) creates rules for online platforms aimed at creating safer virtual worlds. Very Large Online Platforms with over 45 million average monthly users in the EU (VLOPs) and Very Large Online Search Engines (VLOSEs) must explain how their algorithms that suggest content work, and give users the option to turn off profiling-based recommendations. VLOPs must also assess and mitigate risks that algorithms may pose like disinformation or harmful impacts on health.
We can’t sit by while invisible algorithms stir up hateful views and promote violence on and off our screens. Our approach must be preventative and proactive, combining the power of governments, tech firms and civil society. Victims of sexual assault and harrassment deserve better. Young men being targeted and led down these spurious rabbit holes deserve better too.
Algorithms. Artificial intelligence. Cloud computing. Blockchain. In a world burgeoning with technological advancements, these terms get subsumed into mainstream vernacular. And fast. Oftentimes, it happens before the average person can get a grasp on what they actually mean, and perhaps most importantly, how they shape our lives.
The term “algorithm” has no universally accepted definition, but a common one comes from the computer scientist Harold Stone: “An algorithm is a set of rules that precisely define a sequence of operations.” Think of it like a series of instructions. In machine learning, this takes on a special spin: the algorithm is a set of directions computers follow to learn from data.
This can be relatively innocuous: Has this user looked at content of water bottles before?
Nope: Okay, wrap it up. Move on to something else. Quick. Just keep them on the platform.
Yes: Show them an ad for a shiny new water bottle. That’ll get their attention. Bam. Now a matching reusable coffee cup. Get that monay!
Algorithms are the logic that companies rely on to predict, and profit from, our online habits. Social media platforms have them, and so do your favourite online fashion retailers, streaming services like Spotify and Netflix, job platforms, and even dating apps. These platforms need to gain, and maintain, our attention to sell advertising space. Content that catches and keeps our awareness is gold. By extension, content that provokes strong emotional reactions is too. Sure, sometimes this comes in the form of cute fluffy cat reels. But the algorithm also loves fear and rage, insecurity and desire.
That is the stuff that keeps you scrolling, bloodshot and bleary-eyed, huddled under the covers until your head pounds with the echo of hundreds of voices. Strangers from the internet, pixels and provocations, some selling dreams, others peddling nightmares.
Algorithms and Misogyny. What’s the Link?
We know algorithms control what we see. They’re clever. And they want us to stay on platforms as long as possible. To keep us scrolling, algorithms amplify and promote hateful content. Why? It’s engaging. It’s zesty. Provoke them. Polarize them. That’s the strategy. We’re not going to stay on TikTok very long watching videos of bean plants sprouting or people putting stickers on jars.
The algorithm also knows we’re lonely. We’re craving connection. Some young men are feeling this particularly hard. Gallup data from 2023 and 2024 found that 25% of American men between 15 and 34 felt lonely “a lot of the previous day,” which was significantly higher than 18% of young women in the same bracket. The algorithm exploits this vulnerability by feeding boys content that claims to “fix” the emptiness they feel. And it offers them a simple explanation for their isolation and disenfranchisement: blame women and feminism.
This isn’t just a theory born from moral panic. There’s a growing body of research which explores the link between algorithms and misogynistic content.
- In a 2024 study, a group of researchers interviewed young people who were creating and interacting with extreme online content. Based on this, they made dummy TikTok accounts for archetypes representing teenage boys vulnerable to online radicalisation. Some of the personas were seeking content on masculinity. Others were lonely, looking for connection. TikTok worked its magic and began suggesting videos to the fake accounts. The researchers tracked the content it pushed for seven days. After five, the algorithm was showing four times as many videos with misogynistic content, including objectification, sexual harassment and discrediting women.
- Reset Australia examined how YouTube’s algorithms promoted misogynistic and anti-feminist content to boys and young men. In a 2022 study, researchers created 10 experimental accounts and analyzed their recommended content. Four accounts represented boys under 18, another four were modeled on men over 18. For these profiles, dummy users engaged with varied ideological content, ranging from mainstream to extreme. The last two avatars were controls that didn't seek out any particular content – they just followed YouTube’s recommendations. The algorithm pushed videos with misogynistic views to all 10 accounts, within hours of their creation. This included content that criticised feminism and abortion.
- In 2024, researchers from Dublin City University made 10 experimental accounts mimicking 16 to 18-year-old boys, five on TikTok and five on YouTube Shorts. Within the first 23 minutes of the experiment, all accounts were pushed anti-feminist content. It didn’t matter whether they were seeking male-supremacist views or not. Once an account spent time engaging with this kind of content, the amount shown to them increased rapidly. After two to three hours of viewing, the majority of videos (76% on TikTok and 78% on YouTube Shorts) were showing anti-feminist content. Creators argued against equality and encouraged the submission of women. Andrew Tate was the most prevalent manosphere influencer; his monologues popped up 582 times for the YouTube Short accounts, and 93 times on the TikTok ones.
Algorithms are powerful creatures. Even though most young men are not looking for anti-feminist content, the invisible code behind our screen seems hellbent on making sure it finds them. Sometimes, within mere minutes. So, the algorithm marches forwards, fulfilling its mission, muttering its mantra: more screen time.
The creators pedalling misogynistic views online also have profit in mind. They want to enrage and engage their audiences. Build up cult-followings. Solidify empires. Andrew Tate’s platform The Real World advertises yearly memberships like CONQUER for $996 a year, so you can “join thousands crushing their competition and building enemies.” The platform, supposedly a space for 113,000+ “like-minded individuals” to “work towards personal growth,” offers courses in topics as varied as crypto investing and copywriting.
Anti-feminist creators often first capture the attention of young men through benign content about physical health, entrepreneurship and money. When users interact with these creators, echo chambers can be created where the algorithm perceives a positive reaction and promotes similar content from these creators and similar accounts. The result is dangerous: users can get trapped in feedback loops where initial exposure to less harmful content encourages the push of more extreme misogynistic material. Perhaps young men begin to think that these views are normal. That everyone shares these opinions: feminism is a disease of the modern age, rape jokes are funny, and women belong at home.
How Does This Promote Violence Against Women In "The Real World"?
People can gradually and subconsciously adopt the views they’ve consumed online. Last year, the ABC reported on the way misogynistic content has made its way off screens and into the classroom. A female teacher with decades of experience began to notice a shift in the behavior of young boys during the last few years.
“I’ve been told to ‘f**k off, bitch’... I’ve been told to ‘shut up, bitch’. Imagine standing in front of a class full of high school students and having that kind of abuse hurled at you.”
While it’s hard to draw a direct line between this boy’s sexist outburst and the rise of algorithms pushing anti-feminist videos, there is a growing body of research on how manosphere content and everyday misogyny are connected.
Dr Stephanie Wescott examines how creators from the manosphere shape behaviour in schoolyards. She spoke to 30 female teachers, emphasising the role of online content in galvanising sexist behavior.
“I think many parents would be shocked not only about what their boys are seeing online, but also how it is informing their attitudes towards girls and women and how they’re expressing those things…They’re taught that feminism has taken power from men, and that women are now their oppressors… and so they actually feel slighted by women and angry at them.”
One student came up to a teacher and said, ‘Miss your boobs look really good today.’ Another boy spat in his teacher’s water bottle. One teacher reported a student saying, ‘Why are the girls here? They don’t need an education. They can just make an OnlyFans account.’
Dr Wescott and her team found that the rise in such sexist behaviour coincided with a return from online classes following COVID restrictions. Their hypothesis is that the increased screentime young Australian boys had during the pandemic played a role in the surge of misogynistic outbursts at school once things went back to normal.
In Australia, there has been a rise in teen-on-teen perpetrated sexual violence, despite our best prevention efforts. New statistics from the ABS show that in every state and territory, there are, on average, 10% more victim-survivors of sexual assault than previous years. This also coincides with a decrease in the reporting rate, meaning actual rates are likely even higher.
The algorithm is just one force possibly contributing to the rise of sexual violence, yet it’s a powerful beast to contend with. Gazbiah Sans, a Preventing Violence Extremism Expert from The European Commission, highlights cases like Elliot Rodger and Alek Minassian – they exemplify the sheer power and shattering consequences of unchecked online misogyny. Summing up the link between virtual rabbit holes and radicalisation, Sans concludes, “platform algorithms that optimize for engagement fuel exposure to this content and deepen ideological entrenchment.”
Algorithms Are Clever But So Are We. What Can We Do Together?
Regulating algorithms is one way to help curb the rise in misogyny. This is already being done in the EU. Adopted in 2022, The Digital Services Act (DSA) creates rules for online platforms aimed at creating safer virtual worlds. Very Large Online Platforms with over 45 million average monthly users in the EU (VLOPs) and Very Large Online Search Engines (VLOSEs) must explain how their algorithms that suggest content work, and give users the option to turn off profiling-based recommendations. VLOPs must also assess and mitigate risks that algorithms may pose like disinformation or harmful impacts on health.
We can’t sit by while invisible algorithms stir up hateful views and promote violence on and off our screens. Our approach must be preventative and proactive, combining the power of governments, tech firms and civil society. Victims of sexual assault and harrassment deserve better. Young men being targeted and led down these spurious rabbit holes deserve better too.
Help us get these evidence-based, youth-led resources into high schools around Australia.







