The social media feeds that once connected us are now driving us apart. Social media algorithms are flooding young men’s feeds with radical misogynistic content, inciting real-world harm.

We’re calling on the Australian Government to act, and introduce an opt-in feature for social media algorithms so we can bring affirmative consent to our screens, and turn our feeds on and off at will.

Turn On
Turn off
Click me
Systematic radicalisation
It takes just 23 minutes for a social media account mimicking a 16-18 year old boy to be fed misogynistic content, regardless of the account’s viewing preferences.
23 minutes
23 minutes
Misogynistic content is rife
73% of Gen Z social-media users have seen misogynistic content online, with 70% saying they believe misogynistic language and content are increasing. This rises to 80% for women.
73%
73%
Sexual violence is increasing
Instances of reported sexual assault have increased by 10% in the last year in Australia. This accompanies a decrease in the overall reporting rate.
10% increase
10% increase
Our Signatories
Dr Zac Seidler, Global Director of Movember
Jess Hill, Industry Professor at the University of Technology Sydney
Daniel Principe, Youth Advocate and Educator
Dr Zac Seidler, Global Director of Movember
Jess Hill, Industry Professor at the University of Technology Sydney
Daniel Principe, Youth Advocate and Educator
Dr Joanne Gray, University of Sydney
Ben Vasiliou, CEO of The Man Cave
Professor Michael Flood, Queensland University of Technology
Dr Joanne Gray, University of Sydney
Ben Vasiliou, CEO of The Man Cave
Professor Michael Flood, Queensland University of Technology
Jim Hungerford, CEO of the Butterfly Foundation
Dr Joy Townsend, CEO of Learning Consent
Benjamin Law, Writer and Broadcaster
Jim Hungerford, CEO of the Butterfly Foundation
Dr Joy Townsend, CEO of Learning Consent
Benjamin Law, Writer and Broadcaster
Professor Ben Mathews, Queensland University of Technology 
Nicole Yade, CEO of Women’s and Girls’ Emergency Centre
Zahra Al Hilaly, UN Women’s Generation Equality Youth Task Force
Professor Ben Mathews, Queensland University of Technology 
Nicole Yade, CEO of Women’s and Girls’ Emergency Centre
Zahra Al Hilaly, UN Women’s Generation Equality Youth Task Force
Blake Pavey, Comedian and Content Creator
Zoë Foster Blake, Author and Entrepreneur
Gina Martin, Gender Equality Activist
Blake Pavey, Comedian and Content Creator
Zoë Foster Blake, Author and Entrepreneur
Gina Martin, Gender Equality Activist
Charlotte Mortlock, Executive Director Hilma's Network
Yvonne Weldon AM, NSW Aboriginal Woman of the Year (2022)
Susanne Legena, CEO Plan International Australia
Charlotte Mortlock, Executive Director Hilma's Network
Yvonne Weldon AM, NSW Aboriginal Woman of the Year (2022)
Susanne Legena, CEO Plan International Australia
Maree Crabbe, Director at It’s Time We Talked
Melissa Abu-Gazaleh, Founder and Managing Director of Top Blokes
Michelle Ryan, Director of Global Institute for Women's Leadership
Maree Crabbe, Director at It’s Time We Talked
Melissa Abu-Gazaleh, Founder and Managing Director of Top Blokes
Michelle Ryan, Director of Global Institute for Women's Leadership
Wendy McCarthy AC, Businesswoman, Activist and Author
Jack Toohey, Author and Activist
Meredith Turnbull, Principal at VOX FEMINA
Wendy McCarthy AC, Businesswoman, Activist and Author
Jack Toohey, Author and Activist
Meredith Turnbull, Principal at VOX FEMINA

Got questions? We’ve got answers.

(Q)

What’s the problem with algorithms as they are now?

Click to flip

(A)

Many of us enjoy our feeds. They can take us to niche communities with people who like the same music, TV shows, sport, art and more. But alongside those positive connections, something far more dangerous is happening. Today, the same algorithms that curate our favourite content are also being weaponised. Research is showing that boys and men are exposed to extreme misogynistic material within minutes of signing up to social media platforms. These algorithms are engineered to exploit their fears and insecurities. This isn’t a small problem - misogynistic attitudes are linked to support for violence against women.

(Q)

Is gender-based violence in Australia getting worse?

Click to flip

(A)

Yes, Australia’s National Crisis of gender-based violence is worsening. Teenage boys are now the most likely perpetrators of child sexual abuse, the number of reported sexual assault victim-survivors reached an all-time high, image-based abuse is at record highs and sexism and harassment in schools is growing at alarming rates. To address the factors fuelling this crisis, the Government should better regulate the algorithms promoting extreme misogynistic ideology to young men and boys. At the very least, we should be given a choice: to opt in, and turn our algorithms on and off at will.

(Q)

Why would social media platforms promote misogynistic content?

Click to flip

(A)

Social media platforms are designed to keep us online for as long as possible. The longer we scroll, the more advertisements we see and the more money the platforms make. To achieve this, platforms use algorithms that prioritise content most likely to capture and hold our attention. The result? Content that sparks strong emotions, like outrage, fear, excitement, anger, spreads the fastest. The more extreme the reaction, the more “successful” the post becomes. Extreme misogynistic content often provokes exactly those reactions. That’s why it’s amplified.

(Q)

What if I don’t want to turn my algorithm off?

Click to flip

(A)

You can still support this policy but opt-in for an algorithmically-driven social media feed. Our intent is to give Australians options.

(Q)

My feed isn’t misogynistic. Does that mean it’s not harmful?

Click to flip

(A)

Unfortunately, there are many forms of harmful content circulated by social media algorithms. For instance, your feed may not be misogynistic, but you still may want to turn off your algorithm on socials to prevent exposure to eating disorder or self harm content, mis/disinformation, other forms of hateful content, or just to give yourself a break from the doom-scroll. If you're someone who loves your feed, then it can simply be a helpful option to spend less time on your phone, if you want to use it.

(Q)

If I turn the algorithm on for one platform, does that mean I have to have it on for all?

Click to flip

(A)

No. This policy is all about choice, informed consent and safety. Some of us may experience more harm on some platforms than others. So, if you find one of your feeds to be less harmful, you can scroll on and opt-in. But if you're seeing content that you think is harmful, you can choose to turn it off.

(Q)

What will the Instagram Explore Page/ TikTok’s For You Page look like if I turn it off?

Click to flip

(A)

In the European Union, where similar policies are in place, the Explore pages show content as usual. But instead of the content being personalised to you, it would show generally popular videos at the time.

(Q)

Can I change my mind about my algorithm at any time?

Click to flip

(A)

Yes. This policy is about choice, informed and ongoing consent and ultimately, making social media safer by design.

(Q)

How will this help to reduce misogynistic radicalisation?

Click to flip

(A)

This policy will stop the "accidental" radicalisation of young men and boys. It acknowledges that often, young men and boys aren't actively seeking out misogynistic content - they're being targeted it. If the default when you download a platform is to have the algorithm turned off, then we hope less people will be subjected to this content. Parents could also use this tool as a way to create softer landing for their children using social media and to reduce the risk of harm. It's not a silver bullet but it will make online platforms safer and reduce expose to harmful content of all kinds.

(Q)

Is regulating this difficult to do?

Click to flip

(A)

Technically speaking, no. These are protocol, rules driven businesses. There aren’t the same hurdles for implementation, so you can change them easily. And before algorithms, there were chronological feeds; so, it’s really just returning to that initial protocol.

(Q)

How do algorithms and recommender systems work?

Click to flip

(A)

Recommender systems prioritise content or make personalised content suggestions to users. They’re driven by underlying algorithms.
Social media algorithms are collections of rules, signals, and data that govern how content is filtered, ranked, selected, and recommended to users on social media platforms. They determine the priority and display order of content for each user based on various factors, including user behaviour and engagement. The issue with algorithms comes about when they create concentrated neighbourhoods of harmful content.

(Q)

Can’t you already opt-out of personalised feeds on some social media apps?

Click to flip

(A)

Currently, some apps allow you to customise your feeds in different ways. Some give you the option to turn off personalised feeds. However, they don’t go all the way: personalised feeds still remain the default on social media, feeds often easily revert back to personalisation if you leave the app, and ultimately, there’s no legislation that requires platforms to have an opt-in option, so most don’t offer this option as the default.

(Q)

Will this change how advertising works?

Click to flip

(A)

If you turn off your algorithm, you can still be advertised to. However, the ads will be less personalised and targeted.

#fixourfeeds

The scroll ends here. You’ve consumed all the factual information you need to know.