Published daily by the Lowy Institute

AI’s war on consent: Women face coded misogyny

Evidence is growing from across the world that digital platforms increasingly enable gender-based violence with authorities slow to catch-up.

Arguably, one of the reasons why most AI in its current form doesn’t serve women is because women comprise only 22 per cent of the global AI workforce (Marco Piunti/Getty Images)
Arguably, one of the reasons why most AI in its current form doesn’t serve women is because women comprise only 22 per cent of the global AI workforce (Marco Piunti/Getty Images)

Sana Yousaf, a 17-year-old Pakistani TikTok creator, was shot and killed this month by a man whose advances she had reportedly declined. With nearly a million followers, Sana’s online presence made her visible, admired, and fatally accessible. The man accused of her murder had allegedly followed her, first through screens, then streets. Her refusal, offered repeatedly, was ultimately met with, what some are calling, tech-facilitated gender-based violence.

Closer to home, earlier this year, the details of another tragedy from October 2023 were released in a coronial hearing in Sydney. Paul Thijssen, a former sports coach at a prestigious private school, murdered his ex-girlfriend, Lillie James. During an inquest, it emerged that Thijssen, after the break-up, had used Snapchat’s location-sharing feature to monitor her movements before he murdered her.

These acts are not anomalous but part of an enduring pattern in which women’s refusal of attention, advances, and demands is treated as intolerable. But what is newly disquieting is the infrastructural landscape that accelerates and facilitates gender-based violence.

Demonstrators holding placards and a poster of TikTok star Sana Yousaf during a protest held to condemn violence against women after she was killed for rejecting a man's proposal in Islamabad (Farooq Naeem/AFP via Getty Images)
Demonstrators holding placards and a poster of TikTok star Sana Yousaf during a protest held to condemn violence against women after she was killed by a man in Islamabad (Farooq Naeem/AFP via Getty Images)

Digital platforms today, besides mediating relationships, are also recasting their moral grammar. The space for women to say “no” to unwarranted advances has long been constrained, but Artificial Intelligence (AI), algorithmic design, and platform cultures now threaten to foreclose it entirely.

Termed as the final stage of digital addiction, AI companions present an excellent case study. Marketed to alleviate loneliness, these systems are offering individuals programmed acquiescence and indefinite attention. It is no surprise, then, that the global market for “AI girlfriends” was valued at $2.8 billion in 2024 and is projected to exceed $9.5 billion by 2028. However, there is a growing danger in the normalisation of AI companions.

A recent commentary published by the Harvard Kennedy School for Human Rights warns that such technologies may condition users, particularly young men, to always expect deference, passivity, and unconditional emotional availability from their partners. One has to ask: In a society shaped by AI, will resistance by women on any relationship issue come to be seen as a flaw, an error in the script?

Where bots are designed to simulate intimacy, automate consent, and reward dominance, we must proceed with particular caution.

These aren’t abstract concerns. As AI tools grow more accessible and powerful, they are already being used to simulate and violate intimacy without consent. In New Zealand, MP Laura McClure recently demonstrated in parliament how easily deepfake images can be generated. In under five minutes, she created a nude image of herself using a publicly available AI platform. As a woman in a position of power, she wanted to raise the alarm on this situation but sadly, even women with power are not immune. In 2024, Fiji’s Minister for Women, Lynda Tabuya was dismissed after intimate footage reportedly stolen from her phone was circulated online. Despite being the victim of a privacy violation, she was asked to step down from her position.

This shrinking room for women to simply exist without appeasing misogyny is now extending beyond the symbolic and into the structural. A recent report from the International Labour Organisation found that women’s jobs are nearly three times more vulnerable to displacement by AI than men’s, particularly in high-income economies where women are overrepresented in clerical and administrative roles. Besides amplifying conditions of harm, AI is making women economically expendable.

What connects these seemingly disparate examples from tech-facilitated murders to artificial girlfriends to structural job losses is the role of AI in systematically eroding women’s ability to assert boundaries.

Arguably, one of the reasons why most AI in its current form doesn’t serve women is because women comprise only 22 per cent of the global AI workforce. Their presence in leadership roles is even lower, sitting at 14 per cent. This suggests that AI design reflects narrow perspectives, inattentive to structural or ethical asymmetries. When women’s voices are absent from design, their needs are absent from the output. The result is a technological environment in which gendered harms remain unaddressed and replicated at scale.

There is growing social anxiety around these developments in Australia evident in petitions calling for the outright ban of companion AI for minors. Not all AI is harmful, of course. As the eSafety Commissioner suggests, some bots offer legitimate support for tutoring or mental health. But where bots are designed to simulate intimacy, automate consent, and reward dominance, we must proceed with particular caution.

Australia banned deepfakes in 2024, certainly a welcome development, but punitive measures alone cannot reconstitute cultural norms. AI systems whether through companion bots or image-generation tools are not neutral. They encode and reproduce the asymmetries of the societies from which they emerge. They mirror demand and marginalise dissent, rendering women’s non-consent not only inconvenient but offensive.

Whether Australia should ban AI companions or contain the growth of an AI intimacy industry is ultimately a policy decision that should be informed by wide consultation with psychologists, educators, and rights advocates. But women’s voices must be central in that conversation. What is already clear is that we have not yet secured children’s and women’s safety in the physical world. What confidence do we have that we can do better in the virtual one?

There is still time to ask difficult questions about the kinds of systems we allow to become normal and whether our governance mechanisms are strong enough to withstand their consequences. And whether, in doing so, we can reclaim space for women to refuse what makes them uncomfortable – and to do so without fear.




You may also be interested in