Last week, organizations from Australia’s online industries submitted a final draft of new industry codes aimed at protecting children from “age-inappropriate content” to the eSafety commissioner.
The commissioner will now decide if the codes are appropriate to be implemented under the Online Safety Act.
The codes aim to address young people’s access to pornography, high-impact violence, and material relating to self-harm, suicide and disordered eating.
However, the draft codes may have unintended consequences. There is a real risk they may further restrict access to materials about sex education, sexual health information, harm reduction and health promotion.
Social media can operate as a powerful medium to teach teens and young people sexual information.
Social media campaigns (some government-funded) target rising rates of sexual violence. They also disseminate important sexual health information.
What are the industry codes?
The eSafety commissioner is in the process of introducing codes of practice for the online industry “to protect Australians from illegal and restricted online content.” The Phase 1 codes, aimed at illegal content such as child sexual exploitation material, came into effect last year.
Now the commissioner is looking at Phase 2. These are designed to prevent young people from accessing “inappropriate” but not illegal content. They will do this via age-assurance mechanisms and by filtering, deprioritizing, downranking and suppressing content.
The codes will apply to operating systems, various internet services, search engines and hardware, such as smartphones and tablets.
Tech companies will have more power (and responsibility) to remove content and suspend users. Companies that don’t follow the codes risk fines of up to US$49.5 million (around A$77 million).
Suppression of sexual health content
The idea of using technology to restrict online content by age is problematic. The Australian government itself has deemed that age-assurance technologies are not ready to be used. State-of-the-art software has shown racial and gendered bias.
And digital platforms have a poor track record of governing sexual media.
International human rights organizations, including the United Nations, have warned that automated content moderation is being used to censor sex education and consensual sexual expression.
Research shows many platforms tend to remove or suppress content about drag queens, trans rights, sexual racism, body positivity and sex worker safety.
At the same time, they allow health misinformation and hate speech directed at LGBTQ+ people.
Sexual health organizations and educators already face challenges of using social media to communicate with key audiences, including LGBTQ+ communities. These include having their content made less visible (“shadowbanning“) or outright removed.
Unintended consequences
Content moderation policies are already very restrictive. To enforce them, platforms use nudity and pornography detection software that is often biased toward heteronormative standards.
For example, Google’s computer vision software has previously relied on word databases that link “bisexuality” with “pornography,” “sodomy” with “bestiality,” and “masturbation” with “self-abuse.”
Many users currently use “algospeak“. This is language designed to avoid the notice of the algorithms that may flag content as inappropriate, often involving tweaks such as using emojis or “seggs” or “s&x” instead of “sex.”
The government recognizes the power of social media. It has committed more than A$100 million towards Our Watch (a leading organization advocating against violence against women) and its teen-focused social media initiative The Line.
Another A$3.5 million has gone to the Teach Us Consent organization. This group creates social media content for teens and young people about consent, healthy relationships, pornography and sex.
Like the looming youth social media ban, the proposed industry codes may undermine the government’s own efforts to reduce gender-based violence.
Sex education and health promotion
Social media platforms try to separate health information from general sexual content. For example, they may aim to allow nudity in cases like childbirth, breastfeeding, medical care or protests.
However, evidence suggests these exceptions are currently almost impossible to moderate accurately. They rely on a distinction between sex education and sexual media that is blurry at best.
In reality, sexuality education is not simply technical information about infections, sexual dysfunction or medical care. Sexual imagery plays an important role in sexual health promotion. Young people respond well to visual methods of communication and learning.
Likewise, the importance of pleasure has long been recognized in HIV prevention, safer sex and violence prevention efforts. Industry codes should recognize sexual media as a potential medium for conducting sex education and promoting sexual and reproductive rights.
Governments in many countries are moving to restrict sexual information and health services. This includes efforts to criminalize abortion, limit access to trans health care and prevent comprehensive sex education.
In this context, access to online health promotion and sex education content is even more vital.
Ensuring access to sexual health material
The industry codes are intended to protect. However, they risk endangering the ability of Australians to access essential information.
This is especially important for the many young people who do not have access to comprehensive sexuality and reproductive health information at home or school.
To uphold sexual rights to information, privacy and expression, the codes must shift away from simply giving platforms an incentive to detect and suppress all sexual content.
Instead, the codes should ensure non-discriminatory access and require platforms to promote material that supports sexual health, rights and justice. In practice, this necessitates careful consideration of content in context.
This task might seem time-consuming, resource-heavy and difficult for regulators and platforms alike. But the implications of content suppression are too dire to overlook.
In our view, the codes should be paused until they are able to balance protection with rights to information.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Citation:
Sexual health info online is crucial for teens: Australia’s new tech codes may threaten their access (2025, May 29)
retrieved 29 May 2025
from https://techxplore.com/news/2025-05-sexual-health-info-online-crucial.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.