The Online Safety Act 2021 was passed in June 2021. Contained within it is an Online Content Scheme which includes the ability for the eSafety Commissioner to issue ‘remedial notices’. Upon receiving such a notice, a service must either remove the content or ensure that it is subject to a Restricted Access System (RAS).
You may remember that we, among other civil society and community groups, were critical of the Online Safety Act. We don’t believe that the powers included in the Act will actually achieve a safer online experience for adults and children alike. Need to catch up? You can read our explainer of the Act and why we’re concerned here, as well as our submission here.
The goal of a RAS is to restrict and control access to particular forms of content to those who are over the age of 18. Under the Act, the RAS will cover any content that has been or is likely to be, classified as R18+. This includes “realistically simulated sexual activity between adults, high impact nudity, high impact violence, high impact drug use, high impact language.”
As a separate – but connected – issue, the eSafety Commissioner is also developing a ‘roadmap’ to mandatory age verification for online pornography. While the RAS will be somewhat limited to higher impact content, the scope for the age verification roadmap is much broader: it is intended to cover all forms of pornographic material.
The Office of the eSafety Commissioner is conducting a public consultation into its Restricted Access System (RAS) Declaration, as well as a call for evidence regarding the implementation of a roadmap to mandatory age verification (AV) regime relating to online pornography.
What are our concerns?
- practically all approaches to implementing RAS/AV will require the provision of personal information, which creates significant privacy and security risks,
- mandatory AV may lead people of all ages to less safe and secure internet services in order to circumnavigate providing personal information,
- the administrative burden upon smaller Austrailan content hosts is likely to be unreasonable,
- restrictions on sexually explicit material is likley to disproportionately harm young LGBTQ+ people, and
- most existing approaches to RAS/AV can be trivially bypassed, rendering them ineffcetive for the proposed objective.
The combination of these factors is likely to result in a system that is unduly invasive, creates new privacy and security risks, and is unlikely to be effective at preventing people under the age of 18 from accessing restricted content. We are concerned that the outcome will be a system that is not merely ineffective, but actively harmful.
Our recommendations:
- Rather than implementing an invasive regime for all adults in Australia, we would prefer to see an approach that prioritises making websites ensure that their content is more easily filterable by parental control software, as well as leveraging the ability for ISPs to filter content on specific devices to be used by children.
- Prioritise education and communication, rather than trying to find a technological solution. Using a technological appraoch to prevent young people from accessing online pornography is unlikley to mitigate the perceived harms without accompanying robust and inclusive sex education and resources for young people across all sexualities.
- The RAS should under no circumstance enable the ability for government agencies or private companies to track or link an individual’s identity with their online pornography viewing habits, or any other ‘age inappropriate material’ they may access.
- The RAS should not be prescriptive about the measures used to limit children’s exposure to ‘age inappropriate material,’ however it should require safeguards against the use of certain approaches and technologies, including:
- Prohibit the use of facial recognition technology or other collection of biometric data
- Prohibit the association of identity with online pornography viewing habits, for instance, by way of prohibition of content providers from collecting identity documents, and conversely, prohibition of any third-party age verification services from collecting information regarding the content being accessed.