Submission: Draft Restricted Access Systems Declaration


The Online Safety Act 2021 was passed in June 2021. Contained within it is an Online Content Scheme which includes the ability for the eSafety Commissioner to issue ‘remedial notices’. Upon receiving such a notice, a service must either remove the content or ensure that it is subject to a Restricted Access System (RAS).

The goal of a RAS is to restrict and control access to particular forms of content to those who are over the age of 18. Under the Act, the RAS will cover any ‘relevant class 2 material’, which generally means content that has been or is likely to be, classified as R18+. This includes “realistically simulated sexual activity between adults, high impact nudity, high impact violence, high impact drug use, high impact language.”

You may remember that we, among other civil society and community groups, have been critical of the Online Safety Act. We don’t believe that the powers included in the Act will actually achieve a safer online experience for adults and children alike. Need to catch up? You can read our explainer of the Act and why we’re concerned here, as well as our submission here.

As a separate – but connected – issue, the eSafety Commissioner is also developing a ‘roadmap’ to mandatory age verification for online pornography. While the RAS will be somewhat limited to higher impact content, the scope for the age verification roadmap is much broader: it is intended to cover all forms of pornographic material.

This submission is the second we’ve made on the RAS. You can find our initial submission on the Discussion Paper here.

What are our key concerns?

We are primarily concerned with the following:

  • Practically all approaches to implementing a RAS will require the provision of personal information, which creates significant privacy and security risks. This includes the risk of linking individuals’ identity to their content consumption habits, and making services a target for malicious attacks.
  • Age verification requirements may lead people of all ages to less safe and secure internet services in order to circumnavigate providing personal information.
  • Most existing approaches to RAS/AV can be trivially bypassed, rendering them ineffective for the proposed objective.

The draft RAS declaration places an undefined responsibility upon industry to determine ‘reasonable steps’ to verify a user’s age. There is currently no widely accepted ‘good’ approach to implementing safe and effective age verification. In fact, most have been shown to create significant privacy and security risks.

Our Recommendations:

  1. Remove Section 7(b) from the declaration, as it is unlikely to be an effective means of harm reduction. The eSafety Commissioner is best-placed to provide such safety information to parents and guardians, not providers.
  2. As an alternative to Recommendation 1, alter the wording of Section 7(b) to clarify that providers are not required to provide safety information “to the applicant”, but to offer such information on their website for parents or guardians to access, should they seek it. The content of the safety information should be made accessible to industry in guidance developed by the eSafety Commissioner.
  3. Prioritise education and communication at meaningful and effective points in time, delivered in appropriate ways, rather than trying to find a technological solution. Using a technological approach to prevent young people from accessing online pornography is unlikley to mitigate the perceived harms without accompanying robust and inclusive sex education and resources for young people across all sexualities.
  4. Elevate the expectation that age confirmation methods should be privacy-preserving to the legislation, not the explanatory statement. It should also be made explicit in the legislation that age confirmation should not involve identity verification, as is contained in the explanatory statement. 
  5. It is appropriate for the RAS Declaration to be non-prescriptive about specific measures, however, it should require safeguards against the use of certain approaches and technologies, such as:
    • Prohibit the use of facial recognition technology or other collection of biometric data,
    • Prohibit the association of identity with online pornography viewing habits, for instance, by way of prohibiting content providers from collecting identity documents, and prohibiting third-party age verification services from collecting information regarding the content being accessed.
  6. Explicitly prohibit providers from developing a system which logs, records or otherwise documents individuals’ access to, and viewing habits of relevant class 2 material. Given the sensitive nature and value of such information, this kind of database would create significant privacy risk for individuals, and is likely to also create considerable digital security risk for providers by positioning them as targets for malicious attacks.
  7. Include explicit language in the Declaration that prohibits providers from collecting, using, storing, or disclosing personal information of visitors applying for access to relevant class 2 material that is not relevant or necessary for the purpose of providing access. 
  8. The RAS should also preclude any/all ability for government agencies or private companies to track or link an individual’s identity with their online pornography viewing habits, or any other ‘age inappropriate material’ they may access.

Read our full submission to the Draft RAS Declaration here.