Explainer: Basic Online Safety Expectations

The Online Safety Act 2021 passed in June this year and comes into effect on 23 January 2022. Under Part 4 of the legislation, the Minister has the power to determine something called the Basic Online Safety Expectations (BOSE).

*Curious how this fits within the broader Online Safety Act? You can read our initial explainer here.

The draft BOSE declaration is currently under public consultation, and submissions are due on 12 November 2021. Here is an overview of the key issues we’ve identified and how we propose addressing them. Please use this page and our submission as your own cheat sheet if you are writing one! You can also forward it to your MP.

What’s included in the BOSE?

In essence, it outlines the government’s expectations for social media services, relevant electronic services, and designated internet services. 

Taken together, this means that the expectations will apply to basically any service and type of content on the internet. This of course includes social media, but it will also extend to include private messages and email. Really, any electronic communication services are covered by this.

The expectations require services to:

  • Ensure that end-users are able to use the service in a safe manner (core expectation 6.1) 
  • Minimise provision of any of the following: cyber bullying or abuse material targeted at an Australian child or adult respectively, non-consensual sharing of intimate images, Class 1 Material, and material that promotes, incites, instructs or depicts abhorrent violent conduct (core expectation 11)
  • Prevent children from accessing Class 2 Material (core expectation 12.1)
  • Proactively minimise material that may be unlawful or harmful (additional expectation 6.2) 
  • Prevent anonymous accounts being used to deal with material that is or may be unlawful or harmful (additional expectation 9.1) 
  • Develop and implement processes to detect and address material that is or may be unlawful or harmful if the service uses encryption (additional expectation 8) 

There are also expectations about ensuring there are mechanisms to report and make complaints about breaches of a service’s terms of use, to ensure there are policies in place, and to cooperate with providers of other services. We haven’t evaluated these as harmful to digital freedoms so we have not included them here.

You can read our full joint submission with Global Partners Digital on the BOSE here.

What are our key concerns?

There are aspects of the draft BOSE that, if passed in their current form, are likely to pose risks to individuals’ privacy, security, and freedom of expression, and violate Australia’s international human rights obligations.

Based on previous experience and research we fear that BOSE will:

  1. incentivise proactive, automated, and blanket removal of content,
  2. push services to prevent people interacting anonymously, and 
  3. force services to undermine encryption. 

Proactive, automated monitoring and removal of content 

Additional expectation – “The provider of the service will take reasonable steps to proactively minimise the extent to which material or activity on the service is or may be unlawful or harmful.

Given the scale and scope of content that is generated and shared online, platforms generally turn to automated processes, including AI, in order to perform some level of content moderation. We anticipate that this expectation will incentivise platforms to increase their use of such tools, to proactively monitor and remove content. 

Unfortunately, automated content moderation has been shown time and time again to be ineffective, and to disproportionately impact some groups over others, penalising Black, Indigenous, fat, and LGBTQ+ people. Automated processes have also not been effective at dealing with hate speech, which means it’s more likely to be a visual-based scheme, and less effective at identifying particular forms of abuse and bullying contained in text. Zuckerberg said in 2018, it’s ‘easier to detect a nipple than hate speech with AI’ and anyone who has spent much time online is likely to have seen this in practice.

We are concerned that this expectation is likely to result in discriminatory implementation and an overarching increase in blanket removal of content which may not be unlawful or harmful, but be swept up in an effort by services to meet this expectation. 

Threatening anonymity 

Additional expectation –  “If the service permits the use of anonymous accounts, the provider of the service will take reasonable steps to prevent those accounts being used to deal with material, or for activity, that is or may be unlawful or harmful.

This expectation is supplemented with two examples of ‘reasonable steps’ which include: (a) preventing people from repeatedly using anonymous accounts or (b) having processes that require verification of identity.

This expectation, in combination with the reasonable steps, will threaten the ability of people to be anonymous online. We won’t go into more detail here, but anonymity online is extremely important for a range of reasons → we wrote about it here.

Undermining encryption 

Additional Expectation – “If the service uses encryption, the provider of the service will take reasonable steps to develop and implement processes to detect and address material or activity on the service that is or may be unlawful or harmful.”

We are extremely concerned with the inclusion of this expectation, as it stands to push services to undermine encryption, which is essential for our individual and collective digital security. This expectation frames encryption as an inhibitor to safety, which runs counter to the general consensus in the cybersecurity industry that encryption is vital to facilitate safety. 

Encryption supports the security of our online activities; protecting data from potential cybercriminals, enabling secure online transactions, and maintaining the privacy and security of our online communications, including those of children. For example, encryption plays a crucial role in preventing malicious actors from accessing networked devices, including tapping into users’ webcams or baby monitors. This additional obligation would potentially undermine the security of Australians’ encrypted services, jeopardizing the safety of the millions of people who rely on them each day. 

Want to know more about why encryption is so important? Check out this expert session we hosted for Encryption Day.

Want to write your own submission? Here are some tips! 

  • If you’re feeling overwhelmed or not sure where to start, or would like a hype team to get you inspired to write a submission, check out this online workshop we hosted with Electronic Frontiers Australia on how to write your own policy submission. 
  • Remember that your submission doesn’t need to be long or complicated. It’s okay to make a short and to the point submission—in fact, it’s great! One page!
  • Note that the Online Safety Act prescribes the core expectations which are included in the BOSE Determination, and they cannot be amended. But you can still suggest that additional expectations are included to add context or safeguards around the core expectations. 
  • Feel free to take inspiration from our submission (that’s why we’re sharing it!) but remember that your submission will be more powerful if you don’t copy-paste. Take the ideas, and make them your own! Including your own personal experience or personal concerns will have the most impact.
  • You can reference our submission in your own to add additional support. For example, you can include a sentence like: “I would like to extend my support to the submission made by Digital Rights Watch”. You can also reference other organisations.
  • Really stuck? Get in touch with us, we’d love to help as best we can. 

Looking for more? Here is a handful of our work in this space…