The Online Safety Act 2021 (Cth) ('the Act') was passed on 23 June 2021 commenced on 23 January 2022. The Federal Government introduced the legislation in a bid to strengthen industry standards and resolve the gaps in Australia's existing online safety system. As a result of the COVID-19 pandemic, online interactions are becoming increasingly relied upon as a part of everyday life, particularly through the way people socialise, work, learn and enjoy entertainment. This increased usage of online platforms opens a range of privacy and safety issues that users are often not aware of.

The Act allows the eSafety Commissioner ('Commissioner'), Julie Inman Grant, to assess complaints relating to cyber abuse, image-based abuse and cyberbullying. Whilst this Act attempts to eliminate the harm caused to users, it has gained criticism for its wide scope of application, and the consequences of misuse that are likely to arise as a result.

What does the Act enforce?

The Act has a broad scope, allowing the Commissioner to deal with the removal of harmful online material relating to Australian children, adults, the sharing of intimate images without consent, abhorrent violent material and harmful online content. Harmful online content is broken into Class 1, which is anything that is against the standards of morality, decency and propriety, and Class 2, which refers to anything that would be classified as R18+.

The Act upholds a high threshold for cyber-abuse. Under s 7, the abuse must 'intend to cause serious harm' and be 'menacing, harassing or offensive in all circumstances'. Examples of serious harm of cyber-abuse include situations where the material sets out realistic threats, place individuals in imminent danger, is intending to be excessively malicious or where the abuse is relentless. Importantly, if the situation does not meet the above threshold, the Commissioner is able to offer support, information and advice to guide the individual to avoid harm.

Industry Standards

The Act enforces industry standards known as the Basic Online Safety Expectations (BOSE), which requires online service providers to take reasonable steps in order to minimise the risk of harm. For example, the Act requires online service providers to create a safer online environment through:

  • Ensuring technological or other measures are in effect to prevent access by children;
  • Guaranteeing the service has clear and readily identifiable mechanisms that enable end-users to report and make complaints about cyber bullying material and breaches of the service's terms of use; and
  • Providing a written statement of complaints and removal notices to the Commissioner within 30 days when required.

Who does the Act apply to?

The Act applies to the following services:

  • Designated internet service providers
  • Social media service providers
  • Electronic service providers (such as Outlook and WhatsApp)
  • Hosting service providers
  • App distributor service providers
  • Internet service providers (such as Microsoft Edge, Firefox and Safari)
  • Internet search engine service providers
  • Ancillary service providers to the online industry

What powers does the Commissioner have?

The Commissioner will impose updated BOSE industry standards and technical requirements upon digital platforms to which the Act applies. This will be done through online content schemes and regimes tackling violent, abhorrent material that are set out within the Act.

A criticism that is identified with the enforcement of the Act is the large role placed on the Commissioner. The Commissioner is the single decider of the complaints and breaches, and along with this, has been given substantial investigative and enforcement powers.

The Commissioner has the power to issue the following notices:

  1. Removal Notice: Social media, electronic, designated internet and hosting service providers will be given a removal notice, requiring them to remove or take all reasonable steps to remove the material, or cease hosting the material, from their service within 24 hours.
  2. Blocking Notice: Internet service providers will be requested or required to block access to material that depicts, incites or instructs abhorrent violent conduct. The Commissioner must be satisfied that the material is likely to cause significant harm to the Australian community.
  3. App Removal Notice: App distribution service providers may be given a notice to cease enabling end-users to download an app that facilitates the posting of certain material within 24 hours.
  4. Link Deletion Notice: Internet search engine providers may be given a notice requiring the provider to cease offering a link to certain material within 24 hours.

Where an individual or body corporation does not comply with the notice, the Commissioner may impose formal warnings, infringement notices, enforceable undertaking, injunctions and civil penalties. The civil penalties for individuals' range between $22,200 (100 penalty units) and $111,000 (500 penalty units), and $555,000 (2500 penalty units) for corporations.

Individuals are able to make a report to the Commissioner where they believe a platform has failed to take action to ensure the safety of the users, or where they believe a breach of safety has occurred. The Commissioner has the discretion whether to act upon this complaint or whether to issue a formal warning for the online service provider to take reasonable steps to correct the breach.

The Commissioner does not have the power to investigate most online frauds and scams, spam, defamation and privacy breaches. In this situation, the Commissioner will refer individuals who have been targeted to obtain legal advice.

What does this mean for your business?

If your business falls within the realms of the Act, then it will be essential to review and update your current online safety procedures and policies to ensure that there is compliance with the BOSE as well as the relevant sections of the Act. If you are required to remove content as per the Commissioners request, you should have effective mechanisms in place that allow this to be done in the stipulated 24-hour time frame.

It is important that a business has updated terms of use, safety policies and procedures (particularly those that deal with end-users), standards of conduct and policies in relation to the control and enforcement of those standards. The BOSE also requires businesses to have a mechanism which allows Australian residents to report and make complaints about breaches of terms of use and the service provided on their platform. The Commissioner may direct minor complaints back to the business and give them the chance to rectify the breach.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.