Australia's New Online Safety Act: Difference between revisions

From Bibliotheca Anonoma
No edit summary
Line 90: Line 90:
* compliance with disclosure notices;
* compliance with disclosure notices;
* probably some other bullshit they've hidden somewhere deep in the law...
* probably some other bullshit they've hidden somewhere deep in the law...
===Basic Online Safety Expectations===
The Minister will say what they are (45). At a minimum, they include the expectations that:
* "the provider of the service will take reasonable steps to ensure that end‑users are able to use the service in a safe manner";
* "in determining what are such reasonable steps, the provider will consult the Commissioner";
* "the provider of the service will take reasonable steps to minimize the extent" that material covered under this Act is accessible on the service;
* "the provider of the service will take reasonable steps to ensure that technological or other measures are in effect to prevent access by children to class 2 material provided on the service";
* the provider will have "clear and readily identifiable mechanisms that enable end‑users to report, and make complaints about," material covered under this Act or breaches of its terms of use;
* if the Commissioner asks for how many complaints were filed or how long it took to comply with every removal notice within a given period of 6 months or more, or what the service has done to make it safer for end-users, the provider will comply within 30 days (46).
==Notes==
==Notes==
<references />
<references />

Revision as of 09:41, 24 June 2021

Today (23 June 2021) the Online Safety Act[1] passed the Australian Senate. As by constitutional convention Royal Assent is automatic after a bill is passed by both Houses of the Parliament, it might as well be law even though the Governor-General hasn't put his name on it yet. Since it comes into effect a maximum of 6 months from its passage (2)[2], and its strength is, to say the least, sweeping, it is important not only for Australians but also for Canadians, British, and even Americans and citizens of non-English-speaking nations to know about it, given its authors intend it to be a model for similar legislation worldwide[3].

Material Covered by the Act

This Act's definition of "material" includes text, data, sounds, pictures and videos. This Act covers five types of material:

  • "cyber‑bullying material targeted at an Australian child";
  • "cyber‑abuse material targeted at an Australian adult";
  • "non‑consensual intimate image of a person";
  • "material that depicts abhorrent violent conduct";
  • "Class 1 material" and "Class 2 material".

Cyber-Bullying Material Targeted at an Australian Child

If "an ordinary reasonable person would conclude that" it is likely that material is meant for a particular Australian child, and "would be likely to have the effect on the Australian child of seriously threatening, seriously intimidating, seriously harassing or seriously humiliating the Australian child", then the material falls into this category. However, if the material was posted by a person that is "in a position of authority over an Australian child" "in the lawful exercise of that authority" as "reasonable action taken in a reasonable manner", then it doesn't fall into this category (6).

Cyber-Abuse Material Targeted at an Australian Adult

If "an ordinary reasonable person would conclude that it is likely that the material was intended to have an effect of causing serious harm to a particular Australian adult" and "an ordinary reasonable person in the position of the Australian adult would regard the material as being, in all the circumstances, menacing, harassing or offensive", then it falls into this category (7). "Serious harm" is defined as "serious physical harm", "serious psychological harm", or "serious distress" "whether temporary or permanent"(3).

Non-Consensual Intimate Image of a Person

An intimate image includes a picture or video of the following "in circumstances in which an ordinary reasonable person would reasonably expect to be afforded privacy":

  • the genital or anal region (whether bare or covered by underwear) of any person, or either or both of the breasts of a female, transgender, or intersex person,
  • a person "in a state of undress, using the toilet, showering, having a bath, engaged in a sexual act of a kind not ordinarily done in public, or engaged in any other like activity",
  • a person without "particular attire of religious or cultural significance" that they wear "whenever the person is in public" "because of the person’s religious or cultural background" (15).

A picture or video falls into this category if it is posted without the consent of the person it depicts, but if it is posted as an "exempt provision" it doesn't fall into this category (16). "Exempt provisions" are if the posting is for enforcing the law, for "a genuine medical or scientific purpose", or if "an ordinary reasonable person would consider the provision of the intimate image on the service acceptable", among others (86).

Material That Depicts Abhorrent Violent Conduct

"Abhorrent violent conduct" means terrorist acts, murder, attempted murder, torture, rape, or kidnapping. Unlike with the other categories, there is no reasonable person test; merely that material records or streams such conduct puts it into this category (9).

Class 1/2 Material

"Class 1 material" means films, video games, publications, or the contents thereof that are, or have not been but would likely be, classified as RC (banned), or other material that would be RC if it was a film (106).

"Class 2 material" means films, video games, publications, or the contents thereof that are, or have not been but would likely be, classified as X18+ or R18+ (or as Category 1 or 2 restricted for publications), or other material that would be classified there if it was a film (107).

Services Covered by the Act

The Act's definition of "electronic service" includes all services transmitting material through electromagnetic waves. The Act covers three types of electronic services:

  • "social media service";
  • "relevant electronic service";
  • "designated internet service".

The Minister[4] can put a service that doesn't meet the criteria to be there into one of those categories, but whether or not they can put a service that does meet the criteria to be in a category out of it depends on which category. "Exempt services", where material posted is inaccessible and undelivered to all Australian end-users, are not covered by this Act.

Social Media Service

A service falls into this category whose "sole or primary purpose of the service is to enable online social interaction between 2 or more end‑users" or "share material for social purposes" and that "allows end‑users to link to, or interact with, some or all of the other end‑users" and "post material on the service".

The Minister can declare services that would otherwise fall into this category to be exempt services (13).

Relevant Electronic Service

A service falls into this category that is an email, instant messaging, SMS, MMS, chat, or online game service.

The Minister cannot declare services that would otherwise fall into this category to be exempt services (13A).

Designated Internet Service

A service falls into this category that is an "internet carriage service" (or in English, ISP) (14), but not an "on-demand program service", which is a service that transmits commercial or subscription TV, ABC, or SBS broadcasts through the internet (18).

The Minister can declare services that would otherwise fall into this category to be exempt services.

Remedies for Targets

Targets of any of the described content can (depending on the category of the content):

  • file a complaint or objection notice;
  • sue the poster.

Complaints and Objections

A complaint only tells the Commissioner about the material complained of; they may investigate the matter but don't have to. An objection notice explicitly demands the removal of the material objected to (but is not available for all kinds of material) (33).

About Cyber-Bullying Material

An Australian child, their parent, or someone they've authorized to act on their behalf, can file a complaint. Also, a child that turns 18 can file a complaint for cyber-bullying material targeted at them when they were a child (30).

About Cyber-Abuse Material

An Australian adult or someone they've authorized to act on their behalf can file a complaint (36).

About Class 1 or Class 2 Material

Anyone with reason to believe that Class 1 material or Class 2 material at the X18+ level can be accessed in Australia can file a complaint. So can anyone with reason to believe that Class 2 material at the R18+ level is accessible in Australia and not behind a restricted access system (38).

About Intimate Images

A person depicted by an intimate image, a person authorized by them, or their parent or guardian if they're under 16 or incapable of managing their own affairs, may file a complaint or objection. They may file a complaint or objection even if they previously consented to the posting of the intimate image (32).

Private Rights of Action

Australia does not recognize the tort of intentional or negligent infliction of emotional distress[5], so a person cannot be sued for posting cyber-bullying or cyber-abuse material as the Act does not explicitly provide a private cause of action for those types of material.

Over Intimate Images

A person who posts or threatens to post an intimate image of another person without their consent (unless it's intimate because of lack of religious or cultural apparel and the person didn't know that they consistently wore that apparel in public) can be sued by that person, or the Commissioner, for up to 500 penalty units (75)[6].

Consequences for End-Users

If you post material covered by this Act, you may be subject to the following consequences:

  • being sued (see above);
  • "end-user notice";
  • "removal notice".

Notices have to identify the material in question "in a way that is sufficient to enable the end‑user to comply with the notice". If you don't comply with a notice, you can receive an injunction from the Federal Court (71), or a formal warning from the Commissioner (72).

End-User Notice

You can get an end-user notice for posting cyber-bullying material. If you do, you have to:

  • "take all reasonable steps" to get the material removed within the time the notice requires;
  • not post any more cyber-bullying material with the same target;
  • apologize to the complainer in the manner and the time the notice requires (70).

Removal Notice

You can get a removal notice for posting non-consensual intimate images. If you do, you have to "take all reasonable steps" to get the material removed within 24 hours or a longer period the Commissioner allows (78).

You can also get a removal notice for posting cyber-abuse material. The terms of the removal notice are the same as if you posted non-consensual intimate images (89).

Obligations for Services

Services covered by this Act are subject to the following obligations:

  • "basic online safety expectations";
  • reporting;
  • compliance with "removal notices";
  • obedience to "blocking requests" and "blocking notices";
  • compliance with "remedial notices";
  • compliance with "service provider determinations";
  • compliance with disclosure notices;
  • probably some other bullshit they've hidden somewhere deep in the law...

Basic Online Safety Expectations

The Minister will say what they are (45). At a minimum, they include the expectations that:

  • "the provider of the service will take reasonable steps to ensure that end‑users are able to use the service in a safe manner";
  • "in determining what are such reasonable steps, the provider will consult the Commissioner";
  • "the provider of the service will take reasonable steps to minimize the extent" that material covered under this Act is accessible on the service;
  • "the provider of the service will take reasonable steps to ensure that technological or other measures are in effect to prevent access by children to class 2 material provided on the service";
  • the provider will have "clear and readily identifiable mechanisms that enable end‑users to report, and make complaints about," material covered under this Act or breaches of its terms of use;
  • if the Commissioner asks for how many complaints were filed or how long it took to comply with every removal notice within a given period of 6 months or more, or what the service has done to make it safer for end-users, the provider will comply within 30 days (46).

Notes

  1. https://www.legislation.gov.au/Details/C2021B00018
  2. Numbers in brackets refer to section numbers of the Act.
  3. https://www.smh.com.au/national/we-need-to-ensure-online-safety-before-big-tech-profits-20210615-p58123.html
  4. To figure out which Minister, go to https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/Browse_by_Topic/law/adminarrangements#:~:text=Administrative%20Arrangements%20Orders%2C%201906%2B%20%20%20%20Date,%20%20Yes%20%2038%20more%20rows%20 and find the latest Administrative Arrangements Order, then find the "Online Safety Act 2021" under the section for "Legislation administered by the Minister".
  5. Magill v. Magill [2006] HCA 51
  6. A penalty unit is a means for ensuring the amount of fines keeps pace with inflation, by stating fines in terms of penalty units instead of sums of money. A penalty unit is currently $222.