Australia's New Online Safety Act

From Bibliotheca Anonoma
Revision as of 04:44, 12 April 2022 by Quintuplicate (talk | contribs) (→‎Social Media Service: fix grammar)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Update 20/1/22: Basic online safety expectations have been made.

Update 23/7/21: The Online Safety Act has received the Royal Assent as Act No. 76 of 2021. It will come into effect no later than 6 months after its passage (23 January 2022). Draft regulations are being made and the Bibliotheca Anonoma will keep you updated on new developments.

Today (23 June 2021) the Online Safety Act passed the Australian Senate. As by constitutional convention Royal Assent is automatic after a bill is passed by both Houses of the Parliament, it might as well be law even though the Governor-General hasn't put his name on it yet. Since it comes into effect a maximum of 6 months from its passage (2)[1], and its strength is, to say the least, sweeping, it is important not only for Australians but also for Canadians, British, and even Americans and citizens of non-English-speaking nations to know about it, given its authors intend it to be a model for similar legislation worldwide[2].

Table

Showing where sections of the Act not specifically cited in this analysis, but still discussed, are explained here.
Section number(s) Heading under which it is discussed
31 #Complaints and Objections
33 #Complaints and Objections
37 #Complaints and Objections
42 #Complaints and Objections
49-62 #Reporting
65, 66 #Removal Notices
88, 90 #Removal Notices

Material Covered by the Act[edit]

This Act's definition of "material" includes text, data, sounds, pictures and videos. This Act covers five types of material:

  • "cyber‑bullying material targeted at an Australian child";
  • "cyber‑abuse material targeted at an Australian adult";
  • "non‑consensual intimate image of a person";
  • "material that depicts abhorrent violent conduct";
  • "Class 1 material" and "Class 2 material".

Cyber-Bullying Material Targeted at an Australian Child[edit]

If "an ordinary reasonable person would conclude that" it is likely that material is meant for a particular Australian child, and "would be likely to have the effect on the Australian child of seriously threatening, seriously intimidating, seriously harassing or seriously humiliating the Australian child", then the material falls into this category. However, if the material was posted by a person that is "in a position of authority over an Australian child" "in the lawful exercise of that authority" as "reasonable action taken in a reasonable manner", then it doesn't fall into this category (6).

Cyber-Abuse Material Targeted at an Australian Adult[edit]

If "an ordinary reasonable person would conclude that it is likely that the material was intended to have an effect of causing serious harm to a particular Australian adult" and "an ordinary reasonable person in the position of the Australian adult would regard the material as being, in all the circumstances, menacing, harassing or offensive", then it falls into this category (7). "Serious harm" is defined as "serious physical harm", "serious psychological harm", or "serious distress" "whether temporary or permanent" (3).

Non-Consensual Intimate Image of a Person[edit]

An intimate image includes a picture or video of the following "in circumstances in which an ordinary reasonable person would reasonably expect to be afforded privacy":

  • the genital or anal region (whether bare or covered by underwear) of any person, or either or both of the breasts of a female, transgender, or intersex person,
  • a person "in a state of undress, using the toilet, showering, having a bath, engaged in a sexual act of a kind not ordinarily done in public, or engaged in any other like activity",
  • a person without "particular attire of religious or cultural significance" that they wear "whenever the person is in public" "because of the person’s religious or cultural background" (15).

A picture or video falls into this category if it is posted without the consent of the person it depicts, but if it is posted as an "exempt provision" it doesn't fall into this category (16). "Exempt provisions" are if the posting is for enforcing the law, for "a genuine medical or scientific purpose", or if "an ordinary reasonable person would consider the provision of the intimate image on the service acceptable", among others (86).

Material That Depicts Abhorrent Violent Conduct[edit]

"Abhorrent violent conduct" means terrorist acts, murder, attempted murder, torture, rape, or kidnapping. Unlike with the other categories, there is no reasonable person test; merely that material records or streams such conduct puts it into this category (9).

Class 1/2 Material[edit]

"Class 1 material" means films, video games, publications, or the contents thereof that are, or have not been but would likely be, classified as RC (banned), or other material that would be RC if it was a film (106).

"Class 2 material" means films, video games, publications, or the contents thereof that are, or have not been but would likely be, classified as X18+ or R18+ (or as Category 1 or 2 restricted for publications), or other material that would be classified there if it was a film (107).

Services Covered by the Act[edit]

The Act's definition of "electronic service" includes all services transmitting material through electromagnetic waves. The Act covers three types of electronic services:

  • "social media service";
  • "relevant electronic service";
  • "designated internet service".

The Minister[3] can put a service that doesn't meet the criteria to be there into one of those categories, but whether or not they can put a service that does meet the criteria to be in a category out of it depends on which category. "Exempt services", where material posted is inaccessible and undelivered to all Australian end-users, are not covered by this Act.

Social Media Service[edit]

A service falls into this category whose "sole or primary purpose...is to enable online social interaction between 2 or more end‑users" or "share material for social purposes" and that "allows end‑users to link to, or interact with, some or all of the other end‑users" and "post material on the service".

The Minister can declare services that would otherwise fall into this category to be exempt services (13).

Relevant Electronic Service[edit]

A service falls into this category that is an email, instant messaging, SMS, MMS, chat, or online game service.

The Minister cannot declare services that would otherwise fall into this category to be exempt services (13A).

Designated Internet Service[edit]

A service falls into this category that is an "internet carriage service" (or in English, ISP) (14), but not an "on-demand program service", which is a service that transmits commercial or subscription TV, ABC, or SBS broadcasts through the internet (18).

The Minister can declare services that would otherwise fall into this category to be exempt services.

Remedies for Targets[edit]

Targets of any of the described content can (depending on the category of the content):

  • file a complaint or objection notice;
  • sue the poster.

Complaints and Objections[edit]

A complaint only tells the Commissioner about the material complained of; they may investigate the matter but don't have to. An objection notice explicitly demands the removal of the material objected to (but is not available for all kinds of material) (33).

About Cyber-Bullying Material[edit]

An Australian child, their parent, or someone they've authorized to act on their behalf, can file a complaint. Also, a child that turns 18 can file a complaint for cyber-bullying material targeted at them when they were a child (30).

About Cyber-Abuse Material[edit]

An Australian adult or someone they've authorized to act on their behalf can file a complaint (36).

About Class 1 or Class 2 Material[edit]

Anyone with reason to believe that Class 1 material or Class 2 material at the X18+ level can be accessed in Australia can file a complaint. So can anyone with reason to believe that Class 2 material at the R18+ level is accessible in Australia by people under the age of 18 (38).

About Intimate Images[edit]

A person depicted by an intimate image, a person authorized by them, or their parent or guardian if they're under 16 or incapable of managing their own affairs, may file a complaint or objection. They may file a complaint or objection even if they previously consented to the posting of the intimate image (32).

Private Rights of Action[edit]

Australia does not recognize the tort of intentional or negligent infliction of emotional distress[4], so a person cannot be sued for posting cyber-bullying or cyber-abuse material as the Act does not explicitly provide a private cause of action for those types of material.

Over Intimate Images[edit]

A person who posts or threatens to post an intimate image of another person without their consent (unless it's intimate because of lack of religious or cultural apparel and the person didn't know that they consistently wore that apparel in public) can be sued by that person, or the Commissioner, for up to 500 penalty units (75)[5].

Consequences for End-Users[edit]

If you post material covered by this Act, you may be subject to the following consequences:

  • being sued (see above);
  • "end-user notice";
  • "removal notice".

Notices have to identify the material in question "in a way that is sufficient to enable the end‑user to comply with the notice". If you don't comply with a notice, you can receive an injunction from the Federal Court (71), or a formal warning from the Commissioner (72). Your only recourse if you get a notice is to appeal to the Administrative Appeals Tribunal (220).

End-User Notice[edit]

You can get an end-user notice for posting cyber-bullying material. If you do, you have to:

  • "take all reasonable steps" to get the material removed within the time the notice requires;
  • not post any more cyber-bullying material with the same target;
  • apologize to the complainer in the manner and the time the notice requires (70).

Removal Notice[edit]

You can get a removal notice for posting non-consensual intimate images. If you do, you have to "take all reasonable steps" to get the material removed within 24 hours or a longer period the Commissioner allows (78).

You can also get a removal notice for posting cyber-abuse material. The terms of the removal notice are the same as if you posted non-consensual intimate images (89).

Obligations for Services[edit]

Services covered by this Act are subject to the following obligations:

  • "basic online safety expectations";
  • reporting;
  • compliance with "removal notices";
  • compliance with "blocking requests" and "blocking notices";
  • compliance with "remedial notices";
  • compliance with "industry codes" and "industry standards";
  • compliance with "service provider determinations";
  • Federal Court orders;
  • compliance with disclosure notices.

Basic Online Safety Expectations[edit]

The Minister will say what they are (45). At a minimum, they include the expectations that the provider of the service will:

  • "take reasonable steps to ensure that end‑users are able to use the service in a safe manner";
  • consult the Commissioner "in determining what are such reasonable steps";
  • "take reasonable steps to minimize the extent" that material covered under this Act is accessible on the service;
  • "take reasonable steps to ensure that technological or other measures are in effect to prevent access by children to class 2 material provided on the service";
  • have "clear and readily identifiable mechanisms that enable end‑users to report, and make complaints about," material covered under this Act or breaches of its terms of use;
  • comply within 30 days when the Commissioner asks for how many complaints were filed or how long it took to comply with every removal notice within a given period of 6 months or more, or what the service has done to make it safer for end-users (46).

Reporting[edit]

The Minister may require a service or a class of services to report on how well it obeyed the basic online safety expectations for a specified period between 6 and 24 months long. The requirement can be regular or onetime. The Commissioner can sue a service that doesn't comply with it for 500 penalty units, and can also publish a notification on the website stating that the service didn't comply with it. You have no right to remain silent under a reporting requirement, but nothing you say can be used against you in a criminal proceeding or a civil proceeding, other than for failing to report or giving false or misleading information (63).

Removal Notices[edit]

Removal notices are issued for cyber-bullying material, but only if the person already complained to the service about it and it wasn't removed within 48 hours of the complaint (65). They can also be issued to services hosting the material being posted. They require the service receiving it to remove the material within 24 hours. If the Commissioner refuses to issue a removal notice, they must tell the person that asked for it (66).

Removal notices are issued for non-consensual intimate images. If it is on the basis of a complaint, the Commissioner needs to ascertain that the person depicted did not consent, but if it's on the basis of an objection they don't. The recipient has to "take all reasonable steps" to remove the material within 24 hours (77).

Removal notices are issued for cyber-abuse material to services covered by this Act and hosting services under like conditions and with like requirements (90), except that they only require the service receiving them to "take all reasonable steps" to ensure the material is gone, instead of removing the material outright (88).

Removal notices are issued for class 1 material. All class 1 material can be the subject of a removal notice, except Parliamentary, court, or official inquiry proceedings (109).

Removal notices are also issued for class 2 material at the X18+ level with the same exception for official proceedings (114). Unlike all the other kinds of removal notices, they can only be issued "if the service is provided from Australia".

Blocking Requests and Blocking Notices[edit]

All material promoting, inciting, instructing in, or depicting abhorrent violent conduct whose availability online "is likely to cause significant harm to the Australian community" can be the subject of a blocking request. A blocking request is addressed to an ISP. It can block IP addresses, URLs, and domain names. "The Commissioner is not required to observe any requirements of procedural fairness in relation to the giving of the blocking request." (this is a direct quote) (95). A blocking request lasts for 3 months but can be renewed for further 3-month periods indefinitely.

A blocking notice is the same as a blocking request except a person can be sued for up to 500 penalty units for not complying with it (103). "The Commissioner is not required to observe any requirements of procedural fairness in relation to the giving of the blocking notice." (99).

The only exempt material from a blocking notice or request is that which is necessary for enforcing the law, investigating law violations, court proceedings, research (and is "reasonable in the circumstances for the purpose of conducting that scientific, medical, academic or historical research"), reporting the news (and is made by a professional journalist), performing, or assisting the performance of, a public official's duties and functions (and is "reasonable in the circumstances" for those purposes), advocating lawful change to a matter by law established, and "the development, performance, exhibition or distribution, in good faith, of an artistic work" (104).

Remedial Notices[edit]

A remedial notice is akin to a removal notice. It is issued for class 2 material at the R18+ level. It gives the recipient two options: remove it, or put it behind a restricted access system (119). A restricted access system is one that is declared to be one by the Commissioner (108).

Industry Codes and Industry Standards[edit]

The Act declares all those who provide services covered under this Act, or equipment for using those services, to Australians to be "sections of the online industry" (135). Any body or association that the Commissioner believes represents a "section of the online industry" may make an "industry code" binding the "online activities" of all "participants" - that is to say, providing services covered under this Act, or "manufacturing, supplying, maintaining or installing" equipment for using those services, to Australians (134). If the Commissioner "is satisfied that" an industry code provides "appropriate community safeguards", and provided at least 30 days for members of the public and participants in that section of the online industry to comment on a draft of that code, they may register the code in an online Register (140). Thereupon, the Commissioner may direct providers of services to comply with industry codes, and sue them for 500 penalty units if they don't (143).

"Industry standards" are like "industry codes", except the Commissioner can make them themselves without requiring the participation of any participant in the sections of the online industry concerned (145). The Commissioner just has to publish a draft for at least 30 days online and invite comment by "interested persons" (148). Industry standards prevail over industry codes where they conflict (150).

Service Provider Determinations[edit]

Service provider determinations are made by the Commissioner and bind all services covered by this Act, hosts, and ISPs (151). The Minister can exempt any service, host, or ISP from these determinations in general, or from one or more specific determinations (152). A provider can be sued for up to 500 penalty units for failing to comply with a service provider determination (154).

Federal Court Orders[edit]

If the provider of a service has been sued twice or more within the past 12 months for failing to comply with any provision of this act, and because of that its "continued operation" "represents a significant community safety risk", the Commissioner can apply to the Federal Court for an order to shut the service down. Presumably, violation of this order would subject one to the usual penalties for contempt of court, where the court is judge, jury, and prosecutor.

Disclosure Notices[edit]

If the Commissioner believes "on reasonable grounds" that a provider of a service covered under this Act has "information about the identity" or "contact details" of an end-user of it and that this information is "relevant to the operation of this Act, they may issue a written notice demanding the provider give it to the Commissioner (194). If the provider fails to do so they can be sued for 100 penalty units (195). There is no right against self-incrimination but information submitted cannot be used against them in a civil or criminal action except if the action is for giving false or misleading information (196).

Your Rights[edit]

You have:

  • no right against self-incrimination;
  • no right to defend against a notice as expeditiously as one can be filed;
  • no right to sue the Commissioner or any other "protected person";
  • a right to "a reasonable amount of compensation" if any property is taken;
  • a right to "implied freedom of political communication";
  • a right to preemption of inconsistent State and Territory laws.

No Right Against Self-Incrimination[edit]

See #Disclosure Notices and #Reporting.

No Right to Expeditious Self-Defense[edit]

If you're the target of cyber-bullying or cyber-abuse material, it's easy to file a complaint and the Commissioner does all the legal work for you. By contrast, if you're the recipient of a removal notice, there is no way to explain or defend yourself as easily as you can file a complaint, though the whole power of the Commonwealth be marshaled against you, expect by appeal to the Administrative Appeals Tribunal, which places the burden on you. Although the Tribunal aims to "make our review process accessible, fair, just, economical, informal and quick"[6], will it be quick enough for the 24-hour deadline of a removal notice?

No Right to Sue the Commissioner[edit]

You cannot sue any person for anything done in "good faith" in making a complaint, or filing a removal notice (221). You cannot sue the Commissioner or their delegate for any "act in good faith done" under this Act (222). The Commissioner and other "protected persons" including the Classification Board are not subject to criminal proceedings for collecting, possessing, distributing, delivering, copying, or doing anything else to material pursuant to this Act (223).

Right to Just Compensation[edit]

Section 51 of the Australian Constitution provides that the Commonwealth Parliament has power to make laws about "the acquisition of property on just terms from any State or person for any purpose in respect of which the Parliament has power to make laws". Accordingly, if property is not acquired on "just terms", the Act mandates the Commonwealth to pay "a reasonable amount of compensation" to any person whose property is taken, and gives the Federal Court and the Supreme Courts of the States and Territories the jurisdiction to decide on what that amount is if the owner and the Commonwealth are unable to agree (224).

As this section is one of the sections of the Constitution most fruitful of litigation, even a simplified outline of what "just terms" would be is beyond the scope of this analysis. The reader is referred to Quick & Garran, widely regarded as the premier commentary on the Australian Constitution.

Right to Freedom of Political Communication[edit]

The High Court of Australia (or federal supreme court) has held that sections 9 and 24 of the Constitution, requiring that the "people" elect Senators and Representatives, imply a right to freedom of political communication[7]. If a provision of this Act violates that right, then that provision is void (233).

Right to Preemption of Inconsistent State Law[edit]

Hosting services and ISPs are not liable, civilly or criminally,under State and Territorial laws and common-law rules for content they host or carry, respectively, that they did not know about. Crucially, this provision says nothing about services covered under this Act. It says nothing about Federal law. The Minister is also free to withdraw this exemption from liability at any time (235). Although its number is very close, in everything else it is highly different from Section 230 of Title 47 of the US Code[8] Section 230's only question is "did you post this?". Section 230 covers all "interactive computer services", all services that provide access by multiple users to a computer server or the Internet and provides that "No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section"[9]. This section asks also "did you know about it?" thereby requiring hosts and ISPs to prove a negative. It is limited in scope, benefit, and certainty, and will not be of much use.

Notes[edit]

  1. Numbers in brackets refer to section numbers of the Act.
  2. https://www.smh.com.au/national/we-need-to-ensure-online-safety-before-big-tech-profits-20210615-p58123.html
  3. To figure out which Minister, go to https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/Browse_by_Topic/law/adminarrangements#:~:text=Administrative%20Arrangements%20Orders%2C%201906%2B%20%20%20%20Date,%20%20Yes%20%2038%20more%20rows%20 and find the latest Administrative Arrangements Order, then find the "Online Safety Act 2021" under the section for "Legislation administered by the Minister".
  4. Magill v. Magill [2006] HCA 51
  5. A penalty unit is a means for ensuring the amount of fines keeps pace with inflation, by stating fines in terms of penalty units instead of sums of money. A penalty unit is currently $222.
  6. https://aat.gov.au
  7. Nationwide News Pty Ltd v Wills [1992] HCA 46
  8. See my open letter on It's Not Section 230 of the Communications Decency Act.
  9. 47 USC 230