Modernising Communications Offences A final report [2020] EWLC 399 (July 2021)


BAILII is celebrating 24 years of free online access to the law! Would you consider making a contribution?

No donation is too small. If every visitor before 31 December gives just £1, it will have a significant impact on BAILII's ability to continue providing free access to the law.
Thank you very much for your support!



BAILII [Home] [Databases] [World Law] [Multidatabase Search] [Help] [Feedback]

The Law Commission


You are here: BAILII >> Databases >> The Law Commission >> Modernising Communications Offences A final report [2020] EWLC 399 (July 2021)
URL: http://www.bailii.org/ew/other/EWLC/2021/LC399.html
Cite as: [2020] EWLC 399

[New search] [Printable PDF version] [Help]


(Law Com No 399)

Modernising Communications

Offences

Presented to Parliament pursuant to section 3(2) of the Law Commissions Act 1965

Ordered by the House of Commons to be printed on 20th July 2021

HC 547

 

© Crown copyright 2021

This publication is licensed under the terms of the Open Government Licence v3.0 except where otherwise stated. To view this licence, visit nationalarchives.gov.uk/doc/open-government-licence/version/3 or write to the Information Policy Team, The National Archives, Kew, London TW9 4DU, or email: mpsi@,nationalarchives.gsi.gov.uk.

Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned.

This publication is available at www.gov.uk/government/publications.

ISBN 978-1-5286-2771-9

CCS0621857840 07/21

Printed on paper containing 75% recycled fibre content minimum

Printed in the UK by the APS Group on behalf of the Controller of Her Majesty's Stationery Office

The Law Commission

The Law Commission was set up by the Law Commissions Act 1965 for the purpose of promoting the reform of the law.

The Law Commissioners are:

The Right Honourable Lord Justice Green, Chairman

Professor Nick Hopkins

Nicholas Paines QC

Professor Sarah Green

Professor Penney Lewis

The Chief Executive of the Law Commission is Phil Golding.

The Law Commission is located at 1st Floor, Tower, 52 Queen Anne's Gate, London SW1H 9AG.

The terms of this report were agreed on 30th June 2021.

The text of this report is available on the Law Commission's website at

http://www.lawcom.gov.uk.

Contents

Page

The problem

The nature of online harms

Freedom of expression

This report

Origins of this report

Terms of reference

Consultation

Recommendations for reform

Evidence and problems of proof

Children and young adults

Acknowledgements

The project team

Introduction

Repeal of the existing communications offences

Consultation responses (repeal of existing offences)

Analysis (repeal of existing offences)

A new communications offence

Harm

Consultation question 5 - definition of “harm”

Consultation question 4 - likely harm

Likely audience

Consultation question 3 - likely to cause harm to likely audience

Consultation question 6 - context and characteristics of audience

The Mental element - intention

Consultation question 8 - awareness of a risk of harm and intention

Without reasonable excuse

Consultation question 11 - reasonable excuse

Consultation question 12 - public interest

“Sent or posted” a communication

Recommendation 1.

Jurisdiction

Consultation question 16 - extra-territorial application

Recommendation 2.

COMMUNICATIONS, AND FLASHING IMAGES

Introduction

Section 127(2) of the Communications Act 2003

Part 1: Knowingly false communications

The proposed new offence: knowingly false communications

Recommendation 3.

Part 2: Persistent use

Responses and analysis

Recommendation 4.

Part 3: Threatening communications

Threats and the law

Victim impact and prevalence of threatening communications

Use of communications offences

Analysis

Recommendation 5.

Part 4: Flashing Images

Recommendation 6.

Consultation question and response

Analysis

Recommendation 7.

AND VIOLENT CRIME, BODY MODIFICATION CONTENT

Introduction

Part 1: Group harassment

Existing law: group harassment

Options for reform

Incitement or encouragement of group harassment

Knowing participation in “pile-on” harassment

Part 2: Glorification of violence and violent crime

Existing law, and the recommended harm-based offence

Justification and consultation responses

Conclusion

Part 3: Body modification content

Body modification: existing law

Consultation question and responses

Introduction

The Rationale for reform

Harm

Prevalence

Current law

Cyberflashing: a new offence

A sexual offence

The conduct element

The fault element - additional intent

Conclusion

Ancillary orders and special measures

Recommendation 8.

Recommendation 9.

Recommendation 10.

Recommendation 11.

Recommendation 12.

Recommendation 13.

Introduction

Part 1: Existing position

Policy

Law

Part 2: Consultee responses - glorification or encouragement of self-harm

Preliminary issue: online self-harm content and suicide “challenges”

Nonsuicide self-harm content and the harm-based offence

Encouragement of self-harm: should there be a specific offence?

Part 3: Encouragement or assistance of self-harm: a new offence

Threshold of harm

Fault element

Maximum penalty

Practical mechanisms: prosecutorial discretion and enforcement

Recommendation 14.

Recommendation 15.

Recommendation 16.

Recommendation 1.

Recommendation 2.

Recommendation 3.

Recommendation 4.

Recommendation 5.

Recommendation 6.

Recommendation 7.

Recommendation 8.

Recommendation 9.

Recommendation 10.

Recommendation 11.

Recommendation 12.

Recommendation 13.

Recommendation 14.

Recommendation 15.

Recommendation 16.

CONSULTEES

GLOSSARY

The problem

When we published our report in 19853 leading to the enactment of the Malicious Communications Act 1988, only 12.6% of UK households owned a home computer - and the internet wouldn’t be invented for another four years.4 Even in 2003, when the Communications Act 2003 was passed, only 46% of UK households had internet access.5 It was another year before Facebook was released (and, even then, available only to select universities and colleges). Twitter was released in 2006. The first iPhone was not released until 2007, four years after enactment of the Communications Act. Nearly two decades later, in every single second that passes,

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381; Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 1.1.

Law Commission, Poison-Pen Letters (1985) Law Com No 147.

J Schmitt and J Wadsworth, ‘Give PC’s a Chance: Personal Computer Ownership and the Digital Divide in the United States and Great Britain’ (2002) Centre for Economic Performance, London School of Economics, p 17.

Office for National Statistics, Internet Access - households and individuals, Great Britain: 2018 (2018), p 3. nearly three million emails are sent, 1nearly ten thousand Tweets are posted, 2and over three quarters of a million WhatsApp messages are sent [which by late 2020 was actually 1.15 million per second],3 amongst the other c.95 terabytes of internet traffic per second.4

The nature of online harms

online in the last 12 months.11 Forty-seven per cent had experienced potentially harmful content or interactions with others, and 29% per cent had experienced something they rated as “harmful” (annoying, upsetting, or frustrating).12

COVID-19 has not only inflicted massive health costs, it has also amplified myriad social hazards, from online grooming to gambling addiction and domestic abuse. In particular, the United Nations High Commissioner for Human Rights has warned that the pandemic may drive more discrimination, calling for nations to combat all forms of prejudice and drawing attention to the ‘tsunami of hate’ that has emerged.

Freedom of expression

universal standards; there is a sense in which the rejection of universal standards necessitates context-specificity. For example, an intimate image may be entirely harmless in one context, but clearly very harmful in another (using that image as a way of threatening someone, for example, is clearly very different from partners sharing intimate images of each other consensually between themselves, yet the content of the communication may be identical in each case). A similar point may be made of two friends sharing jokes between each other that they find amusing that, in another context, may be very distressing. So, we might not be able to say as a generality that a particular form of words or image will always be criminal or never criminal (because they either are or are not, say, “indecent” or “offensive”), but that doesn’t mean that we won’t be able to discern on a given set of specific facts that harm was or was not likely (and, of course, that is rather the point of the form of our recommended offence: we should only criminalise those communications with the real potential for harm).

THIS REPORT

Origins of this report

internet services by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.20

Terms of reference

Topics not in scope
Related projects

Consultation

Recommendations for reform

The new “harm-based” communications offence
Cyberflashing
Encouraging or assisting self-harm
False, persistent and threatening communications

EVIDENCE AND PROBLEMS OF PROOF

CHILDREN AND YOUNG ADULTS

The Alan Turing Institute found that young people are more likely to experience abuse. Specifically, they found that:

41.2% of 18-30 year olds had seen cruel/hateful content online, compared with 7.4% of 76+ year olds. Age also impacted whether respondents had received obscene/abusive emails but the relationship was far weaker, ranging only from 13.1% of 18-30 year olds to 6.77% of 76+ year olds.

We have heard from some stakeholders that, amongst young people - especially University students - there is a culture of normalisation or dismissiveness about the harmful impacts of online abuse, with both perpetrators and victims sometimes using the excuse of ‘banter’ or ‘humour’ to diffuse the seriousness of this behaviour.

However, we also heard that this is consistent with an awareness of a risk of harm: the perpetrator may realise the seriousness of their actions, but attempt to minimise this by using humour as an excuse.

Moreover, normalisation or dismissiveness can exacerbate the harmful impact of online abuse. It may lead to inappropriate strategies being adopted or recommended - that victims should come off social media, for example. Moreover, it may cause victims to supress their emotional response, potentially contributing to more serious psychological harms later down the line.

ACKNOWLEDGEMENTS

THE PROJECT TEAM

INTRODUCTION

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 3.113 to 3.124.

numerous consultees have described genuinely harmful communications sent to them that do not comfortably fit within these categories. Two striking examples of the latter point would include the dissemination of private information or sending illicit surveillance photos of a person to that person.35

REPEAL OF THE EXISTING COMMUNICATIONS OFFENCES

Consultation responses (repeal of existing offences)

[The] current legislation is too vague and makes the allegations difficult for police officers to keep to a standard nationally, this then causes confusion when one police service in one part of the country takes a case to court and another police service may consider the criminal threshold of abuse/harm isn't met and therefore take no further action... confusion reigns for the police service and for the complainant.

The current model of offences has significant overlaps which leads to inconsistency during investigations and during the provision of charging advice.

The simplification to one primary offence would enable training to be delivered in a focussed manner which will lead to more effective investigation.

We agree the existing offences identified under the Communications Act and Malicious Communications Act should be repealed and agree with the concerns about basing criminal liability on what is grossly offensive or indecent, including the ECHR concerns.

We agree with the detailed arguments put forward in the consultation paper and can see the benefits of creating a new offence to replace section 127(1) of the Communications Act 2003 and section 1 of the Malicious Communications Act 1988. We acknowledge that there is some lack of clarity in respect to the existing situation - especially where certain cases could engage both sections due to overlap, as well as the fact some communications may not be covered by either section. In addition, the lack of consistency between the two sections, especially in relation to technological mode of communication, does currently mean a possible lacuna in respect of some cases.

We agree that there is a pressing need for reform of [section] 127(1) Communications Act 2003 and [section] 1 Malicious Communications Act 1988 so that they are effective, fit for purpose, and adequately capture the nature - and impact - of problematic communications in 2020 and beyond. It is a matter of significant concern that there are a number of gaping holes in the current communications provisions, especially given that this framework does not adequately address online violence, online text-based abuse, nor in particular, online violence against women & girls.

We support the modernisation of the Communications Act 2003 and the Malicious Communications Act 1988, in view of the fact that the world has moved online. It is right that legislation keeps up with technological change and modern forms of communication.

In our experience the current criminal offences concerning harmful online communications are not fit for purpose as they do not cover many forms of harmful communications commonly used by perpetrators of domestic abuse and other forms of violence against women and girls. They are also vaguely defined and sometimes poorly understood by police officers and others involved in their enforcement. We support a large number of women whose perpetrators have abused them in ways which we believe fall under the terms of section 127(1) of the Communications Act 2003 and/or section 1 of the Malicious Communications Act 1988, however we see very few investigations and prosecutions for these offences when women report the abuse they’ve experienced to the police. We therefore strongly support reform of this area of law.

Stonewall welcomes the reform and consolidation of the current communications offences, which are piecemeal, inconsistent and complex, into a new communications offence.

ARTICLE 19 agrees that section 127 (1) of the Communications Act 2003 and section 1 of the Malicious Communications Act 1988 should be repealed. Indeed, we have been calling for the repeal of these provisions for several years: we intervened in the DPP v Chambers case and subsequently took part in a number of consultations about social media offences and CPS consultations on their guidance on the topic. Specifically, we have long argued for the removal of the terms “grossly offensive”, which in our view is highly subjective and therefore open to broad interpretation with disproportionate consequences for human rights.

We support the repeal of the two offences, for the reasons set out in the consultation paper. In particular, that ‘offensive’ communications should not be criminalised... and that communications offences should be ‘technology neutral’. The law should be somewhat ‘future proof’ and what is legal offline should be legal online.

We think this is the correct way forward. We would also point out that [section] 127(1) is a hangover from a previous age, having begun life in the 1930s as a means of dealing with the very specific practice of young men in telephone boxes calling up the (then entirely female) operators for free, and then making obscene remarks to them. It is time it was retired.49

Carnegie welcomes the proposal to reformulate the communications offences currently found in [section] 1 Malicious Communications Act [1988] and [section] 127 Communications Act [2003]: it is a parlous state of affairs where provisions of the criminal law manage to be both under- and over- inclusive. In this regard, however, we note also the role of the police, the CPS and the judiciary in recognising the harms caused by speech and flag the need for more training in this area to improve consistency in approach.

The Law Commission expresses concern that the two existing offences under section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003 are overlapping, ambiguous and so broad as to create a real threat to freedom of expression, particularly as they are adapted to apply to new and unforeseen forms of communications and behaviours. It says that they overcriminalise in some situations, and under-criminalise in others. We agree with these statements.

I agree that [section] 127 and [section] 1 should be repealed and replaced with a single offence. The overlap between the two existing offences is difficult to justify and causes confusion.

There have been concerns (most notably surrounding the Chambers case) that the previous offences are too wide. Thus, the offences are in need of reform...

.a major problem with the current laws are that they use incorrect terminology, framing what should rightly be called abuse as a question of offence. Whereas ‘offence’ should not be addressed through the law, victims of abuse require protection and the perpetration of abusive acts should, in principle, be punishable.

Analysis (repeal of existing offences)

indecent and grossly offensive communications, and the potential for a prosecution to constitute an interference in Article 10 ECHR.

A NEW COMMUNICATIONS OFFENCE

HARM

Consultation question 5 - definition of “harm”

“Harm” for the purposes of the offence should be defined as emotional or psychological harm, amounting to at least serious emotional distress. Do consultees agree?

If consultees agree that “harm” should be defined as emotional or psychological harm, amounting to at least serious emotional distress, should the offence include a list of factors to indicate what is meant by “serious emotional distress”?

emotional and psychological harm is the “common denominator” of online abuse. Therefore, a potential new offence would be capable of addressing a wide range of harms flowing from abusive online communications, even if the definition of harm were limited to emotional and psychological harm: the offence would nonetheless cover abusive behaviour that indirectly causes harms...54

is guilty of an offence...

Consultation responses (definition of harm)

We agree with the Commission that for the purposes of this offence, harm should be defined as emotional or psychological harm, given that there are existing offences which cover other harmful online communications, e.g., fraud and harassment.

We believe that the threshold of the psychological and emotional harm should be “serious emotional distress”, and recognise the precedent for this in existing legislation, e.g., section 76 within the Serious Crime Act 2015 which criminalises coercive or controlling behaviour providing that the defendant’s conduct has a “serious” effect on the victim.

We agree that it is important any new offence is not overly broad, and that harms dealt with elsewhere (such as fraud, as explained in the consultation paper) do not need to be covered, and in fact it is beneficial if they are not covered to reduce unnecessary overlap.

The Carnegie proposal has recognised that the law in England and Wales (whether civil or criminal) does not adequately recognise the impact of speech and the harm that can be caused, especially where the speech could be deemed to fall within the category of political speech. While the Report makes the point that it is important that the proposed offence is not too broad, we emphasise that there is another potentially beneficial side-effect of limiting the harms to emotional and psychological harm - of labelling clearly that words can cause profound hurt.

A remedy to the over-broadness of the proposed offences, additionally or alternatively to specifying that communications are 'abusive', would be to define harm as significant negative impact on daily life. An interference in freedom of expression could be better justified on the basis that it was intended to, or believed to be likely that, the communication would have a significant negative impact on the daily life of a likely audience.

As stated above (Question 3) we support the introduction of a ‘harm’ element into the law, as an appropriate justification for curbing Article 10 rights. We accept that for communications offences this must be a form of psychological harm to an individual. Incitement to physical violence is dealt with elsewhere in the criminal law, and ‘harm’ to society is dealt with in hate crime laws (the subject of a parallel Law Commission project).

We are concerned that the harm threshold is defined as ‘emotional or psychological harm, amounting to at least serious emotional distress.’ This is lower than the ‘recognised medical condition’ standard found in: civil claims for negligence; the tort established in Wilkinson v Downton [1897] 2 QB 57; and for criminal offences against the person.

We would recommend the same threshold be applied to the proposed new offence, for two reasons.

If a ‘likely to’ approach is to be adopted (see Question 4), we are unsure how a Court would be able to assess ‘distress’ without recourse to accepted clinical standards. We fear that in such cases, the ‘serious emotional distress’ standard will become indistinguishable from ‘grossly offensive’ or evidenced by unrepresentative outrage observed in the tabloids or on social media.

There is some merit in the new harmful communications offence proposed by the Law Commission. However, we would recommend that the term ‘emotional or psychological harm, amounting to at least serious emotional distress’ should be more clearly defined. We therefore suggest that further thought needs to be given to this definition to avoid prolonged arguments at court. Further consideration should also be given to the threshold of serious harm and potential impact on lower-level offences.

If the threshold is to be B v New Zealand Police (2017), as set out in paragraph 5.106, then this offence will be used rarely. There, the complainant was unfit for work and upset for a long time. That is a high threshold.

We wonder whether the test should also include physical harm caused by shock or alarm. A person may send a communication knowing that the recipient is particularly vulnerable or fragile with intent to cause a harmful physical effect - for example where a victim is known to have a heart condition, or severe asthma. The requirement that the defendant intends harm or knows of the risk of harm to the likely audience will exclude remote or unexpected injury. Otherwise we agree with the minimum threshold of harm suggested.

Consultation responses (harm: list of factors)

... it is useful to focus investigators’ minds when they are collecting evidence, not everyone understands why another person may be so offended by a particular post.

‘Harm’ is a subjective term and for the legislation to carry public confidence and be applied consistently, its legal definition needs to be set out in clear, comprehensible terms.

Yes, we agree that a list of factors would be useful to guide courts and to ensure that the offence is not applied too broadly. It may be useful to consider including both a list of factors that are likely to support a finding of serious emotional distress, as well as factors which would mitigate such a finding. The list of factors in the NZ offence are helpful. It will be important in these factors to reflect recent case-law which has thrown doubt on whether even considerably offensive, irritating, provocative, contentious or abusive communications should be subject to criminalisation (Scottow [2020] EWHC 3421 (Admin); Miller [2020] EWHC 225 (Admin)).

Yes. A list of factors may be useful, but we consider that may be best left to guidance than the wording of the offence...

Without further definition of the meaning of serious emotional distress, there is a risk that the sense that the Law Commission wish to communicate of this being a ‘big sizeable harm’, as explained in the consultation paper, may be lost. A nonexhaustive list of factors to consider would assist - the Court should have regard to the intensity and duration of the likely distress and how it would likely manifest itself - for example, whether it would have an impact on day to day activities/sleeping-patterns/appetite etc - before concluding that serious emotional distress was likely. This would [aid] the Court to make more objective decisions in cases which may involve inferring the extent of likely distress. A list would also be useful in reducing the risk of unnecessary or inappropriate arrest or intervention by the Police and may enable a more robust and arguably more objective view to be taken regarding a complaint, even in the face of strong pressure by a complainant.

We support the option presented in 5.114, whereby additional guidance produced could include some of the factors cited in New Zealand jurisprudence (such as intensity, duration and manifestation), to indicate what is meant be ‘serious emotional distress’, while also explicitly stating that this list is indicative and nonexhaustive

We understand the argument of introducing a higher threshold than those set out for offences under the Protection from Harassment Act 1997. It seems a sensible level to set for any new offence. As mentioned in the consultation paper, we have previously pointed out that it is common for courts to judge “seriousness” in different contexts, and we are confident they could do so in respect of this new offence. It may be helpful for factors to be included in any guidance associated with the introduction of the new offence, including sentencing guidelines.

I would not support a list of factors as they tend to limit the scope of responsibility (even if they are brought as indicators). Moreover, if it is accepted that emotional harm should suffice - at least as an alternative prong of actual emotional harm -including indicators of ‘serious emotional harm’ will muddle things up.

Having said this, the list of indicators in 5.111 seems reasonable, and in the area of my current research of image-based abuse is typically easily satisfied.

Analysis (definition of harm & factors)

Consultation question 4 - likely harm

We provisionally propose that the offence should require that the communication was likely to cause harm. It should not require proof of actual harm. Do consultees agree?

Consultation responses

Restricting the offence to that where actual harm took place would be problematic where there is the ‘resilient victim’, i.e. while D intended to cause harm, V actually just shrugged off the attack. As noted in paragraph 5.84(1), it would also mean that the police do not need to find proof of actual harm. That could be problematic where, for example, a public tweet was sent.

We agree. This is a helpful proposal, and we note that the current communications offences under [section] 127 CA 2003 and [section] 1 MCA 1988 do not require proof of actual harm either. This is in order to include scenarios where no actual harm occurred or could be shown. For example, this may apply when posts do not target at a specific recipient and in cases where no victim comes forward. As the Law Commission outlines, harm to the information subject is addressed in the civil law of defamation.

If, for example, a person deliberately sends an antisemitic communication to a Jewish recipient, intending to cause them fear and distress, they should not be protected from prosecution if that Jewish recipient anticipates their intention and does not read the offending communication.

The APCC agrees with this proposed approach. As the Commission rightly assesses, proof of harm may be especially difficult to obtain in the case of public posts, where it is difficult to establish who has seen the communication.

Equally, as the commissioners of victims services locally, we believe that victims should not have to go through the potentially re-traumatising process of producing evidence that they were harmed, although evidence of actual harm could count towards proving that a communication was likely to cause harm.

Stonewall supports the removal of the requirement to prove actual harm in law, in line with the Commission’s rationale that ‘Proof of harm may be especially difficult to obtain in the case of public posts, where it is difficult to establish who has seen the communication. Further, given the culture of dismissiveness and normalisation around harmful online communications, victims may be especially unlikely to come forward’ (5.84 - 85).

Refuge strongly agrees with this element of the proposed offence. Requiring proof of actual harm would create an unnecessarily high evidential barrier to prosecution and is likely to result in limited use of the offence. As the Law Commission acknowledges in the consultation, proving harm was caused by a particular communication can be very challenging, particularly in the context of domestic abuse and other forms of VAWG where harmful communications are highly likely to be only one of many ways a perpetrator is abusing a survivor as a pattern of coercion and control.

...That said, it will be challenging to prove the likelihood of harm. In para 5.90 you indicate that an actual victim is harmed would help, but if you have harm suffered by an actual victim then you don’t need to prove likely harm. Factors may help (discussed later), but it is still going to be difficult to prove. The Public Order Act

1986 is different because it is easier to prove that someone is likely to be harassed or alarmed. Showing the likelihood of serious distress appears quite challenging.

2.104 The Magistrates Association discussed proof in the context of the Public Order Act 1986:83

We agree with the proposal, for the reasons stated in the consultation paper. We note the query as to how a prosecution would prove a communication was likely to cause harm. As stated in the consultation paper, evidence of a victim being harmed may be determinative in providing proof, but also note that a similar test is currently used in relation to offences under section 5 of the Public Order Act 1986.

2.105 The Law Society of England and Wales questioned how the offence would operate in practice:84

When deciding whether the communication was likely to cause harm to a likely audience, the court must have regard to the context in which the communication was sent or posted, including the characteristics of a likely audience.

This therefore tries to put the communication into context, practically, for the judge or magistrates and asks the court to examine the context of the sending of the communication and the characteristics of the likely audience; but it is still an objective test and there is a risk of inconsistency in the decision-making.

We consider that it will be necessary for there to be very clear guidance and references to examples so that the judiciary understand the context of messages, to understand what was intended and what harm was really caused.

Analysis (likely harm)

2.106 We take seriously all of the concerns regarding proof. Consultees raised convincing arguments that both the actual harm test and the likely harm test would present difficulties of proof.

2.107 However, we are satisfied that the appropriate test is one of likely harm. We reiterate the point above that it is important the defendant be able to predict at the point of sending whether they are about to commit a criminal offence. Further, the mere fact that someone was harmed does not imply that harm was likely (which has bearing on the culpability of the defendant). The jury or magistrate will have to determine as a matter of fact that, at the point of sending, harm was likely. If a person has an extreme and entirely unforeseeable reaction, the element of likely harm will not be satisfied. This, combined with the problems associated with requiring proof of actual harm in certain commonplace circumstances (such as public posts in public fora), as well as the difficulties in proving causation (identified by Jacob Rowbottom), fortify our view that the benefits of such an approach outweigh the difficulties associated with proving likely harm.

2.108 We further think that those difficulties can be mitigated by clear and appropriate CPS guidance. It is perhaps somewhat difficult to consider in the abstract, but part of the rationale for the harm-based offence is context-specificity and particular scrutiny of the facts.

2.109 For the reasons stated above, we are not convinced that the test of likely harm raises problems for freedom of expression or increases the chances of vexatious complainants.

2.110 As to the question of what “likely” means, our view is that it is not enough that there is a mere risk or possibility of harm. We consider that this would invite too great a degree of interference in expression by the criminal law. 85Conversely, requiring proof that harm was “more likely than not” would be an inappropriately high bar; this sets the threshold of culpability very high and would present real problems of proof. Instead, our view is that there needs to be a “real or substantial risk of harm.” This was the test adopted by the House of Lords in Re H for the purposes of defining the Children Act 1989 standard of “likely to suffer significant harm”.86

Reasonableness

2.111 Finally, a number of consultees have suggested variations on a theme of the “reasonable person” as a way to mitigate what they regard as excessive subjectivity in the harm test. This argument was raised in response to a number of questions, but it is as well to raise it here. There is a particular difficulty with this argument.

2.112 The reasonable person test requires a notion of what harm a reasonable person might have been caused when faced with a particular communication. Who is the reasonable person? The reasonable person is not simply the “majority”; it is a more abstract notion than that (the majority may well be unreasonable). One of the chief problems with the reasonable person test in this context is that it is not clear which attributes the reasonable person must share with the victim. If they share only the fact of being human, and maybe being adult, and maybe speaking the same language, then it is not clear what the test is doing other than replicating the “universal standards” inherent in the categories in the existing offences (the very thing that consultees accepted was unascertainable and an undesirable basis for the criminal law). What if, in the alternative, the reasonable person shares some more of the characteristics of the victim: their age, gender, disability, socio-economic background? All are characteristics that might have legitimate bearing on how someone might respond to a communication. In that case, it is not clear what the reasonable test achieves that the test of likely harm does not: if the reasonable person is, in effect, the victim, we are instead just asking whether harm was foreseeable, or likely, given the nature of the communication and the victim.

2.113 This works both ways, of course. If a person knows the likely audience of their communication shares their dark sense of humour, there is no reason why that person should be found guilty because a more “reasonable” person would likely have been harmed. Alternatively, if the sender knows the likely audience is unusually susceptible to a particular type of harm and exploits that to their serious disadvantage, the sender should not be able to argue their innocence by appeals to the reasonable person.

2.114 It is also important to stress that there are other constraints within the offence: the sender will have to have intended harm, and also be proven to lack a reasonable excuse. If a person intends harm, and has no reasonable excuse for that, it is not clear why that person should not have to “take their victim as they find them” (which is the basis of quantifying damage in civil law). If, however, a recipient were harmed in a way that a sender did not foresee, then it is very unlikely that the sender intended that harm. We therefore do not think that a reasonable person test would add anything to the existing offence.

LIKELY AUDIENCE

Consultation question 3 - likely to cause harm to likely audience

2.115 Consultation question 3 asked:87

We provisionally propose that the offence should require that the communication was likely to cause harm to someone likely to see, hear, or otherwise encounter it. Do consultees agree?

2.116 We should begin by noting one important mitigating factor that has arisen as a result of our removal of the “awareness of a risk” limb of the offence. Whilst a notable majority of consultees supported this proposal, a number disagreed. One of the reasons for this was because, evidently, the net is cast somewhat wider than under the MCA 1988 (which requires a communication be sent to a person); a person might be harmed by encountering a communication (or by seeking it out) even though the defendant did not intend that person to see it. Concerns were particularly acute in the realm of online political debate. However, although that person may still be in the “likely audience” for the purposes of this element, to be guilty the defendant must have intended harm to the likely audience. Necessarily, then, the defendant must have foreseen, at least in broad terms, who that audience would be and have intended them harm. For the avoidance of doubt, it is not enough that, say, an old email was discovered that someone now considers might likely cause harm; the defendant must have intended it be harmful when they sent it.

2.117 As far as the “likely audience” test is concerned, it is necessary to make two preliminary points. As formulated in the consultation paper and in this report, the offence is complete at the point the communication is sent. Therefore, the assessment of who was likely to see the communication is to be assessed “as at” the point of sending. This is not the same as asking who the immediate recipient was - it is broader than that - but the mere fact that the original message was shared, or the old Tweet sought out some years later, would not imply that that was likely.

Consultation responses

It should not be restricted to those the communication is sent, as that would cause difficulties, particularly in the online environment. It should not be a defence to use an ‘open’ form of communication and say, “ah, but I only intended X to read it”.

...Of course, as you note, this does mean that private conversations are (now) exempt. Currently, if two people engage in an obscene conversation online then, at the very least, it constitutes an offence under [section] 127. That will not be the case now. The argument in favour of such an approach is that two people in a pub having the same conversation would arguably not be prosecuted. The courts have previously justified the distinction where it involves the abuse of a publicly-funded communication network (DPP v Collins (2006)). That argument is arguably no longer tenable, and so the proposal to de facto decriminalise conversations save where the recipient suffers, or is likely to suffer, harm is perhaps more justifiable.

I agree with the suggestion in paragraph 5.80 that who is likely to receive the messages must be an objective test. The nature of the internet means that a subjective test would be problematic. Where a person is sending a message that is capable of harming another, it is not unreasonable for him to understand the potential audience of his [message].

We certainly agree that this requirement must form the outer boundary of any offence. The New Zealand provision concerning information distressing to the subject of it seems to us to go much too far in creating a general crime of making distressing comments about other people.

Refuge welcomes that the proposal that the offence should apply when the communication was likely to cause harm to someone likely to see, hear or encounter it. It is a real strength of the model that it is not restricted to communications directly aimed at the victim-survivor. If the offence were to be limited in this way, it would fail to include many of the ways in which perpetrators of domestic abuse and other forms of VAWG perpetrate abuse through communicating publicly on a particular forum or platform, or communicating to a friend, family member or colleague.

We agree. This is dependent on the qualifying second element, namely that the defendant “intended to harm, or was aware of a risk of harming, a likely audience”. Without that intention or knowledge it would be too wide and uncertain.

The APCC agrees with this proposed approach, and the Commission’s assessment that the harm of online communications can extend beyond those towards whom the communication is directed.

...the legislation should explicitly include victims (information subjects)96 who have been affected by the communication, but may have not likely seen, heard or otherwise encountered it.

2.127 By contrast, Dr Elizabeth Tiarks and Dr Marion Oswald discussed alternative means of protecting information subjects:97

The removal of the words “or otherwise encounter it” would narrow the offence to a more reasonable scope. This does not leave indirect victims without redress, as there are alternative methods of tackling indirect harms to “information subjects” through civil law, and specifically the tort of misuse of private information and data protection remedies, which may well be more appropriate to example 1 on page 113 of the consultation document.

Analysis

2.128 Consultees have fortified our provisional view that restricting the scope of the offence to “recipients” would, in the context of modern communications, be too narrow. As we note above, it is our view that the requirement to prove that the defendant intended to harm the likely audience constrains the scope of the offence to justifiable limits (a point echoed by the Bar Council in their response). Equally, it would seem unusual to allow a person to escape liability for onward sharing that was entirely foreseeable to them (they may, for example, have even wished or directed it).

2.129 One issue that arose in the consultation responses was whether “likely audience” should be defined to include “information subjects”. What this means is, if a communication is about someone, but they are not otherwise likely to see, hear or encounter it, whether that person should still be defined as falling within the “likely audience”.

2.130 We do see the force in that argument but, as we noted at para 5.79 in our consultation paper, this would constitute a dramatic expansion of the scope of criminal law. It would almost certainly, for example, re-criminalise defamation. There is undoubtedly harm that flows from the sharing of personal information, but the gravamen of the offence is different from a communications offence (which is directed at the harm to those who witness a communication). These two may overlap - and consultees have agreed that, in some instances such as “doxing”, the communications offence may be applicable - but alternative means of criminalising the disclosure of private information, or criminal defamation, are distinct and discrete legal questions that do not lie within the scope of this report.

2.131 Nonetheless, by contextualising the harm to the likely audience, it does allow the assessment of likely harm to be set against the particular characteristics of that audience. We consider this a vital element in the proposed offence, and is the matter to which we now turn.

Consultation question 6 - context and characteristics of audience

2.132 Consultation question 6 asked:98

We provisionally propose that the offence should specify that, when considering whether the communication was likely to cause harm, the court must have regard to the context in which the communication was sent or posted, including the characteristics of a likely audience. Do consultees agree?

2.133 Ultimately, this question reflected our view that harm is inherently context dependent -that what harms one person may be entirely harmless to another. Indecent messages may, in one context, be genuinely harmful (cyberflashing is a good example of this) whereas the same messages passing privately and consensually between two people is clearly not criminally wrongful conduct (quite apart from questions of intention).

2.134 There was strong support for this proposal. By and large, consultees supported our analysis in the consultation paper that set out the importance of having an offence that can take the context of communications into account. Consultees emphasised the importance of considering communications in their proper context rather than in isolation. Many consultees also noted that being able to recognise and address context is central to ensuring that any offence is effective in addressing the harms that victims experience.

2.135 Some consultees’ responses seemed to refer to “protected characteristics” as found in hate crime legislation. This confusion may have arisen from the use of the phrase “characteristics of a likely audience.” We did not mean “characteristics” to have so prescriptive a definition.

2.136 Regarding freedom of expression, there was some divergence among consultees. Some saw this proposal as central to protecting freedom of expression, others saw it as exacerbating the problems they had already raised.

Diversity of communications and victims’ experience

2.137 Many consultees stressed the importance of having an offence that can properly consider the context of harmful communications and the impact they have on victims. Stakeholder groups provided detailed evidence about the impact of online abuse on the people they work with and emphasised how context-driven proposals would better address this than the existing offences.

2.138 A number of consultees noted that this proposal will better allow the intersectional experiences of those who are subject to online abuse to be recognised.

2.139 Demos emphasised the benefits of a context-specific approach in the context of online communication:99

We agree that the offence should make these specifications and be attentive to the context of the communication, including the characteristics of a likely audience. The harms of online communication can be complex and highly specific to a context, meaning that those unfamiliar with the context may overlook how a harm is being perpetrated.

2.140 Fix the Glitch set out their view that any offence must look both at the content and context of communications:100

We agree that the court should examine the context in which the communication was sent or posted, the court should examine the context in which the communication was sent and the characteristics of a likely audience. Online abuse and violence is not solely dependent on the content of the communication made; in some cases, harmful communications can be deemed so because of the context in which they were made.

2.141 Refuge gave examples of communications whose harmful nature is not objectively evident on the face of the communication, but highly context-dependent:101

Refuge strongly agrees that the context of the communication must be considered when establishing whether the communication was likely to cause harm. This is particularly crucial in the context of domestic abuse and other forms of violence against women and girls, where a particular communication might not seem obviously aggressive or threatening to an observer, but when looked at in light of the pattern of abuse carried out, it is highly distressing and harmful.

Examples of communications made to women supported in our services include perpetrators tracking down survivors after they have fled their abuse to a secret location and sending them a picture of their front door or street sign on their road. Whilst a picture of a front door isn’t harmful in most contexts, in the case of domestic abuse it is deliberately sent to cause intense fear and distress. Such communication also has significant implications for women, for example if they will often need to move again to different area in order to be safe.

2.142 Women’s Aid also noted the context of violence against woman girls (VAWG):102

Experiencing harmful online communications in the context of domestic abuse is also likely to have a significant impact on victims. As the Sentencing Council Guidance on Domestic Abuse makes clear, “the domestic context of the offending behaviour makes the offending more serious because it represents a violation of the trust and security that normally exists between people in an intimate or family relationship. Additionally, there may be a continuing threat to the victim’s safety, and in the worst cases a threat to their life or the lives of others around them. ”103 Similarly, victims who are targeted by harmful communications in the context of sexual violence, harassment and other forms of VAWG will experience particularly severe impacts. For these reasons, the context of domestic abuse, sexual violence and other forms of VAWG must be specified as particularly harmful contexts for the court to consider, and data collected on the offence should be routinely disaggregated by offender and audience characteristics, and their relationship.

2.143 The Crown Prosecution Service agreed and referenced the approach in New Zealand to contextual factors:104

We agree. We note New Zealand's Harmful Digital Communications Act 2015 includes a non-exhaustive list of contextual factors including the extremity of the language used, the age and characteristics of the victim, anonymity, repetition and the extent of circulation. Although we recognise that there is a risk that including a list of contextual factors might not capture all relevant factors in every scenario, we would welcome the inclusion of such a non-exhaustive list in the proposed statute.

2.144 The Association of Police and Crime Commissioners agreed with both the proposal and our analysis that prompted it in the consultation paper:105

We agree that the court must have regard to the context in which the communication was sent or posted, including the characteristics of a likely audience, and agree with the Commission’s recognition that “context is often key to understanding how and why a communication is harmful”.

2.145 The Magistrates Association agreed with our characterisation of this proposal as reflecting the “established approach” of courts. They noted further: 106

... that other jurisdictions list relevant factors in similar legislation, but agree with the Law Commission’s judgement that this is not necessary in this case, as courts should have discretion to respond appropriately by taking all relevant factors into account.

2.146 English PEN agreed that this proposal is central to protecting freedom of expression:107

Context is crucial and must be considered by the court. The argument in the consultation paper (at para 5.119) is that a simple requirement on the Courts, to have regard to the context in which the communication was sent, should allow for sensible decisions. Coupled with the special protections for freedom of expression set out at section 12 of the Human Rights Act 1998, c.42, it is not unreasonable to assume that the Courts will deliver adequate protection for human rights, when presented with a case at trial.

However, as discussed above we are wary about the overdue sensitivity of the audience being a critical factor in a prosecution.

2.147 The Free Speech Union stressed the importance of a context-specific approach in safeguarding freedom of expression:108

We entirely agree. In our respectful view, one of the objectionable features of the case of Markus Meechan, mentioned in connection with Question 5 above, was the apparent holding of the Sheriff, upheld in the Sheriff Appeal Court, that, in deciding whether exhibiting on YouTube a video of a dog doing a Nazi salute was “grossly offensive”, the overall context - clearly that of an intentionally bad-taste joke - was irrelevant.

2.148 Some consultees noted their concerns about the extent to which this proposal would allow any offence to be “weaponised” by a person who claims to be the victim of online abuse. Many of these consultees expressed their concerns in reference to advocating for women’s organisations and single-sex spaces.

2.149 The LGB Alliance disagreed. While they noted that the proposal may seem reasonable, they argued that the practical reality of online mass communication makes it unwieldy:109

While this may seem reasonable, it creates major difficulties on such platforms as Twitter which are open to all. The likely audience is everyone in the world. Given the ease of mass communication, it is a simple matter for a lobby group, such as those asserting the importance of gender identity theory, to organise as part of any audience - making them a ‘likely’ part of that audience. If they so desire, they can assert the right of transwomen to be potential partners for lesbians and claim that those who disagree are harming them. This is not a fictitious example: it occurs on a regular basis.

2.150 Sex Matters disagreed. They noted that a context-driven approach would “give too much discretion to police and CPS”. 110They provided a detailed response discussing at length the various impacts of existing offences have had on gender-critical academics and others who seek to advocate in favour of single-sex spaces.

Analysis

2.151 We take seriously the concerns of those who believe the context-specificity of the offence could be used in a way that runs contrary to the legitimate exercise of Article 10 rights. However, in demonstrating the profoundly divergent emotional responses to communications, the arguments serve primarily to demonstrate that any liability based on universal standards will be unworkable.

2.152 However, contentious political discussion is one of the reasons why we think the “without reasonable excuse” element is so fundamental (to which we will turn shortly). Notably, the court must consider whether a communication was or was intended as a contribution to a matter of public interest.

2.153 The other important factor that will provide protection to freedom of expression is the mental element of the offence: where no harm was intended (such as, for example, might be the case in political argument), the offence cannot be made out. Therefore, while we accept arguments - such as those made by the LGB Alliance - that people could “organise themselves” to form part of the likely audience, there are other constraints within the offence that will prevent the criminalisation of ordinary (even if highly contentious) political discourse.

2.154 Some of the examples provided by consultees demonstrating the context-specificity of harm were striking - such as the example of the photograph of the front door provided by Refuge - and echo some of the examples of which stakeholders have spoken throughout this project. These have fortified our view that the model of offence we have proposed is appropriate.

THE MENTAL ELEMENT - INTENTION

2.155 As noted at the beginning of this Chapter, the offence we recommend should require proof that the defendant sent a communication intending to cause harm to the likely audience. In the context of criminal law, where it is a result that must be intended, intention means that a person did an act “in order to bring about” a particular result.111 It is to be distinguished from recklessness (which requires merely an awareness of a risk of that result). This was not what we proposed in the consultation paper: the offence that we provisionally proposed could be made out by proving either an intention to cause harm or awareness of the risk of causing harm.

2.156 We considered that this dual form of mental element was necessary because we had seen - and have continued to see - many examples of harmful behaviour where it is not clear that the defendant intends harm. Importantly, not all of this is wrongful behaviour - hence the requirement that the prosecution also prove a lack of reasonable excuse. However, some of the behaviour clearly is wrongful, such as the people whom we have heard send flashing images to people suffering from epilepsy “for a laugh”.

2.157 Nonetheless, the context of the criminal law is important, and that context has shifted since we published our consultation paper. The Government is likely to introduce an Online Safety Bill112 that would impose a duty of care over organisations hosting usergenerated content. That duty of care will obviously not cover communications that take place offline (or even off the respective platforms), but the nature of communication is such that it will cover a very significant proportion of harmful communications. It is also our view that platforms are in a far better position to prevent harmful content than the criminal law; criminal justice is an expensive tool with limited resources to deploy.

2.158 However, and this is crucial, our recommendation that the offence should require intention only and not awareness of risk in the alternative is based on the existence of a regulatory regime that covers some form of legal but harmful content. If the duty of care is limited to the scope of the criminal law (ie platforms only have a duty to prevent criminal offences online), or simply doesn’t exist, then the criminal law would have to step into the void.

Consultation question 8 - awareness of a risk of harm and intention

2.159 Consultation question 8 asked:113

We provisionally propose that the mental element of the offence should include subjective awareness of a risk of harm, as well as intention to cause harm. Do consultees agree?

2.160 One thing worth noting is that the responses occasionally describe the two forms of the offence as the “intentional” form and the “reckless” form. For the avoidance of doubt, recklessness requires proof that the defendant was aware of the risk (in this case, risk of a result, that result being harm) and that, in the circumstances as known to the defendant, it was unreasonable to take the risk.114 This is a common law definition. We separated these elements in the offence - rather than including the word “reckless” - because of the very particular role played by reasonableness in the context of this offence and how careful we needed to be in defining it. We therefore considered it important not to leave the scope of the offence to be affected by any change in the definition of recklessness.

Consultation responses

2.161 Victim and law-enforcement stakeholders generally expressed strong support for the proposal, arguing that the more flexible an offence the better it can ensure that all culpable behaviour is addressed. Consultees also by and large agreed with our analysis that a “reckless” form of the offence would address behaviour not currently addressed by the existing communications offences.

2.162 A range of concerns were raised, primarily concerning freedom of expression and potential over-criminalisation. Stakeholder groups with particular interests in freedom of expression noted their concern that including awareness of a risk as a fault element would have too great an impact on communications and would likely “chill” expression to a substantial degree. In particular, that in combination with the “likely harm” and “likely audience” criteria, and the threshold of “serious emotional distress”, the lesser fault element would criminalise large swathes of speech that are not currently criminal.

2.163 Further, legal stakeholders (for example the Law Society of England and Wales) noted that the potential breadth of an offence with awareness of a risk as a fault element may criminalise unintentional behaviour and have a disproportionate impact on young people. Their concerns were heightened given the nature of online communications, where the impact of a communication is more difficult to gauge, and the effort required to send or post it is minimal. The potential consequences of a criminal conviction might significantly outweigh the culpability.

2.164 In contrast, Professor Tsachi Keren-Paz outlined his view that awareness of the risk of harm would be a preferable threshold for the mental element:115

Absolutely. A major weakness of [section] 33 of CJCA 2015 [“revenge porn”] is its insistence on intention to cause distress. The analysis in the consultation paper is robust.

2.165 Young Epilepsy outlined some of their own experience of receiving malicious communications and their preference for a mental element requiring awareness of a risk of harm: “The malicious communications we have received imply that the perpetrators have (at the very least) a subjective awareness of the risks associated with sharing flashing images with the epilepsy community.”116

Including recklessness is important if it is going to be a harm-based offence. Proving that a person knowingly sent a communication intending it to cause serious emotional distress could be difficult. Recklessness is common with harm-based offences. There will still be the requirement that a person knowingly or intentionally sends a communication, and so the recklessness only applies to the harm element.

Yes, Refuge agrees that any new offence should include subjective awareness of risk of harm as well as intention to cause harm. Such an approach would lead to a more workable offence that would capture circumstances where evidence of intention is challenging, but where it was clear that a perpetrator was aware their communication could cause harm but were reckless or disinterested in this risk.

We agree with this approach, and the Commission’s assessment that including “subjective awareness of a risk of harm” within the offence would be consistent with previous recognition within law of culpability when clear intention to harm is not present, e.g., in the legal concept of “recklessness”.

We also agree that including a subjective awareness of a risk of harm could help to prosecute instances of cyber-flashing, where the perpetrator’s primary intention may not be to cause harm. We believe that focusing too narrowly on the perpetrator’s intention, will draw focus away from the risk of harm to a likely audience, and therefore dilute the proposals’ overall “harm-based” approach.

We agree. While [section] 1 MCA 1988 and [section] 127 CA 2003 require intent, many other existing criminal offences already incorporate degrees of culpability from intent to recklessness to correspond with a scale of sentencing options. This approach has worked well in other offences such as the Protection of Harassment Act 1997.

The Law Commission considers that in some circumstances, abusive communications should be criminalised even if the motivation is not to cause harm. The anecdotal experiences of CPS prosecutors tend to support this insight. This approach may help to avoid the technical issues that arise for the prosecution where the defendant’s motivation may include other factors such as ‘having a laugh’.

We support the Commission’s proposal that the mental element includes not only an intention to cause harm, but an awareness of the risk of harm (para 5.148). This will help to ensure that a wider range of perpetrators are held responsible for their abusive acts. It will also make prosecutions more likely as demonstrating proof of a direct intention to cause harm is particularly challenging in the online environment where there is often little additional material or context other than the abusive communication.

2.171 The Magistrates Association accepted our analysis in the consultation paper and supported the proposed lower fault element:122

We appreciate that including the second test of the mental element that the defendant should be aware of a risk of harm does widen the scope of the offence. However we note the Law Commission’s intention not to criminalise harmful but not culpable behaviour, and support their argument that introducing this test will ensure those who were aware of the likely harm of their communication, even where it cannot be proven they intended harm, are potentially culpable. It is also important that those who are fully aware of the likely harm due to a communication, but where intended harm is not the primary driver for the behaviour, are held to account.

2.172 The Criminal Bar Association agreed that a scope beyond full intention was appropriate, but argued that it would be more appropriate to express this as “recklessness” rather than as a mere “risk”:123

Yes, we agree that some subjective element is necessary and we understand the offence as currently drafted requires D to be subjectively aware of a risk of harm as defined in the section i.e. serious emotional distress, not just a risk of any harm. We consider for the reasons set out above (see paragraphs 29-38) that the subjective awareness of risk of harm necessary for an offence to be committed should be expressed as ‘recklessness’ rather than mere ‘risk'.

2.173 However, a number of consultees were concerned about the potential for over-criminalisation. For example, the Law Society of England and Wales raised their concerns about the potential for over-criminalisation if the proposed “lower” fault element was adopted:

It has been suggested by practitioners that immature teenagers, in particular, would not see the risk of harm. There is a considerable danger to young people who, isolated in their rooms and among their group of friends, may say things that are ill-thought out or are, in fact, intended to shock. Many of these young people will never have been in trouble in their lives and their futures might be blighted by a criminal conviction for something they will never have realised was an offence.

There is a danger of imposing a current moral norm on someone and for that it is suggested there would need to be an objective test - the ordinary reasonable person on the Clapham Omnibus. The question would become whether or not a reasonable man would the see the risk of harm.

2.174 Dr Elizabeth Tiarks & Dr Marion Oswald noted that the proposal may be overly broad and referenced their recommendation to require a “significant risk” of harm in any lower fault element.124

2.175 The Justices’ Legal Advisers and Court Officers’ Service (formerly the Justices’ Clerks’ Society) argued that “awareness of risk” of harm was a disproportionately low fault element for a relatively serious criminal offence:125

We can see why you would wish to create something wider than intention, for the reasons you give. However, we think “awareness of a risk” of harm is too low a threshold for criminal intention for such an offence with such high consequences.

For comparison, and appreciating the different purposes behind the legislation, it certainly wouldn’t be considered appropriate to add it into section 4A Public Order Act 1986 for example.

2.176 Certain consultees also had concerns about the implications for freedom of expression flowing from a relatively low mental threshold. Kingsley Napley noted their concerns: “We believe that subjective awareness may go too far. To preserve freedom of speech, we recommend that the offence is limited to the intended harm.”126

2.177 The Free Speech Union disagreed and set out their preferred approach:127

... Most people engaging in robust speech are likely to be aware that what they say may cause severe distress to at least some people. In the absence of any intent to cause distress, we are not convinced that that feature alone should be sufficient potentially to criminalise their speech.

2.178 ARTICLE 19 set out their preference for requiring “full” intention rather than awareness of a risk of harm:128

...it would have a significant chilling effect on public conversations on contentious topics since the risk of harm and therefore awareness of that risk would be inherent to the topic. To give an example, an ordinary user sharing disturbing footage about a terrorist attack for journalistic purposes would almost inevitably upset [victims’] families. The truth can often be seriously distressing.

2.179 English PEN accepted that some form of recklessness could be appropriately captured, but that it must not disproportionately impact freedom of expression:129

We agree that there should be a mental element to the proposed new law. When a message has been targeted or addressed to an individual, then it would be appropriate to include a ‘recklessness’ limb as an alternative to direct intent.

However, when a message is broadcast to the world on a blog or social media, an ‘awareness of risk’ or ‘recklessness’ limb would be inappropriate. Due to the high likelihood that controversial or offensive messages can cause ‘serious emotional harm’ to someone on the internet, a ‘risk of harm’ limb would become trivial to prove, and effectively turn the offence into one of strict liability.

Analysis

2.180 Under Article 10 of the European Convention on Human Rights (ECHR), interferences in freedom of expression should be necessary. We are bound to consider the ECHR implications of laws that we recommend. We acknowledge that an offence based on awareness of a risk constitutes a greater (arguably, a quite considerably greater) interference in freedom of expression than does the intentional offence. In one important sense, however, we have built Article 10 protection into the offence, in requiring proof that the defendant lacked reasonable excuse (and it is this test which imports consideration of political debate). Nevertheless, we have to consider whether there is a lesser interference that could achieve the legitimate aim, especially considering the weighty burden of justification for imposing criminal sanction. To this end, we cannot ignore the regulatory regime that the Government is planning to introduce in respect of online harms.

2.181 We have sympathy with those consultees who argued that there was clearly scope for an offence based on awareness of risk: in one sense, we entirely agree. Where we might differ is whether the most effective or appropriate response is a criminal one. Given the significant work being done by the Government to improve safety in the context of online communications, and given the scale of online communications as a medium of communication, we are of the view that a criminal law response to deal with merely reckless communications, rather than intentionally harmful ones, is not necessary (and, by extension, not proportionate).

2.182 We equally have sympathy with those consultees who questioned the implications for children and young adults, and for freedom of expression, of having an offence based on awareness of a risk of harm. (We also address these matters in the introduction of this report, at paras 1.50 to 1.57).

2.183 The chief benefit of requiring proof of intention is that it helps to constrain the scope of the offence to the genuinely culpable, relieving the “without reasonable excuse” element of much of its burden. For example, a doctor delivering bad news to a patient - where serious distress may well be likely - is absolutely not in the scope of the offence; quite apart from the fact that they obviously have a reasonable excuse, they have also not acted in order to bring about any harm (indeed, they almost certainly regret the harm that they are causing). A similar argument could be made in response to ARTICLE 19’s example concerning images of a terrorist attack shared for journalistic purposes.

Proving intention

2.184 We recognise that it will be more difficult to prove intention than it would have been to prove that the defendant was merely aware of a risk. However, we do not think that it will be an impossible burden: proof of intention is a very common requirement in criminal offences. Often, the content and context of a communication may well be enough for the jury to infer that the defendant must have intended harm (even if the defendant claims otherwise). Consider, for example, a person sending images of corpses in a concentration camp to an elderly Jewish person: the defendant in that case would struggle to marshal any credible defence to the contention that they intended harm.

2.185 The jury can also consider the defendant’s foresight of the natural and probable consequence of a communication as evidence that it was intended. However, and importantly, it is absolutely not the case that intention can be established “so long as” the result was a natural and probable consequence of the defendant’s conduct, or even so long as the defendant foresaw the result as a natural and probable consequence. Evidence of the natural and probable consequence (or the defendant’s foresight thereof) may be evidence of intention, but this is not the same as saying that such evidence alone will inevitably suffice to establish intention. As we note below, for the natural and probable consequence to suffice as evidence of intention, the likelihood of that consequence (ie the degree of probability) would have to be overwhelming.

Direct and indirect/oblique intention

2.186 In order to understand how evidence of the natural and probable consequence of conduct informs the question of intention, it is first necessary to distinguish between direct intention and indirect (or oblique) intention. These are the two “processes” for establishing intention.

2.187 A person will normally be taken to have intended a result if they acted in order to bring that result about. 130This is direct intention, to be contrasted with indirect intention. For the latter, a person may be taken to have intended to bring about a result if that

person thought that the result was a virtually certain consequence of their action.131 However, even if the result were a virtually certain consequence, the jury does not have to find that it was intended; they may nonetheless decide that a result clearly was not intended (as would be the case in the example of the doctor above).

2.188 Either definition of intention can suffice for a finding that a defendant had the necessary intent to be guilty of an offence. However, it is rare that a direction to the jury concerning foresight and indirect intention would need to be given: it is only necessary in those situations where the evidence would suggest that the defendant’s motive was not to bring about the relevant result.132

Natural and probable consequences

2.189 Evidence of the defendant’s foresight of the consequences may form part of the evidence that they acted in order to bring about the result (ie evidence of direct intention), but it will only suffice to establish intention (in this case, indirect intention) if the defendant foresaw the consequence to a very high degree of probability.

2.190 It is never enough to point to the natural and probable consequences of conduct and, on that basis alone, prove intention. This principle was clearly established by the House of Lords in Moloney. 133Juries, when considering whether there is evidence of intention, may draw inferences from the natural and probable consequences of the defendant’s actions, but this is not the same as saying that intention can be established so long as the result was the natural and probable consequence of the defendant’s conduct. The question for the jury is whether the defendant intended the outcome (or whatever matter to which the intention need be directed) based on an assessment of the evidence. In establishing direct intention, the defendant’s foresight of consequences might be evidence that they intended them, and the natural and probable consequences of their conduct might be evidence of that foresight, but “foresight and foreseeability are not the same thing as intention”; they are matters of evidence and not legal presumption.134

2.191 In his speech in Hancock, Lord Scarman remarked:

.. .the greater the probability of a consequence the more likely it is that the consequence was foreseen and that if that consequence was foreseen the greater the probability is that that consequence was also intended.135

2.192 However, for such foresight to suffice in establishing intention, the probability must be high indeed. On this point, Lord Scarman cited with approval the part of Lord Bridge’s speech that considered probability:

..the probability of the consequence taken to have been foreseen must be little short of overwhelming before it will suffice to establish the necessary intent.136

2.193 This was confirmed subsequently in Nedrick 137and Woollin. 138In Nedrick, concerning the intention element of murder (the defendant must intend death or serious injury), Lord Lane CJ held that:

... the jury should be directed that they are not entitled to infer the necessary intention, unless they feel sure that death or serious bodily harm was a virtual certainty (barring some unforeseen intervention) as a result of the defendant’s actions and that the defendant appreciated that such was the case.139

2.194 This approach was confirmed by Lord Steyn in Woollin, who was at pains to draw the fundamental distinction between intention and recklessness. In Woollin, the trial judge had directed the jury that they could infer intention so long as they were sure that the defendant foresaw a “substantial risk” of serious injury. Lord Steyn held that this was a material misdirection: substantial risk was clearly a wider formulation than virtual certainty, and thus unacceptably blurred the line between intention and recklessness.140

2.195 It is therefore not correct to suppose that a finding that a consequence was merely probable would lead inexorably to a finding that it was intended. Further, we must emphasise that even a virtually certain consequence does not lead inexorably to a finding of intention: the jury may nonetheless find that there was no intention. So a doctor delivering bad news to a patient, or a manager making an employee redundant, or a professor telling a student that they had failed a module, may know that their communications will lead to serious distress, but that does not mean that a jury would find that they intended that serious distress (and, in any event, they are all outside the scope of the offence because none of them lacks a reasonable excuse).

Purpose

2.196 We also considered whether framing the fault element in terms of the defendant’s purpose was preferable. We concluded that it was not.

2.197 “Purpose” is more restrictive than intention: it speaks of a person’s motive, their reason for action, the outcome they desire. Proof that a person foresaw a result as probable, even highly probable, will not be enough to sustain a finding that they desired the outcome. The terrorist might place a bomb in a car park and provide fair warning to help ensure that the car park is evacuated: their purpose is not to kill (albeit that, when then bomb detonates and kills another, we would have no problem saying that they intended to kill or cause serious injury).

2.198 However, purpose is not a particularly common way of defining the fault element in crimes. Sometimes it is entirely appropriate, and there are certainly examples.141 Two would include: section 1 of the Official Secrets Act 1911 (which makes it an offence for a person to do certain acts for a “purpose prejudicial to the safety or interests of the State...”); and section 67A of the Sexual Offences Act 2003 (which prohibits “upskirting” for the purpose of obtaining sexual gratification or causing alarm, distress or humiliation). Nonetheless, defining the scope of a person’s purpose is not always an exact science. For example, in Chandler v DPP, 142several persons organised a “ban-the-bomb” demonstration at a Royal Air Force base and were convicted of conspiring to commit a breach of section 1 of the Official Secrets Act 1911. The House of Lords upheld their conviction on the basis that their relevant - or immediate -purpose was not to get rid of nuclear weapons (albeit that that was their object), but rather to obstruct aircraft. Whilst this inexactitude will likely present few problems in respect of defining a purpose for sexual offences - the range of ulterior purposes beyond the sexual or abusive for taking non-consensual private, sexual images of another may be somewhat limited - a purpose does become increasingly difficult to establish where people’s reasons for acting may be as many and varied as they are in the field of communications.

2.199 Our concern with purpose in the context of the harm-based offence is not only that it may be evidentially difficult to establish, but also that it is likely to be too restrictive. It is not difficult to imagine a most abhorrent and genuinely distressing communication where a defendant might claim that their motivation wasn’t to cause serious distress, but where serious distress was so overwhelmingly likely that the culpability was still high. If, for example, a person sent flyers to nursing homes suggesting that their residents be forcibly euthanised - and did so with a view to saving the state money (or at least a stated view to that effect) - it seems that their culpability is not so far removed from that of the person who wrote the same messages desiring serious distress as the outcome. Purpose, then, would fail to capture some genuinely harmful and - importantly - culpable communications.

2.200 It is for a similar reason that we should be slow to adopt for our offence the wording from section 47(7)(b) of the Serious Crime Act 2007, which (as with section 44(2) of the same Act) was intended to preclude a finding of indirect intention:143

D is not to be taken to have intended that an act would be done in particular circumstances or with particular consequences merely because its being done in those circumstances or with those consequences was a foreseeable consequence of his act of encouragement or assistance.

2.201 On its face, of course, section 47(7)(b) says nothing that is not already trite law. 144As we note above, evidence of foresight might strongly favour a finding of intention (the

“irresistible inference”145), but the mere fact that consequences were foreseen does not lead to a legal presumption of intention. The wording of section 47(7) - “taken to” - serves only to preclude a presumption of intention based merely on foresight; it does not preclude the use of foresight as evidence of intention. Even indirect intention requires proof of more than merely foreseeable consequences: the consequences must be foreseen and virtually certain (and, of course, foreseen as such).

2.202 However, even if this section is interpreted in such a way as to preclude a finding of indirect intention, we do not recommend adopting it for the purposes of the harmbased offence. To exclude a finding of intention based on virtually certain outcomes, where evidence of the defendant’s motive may otherwise be in short supply, would make this offence very difficult to prosecute, and would fail to capture culpable behaviour.

2.203 We are therefore satisfied that intention is the most appropriate formulation for the fault element of the offence. It is sufficiently narrow in scope that it addresses concerns about an overly broad offence, without presenting an insurmountable evidential hurdle for the prosecution.

WITHOUT REASONABLE EXCUSE

2.204 This was one of the most widely misunderstood parts of our consultation paper, though also one of the most important. Many consultees thought that we had proposed a defence of “reasonable excuse”, ie the burden would fall on the defendant to prove that they had a reasonable excuse, lest they be found guilty of the offence. This was not what we proposed. Instead, our proposal was that the prosecution should be required to prove as part of the offence that the defendant lacked a reasonable excuse - and that needed to be proven beyond reasonable doubt.

2.205 It was for this reason primarily that we considered the offence complied well with Article 10 ECHR. Unless the court was sure that the defendant lacked a reasonable excuse, considering also whether it was the communication was or was intended as a contribution to the public interest, the defendant would not be found guilty.

2.206 It is important also to recall that we are recommending a different and narrower fault element for this offence than we had proposed in the consultation paper. We originally proposed that the offence could be made out if the defendant were aware of the risk of harm or intended harm. We now recommend that the offence can only be made out where the defendant intends harm. The range of communications for which there might be a reasonable excuse is far smaller where harm is intended than where harm is foreseen as a risk. For this reason, many of the concerns raised by consultees (below) in relation to the reasonable excuse element are greatly diminished or simply do not arise.

2.207 Alisdair Gillespie’s response articulated the tension between a safeguard forming part of the conduct element of an offence (which is important) and terming the safeguard an “excuse”: the latter appears to frame the issue in a way that puts the burden of proof on the communicator, even though our analysis in the consultation paper and reasoning for including the safeguard seek to ensure that the onus is on the prosecution. However, the term “reasonable excuse” does have some advantages over simply asking whether the communication was (or was not) “reasonable”. Whilst reasonable excuse allows some scope for assessing the content of the message as well as the reason for its being sent, the assessment of whether a communication was reasonable necessitates an assessment of the content. It requires assessment of whether the communication was itself reasonable - ie does the content meet a standard of reasonableness - quite apart from whether the excuse for sending the communication was reasonable.

2.208 Take the example of a satirical piece of writing on a matter of intense political interest. Under our recommendations, this is very unlikely to fall within the scope of the offence (not least because of the requirement that the sender intend harm): the prosecution would have to prove that the defendant lacked a reasonable excuse; the defendant would argue (successfully, we submit), that political satire is a perfectly reasonable excuse for sending a communication that may cause a person harm. 146However, if the question were instead whether the piece of satire was reasonable, it is not at all clear by what metric we could answer this question. Of course, we could consider such matters as who was likely to see it or the likely harm, but these are already elements of the offence, so it becomes an exercise in circular reasoning. Many communications are simply not amenable to a “reasonableness standard”. We therefore remain of the view that “lack of reasonable excuse” is the better formulation, despite the risk that it be viewed incorrectly as a defence.

2.209 We were also aware that “reasonable excuse” outside of a specific context might be vague. It is hard to impose prescriptive factors that will apply in all situations; the circumstances in which the issue might arise are infinite. Indeed, “reasonable excuse” does not have any precise definition in law. In R v AY,147 the Court of Appeal held that “[t]he concept of ‘reasonable excuse’ is par excellence a concept for decision by the jury on the individual facts of each case.”148

2.210 However, one category seemed to call for particular consideration when determining whether a communication was reasonable, and that was whether it was or was intended as a contribution to a matter of public interest. If an exercise of freedom of expression bears on a matter of public interest, the ECtHR will be less ready to find that an interference - especially a serious interference like a criminal conviction - will be compatible with Article 10.149

2.211 Consultees expressed very strong support for this proposal. A minority were concerned that such a consideration could be used by defendants as an attempt to “justify” otherwise clearly harmful content, though we should note that the public interest does not override the operation of the reasonable excuse element; it is simply a matter that must be considered. A person may be communicating on a matter of public interest, but that does not lead inexorably to a finding that the person had a reasonable excuse. Given the nature of many of the responses on this point, which did not appear to acknowledge this feature of the proposal, it is clear that we could have expressed this more explicitly in the consultation paper.

2.212 Other consultees noted the imprecise nature of the public interest. However, assessment of the public interest in the field of communications is not unknown to the law. For example, under the section 4(1) of the Defamation Act 2013, it is a defence to an action for defamation for the defendant to show that the statement complained of was, or formed part of, a statement on a matter of public interest. Courts have nonetheless been reluctant to constrain the concept in precise ways; as noted by Warby LJ in Sivier v Riley, “it is common ground that [the notion of public interest] is necessarily a broad concept.” 150In any case, we reiterate that the public interest consideration does not mandate a finding one way or the other as to the reasonable excuse element; this provision is simply intended to ensure that public interest matters are at the forefront of the court’s mind. This reflects the importance that the European Court of Human Rights attaches to speech concerning matters of public interest.

2.213 We will first note the responses concerning the reasonable excuse element, before then considering the responses relating to the public interest part of that element. We will present a combined analysis at the end of this section.

Consultation question 11 - reasonable excuse

2.214 Consultation question 11 asked:151

We provisionally propose that the offence should include a requirement that the communication was sent or posted without reasonable excuse, applying both where the mental element is intention to cause harm and where the mental element is awareness of a risk of harm. Do consultees agree?

2.215 Consultees expressed strong support for this provisional proposal. Even those who opposed the provisionally-proposed offences noted that, should they be taken forward, they should include a requirement they be sent without reasonable excuse as a “crucial safeguard”.

2.216 A substantial number of legal stakeholders agreed with our analysis in the consultation paper and set out various ways in which the concept of “reasonable excuse” was well-known to the law. Further, many noted the greater protection afforded by ensuring that “reasonable excuse” is not a defence, but rather something to be disproved to the criminal standard.

2.217 The Justices’ Legal Advisers and Court Officers’ Service (formerly the Justices’ Clerks’ Society) noted their preference for a number of separate offences arranged in a hierarchy, similar to the scheme in the OAPA 1861. Chara Bakalis made a similar observation in her general response. Each notes the difficulties that arise with our provisionally-proposed offences in relation to what constitutes a sufficient “harm” and what may amount to a “reasonable excuse”. They argue that the provisional proposals try to act as “catch-all” offences, and that it may be more appropriate to adopt instead narrower, more targeted offences to address the varying types of harm.

2.218 Despite supporting the safeguard itself, consultees expressed concern with the potential vagueness of the concept “reasonable excuse”. A number suggested that a non-exhaustive list of factors could be used to assist in giving more content to the concept. However, as noted by the legal stakeholders, “reasonable excuse” is a relatively familiar concept to the law and to courts. To that end, it may well be useful to have some form of guidance as to what may constitute a “reasonable excuse” that is not in legislation: possibly in specific police material or CPS guidance. Consultees commended the guidance under the New Zealand model in this context.

2.219 On the topic of “reasonable excuse” being known to the law, the APCC’s response referencing the use of the concept in coronavirus restrictions and regulations152 is worth noting. The concept has been used extensively as a way to ensure there were adequate safeguards in place when the coronavirus restrictions were applied. However, there have been numerous examples of inconsistencies in approach by police. 153To that end, the potential formulation of police guidance or similar resources to assist in consistent application may be useful.

2.220 The Justices’ Legal Advisers and Court Officers’ Service (formerly the Justices’ Clerks’ Society) agreed and discussed a number of the examples used in the consultation paper:154

We agree that reasonable excuse is a necessary requirement in the proposed offence. Your examples are interesting.

Take the example of the ending of a relationship. People end relationships in sometimes very traumatic ways. If someone sends an electronic message saying “it’s over,” clearly that is not criminal in the proposed offence. They have a reasonable excuse (i.e. autonomy to live their life how they see fit).

However, consider if they said: “it’s over, I hate you. I’ve hated you for a long time. You were useless at a, b and c, and I only stayed with you because of x, y and z” (i.e. they deliberately said things to hurt the other person). Perhaps accompanied by obscenities or reference to very personal things. If those words were spoken in the home to the partner, none of that would likely be criminal.

Yet in the proposed offence, it is sailing close to the wind; the hurtful comments are arguably not said “without reasonable excuse” (there being no good reason to say them other than to cause upset), they could cause emotional distress, but surely we do not wish to criminalise such comments when put in messages, yet not when spoken?

This is one of the issues with the harm based approach we feel. The New Zealand list of guiding considerations is therefore useful.

2.221 Fair Cop agreed:155

We support this, as presumably it will go to protect Article 10 rights in that contribution to a genuine and necessary debate will presumably be seen as a 'reasonable excuse'.

2.222 Alisdair Gillespie agreed with both the importance of a “reasonable excuse” safeguard and also its inclusion as part of the offence itself as opposed to a defence. He discussed some of the possible issues that may arise in the context of communications that are not inherently “wrongful”:156

Stating that the communication must have been sent without reasonable excuse could be seen as an essential safeguard. If adopted, I also agree that it should constitute part of the actus reus of the offence and not a defence. This is important not only because it clarifies that the prosecution need to prove the absence of a reasonable excuse, but also because of the logic of offences.

A defence is only required where the prosecution have proven the actus reus and mens rea. In other words, a person is prima facie guilty unless they have a defence. That would be wrong where a person had a legitimate reason for sending the communication. By including it within the offence, it is clear that there is no presumption that they have acted wrongly, and need to prove that they did not act inappropriately.

However, the terminology is arguably problematic. ‘Reasonable excuse’ still carries the suggestion that a person should not have said what they did. It is an excuse, a way of showing that the wrong is not culpable. Yet will there be a wrong?

I know that my student has had the maximum number of condoned fails permissible. If she fails my 5,000 word essay then she cannot be awarded a degree.

Unfortunately, the essay is rubbish. I send the feedback and the low mark to her. This causes severe emotional distress because she realises that she has no degree after four years of work.

Why do I need an excuse to send this feedback? Surely, it is a legitimate thing for me to do? Does it form a matter of public interest (see next response)?

2.223 The Association of Police and Crime Commissioners agreed that this aspect of the offence will ensure that both “legitimate” communications and freedom of expression more broadly are protected:157

We agree that the offence should include a requirement that the communication was sent or posted without reasonable excuse. This will help to ensure that legitimate communications which could risk causing emotional distress are not criminalised (e.g., sending a formal communication regarding a change in legal status, such as refusal of housing, etc.).

We also believe that including the reasonable excuse element will help to ensure that freedom of expression is protected, and recognise the previous examples in law where the reasonable excuse concept appears, the most recent example being the Health Protection (Coronavirus, Restrictions) (England) Regulations 2020.

2.224 English PEN emphasised that the “reasonable excuse” aspect of the proposals is an essential safeguard in ensuring an appropriate scope. They argued that examples of what a “reasonable excuse” might be should be set out either in the offence itself or in a schedule:158

The requirement is essential for limiting the scope of the law. If sub-section (3) were absent, the proposed law would amount to a disproportional interference with Article 10.

We recommend that examples of ‘reasonable’ excuse be set out within the offence, or within a schedule.

The Law Commission’s parallel consultation on hate crime analyses the explicit protections for freedom of expression afforded by the Public Order Act [1986] c.64, at [section] 29J and [section] 29JA. The proposed new communications offence should include an analogous provision.

We enthusiastically agree. The criminal law has no business whatever sanctioning people who say things which they have a reasonable excuse to say.

We would add that in our view the absence of reasonable excuse should be a substantive element in any offence, rather than its presence a specific defence. Hence it is our view that any burden on the defendant should be merely evidential: provided the defendant raises some evidence of reasonable excuse, it should then be up to the prosecution to prove its absence beyond a reasonable doubt.

We agree that any new offence should include a requirement that the communication was sent or posted without reasonable excuse. Especially if the second mental element test is that the defendant was aware of a risk of harm, rather than a likelihood of harm. Even if the second test is that the defendants was aware of a likelihood of harm, it is still important that behaviour that is likely to cause harm is not criminalised where there is good reason for someone carrying out that behaviour. The examples given in the consultation paper show that there are occasions where someone will send or post a communication that they are aware is likely to cause harm, but where there should be no criminal liability (for example when breaking up with someone or informing them of bad news). We agree that the phrase “without reasonable excuse” is straightforward, and therefore arguably preferable to a recklessness element.

We agree. The circumstances should make this obvious in most cases. Where a potentially reasonable excuse exists (or is raised in evidence by the defence) it must be for the prosecution to disprove it. The Law Commission point out in §5.163 the type of communication which foreseeably causes distress but is a necessary part of social life, such as a doctor’s assistant notifying a person that their child has been diagnosed with a serious illness. By making a reasonable excuse part of the offence to be excluded by the prosecution, the proposed definition avoids causing unnecessary invasion into personal lives (and so avoids offending against Art.8 ECHR).

If the offence is limited to an intention to cause harm then this may not be necessary as an individual would not be guilty unless they intended harm to be caused. If they did intend to cause harm to a ‘likely’ (or in our preference, ‘intended’) audience then it is difficult to comprehend what the reasonable excuse would be...

In criminal law generally, the law has two approaches to reasonable excuse, one where it is raised as an issue by defence - the prosecution has to prove it was used e.g the example you give.

The alternative is that the defence has the burden of proving on balance of probabilities that he had reasonable excuse and relies on that defence, e.g having an offensive weapon in public place. We prefer the second line where online communication is liable to cause serious harm to likely audience, he should have the burden of proving he has the reasonable excuse.

While we understand the intention behind the inclusion of a without reasonable excuse requirement, we are concerned that, unless tightly defined, this requirement could be manipulated by perpetrators of abuse to evade criminal responsibility for their actions.

Perpetrators of domestic abuse and other forms of VAWG will often try and justify and excuse their abuse as ‘showing they care’, for example the specialist tech abuse team at Refuge often hear perpetrators try and justify the constant surveillance, monitoring and control of women by saying that they love and care about them so need to know what they are doing and where they are at any time.

2.231 Jacob Rowbottom cautioned that “once the threshold of the offence has been met, the expression is more likely to be seen as ‘low value’ and the reasonable excuse clause will be applied sparingly.”135 Further, Rowbottom suggested that many cases would be challenging on their facts:

For example, a parent with strong religious beliefs may attempt to persuade his or her daughter not to have an abortion by sending a message that includes images of an aborted foetus. If the court concludes that the communication is likely to cause severe emotional distress to the recipient, would such an attempt at persuasion be reasonable or not?

Consultation question 12 - public interest

2.232 Consultation question 12 asked:136

We provisionally propose that the offence should specify that, when considering whether the communication was sent or posted without reasonable excuse, the court must have regard to whether the communication was or was meant as a contribution to a matter of public interest. Do consultees agree?

This aims to strengthen protection for freedom of expression under Art. 10 ECHR (in addition to reasonable excuse). The Law Commission considers that in matters of public interest, many of which may be sensitive, there should not be a criminal sanction for communications even if they may cause serious emotional distress.

This fits with the proposed new offence seeking to exclude news media, broadcast

media and cinema. Broadcast media were also excluded from s.127 (4) CA 2003 and criminal libel was abolished in 2010.

We note there is a range of case law on matters of public interest, particularly with regard to defamation and libel cases against newspapers. It is also set out as a defence in the Defamation Act 2013 s.4.

We agree that the proposed new offence should specify that, when considering whether the communication was sent or posted without reasonable excuse, the court must have regard to whether the communication was or was meant as a contribution to a matter of public interest, as it is vitally important to protect freedom of expression in a democratic society... We would support the case-by-case nature of this proposed provision in that the extent of the ‘likely harm’ caused should be balanced against the right of freedom of expression, freedom from discrimination and right to a private life.

We broadly agree with the position as set out. Genuine efforts, such as those described in the consultation paper, to engage on matters of public interest should not be criminalised. The test should, however, be an objective one. It should not be a defence for a person to have a subjective but unreasonable belief that they are contributing to a debate of public interest.

We agree that the offence should make these specifications, and we believe the court should be given extensive guidance in this regard to reflect the complexity of considering public interest here. An area of our recent research has highlighted, for example, how harms can occur even in a context of apparent public interest that is addressed in the Law Commission’s consultation paper.

[I]t is important for the Law Commission to be cognisant that efforts to identify people as engaging in such communications could also run the risk of discriminating, for example, on a class or socioeconomic basis, if certain ways of speaking are deemed unsuitable for ‘civil’ public debate by people in positions of power; a risk that exists too when relying on algorithmic detection of harmful communications. It is vital that we do not create a two-tier system, punishing those whose speech is not aligned with an algorithmic or computational definition of ‘civility’ rather than those whose speech indeed causes significant harm.

[S]ome guidance for the court would be essential in this domain as the nature of communication online continues to present new challenges to judgments of when something is or is not a genuine contribution to a matter of public interest.

[A]s Knowles J commented in Miller v College of Policing, Mr Miller's tweets may have been in the main 'unsophisticated' but they were nevertheless part of a genuine and necessary political debate. This defence should not be restricted only to those who are articulate or refined in their language

...such a provision would need to be drafted carefully to ensure that 'contribution to a matter of public interest' would not be interpreted as definitive of a reasonable excuse, but rather as merely one possible consideration in that determination.

We agree [with the proposal], but would suggest adding words specifically mentioning the need for the court to have regard to the ECHR in considering the question of reasonable excuse. We are mindful of the Anti-Social Behaviour Crime and Policing Act 2014 pursuant to which, when a local authority is making a Public Spaces Prohibition Order. s 72 thereof governs the decision-making process and specifically directs the decision-maker to consider the rights contained in the ECHR.

ARTICLE 19 understands that the proposal to include a requirement for the courts to have regard to the question whether the communication was meant as a contribution to a matter of public interest is aimed at protecting freedom of expression. This proposal goes some way towards achieving that aim. In our view, however, it is insufficient...

Freedom of speech should not be understood as the freedom to say what judges or prosecutors believe to be in the category of ‘public interest’ or things that have a ‘reasonable excuse’. People often express ideas in ways which are thoughtless, callous or indifferent to people’s feelings. The Law Commission should be very wary of proposing a new offence that would criminalise everyday expression almost by default.

Analysis

“SENT OR POSTED” A COMMUNICATION

Under section 1 MCA 1988, the conduct element covers both electronic communications (such as communications sent via the internet) and non-electronic communications. In our provisional view, the proposed new offence should be aligned with the section 1 MCA offence in this regard. The proposed new offence thereby avoids the pitfalls of the section 127 CA offence in criminalising only certain forms of electronic communication. It also differs from the section 22 HDCA 2015 offence in that it is not limited to digital communications and avoids the problems of a “technologically specific” approach.

However, in another respect, the proposed offence expands on the forms of communication covered under section 1 MCA 1988. As we explain in Chapter 2, the offence under section 1 MCA 1988 requires that the defendant sends a communication of the proscribed character to an intended recipient. Our proposed offence has no such requirement. It would cover posts on social media, such as a publicly available Tweet or a comment below an online newspaper article, both of which have no intended recipient and are accessible to the world at large. Such communications may technically be covered by the word “sent” but, for the sake of clarity, we have added the word “posted”.

Recommendation 1.

2.257 We recommend that section 127(1) of the Communications Act 2003 and the Malicious Communications Act 1988 be replaced with a new offence with the following elements:

2.258 We recommend that “likely” be defined as “a real or substantial risk”.

2.259 We recommend that the offence be an either-way offence, with a maximum sentence comparable to the existing Malicious Communications Act 1988.

JURISDICTION

2.260 As we set out in the consultation paper,173 given the relative seriousness of the recommended communications offence (which is low when compared with other offences normally justifying extradition to the UK), we were of the view that a provision penalising acts committed outside of the jurisdiction of England and Wales would be disproportionate. The offence is also one that criminalises conduct, rather than result.

To that end, the conduct that should be captured ought to be wrongful conduct committed within the jurisdiction. We remain of the view that wholesale extension of jurisdiction is not appropriate. However, as we detail below, we believe that a modest extension of jurisdiction could combat a failing that arises with the existing communications offences, which we do not wish to replicate with our recommended offence.

Consultation question 16 - extra-territorial application

2.261 We asked a consultation question concerning jurisdiction and extra-territoriality:174

Do consultees agree that the offence should not be of extra-territorial application?

2.262 There was strong support for our view that any offence should be limited to conduct within the jurisdiction.

2.263 The Crown Prosecution Service noted the potentially disproportionate work that could arise if the offence were of wholesale extraterritorial application:175

This is a reasonable proposal, given the extradition of people from overseas responsible for harmful communications received or viewed in England and Wales could involve a disproportionate scope and amount of procedural work. There could be diplomatic implications. For example, the USA has very strong protections for freedom of expression under the First Amendment.

2.264 ARTICLE 19 noted concerns with the principle and practical application of extraterritoriality:176

In our view, the extra-territorial application of such an offence would lead to other countries applying their unduly vague laws beyond their borders, which would have a significant chilling effect on freedom of expression. Moreover, it would be practically very difficult to prosecute individuals based overseas.

2.265 However, a number of consultees expressed their concern with our proposal to limit the offence’s territorial application. 177They outlined the various types of online abuse that are perpetrated by people outside of the jurisdiction and questioned the practical impact that limited jurisdiction would have on any recommended offence.

2.266 A number of consultees conceded that given the practical concerns with any extraterritorial application of an offence, some compromise position could be sought whereby people who are only briefly outside of the jurisdiction are not able to escape responsibility.178 They pointed to various recent examples of this type of behaviour, including examples where a person habitually resident in England and Wales sent tweets while on holiday in Australia and, due to the territorial limitations of the communications offences, had not committed an offence in England and Wales.179 Consultees noted that measures to address similar behaviour form part of the recent Domestic Abuse Act 2021.180

Analysis

2.267 We accept that any attempt to extend the offence to communications sent from outside the jurisdiction with a likely audience within the jurisdiction would present both practical and principled difficulties. However, we appreciate that there may be limited scope to expand the application of the offence to the extent that any person ordinarily subject to it cannot evade liability by sending a communication while briefly outside the jurisdiction. It is worth considering the decision of R v Smith (No 4). 181In that case, the offence of obtaining services by deception was committed in circumstances where the obtaining had taken place abroad, but a substantial part of the deception had occurred in England. The Court held that not to extend the application of the offence in the circumstances would be an unduly strict interpretation of the doctrine of extraterritoriality. However, central to this reasoning was that different elements of the offence had been committed in different jurisdictions. Given that our recommended offence is complete at the point of sending or posting, the test articulated in Smith does not apply.

2.268 Our recommended offence, similar to the current offences in the MCA 1988 and CA 2003, does not require the recipient (or in the terms of the offence, the “likely audience”) of a communication to be within the jurisdiction. However, any extension of jurisdiction requires consideration of the potential application of an offence where both defendant and likely audience are entirely out of the jurisdiction of England and Wales.

2.269 At this juncture, it is important to note that there is a wide array of relatively serious offences in the criminal law of England and Wales with extraterritorial application.182 Many of these are more serious offences than a communications offence: for example, the offences of murder and manslaughter pursuant to the Suppression of Terrorism Act 1978, a number of terrorism-related offences under the Terrorism Act 2000 and Terrorism Act 2006, and the offences in sections 1 to 3A of the Female Genital Mutilation Act 2003 may all be prosecuted in England and Wales against United Kingdom nationals or habitual residents where the conduct occurs outside of the jurisdiction without any requirement that the conduct is an offence in the jurisdiction where it takes place. Some forms of extraterritoriality use a “local law” requirement - that the impugned act is an offence under the law of the country where it takes place.183

2.270 Offences more analogous to communications offences that are subject to extraterritorial application are the offences of putting people in fear of violence and stalking involving fear of violence or serious alarm or distress in the Protection from Harassment Act 1997184 and the offence of controlling or coercive behaviour in an intimate or family relationship contrary to section 76 of the Serious Crime Act 2015.185 We note that the nature of these offences, similar to our recommended offence, is such that it may be difficult to establish that the impugned conduct matched an existing offence in a different jurisdiction, and that to impose a “local law” requirement may render any extraterritoriality practically unenforceable.

2.271 Further, in line with the observations made by the Government in relation to the Domestic Abuse Bill, it is generally appropriate for criminal offending to be dealt with by the criminal justice system of the state where an offence occurs. 186A prosecution within England and Wales would only take place where the accused person is present in the jurisdiction, there is sufficient evidence to provide a realistic prospect of conviction and it is in the public interest to prosecute.187 Similarly, the Code for Crown Prosecutors notes that decisions in prosecutions involving conduct across jurisdictions will require careful balancing of a range of factors and sets out that the final decision in each case will depend on the circumstances of each case which will: “...weigh heavily on whether there is enough evidence to prosecute and whether it would be in the public interest to do so.”188

2.272 In our view, to ensure that the offence cannot be avoided by those to whom it would ordinarily apply (namely, people habitually resident in England and Wales) by sending or posting communications while temporarily outside the jurisdiction, the offence should apply to communications sent or posted either within England and Wales or outside the jurisdiction by a person habitually resident in the jurisdiction. For completeness, we reiterate that any prosecution arising from a communication sent or posted abroad by a person habitually resident in England and Wales would only take place where there is sufficient evidence to provide a realistic prospect of conviction and it is in the public interest to prosecute.

2.273 This approach ensures that the concerns raised in the consultation paper about proportionality,189 and those noted by consultees about freedom of expression and comity, are respected: the offence will never apply to those who are not either present in, or habitually resident in England and Wales. The offence will extend only to those who are ordinarily subject to the criminal law of England and Wales. We believe that this modest extension of the application of the offence, in line with the approach adopted for the offences in the Protection from Harassment Act 1997 and Serious Crime Act 2015 in the Domestic Abuse Act 2021, will ensure that a current gap in enforcement that exists with the communications offences will not be replicated with our recommended offence.

Recommendation 2.

2.274 We recommend that the harm-based communications offence applies to communications that are sent or posted in England and Wales, or that are sent or posted by a person habitually resident in England and Wales.

Chapter 3: Knowingly false, persistent and threatening communications, and flashing images

INTRODUCTION

SECTION 127(2) OF THE COMMUNICATIONS ACT 2003

A person is guilty of an offence if, for the purpose of causing annoyance, inconvenience or needless anxiety to another, he—

For a discussion of the potential application of the Offences Against the Person Act 1861 see Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 4.123 to 131,

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.7.

Communications Act 2003, s 127(3).

The proposed new offence: knowingly false communications

The conduct element

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 18.

Consultation Responses: Bar Council of England and Wales, Alan Turing Institute, Magistrates Association, Women’s Aid, Justices’ Legal Advisers and Court Officers’ Service (formerly the Justices’ Clerks’ Society), S Rowe, English PEN.

communications, which are no less problematic in virtue of being sent by Bluetooth or over a work intranet than over the internet, for example.193

We provisionally propose that the conduct element of the false communications offence should be that the defendant sent a false communication, where a communication is a letter, electronic communication, or article (of any description).

Do consultees agree?

Consultation responses and analysis

We agree with this proposed approach in terms of not replicating the “public electronic communications network” in the relevant subsection, and agree that false and persistent messages sent for malign purposes are not necessarily any less problematic for being sent via non-public communication networks, such as Bluetooth or a work intranet.

In terms of the act of sending, consideration should be given to the recent case of R (Chabloz) v Crown Prosecution Service. In that case, the High Court held that the offence under section 127(1)(a) was complete with the insertion of a hyperlink that linked to a communication of the proscribed character. The conduct element under section 127(1)(a) and section 127(2)(a) and (b) are indistinct as regards the act of sending. The decision in Chabloz has been criticised on the basis that it misapprehends the nature of hyperlinking and has caused already widely cast offences to become even wider. I suggest that the Law Commission considers carefully the ambit of 'sending' an electronic communication, so as to prevent liability arising where an individual has sent a message that includes a hyperlink to impugnable material but isn't necessarily the subject matter of the communication.

Similarly, consideration should be had as to whether creating a webpage or social network group constitutes 'sending' in the digital context. In my view, it should not, as the proposed offence aims to limit the spread of potentially harmful information, and to do otherwise would likely constitute a disproportionate interference with individuals' Article 10 ECHR rights. The offence should therefore not capture information that is published with no intended recipient or recipients.

We note that what is ‘false’ is often difficult to ascertain. The Commission will be well aware of the complex precedents and provisions in defamation law regarding ‘substantial truth’ and ‘justification,’ how the Courts determine whether something is fact or opinion, and whether an innuendo is an assertion of fact.

However, we also note that a version of this offence is already in operation at s.127(2) and that prosecutions under this law are for palpably false claims that can be proven to be so. We assume a narrow definition of falsity will continue to be employed by police and the Courts, and the offence will not be used as a way to litigate what should properly be settled through defamation law.

In many online settings, individuals share false information unknowingly because they believe that it is true or have given it insufficient scrutiny to understand its harmful effects. This is unfortunate, and the unintentional spread of false content should be addressed through other initiatives such as media literacy programmes. However, we do not think that the law is an appropriate way of handling such behaviour.

This point highlights that the nature of false communications online is not understood with sufficient granularity in the proposal so far, as it overlooks, for example, another related category of harmful disinformation: misleading but true information used with malign intent. There are many examples of this, such as the sharing online of true reports of e.g. someone who is a migrant engaging in violence, but which is shared in a context or a manner in which it intends to mislead others into viewing migrants as a whole as a threat. As Demos’ report, Warring Songs, into the variety of information operations that occur online highlighted:

[...] focusing on the distinction between true and false content misses that true facts can be presented in ways which are misleading, or in a context where they will be misinterpreted in a particular way that serves the aim of the information operative.

The presumption that a true/false binary is sufficient to capture many of the abuses of false communication in an offence is, therefore, mistaken. Again, however, this point is meant to illustrate the broader point that consideration of truth and falsity is not the only dimension by which this proposal could be made more granular and thus more effective.

The fault element

We provisionally propose that the mental element of the false communications offence should be:

Do consultees agree?

Consultation responses and analysis

Agreed

We agree. The proposed false communications offence would require intent, as required by existing law under section 1 [Malicious Communications Act] 1988 [(“MCA 1988”)] and section 127 CA 2003. The new offence would not cover 'belief' that the communication was false (as under section 1 MCA 1988). Section 127 CA 2003 also requires that the defendant knew the communication was false, rather than believed it to be so.

We agree that the mental element of the false communications offence should include the two elements above, covering both the deliberate sharing of disinformation and intention to cause harm. We would also like to draw attention to the ways in which misinformation can cause harm and the ways in which disinformation and misinformation have a gendered and racialised component, affecting certain users more than others.

Disagreed

The consultation explains in point 6.60 that 'We do not, however, propose to cover communications that the defendant believes to be true - no matter how dangerous those communications may be. We recognise that misinformation and ‘fake news’ are serious social problems, but they lie beyond our Terms of Reference'.

This has the potential to leave a loophole whereby common and accepted forms of antisemitic discourse will not be considered as offences if they otherwise fall under the terms of the proposed changes to online communications law, whereby Jewish individuals or organisations are being targeted with antisemitic abuse.

Examples include common antisemitic canards and conspiratorial narratives around 'Jewish Power', which claim that Jews somehow rule and manipulate the world via assorted mechanisms of global control.

Though this is not illegal per se, CST has published evidence of many examples whereby Jewish individuals or organisations are harassed and targeted online in an antisemitic manner by offenders employing these themes - whether dressed up as a 'Rothschild Conspiracy Theory' or not.

Another example is the denial or distortion of the Holocaust. Though, in the UK Holocaust Denial is not illegal, it can certainly be employed to harass and target members of the UK Jewish community in an antisemitic manner.

The offender may truly believe in their misguided notions, but the onus shouldn't be on whether or not they believe in what they are saying or not. It should be on the impact that their 'false communication' has on the victim, especially when employed in a hateful manner that targets a specific (or several) protected characteristics.

I disagree that the defendant should have to know the communication to be false, applying a wholly subjective test to the defendant's knowledge.

In DPP v Collins, Lord Bingham held that the mental element of section 127(2) requires "proof of an unlawful purpose and a degree of knowledge" (at p 10). In my view, the proposed offence should capture both actual knowledge and wilful blindness. This is because the law should be flexible enough to punish the most severe cases of recklessness and because establishing actual knowledge will likely be extremely difficult to establish to the requisite evidential burden.

However, I do not go as far to recommend that constructive knowledge be sufficient to establish the mental element of the proposed offence. Constructive knowledge will very rarely establish criminal liability and with this offence it should be no different.

Other

We question the need for this offence.. .However, if a new offence is to be introduced, we support the proposed wording, for the reasons set out in paragraphs 6.42 and 6.46.

Knowledge of falsity is a crucial addition that will limit the scope of the offence in a way that will protect individual rights to express opinions and share information.

I can see the logic of this, but it is challenging of further dissemination (e.g. retweets). Knowledge is a high threshold to clear, and some offences do have an equivalent of recklessness (‘ought to have known’). Extending it to include recklessness would be controversial, but there is a danger that harmful content could be sent through retweets.an alternative (and possible half-way house) is to say that one of the purposes must be to cause non-trivial emotional, psychological or physical harm. That is used elsewhere in the law.

consequences that the making of malicious complaints can have and questioned whether the proposals could adequately address this:217

Non-trivial is hard to define and we wonder whether a clearer definition could be sought.

We are also concerned that the definition of harm is too restricted and would decriminalise some undesirable behaviour which is currently criminal under the Malicious Communications Act and the Communications Act - e.g. making malicious complaints with a view to causing purely material or reputational harm, such as public calumny, loss of position (paid or voluntary), or exposing the victim to criminal investigation. While some offences could be prosecuted as fraud, frequently such offences will not satisfy the requirement of intent to cause loss of property. With such offences the emotional harm is often secondary to the material harm, and the material harm is often what the offender intended.

Analysis

Misinformation and the criminal law

Types of harm

However, in order to ensure that our proposals do not disproportionately expand the scope and application of communication offences, it is important to ensure that any harms captured have a sufficient causal connection with the defendant’s conduct.230

Knowledge of falsity and intention to cause harm

Fault element: conclusion

Reasonable excuse

We provisionally propose that the false communications offence should include a requirement that the communication was sent without reasonable excuse. Do consultees agree?

Consultation responses

I agree as it is not hard to comprehend circumstances where a defendant will be able to establish a reasonable excuse.

Therefore, in order prevent over criminalisation, as well as unlawful interferences with individuals' Article 10 ECHR rights, it should be possible for individuals to have a reasonable excuse for sharing harmful, false information.

I also agree that the reasonable excuse should not form a defence, since the evidential burden for proving the message was sent without a reasonable excuse should rest with the prosecution.

The notion that communications that may upset someone must have a ‘reasonable excuse’ amounts to censorship.

If a new offence is to be introduced on these terms, then a ‘reasonable excuse’ requirement is essential for limiting the scope of the law.

If sub-section (3) were absent, the proposed law would amount to a disproportional interference with Article 10.

...whilst we do not oppose the inclusion of a without reasonable excuse requirement in its entirety, we are concerned that unless very tightly defined, this requirement could be manipulated by perpetrators who attempt to justify their abuse and control as an act of love and care.

Analysis
Knowingly false communications: a new offence

Recommendation 3.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.11.

criminal sanction.254 As was noted by Warby J in a judgment delivered after the publication of the consultation paper, the object of section 127(2)(c) was:255

...to prohibit the abuse of facilities afforded by a publicly funded network by repeatedly exploiting those facilities to communicate with another for no other purpose than to annoy them, or cause them inconvenience, or needless anxiety.

We provisionally propose that section 127(2)(c) should be repealed and replaced with a specific offence to address hoax calls to the emergency services. Do consultees agree?

Responses and analysis

Yes, we agree that a bespoke offence with a proportionate maximum penalty would be appropriate.

We support this proposed approach, and agree with the Commission’s concerns that the expansion of the definition of the word “network” has led to the offence now being very broad, at least in theory. We support the move to update the law to reflect the nature of how people communicate now in online spaces.

In principle, ARTICLE 19 would be supportive of a specific offence to address hoax calls to the emergency services as long as it is sufficiently clearly and narrowly defined so as to protect freedom of expression.

This seems a sensible proposal. The consultation sets out a convincing argument that currently section 127(2)(c) risk criminalising certain behaviour that is not intended to be covered by any offence. However, it is obviously important that if section 127(2)(c) was repealed, there was legislation to cover specific offences, including hoax calls to the emergency services. Other examples given in the consultation paper, can be dealt with elsewhere, including via legislation covering domestic abuse.

We understand the rationale for the proposal and there is some value to it. However, it is important not to underestimate the impact that other low-level behaviour can have when it is persistent. The views of victim groups could also add valuable insight to consideration of this proposal.

We note that where the harm is more serious such as domestic abuse then targeted laws such as the Domestic Abuse Bill 2020 and [section] 76 of the Serious Crime Act 2015 may be more appropriate.

As regards section 127(2) Communications Act, the Report notes that the main purpose has been to deal with hoax calls to the emergency services. While we agree that this is a distinct objective that could be dealt with via a specific offence, we also note that under section 128 Ofcom may bring enforcement action against those who persistently misuse electronic networks and services. The majority of these have been related to silent calls. While many of these may arise from automated dialling systems and cold-calling, not all will fall into this category, but could be from, for example, ex-partners or people trying to elicit a reaction for their own amusement. Ofcom notes that silent calls are more annoying and more likely to cause anxiety. The circumstances in which people could persistently misuse public networks using modern communications software and techniques are many, varied and unforeseeable in the future. We invite the Law Commission to consider whether such behaviours would be adequately dealt with under the proposed new regime and emphasise that a focus on new services (e.g Skype, WhatsApp) should not mean that older services that are still much used are forgotten.

This proposed law reform is motivated by what the Law Commission identifies in this consultation as the need to legislate for harm-based offences (§6.3). Existing legislation is intended to cater adequately - and comprehensively - for the mischief and misconduct of a person making a hoax call to police. The offence of wasting police time, contrary to section 5(2) of the Criminal Law Act 1967, is a type of public justice offence for which the Crown Prosecution Service has published specific and current guidance^We doubt that the offence specified by section 5(2) is a complete answer to the conduct discussed in the consultation:

We invite the Law Commission to consider this. If a new communications offence is necessary, the Bar Council agrees with the Law Commission’s proposed approach.

Recommendation 4.

Tweet was not capable of being properly construed as “menacing”.280 In our view, requiring that the defendant intend or be reckless as to whether the victim of the threat would fear that the defendant would carry out the threat will ensure that only “genuine” threats will be within the scope of the offence. Communications sent without the requisite fault will not be within the scope of the offence.

...Matthew Wain was recently convicted for posting a YouTube video in which he said that he hoped the National Health Service (“NHS”) staff at Birmingham City Hospital would “all die of the coronavirus because they would deserve it." He also said, "Not being funny... after what I had done to me yesterday I would bomb the place, to be honest.” As District Judge Briony Clarke observed, part of the wrongful nature of Mr Wain’s conduct was that it was threatening.

In addition to our proposed new communications offence, should there be a specific offence covering threatening communications?

Threats and the law

On the one hand, threatening messages would seem ordinarily to come within the proposed offence. It is easy to imagine how someone threatened could suffer serious emotional distress. However, they won’t automatically suffer such harm. Indeed, they may not personally feel threatened even though it is undoubtedly a threatening email.

Making threats is something the law should concern itself with. Threats in person can constitute a (technical) assault, although it would be difficult for them to do so online (as the person would not apprehend immediate unlawful violence). Public order law also criminalises threatening behaviour in various circumstances.

It would be perfectly appropriate to define an offence of sending a threatening communication. Chambers may be cited as a reason why not to, but it isn’t such a reason. That was an example of bad cases make bad law. The case should never have been prosecuted in the first place, as it was evident from the outset that it was not a true threat. Its importance should not be overstated in this arena.

Whilst we recognise that existing communications offences explicitly cover threats, and that threatening and menacing communications will also be covered by this proposed offence, we do believe that in cases where a threat is only covered by the proposed new communications offence, that there may be a labelling problem.

As commissioners of victims services, we recognise the impact that a communication that contains threatening or menacing language may have on victims, and would argue that in order for the offence to adequately capture and describe the nature of wrongdoing, there is sufficient justification for a specific offence relating to threats to be created.

Refuge.. .supports specific threats being criminalised where there is clear need. For example, we are campaigning for threatening to share intimate images without consent to be criminalised through the extension of section 33 of Criminal Justice and Courts Act 2015 to include threats as well as the sharing of an intimate image or film. This issue falls within the scope of a separate Law Commission project and therefore we do not go into detail here. However, the full details of our proposal can be found in our report The Naked Threat.

Many existing offences cover various threats, for example threatening to kill, harassment offences and the coercive and controlling behaviour offence. In our experience these are under utilised by the police and CPS. Whilst prosecutions for tech abuse related offences are relatively rare, where the police do take action, they usually favour malicious communications offences, even when the communication was a threat to kill or a threat as part of a pattern or stalking and harassment. It would be our preference for the existing offences to be used more widely, as they better reflect the context and seriousness of a threat compared to a communications offence and therefore attract a more severe sentence.

So long as measures are proportionate, we believe that preventing threats to individuals is an appropriate reason to curb Article 10 rights — threats of violence represent a significant interference with an individual’s Article 5 right to security and the Article 8 right to a private and family life.

Such an offence would need to be drafted to take into account the infamous ‘Twitter joke trial’ (Chambers v Director of Public Prosecutions [2013] 1 All ER 149), which the Court of Appeal acknowledged should not have been prosecuted. This could be addressed with a strong mens rea element that rules out jokes and unfocused rants that happen to include threatening language.

Once more we note how a message ‘broadcast’ over social media to no-one in particular is conceptually different to a message addressed to a particular recipient. This is what distinguishes the Chambers case from the prosecution of Matthew Wain (discussed at paragraphs 5.207-5.208).

Although some threats are covered in other acts (such as the threat to kill under section 16 of the Offences Against the Person Act 1861) there are several threats that may not be covered. Namely those that are likely to cause harm within the context of abuse at work, by members of the public or colleagues (threats to share intimate images, personal information, etc); and likewise within the context of stalking where the behaviours are so varied and not listed as offences within themselves (threat to visit victim’s house, threat to contact their family member, etc).

Victim impact and prevalence of threatening communications

We believe the proposal should include a specific offence covering threatening communications. As a charity working to end online abuse, we recognise that online violence takes on several forms, ranging from harmful to illegal and threatening content. Threatening content - including through images and video - poses a direct danger to a person’s integrity and/or life, or their ability to exercise their rights freely. Research has shown, for instance, that politicians who have faced threats online in the UK have been forced to step down from some of their activities. While abusive content has a long-term psychological and emotional impact on victims - and can lead to threats further down the line - threats pose an immediate danger to the victim and need to be addressed separately in legislation.

Threatening communication was one of the contributing factors to reporting to the police when we examined the patterns of reporting in police records. We examined disability-related cyber incidents in UK police force records over 18 months, and identified three overarching themes as influencers of reporting or documentation of cyber incidents by the police, these were: psychological impact, fear for safety, and the type of disability.

In my research threatening communication was commonly reported with fear, which is significantly associated with cyber-victimisation.

I think it should be wider and include threats because of the issues with judges. Threats are a problem. It would be easier if it were more specific.

This is the only addition that we feel would be a helpful step forwards. People in high-profile positions, especially parliamentarians and academics and especially women, receive rape and death threats on a daily basis. Although these are already covered by existing legislation, we do think it is worthwhile considering adding a specific offence covering such threats.

For example, we recently supported a survivor whose perpetrator made repeated threats to kill her. She reported this to the police and on several occasions and no further action was taken. The final time she reported the threats to the police, a case of malicious communication was pursued, and he was charged. The perpetrator went on to attack the survivor and left her with life changing injuries.

In another case a perpetrator shared a sexual image of a survivor without her consent. The survivor report this to the police and the perpetrator was charged under the Malicious Communications Act. Refuge staff questioned why a charge under section 33 of the Criminal Justice and Courts Act was not pursued instead, and received the response that the police try to avoid using the more serious offences as these charges are often rejected by the CPS.

Based on our experience we think addressing gaps in the law regarding specific threats and focusing on increasing the use of existing offences covering threats would lead to better outcomes for survivors than a general threatening communications offence.

3.111 Women’s Aid stressed the importance of addressing threats:296

We would argue that it is important for there to be a specific offence covering threatening communications. Threatening communications should not be downplayed or dismissed, particular when raised by women, survivors of VAWG or other marginalised groups. The response to threatening communications must reflect the reality of the impact that they have and the seriousness of the intention behind them.

3.112 The Antisemitism Policy Trust referenced the prevalence of threats to parliamentarians and their staff in supporting the proposal:297

Yes. The Trust’s experience of working to support parliamentarians and their staff, in particular, leads us to believe threatening communications have a particular, significant harm which should be separated from a general offence. Threatening communications can and do impact people’s way of life, their fundamental freedoms.

3.113 Suzy Lamplugh Trust expressed their support and noted the increased incidence of threats during the COVID-19 pandemic, as well as the potentially narrow scope of the section 16 OAPA 1861 offence:298

Suzy Lamplugh Trust welcomes the proposal for a specific offence covering threatening communications. Inflicting a comparable degree of psychological and emotional harm as those offences at the centre of this consultation, we argue that it is equally important to try and convict those sending or posting threatening messages online. Especially within the context of the COVID-19 pandemic and the consequent increase in online communication, threats among frontline workers such as healthcare and retail workers have risen significantly.

3.114 ARTICLE 19 agreed that, to the extent it is not already criminalised, an offence addressing credible threats of bodily harm could be worth investigating.299

Use of communications offences

3.115 The Criminal Bar Association disagreed, arguing the general communications offence would be sufficient:300

We consider that [the general harm-based communications] offence would be sufficient to obviate the need for a further offence. Should there be significant feedback to suggest that such an offence is desirable, it would be possible to add a further subsection to provide for an aggravated form of the offence.

3.116 Fair Cop answered “No”:301

We do not see a need for this to be a separate offence. If a malicious communication is directly threatening to an individual then this makes it more likely that it has genuinely caused significant distress and less likely that the person publishing it could have any good reason. We consider that communications that directly threaten an identifiable individual are very likely to justify the intervention of the criminal law.

3.117 The Free Speech Union disagreed, indicating their preference to limit the number of speech offences:302

No. The Commission itself points out at Para.5.210 that there seems no particular difficulty arising from the lack of any offence of this kind. Given this fact, we for our part see no need whatever to add to the catalogue of speech offences.

3.118 The Crown Prosecution Service noted that external communications about prosecutions for general offences could address the labelling issue to an extent, and that other specific offences do exist that could be used to address threats:303

As a matter of law, threatening communications may be sufficiently covered by the proposed harmful communications offence.

We note the Law Commission welcomes our views on whether the labelling issue is a sufficient justification for a specific offence relating to threats. This is something that the CPS would seek to assist in addressing through effective external communications following successful prosecution outcomes.

Existing offences also cover threats e.g. [section]16 of the Offences Against the Person Act 1861 [threats to kill] and [section] 51 of the Criminal Law Act 1977 [bomb threats].

Analysis

3.119 There was extremely strong support from consultees for a separate offence that specifically covers threatening communications. This support came both from those who were in favour of the provisionally proposed harm-based offences and from some who opposed them. In our view, this reflects the fact that genuinely threatening communications reflect a high degree of criminal culpability.

3.120 Some consultees noted that there may be a gap created via the harm-based offences wherein a communication is sent or posted that may be properly characterised as threatening but would not meet the threshold of harm. None provided examples, and it seems unlikely that a genuine threat would not meet the threshold of harm in the provisionally proposed offence.

3.121 A recurring comment in consultee responses was that deliberate threats constitute a particular harm that would not be sufficiently addressed via a generic communications offence. Instead, in their view, they should be addressed by a fairly labelled offence. Similarly, the maximum penalty of two years imprisonment would apply for a credible threat of an offence other than murder (for example, of rape or of some other serious violent offence). 304Consultees also raised the potential links between online abuse and offline violence in the context of threatening communications as a reason to address this matter explicitly.

3.122 While the matter of Chambers305 was raised by some consultees, none thought it posed an insurmountable problem, and noted that providing the offence targeted genuine threats, it should not stand in the way of addressing the harm. We agree with Professor Gillespie’s characterisation of it as involving no true threat.

3.123 Refuge noted the poor application of existing offences that target threats. They disagreed with the need for a specific offence to tackle threatening communications and instead urged for greater focus on the communications offences more broadly. A similar point was raised by the Suzy Lamplugh Trust, who argued that certain threatening behaviours manifest in otherwise innocuous behaviours. We appreciate the complexity of addressing these threatening behaviours, and that any specific offence may not explicitly address these concerns. It is worth re-iterating the availability of harassment offences for courses of conduct (that may otherwise seem innocuous).306 Further, as discussed in Part 2 above in relation to persistent communications, the various specific protections contained in the Domestic Abuse Bill will be vital to ensuring there is a comprehensive strategy to address violence against women and girls.307

We believe that the threats within scope should be threats to cause serious harm, where “serious harm” includes threats of causing serious injury (encompassing physical and psychiatric injury within the definition of “grievous bodily harm” for the purposes of sections 18 and 20 of the Offences Against the Person Act 1861315), rape or serious financial harm.

Recommendation 5.

3.138 It was our view in the consultation paper that such behaviour ought ideally to be prosecuted as an offence under section 18 of the Offences Against the Person Act 1861 (maliciously causing grievous bodily harm or wounding, with intent to cause grievous bodily harm), or section 20 of the same Act (maliciously inflicting grievous bodily harm or wounding). 325However, we do acknowledge that this involves two prosecutorial hurdles in particular. First, it might be difficult in all cases to establish that the harm met the requisite threshold of “really serious harm”. A seizure, without more, does not automatically meet that threshold (though clearly a seizure brings with it substantial risk of a multitude of serious harms). 326Secondly, establishing causation beyond reasonable doubt might prove difficult, especially if the person is known to suffer from regular seizures. Clearly it will not always be difficult - in some cases it may be very obvious that the flashing image caused the seizure - but we cannot guarantee that there could never be reasonable doubt as to causation.

3.139 The further difficulty with reliance on the Offences Against the Person Act 1861 is that, as a result of the wording of the offences, the only offences that could be charged are the most serious, namely sections 18 and 20, because these require only that grievous bodily harm or wounding be caused or inflicted. The lesser offence, under section 47, is assault occasioning actual bodily harm: as the name suggests, the offence requires proof of an assault or battery. Despite the lower threshold of harm, proving that the flashing image (without more) amounted to an assault or battery will be very difficult.

3.140 Notably, this shortcoming of section 47 is addressed in our earlier recommendations for reform of offences against the person. We recommended replacing section 47 with an offence of intentionally or recklessly causing injury (whether or not by assault).327 The physical harm threshold is sufficiently low in that offence that seizures would certainly satisfy that element of the offence. This does not, of course, address the difficulties of proving causation.

3.141 However, in the absence of such an offence in statute, and recognising the difficulties of proving causation in some cases, we therefore consider that there is real scope for an offence specifically targeted at the intentional sending of flashing images to those with epilepsy. Having not proposed such an offence in the consultation paper, and

therefore having not consulted on its terms, we do not recommend a detailed offence.

We do, however, recommend that an offence should be created.

Recommendation 6.

3.142 We recommend that the intentional sending of flashing images to a person with epilepsy with the intention to cause that person to have a seizure should be made an offence.

Consultation question and response

Opposition to exemption

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 5.66.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 5.67. As we noted at footnote 386 of the consultation paper: “Newspapers and magazines are primarily regulated by IMPRESS and the Independent Press Standards Organisation. Television and radio are regulated by Ofcom. Cinema is regulated by the British Board of Film Classification.”

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 2.

that any new laws should apply without differentiation. We extract a number of their key points:328

It is our view that news media should be covered by the new offence. We believe that the arguments given by the Commission in favour of exempting news media from the offence are flawed. Further, there are good reasons for the news media to be covered.

The first argument given in favour of exempting the news media is that it is “regulated” (paragraph 5.66). This is not correct in relation to the large majority of print & online news media. Of national titles, some rely on a “readers’ editor” model and others rely on the IPSO.

The second argument given is that “a free and independent press is crucial in a democratic society” (paragraph 5.67). It is undoubtedly true that freedom of expression as exercised by some media outlets is crucial in a democratic society. But, as is stated in Article 10(2) of the European Convention on Human Rights, the exercise of this freedom carries with it duties and responsibilities including those which protect the rights of others. The news media’s right to freedom of expression is subject to those duties and responsibilities in the same way as the freedom of expression of any citizen...

The third argument is that “there is a risk that mere existence of the offence would have a chilling effect on the exercise of freedom of expression by the press and media outlets” (paragraph 5.67) .. A “chilling effect” will, of course, be felt equally by ordinary citizens, a result which is implicitly accepted as an appropriate one. There is no reason why the news media should be in any different position. Unregulated news media outlets should not be treated as if they have privileged access to the right of free expression. There is no basis for such special treatment.

I question why the news and broadcast media are not covered.

Much of the abuse, harassment and hate targeting me / I received occurred in comment sections under online news articles.

Several complaints made on my [behalf] to IPSO regarding comments made by the public that contained threats of violence and abuse failed as the IPSO code stipulates that such comments are outside their regulatory scope. This is despite the fact that IPSO acknowledged these comments caused me distress.

I would argue that now that newspapers are online, and the Government is intending that OFCOM is the regulator for online harm, that newspapers and broadcast media have to moderate and exercise a duty of care on comments posted by their readers/audience. The issue should not be the medium but the messages.

Support for exemption

We agree that news and broadcasting media should be expressly exempted from the proposed new offence. We concur with the Law Commission’s explanation at paragraph 5.67 of why this is so important.

It is correct to state, and we emphasise, that “the mere existence of the offence would have a chilling effect on the exercise of freedom of expression by the press and media outlets”. And it is correct to note that, “the existing regulator[y] framework within which news media and broadcast media groups operate render the imposition of additional criminal liability superfluous”. There is no need for further regulation of an already regulated industry.

In short, a clear and broad exemption for journalistic material is required and the Law Commission identifies considerations which are critical in determining the proper scope of such an exemption.

Reference is made in the consultation to the exclusion of news media etc being achieved ‘by way of a carve out if necessary’. We would suggest that it is necessary for a term excluding these forms of communications from being covered by the offence to be in the Act itself.

We would also suggest that the exclusion should be broad enough to cover communications that are not necessarily intended for broadcast in the form in which they have been sent: that is, a journalist who sends to a news corporation via email some working document/reports or a series of videos designed to provide an editor with background material, but not intended for broadcast in the form sent, should not be caught by this offence. We would suggest that an explicit statement of the type of materials/situations the new offence is not intended to cover should be made, rather than causing journalists to have to consider if they have a reasonable excuse in relation to briefing materials in each individual case.

There is merit to the Law Commission’s proposal to exclude news media, broadcast media and cinema from the proposed harmful communications offence. We understand that the offence would deal with individuals and not with regulated news media. Broadcast media were also excluded from s.127 (4) CA 2003, and criminal libel was abolished in 2010.

We agree that these offences should not cover regulated media such as the news media, broadcast media or cinema, and that, as the Commission state, this could potentially have a chilling effect on the exercise of freedom of expression by the press and media outlets. We also note that regulatory schemes already exist for these media.

We agree that this offence.. .should exclude broadcast media, news media or cinema. This allows the law to consider electronic communications in their specificity and the impact of harmful communications made on social media, the Internet and online platforms, while keeping up with the technological advances and changes made to platforms on a regular basis. Harmful content on the Internet and social media is characterised by its growing scale and the ease with which it is disseminated and can be shared. Since its launch in 2017, Glitch has documented the scale and nature of online abuse targeting women and minoritised communities. We have found that a significant proportion of women have faced online abuse. During the COVID-19 pandemic, our research showed that one in two women and non-binary social media user[s] experienced online abuse. Given the sheer scale of online abuse and its many manifestations, current legislation is not adapted to respond to the consequences of online harms.

We agree that freedom of expression should be protected for news media, broadcast media and cinema. However, we are concerned that this approach gives freedom of expression to those who can present their views in newspapers or on television whilst denying it to members of the public who lack such a platform and instead use YouTube or a personal blog. Likewise, it would protect film studios but not artists who create low-budget productions and share them via social media. This is an elitist approach and could lead to a situation where a news outlet can lawfully publish an article but a person who shares it on social media and echoes the sentiments expressed therein could face prosecution.

We wholeheartedly agree that the news media, broadcast media and cinema need to be protected here, and would support a carve-out.

We have, however, one comment. We note that the Consultation Paper refers to “regulated” media (para.5.66). If this means that the media carve-out should be limited to those media that are in fact regulated, we regard this as problematic. In particular, it has to be remembered that not all newspapers are regulated (membership of an organisation such as IPSO or IMPRESS is in law voluntary), and IP broadcasts originating from outside Europe are entirely unregulated (since Ofcom’s writ does not run there). Since we regard it as important that free speech protection extend to unregulated as much as regulated speech, we would press for any protection to apply whether or not the relevant publication is otherwise regulated in any other way.

We agree and echo the Law Commission’s hesitancy to ‘dramatically expand the scope of the existing communications offences’, and strongly agree that a ‘free and independent press is crucial in a democratic society’, and that the ‘burden of justification for imposing criminal liability...is especially weighty’ (para. 5.66-5.67 of consultation paper).

We support the Law Commission’s proposal that the new offence would cover any abusive comment posted in response to an article. We see first-hand at Mermaids that often replies and comments to posts often contain the very harmful communication.

Difficulties with definition

We consider that the 'news media' and 'broadcast media' should be further defined with further consideration of newer forms of news media online such as blogs, user generated content, and comments on articles. Although ‘broadcast media’ is defined in some statute, it may be less straightforward to define ‘news media’. To that end, further consideration of the definition of 'news media' and 'broadcast media' would be welcomed.

We agree, subject to identification of what is meant by the “media”. Would the exclusion include a communication in the form of a newsletter from a malicious group intent on disseminating false, alarming and foreseeably harmful information?

...it appears that the Law Commission is proposing that its exemption will be limited to articles published on news sites. This is far too narrow and fails to provide the necessary protection for freedom of expression which the Law Commission aims to achieve through its reforms. It is also inconsistent with the approach taken by the Government in its recent Online Harms White Paper: Full Government Response to the Consultation.

An exemption must include all content on news sites and must extend to that content when it is shared and when it appears on third party sites including social media platforms. It must also include user comments and social media. It must extend to communications sent for journalistic purposes, including, for example, communications offering a right to reply.

This approach has been accepted by the Government in its recent Online Harms White Paper: Full Government Response (“Online Harms: Full Response”). In the Executive Summary, the Government states (our emphasis added):

Journalistic content

And in Part 1 the Government states its Final Policy Position as follow[s]:

Content and articles produced and published by news websites on their own sites, and below-the-line comments published on these sites, will not be in scope of legislation. In order to protect media freedom, legislation will include robust protections for journalistic content shared on in-scope services. The government is committed to defending the invaluable role of a free media and is clear that online safety measures must do this. The government will continue to engage with a range of stakeholders to develop these proposals.”

We agree with this approach and submit that the exemption from any reform communications offence(s) must be equally broad.

They went on to describe existing provisions that exempt journalism and journalistic activity:344

^we simply note that there are templates in existing legislation including, by way of example, exemptions and defences referring to “the purposes of journalism” and “journalistic material” contained in the Data Protection Act 2018, the Police and Criminal Evidence Act 1984 and the Investigatory Powers Act 2016.

Though IMPRESS and IPSO as regulatory organisations are important in regulating its members, there are many online 'news media' outlets that may have print operations who are not regulated.

Examples include:

antisemitic-hate/ and

https://twitter.com/sffakenews/status/1157994330646339584?lang=en).346

Any legislation also needs to cover news media that has an online output, not covered by existing regulators.

We agree that the media deserves special consideration because of its role in informing the public and holding those in power to account. This process may give rise to considerable discomfort or distress to those the subject of reporting, but it would rarely be in the public interest to criminalise this. The report notes in passing the possibility of a ‘carve out’. In this context we note that, given the development of online distribution channels, it is difficult to define the different elements of the media clearly. Referring to regulated media would not suffice. While there are regulatory regimes, not all the traditional media are members of such a regime (see e.g. The Guardian, the Financial Times). Choices made by the print media to remain selfregulated and outside a Leveson framework means that there is not a simple definition. Moreover, the democratisation of public speech means that actors beyond the traditional media actors may speak to large audiences in the public interest. This can be seen most clearly in prominent social media accounts that have more followers than major newspapers have readers. Rather than seek to define an exception relating to the media, which would quickly run into definitional difficulties and potentially exclude other forms of public interest speech, our view is that the position of public interest speech, as usually found in the media, could be best dealt with through a public interest element to the offence, as proposed in 5.51(6).

Analysis

Recommendation 7.

Chapter 5: Group harassment, glorification of violence and violent crime, body modification content

INTRODUCTION

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, paras 8.160 to 8.222.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, Chapters 3,4 and 6.

Inchoate offences target conduct where a substantive offence may not be complete, but where what has occurred is worthy of a different type of criminal sanction. These include standalone offences of encouraging or assisting crime. For a more detailed discussion see paras 7.15 to 7.16.

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, Chapter 12; Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.152 to 174.

this as we were unconvinced that an offence targeting glorification of violent crime was a proportionate or appropriate response.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.200 to 6.207.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, Chapter 3; para 6.69ff; Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, para 3.78.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.71, citing M Oppenheim, Labour MP Jess Phillips receives ‘600 rape threats in one night’ Independent (31 May 2016), available at https://www.independent.co.uk/news/people/labour-mp-jess-phillips-receives-600-rape-threatsin-one-night-a7058041.html (last visited 13 July 2021).

See eg S Hughes, “Trolls on ‘dragging sites’ can ruin lives. It’s time they answered for their actions.” The Guardian (5 October 2020), available at https://www.theguardian.com/commentisfree/2020/oct/05/trolls-dragging-sites (last visited 13 July 2021).

See eg BBC Sport “Facebook ‘horrified’ by online abuse of Premier League footballers” (10 February 2021), available at https://www.bbc.co.uk/sport/football/56007601 (last visited 13 July 2021). We also received a Consultation Response from the Premier League which described in detail the abuse experienced by footballers and referees and their families.

given the “persistence and escalation” that would be difficult for an individual to replicate.352

The persistent abuse women face on [Twitter] undermines their right to express themselves equally, freely and without fear. This abuse is highly intersectional and women from ethnic or religious minorities, marginalised castes, lesbian, bisexual or transgender women - as well as non-binary individuals - and women with disabilities are disproportionately impacted by abuse on the platform.

Existing law: group harassment

A person’s conduct on any occasion shall be taken, if aided, abetted, counselled or procured by another—

Example 4: coordinated group harassment

A female MP, Ruby, recently made a statement calling for the close of the gender pay gap. Within 48 hours of making the statement, Ruby received thousands of emails, Tweets, and comments on her Instagram page. Some criticised her position on the gender pay gap. Others used misogynistic slurs, calling her a “stupid slut” or a “dumb bitch”. Some used the hashtag “#IwouldnotevenrapeRuby”. Ruby has been vocal on her own social media pages and in the press about the abuse she has received.

As it turns out, many of the Tweets were sent by people who were part of a 4chan thread created by Oscar. Oscar wrote: “We need to teach Ruby a lesson. The world should know what a dumb slut she is. Say it loud, say it clear! “#dumbbitch #IwouldnotevenrapeRuby... You know what to do.” Using an alias, he then Tweeted Ruby using the hashtags #dumbbitch and #IwouldnotevenrapeRuby.

Example 5: uncoordinated group harassment

Naomi, a young woman who follows Ruby on Twitter and Instagram, sees the harassing messages being sent to her and decides to join in. Underneath one of Ruby’s Instagram posts, she comments: “You stupid slut. She-devils like you need a good raping. but I wouldn’t want to be the one to do it #IwouldnotevenrapeRuby”.

Example 6: political commentary

A political commentator, Mohammed, is not convinced by Ruby’s analysis of the gender pay gap. He is aware of the abuse she has received. He writes a Tweet directed to Ruby saying, “Good effort @RubyMP, but we need smarter thinking on this important issue” with a link to an article he wrote about the causes of gendered income inequality.

Options for reform

Incitement or encouragement of group harassment360

Consultation responses and analysis

Should there be a specific offence of inciting or encouraging group harassment?

A new offence: ease of prosecution and fair labelling

Co-ordinated group attacks can heighten the harm that is inflicted on victims by creating a sense of near-universal approbation. This often happens because, unfortunately, sophisticated actors exploit and manipulate the algorithmic design of platforms to amplify their activities. For instance, if 50 trolls worked together in concert then they could each send one original messages to a victim and also share and reply to the others. This would create close to 2,500 messages/replies. On most platforms, this would appear like an avalanche of abuse, yet it is the work of just 50 accounts, likely working for only a few minutes. Put simply, the ease of interaction and communication enabled by online platforms can be manipulated in this way to inflict substantial emotional distress on victims - far beyond what is inflicted in offline settings where such group behaviour is far harder and more costly to coordinate. Furthermore, in most cases there are far fewer individuals than accounts. In the example just given, 50 separate individuals might not be involved because multiple accounts could be controlled by a far smaller number of people (potentially even just one person), some of which might be bots or heavily automated.

As outlined in the Law Commission’s consultation report (pp. 155-161) there is a risk that the perpetrator/organiser of pile-ons may not be a message sender, and as such their actions would potentially not be covered by the existing law. This is problematic as we would argue that the harm is clearly being inflicted on the end recipient by the actions of the organiser, as much as by the message senders. In some cases, the causal link between the organiser and the message senders may be quite weak and such cases should not come under the purview of the law - we would strongly advocate that it is retained only for cases where an organiser knowingly and explicitly incited group harassment. This caveat aside, it is important that organisation of harassing behaviour is addressed by a new offence. We note that there is a clear precedent for this in how hate speech is tackled, where the 1986 Public Order Act criminalises actions to ‘stir up’ racial hatred. A similar view could be adopted here: the organisers are stirring up malicious communications.

The impact on the victim, and the wider community impact, is acute. Therefore, there is a need for a specific offence of this kind in order to mitigate the chances of this type of harassment not being prosecuted by older and more obscure legislation that could otherwise be used.

Refuge strongly supports the creation of a specific offence of inciting or encouraging group harassment. ‘Pile on’ or other forms of group harassment can cause significant fear, harm and distress and often lead to survivors reducing their online presence, and therefore interferes with their rights to freedom of expression. The absence of such an offence is a significant gap in the law.

Coordinated collective harassment is a growing problem online, and affects women and minoritised communities disproportionately. Research has shown that malign groups and individuals are using social media to launch coordinated harassment campaigns against public figures and social media users with no public profile alike. An investigation by BBC Newsnight in April 2019 showed that self-identified female politicians across Europe were targeted with threatening and misogynist content and coordinated harassment campaigns ahead of the European Parliamentary election. Research has consistently shown that women are more likely to be targeted by coordinated or uncoordinated harassment. In 2017, the European Institute for Gender Equality described cyber harassment against women and girls as a growing problem. Legislation in the UK has not kept up with this growing threat. Coordinated or collective harassment needs to be addressed as a specific offence.

Stonewall supports the introduction of a specific offence of inciting or encouraging group harassment. Stonewall agrees with the Commission’s assertion that group harassment - whether coordinated or not - can have a significant impact on the victim(s) involved, both at the level of individual emotional and psychological harm, and a broader societal level, such as having a chilling effect on marginalised groups’ ability to participate in public life (6.73).

LGBT+ people often face harassment online and offline, includes instances that mirror the Commission’s description of ‘pile-on’ harassment. These events can have a significant impact on LGBT+ people’s ability to move through public life safely and free from harassment, causing many to retreat from online spaces.

Epilepsy Society agrees that there should be a specific offence of inciting or encouraging group harassment. The attacks Epilepsy Society, other epilepsy organisations, people with epilepsy, and the family and friends of people with epilepsy have experienced has often been as the direct result of inciting or encouraging group harassment.

During the attack on Epilepsy Society’s Twitter account in May 2020 users on Twitter stated ‘Let’s spam this to the epilepsy community’ and ‘thread of strobe light gifs to send to epileptix’. It was clear that this was a co-ordinated attack by a group of trolls deliberately setting out to harm people with epilepsy. Targeting people with disabilities should be considered reprehensible and Epilepsy Society would support this being reflected in the law and by further regulations on online platforms such as social media.

Uncoordinated group harassment happens often in cases where the victim is in the public eye, such as in the case of MP Jess Philips or cases of Twitter trolling as seen within our Cyber Abuse as Work report. These cases are not currently adequately covered by law (notably where the perpetrator did not start the communication, such as adding a comment on a Tweet that causes harm) and it is vital that this changes. Our report put a spotlight on various forms of online abuse perpetrated in the workplace to devastating psychological effect. Among the victims of cyber abuse at work, a shocking 23% experienced trolling. These victims should be protected adequately under the law.

In cases of so-called ‘honour’ based violence it is common to see one member of the family encourage or incite others to perpetrate abusive and harmful behaviour towards another. Stalking can also be perpetrated in a ‘group’ setting, where stalkers encourage other individuals to stalk by proxy. This would equal coordinated group harassment as defined in the consultation.

Again, it should be noted that where the behaviours demonstrated constitute stalking, it is the stalking offence that should be criminalised first and foremost, with the communications offence only in addition and never replacing it.

Unnecessary duplication and availability of existing offences

I think this can be dealt with by the proposed offence, with the pile-on harassment being an aggravating factor.

We do not encourage the creation of such a specific offence: existing offences and powers of sentence upon conviction are sufficiently comprehensive and current to cover a form or forms of inciting and encouraging harassment where offending entails (whether or not targeting) a group or groups.

Intention to cause harm or knowledge that harm was likely to be caused and reference to the context would enable the new proposed offence to be used to combat group harassment. We do consider that guidance accompanying the offence should specifically address this form of conduct, as such behaviour is increasingly observed on social media platforms, but the new proposed offence and the panoply of existing inchoate offences should provide sufficient protection.

We strongly disagree with these proposals. We do not think it is practical or proportionate to criminalise “being part of a pile-on”, which can simply be lots of people disagreeing with one person or an organisation. Nor do we think it practical or proportionate to criminalise “inciting or encouraging group harassment”, which can simply be quoting something someone has said and criticising it. We believe such a law would be an unacceptable infringement on free speech and impossible to police consistently, and that it would be used by organisations and powerful people receiving criticism to criminalise their detractors.

The Criminal Law Solicitors’ Association do not see that there is a specific need for inciting or encouraging group harassment. The difficulties which would be encountered in trying to prove such an action would be immense. The CLSA believe that this proposal is unnecessary particularly in light of the earlier proposals.

It seems very unlikely there could ever be a successful prosecution of such an offence, given the inherent difficulty in establishing what had actually happened and who was inciting or harassing who.

In the absence of a clearly defined offence of group harassment (see our response to the next question), ARTICLE 19 is concerned that putting forward a related incitement offence is premature. We also query whether the issues raised by group harassment could not be better addressed through legislation or other policy measures on anti-discrimination or harassment in the workplace. In general, we would recommend any incitement offence to take due account of the treaty language of Article 20 (2) of the ICCPR and the Rabat Plan of Action to determine whether such incitement has indeed taken place.

We see no compelling reason to create yet a further offence of inciting or encouraging group harassment here. As the Commission point out in Paras.6.87 -6.98, in most cases there is a possible charge already: the fact that prosecutors apparently do not use the available charges is in our view no reason to create a new criminal offence.

We acknowledge the phenomenon of social media harassment and note that these activities involve, by definition, messages targeted at an individual who is likely to see them.

Although such harassment might be described as a form of expression (the participants are using social media, after all) we note that they also engage and interfere with the free speech rights of the target/victim. The ‘pile-ons’ render the social media platform unusable by the individual, and many are forced to avoid using the platform altogether.

However, we are wary about introducing a new offence to criminalise something which is very often impromptu and where the boundary between outrage and harassment is indistinct.

The consultation paper itemises a list of existing laws under which the organisers of group harassment might already be prosecuted. We suggest that these methods continue to be used to address the most serious offences.

Where a course of action is not already caught by an existing offence, we suggest that the law err on the side of freedom of expression, rather than introduce a measure that could encompass legitimate political organising. In general, we counsel against the introduction of new laws in order to capture ‘edge cases’ as seems to be the case here.

We note that technological innovation could offer sufficient protections to deal with the problem of online harassment. For example, Twitter introduced new features in 2020 to give users more control over who can reply to, or target a message at their account; and email software can be set up to filter abuse or unsolicited messages.

Terms of service and the forthcoming online harms regulations could also be deployed to deal with such issues, without involving the criminal law.

The existing law addresses incitement to group harassment under the S.8 of Accessories and Abettors Act 1861, s.7(3A) the Protection from Harassment Act 1997 and s.44 to 46 Serious Crime Act 2007...from a CPS viewpoint, we are not aware of any cases or situation where the police have referred the matter for a prosecution decision and where the unavailability of an offence of this nature has prevented a prosecution from taking place.

Pile-ons are . harmful events that are fundamentally enabled by the prevalence of one to many and many to many services, as opposed to one to one POTS [“Plain Old Telephone Service”] at the time of the [1988] Act. The question remains whether they are something that should be dealt with in civil regulation biting on platform design (such as the Online Harms regime) or as an offence relating to an individual. An individual who organises or triggers one (as it can be one comment that becomes algorithmically promoted to a wider audience) might have some awareness of the harm that can be caused. In some instances, however, it may be that the platform - rather than being a neutral intermediary - constitute an essential motivating element in online speech and behaviours. An example of this can be seen in the context of gossip forums. The operators of some of these platforms have taken the decision to structure the forum threads by reference to particular named individuals. Depending, of course, on the content of the posts and their quantity, this could be harassment but what is central to this is the fact that the posts will, through the decision of the service operator, be focussed on a specific individual. Even if the person the subject of the thread does not see the posts (though this is unlikely), that person would still be a victim as it is likely that information concerning private aspects of the victim’s life have been shared (we assume here that sexual images or deepfakes are not involved). It may be that the offences under the Data Protection Act 2018 could be used, but this sort of environment could be seen as facilitating if not encouraging pile-ons, though the platform operator may not choose the individuals the subject of each thread.

Analysis

(3A) A person’s conduct on any occasion shall be taken, if aided, abetted, counselled or procured by another—

Conclusion: incitement or encouragement of group harassment

Knowing participation in “pile-on” harassment

The various challenges in creating an offence to meet these circumstances were the subject of our analysis in the scoping report401 and consultation paper.402 These include evidential challenges relating to each of the above conditions, and the lack of applicability of the PHA 1997 to uncoordinated conduct.

Should there be a specific offence criminalising knowing participation in uncoordinated group (“pile-on”) harassment?

Consultation responses and analysis
A distinct form of abuse worthy of criminalisation

Group harassment is a particular form of online violence whose seriousness and coordinated nature needs to be acknowledged in legislation. Sometimes, however, collective harassment can grow organically, fuelled by social media’s and online platforms’ amplifying effect. Recent high-profile cases of collective harassment online have shed light on the amplifying effect of social media in fuelling harassment. In March 2020, a video of a hospitalised COVID-19 patient originally shared via the messaging application WhatsApp was shared to social media and fuelled a large scale collective abuse with collective harassment spreading organically rather than in a coordinated way.

The creation of a specific offence of ‘pile-on’ harassment would be welcome. Such legislative development would allow for further conceptualisations of online harms, which takes into consideration the reality of online abuse, its scale (including cross-jurisdictional spread), the speed with which it is committed, and - most importantly -the impact it has on the victim(s). In our research, we have criticised the current efforts to reform the law on online harms as representative of an incredibly fragmented and selective approach whilst advocating for a conceptually broader understanding of online harms. We would strongly support the amendment of the law in this direction.

The stakes are not enormously high here, since I agree that many of these communications would be caught by the general suggested offence as likely to cause serious emotional distress. However, to the extent that each alone is not likely to cause harm (especially if the bar to the offence remains serious harm) but the defendant is aware (and I would add or ought to be aware) that it is likely that many others will send similar messages and that the effect on the victim would hence be more serious, there is a strong justification to impose criminal responsibility: Once it is accepted that pile-on abuse is additive and hence is more serious, knowing participation in it amounts to an anti social behaviour which ought to be caught by the criminal responsibility net.

Ability of “harm-based” offence to address behaviour

...if such activity is charged under the new proposed offence the court should take into account when considering the facts of such a case that there may be in fact greater [culpability] attached to those who send messages later in time in a pile-on as they had greater opportunity to be aware of the harm likely to be caused in the circumstances of them adding to the number of messages/posts already sent/posted; a pile-on by definition becomes more harmful the longer it goes on and the more messages (/people) are involved.

The courts could find that the relationship between the defendants’ messages was ‘similarity of impact’ or ‘similarity of [general] intent’ - e.g. defendant 1’s message may be homophobic, defendant 2’s message may be fat-phobic, in the context of a pile-on both messages are capable of being harmful (similarity of impact) and are being sent with the general intent to.cause distress/cause harm etc (similarity of intent).

We note the explanation set out in the consultation about complications in evidencing a separate offence for “pile-on” harassment. We agree that even though the harm caused by such actions can be great, it may be difficult to link culpability of one individual sending one message to that greater harm, unless they were involved in inciting, encouraging or otherwise coordinating the group action.

However, we do note that if someone was aware that there was a “pile-on” occurring, and chose to add to it by sending a message that was otherwise covered by the new proposed harm-based criminal offence, then possibly this could be regarded as an aggravating factor. If there were evidence that someone was responding to a message from the victim (for example) indicating they were receiving hundreds of abusive messages, possibly linked to a specific hashtag, then this might indicate greater culpability for more serious harm caused.

Overreach and unclear routes to culpability

We recognise the positive and proactive motivation behind this proposal, and the need to adopt a victim-centric approach which accounts for the total impact of malicious online behaviours on those who are targeted. However, our concern is that if this offence is addressed at “uncoordinated groups” then it may end up by criminalising the actions of individuals which, by themselves, would not be criminal. Instead, actions of one individual could become criminal if, in concert with many others, they lead to an accumulation of harmful behaviours. Whilst the impact on the targeted person may be highly harmful - and some form of appropriate intervention should be taken, potentially by the host platforms - it seems unfair to assign blame to each of the perpetrators for the actions of other individuals.

Participating in an uncoordinated ‘pile-on’ may be undertaken without awareness of the pile-on taking place; an individual may unwittingly have contributed to a pile-on which they did not know about, or want to be associated with. It seems particularly unfair for them to receive a heightened sentence in such a situation. Relatedly, we also caution that the definition of a ‘pile-on’ seems unclear and may be liable to misinterpretation and, most worryingly, an overly broad definition of what constitutes a pile-on. The law could easily be used in an overly draconian way to criminalise nearly any interpersonal aggression if it formed a (potentially quite loosely defined) pile-on.

Finally, we note that whilst we are concerned about uncoordinated group harassment being treated specifically under law, we support efforts to address organised networks of abuse and harassment.

We support introducing the new proposed offence considered in Chapter Five offence [to replace section 127(1)]. Knowingly participating in uncoordinated group (“pile-in”) harassment is overly inchoate in comparison to the preferred new offence proposed in Chapter Five.

We note the Law Commission’s observations that such an offence may be difficult to enforce for the practical reasons outlined. There may be evidential challenges in demonstrating that individuals were aware of communications sent or posted by others, in distinguishing the culpability of the first instigating communication from subsequent communications, or to link them. There may also be numerous communications from different individuals making it resource intensive to enforce.

The phenomena of pile-on or uncoordinated group harassment certainly deserves attention, due to the significant negative impact it can have on its targets as well as on onlookers. We welcome the examination by the Law Commission on action which can be taken to reduce group harassment. As highlighted by Glitch, a UK charity which raises awareness of online abuse, group harassment is a significant concern: it combines attacks on individuals, which can cause significant emotional distress, with communication to onlookers as another audience who may also suffer distress. For instance, women and members of other marginalised groups who observe others being attacked by groups online may themselves feel less able to engage in online spaces for fear of suffering the same targeting.

However, understanding and debate over pile-on harassment is currently impoverished. Some of the complicating factors for straightforwardly addressing pile-on harassment are as follows.

Firstly, pile-ons can include both coordinated and uncoordinated efforts, and many cases can lie somewhere in the middle with a small group privately and deliberately planning coordinated harassment elsewhere, often with a view to presenting the initial activity as authentically uncoordinated and with a view to encouraging others uninvolved in the initial plans to organically participate. This porousness means distinguishing coordinated from uncoordinated group harassment online can be extremely difficult: it is not always clear to someone engaging in a pile-on that they are doing so.

This has a bearing too on the standards by which we judge someone’s ‘knowing’ participation. The degree to which a participant can be said to know that the harassment is coordinated or not is similarly difficult to establish for the same reasons. Indeed, the degree to which they can be said to know that their particular online communication is part of a broader group activity is difficult to ascertain too. Not least because, given the rapidity with which many users engage with content online and thus give little consideration to context as a result, it may not occur to them that they are contributing to an activity with a mass of participants. This argument, of course, could be used disingenuously as a defence, but therein again lies the difficulty of establishing facts about pile-on harassment.

Two further complicating factors stem from the fact that it can occur over extended periods of time. On the one hand, this further complicates all of the above: for example, the sheer amount of data that can accrue around a hashtag used specifically in a pile-on can quickly exacerbate the difficulties of establishing its origins, who was involved at the start, the awareness of its un/coordinated status to participants and so on. On the other hand, the longer a pile-on continues, the more attention it is likely to attract, and so the more it is likely to be subverted by those who oppose it. In June 2020 many members of the online community of ‘K-pop stans’ (fans of a genre of South Korean pop music) co-opted the hashtag #whitelivesmatter (a hashtag developed deliberately in an online far-right community to undermine the anti-racist hashtag #blacklivesmatter and as a vehicle for far-right propagandising), in response to its increased use by those opposing the protests against the killing of George Floyd in the United States. The K-pop stan users deliberately rejected this use, instead accompanying it with expressions of support for those protesting. A similar example can be found in the hijacking by the LGBT community of the #ProudBoys hashtag in rejecting white nationalism. This example adds to the mix of difficulties discussed above, the factor of delineating when mass group activity online constitutes pile-on harassment.

It is not clear how an individual knowingly participates in an uncoordinated group. This might lead to the investigation of someone who is unaware of third party communications and/or is not quick enough to realise that they could be considered a ‘pile on’.

Freedom of Expression

ARTICLE 19 recognises that ‘pile-on’ harassment is a significant issue for public debate online and that it tends to target particularly vulnerable or marginalised groups or individuals. However, we are concerned that such an offence would criminalise individuals getting caught for making a one-off offensive comment that would otherwise fall below the threshold for prosecution of a harassment charge. In particular, it is unclear to us how an individual would be supposed to acquire knowledge of participation in ‘uncoordinated group harassment’ and the point at which the threshold for such knowledge would be reached, i.e. how many comments would be necessary for ‘uncoordinated harassment’ to take place? It is also unclear what kinds of tweets and how many would be sufficient to amount to ‘harassment’. What should be the relationship between the tweet or comment that sparked a ‘backlash’ and the tweets or comments in response and how much should the various tweets in response correlate with one another to amount to ‘uncoordinated harassment’? More generally, we are concerned that this offence could have a chilling effect on matters of public debate where people may strongly disagree for fear that one comment, however minor, may be taken to add on to ongoing harassment of someone online.

ARTICLE 19 reiterates that the criminal law should only be used as a matter of last resort over other policy responses. In the case of “pile-on” harassment, other measures such as improved content moderation practices, restrictions on certain accounts or features, would likely be more proportionate and appropriate.

We do not think it is practical or proportionate to criminalise “being part of a pile-on”, which can simply be lots of people disagreeing with one person or an organisation. Nor do we think it practical or proportionate to criminalise “inciting or encouraging group harassment”, which can simply be quoting something someone has said and criticising it. We believe such a law would be an unacceptable infringement on free speech and impossible to police consistently, and that it would be used by organisations and powerful people receiving criticism to criminalise their detractors.

Analysis
Conclusion: knowing participation in “pile-on” harassment

PART 2: GLORIFICATION OF VIOLENCE AND VIOLENT CRIME

used as evidence of bad character, has on Black men and boys.426 Of course, where directly relevant to a specific crime, a piece of Drill music may rightly be admitted as evidence. 427The central concern of those who urge caution in the treatment of drill music is the “...corrosive effect of portraying a genre of music so closely connected to Black communities as innately illegal, dangerous and problematic.”428

Existing law, and the recommended harm-based offence

However, without an intention or belief that a criminal offence will be committed, the offences under the SCA 2007 will not apply.

Example 9: glorification of violence causing serious emotional distress

Freddie sends video clips of a woman being violently attacked to an acquaintance of his, Sara. Both Sara and the woman in the video clips are black. Along with the clips, Freddie sends an accompanying message “Could have been you, Sara! They really got her good.” The message includes a string of “thumbs up” emojis.

6.164 Recall that, under existing legislation, “glorification” means any form of praise or celebration. Given that Freddie’s accompanying message said, “They really got her good” and included “thumbs up” emojis, his conduct may amount to “glorification” of violence.

6.165 This “glorification” of violence would likely be caught by our proposed offence. A court would probably find that Freddie’s communications were likely to cause Sara at least serious emotional distress. Further, the prosecution could probably prove that Freddie intended to cause harm or was aware of a risk of causing harm. There is nothing to suggest that he had a reasonable excuse.

Justification and consultation responses

Should there be a specific offence of glorification of violence or violent crime? Can consultees provide evidence to support the creation of such offence?

Support for a specific offence

We welcome this suggestion. Over the last few years, we have seen an increase in calls to violence online, which can overlap with threatening communications. Research by the HateLab at Cardiff University in 2019 recorded a significant increase in overt hate speech online in the UK. Politically or ideologically motivated groups are increasingly using social media platforms to issue calls for violence against specific communities or groups of individuals, or promote gang violence. Researchers have, in particular, shown that recent months and the COVID-19 pandemic had led to an increase in calls for violence online and fuelled online extremism. The UK Commission for Countering Extremism showed that extremists across the political spectrum have exploited the pandemic and increased time spent online to sow division and encourage extremist agendas with a potential to fuel violence.

There was general support for this although not a significant amount of evidence of individual crimes. Given the diverse views, I have selected below those that raise distinct issues:

The court has to be satisfied on the balance of probabilities that the respondent has engaged in or has encouraged or assisted—

So we have legislation that allows us to target person(s) encouraging gang violence, i.e. drill music. But the lines are very close in terms of the cross over to films and other forms of media that also portray and glorify violence. It would need some care consideration to deconflict this. But I would be supportive of some stand-alone legislation wider than just Gang Violence.

Existing offences

5.104 Kingsley Napley noted that the existing offences, and provisionally proposed communications offences, likely provide sufficient protection:445

Glorification is probably sufficiently covered already but this area should be kept under review. Consideration would need to be given to the following aspects:

5.105 Dr Jen Neller argued an alternative way to address the behaviour would be by amending the existing encouragement offences:446

This could be dealt with more effectively by amending the Serious Crime Act 2007 to clarify the encouragement offences and extend them to encouragement of the public or any sector of the public to commit a crime, rather than only encompassing encouragement of a particular individual. Encouragement to commit a crime would be a higher threshold than glorification, and would avoid the risk of disproportionately criminalising black cultural expressions or of otherwise interfering with the freedom of expression of those who have grown up in or live in violent contexts (see Kubrin, Charis E. and Erik Nielson. “Rap on Trial.” Race and Justice 4, no. 3 (2014): 185-211). The further criminalisation of such marginalised populations would be at risk of causing greater harm than it remedied. Additionally, the criminalisation of glorifying violence would seem to inappropriately place a greater restriction on the expression of individuals than on the media (e.g. violent films and video games).

Freedom of expression

5.106 The Magistrates Association agreed with the views expressed in the consultation paper, particularly the concerns about Article 10:447

The arguments put forward in the consultation paper against any such new offence seem strong. We agree that any definition of glorification would have to be very tightly drafted, so that it did not infringe Article 10 rights. For example, we would be concerned about any offence that covered all drill music by default, for the reasons put forward by the Law Commission. It also seems clear that glorification that reaches the level of threats would be covered in existing legislation.

5.107 ARTICLE 19 argued that a “glorification” offence would be “very dangerous” for freedom of expression. They noted that no matter how an offence is drafted, the inherent ambiguity of “glorification” means that disproportionate “chilling” of expression is, in their view, inevitable:448

In our view, such an offence would be very dangerous for freedom of expression. Under international standards on freedom of expression, ‘glorification’ is generally considered too broad a term in relation to incitement to commit acts of terrorism or incitement to discrimination, hostility or violence (Article 20 ICCPR). Moreover, the glorification of ‘violent crime’ could potentially cover a very wide range of conduct that most people would not know are criminalised. It is unclear how the proposed offence would apply in several scenarios. For instance, would supporting violent police action fall within scope? What about the actions of the military which are highly likely to involve violence? Slavery is clearly an affront to human dignity. Some may argue it is a violent crime. Would those who seek to justify it fall within scope? We also note that a specific offence of glorification of violence could have an impact on minority groups, for instance those individuals whose sexual preferences include bondage and other sado-masochistic practices. Finally, we draw attention to a French case where a mother and an uncle were fined several thousands of euros for a similar offence in circumstances where the mother’s son, named Jihad and born on 11 September, had been wearing a T-shirt saying ‘I am a bomb’ at the front and ‘born on 09/11’ at the back given by his uncle for his 3rd birthday. In our view, the prosecution and conviction in this case were clearly disproportionate and demonstrate how such offences have a chilling effect on freedom of speech.

5.108 The Criminal Bar Association argued against any offence of glorification of violent crime. They discussed the potential disproportionate impact any offence may have on communities or sub-cultures:449

No; we believe that an attempt at criminalising behaviour under this banner will disproportionately affect some communities or sub-cultures (as per the e.g. of [Black] youth who listen to or themselves create “drill” style rap). We do feel that the association of drill music with the glorification of physical violence is misguided and racially biased and we not[e] that there are other genres of music which also have lyrics speaking of violence which are not viewed in the same way (e.g. hard rock).

In the field of art, music and artistic expression, the question of what in fact amounts to “glorification” of violence is often determined on subjective factors (e.g. the perception that drill music is fuelling violence in inner cities or is more lyrically graphic then other forms of music) this subjectivity we feel can lead to inequality in the way the law is applied to some artforms / music leading to further disproportionate criminalisation of some communities / cultures.

5.109 English PEN discussed their concerns about Criminal Behaviour Orders being used to prevent the performance of certain music. They agreed with our analysis that “street illiteracy” can lead to disproportionate responses to certain types of music and expression:450

We are extremely concerned by the recent prosecutions of rappers for producing ‘Drill’ music, and the Criminal Behaviour Orders preventing others from performing such music. We consider these interventions a disproportionate interference with freedom of expression rights.

We welcome the Commission’s acknowledgment in the consultation paper that such music and associated videos are susceptible to misunderstanding and ‘street illiteracy’ and that such content “represents, rather than endorses violent crime.” Such representation is a window into a segment of British society that is often marginalised, and is therefore in the public interest. It should be contextualised, not be criminalised.

We support the Commission’s implied suggestion a[t] paragraph 6.159 that only communications/videos that meet the threshold of incitement to commission of criminal conduct should be criminalised. Where no intent to incite a specific offence can be shown, and where the communication is not captured by the proposed harmbased offence (paragraph 6.163), such messages should not be prosecuted.

We support the points made in defence of freedom of expression at paragraphs 6.170-6.174 of the consultation paper, questioning the efficacy and proportionality of a ‘glorification’ offence.

Vagueness of “glorification” and comparison with terrorism legislation

5.110 The Crown Prosecution Service agreed that an offence of glorification or incitement of violence or violent crime may lack clarity, distinguishing the offences under the Terrorism Act 2006. They discussed the offences under the Terrorism Act 2006 and the “very specific” nature of terrorism and why the general concepts of “violence and violent crime” are not amenable to similar regulation:451

The Law Commission considers that a specific new offence of glorification or incitement of violence or violent crime may be too broad and lacking in clarity. We consider that there is merit to that analysis. This proposal can be distinguished from incitement offences under the Terrorism Act 2006.

The Terrorism Act 2006 includes references to glorification, defined as ‘direct or indirect encouragement or other inducement to some or all of the members of the public’. The Counter-Terrorism Division of the CPS has considerable experience of dealing with criminal glorification in the context of terrorism, in particular by way of the offences under Part 1 of the Terrorism Act 2006. The offences, which criminalise the encouragement of terrorist acts, have been an extremely useful tool in the fight against terrorism and have allowed investigators and prosecutors to successfully target those involved in the radicalisation of others.

Indirect encouragement under the 2006 Terrorism Act can form the basis for a criminal prosecution provided that other criteria are met. The additional criteria have the important effect of narrowing the scope of the offences and hence addressing some of the concerns, highlighted in the Commission report, relating to freedom of speech. One of the key criteria is that members of the public should reasonably be expected [to] understand that the terrorist acts are being glorified as ‘conduct that should be emulated by them in existing circumstances’ (Part 1 s.3 Terrorism Act 2006).

However, Terrorism is a very specific area of international and national concern and one that, in the UK, has a statutory definition in s.1 of Terrorism Act 2000. The concept of violence or violent crime is much broader. Criminalising the glorification of violence or violent crime has the potential to have a much wider reach, including material that some may view as mainstream. This could pose issues of proportionality.

5.111 The Association of Police and Crime Commissioners noted their concerns about the breadth of the term “glorification” in the context of violence and violent crime:452

We do have concerns about the broadness of the term “glorification” in this context, and regarding what harms such an offence would seek to redress that is not covered by other offences, e.g., those relating to incitement to violence. We are also concerned about how such a law could conflict with Article 10 Freedom of Expression in the European Convention on Human Rights.

5.112 The Bar Council of England and Wales argued that the existing offences under the Terrorism Act cater for “some types” of conduct glorifying violence or violent crime. They also noted the absence of evidence of need for a new specific offence:453

Existing anti-terrorism legislation (Terrorism Act 2006, ss.1 & 2 - encouragement of terrorism and dissemination of terrorist publications) caters for some types of misconduct involving glorification of violence or violent crime. We recognise, though, that the Terrorism Act 2006 does not cover all the forms of glorification cited by the Law Commission in this consultation. The most serious forms are covered by the Terrorism Act 2006. We think it important to point out that there does not appear to be an overwhelming evidential basis for creating an additional offence. In any event, the Bar Council is not in a position to inquire into, or provide, such evidence. We do register our concern as to the breadth of any additional offence.

Analysis

5.113 A number of consultees, particularly legal consultees, noted the problematic nature of the glorification of terrorism offences. Further, they highlighted the risk that any new glorification of violence or violent crime offence may serve to alienate further already marginalised parts of the community in ways that is ultimately counterproductive.

5.114 Consultees that responded positively included those who actively deal with and combat violence and prejudice, for example Community Security Trust and the LGBT Fed. Their responses noted the prevalence of violence against members of their community, and the connection between online abuse/glorification and offline violence. These concerns are valid, and any connection between online “hate” or abuse and offline violence is worrying. However, most of these responses do not directly engage with the concerns we expressed in the consultation paper about expression and the potential for over-criminalisation, nor the ability of existing laws to grapple with these issues. We also believe, as we set out above and in the consultation paper, that the recommended harm-based offence will have some application in relation to harmful communications that may also glorify violence or violent crime.

5.115 Further, this is an area where a specific criminal response criminalising “glorification” may be difficult to target, and risks being a disproportionate response. Instead, the online harms framework and platform community guidelines may well be a more appropriate avenue. We note the detailed consideration given to this matter in the February 2021 report of the Commission for Countering Extremism.454 It is also important to note that communications containing threats of violence may also be addressed to a degree by our recommendations on threatening communications and our existing, yet to be implemented recommendations regarding threats and the OAPA 1861.455

5.116 A further refrain in consultee responses (again along the lines of concern we had already covered in the consultation paper) was that any offence of glorification of violence or violent crime would likely represent an unjustifiable, disproportionate interference with Article 10 and freedom of expression. A section of consultees noted the existing inchoate offences that may well be able to address culpable behaviour without having the “chilling” effect that any new glorification offence is likely to have.

5.117 Our concerns in the consultation paper about Drill music and “street illiteracy” are also reflected in a number of the responses from legal stakeholders. However, responses including those of English PEN and the Crown Prosecution Service echo our concerns and note that criminalisation would be a radical response to a problem that appears thus far to be poorly understood. As English PEN pithily put it, this provides an opportunity to contextualise rather than criminalise the behaviour.

Conclusion

5.118 After considering the detailed and thoughtful responses we received from consultees, we believe our provisional conclusion 456to have been correct: there is not sufficient justification for a new offence of glorification of violence or violent crime. There are two central aspects to this conclusion. First, the inherent vagueness in the concept of “glorification”, which we do not believe in this context would result in an appropriately confined offence. Secondly, the potential for a disproportionate and counterproductive impact on certain marginalised parts of the community.

PART 3: BODY MODIFICATION CONTENT

5.119 Content relating to body modification overlaps with two other discrete topics in this report, encouragement or assistance of self-harm and glorification of violence or violent crime. As we discussed in the consultation paper, any broad offences of glorification of either self-harm or violent crime could be relevant to communications relating to body modification.457

5.120 However, as we discuss in detail in Part 2 above, we are of the view that there is insufficient justification for an offence of glorification of violence or violent crime. Further, as we set out in Chapter 7, our recommendations regarding encouragement or assistance of self-harm are tightly constrained to avoid their inadvertent application to vulnerable people who are not acting culpably.

Body modification: existing law

5.121 The judgment of the Court of Appeal in R v BM 458confirmed that body modification procedures such as ear-removal, nipple-removal, and tongue-splitting can constitute the offence of causing grievous bodily harm with intent contrary to section 18 of the OAPA 1861, regardless of whether they were carried out with consent. This will be the case provided that the procedure in question does not fall within the “medical exception”, which would cover, for example, nipple-removal as part of a mastectomy.459 Following his unsuccessful appeal in R v BM, body modification practitioner Brendan McCarthy pleaded guilty to three counts of causing grievous bodily harm with intent and on 21 March 2019 was sentenced to 40 months imprisonment.460

5.122 As we set out in the consultation paper, the significance of the judgment in BM is that carrying out body modification procedures (provided they do not fall within the medical exception) can amount to a violent crime; therefore, communications promoting body modification would likely be covered by one or more of the existing inchoate offences of inciting, encouraging, aiding, or abetting violent crime.461

5.123 These offences are serious. For example, for the offences of encouraging or assisting an offence under sections 44, 45 and 46 of the SCA 2007, the maximum penalty is (as a general rule) the same as the maximum available on conviction for the relevant “anticipated or reference offence”.462 So, if a communication regarding body modification is found to amount to encouragement of causing grievous bodily harm with intent, this could, in theory, carry a maximum sentence of life imprisonment (the same as the maximum sentence for the reference offence, that is, causing grievous bodily harm with intent) - though we note that Brendan McCarthy’s sentence was significantly lower than this maximum.

5.124 This is our understanding of the position on communications relating to body modification under the existing law. It is not within our Terms of Reference to make proposals in relation to the existing law covering offences against the person (such as causing grievous bodily harm, with or without intent) or the existing inchoate offences.

Consultation question and responses

5.125 We sought consultees’ views on the implications of possible offences of glorification of violence or violent crime, and glorification or encouragement of self-harm for body modification content:463

We welcome consultees’ views on the implications for body modification content of the possible offences of:

Responses

5.126 The Justices’ Legal Advisers and Court Officers’ Service, formerly the Justices’ Clerks’ Society, agreed that where an existing offence is being encouraged, the inchoate offences that are already available can be used:464

We agree with your comments. We would only note that encouragement of criminal body modification is not an offence we see charged. However in an appropriate case, the existing offence of encouraging an offence is already available which renders specific legislation unnecessary.

5.127 ARTICLE 19 agreed with the cautious approach we took in the consultation paper: 465

As the Law Commission notes, it is entirely possible that some body modifications pertaining to one’s self-identity may inadvertently get caught in any new proposed offence of glorification of violence or encouragement of self-harm. We would strongly discourage the Law Commission from introducing any such new offence, that in our view would inevitably be unduly broad.

5.128 The Criminal Bar Association noted the problematic nature of “glorification” in criminalising violence or violent crime:466

We do not think the offence should be further modified. ‘Glorification’ is a problematic concept to tackle with the criminal law.

It is our view that any creation of offences of either glorification of violence or violent crime or glorification or encouragement of self-harm would necessarily include and thus criminalise body modification content especially in view of the case of R v BM. Therefore, unless there is a strong imperative and sufficient background evidence to justify a need to further amend the criminal law (terrorism matters aside, which are in any event outside the scope of this consultation), there should be no further offence created.

Analysis

5.129 Many responses we received were from individuals who seemed unaware of the current legal position with respect to body modification. These responses stressed that if a person modifies their body of their own free will,467 or consents to another person modifying their body, that the law should not intervene. As we set out in the consultation paper and re-iterated above, the decision of R v BM 468confirmed that body modification procedures such as ear-removal, nipple-removal, and tonguesplitting can constitute the offence of causing grievous bodily harm with intent contrary to section 18 of the OAPA 1861, regardless of whether they were carried out with consent.

5.130 Other consultees noted the parallels between self-harm and body modification as they related to consent. As we set out in the consultation paper, the distinction between glorification and encouragement is worth noting. Our recommendation for an offence of encouragement of serious self-harm469 could potentially capture someone who posts material promoting self-administered body modification. It would only cover body modification that amounts to grievous bodily harm.

5.131 A number of consultees stressed that any new offence addressing “glorification” of self-harm was likely to be too broad. They noted that an offence predicated on “glorification” could bring non-culpable behaviour within its scope. We share these concerns and note that we make no recommendations for an offence of glorification of violence or violent crime. Regarding self-harm, our recommendations are carefully crafted to ensure they do not disproportionately apply to vulnerable people. They also only target encouragement and assistance of serious self-harm. This means many of the concerns raised by consultees about the potential over-reach of an offence predicated on “glorification” do not arise.

5.132 To the extent that a communication promotes self-administered body modification, this may be caught by a potential new offence of encouragement of serious self-harm, which we recommend in Chapter 7: . However, as we set out at length in the chapter, we are concerned that any such offence should not catch vulnerable people - such as people who are suffering from mental health conditions - who share content for the purposes of self-expression or seeking support. This concern also applies in the case of body modification.

5.133 The safeguards built in to the recommended offence of encouraging or assisting serious self harm include the fault element, requiring intention to encourage or assist self-harm to the level of grievous bodily harm; the high harm threshold of grievous bodily harm; and the requirement of (non-personal) DPP consent. These are all designed to ensure that vulnerable people are not inadvertently caught by our recommended offence of encouraging or assisting self-harm. Those safeguards will

also apply in respect of communications encouraging body modification. In light of the decision of R v BM128we believe this is justified.


128 [2018] EWCA Crim 560; [2018] Crim LR 847.


INTRODUCTION

Taking, making and sharing intimate images without consent project webpage:

https://www.lawcom.qov.uk/project/takinq-makinq-and-sharinq-intimate-imaqes-without-consent/.

Mobile applications through which users can chat with other people, often (though not exclusively) with the intention of meeting or dating that person.

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 6.116. See also Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, para 6.144.

See, for example, R v Alderton [2014] EWCA Crim 2204.

proportion of the offending behaviour constituting cyberflashing is not live, instead involving photographs or pre-recorded video.

THE RATIONALE FOR REFORM

Harm

Some victim-survivors have described their experiences of cyberflashing in terms of violation, describing how they felt: ‘utterly violated’; ‘really violated’; ‘incredibly violated’; ‘at its core, it’s very invasive’; ‘I just felt totally violated’. Marcotte et al (2020) found that almost one third of women reported feeling ‘violated’ after being sent unsolicited penis images...

Victim-survivors also report experiencing being embarrassed, disturbed, shocked, utterly horrified and ashamed, with one describing the ‘heatwave of embarrassment’ she felt:

‘The truth is, no matter how strong I thought I was, he turned me, with a picture, into a weak person, feeling humiliated and with no ability to stand up for myself . the incident still repeats in my mind’ (Boulos, 2019).

Combined, these experiences underpin a sense of humiliation, understood as infringing the dignity of the person (Gillespie 2019). The person is dishonoured and humiliated through a failure to show respect and through treatment of others as less than deserving of respect, and as means rather than ends.471

I interviewed many women over a period of years and for victims who have experienced both cyber flashing and traditional “anorak” in-person flashing - they say the impact was the same/if not worse for cyber flashing because of the anonymity given by the phone as opposed to being able to see who is flashing in person/other people around could also see and protect the victim...

... it is hard to see impacts of physical flashing as worse than digital flashing because it will not have a black and white scale for all victims. Some may find one format worse than the other because of specific triggers/touch points within their own experience.478

Women have frequently connected their experiences to physical sexual exposure: ‘it’s the same thing as flashing in public’. For many, the harm stems from the ‘well-founded fear’ of what might happen next, particularly in contexts where unsolicited penis images are sent in public from strangers. Women may not be harmed per se by being sent a penis image, but what it represents and what it might mean in practice; the implicit or explicit threat of further sexual assault.

Women recount, for example, feeling immediately ‘frightened’, ‘terrified’, ‘vulnerable’ and ‘exposed’ by acts of cyberflashing. They fear escalation of the actions, with women reporting feeling scared as to what might happen next. One victim-survivor stated: ‘with cyberflashing, because you don’t know who’s sent it, and you’re in a public space, that threat is never really eliminated’. Another said: ‘I was singled out, I was being targeted, and it felt very personal’.479

Prevalence

Organisations fighting sexual harassment and violence believe these numbers to be only the tip of the iceberg as many more cases remain unreported... Campaigners, journalists and activists have documented the concerning increase in cyber-flashing...483

(47%) (YouGov 2018). One study found 76% of girls aged 12-18 had been sent unsolicited nude images of boys or men (Ringrose, 2020).484

Cyberflashing is experienced as routine by many using dating apps, including from strangers, acquaintances and potential daters. It is commonly experienced out of nowhere; other times following rejecting the man’s advances...

Cyberflashing is regularly experienced by women engaging in social media and other online technologies, in personal and professional capacities, by strangers, colleagues, acquaintances, family friends. It is also now taking place in online video conferencing with terms such as ‘zoomflashing’ and ‘zoombombing’ now into language reflecting a rise [in] various forms of online abuse, including online exposure and distribution of sexually explicit texts online.485

Current law

CYBERFLASHING: A NEW OFFENCE

A sexual offence

Consultation question 24

We provisionally propose that section 66 of the Sexual Offences Act 2003 should be amended to include explicitly the sending of images or video recordings of one’s genitals. Do consultees agree?

.. .Cyber-flashing is, as a matter of common sense, conduct of a sexual nature. Further... those who have been subjected to cyber-flashing compare its impact to that of other sexual offences: for example, it can cause similar feelings of violation and sexual intrusion. Our proposed [communications] offence does not fully reflect this specifically sexual behaviour and harm. It covers a wide range of abusive communications, some of which may be sexual, and some of which are not. In our provisional view, there should be a clear option to prosecute cyber-flashing as a sexual offence in order to ensure that the nature of the offending conduct is more accurately labelled by the offence.

The second reason concerns the additional legal protections that apply in respect of sexual offences. Under section 103A of the SOA 2003, inserted by the Anti-social Behaviour, Crime and Policing Act 2014, these protections include Sexual Harm Prevention Orders: orders made by the court to protect the public or members of the public from the risk of sexual harm presented by a defendant. Under Section 103A, a court may make a Sexual Harm Prevention Order in respect of offences listed in Schedules 3 and 5, if it is satisfied that this is necessary for the purpose of:

Responses

We welcome the clarity that will be introduced with enactment of the newly proposed offence as outlined in Chapter Five [section 27(1)]. Alderton 491related to live exposure but the legal position at present is that it is unclear whether all forms of cyber flashing are contemplated by section 66 of the 2003 Act. We agree that this uncertainty is most directly and straightforwardly addressed by amending the statute.492

...be helpful for cyber-flashing to be dealt with as a sexual offence, rather than under the proposed new harmful communications offence. This would facilitate correct labelling, appropriate sentencing and remedies such as Sexual Harm Prevention Orders. It could also facilitate automatic anonymity for victims if added to the Sexual Offences Act (SOA) 2003 and to section 2 of Sexual Offences (Amendment) Act 1992.493

Given the distress and feelings of violation that the sending of images or video recordings of the defendant’s genitals can cause., we agree in principle that the Sexual Offences Act 2003 should be amended to include this, so that offences should carry with them the appropriate range of sentencing options (including Sexual Harm Prevention Orders).494

This is a form of sexual violence and the impact of this on victim/survivors can be far reaching and thus the law should be reflective of this. In consultations completed with victim/survivors of racialised sexual harassment perpetrated online through explicit images, the impact can mirror that of violence already included within the Sexual Offences Act, primarily women feeling unsafe and violated. Many of the black and minoritised women and girls the Angelou Centre support who disclose experiencing sexual violence are blamed and viewed to have behaved shamefully/dishonourably and thus the receipt of sexualised images can lead to wider forms of violence, including domestic violence perpetrated within honourbased violence contexts. The law needs to reflect this level of risk.495

Suzy Lamplugh Trust agrees that section 66 of the Sexual Offences Act 2003 should be amended to include explicitly the sending of images or video recordings of one’s genitals. Considering the detrimental effects of cyber-flashing on the victim (and the fact that it can be equal to or more aggravating than offline sexual harassment), we believe it vital for the victims to have the choice 496to be protected by a Sexual Harm Prevention Order in such cases. By introducing this offence, the law would be able to send a clear message that online crime is as harmful and as punishable as offline crime.497

This seems a sensible proposal, especially as it can ensure Sexual Harm Prevention Orders can be used in response to offences.498

Singapore and a number of US states have recently adopted specific laws criminalising cyber-flashing and each are characterised as sexual offences. Scots law covers cyber-flashing as a sexual offence. Therefore, this approach follows international best practice...

Cyberflashing is a sexual intrusion which infringes victim-survivors’ rights to sexual autonomy and privacy. It is also experienced by some as a form of sexual assault. It is vital therefore that any new criminal law is framed as a sexual offence, ensuring appropriate recognition of the nature and harms of cyberflashing, and granting anonymity rights, special protections in court and suitable sentencing options. The specific cyberflashing laws that have been adopted in some US states and in Singapore have been enacted as sexual offences.499

ought to be recognised as sexual offences and incur the sentencing and other consequences associated with sexual offences, given their specific and heightened risk.501

We are not convinced of the need for this. If the general communications offence is limited to communications likely to cause harm (or as we would prefer intended to do so), we do not see why anything less, such as mere distress, should apply merely because the picture is of the sender’s or someone else’s genitals.502

Analysis

The conduct element

Consultation question 25

Assuming that section 66 of the Sexual Offences Act 2003 is amended to include explicitly the sending of images or video recordings of one’s genitals, should there be an additional cyber-flashing offence, where the conduct element includes sending images or video recordings of the genitals of another?

To do so would broaden the scope of the offence significantly and encompass behaviours that, in one sense at least, are quite different. For example, sending a publicly available image of a naked person (that includes their genitalia) to an acquaintance (who knows that the image is not of the sender) would seem to be a different order of threat from that posed by a stranger sending the same image, or where it wasn’t otherwise clear that the sender was not the person in the image. (For the avoidance of doubt, this is not to say that the former act would be harmless).

This would seem to suggest that the harm is not merely a function of whether it was the sender’s genitalia, but instead a more nuanced question of context and the apprehension of the recipient. Sending someone unwanted pornographic images may be harmful akin to forms of sexual harassment, but this is not to say that such behaviours should be governed by a law primarily focused on exposure.507

Responses

...we are concerned that the proposals draw a distinction between cases of cyber flashing where the perpetrator sends images or videos of his own genitals and those where he sends images or videos of another person’s genitals. In terms of impact on the recipient of the images or videos, this distinction is unlikely to make a difference. Having separate offences could make the law challenging to use in practice and require investigation and evidence about whether the genitals in question were the perpetrator’s or those of another person, which seems to us an unnecessary evidential hurdle.508

I think cyber-flashing should be dealt with as an offence either under s.66 or as a stand-alone offence (s.66A?), but I don’t see the point in differentiating between one’s own and another’s genitals and I suspect it would be quite difficult to prove in courts509

From the perspective of the victim, if they receive an unsolicited image or video recording of a person’s genitals from a defendant, we believe that the impact on them would be the same, regardless of whether the genitals in question belong to the defendant or to a third party.

Additionally, defendants who share unsolicited images or videos of other people’s genitals are displaying similar - if not identical - sexually offensive behaviour, as defendants who have shared unsolicited images or videos of their own genitals.

Therefore, if the Sexual Offences Act 2003 is amended specifically to include the sending of images or videos of one’s own genitals, we would lean in principle towards suggesting that the sending of images or video recordings of other people’s genitals should be included in this. This would ensure that the law properly recognises the impact on victims, and that convicted offenders can be issued with Sexual Harm Prevention Orders, to prevent them from committing similar behaviour in future.510

We note the argument put forward that cyber-flashing which involved sending unsolicited or unwanted images or video recordings of the genitals of another would likely be covered by the harm-based offence proposed earlier in the consultation. However, we also note the arguments set out in respect of the previous question, explaining the importan[ce] of ensuring Sexual Harm Prevention Orders can be used in response to these types of offences. This suggests that it may be beneficial to have an additional cyber-flashing offence for this conduct that would not be covered by Section 66 of the Sexual Offences Act 2003.511

risks becoming quickly outdated (again) as technology evolves, specifically as advances in ‘deepfake’ technology make it easier to create images where it is almost impossible to tell whether they are real or faked.512

We reject the proposal to offer a two-part legal solution to cyber-flashing [as this] creates a hierarchy between different forms of cyber-flashing which is not justified on the evidence. There is no evidence that cyber-flashing is experienced as ‘worse’ or more harmful if it involves an image of the perpetrator’s own penis. In creating a hierarchy, there is a risk [that] not all cases of cyber-flashing will be taken seriously.513

...the violation and intrusion, and possible fear and threat, experienced by the victim-survivor are not dependent on the knowledge that the penis in the unsolicited image is that of the perpetrator. To require proof that the image belongs to the perpetrator risks misunderstanding the nature of the experience and its attendant harms; it does not fully recognise the experience of victim-survivors.

It may be that proposals limiting a sexual offence to images of a perpetrator’s own penis assumes offenders are driven by motives similar to those of some physical ‘flashers’, namely the sexual dysfunction of exhibitionism, with often predatory consequences and a potential pre-cursor to other forms of sexual offending. Viewed from this perspective, the problem to be addressed in legislation is that of the individual exposing his own penis, with the fear of escalating sexual offending.

However, such an understanding neither captures the full range and extent of motivations of physical ‘flashing’, and even less so the wide-ranging purposes of cyberflashing, including inducing fear, alarm, humiliation and shame. It is vital that the multiplicity of motivations for cyberflashing are recognised and that the scope of any sexual offence is not unduly limited.514

...in my case studies I have found less evidence that perpetrators are sending pictures/videos of other people (by this I mean that the pictures/videos look like home recordings/images/selfies as opposed to professional porn, for example).

When speaking to some men about why they do it the sexual satisfaction many get is often as a result of people seeing their genitals specifically, and the indirect humiliation that brings upon them, which they find gratifying. Although not all men do it for this reason - others do it for attention (any type), or to provoke reaction.515

.even if the analysis in the consultation paper is correct that sending an image of genitalia known to likely recipients not to be the sender’s [is] likely to cause less harm or threat (which is possible, but its normative implication might be overstated), an equal threat will exist if the likely recipient believes the genitalia is of the sender while it was not. The advantage of the proposed additional offence is that it could cover cases of victim’s mistaken belief that the genitalia was of the sender. Of course, this could be addressed by amending section 66 to include the sending of images or video recordings of or reasonably believed by the audience to be of one’s genitals.516

Without the additional cyber flashing offence proposed above, it would be necessary for the prosecution to prove that the genitals shown are those of the suspect themselves unless the suspect has made an admission of this. There may be evidential challenges if the images do not contain the suspect's face (or some other distinctive feature which strongly links the image to the suspect) but depict the genitals only. This may in appropriate cases be overcome through the preferring of alternative charges if the suggested additional cyber flashing offence is enacted.517

Legislation which only criminalises the sharing or pictures of the perpetrator’s genitals fails to take into account the fact that such material - received without consent - causes distress regardless of whose genitals are depicted in the material. We therefore suggest that section 66 of the Sexual Offences Act 2003 be amended to include both the non-consensual sending of pictures or videos of genitals regardless of who the picture belongs to.518

Analysis

...if the amendment were accepted, it would prevent the offence being used against females who expose themselves. While that is uncommon, as my noble friend said, it is not unheard of. The amendment would unduly limit the law. It would be saying that the man and woman who paraded naked together in public could not be convicted of the same offence. Given that the Government are anxious that the Bill should be non-discriminatory, it would be strange to accept an amendment that discriminates between men and women.

The primary concern should be the protection of the public.521

The fault element - additional intent

Consultation question 26(1)

Assuming that section 66 of the Sexual Offences Act 2003 is amended to include explicitly the intentional sending of images or video recordings of one’s genitals, should there be an additional cyber-flashing offence, where a mental or fault element includes other intended consequences or motivations, beyond causing alarm or distress?

Responses

This is creating an unjustified hierarchy of cyber flashing offences and [the] mental element threshold is too limited. Causing alarm or distress does not take into account the reality of the situation - the evidence shows that men cyber flash victims for a host of other reasons such as humiliation. In any case, the law should focus on non-consent not the motivation of the perpetrator.525

Cyberflashing is problematic because it is non-consensual conduct of a sexual nature. Distributing penis images is not per se wrongful, but doing so without the consent of the recipient is. The non-consensual act breaches the individual’s rights to sexual autonomy, regardless of the motive of the perpetrator. A focus on nonconsent as the core wrong is the approach of US states which have adopted specific cyberflashing offences.

While there are real challenges with proving consent in sexual offence cases, a major impetus for a cyberflashing law is to raise awareness, challenge the normalisation of the practice, aid prevention and education initiatives and to let victims know that their experiences are understood and recognised. These aims are met by focussing on the core wrong of non-consent and therefore justify this focus, in preference to a law requiring proof of specific motives.

Motive requirements (such as in the laws on image-based sexual abuse) invariably mean that only some forms of abuse are covered and create a hierarchy of abuses which does not reflect victim’s experiences. For example, a law requiring proof of intention to harm (or awareness of risk) will likely exclude some forms of cyberflashing, as well as making prosecutions more difficult because of the threshold being introduced.526

by broadening the offence to cover other intended consequences it allows the prosecution of perpetrators that have committed the offence not to cause distress or alarm but for example, for sexual gratification, power or to humiliate the victim/survivor - which is often present in the perpetration of sexual violence. For example, the use of humiliation as a motivation is used in the Upskirting Legislation. This is particularly relevant for black and minoritised women and girls, who have disclosed that sexualised images have been sent by perpetrators with the intention of undermining their ‘honour’ within their families and wider communities - by the law encompassing other intended consequences or motivations, the lived of experiences of black and minoritised victim/survivors can be reflected in the law.527

...any motivations required to be proven be extended to include humiliation, as discussed in previous questions, which is used in similar legislation, including in Scotland and Singapore and in the English ‘upskirting’ legislation.528

As set out in our response to Question 24, we agree that the offence should include other intended consequences or motivation beyond causing alarm or distress to cover the example provided in paragraph 6.117 of the Consultation Paper where the defendant saw their behaviour as a sexual advance rather than to cause ‘alarm or distress’. In the anecdotal experience of CPS prosecutors, defendants expose themselves for a range of reasons including for the purpose of sexual gratification and ‘having a laugh’.529

In principle, we believe that if a mental or fault element were to be included in the amends to section 66 of the Sexual Offences Act 2003, then it should go beyond an intention to cause alarm or distress: regardless of what the defendant’s intention for exposing an image or recording of their genitals may have been, this is unrelated to the potential impact that may be felt by the victim.530

The mental [or] fault element should include sexual gratification as a motive (as per the Singaporean sexual exposure law), as well as intent to cause humiliation (as per the current upskirting law). But this should be an amendment to section 66, not an additional offence.531

...given the wide ranging circumstances of the sending and receipt of such images in this digital age we think for any cyber-flashing offence, the mental element of the offence could / should be expanded beyond a defendant intending to cause alarm or distress.532

A person’s motive for cyber-flashing should be distinguished from their intent. It is difficult to conceive of circumstances of cyber-flashing, other than to an intimate, where the intention is other than to cause at the bare minimum alarm or distress. If it was a communication to an intimate, then, as suggested in the consultation, it would seem not to cross the threshold of criminality, even in the absence of express consent.533

The legislation must be tightly worded to ensure that the criminal behaviour captured reflects the actions of individuals who have engaged in obviously culpable behaviour and not those acting in a careless, non-criminal way.

Identical behaviour can carry varying levels of culpability in this context depending on the circumstances in which communications are passed. Reasonable behaviour for individuals involved in a relationship for example, would be different to behaviour expected between two strangers...

There is a real risk that the definition of the offence could be broadened without thorough consideration being given to the repercussions, which could lead to the criminalisation of poorly judged, innocently sent communications.536

Consideration must be given in relation to the intention of the act, particularly where the elements of the offence are committed by children.

Criminalising acts which were intended to be non-malicious, albeit distasteful jokes would hugely broaden the criminalised group and would likely result in convicting individuals who have acted in a certain way due to immaturity and poor judgement, rather than with criminal intent.537

Different maximum sentences between children and adults should be considered to provide proportionate consequences to the careless rather than criminal acts of children.

Sentencing differentiation is employed elsewhere in relation to similar offences committed by children, for example section 13 of the Sexual Offences Act 2003 provides for a different maximum sentence for under 18s who have committed various sexual offences set out in sections 9 to 12 of the act.538

Consultation question 26(2)

6.98 Consultation question 26(2) asked:539

Further, should the defendant’s awareness of the risk of causing harm (whether alarm or distress, or otherwise) be sufficient to establish this mental or fault element of the cyber-flashing offence?

Responses

6.99 The Justices’ Legal Advisers and Court Officers’ Service (formerly the Justices’ Clerks’ Society) agreed and noted:

Consent is a part of it in our view, but not all of it as you say. Perhaps the new offence could include “without consent”, but also “with intent to cause alarm or distress or awareness of a risk of doing so”.540

6.100 Alisdair Gillespie agreed, commenting that “...awareness of risk should suffice.”541

6.101 The Angelou Centre supported this approach:

This would allow for the perpetrator to be prosecuted when their main intention is not to cause alarm, distress or any other required motivation, but they were aware that their behaviour would likely cause alarm or distress. It also continues to ensure that the focus is on the behaviour of the perpetrator, rather than the victim/survivor and how she/he reacted or did not react.542

6.102 The Association of Police and Crime Commissioners agreed:

We think that the defendant’s awareness of the risk of causing harm should be sufficient to establish the mental or fault element of the cyber-flashing offence; this will ensure that consensual sharing of images or recordings of genitals between adults is not unduly criminalised.543

6.103 The Magistrates Association agreed and added the following:

This seems a sensible approach to be taken in respect of the mental element of a cyber flashing offence. The discussion in the paper around whether express consent should be required to receive such an image is an interesting one. We agree that you would not want to criminalise actions between partners or friends, where there is an expectation based on their relationship that such an image would be welcomed.544

6.104 Fair Cop disagreed: “We think that it is important to show intent to cause harm if the criminal law is to get involved.”545

6.105 The Criminal Law Solicitors’ Association also disagreed, commenting that this approach was “far too subjective.”546

This is an area fraught with potential difficulties. As an example, the extension of this offence as suggested may lead to the increased criminalisation of young people who may misjudge their actions and their audience. The consequences of doing so may be severe and disproportionate. In particular, the automatic placement on the sex [offenders’] register may hold back, divert or marginalise young people who may in the particular circumstances have acted without malice and having caused relatively little or no harm. Neither should adults be faced with criminality when they intend no harm but merely misjudge at the relevant moment of sending an[] image believing that it may be well received. In order to temper this we feel that where the defendant’s awareness of the risk of causing harm (whether alarm or distress, or otherwise) is deemed sufficient to establish the mental element, the court should also have to take into account among other things, the context of the sending and any specific features of the defendant (and recipient).547

Analysis

6.116 Further, a number of consultees noted that obtaining sexual gratification was frequently one of the purposes underlying cyberflashing. While we have no reason to doubt this - indeed, it almost seems self-evidently true - it cannot just be tagged on to the list of “intentions”. Obtaining sexual gratification differs from the malicious intentions (causing alarm, distress, and humiliation) in this context because sending a person an image of genitals for one’s own sexual gratification is, on its own, not wrong. Indeed, it may be welcome. The harmful outcome is not embedded within the intent. However, in the right circumstances, an offence that criminalises cyberflashing done for a sexual purpose can better recognise the harm inflicted by the invasion of a victim’s autonomy. For this reason, we believe a different fault element is appropriate in certain circumstances where the defendant is acting for the purpose of obtaining sexual gratification.550

CONCLUSION

6.124 The offence we recommend would be technologically neutral; it does not matter over or through what medium you send the image, nor should the offence be constrained to images of the sender’s genitals. It is perhaps interesting to note in passing that the conduct element of the “cyberflashing” offence thus involves nothing inherently related to computer networks nor anything inherently related to public exposure.

6.125 However, this broad conduct element is balanced by a fault element that focusses on specifically wrongful behaviour, whilst at the same time having greater scope than exists under the current exposure offence.

6.126 Given the range of harms that have been described to us that result from cyberflashing and the attendant motivations, it seems that there may be reason to revisit the relatively restricted fault element of the original exposure offence (though, given that we have proposed a separate cyberflashing offence, that now lies outside the scope of this Report).

6.127 Finally we also note that in our view, the harm as described to us by stakeholders, and the conduct and fault elements as we outline above, seem to align with the offence having the same maximum penalty as the existing offence in section 66 of the Sexual Offences Act 2003. That offence is an “either-way” offence, liable to up to 6 months’ imprisonment on summary conviction or two years’ imprisonment on conviction on indictment.

Ancillary orders and special measures

6.128 Finally, part of the reasoning underlying our proposal that cyberflashing be considered a sexual offence were the ancillary orders that would be available were it included in Schedule 3 of the Sexual Offences Act 2003. As we noted above, consultees were overwhelmingly in support of this, citing the availability of Sexual Harm Prevention Orders and notification specifically. As our proposal was to amend the existing section 66 offence, which carries with it existing triggers under Schedule 3, we did not consult on what specific elements of a new offence should trigger an SHPO or notification requirement. However, there would seem to be no compelling reason for excluding a cyberflashing offence from the auspices of Schedule 3 of the SOA 2003 with a threshold similar to that applicable to exposure.554 The two separate fault elements in our recommended offence are similar to the offence in section 67A of the Sexual Offences Act 2003, which only triggers notification requirements under Schedule 3 when the defendant’s purpose was obtaining sexual gratification. 555However, in our view, the recommended offence is more similar in both substance and form to the offence of exposure in section 66 of the Sexual Offences Act 2003. As such, we think it more appropriate that notification requirements mirror those for the offence of exposure.

6.129 We also consider that, in line with other sexual offences, automatic lifetime anonymity and special measures for victims 556should be available in respect of offences of cyberflashing. These issues were considered in detail in our project on intimate image abuse.557

Automatic lifetime anonymity

6.130 Objections can be made to imposing restrictions on the reporting of a complainant’s identity. The principle of open justice is fundamental and is not to be departed from without good reason. One practical concern is that in the case of a false allegation, a defendant may lose the opportunity to trace witnesses who could provide important exculpatory evidence regarding the complainant relevant to their case. More commonly it is argued that without anonymity for the defendant there is no parity of treatment and that this conflicts with the principles of open justice and fairness. However, automatic anonymity in relation to sexual offending has its origins in the 1975 Report of the Helibron Committee, 558and a range of statutory exceptions to the principle of open justice make specific provisions for anonymity. 559The question, then, is not whether anonymity can ever be justified but whether the public interest requires automatic (as opposed to discretionary) anonymity to be extended to victims of cyberflashing.

6.131 As we set out in detail above, in our view it is essential that cyberflashing be classified as a sexual offence in the SOA 2003. The other offences in that Act attract automatic lifetime anonymity, including the “in-person” exposure offence in section 66. 560In our view, not to extend automatic lifetime anonymity to an offence of cyberflashing would create unacceptable confusion and inconsistency. Automatic anonymity would ensure that victims of cyberflashing feel confident to come forward and report the offending, and that they do not see any lack of anonymity as a “barrier” to reporting.

Special measures

6.132 Further, as a sexual offence, it is our view that complainants in cyberflashing matters ought to be automatically eligible for special measures at trial unless witnesses inform the court that they do not wish to be eligible. 561Another important safeguard for victims of cyberflashing at trial is the restrictions on cross examination that apply in trials involving sexual offences. 562Individually and in combination, we believe these measures will go some way to ensuring victims’ confidence in reporting cyberflashing, as well as limiting distress experienced by complainants in the trial process. This would similarly mirror the protections currently afforded to complainants of offences of “in-person” exposure.563

Recommendation 8.

6.133 We recommend that the Sexual Offences Act 2003 be amended to include an offence of cyberflashing with the following elements:

Recommendation 9.

6.134 We recommend that Schedule 3 of the Sexual Offences Act 2003 be amended to include the offence of cyberflashing so that notification requirements under Part 2 of the Sexual Offences Act 2003 are imposed when an appropriate seriousness threshold is met.

Recommendation 10.

6.135 We recommend that Sexual Harm Prevention Orders be available for all cyberflashing offences.

Recommendation 11.

6.136 We recommend that victims of the new cyberflashing offence should have automatic lifetime anonymity.

Recommendation 12.

6.137 We recommend that victims of the new cyberflashing offence should automatically be eligible for special measures at trial.

Recommendation 13.

6.138 We recommend that restrictions on the cross-examination of victims of sexual offences should extend to victims of the new cyberflashing offence.

Chapter 7: Glorification or encouragement of selfharm

INTRODUCTION

online and offline. While we raised concerns about how the criminal law might best tackle encouragement of self-harm in our consultation paper, we received detailed, well-evidenced responses from consultees that have been of great assistance in considering the best way forward. While a specific offence addresses the equivalent behaviour in the context of suicide, there is currently no offence that adequately addresses the encouragement of serious self-harm.

Policy

3.153 The existence of online content glorifying, encouraging, or promoting selfharm and suicide has attracted significant media attention and has been linked with the deaths of children and young people. For example, the so-called “Blue Whale Challenge” - an online “suicide game” which sets daily “challenges” for “players” -has been well-documented. Daily “challenges” start with, for example, “wake up in the middle of the night", then escalate to “cut a blue whale into your arm”, and finally, to suicide.4 This is an extreme example of content promoting self-harm. At the more insidious end of the scale are websites and social media pages promoting strict diets that may amount to eating disorders or “orthorexia”.5

3.154 While encouraging or assisting suicide is a specific offence, criminalised under the Suicide Act 1961, encouraging or assisting self-harm is not. As we note in the Scoping Report, there is an argument that glorifying self-harm may be an inchoate offence.6 We discuss this in detail in Chapter 6. Here, we simply note that, unless a communication glorifying, encouraging, or promoting self-harm crosses the threshold of “obscene, indecent, or grossly offensive”, it cannot be prosecuted under section 127 CA 2003. For similar reasons, it may not be caught by section 1 MCA 1988. It is, therefore, another example of potentially harmful communication that is arguably under-criminalised by the existing law.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, 3.153-3.154. Citations in original. The issues surrounding the “Blue Whale Challenge” and other “digital ghost stories” are explored in more detail in Part 2 below.

See, for example, A Adeane, Blue Whale: What is the truth behind an online 'suicide challenge'? (13 January 2019), available at: https://www.bbc.co.uk/news/blogs-trending-46505722 (last visited 13 July 2021).

See, for example, S Marsh, Instagram urged to crack down on eating disorder images (08 February 2019), available at: https://www.theguardian.com/technology/2019/feb/08/instagram-urged-to-crack-down-on-eating-disorder-images (last visited 13 July 2021). As defined in the Oxford English Dictionary (2021), “orthorexia” means: an excessive concern with consuming a diet considered to be correct in some respect, often involving the elimination of foods or food groups supposed to be harmful to health.

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, para 12.94.

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.177 to 6.182.

defendant pleaded guilty to a total of 137 offences all either of a sexual nature or sexually motivated. The offences related to his manipulation of numerous victims over the internet. It is also worth noting that his guilty pleas included numerous offences under the Malicious Communications Act 1988 (though given the severity of his conduct, what was caught under the MCA would likely fall under our provisionally proposed “harm-based” offence). Falder’s offending can be summarised as follows:572

Falder was a university academic who is now serving a sentence of 25 years’ imprisonment after admitting 137 charges. His conviction followed an investigation by the NCA into horrific online offending which included encouraging the rape of a four year old boy. Falder approached more than 300 people worldwide and would trick vulnerable victims - from young teenagers to adults - into sending him naked or partially-clothed images of themselves. He would then blackmail his victims to self-harm or abuse others, threatening to send the compromising images to their friends and family if they did not comply. He traded the abuse material on ‘hurt core’ forums on the Dark Web dedicated to the discussion, filming and image sharing of rape, murder, sadism, paedophilia and degradation.

The Revenge Porn Helpline have been supporting the National Crime Agency on a significant case for over 18 months. One main offender groomed, bribed and blackmailed both children and vulnerable young women to share intimate content. After they had shared relatively mild images, they were then threatened and coerced into sharing ever more extreme images. What started with topless images, moved through masturbation; self-harm; degrading words written on bodies; hitting and hurting themselves, urinating, defecating and even eating their own faeces. The content was all recorded to be sold and shared further online. The NCA believe there are approximately 200 victims and the Helpline is supporting nearly half of these. The Helpline have so far removed 68,441 individual images, currently a 92% takedown rate.

The offender has pleaded guilty to over 100 charges relating to the grooming, coercion and blackmail of the victims.

Law

Inchoate offences
Harming oneself

1861 could apply to self-harm. The offence criminalises harm (of the relevant severity) “to any person”, in contrast to section 20 of the same Act that only criminalises harm to “any other person”.581

A further difference is that section 18 speaks of wounding or causing grievous bodily harm to “any person”, while section 20 speaks of “any other person”. In this respect section 18 appears to follow the common law offence of mayhem, which forbids the mutilation of either another person or oneself, as in either case this would make the mutilated person unfit to fight for the country.586

Encouraging self-harm

In Marlow,588 for example, D was convicted on the basis of ‘encouraging’ others by his publication of a book on cannabis cultivation. Encouragement can be by hostile threats or pressure as well as by friendly persuasion. It may be implied as well as express. It was held to be an incitement to advertise an article for sale, stating its potential to be used to do an act which is an offence. It was an act capable of encouraging P to commit that offence589—even when accompanied by a warning that the act is an offence. By contrast, it has been held that merely intending to manufacture and sell, wholesale, a device—which has no function other than one involving the commission of an offence—is not to incite the commission of that offence. 590Arguably, there may be circumstances in which offering such devices for sale would be capable of constituting ‘encouragement’ for the purposes of Part 2 of the 2007 Act.

Encouraging or assisting suicide

(1A) The person referred to in subsection (1)(a) need not be a specific person (or class of persons) known to, or identified by, D.

(1B) D may commit an offence under this section whether or not a suicide, or an attempt at suicide, occurs.

(1C) An offence under this section is triable on indictment and a person convicted of such an offence is liable to imprisonment for a term not exceeding 14 years.

2A Acts capable of encouraging or assisting

2B Course of conduct

A reference in this Act to an act includes a reference to a course of conduct, and a reference to doing an act is to be read accordingly.

A gap in the law: culpable conduct falling short of encouraging suicide

PART 2: CONSULTEE RESPONSES - GLORIFICATION OR ENCOURAGEMENT OF SELF-HARM

Preliminary issue: online self-harm content and suicide “challenges”

Nonsuicide self-harm content and the harm-based offence

Can consultees suggest ways to ensure that vulnerable people who post nonsuicide self-harm content will not be caught by our proposed harm-based offence?

Responses

As we have iterated throughout our response, to avoid the criminalisation of individuals who post self-harm or suicide content, the intent of the user will need to be the primary consideration in prosecutions. Malicious intent without reasonable excuse should be demonstrated in order to meet the threshold for the harm-based intent.

Samaritans also question the Law Commission’s decision to differentiate between non-suicidal self-harm content and suicidal self-harm content - it can be difficult to determine intent to die in regard to self-harm, and it can change over time: self-harm may precede suicide in young people and is considered a strong risk factor in future suicide attempts and suicidal behaviour.

Furthermore, it may be difficult to distinguish between non-suicidal self-harm and a suicide attempt based on online communication alone. To make this distinction may be confusing for those enforcing the offence: an image of self-harm and a suicide attempt using the same method are unlikely to be materially different in appearance and an individual posting content of this nature is likely to be in need of support.

Targeted approaches to individuals may be indicative of malicious intent. For example, a user says they are feeling low on their public social media account, and are then bombarded by unsolicited images of self-harm and information about methods from others, even when asked to stop. The creation of new accounts to bypass blocks to target individuals may indicate malicious intent in some cases, although it should be noted that users may also create multiple new accounts without malicious intent: graphic suicide and self-harm content goes against the community standards of the largest social networks and users who regularly post this content may re-register to avoid detection or evade suspension, neither of which are criminal issues.597

Some survivors of domestic abuse and other forms of VAWG post descriptions and pictures of the abuse they have suffered as a way of sharing their story and expressing themselves. We have also supported women who have posted pictures of injuries and claim that they were the result of self-harm when in fact they were caused by a domestic abuse perpetrator, in order to discourage others from thinking they are being abused. It is crucial that survivors of domestic abuse are not arrested or prosecuted for communicating about their experience of abuse online.598

We would suggest that the CPS produces guidance that specifically states that vulnerable people who post non-suicide self-harm content should not be prosecuted under this law. Additionally, College of Policing could produce subsequent guidance encouraging the police not to treat such behaviour as criminal conduct, but when made aware of such behaviour, to refer vulnerable individuals to services that can offer them appropriate support.

As well as the above - or potentially as an alternative - a vulnerable person who posts non suicide self-harm material that is not specifically encouraging others to commit similar behaviour, could arguably be doing so as a form of self-expression, or as an attempt to illicit [sic] emotional support from others to stop the behaviour. Potentially, this could be viewed within the law as proposed as a “reasonable excuse” for the post.

Either way, we believe social media companies have an important role to play in terms of ensuring that users who post self-harm content are signposted to services that can support them.599

As suggested in the consultation, ensuring people who were sharing their own story or struggle, should have a reasonable excuse to the proposed harm-based offence. The challenge is ensuring people can discuss problems (not just relating to themselves, but people may share their experiences relating to supporting friends of family) without providing an automatic defence for people. It may be that communications relating to the issue of non-suicide self-harm should be more tightly defined to ensure only the most grievous offences are caught by the legislation.600

One possible way we suggest would be to include an explicit exclusionary subsection excluding vulnerable people but, as we also say in answer to Question

29 below, we feel there would be inherent difficulties involved both in defining and determining who fell into the category of a “vulnerable person”.

We are conscious of the fact though that whilst a vulnerable person may share NSSH there is still a risk that a person from the likely audience may view such material and go on to self-harm to suicide, that is to say that even where a vulnerable person posts NSSH material there is no way of controlling how that material or post may be used or viewed by other vulnerable people who may be severely adversely affected by it e.g. to the point of self-harm to suicide. This consideration should not be addressed in the offence itself. The public interest test in any decision to prosecute has proven capable of protecting those individuals in respect of whom there is sufficient evidence to secure a conviction but it would be inappropriate or unfair to prosecute them. The offence itself should not distinguish between who may or may be capable of committing an offence in particular circumstances. A sufficient protection exists in the two part prosecution test.603

Analysis

Encouragement of self-harm: should there be a specific offence?

Should there be a specific offence of encouragement of self-harm, with a sufficiently robust mental element to exclude content shared by vulnerable people for the purposes of self-expression or seeking support? Can consultees provide evidence to support the creation of such an offence?

Agreed

As the Law Commission has identified, there are complexities in ensuring people engaging in self-harm are not caught within the proposed offence of encouragement of self-harm. Recent research by Samaritans in England with people who had selfharmed suggested that current provision and support after self-harm was largely unhelpful, and participants reported facing ongoing stigma, meaning that people who have self-harmed may go to informal online spaces to seek peer support without fear of judgement from people they know face to face. Some people who have selfharmed consider professional services to be too basic and want what they consider to be more responsive help in user-led and run online spaces. Capturing users of self-harm communities within this offence would service to further stigmatise individuals who self-harm and may remove an invaluable source of support, without providing an alternative.

It is important to note that peer support may not always be helpful: communities with a focus on self-harm may offer advice on methods of self-harm, guidance on concealing evidence and injuries, and may inculcate the belief that self-harm is an effective coping strategy, which could deter seeking support from other avenues. Furthermore, self-harm peer support groups, helplines and charities may provide guidance on harm minimisation for individuals who are intent on self-harming. An example of this is Self-Injury Support’s resource on Harm Minimisation, which provides guidance on the lowest risk areas to self-harm and the importance of clean implements. This resource is aimed at individuals who are intent on self-harming regardless of intervention and should not be within the scope of the offence: the inadvertent criminalisation of harm reduction approaches from professional organisations would deter the provision of such guidance due to fear of reputational damage and prosecution, potentially leading vulnerable users towards harmful material online. It will be necessary to have the ability to identify bad actors breaking policy within supportive services while also ensuring that service policies designed to keep people safe aren’t inadvertently caught by the law.

Some social media platforms attract a high number of users who share graphic, glamorising images of their self-harm: a recent example seen by Samaritans was of a user sharing an image of a bucket of blood and images of recent self-harm. This attracted responses from other users, remarking on how ‘cool’ this was. Under the same hashtag, users comment on the neatness of others’ injuries, asking about the method used. Many of these users are likely to be young people: self-harm posts on social media often include reference to being caught by their parents, or having to hide paraphernalia relating to self-harm from their parents. This behaviour shows the complexity around assisting self-harm: the content is likely to be harmful to others and shows that users are experiencing feelings of distress, the communication is taking place between minors who are likely to have shared experiences of self-harm and intention may be difficult to understand.

In recent years, multiple online suicide and self-harm challenges, such as Blue Whale, have risen to prominence, and indicate that the overt encouragement of selfharm by bad actors is becoming more of an issue. Media reports and discussions on social media about these challenges can inadvertently raise their profile, encouraging individuals to actively seek them out and potentially cause themselves harm. In the case of Jonathan Galindo, a challenge that emerged in the summer of 2020 and was widely described in the media as the ‘new Blue Whale’, multiple accounts on social media purporting to be this character were created, suggesting that the high level of coverage led to copycat behaviour. Many of these accounts broke the community guidelines of the platforms on which they were created, sharing graphic images of self-harm and injury, and sending private messages to users with the intent of causing them harm.

Another ‘suicide challenge’, the Momo Challenge, received worldwide media attention in 2018. There is no evidence to suggest that the Momo Challenge resulted in actual harm and it is widely considered to be a hoax. However, it spread rapidly via social media, including by high profile celebrities trying to ‘raise awareness’ of the dangers it presented. Samaritans advise that specific details about suicide challenges should be limited to avoid inadvertently drawing attention to potentially harmful content. However, individuals who aren’t vulnerable who create content encouraging self-harm and suicide with malicious intent, whether a hoax or not, should be held accountable for these actions.

Our understanding is that there is no precedent in the UK for prosecuting individuals involved in the creation and spread of online suicide challenges, and we support the prosecution of bad actors who maliciously and deliberately encourage self-harm and suicide. The scale of encouragement to self-harm online is not widely known, perhaps due to the lack of legislation in this area, and while this offence would not be applicable to most individuals speaking about self-harm online, the existence of suicide challenges indicates that there are cases where it would be appropriate. As there is currently little recourse for those encouraging self-harm, the offence may go some way to deter users from creating online challenges in future.608

Disagreed

No. In our view, there should not be a specific offence of encouragement of selfharm. We feel there would be inherent difficulties involved in both defining and determining who fell into the category of a vulnerable person sharing or who had shared content for the purpose of self-expression or seeking support. We also are of the view that these difficulties may lead to inconsistent charging decisions and or outcomes of trials.

This is a complex area and it is unlikely to be dealt with consistently, fairly and properly by reference to the terms of an offence. Each case will turn on its own facts and where a line is crossed or not will be one of judgment and so we advise against over complicating the offence (or any specific additional offence) with prescriptive drafting, which would most likely be incapable of universally fair application.

The proposed offence specifically requires regard to be had of the context. This, in our view, along with the discretionary prosecution test, availability of inchoate offences and any accompanying guidance offers sufficient protection.610

Other

The APCC would need to see evidence to support the creation of such an offence before forming a view. We can see arguments that creating a specific offence of encouragement of self-harm may help to deter people from specifically posting content of this nature (as opposed to content which expresses their own experiences with self-harm, or is an attempt to illicit [sic] emotional support from others to stop the behaviour).

However, we believe that conduct such as this will likely come under the proposed harm-based offence as detailed above.

Additionally, we share the Commission’s concerns detailed at paragraph 6.197 in the consultation document, that there is a significant overlap between “victims” and “perpetrators” in this respect: many of those who communicate online about selfharm tend to post content about their own practice and experiences.

Ultimately, policing will not prevent people from engaging in self-harming behaviours. Primarily, a public health approach should be taken to prevent people from harming themselves, or from encouraging others to do the same.611

On the one hand we accept that the encouragement of self-harm over the internet is a significant social problem that the law needs to address, and there may well be a case for criminalising it. However, as we raised in a meeting with Commission officials, we are concerned that doing so may criminalise people suffering themselves with mental ill-health. Those considering self-harm and those encouraging them are all likely to be suffering from the same mental health problems. It appears there is a fine line here between people who might inappropriately support each other with such ideas and the malicious person who ‘eggs’ a mentally disordered person to self-harm. Great care must be exercised before criminalising this conduct.612

We think it is difficult to create a specific offence of encouragement of selfharm in such a way as to exclude content shared by vulnerable people for the purposes of self-expression or seeking support. In practice, were such a specific offence to be introduced, the Crown Prosecution Service would have to apply its guidance for Crown prosecutors when considering whether the charging standard is met. The Bar Council is not in a position to provide evidence.613

ARTICLE 19 would urge extreme caution in this area. To begin with, it is unclear that self-harm is an offence in and of itself. As the Law Commission rightly notes, criminalisation in this area could result in preventing vulnerable people from seeking help or sharing experiences with others who suffer from the same issues. It is highly unclear that it would help from a medical or mental health perspective.614

... there are strong reasons against any new offence around encouragement of self-harm. It may be difficult to clearly delineate between supportive messages, and a communication that encourages self-harm with no good intent. It would also be difficult to ensure vulnerable people, especially those sharing their own experiences, would not be captured by any such offence.615

Analysis

education could also accompany any legislative reform. Further, as raised by the NPCC in the context of encouraging or assisting suicide, it is also worth considering whether prosecutions should require the consent of the Director of Public Prosecutions (“DPP”). Similarly, the DPP may consider that the sensitivities of this area could warrant the formulation of a Policy for Prosecutors similar to that issued in respect of cases of encouraging or assisting suicide (although this is of course a matter for the DPP). 620While this issue may be problematic when raised in the context of the communications offences (given the volume of offences and potential impact on resources), any offence directed to encouragement of self-harm would be consciously narrow. DPP consent could be a useful way to ensure that any offence not only has a narrow scope of liability but is also accompanied with a heightened level of prosecutorial oversight and discretion.

PART 3: ENCOURAGEMENT OR ASSISTANCE OF SELF-HARM: A NEW OFFENCE

We have particular concerns about a broad offence of “glorification” of self-harm, rather than a narrower offence of encouragement, akin to the offences under the SCA 2007...

Further, there are arguments against the creation of an encouragement offence. First, even if we set aside the possibility of prosecutions that rely on self-harm being a crime under the OAPA 1861, some of this kind of behaviour would be covered by our proposed harm-based offence. Therefore, there may not be a pressing need for a specific offence, in addition to the proposed harm-based offence.

Second, we are conscious that, in relation to this specific kind of communication, there is a significant overlap between “victims” and “perpetrators”. The aforementioned research suggests that many of those who communicate online about self-harm tend to post content about their own practice of and experiences with self-harm. This suggests that it may be more appropriate to treat NSSI content as a public health issue, using strategies other than the criminal law.

That being said, there may be a case for a narrow offence of encouragement or incitement of self-harm, with a sufficiently robust mental element to rule out the kind of NSSI content shared by vulnerable people for the purposes of self-expression or seeking support.

Each is expanded upon below.

Threshold of harm

As well as the more obvious and commonplace types of injury such as bruises and grazes, the definition captures a huge range of harms including cutting off a person’s hair and causing a temporary loss of consciousness. It has also been held to include causing a recognised psychiatric condition.625

Fault element

Maximum penalty

Practical mechanisms: prosecutorial discretion and enforcement

A prosecution is less likely to be required if:

Recommendation 14.

Recommendation 15.

Recommendation 16.

Recommendation 1.

sentence comparable to the existing Malicious Communications Act 1988.

Paragraph 2.257

Recommendation 2.

Paragraph 2.274

Recommendation 3.

Paragraph 3.71

Recommendation 4.

Paragraph 3.94

Recommendation 5.

Paragraph 3.135

Recommendation 6.

Paragraph 3.142

Recommendation 7.

Paragraph 4.34

Recommendation 8.

Paragraph 6.133

Recommendation 9.

Paragraph 6.134

Recommendation 10.

Paragraph 6.135

Recommendation 11.

8.14 We recommend that victims of the new cyberflashing offence should have automatic lifetime anonymity.

Paragraph 6.136

Recommendation 12.

8.15 We recommend that victims of the new cyberflashing offence should automatically be eligible for special measures at trial.

Paragraph 6.137

Recommendation 13.

Paragraph 6.138

Recommendation 14.

Paragraph 7.97

Recommendation 15.

Paragraph 7.98

Recommendation 16.

Paragraph 7.99

Consultees

Legal Advisers’ and Court Officers’ Service, Derbyshire Constabulary, Townswomen’s Guilds, Community Security Trust, Leonard Cheshire, LGBT Fed, Proverbs 31.

Glossary

4chan

4chan is a website to which images and discussion can be posted anonymously by internet users. The website contains a number of sub-categories - or “boards” - such as, notably, the “Politically Incorrect” board and the “Random” board. The website has proved controversial, and has at times been temporarily banned by various internet service providers.

App

Short for “application”, this is software that can be installed on a mobile device, such as a tablet or mobile phone, or a desktop computer.

AirDrop

This is an Apple service that allows users to transfer files (including photographs) between Apple devices using a peer-to-peer wireless connection (ie they are not sent over the internet or mobile network).

Blog

An online journal, or “web log”, usually maintained by an individual or business and with regular entries of content on a specific topic, descriptions of events, or other resources such as graphics or videos. To “blog” something is also a verb, meaning to add content to a blog, and a person responsible for writing blog entries is called a “blogger”. Microblogging refers to blogging where the content is typically restricted in file size; microbloggers share short messages such as sentences, video links or other forms of content. Twitter is an example of a microblog.

Body dysmorphia

This is a chronic mental health condition characterised by extreme anxiety or obsession over perceived physical flaws.

Chatroom

A feature of a website where individuals can come together to communicate with one another.

Chatrooms can often be dedicated to users with an interest in a particular topic. Chatrooms can have restricted access or be open to all.

Comment

A response to another person’s message - such as a blog post, or tweet - often over a social media platform.

Cyberbullying

The use of the internet enabled forms of communication to bully a person, typically by sending messages of an intimidating or threatening nature.

Cyberflashing

The term “cyberflashing” is used to refer to a range of behaviours, but mostly commonly involves a man sending an unsolicited picture of his genitals to a woman.

Cyberstalking

A form of stalking that takes place over the internet.

Dragging

Also known as “Trashing” - the practice of repeatedly publicly posting about a person in order to humiliate or shame them, often on a chatroom or similar social media site.

Defendant

The person accused of an offence.

Dick pic

Strictly speaking, this is photograph that a person has taken of their penis. The term more commonly relates to these photographs being sent to another or posted publicly.

Doxing

Searching for and publishing private or identifying information about a particular individual on the web, typically with malicious intent.

Either-way offence

An offence that can be tried either in the Crown Court or in a magistrates’ court.

Facebook

A social media platform which connects users from all over the world and enables them to post, share, and engage with a variety of content such as photos and status updates.

Fake news

False, often sensational, information disseminated under the guise of news reporting.

Fault element

A culpable state of mind that needs to be proven beyond reasonable doubt in order to establish criminal liability. Also referred to as “mens rea” or “mental element”.

Friend

The term used on social media services such as Facebook to refer to an individual who is added to a user’s social network on the platform. A person may allow this “friend” to view their profile, or particular parts of it (for example, certain posts or messages). It is also used as a verb, for example, to “friend” a person, means to add them to your social network. Facebook “friends” may not actually be “friends” in the conventional understanding of the term. Someone could “friend” a complete stranger.

Follow

“Following” another user of certain social media platforms (for example, Twitter or Instagram) means that you will receive updates from that user, which will appear in your newsfeed.

GIF

A GIF (“graphics interchange format”) is a moving or “animated” digital image that plays back (or “loops”) continuously. They are mostly soundless, and can include short clips of video or film as well as cartoons.

Hashtag

A hashtag is a tag usually used on social networks such as Twitter or Facebook. Social networks use hashtags to categorise information and make it easily searchable for users. It is presented as a word or phrase preceded by a #. For example, a current well-known hashtag is MeToo.

Hate Crime

There is no statutory definition of “hate crime”. When used as a legal term in England and Wales, “hate crime” refers to two distinct sets of provisions:

Aggravated offences under the Crime and Disorder Act 1998 (“CDA 1998”), which are offences where the defendant demonstrated, or the offence was motivated by racial or religious hostility;

Enhanced sentencing provisions under the Criminal Justice Act 2003 (“CJA 2003”), which apply to offences where the defendant demonstrated, or the offence was motivated by hostility on the grounds of race, religion, sexual orientation, disability or transgender identity.

A different definition is used by the police, Crown Prosecution Service and National Offender Manager Service for the purposes of identifying and flagging hate crime. The focus of this definition is on victim perception:

Any criminal offence which is perceived by the victim or any other person, to be motivated by a hostility or prejudice based on a person’s race or perceived race; religion or perceived religion; sexual orientation or perceived sexual orientation; disability or perceived disability and any crime motivated by a hostility or prejudice against a person who is transgender or perceived to be transgender.

The term hate crime is sometimes also used to describe “hate speech” offences, such as offences of stirring up hatred under the Public Order Act 1986, and the offence of “indecent or racialist chanting” under the Football (Offences) Act 1991.

Inchoate offence

An offence relating to a criminal act which has not, or not yet, been committed. The main inchoate offences are attempting, encouraging or assisting crime.

Indictable offence

An offence triable in the Crown Court (whether or not it can also be tried in a magistrates’ court); contrasted with a summary offence.

Instagram

A photo sharing app that allows users to take photos, apply filters to their images, and share the photos instantly on the Instagram network and other social networks such as Facebook or Twitter.

Internet Access Provider

A company that provides subscribers with access to the internet.

Internet Service Provider

A broader term than Internet Access Provider referring to anything from a hosting provider to an app creator.

IP address

An “internet protocol” address is a numerical label which identifies each device on the internet, including personal computers, tablets and smartphones.

Liking

Showing approval of a message posted on social media by another user, such as his or her Facebook post, by clicking on a particular icon.

Meme

A thought, idea, joke or concept that has been widely shared online, often humorous in nature - typically an image with text above and below it, but sometimes in video and link form.

Non-binary

An umbrella term for people whose gender identity doesn’t sit comfortably with “man” or “woman”. It can include people who identify with some aspects of binary gender identities, and others who completely reject binary gender identities. Non-binary people may also identify under the transgender umbrella.

Offline communication

Communication that does not use the internet (for example, having a face-to-face conversation or sending a letter).

Online abuse

For the purposes of this report, we adopt the following working definition of “online abuse”. Online abuse includes but is not limited to: online harassment and stalking; harmful one-off communications, including threats; discriminatory or hateful communications, including misogynistic communications (“online hate”); doxing and outing; impersonation.

Online communication

Communication via the internet between individuals and/or computers with other individuals and/or computers.

Online hate

By “online hate” we mean a hostile online communication that targets someone on the basis of an aspect of their identity (including but not limited to protected characteristics). Such communications will not necessarily amount to a hate crime. We note that the College of Policing’s Hate Crime Operational Guidance (2014), stipulates that police should record “hate incidents” using a perception-based approach. Again, such incidents may not amount to a hate crime.

Photoshop

A software application for editing or retouching photographs and images.

Pile-on harassment

Harassment that occurs where many individuals send communications that are harassing in nature to a victim. This can involve many hundreds or thousands of individual messages and is relatively closely connected to the instantaneous and un-geographically bounded nature of the online environment.

Post or posting (on social media)

A comment, image or video that is sent so as to be visible on a user’s social media page or timeline (whether the poster’s own or another’s).

Private message

A private communication between two people on a given platform which is not visible or accessible to others.

Protected characteristics

In the context of hate crime this refers to characteristics that are specified in hate crime laws in England and Wales, namely; race, religion, sexual orientation, disability and transgender status. The term is also sometimes used in the context of the Equality Act 2010, which specifies nine protected characteristics. There is some overlap between the two, but in this report we are referring to the hate crime characteristics unless we specify otherwise.

Replying

An action on, for example, Twitter that allows a user to respond to a Tweet through a separate Tweet that begins with the other user’s @username.

Retweeting

The re-sharing (forwarding) on Twitter by a person (B) of a message received from another person (A), using the re-tweet button and attributing the message to A.

Sharing

The broadcasting by users of social media of web content on a social network to their own social media page, or to the page of a third party.

Skype

A free program that allows for text, audio and video chats between users; it also allows users to place phone calls through their Skype account.

Social media

Websites and apps that enable users to create and share content or to participate in social networking.

Social media platform

Refers to the underlying technology which facilitates the creation of social media websites and applications. From a user’s perspective, it enables blogging and microblogging (such as Twitter), photo and video sharing (such as Instagram and YouTube), and the ability to maintain social networks of friends and contacts. Some platforms enable all of these in one service (through a website and/or an application for a desktop computer or mobile phone) as well as the ability for third-party applications to integrate with the service.

Social Networking Service

A service provided by an internet company which facilitates the building of social networks or social relations with other people, through the sharing of information. Each service may differ and target different uses and users. For example, facilitating connections between business contacts only, or only particular types of content, such as photos.

Summary or summary-only offence

An offence triable only in a magistrates’ court; in contrast to an indictable or either-way offence.

Tag

A social media function used commonly on Facebook, Instagram and Twitter, which places a link in a posted photograph or message to the profile of the person shown in the picture or targeted by the update. The person that is “tagged” will receive an update that this has occurred.

Troll

A person who creates controversy in an online setting (typically on a social networking website, forum, comment section, or chatroom), disrupting conversation as to a piece of content by providing commentary that aims to provoke an adverse reaction.

Tweet

A post on the social networking service Twitter. Tweets can contain plain text messages (not more than 280 characters in the English version of the service), or images, videos, or polls. Users can Tweet to another person (@mention tweets) so as to ensure they will be notified of the Tweet, or can also message them directly. Other users can retweet the Tweets of others amongst their connections on the platform.

Twitter

A social network that allows users to send “Tweets” to their followers and/or the public at large.

VoIP

“Voice over Internet Protocol” refers to the group of technologies that allow users to speak to each other over an Internet Protocol (such as the internet) rather than over traditional networked telephone services.

WhatsApp

An encrypted instant messaging service for one-to-one or group chat on mobile devices.

YouTube

A video-sharing website that allows registered users to upload and share videos, and for any users to watch videos posted by others.

Zoom

Zoom is cloud-based videoconferencing and messaging software that allows for video meetings with multiple participants (up to 1,000 in some cases).

CCS0621857840

978-1-5286-2771-9

1

Internet Live Stats, https://www.internetlivestats.com/one-second/#traffic-band (last visited 13 July 2021).

2

The Number of Tweets per Day in 2020, https://www.dsayce.com/social-media/tweets-day/ (last visited 13 July 2021.

3

WhatsApp Revenue and Usage Statistics (2020), https://www.businessofapps.com/data/whatsapp-statistics/ (last visited 13 July 2021).

4

Internet Live Stats, https://www.internetlivestats.com/one-second/#traffic-band (last visited 13 July 2021).

5

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381.

6

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, especially Chapter 4.

7

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 4.9.

8

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, paras 2.155 to 2.161.

9

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No

248, para 4.12.

10

Aged 16 or over.

11

That is, 61% of respondents selected at least one of the things listed when asked “Which if any of these things have you come across on the internet in the last 12 months?”. The things listed were potential online harms, divided into three categories: (1) data; (2) hacking/security; and (3) content/contact with others.

Under (3), content/contact with others, the following things were listed: (i) fake news; (ii) offensive language; (iii) violent\ disturbing content; (iv) unwelcome friend\follow requests/unwelcome contact or messages from strangers; (v) offensive videos/pictures; (vi) harmful/misleading advertising; (vii) hate speech/inciting violence; (viii) bullying, abusive behaviour or threats; (ix) trolling (a person who deliberately says something controversial); (x) people pretending to be another person; (xi) sexual\pornographic content; (xii) spending too much time online; (xiii) encouraging self-harm e.g. cutting, anorexia, suicide; (xiv) encouraging terrorism\ radicalisation; (xv) cyberstalking (harassment from other internet users); (xvi) material showing child sexual abuse.

12

Jigsaw Research, Internet users’ experience of harm online: summary of survey research (Ofcom 2019), https://www.ofcom.org.uk/ data/assets/pdf file/0028/149068/online-harms-chart-pack.pdf (last visited 13 July 2021).

13

Aged 12 to 15.

14

H Bentley and others, How safe are our children? 2019: an overview of data on child abuse online (NSPCC 2019), https://learning.nspcc.org.uk/media/1747/how-safe-are-our-children-2019.pdf (last visited 13 July 2021), at p 10.

15

Glitch and End Violence Against Women Coalition, “The Ripple Effect: COVID 19 and the Epidemic of Online Abuse” (2020) available at: https://www.endviolenceagainstwomen.org.uk/wp-content/uploads/Glitch-and-EVAW-The-Ripple-Effect-Online-abuse-during-COVID-19-Sept-2020.pdf (last visited 13 July 2021).

16

Alan Turing Institute, “Detecting East Asian prejudice on social media”, available at: https://www.turing.ac.uk/research/research-projects/hate-speech-measures-and-counter-measures/detecting-east-asian-prejudice-social-media (last visited 13 July 2021).

17

This figure comes from internal data generated by the National Online Hate Crime Hub, provided to us by Paul Giannasi.

18

See for example, S Stone, “Premier League plans social media blackout in response to online abuse” BBC News 18 April 2021 available at: https://www.bbc.co.uk/sport/football/56792443 (last visited 13 July 2021).

19

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 1.21.

20

Draft Online Safety Bill 2021, available at:

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment data/file/985033/D raft Online Safety Bill Bookmarked.pdf (last visited 13 July 2021).

21

Hate Crime: A Consultation Paper (2020) Law Commission Consultation Paper No 250; Intimate Image Abuse: A Consultation Paper (2021) Law Commission Consultation Paper No 253.

22

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 5.184-5.188.

23

This is more relevant to the non-criminal law response to online harms, as acknowledged in the Government response to the Online Harms White Paper: “Devolution” paras 6.15 to 17, available at: https://www.gov.uk/govemment/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response (last visited 13 July 2021). Justice and policing are transferred matters on which the Northern Ireland Assembly has full legislative powers, by virtue of The Northern Ireland Act 1998 (Amendment of Schedule 3) Order 2010. See also: Guidance, Devolution settlement: Northern Ireland, 20 February 2013, updated 23 September 2019. Available at: https://www.gov.uk/guidance/devolution-settlement-northern-ireland (last visited 13 July 2021).

24

Hate Crime Legislation in Northern Ireland: Independent Review (November 2020), available at: https://www.justice-ni.gov.uk/sites/default/files/publications/justice/hate-crime-review.pdf (last visited 13 July 2021).

25

Department of Justice Northern Ireland, Consultation Response.

26

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 4.62.

27

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 4.63 to 66. (Citations omitted).

28

Crime and Disorder Act 1998, s 37(1).

29

Intimate Image Abuse: A Consultation Paper (2021), Law Com Consultation Paper No 253, para 1.75.

30

For example, Chapter 6, paras 6.112 and following.

31

Crown Prosecution Service, “Youth Offenders”, updated 28 April 2020, available at: https://www.cps.qov.uk/leqal-quidance/vouth-offenders (last visited 13 July 2021).

32

Crown Prosecution Service, “Social Media - Guidelines on prosecuting cases involving communications sent via social media” revised 21 August 2018, available at: https://www.cps.gov.uk/legal-guidance/social-media-guidelines-prosecuting-cases-involving-communications-sent-social-media (last visited 13 July 2021).

33

Sentencing Council, “Sentencing Children and Young People”, effective from 1 June 2017, available at: https://www.sentencingcouncil.org.uk/overarching-guides/magistrates-court/item/sentencing-children-and-young-people/ (last visited 13 July 2021).

34

Online Harms White Paper: Full UK government response to the consultation, “Interim measures from the government” para 2.46 and following; “Role of the government in media literacy” paras 5.23 to 28: https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response (last visited 13 July 2021).

35

See Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, at para 1.50.

36

A draft Online Safety Bill was published on 12 May 2021, available at:

https://www.gov.uk/government/publications/draft-online-safety-bill (last visited 13 July 2021).

37

Redmond-Bate v DPP [1999] EWHC Admin 733 [20]. It is worth also recalling the words of the European Court of Human Rights in Handyside v UK [1990] ECHR 32, 49, that the right to freedom of expression extends to those communications “that offend, shock or disturb the State or any sector of the population. Such are the demands of that pluralism, tolerance and broadmindedness without which there is no ‘democratic society’.”

38

Parliamentary Liaison and Investigation Team, Consultation Response.

39

National Police Chiefs’ Council, Consultation Response.

40

Criminal Bar Association, Consultation Response.

41

Magistrates Association, Consultation Response.

42

K Barker & O Jurasz, Consultation Response.

43

Zoom, Consultation Response.

44

Refuge, Consultation Response.

45

  Stonewall, Consultation Response.

46

 ARTICLE 19, Consultation Response.

47

  English PEN, Consultation Response.

48

  Free Speech Union, Consultation Response.

49

  It appears that this is a reference to the Post Office (Amendment) Act 1935. Lord Templemore is recorded in

Hansard: “.I come now to what is probably the most important clause of the whole Bill, Clause 10. Subsection (1) of that clause extends the penalties imposed by Section 67 of the Post Office Act, 1908, for obstructing officers of the Post Office to other forms of molestation and is designed to give the Post Office staff protection in cases where, for example, people have indulged in improper or obscene language over the telephone to female telephonists.” Hansard (HL) 19 March 1935, vol 96, col 165.

50

  An either-way offence is one that can be tried either in the magistrates’ courts or in the Crown Court. If the

magistrates decide their sentencing powers are sufficient to deal with the offence, the accused may choose to have it dealt with summarily in the magistrates' court or on indictment (trial by jury) in the Crown Court.

51

An offence triable only in a magistrates’ court.

52

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 5.1.

53

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 5.2.

54

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 4.121.

55

Association of Police and Crime Commissioners, Consultation Response.

56

Magistrates Association, Consultation Response.

57

Carnegie UK Trust, Consultation Response.

58

J Neller, Consultation Response.

59

English PEN, Consultation Response.

60

Crown Prosecution Service, Consultation Response.

61

A Gillespie, Consultation Response.

62

Bar Council, Consultation Response.

63

Parliamentary Liaison and Investigation Team, Metropolitan Police, Consultation Response.

64

Community Security Trust, Consultation Response.

65

E Tiarks & M Oswald, Consultation Response.

66

Criminal Bar Association, Consultation Response.

67

Stonewall, Consultation Response.

68

Magistrates Association, Consultation Response.

69

T Keren-Paz, Consultation Response.

70

WhatsApp Revenue and Usage Statistics (2020), available at:

https://www.businessofapps.com/data/whatsapp-statistics/ (last visited 13 July 2021).

71

Homicide Act 1957, s 2(1)(a).

72

Public Order Act 1986, s 5.

73

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No

248, consultation question 4.

74

A Gillespie, Consultation Response.

75

J Rowbottom, Consultation Response, para 7.

76

Criminal Law Solicitors’ Association, Consultation Response.

77

Parliamentary Liaison and Investigation Team, Metropolitan Police, Consultation Response.

78

Community Security Trust, Consultation Response.

79

Association of Police and Crime Commissioners, Consultation Response.

80

Stonewall, Consultation Response.

81

Refuge, Consultation Response.

82

  A Gillespie, Consultation Response. Respectfully, we do not agree that in every case where actual harm

occurs that likely harm would not need to be proven. We deal with this further at para 2.107 below.

83

Magistrates Association, Consultation Response.

84

Law Society of England and Wales, Consultation Response.

85

This is a view fortified by the Divisional Court in Parkin v Norman [1983] QB 92, 100, in which the court held that, for the purposes of section 5 of the Public Order Act 1936, “likely to” indicated a higher degree of probability than “liable to”: “This is a penal measure and the courts must take care to see that the former expression is not treated as if it were the latter.” The offence in section 5 criminalised “threatening, abusive or insulting words or behaviour with intent to provoke a breach of the peace or whereby a breach of the peace is likely to be occasioned” (broadly the offence now contained in section 4 of the Public Order Act 1986).

86

Re H (Minors) [1995] UKHL 16, [1996] 1 AC 563, 584.

87

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 3.

88

See para 2.110 above.

89

A Gillespie, Consultation Response.

90

Free Speech Union, Consultation Response.

91

Refuge, Consultation Response.

92

Bar Council, Consultation Response.

93

Judicial Security Committee, Consultation Response.

94

Association of Police and Crime Commissioners, Consultation Response.

95

Suzy Lamplugh Trust, Consultation Response.

96

The topic of “information subjects” is discussed further below.

97

E Tiarks & M Oswald, Consultation Response.

98

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 6.

99

Demos, Consultation Response.

100

Fix the Glitch, Consultation Response.

101

Refuge, Consultation Response.

102

Women’s Aid, Consultation Response.

103

Sentencing Council, Overarching principles: domestic abuse, 24 May 2018. Available at:

https://www.sentencingcouncil.org.uk/overarching-guides/magistrates-court/item/domestic-abuse/ (last visited 13 July 2021).

104

Crown Prosecution Service, Consultation Response.

105

Association of Police and Crime Commissioners, Consultation Response.

106

Magistrates Association, Consultation Response.

107

  English PEN, Consultation Response.

108

  Free Speech Union, Consultation Response.

109

LGB Alliance, Consultation Response.

110

Sex Matters, Consultation Response.

111

See Murder, Manslaughter and Infanticide (2006), Law Commission Report No 304, paras 3.9-3.27. We discuss this in detail later in this section at para 2.186 onwards.

112

A draft Online Safety Bill was published on 12 May 2021, available at:

https://www.gov.uk/government/publications/draft-online-safetv-bill (last visited 13 July 2021).

113

  Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No

248, consultation question 8.

114

R v G and another [2003] UKHL 50; [2004] 1 AC 1034.

115

T Keren-Paz, Consultation Response.

116

Young Epilepsy, Consultation Response.

117

A Gillespie, Consultation Response.

118

Refuge, Consultation Response.

119

Association of Police and Crime Commissioners, Consultation Response.

120

Crown Prosecution Service, Consultation Response.

121

End Violence Against Women Coalition, Consultation Response.

122

Magistrates Association, Consultation Response.

123

Criminal Bar Association, Consultation Response.

124

E Tiarks & M Oswald, Consultation Response.

125

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

126

Kingsley Napley, Consultation Response.

127

Free Speech Union, Consultation Response.

128

ARTICLE 19, Consultation Response.

129

English PEN, Consultation Response.

130

See Murder, Manslaughter and Infanticide (2006), Law Commission Report No 304, para 3.27, and cited in Blackstone’s Criminal Practice at B1.15.

131

Murder, Manslaughter and Infanticide (2006), Law Commission Report No 304, para 3.27, and cited in Blackstone’s Criminal Practice at B1.15.

132

See Blackstone’s Criminal Practice at B1.14.

133

R v Moloney [1983] AC 905.

134

R v Moloney [1983] AC 905, 913E-F (Lord Hailsham of St Marylebone LC), affirmed by Lord Scarman in R v Hancock [1985] UKHL 9; [1986] 1 AC 455, 472B.

135

R v Hancock [1985] UKHL 9; [1986] 1 AC 455, 473F (Lord Scarman).

136

R v Moloney [1983] AC 905, 929H (Lord Bridge of Harwich), cited in R v Hancock [1985] UKHL 9; [1986] 1 AC 455, 472g and again referred to at 473d.

137

R v Nedrick [1986] 3 All ER 1; [1986] 1 WLR 1025.

138

R v Woollin [1999] 1 AC 82.

139

R v Nedrick [1986] 3 All ER 1; [1986] 1 WLR 1025, 1027-1028 (Lord Lane CJ).

140

R v Woollin [1999] 1 AC 82, 95F (Lord Steyn).

141

Indeed, we make a recommendation in Chapter 6: that purpose (of obtaining sexual gratification) should constitute one of the fault elements for the recommended offence of cyberflashing.

142

Chandler v DPP [1964] AC 763.

143

Hansard, HC Public Bill Committee, 6th Sitting, 3 July 2007, col.211.

144

See Blackstone’s Criminal Practice A5.12.

145

R v Moloney [1983] AC 905, 913E-F (Lord Hailsham of St Marylebone LC).

146

Of course, the excuse does still have to be reasonable: it is not enough that it is simply “an” excuse. It is certainly open to a court to find that there was no reasonable excuse for sending, say, Nazi propaganda dressed up as satire.

147

R v AY [2010] EWCA Crim 762.

148

R v AY [2010] EWCA Crim 762 at [25].

149

See our discussion of Peringek v Switzerland (2016) 63 EHRR 6 (App No 27510/08) in Chapter 2 of our Consultation Paper.

150

Sivier v Riley [2021] EWCA Civ 713, [20] (Warby LJ).

151

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No

248, consultation question 11.

152

Health Protection (Coronavirus, Restrictions) Regulations (England) 2020.

153

For example: BBC News, 11 January 2021, “Covid: women fined for going for a walk receive police apology: available at: https://www.bbc.co.uk/news/uk-england-derbyshire-55625062 (last visited 13 July 2021). See also Joint Committee on Human Rights, “The Government response to covid-19: fixed penalty notices” 27 April 2021, available at: https://committees.parliament.uk/publications/5621/documents/55581/default/ (last visited 13 July 2021).

154

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

155

Fair Cop, Consultation Response.

156

A Gillespie, Consultation Response.

157

Association of Police and Crime Commissioners, Consultation Response.

158

English PEN, Consultation Response.

159

Magistrates Association, Consultation Response.

160

Bar Council, Consultation Response.

161

Kingsley Napley, Consultation Response.

162

Judicial Security Committee, Consultation Response.

163

Mermaids, Consultation Response.

164

Hacked Off, Consultation Response.

165

Demos, Consultation Response.

166

Fair Cop, Consultation Response.

167

J Neller, Consultation Response.

168

English PEN, Consultation Response.

169

Law Society of England and Wales, Consultation Response.

170

ARTICLE 19, Consultation Response.

171

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 5.175.

172

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No

248, paras 5.63 to 64.

173

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 5.212 to 214.

174

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 16.

175

Crown Prosecution Service, Consultation Response.

176

ARTICLE 19, Consultation Response.

177

For example, Refuge, Samaritans, Epilepsy Society, Community Security Trust, Consultation Responses.

178

A Gillespie, Community Security Trust, Consultation Responses.

179

See for example: A Mistlin, “Man who sent antisemitic tweets on holiday avoids UK prosecution” The Guardian, 23 February 2021. Available at: https://www.theguardian.com/news/2021/feb/23/man-who-sent-antisemitic-tweets-on-holiday-avoids-uk-prosecution (last visited 13 July 2021). See also L Harpin, “Police to drop Wiley antisemitism probe after learning he was abroad at time of alleged offences” Jewish Chronicle, 25 September 2020. Available at: https://www.thejc.com/news/uk/police-drop-wiley-antisemitism-probe-after-learning-he-was-abroad-at-time-of-alleged-offences-1.507042 (last visited 13 July 2021).

180

Domestic Abuse Act 2021, s 72 to 74, sch 3.

181

[2004] EWCA Crim 631.

182

See for example: Domestic Abuse Act 2021, ss 72 to 74, sch 3; Sexual Offences Act 2003, s 72, sch 2; Suppression of Terrorism Act 1978, s 4, sch 1; Terrorism Act 2000, s 59, 62, 63; Terrorism Act 2006, s 17; Bribery Act 2010, s 12; Female Genital Mutilation Act 2003, s 4; Serious Crime Act 2015, s 70; Criminal Justice Act 1993, ss 1 to 3.

183

For example, Domestic Abuse Act 2021, s 72(1)(b).

184

Protection from Harassment Act 1997, ss 4 and 4A; Domestic Abuse Act 2021, sch 3.

185

Serious Crime Act 2015, s 76; Domestic Abuse Act 2021, sch 3.

186

Domestic Abuse Bill 2020, Extraterritorial jurisdiction factsheet, updated 18 May 2021. Available at:

https://www.qov.uk/qovemment/publications/domestic-abuse-bill-2020-factsheets/extraterritorial-jurisdiction-factsheet (last visited 13 July 2021).

187

Domestic Abuse Bill 2020, Extraterritorial jurisdiction factsheet, updated 18 May 2021. Available at:

https://www.qov.uk/qovernment/publications/domestic-abuse-bill-2020-factsheets/extraterritorial-jurisdiction-factsheet (last visited 13 July 2021).

188

Crown Prosecution Service, Jurisdiction, updated 2 September 2020. Available at: https://www.cps.qov.uk/leqal-quidance/jurisdiction (last visited 13 July 2021).

189

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 5.212 to 214.

190

The current offence is found in the Fraud Act 2006, s 2.

Online Harms White Paper: Full UK government response to the consultation, “Disinformation and misinformation” at paras 2.75 and following, available at:

191

https://www.gov.uk/govemment/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response (last visited 13 July 2021). We also differentiated “disinformation” and “misinformation” in our consultation paper at paragraph 6.43: Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248.

192

As we discussed in Example 2 of Chapter 6 of the consultation paper: Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.20.

193

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, Chapter 3, paras 6.12, 6.36.

194

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 19.

195

Association of Police and Crime Commissioners, Consultation Response.

196

 [2019] EWHC 3094 (Admin).

197

S Rowe, Consultation Response.

198

English PEN, Consultation Response.

199

Alan Turing Institute, Consultation Response.

200

Demos, Consultation Response.

201

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 2.20 to 59.

202

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 3.125 to 3.139.

203

English PEN, Consultation Response.

204

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.28.

205

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.39 and following.

206

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.41.

207

Ministry of Justice and Cabinet Office, Advice on introducing or amending criminal offences. Available at https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment data/file/481126/cr eating-new-criminal-offences.pdf (last visited 13 July 2021); Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.42.

208

Online Harms White Paper: Full government response to the consultation, “Disinformation and misinformation” 2.75ff, available at: https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response (last visited 13 July 2021).

209

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.47 to 6.61.

210

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 20.

211

Crown Prosecution Service, Consultation Response.

212

Fix the Glitch, Consultation Response.

213

Community Security Trust, Consultation Response.

214

English PEN, Consultation Response.

215

A Gillespie, Consultation Response.

216

Full Fact, Consultation Response.

217

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

218

Community Security Trust, Consultation Response.

219

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.45 to 6.46.

220

See paras 3.48 to 3.50 below.

221

Hate Crime Laws: a Consultation Paper (2020) Law Commission Consultation Paper No 250, Chapter 18.

222

We note that the offences of stirring up hatred are not limited to stirring up racial hatred.

223

Hate Crime Laws: a Consultation Paper (2020) Law Commission Consultation Paper No 250, paras 18.142 to 152.

224

Hate Crime Laws: a Consultation Paper (2020) Law Commission Consultation Paper No 250, para 18.144. “The "blood libel” is antisemitic trope that Jews murder children in order to use their blood in religious rituals. Originating in Medieval England, where it was responsible for several massacres, the trope continues to circulate today.The Protocols were a hoax text purporting to document a Jewish conspiracy to control the world by controlling the press and finance. The Protocols continue to be cited by modern anti-Semites and conspiracy theorists.”

225

Criminal Law Act 1977, s 51.

226

Public Order Act 1986, s 38

227

Magistrates Association, National AIDS Trust, Consultation Responses.

228

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 4.104 to 4.132.

229

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 4.113. Further, at paras 6.47 to 6.61 we acknowledge that in the context of knowingly false communications, there is a justification for including physical harm.

230

We also set out that, where appropriate, harms with a more diffuse causal connection to communications may well be better captured by other alternative offences that better reflect the nature and gravity of the harmful conduct, including offences against the person or sexual offences: Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 4.131.

231

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 3.125 to 130.

232

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 3.159 to 3.169.

233

National AIDS Trust, S Rowe, Demos, Carnegie UK Trust, Full Fact, Consultation Responses.

234

For example: S Rowe, A Gillespie, Consultation Responses.

235

The formulation “ought to have known” is also variously referred to as “wilful blindness”. We note the discussion of this in our report A Criminal Code for England and Wales (1989) Law Com No 177, vol 2, para 8.10. Even where a fault element requires intention, evidence of wilful blindness, or “shutting one’s eyes to the truth” may be used to infer knowledge or a belief: G Williams, Glanville Williams Textbook of Criminal Law (4th ed), para 6-017.

236

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.41 to 6.46.

237

Big Brother Watch, Consultation Response.

238

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.45.

239

See paras 2.59 to 2.60, 2.80.

240

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 21.

241

Criminal Law Solicitors’ Association, Consultation Response.

242

S Rowe, Consultation Response.

243

LGB Alliance, Consultation Response.

244

English PEN, Consultation Response.

245

Refuge, Consultation Response.

246

See eg paras 1.15 to 1.17 and 2.180 to 2.183.

247

Communications Act 2003, s 127(3).

248

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

249

Magistrates’ Courts Act 1980, s 127(1).

250

  Criminal Justice and Courts Act 2015, s 51.

251

  Criminal Justice and Courts Act 2015, Explanatory Notes s 51, available at:

https://www.legislation.gov.Uk/ukpga/2015/2/notes/division/3/3/2/1 (last visited 29 April 2021).

252

Criminal Justice and Courts Bill Fact Sheet: section 127 of the Communications Act 2003, available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment data/file/363924/fa ct-sheet-s127-comms-act.pdf (last visited 13 July 2021).

253

Criminal Justice and Courts Bill Fact Sheet: section 127 of the Communications Act 2003, para 6.

254

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.13.

255

Scottow v Director of Public Prosecutions [2020] EWHC 3421 (Admin); [2021] Crim LR 315, [32].

256

Domestic Abuse Act 2021, s 39.

257

Domestic Abuse Act 2021, s 68; Serious Crime Act 2015, s 76.

258

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.15; s 128 “Notification of misuse of networks and services” Communications Act 2003.

259

Namely “annoyance, inconvenience or anxiety”: Communications Act 2003, s 128(5).

260

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 17.

261

Protection from Harassment Act 1997 s 1, 2 and 7.

262

For example, S Thomas, N Djaba, Consultation Responses.

263

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381.

264

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248.

265

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

266

National Police Chiefs’ Council, Consultation Response.

267

Association of Police and Crime Commissioners, Consultation Response.

268

ARTICLE 19, Consultation Response.

269

  [2020] EWHC 3421 (Admin); [2021] Crim LR 315.

270

Free Speech Union, Consultation Response.

271

Magistrates Association, Consultation Response.

272

This may include traditional fixed line phone and mobile telephone networks, but also Voice-over-InternetProtocol (VoIP) or similar broadband-based phone services.

273

For example, Criminal Law Act 1977, s 51 “Bomb hoaxes”; Anti-terrorism, Crime and Security Act 2001,

s 114 “Hoaxes involving noxious substances or things”.

274

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.11 to 6.17.

275

Communications Act 2003, s 127(3).

276

For example, Offences Against the Person Act 1861, s 16 “Threats to kill”; bomb hoaxes are covered by the Criminal Law Act 1977, s 51. We described the existing position regarding threats as lacking coherence and amounting to a “patchwork of different statutes": Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, Chapter 7.

277

Reform of Offences Against the Person (2015) Law Com No 361, para 8.18.

278

Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, paras

12.1 and following.

279

Chambers v Director of Public Prosecutions [2012] EWHC 2157 (Admin); [2013] 1 WLR 1833.

280

Chambers v Director of Public Prosecutions [2012] EWHC 2157 (Admin); [2013] 1 WLR 1833, paras [26] to [34].

281

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 5.207, citing: BBC News, YouTuber jailed for Birmingham hospital bomb threat (18 June 2020), https://www.bbc.co.uk/news/uk-england-birmingham-53092117 (last visited 13 July 2021).

282

UK Government National Action Plan for the Safety of Journalists, 9 March 2021, available at:

https://www.gov.uk/government/publications/national-action-plan-for-the-safety-of-joumalists/national-action-plan-for-the-safety-of-journalists (last visited 13 July 2021).

283

R Syval, “UK launches action plan to prevent harassment and abuse of journalists” The Guardian 9 March 2021, available at: https://www.theguardian.com/media/2021/mar/09/uk-launches-action-plan-prevent-harassment-abuse-journalists (last visited 13 July 2021).

284

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 5.210.

285

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 15.

286

A Gillespie, Consultation Response.

287

Association of Police and Crime Commissioners, Consultation Response.

288

Refuge, Consultation Response.

289

English PEN, Consultation Response.

290

Suzy Lamplugh Trust, Consultation Response

291

Fix the Glitch, Consultation Response.

292

Z Alhaboby, Consultation Response.

293

Judicial Security Committee, Consultation Response.

294

LGB Alliance, Consultation Response.

295

Refuge, Consultation Response.

296

Women’s Aid, Consultation Response.

297

Antisemitism Policy Trust, Consultation Response.

298

Suzy Lamplugh Trust, Consultation Response.

299

ARTICLE 19, Consultation Response.

300

Criminal Bar Association, Consultation Response.

301

Fair Cop, Consultation Response.

302

Free Speech Union, Consultation Response.

303

Crown Prosecution Service, Consultation Response.

304

This can be contrasted with the maximum penalty of 10 years’ imprisonment for the offence of threats to kill: Offences Against the Person Act 1861, s 16.

305

Chambers v Director of Public Prosecutions [2012] EWHC 2157 (Admin); [2013] 1 WLR 1833.

306

Protection from Harassment Act 1997, s 1.

307

For example: Domestic Abuse Act 2021, s 39, 68; Serious Crime Act 2015, s 76.

308

Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, Chapter 12.

309

Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, paras 12.84 to 12.108.

310

Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, paras 12.113 to 114.

311

Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, paras 12.144 to 145.

312

See the discussion in Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, Chapter 14.

313

Reform of Offences against the Person (2015) Law Com No 361, para 8.18.

314

Misconduct in Public Office (2020) Law Com No 397, paras 6.88 to 6.89.

315

As discussed in R v Ireland [1988] AC 147 this will incorporate “recognisable psychiatric illness”.

316

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 5.200.

317

A Gillespie, Consultation Response.

318

Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, para 12.138.

319

For example, Malicious Communications Act 1988, s 1(2); Theft Act 1968, s 21(1)(a) and (b).

320

For further discussion, see for example “Defences involving other excuses and justifications” Blackstone’s Criminal Practice para A3.34 and following.

321

Offences Against the Person Act 1861, s 16 “Threats to kill”.

322

Reform of Offences against the Person (2015) Law Com No 361, para 8.18.

323

Epilepsy Society, Zach’s Law: protecting people with epilepsy from online harms (July 2020). See also S Elvin, Trolls attack boy with epilepsy, 8, by sending him hundreds of flashing images (29 May 2020), https://metro.co.uk/2020/05/29/trolls-attack-boy-epilepsy-8-sending-hundreds-flashing-images-12777199/ (last visited 13 July 2021).

324

Epilepsy Society, Consultation Response, p 5.

325

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 4.123-131.

326

See for example, Epilepsy, “Symptoms”, NHS (last reviewed 18 September 2020), available at: https://www.nhs.uk/conditions/epilepsy/symptoms/ (last visited 13 July 2021).

327

Reform of Offences Against the Person (2015) Law Com No 361, paras 4.13 to 33.

328

Hacked Off, Consultation Response.

329

Gina Miller, Consultation Response.

330

Media Lawyers’ Association, Consultation Response.

331

News Media Association, DMG Media, Consultation Responses.

332

Criminal Bar Association, Consultation Response.

333

Crown Prosecution Service, Consultation Response.

334

IMPRESS, Consultation Response.

335

Association of Police and Crime Commissioners, Consultation Response.

336

Fix the Glitch, Consultation Response.

337

National Secular Society, Consultation Response.

338

Free Speech Union, Consultation Response.

339

Mermaids, Consultation Response.

340

Crown Prosecution Service, Consultation Response.

341

  Bar Council, Consultation Response.

342

  News Media Association, Consultation Response.

343

Media Lawyers’ Association, Consultation Response.

344

Media Lawyers’ Association, Consultation Response.

345

Community Security Trust, Consultation Response.

346

See also, “News website accused of antisemitism joins press regulator” The JC, 21 March 2021, available at: https://www.thejc.com/news/uk/news-website-accused-of-antisemitism-joins-press-regulator-1.513341 (last visited 13 July 2021).

347

Carnegie UK Trust, Consultation Response.

348

In law, “intention” is not the same as “purpose”. A person can be (but does not have to be) deemed to have intended those outcomes which he or she foresaw as a virtually certain consequence of his or her actions. Purpose, by contrast, is the desired outcome. So a person who blows up an aircraft full of people hoping to collect on the insurance money has, as his or her purpose, an insurance windfall, but clearly intends the death of those he or she kills. See also the discussion of intention in Chapter 2: .

349

Online Harms White Paper: Full government response to the consultation, 15 December 2020, “Journalism”, available at: https://www.gov.uk/govemment/consultations/online-harms-white-paper/outeome/online-harms-white-paper-full-govemment-response (last visited 13 July 2021).

350

Draft Online Safety Bill, published 12 May 2021, available at:

https://www.gov.uk/govemment/publications/draft-online-safety-bill (last visited 13 July 2021).

351

We note that should a different approach be taken in relation to the Online Harms framework, existing definitions of journalism could be used to provide effective levels of protection, for example the protections afforded when things are done “for the purposes of journalism” in the Investigatory Powers Act 2016, s 264.

352

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.72; Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, para 3.78

353

Amnesty International, UK: Twitter still failing women over online violence and abuse - new analysis (20 September 2020), available at: https://www.amnesty.org.uk/press-releases/twitter-still-failing-women-over-online-violence-and-abuse-new-analysis (last visited 13 July 2021); see also Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.74.

354

See E Vogels, “The State of Online Harassment” Pew Research Centre, January 13 2021, available at: https://www.pewresearch.org/internet/2021/01/13/the-state-of-online-harassment/ (last visited 13 July 2021): “Overall, men are somewhat more likely than women to say they have experienced any form of harassment online (43% vs. 38%), but similar shares of men and women have faced more severe forms of this kind of abuse. There are also differences across individual types of online harassment in the types of negative incidents they have personally encountered online. Some 35% of men say they have been called an offensive name versus 26% of women and being physically threatened online is more common occurrence for men rather than women (16% vs. 11%). Women, on the other hand, are more likely than men to report having been sexually harassed online (16% vs. 5%) or stalked (13% vs. 9%). Young women are particularly likely to have experienced sexual harassment online. Fully 33% of women under 35 say they have been sexually harassed online, while 11% of men under 35 say the same. Lesbian, gay or bisexual adults are particularly likely to face harassment online. Roughly seven-in-ten have encountered any harassment online and fully 51% have been targeted for more severe forms of online abuse.”.

355

Fix the Glitch UK and End Violence Against Women Coalition, “The Ripple Effect: COVID-19 and the Epidemic of Online Abuse”, available at: https://www.endviolenceagainstwomen.org.uk/wp-content/uploads/Glitch-and-EVAW-The-Ripple-Effect-Online-abuse-during-COVID-19-Sept-2020.pdf (last visited 13 July 2021).

356

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 3.36. See also the discussion by Warby J in Hourani v Thompson [2017] EWHC 432 (QB) at [131]-[135], a civil case concerning an orchestrated “campaign” of harassment, where a defendant’s argument that their single instance of conduct was insufficient to amount to a “course of conduct” was rejected. It was held that the conduct of the co-defendants could be attributed to the defendant by virtue of s 7(3A).

357

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 3.63 to 66.

358

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.78 to 84. We note that the reference to the “explicit protection for political speech” is to the “contribution to a matter of public interest” aspect of the without reasonable excuse element of the offence discussed in both the consultation paper and Chapter 2 above.

359

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.86.

360

The following paragraphs (5.21 to 5.29) are only very slightly modified from the Consultation Paper: Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.87 to 99.

361

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, paras 8.46 to 8.47.

362

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.87ff.

363

Crown Prosecution Service, Guidelines on prosecuting cases involving communications sent via social media (last revised 21 August 2018) available at https://www.cps.gov.uk/legal-guidance/social-media-guidelines-prosecuting-cases-involving-communications-sent-social-media (last visited 13 July 2021).

364

An either-way offence is one that can be tried either in the magistrates’ courts or in the Crown Court. If the magistrates decide their sentencing powers are sufficient to deal with the offence, the accused may choose to have it dealt with summarily in the magistrates' court or on indictment (trial by jury) in the Crown Court.

365

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, paras 8.206 to 8.207.

366

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 22.

367

Magistrates Association, Consultation Response.

368

Alan Turing Institute, Consultation Response.

369

  Community Security Trust, Consultation Response.

370

Refuge, Consultation Response.

371

Fix the Glitch, Consultation Response.

372

Stonewall, Consultation Response.

373

Epilepsy Society, Consultation Response.

374

Suzy Lamplugh Trust, Consultation Response.

375

A Gillespie, Consultation Response.

376

Bar Council of England and Wales, Consultation Response.

377

  Criminal Bar Association, Consultation Response.

378

  Sex Matters, Consultation Response.

379

  SWGfL, Consultation Response.

380

  Adam Smith Institute, Consultation Response.

381

  Criminal Law Solicitors’ Association, Consultation Response.

382

  Fair Cop, Consultation Response.

383

  ARTICLE 19, Consultation Response.

384

  Free Speech Union, Consultation Response.

385

  English PEN, Consultation Response.

386

Crown Prosecution Service, Consultation Response.

387

Carnegie UK Trust, Consultation Response.

388

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.87 to 98.

389

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.87.

390

Protection from Harassment Act 1997, s 7(3A).

391

There is a separate, but related question as to whether Oscar’s post encouraging the “pile-on” could itself be an instance of conduct for the purpose of the PHA 1997, such that if only one other person responds to Oscar’s call to “teach Ruby a lesson” there would be a “course of conduct”. In our view, it would be possible that though not directly contacting Ruby, Oscar’s message could be included in a “course of conduct”. Further, as we set out in the scoping report and consultation paper, and as our consultees pointed out in their responses on this matter, most instances of “pile-on” harassment involve a very large number of communications. To that end, the culpable instances of inciting or encouraging “pile-on” harassment will likely involve many more than two individual communications.

392

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.89.

393

Protection from Harassment Act 1997, s 4.

394

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.90 to 6.98.

395

Crown Prosecution Service, Social Media: Guidelines on prosecuting cases involving communications sent via social media (last revised 21 August 2018), para 15. Available at: https://www.cps.gov.uk/legal-guidance/social-media-guidelines-prosecuting-cases-involving-communications-sent-social-media (last visited 13 July 2021).

396

For example, Criminal Bar Association, Bar Council of England and Wales, Criminal Law Solicitors’ Association, English PEN, Consultation Responses. Further, where a person repeatedly attempts, unsuccessfully, to incite or encourage a “pile-on” against a person, the ordinary provisions of the PHA 1997 may well apply, as those repeated unsuccessful attempts may constitute a “course of conduct” that “amounts to harassment”.

397

Online Harms White Paper: Full UK government response to the consultation, paras 2.19 to 2.27, available at: https://www.gov.uk/govemment/consultations/online-harms-white-paper/outeome/online-harms-white-paper-full-govemment-response (Last visited 13 July 2021).

398

Online Harms White Paper: Full UK government response to the consultation, paras 2.28 to 2.34, available at: https://www.gov.uk/govemment/consultations/online-harms-white-paper/outeome/online-harms-white-paper-full-govemment-response (Last visited 13 July 2021).

399

Protection from Harassment Act 1997, s 1.

400

Protection from Harassment Act 1997, s 7.

401

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, para 8.207.

402

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.100 to 6.104.

403

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 23.

404

Fix the Glitch, Consultation Response.

405

K Barker & O Jurasz, Consultation Response.

406

T Keren-Paz, Consultation Response.

407

Criminal Bar Association, Consultation Response.

408

Association of Police and Crime Commissioners, Consultation Response.

409

Magistrates Association, Consultation Response.

410

Alan Turing Institute, Consultation Response.

411

Bar Council of England and Wales, Consultation Response.

412

Crown Prosecution Service, Consultation Response. A similar point was made regarding the investigation of any potential new offence by the Association of Police and Crime Commissioners in their Consultation Response.

413

Demos, Consultation Response.

414

Refuge, Consultation Response.

415

Kingsley Napley, Consultation Response.

416

ARTICLE 19, Consultation Response.

417

Sex Matters, Consultation Response.

418

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.104.

419

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, Chapter 12.

420

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.152.

421

J Ilan, ‘Digital Street Culture Decoded: Why Criminalizing Drill Music Is Street Illiterate and Counterproductive’ (2020) British Journal of Criminology, No 4, pp 994 to 1013.

422

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.153.

423

See eg Criminal Behaviour Orders under the Anti-Social Behaviour, Crime and Policing Act 2014.

424

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.154 to 158.

425

JUSTICE, “Tackling Racial Injustice: Children and the Youth Justice System” (2021), 33-34. Available at: https://justice.org.uk/wp-content/uploads/flipbook/46/book.html (last visited 13 July 2021).

426

JUSTICE, “Tackling Racial Injustice: Children and the Youth Justice System” (2021), 40; citing S Swan, “Drill and rap music on trial” BBC News, 13 January 2021, available at: https://www.bbc.co.uk/news/uk-55617706 (last visited 13 July 2021).

427

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.153; JUSTICE, “Tackling Racial Injustice: Children and the Youth Justice System” (2021), 49.

428

JUSTICE, “Tackling Racial Injustice: Children and the Youth Justice System” (2021), 101; M Keenan, Youth Justice Centre, “JUSTICE report: report finds misunderstanding of Drill music is leading to unfair convictions” 11 March 2021, available at: https://yjlc.uk/resources/legal-updates/justice-report-report-finds-misunderstanding-drill-music-leading-unfair (last visited 13 July 2021).

429

Derek Chauvin, a former police officer, was convicted on April 21, 2021 of charges of second-degree murder, third-degree murder and manslaughter for the death of Mr Floyd which occurred on 25 May 2020. See BBC News: “George Floyd: Jury finds Derek Chauvin guilty of murder” available at: https://www.bbc.co.uk/news/world-us-canada-56818766 (last visited 13 July 2021).

430

See, for example, A Hern, Twitter hides Donald Trump tweet for 'glorifying violence' The Guardian (29 May 2020), available at https://www.theguardian.com/technology/2020/may/29/twitter-hides-donald-trump-tweet-glorifying-violence (last visited 13 July 2021).

431

  Twitter, “Permanent suspension of @realDonaldTrump” 8 January 2021, available at:

https://blog.twitter.com/en us/topics/company/2020/suspension.html (last visited 13 July 2021).

432

Twitter, Glorification of violence policy March 2019: “We define glorification to include praising, celebrating, or condoning statements, such as “I’m glad this happened”, “This person is my hero”, “I wish more people did things like this”, or “I hope this inspires others to act”.” Available at: https://help.twitter.com/en/rules-and-policies/glorification-of-violence (last visited 13 July 2021).

433

See the discussion in Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, para 12.88 and Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.159 to 165.

434

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.161.

435

Terrorism Act 2000, s 1 and s 2.

436

See Reform of Offences Against the Person (2015) Law Com No 361. The government has not yet responded to the report.

437

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.164 to 165.

438

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.166 to 174.

439

See, for example, Index on Censorship, ‘A guide to the legal framework impacting on artistic freedom of expression’, available at https://www.indexoncensorship.org/wp-content/uploads/2015/07/Counter-Terrorism 210715.pdf (last visited 13 July 2021).

440

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 27.

441

Community Security Trust, Antisemitism Policy Trust, Consultation Responses.

442

LGB Alliance, LGBT Fed, Consultation Responses.

443

Fix the Glitch, Consultation Response.

444

National Police Chiefs’ Council, Consultation Response.

445

Kingsley Napley, Consultation Response.

446

J Neller, Consultation Response. We note that such an amendment would likely have wide-ranging effects and be outside the scope of this project.

447

Magistrates Association, Consultation Response.

448

ARTICLE 19, Consultation Response.

449

Criminal Bar Association, Consultation Response.

450

English PEN, Consultation Response.

451

Crown Prosecution Service, Consultation Response.

452

Association of Police and Crime Commissioners, Consultation Response.

453

Bar Council of England and Wales, Consultation Response.

454

Commission for Countering Extremism, “Operating with Impunity: Hateful extremism: the need for a legal

framework” (February 2021), available at:

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment data/file/963156/C

CE Operating with Impunity Accessible.pdf (Last visited 13 July 2021), especially paras 6.1 to 6.39.

455

Contained in our Report: Reform of Offences Against the Person (2015) Law Com No 361.

456

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.173.

457

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.200 to 206.

458

[2018] EWCA Crim 560; [2018] Crim LR 847.

459

For discussion of the medical exception, see P Lewis, The Medical Exception (2012) 65 Current Legal Problems 355.

460

See The Guardian, 21 March 2019, available at: https://www.theguardian.com/uk-

news/2019/mar/21/tattooist-dr-evil-jailed-for-performing-ear-and-nipple-removals (last visited 13 July 2021).

461

See Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, Chapter

12.

462

Serious Crime Act 2007, s 58(3). “Anticipated or reference offence” means the offence that is encouraged or assisted.

463

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, consultation question 30.

464

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

465

ARTICLE 19, Consultation Response.

466

Criminal Bar Association, Consultation Response.

467

See paras 7.17 to 7.19 in Chapter 7 for a discussion of the possible application of s 18 of the Offences Against the Person Act 1861 where a person harms themselves.

468

[2018] EWCA Crim 560; [2018] Crim LR 847.

469

See Chapter 7: .

470

See, for example, Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 4.82.

471

C McGlynn and K Johnson, Consultation Response, p 5, citing A Marcotte, A Gesselman, H Fisher & J Garcia, “Women’s and Men’s Reactions to Receiving Unsolicited Genital Images from Men” (2020) The Journal of Sex Research; J Boulos “Cyber Flashing: ‘I froze when penis picture dropped on to my phone’” BBC News 26 April 2019, available at: https://www.bbc.co.uk/news/uk-48054893 (last visited 13 July 2021); A Gillespie, “Tackling Voyeurism: Is The Voyeurism (Offences) Act 2019 A Wasted Opportunity?” (2019) 82 Modern Law Review 1107.

472

The Association of Police and Crime Commissioners, Consultation Response.

473

S Gallagher, Consultation Response.

474

   https://anqelou-centre.orq.uk/?paqe id=92 (last visited 13 July 2021).

475

The Angelou Centre, Consultation Response.

476

National Police Chiefs’ Council, Consultation Response.

477

Suzy Lamplugh Trust, Consultation Response.

478

S Gallagher, Consultation Response.

479

C McGlynn and K Johnson, Consultation Response, p 6.

480

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 6.106.

481

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 4.82.

482

National Police Chiefs’ Council, Consultation Response.

483

Fix the Glitch, Consultation Response.

484

C McGlynn and K Johnson, Consultation Response, p 3, citing J Ringrose, “Is there hidden sexual abuse going on in your school?” TES, 29 October 2020. Available at: https://www.tes.com/news/there-hidden-sexual-abuse-going-your-school (last visited 13 July 2021).

485

C McGlynn and K Johnson, Consultation Response, p 3.

486

See, for example, Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 3.156.

487

See Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 3.9 to 3.12.

488

Crown Prosecution Service, Consultation Response.

489

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, consultation question 24.

490

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 6.124 to 6.125. It is worth noting that an offence does not have to be contained in the Sexual Offences Act 2003 itself in order to be included in Schedules 3 and 5.

491

R v Alderton [2014] EWCA Crim 2204.

492

Bar Council of England & Wales, Consultation Response.

493

Crown Prosecution Service, Consultation Response.

494

Association of Police and Crime Commissioners, Consultation Response.

495

Angelou Centre, Consultation Response.

496

It is important to clarify that it is not up to the victim whether a Sexual Harm Prevention Order is made.

497

Suzy Lamplugh Trust, Consultation Response.

498

Magistrates Association, Consultation Response.

499

C McGlynn and K Johnson, Consultation Response, p 7.

500

End Violence Against Women Coalition, Consultation Response.

501

Big Brother Watch, Consultation Response.

502

Free Speech Union, Consultation Response.

503

Sexual Offences Act 2003, s 3.

504

As we set out in Intimate Image Abuse: A consultation paper (2021) Law Commission Consultation Paper No 253, paras 6.5 to 6.12, other similar offences capture photographs and video. Further, our consultation questions all specifically addressed images and video recordings. While we may refer at times simply to an “image”, this is simply for the sake of brevity, and is not intended to reflect a narrower scope.

505

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, para 1.21.

506

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, consultation question 25.

507

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 6.139 to 6.140.

508

Refuge, Consultation Response.

509

A Gillespie, Consultation Response.

510

Association of Police and Crime Commissioners, Consultation Response.

511

Magistrates Association, Consultation Response.

512

L Thompson, Consultation Response.

513

End Violence Against Women Coalition, Consultation Response.

514

C McGlynn and K Johnson, Consultation Response, p 9.

515

S Gallagher, Consultation Response.

516

T Keren-Paz, Consultation Response.

517

Crown Prosecution Service, Consultation Response.

518

Fix the Glitch, Consultation Response.

519

S Thomas, Consultation Response.

520

See, for example, the Office of National Statistics’ Crime Survey for England and Wales, “Sexual Offences

in England and Wales: year ending March 2017” available at:

https://www.ons.qov.uk/peoplepopulationandcommunity/crimeandjustice/articles/sexualoffencesinenqlandan dwales/yearendingmarch2017 (last visited 13 July 2021).

521

Hansard (HL), 19 May 2003, vol 648, col 556.

522

  It is for this reason that we do not take forward the suggestion made by Professor Keren-Paz at 6.68, above,

to limit the offence to images of the sender or that the recipient reasonably believed were of the sender.

523

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, paras 6.146 to 6.148.

524

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, consultation question 26(1).

525

#NotYourPorn, Consultation Response.

526

C McGlynn and K Johnson, Consultation Response, p 8.

527

Angelou Centre, Consultation Response.

528

End Violence Against Women Coalition, Consultation Response.

529

Crown Prosecution Service, Consultation Response.

530

Association of Police and Crime Commissioners, Consultation Response.

531

  L Thompson, Consultation Response.

532

  Criminal Bar Association, Consultation Response.

533

Bar Council of England and Wales, Consultation Response.

534

  A Gillespie, Consultation Response.

535

  Fair Cop, Consultation Response.

536

  Kingsley Napley, Consultation Response.

537

  Kingsley Napley, Consultation Response.

538

  Kingsley Napley, Consultation Response.

539

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, consultation question 26(2).

540

Justices’ Legal Advisers and Court Officers’ Service, Consultation Response.

541

A Gillespie, Consultation Response.

542

Angelou Centre, Consultation Response.

543

Association of Police and Crime Commissioners, Consultation Response.

544

Magistrates Association, Consultation Response.

545

Fair Cop, Consultation Response.

546

Criminal Law Solicitors’ Association, Consultation Response.

547

Criminal Bar Association, Consultation Response.

548

G Miller, Consultation Response.

549

Intimate Image Abuse (2021) Law Commission Consultation Paper No 253, para 8.20.

550

We note that criminal offences targeting defendants acting for this purpose already exist. For example, sections 12, 33 and 67 of the Sexual Offences Act 2003.

551

R v G [2003] UKHL 50. See also the discussions of recklessness in our reports - Criminal Law: A Criminal Code for England and Wales (1989) Law Com No 177, vol 1 and Legislating the Criminal Code: Offences Against the Person and General Principles (1993) Law Com No 218.

552

For example, campaigns and resources directed toward better education on consent: Personal, Social Health and Economic Association, Guidance on teaching about consent: https://www.pshe-association.org.uk/curriculum-and-resources/resources/guidance-teaching-about-consent-pshe-education-key; Schools Consent Project https://www.schoolsconsentproject.com/; Department for Education Guidance on Relationships and sex education (RSE) and health education, last updated 9 July 2020, available at: https://www.gov.uk/government/publications/relationships-education-relationships-and-sex-education-rse-and-health-education (links last visited 13 July 2021).

553

Crown Prosecution Service, Youth Offenders guidance, “Section 1 of the Voyeurism (Offences) Act 2019 -‘upskirting’” 28 April 2020, available at: https://www.cps.gov.uk/legal-guidance/youth-offenders (last visited 13 July 2021).

554

The threshold for exposure (for adult offenders) is met when either the victim was under 18, or the offender has been sentenced to a term of imprisonment, detained in a hospital, or made the subject of a community sentence of at least 12 months. Sexual Offences Act 2003, Schedule 3, regulation 33.

555

Sexual Offences Act 2003, Schedule 3, regulation 34A.

556

For example, various provisions of the Youth Justice and Criminal Evidence Act 1999 provide for: giving evidence behind a screen to shield the witness from the defendant (s 23), giving evidence via live link to allow evidence to be given from outside the courtroom (s 24), giving evidence in private (s 25), removal of wigs and gowns (s 26), and video-recorded evidence in chief (s 22A).

557

Intimate Image Abuse: A consultation paper (2021) Law Com Consultation Paper No 253, paras 14.68 to 84.

558

Report of the Advisory Group on the Law of Rape (1975) Cmnd 6352.

559

For example, s 122A of the Antisocial Behaviour, Crime and Policing Act 2014 (for victims of forced marriages), and s 4A of the Female Genital Mutilation Act 2003 (for victims of female genital mutilation).

560

Sexual Offences (Amendment) Act 1992, s 2(1)(da).

561

Youth Justice and Criminal Evidence Act 1999, s 17.

562

Youth Justice and Criminal Evidence Act 1999, sections 34 & 35 (prohibiting self-represented defendants in trials of sexual offences cross-examining complainants and child complainants and child witnesses); s 41 (restrictions on adducing evidence of or questioning the complainant about their sexual behaviour).

563

Youth Justice and Criminal Evidence Act 1999, s 62.

564

We note that the offence of suicide was abolished by the Suicide Act 1961.

565

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Com No 381, 12.91.

566

See, for example, N Shanahan and others, ‘Self-harm and social media: thematic analysis of images posted on three social media sites’ (2019) 9(2) BMJ Open, available at:

https://bmjopen.bmj.eom/content/9/2/e027006#ref-12 (last visited 13 July 2021). See also R Brown and others ‘#cutting: non-suicidal self-injury (NSSI) on Instagram’ (2018) 48(2) Psychological Medicine 337.

567

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.178.

568

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.178.

569

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.180 to 181.

570

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.182.

571

  [2019] 1 Cr App R (S) 46.

572

Government Serious and Organised Crime Strategy Paper, 27, available at:

https://assets.publishing.service.gov.uk/govemment/uploads/system/uploads/attachment data/file/752850/S OC-2018-web.pdf (last visited 13 July 2021). Whether this conduct provides a “paradigm” example of encouragement of self-harm is potentially open to debate. However, it demonstrates the potential for malign actors to manipulate people who are at risk and that self-harm content is not solely created and shared by people who are themselves vulnerable.

573

A charity “dedicated to empowering the safe and secure use of technology” whose services include the support helplines Report Harmful Content and the Revenge Porn Helpline: https://swgfl.org.uk/about/ (last visited 13 July 2021).

574

Abusive and Offensive Online Communications: A Scoping Report (2018) Law Commission Scoping Report Law Com No 381.

575

  Inchoate Liability for Assisting and Encouraging Crime (2006) Law Com No 300, para 1.9.

576

  Inchoate Liability for Assisting and Encouraging Crime (2006) Law Com No 300, para 1.9.

577

Serious Crime Act 2007 section 44 “intentionally encouraging or assisting an offence”, section 45 “encouraging or assisting an offence, believing it will be committed”, and section 46 “encouraging or assisting offences, believing one or more will be committed”.

578

  Blackstone’s Criminal Practice 2021 Chapter A5 Inchoate Offences A5.4.

579

  Blackstone’s Criminal Practice 2021 Chapter A5 Inchoate Offences A5.6.

580

  Blackstone’s Criminal Practice 2021 Chapter A5 Inchoate Offences A5.32.

581

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.185.

582

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.185.

583

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.185 to 6.189.

584

See our separate report, Reform of Offences against the Person (2015) Law Commission Report Law Com No 361.

585

Reform of Offences against the Person (2014) Law Commission Consultation Paper 217, para 2.122.

586

W Blackstone, Commentaries on the Laws of England (1st ed 1765-1769) vol 4, p 205.

587

Smith, Hogan & Ormerod’s Criminal Law (15th ed) Chapter 11 “Inchoate Crime” 11.4.4, 481 (citations in original).

588

  [1998] 1 Cr App R (S) 273, [1997] Crim LR 897, CA. (A statutory offence of incitement, contrary to the

Misuse of Drugs Act 1971, s 19.

589

Invicta Plastics Ltd [1976] RTR 251, [1976] Crim LR 131, DC. (Indication that ‘Radatex’ may be used to detect police radar traps was incitement to an offence under s 1(1) of the Wireless Telegraphy Act 1949). Note that the licensing requirement was removed by SI 1989/123, and that no offence of ‘obtaining information’ is committed by the user of the apparatus: Knightsbridge Crown Court, ex parte Foot [1999] RTR 21.) Cf the reports in The Times, 6 Aug 1998 that a student was convicted of ‘inciting speeding offences’ by the sale of a ‘speed trap jammer’. In Parr-Moore [2002] EWCA Crim 1907, [2003] 1 Cr App R (S) 425, the court described the appellants’ publication of disclaimer on such a device serving only to illustrate their realization that the trade was illegal, at [3].

590

  James and Ashford (1985) 82 Cr App R 226 at 232, [1986] Crim LR 118, CA, distinguishing Invicta Plastics

Ltd. See also Maxwell-King [2001] 2 Cr App R (S) 28, (2001) The Times, 2 Jan, CA: incitement to commit offence contrary to the Computer Misuse Act 1990, s 3, by supply of device to allow unauthorized access to satellite TV channels.

591

Blackstone’s Criminal Practice 2021 Chapter B1 Homicide and Related Offences, B1.145.

592

See s 2(1)(b) Suicide Act 1961 extracted above. The previous formulation is available at:

https://www.legislation.gov.uk/ukpga/Eliz2/9-10/60/1991-02-01 (last visited 13 July 2021) and referred to in A-G v Able [1984] 1 QB 795.

593

S [2005] EWCA Crim 819.

594

SWGfL & A Phippen, Consultation Response.

595

Facebook, Consultation Response.

596

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No

248, consultation question 28.

597

Samaritans, Consultation Response.

598

Refuge, Consultation Response.

599

Association of Police and Crime Commissioners, Consultation Response.

600

Magistrates Association, Consultation Response.

601

Big Brother Watch, Consultation Response.

602

For more information on the “public interest test” applied by prosecutors, see: Crown Prosecution Service, Suicide: Policy for Prosecutors in Respect of Cases of Encouraging or Assisting Suicide, February 2010, updated October 2014, available at: https://www.cps.gov.uk/legal-guidance/suicide-policy-prosecutors-respect-cases-encouraging-or-assisting-suicide (last visited 13 July 2021).

603

Criminal Bar Association, Consultation Response.

604

Confusingly, the term “reasonable excuse” is frequently used to refer to justifications or permissions. Ashworth and Horder, Principles of Criminal Law 7th ed 2013, 116.

605

 [2018] EWCA Crim 1803.

606

This is considered in some detail in the Full government response to the consultation on the Online Harms White Paper, for example at paras 2.28 to 2.45, Available at:

https://www.gov.uk/govemment/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response (last visited 13 July 2021).

607

Harmful Online Communications: A Consultation Paper (2020) Law Commission Consultation Paper No 248, consultation question 29.

608

Samaritans, Consultation Response.

609

Her Majesty’s Government, Consultation Response.

610

Criminal Bar Association, Consultation Response.

611

Association of Police and Crime Commissioners, Consultation Response.

612

  Law Society of England and Wales, Consultation Response.

613

  Bar Council of England and Wales, Consultation Response.

614

ARTICLE 19, Consultation Response.

615

Magistrates Association, Consultation Response.

616

See Chapter 2.

617

R v Natasha Gordon [2018] EWCA Crim 1803.

618

  SWGfL’s general response contained similar observations about the risks that attend “digital ghost stories”.

619

  Samaritans, confidential “accompanying note” and “Rapid Review of Evidence”. These were provided

confidentially and any inclusion in a final published paper or report would be subject to Samaritans’ review and agreement.

620

Director of Public Prosecutions, “Suicide: Policy for Prosecutors in Respect of Cases of Encouraging or Assisting Suicide” February 2010, updated October 2014. Available at: https://www.cps.gov.uk/legal-guidance/suicide-policy-prosecutors-respect-cases-encouraging-or-assisting-suicide (last visited 13 July 2021).

621

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.195 to 6.198.

622

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, paras 6.195 to 6.198. We note that Brazil recently extended their incitement to suicide offence to criminalise “inducement or instigation” of self-harm: https://www.loc.gov/law/foreign-news/article/brazil-new-law-criminalizes-instigation-to-self-harm/ (last visited 13 July 2021).

623

For a detailed overview of the Act, and our own proposals for reform, see Reform of Offences Against the Person (2015) Law Com Report No 361.

624

Miller [1954] 2 QB 282, [1954] 2 All ER 529.

625

Reform of Offences Against the Person (2015) Law Com Report No 361, para 2.19.

626

See R v Brown [1993] UKHL 19, [1994] 1 AC 212.

627

See also BBC News, “’Concerning’ rise in pre-teens self-injuring” (16 February 2021), available at: https://www.bbc.co.uk/news/uk-55730999 (last visited 13 July 2021).

628

Discussed at length in Re A (Children) (Conjoined Twins: Medical Treatment) (No 1) [2001] Fam 147, and in our separate Report: Defences of General Application (1977) Law Com Report No 83.

629

Harmful Online Communications: The Criminal Offences (2020) Law Commission Consultation Paper No 248, para 6.197.

630

  For example, the Consultation Responses of Samaritans, SWGfL, B Thain, C Wilton-King, Association of

Police and Crime Commissioners all referred to the importance of accessible mental health resources.

631

Consultation Responses that noted this included those from the Crown Prosecution Service, A Gillespie, Association of Police and Crime Commissioners, National Police Chiefs’ Council, Bar Council of England and Wales, Criminal Bar Association.

632

Director of Public Prosecutions, “Suicide: Policy for Prosecutors in Respect of Cases of Encouraging or Assisting Suicide” February 2010, updated October 2014. Available at: https://www.cps.gov.uk/legal-guidance/suicide-policy-prosecutors-respect-cases-encouraging-or-assisting-suicide (last visited 13 July 2021).

633

  Director of Public Prosecutions, “Suicide: Policy for Prosecutors in Respect of Cases of Encouraging or

Assisting Suicide” February 2010, updated October 2014, “Public interest factors tending against prosecution”. Available at: https://www.cps.gov.uk/legal-guidance/suicide-policy-prosecutors-respect-cases-encouraging-or-assisting-suicide (last visited 13 July 2021).

634

As for the Suicide Act 1961, we would not recommend that this must be the personal consent of the Director of Public Prosecutions. See Crown Prosecution Service, Consents to Prosecute. 31 October 2018, updated 11 December 2018. Available at: https://www.cps.gov.uk/legal-guidance/consents-prosecute (last visited 13 July 2021).


BAILII: Copyright Policy | Disclaimers | Privacy Policy | Feedback | Donate to BAILII
URL: http://www.bailii.org/ew/other/EWLC/2021/LC399.html