Efforts in Inefficacy: Reflections on the Digital Economy Act and the looming spectre of the Online Harms Bill

The current landscape of pornography legislation in the UK is an ever-shifting entity, struggling to both accurately represent the plethora of sexualities that exist in our society and to effectively and fairly govern the ways in which these sexualities are portrayed. Since the advent of the internet, the public has provided portrayals and insights into a flourishing and diverse online pornography industry. This had aided individuals in accessing a range of pornography and alternative forms of sexual expression.

However, this has not been without its complications. Under the banner of pornography, several sites have shared content depicting ‘scenes of gore, violence, rape, snuff, hanging, dead bodies, genital mutilation, aborted babies, suicides, necrophilia, dead celebrities, murder victims’ (Presdee, 2000: 84), and other content which is far removed from the playful and exploratory transgressions of the adult film industry and the range of sexual expression present currently.

The distribution of this sort of material has set the government on a crusade to regulate online pornography in the interests of protecting the public from harm, positioning some acts (such as sadomasochistic and BDSM play) as obscene stepping stones towards the aforementioned snuff film material. Despite the fact that real-life depictions of torture and violence can be viewed freely (for example, the photographs of the torture and humiliation experienced by prisoners at Abu Ghraib prison), under the guise of educating the public about world events (Attwood and Smith, 2010: 181-182). Pornography is not afforded the same freedoms, as it was made for consumption within a sexual context.

Following this line of thought, the government models their interventions on the idea that governing pornography is justified to protect the public from harm and minimise the fallout of the perceived contribution of pornography to ‘social problems of sexual dysfunction, the continually rising rates of sexually transmitted infections…and the annually rising sexual crime rate’ (Mediawatch UK, in McGlynn and Ward, 2009: 347). Pornography is often conflated with corroding innocence and teaching unrealistic depictions of sex when regarding children, whilst it is associated with trafficking and even internalised patriarchy when considering the positionality of women watching or performing in pornography. Whilst it is agreed that pornography should be regulated to some extent (to protect against issues of consent and protect vulnerable groups), the intervention of the government can be seen as inaccurate and unjustified in most cases.

It has been argued that pornography legislation is not sufficiently informed, as pornography producers /performers and the wider general public are often disregarded in the process of drafting such policy and excluded from conversations about new legislation and the implementation thereof. As well as the most knowledgeable voices being silenced, the definition of pornography is notoriously difficult to define as it is highly dependent on personal sexual preferences and context. What is considered pornographic to one group, may in fact constitute the norm for another.

This article will explore the implications for wider society of prior and current pornography and obscenity legislation in the UK, with a particular focus on the Digital Economy Act (DEA) (2017) and the proposed Online Harms Bill (OHB). Perhaps the most archaic of these was the Obscene Publications Act of 1959, which stated a duty to ‘penalise purveyors of obscene material’, deeming the act of publishing an article considered to be obscene a criminal offence (Crown Prosecution Service [CPS], 2018: OPA, 1959). Acts commonly prosecuted under this legislation included ‘sexual act with an animal…torture with instruments…dismemberment or graphic mutilation’ (CPS, 2018: OPA, 1959), all of which are acts considered to be abhorrent and prosecutable.

However, regardless of the consent of the participants, the OPA has repeatedly allowed for the prosecution of acts associated with sadomasochistic play, bondage (particularly where gags are incorporated) and the subjective clause of ‘activities involving perversion or degradation’ (CPS, 2018: OPA, 1959). Some examples of this are the cases of R v Brown [1993], which saw the prosecution of a group of gay men for filming sadomasochistic acts which they conducted in private, and R v Peacock [2012], in which the defendant was tried for distributing gay pornography with elements of BDSM. This characterises the pervasiveness of the OPA and the instrumental use of the vague guidelines to set and ‘determine standards of sexual decency, morality and taste’ (Attwood and Walters, 2013: 976).

Following the arguments that the OPA was outdated and a decade of campaigning efforts, the Crown prosecution service agreed to remove all specific examples from legislation and replaced these with tests which aim to determine whether content can be deemed as obscene (BBC News, 2019 [1]). Under these new guidelines, it is stated that it is unlikely that material will be prosecuted if: it features consenting adults (where consent is made clear), no serious harm is causes and the likely audience is not under 18 (BBC News, 2019 [1]).

The Criminal Justice and Immigration Act (CJIA, 2008) had built upon the foundations laid by the OPA, setting out the parameters of what constitutes extreme pornography. Section 63 of the act states that for an image to be considered as extreme pornography, it must be seen as ‘grossly offensive, disgusting and otherwise of an obscene character’ (Legislation.Gov.UK: CJIA, 2008, part 5, s63: 7) in the opinion of the majority of the public. Images are considered to be extreme pornography if they display an act which portrays non-consensual penetration of an individual’s orifices or an act ‘which threatens a person’s life, an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals’ (Legislation.Gov.UK: CJIA, 2008, part 5, s63: 7). As well as depictions of necrophilia and zoophillia. The extreme pornography guidelines set out in the CJIA (2008) are still in effect, despite numerous campaigns to have them amended.

The most recent precursor to the DEA were the Audio-Visual Media Services Regulations (AVMS) (2014), Which imposed restrictions on the creation, production and distribution of Pornography in the UK. However, these regulations did not set out explicit guidelines on what could and could not be portrayed. Rather that pornography would have to adhere to R18 video standards enforced by the communications regulator, ATVOD in partnership with Ofcom, and follow the same guidelines set out for DVD pornography by the BBFC. These requirements ensure that acts including; impact activities (spanking/canning/whipping), physical restraint, humiliation, urination and female ejaculation (The Independent, 2014). These regulations also banned the acts of fisting, face-sitting and breath play, dubbing them life threatening. This created a confusing parallel, as these acts which were excluded from consumption via film under the guise of danger, were acts that individuals were lawfully able to perform in their sexual encounters.

The DEA was passed into law in April 2017, after the Department for Digital, Media, Culture and Sport (DCMS) had ‘opted to forgo much of the usual “ping-pong” process of debating amendments and wording in order to get the thing passed’ before progress was slowed by the snap election (Rigg, 2017). Part 3 of this act ‘requires that companies delivering adult content in the UK act responsibly by having robust age verification controls in place to prevent children accessing explicit material’ (Gov.UK, 2017). Each individual visiting a site which hosts adult content will be prompted to make an account, using their credit card, mobile phone details, passport or drivers licence to verify their age (IS Preview, 2018).

Rather than developing a software to meet their aims, the UK government had entrusted this task to the AV Software market on a competition basis. The main contender was the AgeID software produced by giant pornography conglomerate, MindGeek (encapsulating YouPorn, PornHub, RedTube, Tube8, Spankwire, Brazzers, among other sites). The AgeID software proposed to offer an aggregated service, allowing customers to instantly pass the age verification process across the MindGeek network. In conjunction with this, the government had appointed the BBFC as the regulator of adult film content, allowing them to decide what constituted extreme pornographic material, to instruct internet service providers to prevent customers from accessing sites hosting content which did not comply with their guidelines, and to issue fines of up to £250,000 (The Independent, 2018) in cases where organisations did not adhere to the guidelines. 

An array of predictions and reasons for the inefficacy of the Digital economy act were put forward. Firstly, it was suspected that the overzealous regulation of the pornography industry would push the industry underground. This was predicted to result in environments where performers would be exploited, underpaid and abused. As well was this, it was noted that logically if consumers were not able to view the material they desired via mainstream channels, this would result in increased used of VPN software or even use of the Dark Web to circumnavigate the extreme pornography guidelines set out by the BBFC and the age verification software which would demand their personal data. This may increase the likelihood of viewing of material which exceeds the extreme pornography guidelines prohibited material list, and increase the likelihood of vulnerable people being groomed.

Secondly, the likely outcome for content creators and producers of pornography in the UK were likely to be shut down for not ensuring that access to their content was restricted by Age Verification, or being priced out of their livelihood by the proposed Age Verification systems. Smaller and independent producers were unlikely to be able to introduce an Age Verification software of their own to their website, and could not afford the cost of the MindGeek product, AgeID at a rate of £300 per day, whilst most in this position have an income of approximately £1,000 per month (Brown, 2017).

Additionally, if the types of content which are most profitable for companies to produce and distribute become illegal, then it is likely that performers and producers would have to relocate to a less restrictive jurisdiction, or shut down entirely. This gap in the market and the affiliation of MindGeek with the UK government posed the further likelihood of MindGeek monopolising the market and facilitating a homogenisation of pornography, which would exclude several forms of sexual expression.

Finally, the most pressing concern surrounding the potential introduction of Age Verification Software was data security. As Age Verification systems require the individual to enter their ‘name, postal address, mobile phone numbers and demographic information’ (Sky News, 2018), as well was collecting information about their online activity, including what types of porn they watch, these companies stand to obtain masses of sensitive data on their consumers. This warehousing of sensitive data poses great threats to privacy and an increased likelihood of data breaches and leaks, often with fatal consequences (as seen during the Landslide productions investigation [1999] and the Ashley Madison data leaks of 2015).

Historically, MindGeek had struggled to protect user data, resulting in the websites included within its network being subject to periodic hacks. In 2012, YouPorn Customers faced a leak of their addresses and passwords, due to a coding error which MindGeek were aware of for some time (ORG, 2018). In 2015, RedTube, PornHub and YouPorn users became subject to mass malware advertising hacks (ORG, 2018). In 2016, 800,000 paid users were left vulnerable to online extortion after a cyber-attack launched against MindGeek went undetected for 3 years (ORG, 2018). In 2017, millions of PornHub users across the globe had the security of their data compromised, after the website experienced a year-long malware advertising campaign (ORG, 2018). 

The most ungraceful deflation of momentum behind the Digital Economy Act (2017) came when the government actors realised that they indeed would not be able to implement the unworkable legislation they had hastily presented. This was a gradual process of realisation, it seems, with the date of implementation being pushed further into the future, without convincing explanation or rationale.

One of the key issues in trying to implement this legislation was that the pornography industry has always managed to adjust in some way to the strict impositions placed upon it. This has become particularly evident with the emergence of webcam pornography, for which all performers ‘need is a laptop, a web camera, a platform and a bank account’ (Klein, 2016: 222-223). Such performers are also able to bypass regulations and legislation through posting their content on sites that are based outside of the UK.

Another point of concern was that previous attempts at blocking sites which contain adult material have resulted in ‘tens of thousands of websites being blocked, despite their content being perfectly legal’ (New Statesman, 2017). Although websites hosting pornography can be blocked, huge swathes of the internet host adult content which would be difficult to regulate. Sites such as Reddit, 4Chan, Twitter and Tumblr have come under attack due to their mix of content, thus diminishing creative licence and freedom of expression through policing what can and cannot be posted.

As previously mentioned, the Age Verification measures can easily be circumnavigated using VPNs, ToR, onion browser and even the Dark Web, rendering the individuals and the content they view untraceable, in spaces that are already devoid of monitoring and effective regulation. This increases the risk of children and other vulnerable groups being subjected to situations of grooming, bullying, abuse, and more recently, radicalisation (The Telegraph, 2018).

With these concerns in mind, the proposed efficacy of the legislation started to unravel. The first sign that the Age Verification measures would be curtailed at the last hurdle came two weeks before the proposed implementation of Age Verification systems across the spectrum of adult entertainment websites in April 2018, with the government sheepishly adding this statement at the end of an entirely unrelated press release on the introduction of 5G networks in the UK (The Independent, 2018). This was attributed to allowing the BBFC more time to best achieve their aim of making the internet safer for children. The second time the delay in the implementation of Age verification was attributed to a failure to comply with European law in July 2019 (The Guardian, 2019). These delays hinted at problems with the implementation of age verification, culminating in the DCMS stating that they would not be going ahead with the Age Verification clause of the Digital Economy Act in October 2019 (BBC News, 2019 [2]). This statement came as they were forced to acknowledge the aforementioned risks which had been concerns from the conception of the bill. Rather than abandon the idea all together, the DCMS stated that the objectives of the Digital Economy Act would be more effective if delivered via the Online Harms Regulations (Sex Tech Guide, 2019).

The Online Harms bill (OHB) is currently a government white paper which outlines measures to ‘“Make the UK the safest place in the world to be online”’ (Info Security Group, 2020). The paper puts forward plans to move away from the age of self-regulation on the internet, and towards ‘a new system of accountability’ for the content we post online (GOV.UK: Online Harms White Paper, 2020).

The online harms bill aims to resolve a range of issues including, the sharing of terrorist content and content which is considered to incite violent crime, combatting online abuse and cyberbullying, the sale of opioids online, child exploitation/ abuse online and the underage sharing of sexual imagery.  The proposal of the bill even extends to include content illegally uploaded from prisons (as it undermines public confidence in the prison service) (GOV.UK: Online Harms White Paper, 2020).

The belief is that all of these issues can be resolved by imposing ‘a statutory duty of care on online services to protect users from “online Harms”’ (Stewarts Law, 2020). This duty would apply to services where users can interact with each other and which facilitate ‘user-generated content’ (Stewarts Law, 2020).

Most worryingly, the bill proposal includes the aim of tackling ‘online disinformation’ (GOV.UK: Online Harms White Paper, 2020). In the age of “fake news”, this may seem like a step in the right direction. However, the bill proposes a challenge to ‘a real danger that hostile actors use online disinformation’ (GOV.UK: Online Harms White Paper, 2020). Further stating a desire to quash the ‘echo chambers’ and ‘filter bubbles’ which they claim have lead users to ‘perceive a story to be far more widely believed than It really is’ (GOV.UK: Online Harms White Paper, 2020). This is considered to be part of their wider strategy to ‘develop norms and rules for the internet’ (GOV.UK: Online Harms White Paper, 2020).

Despite this, the paper claims that the ‘UK is committed to a free, open and secure internet, and will continue to protect freedom of expression online’ (GOV.UK: Online Harms White Paper, 2020). The proposed new regulatory framework has promised to ‘ensure the safety of users while protecting freedom of expression, especially in the context of harmful content or activity that may not cross the criminal threshold but can be particularly damaging’ to vulnerable groups (GOV.UK: Online Harms White Paper, 2020).

As seen with the existing measures in place to challenge online harms, such as the Global Internet Forum to Counter Terrorism (which aims to reduce the availability of terrorist propaganda online), and Project Arachnid (which trawls the web to identify webpages with suspected child sexual abuse) (GOV.UK: Online Harms White Paper, 2020), these measures are an afterthought in the process of protecting vulnerable people online. There would be a lesser need for these measures if the government placed more emphasis on educating children and their parents on how to keep safe online. The government taking this matter into their own hands, whilst a seemingly noble cause, takes away to responsibility of a parent or guardian to monitor their child’s internet usage and remain accountable for the harm that comes to them due to the guardians lack of attention.

The clearly stated rationale for the Online Harms bill is set out ‘a programme of action to tackle content or activity that harms individual users, particularly children, or threatens our way of life in the UK, either by undermining national security, or by undermining our shared rights, responsibilities and opportunities to foster integration’ (GOV.UK: Online Harms White Paper, 2020). It is apparent that much like the rationales presented for the DEA, the cornerstone of the bill is the protection of children. However, there are also the echoes of the moral paternalistic governing of expression and morality politics that were present in the formation of the DEA.

The Online Harms White Paper outlines that use of the internet ‘can be a hugely positive experience for children and young people’, citing web usage as ‘essential for their children’s learning development’ (GOV.UK: Online Harms White Paper, 2020). But is quick to express that the internet is a place which can be used to ‘spread terrorist and other illegal or harmful content, undermine civil discourse, and abuse or bully other people’ (GOV.UK: Online Harms White Paper, 2020). According to the Chief Executive of the NSPCC, ‘lockdown has created the perfect storm for online abuse. We don’t yet know the true scale, but we do know young people spent longer on platforms with fewer moderators’ (The Telegraph, 2020).

The white paper purports that the UK government ‘has an established reputation for global leadership in advancing shared efforts to improve online safety’, is ‘a world-leader in emerging technologies and innovative regulation’, with the objectives of ‘protecting personal data…and promoting responsible digital design’ (GOV.UK: Online Harms White Paper, 2020). As set out previously in this article, the fate of the DEA from conception to failed implementation does not align with these claims. The proposal for the Online Harms Bill further states that the measures outlined in the white paper are ‘novel and ambitious’, when in fact the DCMS had admitted that where the DEA had failed the OHB would pick up the slack. As a result of this, many of the same issues are presenting themselves as the OHB is developed.

As with the Digital Economy Act, the biggest threat posed by the Online Harms Bill is the threat to freedom of expression. The Open Rights Group commented that the ‘new internet regulations targeting “harmful content” risk curtailing free expression’ (ORG, 2019). The white paper clearly states that our ‘reputation and influence across the globe is founded upon our values and principles’, following this with the caveat that our democratic freedoms and global status are based upon the public to ‘peacefully contribute to public discourse’ (GOV.UK: Online Harms White Paper, 2020). Thus implying that the general public should contribute to government authorised discourses, rather than challenge them or create alternatives.

Possibly the most troubling circumstance of the OHB is the appointment of Ofcom as the “independent” regulator, to oversee the issuing of sanctions to the decidedly offending companies and content producers. Ofcom cannot be considered an independent body which is fit for this job, as their complicity towards government ideals has been historically demonstrated, most notably in the censorship of expression demanded by the AVMS regulations in 2014. This is further evidenced by the fact that the white paper states that any codes of practice to tackle the online harms set out in the bill ‘must be signed off by the Home Secretary’ (GOV.UK: Online Harms White Paper, 2020). 

As well was the appointed regulator being far from independent, they have been promised a series of destructive powers to ensure that their aims are met. These are reported to include; ‘the power to issue fines, impose liability on senior managers, mount raids and seize materials, and order ISPs to block access’ (Stewarts Law, 2020). The additional power to demand ‘annual transparency reports from companies…outlining the prevalence of harmful content on their platforms’ has been stated as part of Ofcom’s remit, as well as insight into their inner workings including ‘algorithms [used] in selecting content for users’ (GOV.UK: Online Harms White Paper, 2020).

Disquietingly, for a piece of legislation which aims to tackle online harms, there are surprising online issues which are to be excluded from its scope. The white paper expresses that ‘all harms suffered by individuals on the dark web rather than the open internet’ will be excluded from the remit of the legislation (GOV.UK: Online Harms White Paper, 2020). This is an oversight of gargantuan proportions, as we know that the dark web facilitates much of the harm which the legislation claims to be against.

A prickly issue arises when we consider private communication channels. Whilst it is integral that our private interactions are kept away from the public gaze in the interests of maintaining freedom of expression, events in recent years (such as the radicalisation and incitement of terrorist activity via WhatsApp) prove that the legislation should take into account the ability of private communication channels to cause harm. Despite this, the legislation proposal states that ‘any requirements to scan or monitor content for tightly defined categories of illegal content will not apply to private channels’ (GOV.UK: Online Harms White Paper, 2020).

There is also little consideration of buffers which should be in place to protect against the fallout of the legislations proposed implementation. Harms to organisations will not be included, particularly in the cases of intellectual property violations and unlawful cessation of their services by government appointed powers (GOV.UK: Online Harms White Paper, 2020). This is considered to be less important than ‘harms suffered by individuals’ (GOV.UK: Online Harms White Paper, 2020). Additionally, harms which are ‘suffered by individuals that result directly from a breach of data protection legislation’ are to be excluded (GOV.UK: Online Harms White Paper, 2020). Under this clause, the paper lists that people will not be protected from the ‘distress arising from intrusion…unfair processing, and any financial losses…a breach of cyber security or hacking’ (GOV.UK: Online Harms White Paper, 2020).

It seems that the progress of the bill has been stunted, due to the emergence and associated issues of COVID-19, with the bill remaining at White Paper stage (BBC News, 2020). This delay is present despite the introduction of a paving bill issued in January 2020 requiring the DCMS and Ofcom to publish a draft bill by January 2021, due to frustrations regarding the slow progress of the bill (Stewarts Law, 2020). The 13th of May 2020 saw the current head of the DCMS promise to issue the draft of the bill in the autumn of 2020 (Stewarts Law, 2020), however the DCMS have since admitted that they ‘could not commit to bringing a draft bill to parliament until the end of 2021’ (Info Security Group, 2020). This has raised concerns, with the chair of the Digital Committee, Lord Putnam, predicting that the bill may not be able to be implemented until 2024. Lord Putnam had highlighted that from the conception of the bill this would be two lifetimes in the technology world (BBC News, 2020). This delay would render the bill obsolete and powerless to effectively challenge the rapidly changing landscape of the online world.

From this, we can conclude that whilst the attempt at implementation of the Online Harms Bill seems to be years away, it is still cause for concern when considering the infringement on personal freedoms and civil liberties. However, there is an increasing likelihood that the Online Harms Bill will follow the fate of Digital Economy Act (2017).

The Online Harms White paper presents yet another vague “Christmas tree bill” with the aims of censoring the whole web, in a bid to divert attention away from the ineptitude of the government in dealing with the longstanding and intrinsic problems present within UK society. It is an observable shift towards a paradigm of punishment for deviation from state imposed norms and approved behaviours based on perceived harms, even in cases where criminality is not present. Even if the law is passed, there is belief that the complications and challenges to its implementation will prove too much for the ill-conceived ideas, and is likely to result in more harm than good. This legislation would only serve as an addition to the array of piecemeal and terribly thought out approaches which simply do not work.

Notes

This article was derived from information extracted from the authors MSc thesis (2018), titled: ‘Protection or Puritanism: A commentary of the implications of current pornography legislation for the adult film industry and contemporary society in the UK?’. The objectives of this study were to collect detailed data regarding the lived experience of both producers and performers of adult content in the UK, to collect detailed data about the concerns for the general population from experts directly and indirectly involved with the adult film industry, and to determine whether government strategies and legislation on pornography are devised to protect the public, or to impose puritanical restrictions bases on ideas of morality. This information was combined with the recent release of information on the Online Harms Bill, which was mostly found in news media and government documents, as there is very little scholarship on this topic.

References

Attwood, F. and Smith, C. (2010) ‘Extreme concern: Regulating ‘Dangerous Pictures’ in the United Kingdom’, Journal of Law and Society, Volume 37, Issue 1, PP 171-188, Jstor Journals. Available at:

http://www.jstor.org/stable/25622013

Attwood, F. and Walters, C. (2013) ‘Fifty Shades and the Law: Regulating sex and sex media in the UK’, Sexualities, Volume 16, Issue 8, PP 974-979, Sage Publications Journals. Available at: 

http://journals.sagepub.com/doi/abs/10.1177/1363460713508880

BBC News (2019 [1]) ‘Obscene porn rules relaxed in England and Wales’. Available at:

https://www.bbc.co.uk/news/technology-47069414

BBC News (2019 [2]) ‘UK’s controversial ‘porn blocker’ plan dropped’. Available at:

https://www.bbc.co.uk/news/technology-50073102

BBC News (2020) ‘Online Harms Bill: Warning over ‘unacceptable’ delay’. Available at:

https://www.bbc.co.uk/news/technology-53222665

Brown, M. (2017) ‘Pornhub’s Owner is About to Card Everybody in the UK’, Inverse Innovation. Available at:

https://www.inverse.com/article/31314-pornhub-mindgeek-ageid-user-verification-system-myles-jackman-pandora-blake

Crown Prosecution Service (2018) ‘Obscene Publications’. Available at:

https://www.cps.gov.uk/legal-guidance/obscene-publications

GOV.UK (2017) ‘BBFC proposed to enforce age verification of online pornography’. Available at:

https://www.gov.uk/government/news/bbfc-proposed-to-enforce-age-verification-of-online-pornography

GOV.UK (2020) ‘Online Harms White Paper’. Available at:

https://www.gov.uk/government/consultations/online-harms-white-paper/online-harms-white-paper

Info Security Group (2020) ‘The Online Harms Bill cannot Wait’. Available at:

https://www.infosecurity-magazine.com/opinions/online-harms-bill-wait/

IS Preview (2018) ‘Age Verification and UK ISP Internet Porn Ban Quietly Delayed’. Available at:

Klein, A. (2016) ‘Regulating online erotica – ethnographic observations of a UK-based adult entertainment industry’, Drugs and Alcohol Today, Volume 16, Issue 3, PP 222-227, ProQuest Journals. Available at: 

https://search-proquest-com.gold.idm.oclc.org/docview/1823088634?OpenUrlRefId=info:xri/sid:primo&accountid=11149

Legislation.Gov.UK (2018) ‘Criminal Justice and Immigration Act 2008 – Section 63’. Available at:

https://www.legislation.gov.uk/ukpga/2008/4/section/63

McGlynn, C. and Ward, I. (2009) ‘Pornography, pragmatism and proscription’, Journal of Law and Society, Volume 36, Issue 3, PP 327-351, Jstor Journals. Available at:

http://www.jstor.org/stable/25621977

New Statesman (2017) ‘The UK has now entered a draconian era of porn prohibition’. Available at:

https://www.newstatesman.com/science-tech/privacy/2017/05/uk-has-now-entered-draconian-era-porn-prohibition

Open Rights Group Wiki (2018) ‘MindGeek/ List of MindGeek data breaches’. Available at:

https://wiki.openrightsgroup.org/wiki/MindGeek/List_of_MindGeek_data_breaches

Open Rights Group (2019) ‘A New Wave Of Internet Censorship May Be On The Horizon’. Available at:

Presdee, M. (2000) Cultural Criminology and the Carnival of Crime. London: Routledge.

Rigg, J. (2017) ‘How the Digital Economy Act will come between you and porn’, Engadget.com. Available at:

https://www.engadget.com/2017/05/03/digital-economy-act-explainer/

Sex Tech Guide (2019) ‘UK drops beleaguered porn block, but hints at ‘chilling’ potential future plans’. Available at:

Sky News (2018) ‘Porn giant MindGeek denies planning to snoop on UK Viewers’. Available at:

https://news.sky.com/story/porn-watchers-will-have-to-prove-they-are-over-18-under-new-laws-11224093

Stewarts Law (2020) ‘Online Harms: What progress in the UK and EU?’. Available at:

The Guardian (2019) ‘UK age-verification system for porn delayed by six months’. Available at:

https://www.theguardian.com/technology/2019/jun/20/uks-porn-age-verification-system-to-be-delayed-indefinitely

The Independent (2014) ‘UK porn legislation: What is now banned under new government laws’. Available at:

http://www.independent.co.uk/news/uk/home-news/uk-porn-legislation-what-is-now-banned-under-new-government-laws-9898541.html

The Independent (2018) ‘Porn age-verification laws delayed by UK government amid widespread confusion about how they will actually work’. Available at:

https://www.independent.co.uk/life-style/gadgets-and-tech/news/porn-age-verification-laws-ageid-youporn-pornhub-mindgeek-uk-government-a8251791.html

The Telegraph (2018) ‘Warning that age-checks on porn sites risks pushing children to dark web’. Available at:

https://www.telegraph.co.uk/news/2018/01/05/warning-age-checks-porn-sites-risks-pushing-children-dark-web/

The Telegraph (2020) ‘Self-regulation of the internet must come to an end through an online harms law that delivers meaningful and lasting change’. Available at:

https://www.telegraph.co.uk/politics/2020/09/30/self-regulation-internet-must-come-end-online-harms-law-delivers/