Resurrecting Part 3 Digital Economy Act 2017 is not the answer

Photo of a cockrell1

According to “Christian Today”, the High Court has granted permission for a pastor and someone else to bring a judicial review against the government, for its decision not to implement Part 3 Digital Economy Act 2017 - the bit of the Act dealing with online pornography - pending its work on the Online Safety Bill.

This decision is not an unduly significant one (although, inevitably, some are portraying it as a “loss” for the government), and it simply means that those bringing the claim now need to do so. It’s no guarantee that their claim will succeed.

For what it’s worth, even though I think that Part 3 was so flawed that not taking it forward was absolutely the right outcome, I think there is a genuinely interesting constitutional issue in this case, in terms of the government’s powers in the context of an Act made by Parliament.

Why flawed? Here are three reasons, off the top of my head.

(There was also a disagreement between what the statute says, and what the proposed age verification regulator said, in terms of the definition of “pornographic material”, in the context of audio porn, but I can’t find the references for that right now.)

(Some of) The flaws of Part 3

Part 3 lacked any privacy safeguards.

Despite the potential for the processing of very sensitive personal data, the Act has absolutely no privacy safeguards.

Zero.

Nada.

The usual riposte is that this is not necessary because of the GDPR. But this is problematic for (at least) two reasons:

First, the Act purports to have extraterritorial effect, applying to porn sites wherever they are in the world, if they make porn available to people in the UK on a commercial basis. (Whether or not the extraterritorial effects provisions would be effective is a different, but pertinent, matter.)

While the GDPR does have extraterritorial effect in some situations - see Article 3(2) - it does not say “if a site is accessible to people in the UK, the GDPR applies”. It is much more limited than that (and for good reason).

In other words, it would mean people in the UK being compelled to hand over, or give access to, documents or information proving their age, and possibly their identity, to sites located outside the UK/EU, and not subject to the GDPR. Who knows what data protection and privacy safeguards those countries have?

Second, the GDPR is a general purpose framework. It establishes a baseline level of protection. There’s no reason why its existence should mean that, additional, specific controls, tackling specific, high risk, threats, cannot co-exist.

For example, some argue that additional, specific controls are needed to deal with artificial intelligence, and for the use of facial scanning in the context of policing, saying that the GDPR is insufficient for the risks.

Ironically, facial scanning is one of the proposed techniques for age estimation for access to pornography. If you think that requiring people to turn on their cameras when they’re accessing a porn site is in any way a good idea, I suspect you’re on your own…

(And, no, PAS 1296 - a code of practice sponsored by The Age Verification Group of the Digital Policy Alliance - is not any sort of an answer to this!)

The framework for administrative blocking orders is unlawful (IMHO)

Recognising the limited likelihood of porn sites all around the world choosing to comply with English law, the Act included a backstop power “to require internet service providers to block access to material”. In effect, an administrative blocking order (“administrative” in the sense that they would be issued by the age-verification regulator, not a court).

This power was, IMHO, unlawful, for two reasons:

First, there is no inherent limitation, requiring that these measures are imposed only on ISPs capable of implementing them.

Not all ISPs have the ability to implement this kind of order (although many do), and requiring ISPs to design, test, implement, and operate systems specifically for this purpose, without public funding*, would be disproportionate.

The plan was that ISPs in the UK which provided Internet access to more than 10,000 residential customers could be served with these orders, with the blocking carried out through (ab)use of their DNS systems, but this was not enshrined in statute, nor is there any express provision limiting the scope of the obligation to what is reasonably practicable.

In other words, the proposal was more onerous than the obligations which can be imposed by interception or equipment interference warrants, or communications data acquisition notices, under the Investigatory Powers Act 2016!

*Even with public funding, I’d be sceptical, leaving aside the inevitable “slippery slope” that these systems would engender, since it is likely that the copyright industry would seek to use the capabilities to protect their commercial interests.

Second, the framework expressly permitted overblocking.

Regulation 23(3) says:

The steps that may be specified or arrangements that may be put in place under subsection (2)(c) include steps or arrangements that will or may also have the effect of preventing persons in the United Kingdom from being able to access material other than the offending material using the service provided by the internet service provider.

I cannot see how a statute expressly permitting overblocking - inhibiting people’s access to lawful material - can be consistent with fundamental rights.

Nor do I see how an appeals framework can rescue a statute which is, in my opinion at least, fundamentally unlawful.

Would it be possible to circumvent these orders anyway? Yes, trivially, and the government’s impact assessment recognises that:

There is also a risk that both adults and children may be pushed towards Tor where they could be exposed to illegal activities and more extreme material.

“Out of the frying pan, into the fire”, as a regulatory approach. Great.

The “commercial basis” rules were a mess

The main principle of Part 3 was:

A person contravenes this subsection if the person makes pornographic material available on the internet to persons in the United Kingdom on a commercial basis other than in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.

What does it mean to make something available “on a commercial basis”? If a law is going to penalise someone with significant fines, it needs to clear and precise, and this phrase is neither.

Never mind: enter The Online Pornography (Commercial Basis) Regulations 2019, which say that the framework applies, unless it doesn’t, unless it does.

Regulation 2(4), for example, says:

Subject to paragraph (5), paragraph (3) does not apply in a case where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one-third of the content of the material made available on or via the internet site or other means (such as an application program) of accessing the internet by means of which the pornographic material is made available.

One-third of what? The total filesize of the content on the site? The total running time of the content on the site? The total number of individual pieces of content on the site?

If the rules around when it applied were unclear, is it really fit for purpose?

Of course, some people will say that it was “good enough”, which should be a warning in itself