Increasingly, stories as to filtering out illegal content such as child porn; blocking infringing downloads of copyright material by deep packet inspection and disconnection; and filtering to fight the "war on terror" are converging. For all of these, the same issues come up again and again: privacy; proof, transparency and other aspects of due process; and scope creep. These 3 stories illustrate this well. For my own recent take on the issue of Net filtering, as I said before, see my Internet pornograohy chapter on SSRN, which suggests the need for a Free Speech Impact Assessment before non transparent stateNet filtering schemes are introduced, for whatever purpose.
Filtering of illegal content in France
Thanks to @clarinette on Twitter (whose real name I am not absolutely sure of!!) for pointing me to another important European move towards non transparent Internet filtering - this time in France. From La Quadrature de Net:
Paris, February 11th, 2010 - During the debate over the French security bill (LOPPSI), the government opposed all the amendments seeking to minimize the risks attached to filtering Internet sites. The refusal to make this measure experimental and temporary shows that the executive could not care less about its effectivity to tackle online child pornography or about its disastrous consequences. This measure will allow the French government to take control of the Internet, as the door is now open to the extension of Net filtering.
The refusal to enact Net filtering as an experimental measure is a proof of the ill-intended objective of the government. Making Net filtering a temporary measure would have shown that it is uneffective to fight child pornography.
As the recent move1 of the German government shows, only measures tackling the problem at its roots (by deleting the incriminated content from the servers; by attacking financial flows) and the reinforcement of the means of police investigators can combat child pornography.
Moreover, whereas the effectivity of the Net filtering provision cannot be proven, the French government refuses to take into account the fact that over-blocking - i.e the "collateral censorship" of perfectly lawful websites - is inevitable2. Net filtering can now be extended to other areas, as President Sarkozy promised to the pro-HADOPI ("Three-Strikes" law) industries3."
LQN are never exactly ones to mince their words:-) so the strong nature of this statement should perhas be taken with some care - but Pangloss intends to go investigate this story further.
Ireland, Eirecom, disconnection and DP
Meanwhile in a surprising twist, Eirecom have apparently pulled out of the negotiated settlement they reached in January 2009 to disconnect subscribers "repeatedly" using P2P for (alleged) illicit downloading. This was the result of the Irish court case brought against them by various parts of the music industry for hosting illegal downloads, and appeared to open up a route to "voluntary" notice and disconnection schemes on the part of the ISP industry; a worrying trend both for advocates of free speech, privacy, due process, ISP immunity and net neutrality.
Now however according to the Times:
As part of the agreement, Irma said it would use piracy-tracking software to trace IP addresses, which can identify the location of an internet user, and pass this information to Eircom. The company would then use the details to identify its customer, and take action.
But the office of the Data Protection Commissioner (DPC) has indicated that using customers’ IP addresses to cut off their internet connection as a punishment for illegal downloading does not constitute “fair use” of personal information. Irma and Eircom have asked the High Court to rule on whether these data-protection concerns mean the 2009 settlement cannot be enforced.
This is very, very interesting. A court case on this might settle a number of outstanding DP legal issues: whether IP addresses are "always" personal data (on which see also a recent EU study demonstarting the disharmny across Europe on this) and if not, when; what the scope of the exemmptions for preventing and investigating crime are; and what"fair" means in the whole context of the DP principles, purpose limitation and notice for processing.
Not only that but as the Times indicate, the human rights issues which have been repeatedly aired in debate around "three strikes" generally, would also come into play as well, as the straight DP law. Is use of a customer's personal data to cut them off from the Internet a proportionate response to a minor civil infringement? Does it breach a fundamantal right of freedom of expression or association? Does it breach due process? This could be the DP case of the decade. Pangloss is geekily excited. If anyone out there is involved in this case, do let me know.
UK cops don't terrorise the IWF?
Finally , as widely reported, the UK Home Office has introduced a website hotline for the public to report suspected terrorist or hate speech sites. Reports are then vetted by ACPO, the Association of Chief Police Officers, who it appears can then take action, not only by investigating in normal way, but also by asking the relevant host site to take down. The official press release notes : "If a website meets the threshold for illegal content, officers can exercise powers under section 3 of the Terrorism Act 2006 to take it down." Indeed on serving such a notice, the host only has 2 days to take down or loss immunity under the UK ECD Regs.
As TJ McIntyre also notes, this is a rather significant development, not just in itself but for sidestepping use of the Internet Watch Foundation (IWF). There have been persistent rumours since and before then-Home Sec Jacqui Smith's famous speech in Jan 2008, that theUK government was attempting to pressurise the IWF into adding reports of hate speech/terror to its block- or black-list; and that the IWF was as strongly resisting this, hate speech being a somewhat more ambiguous and controversial matter than adjudicating on child sexual imagery.
It seems then that the IWF has held fast and the Home Office have backed off and created their own scheme, which embraces only take down in the UK, not access blocking to sites abroad (?). Whether this is ideal remains to be seen. The IWF, at least until recently had the services of esteemed law prof Ian Walden as well as a lot of accumulated experience, and may have been a better informal legal tribunal, than a bunch of chief constables, to decide on the illegality of sites under terror legislation. Who knows. On the other hand , adding alleged terror URLs to an invisible, encrypted, non public blocklist defeats every concept of transparency and public debate regarding restrictions on freedom of political speech, and Pangloss is glad to see it avoided.
Pangloss's view remains that such difficult non-objective issues are best decided by the body long set up to deal with questions of hazy legal interpretation: namely, the courts. The definition of "terrorist" material for the urposes of s 3 of the 2006 Act is as follows (s 3(7)):
"(a) something that is likely to be understood, by any one or more of the persons to whom it has or may become available, as a direct or indirect encouragement or other inducement to the commission, preparation or instigation of acts of terrorism or Convention offences; or
(b) information which—
(i) is likely to be useful to any one or more of those persons in the commission or preparation of such acts; and
(ii) is in a form or context in which it is likely to be understood by any one or more of those persons as being wholly or mainly for the purpose of being so useful."
Well I hope that clears everything up :-) Still confused? Try s 3(8)).
"(8) The reference in subsection (7) to something that is likely to be understood as an indirect encouragement to the commission or preparation of acts of terrorism or Convention offences includes anything which is likely to be understood as—Er give me that last line again?
(a) the glorification of the commission or preparation (whether in the past, in the future or generally) of such acts or such offences; and
(b) a suggestion that what is being glorified is being glorified as conduct that should be emulated in existing circumstances."
As with previous contested IWF rulings, the same questions come up again: what is the appeal from a take down notice under s 3 to the regular courts? What notice if any is given to the site owner and the public of therfact of and reasons for take down? What safeguards are there for freedom of speech? None of these are mentioned in ss 1-4 of the 2006 Act. Nor does there seem to be a general provision in the Act for Part 1 or the whole of the 2006 Act for appeals or review. Since the police are a public body however, one imagines that judicial review might be competent. EDIT However I am helpfully informed that ACPO is a company limited by giuarantee and regards itself as not a public body at least for the purpose of FOI requests. Clarity on this would be very desirable. And as noted above record keeping of take down for terror reasons seems to be poor due to voluntary compliance by ISPs.