Cogent post by Technollama on the insatiable hunger of the UK press for scare stories about the horrors of the Internet, especially re Facebook, MySpace, chatrooms, child porn etcetera.
All this furore has of course been partly whipped up most recently by the publication of the much-awaited Byron Report. Pangloss has not had time to read the Byron Report in full yet but was initially relieved that it seemed to have concentrated on "having a national strategy for child internet safety which involves better self-regulation and better provision of information and education for children and families" and not on further extension of the invisible upstream censorship model pioneered by the IWF and BT Cleanfeed to, eg, sites like Social networking sites SNSs), or online games; or types of content which are arguably harmful to children, but not illegal, such as adult sexual content (although read on for discussion of existing upstream filtering in schools and local libraries, and the consideration of extending a "child-safe" Internet to everyone, children and adults alike).
The main features of the Byron Report , beyond the usual calls for parental involvement, understanding that children know more about the net than parents, integration of e-safety into the school curriculum, and consumer and teacher education, seem to be:
(a) "better" ie more granular, classification of video, console and on-line games;
(b) refinement of our understanding of how offline laws apply to online content eg are suicide websites illegal?, and
(c) the creation of a one-stop shop for regulation child safety on the Internet issues, to be named the (slightly Orwellian) UK Council on Child Internet Safety, and run by Home Office and DCFS with help from DCMS, which will "lead the development of a strategy with two core elements: better regulation – in the form, wherever possible, of voluntary codes of practice that industry can sign up to – and better information and education, where the role of government, law enforcement, schools and children’s services will be key".
Reading further on gives us some idea of the key tasklist the Council is meant to undertake. This is a long and interesting list but these are a few items that stood out to me.
- making sure home computers are sold already loaded up with kitemarked parental control software (but not by default already switched on and fully functional - see 4.72)
- making sure search engines offer clear indications if safe search is on, and that these can be "locked on" by parents
- making sure 100% of schools and local services (computers in libraries and museums eg) to children have Becta accredited filtering services
- working with user generated content hosts (eg Facebook) to establish an independently monitored voluntary code of practice for the moderation of user generated content.
Despite all this the executive summary concludes with the following quote ;
"“Kids don’t need protection we need guidance. If you protect us you are making us
weaker we don’t go through all the trial and error necessary to learn what we need
to survive on our own…don’t fight our battles for us just give us assistance when we
need it.”
I feel, in my slightly confused position as a former specialist in child law and nowadays a specialist in Internet law, that we are getting mixed messages here. How are children going to go through "trial and error to learn" when they inhabit a world where parents can defer any parenting discussions on adult content to a kitemarked filter they don't understand enough to alter? Where school , library and museum access is 100% filtered? (And I have an acquaintance who runs the filters for a certain Scottish local authority 's schools - and I was mildly appalled by how far it filters beyond what is legally proscribed content.) Where their own version of their own real life on their own UGC sites is potentially censored? (As if they need to go to the Internet anyway to see teens engaged in nudity, sex, drugs and unsafe behaviour - they can just watch Skins .)
Less controversially, there is an interesting suggestion at 4.19 about how UGC or social networking sites might handle the tricky issue of moderation of content and legal liability. Many SNSs, hosts and ISPs have long argued that they cannot monitor/moderate illegal content and remove some, because they are then "on notice" for the whole site's contents, and will be liable for any illegal content they have let slip past (see Art 14 of the Electronic Commerce Directive and the ghost of the Prodigy doctrine.) Byron rather smartly observes that such risks might be minimised if a third party was used to audit the site and give notice to the host site only about material which definitely breaches the law, and which could then be removed, and adds a recommendation that "the Council explores the possibility of developing such arrangements to minimise the risks of liability for companies that take steps to make their products safer for children". Who PAYS for such third party auditing is not discussed :)
Byron also recommends that sites be encouraged to sign up to specific public commitments on take down times, which sites currently tend to avoid for fear of being deemed in breach of contract if they do not take down in time ; Facebook, eg, has already publicly guaranteed to take down on complaint, content containing "nudity, pornography,harassment or unwelcome contact" within 24 hours. This Pangloss approves of, having seen in her own empirical research, the very wide variation in take down times from hosts and ISPs according to variables such as size of organisation, type of content and type of organisation, and the uncertainty this can cause both hosts and users (MumsNet were reportedly forced into settlment re liability for allegedly libellous UGC , by not being sure if they had taken down "expeditiously").
Overall though, despite the odd mention (and I emphasise again my not having read whole report fully yet) there is a definite air about the report , as Jonathan Zittrain once put it, of it being "so 2005". What use will filtering requirements on schools , and parental control software be, when as will be true in about 5 Internet minutes, every child routinely accesses Facebook or Bebo on their way to school via their smart mobile phone?. The report itself admits that 37% of 11-16 year-olds already have access to the internet via a mobile (ChildWise 2008). Even if mobile phone operators are corralled as upstream supervisors as well (a voluntary code of concuce for mobile operators has existed since 2004, but Byron admits "it is difficult to establish the effectiveness of work in this area" - 4.109) what about wi fi accessed via their smartphones, IPod Touch or equivalent, on the school bus, in cafes, at friends' houses and at clubs? These issues are actually, praiseworthily, raised, with research commissioned to examine access outside the home (4.69) but in the end there is no solid recommendation of any serious way of how to deal with these impossibly difficult problems (4.106, 4.116,4.117).
There also seems to be a rather worrying supposition that SNSs are the domain solely of children. Bebo may be, but many are not. Recent research showed, rather amazingly, that in the UK as of September 07, the median age of a Facebook user was 34! (Pangloss herself is an FB user and er rather over that age :( Should a 34 year old be subject to a UGC moderation code which refuses to let him publish a tasteful non-illegal erotica picture of his girlfriend? I am not really sure. We are getting dangerously close to the famous ACLU v Reno No 1 case which asserted that , even in the interests of children, the whole of the Internet should not be reduced to the level of a "children's reading room".
Putting the job of censorship on to ISPs, host and SNSs rather than directly exercised by the state, does not make it any less censorship - it just makes it less transparent and less accountable. There is a slightly chilling discussion at 4.54ff of the idea of network (ISP) level blocking of all unwelcome content - ie blocking non illegal but non child friendly content to ALL USERS , by all UK ISPs - with the onus on, or choice by, over-18s to opt out of this blocking. The Report chooses to not go down this route for large numbers of very sensible reasons, but adds somewhat worryingly "this may need to be reviewed if the other measures recommended in this report fail to have an impact on the number and frequency of children coming across harmful or inappropriate content online." (4.60) This puts Technollama's suggestion that next we will see regulation of social networking sites positively in the shade..
In short, the Byron Report is a brave and largely non-tabloid-scare-oriented attempt to deal with a difficult problem. Much of the child developmental information in the first two chapters is excellent and it is very valuable to have it in one place in front of policy makers and lawyers' noses. But as far as as solutions go, one does have a feeling that it is perhaps not looking far enough ahead; because "far" on the Internet is usually not that far at all.
No comments:
Post a Comment