Bill Thompson of the BBC Going Digital has written a very sensible column on how trusted computing, rather than being a smokescreen for All that Is Evil (or Microsofty) might actually be the way forward to defend computers against spyware, adware and virus-ridden CDs of the infamous Sony "root kit" type.
However the tone changes in the second para:
"Unless we are careful the tools which could make us a lot safer and give us more power over what we do with the hardware we own and the software we license - few programs are actually "sold", not even free software - will instead be used to take control away from us.
At the moment the companies behind trusted computing do not trust their customers at all.
They want to use digital rights management to control what we can do with content we have purchased, they want to make sure we don't install programs or new hardware that they haven't approved, and they want to be able to monitor our use of the expensive computers we own."
Bill argues that being asked to trust the people who supply "trusted" software - people like Sony - is akin to owning a car where you can't look under the bonnet.
"I have a very nice car, and I try to take good care of it. It runs on petrol, but I want the freedom to fill it up with diesel and destroy the engine. It's my engine, after all.
The same goes for my computer. I want the freedom to write, compile and run my own code, take risks with dodgy software I've downloaded from the net and even break the law and risk prosecution by playing unlicensed music or running cracked software. "
What this apparently appealing metaphor obscures is two things. One "trusted computing" in the strict sense is about hardware not software. I'll come back to this. Trusted computing means that the (metaphorical) box your computer comes in has to be a "black box" unopenable by the user - otherwise the user can do all the stupid things users do right now like open emails from strangers, accept downloads with payloads of virus executables , and click on URLs that take them to phishing websites.
This means you do indeed have to trust the people who supply you with trusted computing hardware, and I agree with Bill that there should be serious legal obligations with decent compliance mechanisms placed on those who do sell "trusted computing" so they do not sell us, as we Glaswegians say, a pig in a a poke (or a root kit in a DRM).
But the Internet is not going to be any more secure if we sell people trusted computing hardware and let them, as Bill, wants to, tinker and fiddle. It would be like selling my mum a Ferrari and suggesting that if she's bored one Sunday she tunes the engine up a bit. She would destroy a very expensive engine and she would also endanger people on the road if she took the car out and it ran out of control as a result of her unskilled modifications while she was driving.
Security of hardware sold to consumers, and consequentially the security of the entire Internet (see posts on bots , zombies etc, passim) is simply no longer compatible with open tinkering. Once upon a time anyone who bought a car was allowed to simply take delivery and drive it. Then when the density of cars increased, we reaised we needed driving tests for public safety. Maybe people like Bill who are well equipped to safely tinker with/fine tune their computers (unlike my Mum) , should have to pass a test too before they're allowed to drive away a non-black-box computer? Radical in the libertarian world of computer culture ; but not very odd at all when you look at the rest of the everyday attitude to owning potentially dangerous objects.
What about the software that trusted computing hardware is willing to accept and excute? the so called "signed" software? Here I completely agree with Bill that the defining of what is acceptable software cannot safely be left to the dictat of the software/hardware vendors. Microsoft eg (just AS an example!) has absolutely no incentive to let me, a consumer, run open source software on the trusted platform they've just sold me. Without needing to imply any malice at all, simple competitive strategy would dictate they should allow access to Microsoft software products and nothing else, if they can get away with it. So as Bill says:
"The second thing we need is diversity when it comes to code signing. If my computer is set to run only signed software or read only signed documents, then who can sign what becomes far more than a matter of technology, it becomes a political issue.
We must not settle for a closed platform which allows the hardware vendor or the operating system supplier to decide, so it is time for governments to intervene and to ensure that we have an open marketplace for code signing.
The simplest way to do this is to give the process a statutory backing and then issue licences, just like we do for many professional and financial services. "
It's the last para I can't see happening, for the simple reason that a lot of hardware and software comes from the US and the US is not prone to extending governement regulation of industry. The UK can impose local regulation on hardware, at least in theory, by stopping it at ports: it simply can't impose licensing control on software downloaded from the States. How can you download that "dodgy software" you have your eye on if the country it originates from hasn't bought in to a licensing scheme model? Do you simply accept any software with no license - the bangs goes security.
Plus the national model of licensing financial and profesional services has already proven to be a nightmare of possible restrictive practices which the EU , the most harmonised region of nations in the world, is only slowly getting over. How tempting would it be for a faltering French software industry (say) to refuse to sign off on US or even Chinese software products?
A better candidate for a certification authority for signing or licensing software as safe might be the existing international standard setting authorities. If an ISO standard, available on-line and revised on application by new entrants into the software market, said what programmes my black box should (or could) accept and execute and which it definitely shouldn't, both I and my technophobe mother might feel a lot safer on the Net.
No comments:
Post a Comment