Sunday, February 26, 2006

Who Do You Trust, Reloaded?

Interesting response from my coder guru pal, Pete Fenelon: I don't agree with every word but I thought it was worth reproducing in full..

Overview: code signing and secure OSes won't work - but that's not where
the effort should be going; it should be going into creating a
well-policed interface between private systems and the network - and
making the owners of those systems liable.

PF: I admit that I'm something of an oddball in my views here, but I
belive that what goes on behind your net connection is your own business; what comes out of it is very much not your business. Same as I can have a rocket-powered car in my garage, but I'm toast if I try to take it on the road. ;)

LE: Bill argues that being asked to trust the people who supply "trusted" software - people like Sony - is akin to owning a car where you can't look under the bonnet.

PF: And what's wrong with this? -- most people who buy cars these days don.t know diddly about what goes on under the bonnet, and entrust repairs to qualified professionals (or at least people who they think are qualified professionals). Most home computers are "administered" by "our Kevin who's dead good with computers, he gets high scores on all them games he gets discs of". "Our Kevin" often isn't mindful of the consequences (or even existence of) malware, and would click "OK" like a Pavlovian dog if it meant playing a warez version of Halo 3.

Bill: "I have a very nice car, and I try to take good care of it. It runs on petrol, but I want the freedom to fill it up with diesel and destroy the
engine. It's my engine, after all.

PF: Well, in many cases it's probably the finance or leasing companys engine, but hey...

Bill: The same goes for my computer. I want the freedom to write, compile and run my own code, take risks with dodgy software I've downloaded from the
net and even break the law and risk prosecution by playing unlicensed music or running cracked software. "

PF: It might well be "his computer", in the same way that it's "his
car", but his car has to be MOTed regularly to ensure that it still complies with the law, and he has to take out insurance against any damage he might cause to others. When people call the Internet the "information superhighway" they seem to forget that the real highway isn.t a free-for-all -- there are people out there watching what you do, there are laws by which and your vehicle must abide if you wish to drive on it, and you must be licensed to even venture onto it. The penalties
are (or at least should be) draconian. The analogy is simple; we don't have "car cops" in Britain who stop you fitting an eight litre engine and slick tyres to your Morris Minor, we have "traffic cops" who get peeved if they see it on the road. Similarly, we shouldn.t have "computer cops" who stop you installing Frigware Linux R00tK1T 3D1T10N, we should have "network traffic cops" who pull the plug if your machine starts behaving dangerously.

PF: Right now, lives aren't at stake on the Internet (although no doubt some fool will eventually connect up some safety-critical equipment to an
unprotected public network and someone will get hurt), but the economic well-being of others is. What we need isn't a technical solution; it's a financial/legal one. We need:

PF: liability for damage caused by anything coming from a network
endpoint for which a particular legal entity (individual, corporation) is responsible.

PF: Regulation of apparatus that can connect - and I don't mean the old BABT red/green triangles, I mean mandating *approved* firewall/gateways between the public network and any equipment connected to it. Found without a firewall/working and up to date AV system? (and your ISP will be probing, otherwise it'll be fined and
potentially ostracised at LINX or similar.... or at least would be in my universe?) Exactly the same as having no catalytic converter, no headlights and bald tyres -- your connection "goes dark" and you're fined. Simple as that.

PF: Unfortunately I don;t believe that licensing of individuals as fit to use computers can take place - for a start there's the problem of proving who's in control of a machine at any point.

PF: I also don't believe that licensing of applications can meaningfully be done. True 'trusted computing' costs, and costs in the eight figure sort of region for a typical project. And, frankly, how far does trust go? You can't trust any mainstream commercial or open-source desktop operating systems, not with the level of flaws found in them (and for an amusing aside, google "Reflections On Trusting Trust"). True Trusted Computing platforms are expensive, inflexible, and don't offer the kind
of experience that modern end users expect -- it'd be like stepping back around 20 years for most PC owners. A trusted system according to the Orange Book or Common Criteria would not be something most people would buy, and it'd move computers back from being a part of the home to being an expensive office tool. Maybe no bad thing ;)

LE: What this apparently appealing metaphor obscures is two things. One "trusted computing" in the strict sense is about hardware not software. I'll come back to this. Trusted computing means that the (metaphorical) box your computer comes in has to be a "black box" unopenable by the user - otherwise the user can do all the stupid things users do right now like open emails from strangers, accept downloads with payloads of virus executables , and click on URLs that take them to phishing websites.

PF: Exactly. But extending your thoughts even further, it's a systems view and a human view that we need, not a software one. If I do something that trashes my computer, it's my risk and my loss. If I do something that trashes my computer, turns it into a zombie host for running a botnet from, and makes it part of a denial of service attack, it's different. I've messed someone else's system up and that's contributory negligence... or criminal damage ;)

LE: This means you do indeed have to trust the people who supply you with trusted computing hardware, and I agree with Bill that there should be
serious legal obligations with decent compliance mechanisms placed on those who do sell "trusted computing" so they do not sell us, as we Glaswegians say, a pig in a a poke (or a root kit in a DRM).

LE: But the Internet is not going to be any more secure if we sell people trusted computing hardware and let them, as Bill, wants to, tinker and
fiddle. It would be like selling my mum a Ferrari and suggesting that if she's bored one Sunday she tunes the engine up a bit. She would destroy a very expensive engine and she would also endanger people on the road if she took the car out and it ran out of control as a result of her unskilled modifications while she was driving.

PF: Agreed.

LE: Security of hardware sold to consumers, and consequentially the security of the entire Internet (see posts on bots , zombies etc, passim) is simply no longer compatible with open tinkering.


LE: Once upon a time anyone who bought a car was allowed to simply take delivery and drive it. Then when the density of cars increased, we reaised we needed driving tests for public safety. Maybe people like Bill who are well equipped to safely tinker with/fine tune their computers (unlike my Mum) , should have to pass a test too before they're allowed to drive away a non-black-box computer?

PF: Unenforceable. You don.t stop people owning computers, you just make it very, very hard, risky, and expensive to connect anything dubious to the public internet.

LE: Radical in the libertarian world of computer culture ; but not very odd at all when you look at the rest of the everyday attitude to owning potentially dangerous objects.

PF: "Libertarianism" on the public internet is a consensual illusion
passed down from idealistic old-timers of the 1970s and 1980s who enjoyed unrestricted ARPAnet/Internet access as a perk of their jobs or studies and the network was largely run by and for enthusiasts as a piece of research. It's been a fiction ever since individuals have been paying for their access; you are always "playing with someone else's ball" and that someone else is much bigger than you. AUPs are going to get more and more restrictive, either because ISPs are covering their asses or because governments are leaning on them, and the onsequences for breaching those AUPs must become commensurately more painful.

LE: What about the software that trusted computing hardware is willing to accept and excute? the so called "signed" software? Here I completely agree with Bill that the defining of what is acceptable software cannot safely be left to the dictat of the software/hardware vendors. Microsoft eg (just AS an example!) has absolutely no incentive to let me, a consumer, run open source software on the trusted platform they've just sold me. Without needing to imply any malice at all, simple competitive strategy would dictate they should allow access to Microsoft software products and nothing else, if they can get away with it. So as Bill says:

PF: This "ecosystem" doesn't work; Gates tried to build a "trusted
computing" platform with XBox. I forget how many weeks it took to crack it wide open. DVD regioning tried to enforce a controlled system in hardware. Ditto. There are more and cleverer people out there fighting for "freedom" than there are people able to deny them. So move the problem - take it out of the technical domain and into the legal one.


LE: [actually Bill] "The second thing we need is diversity when it comes to code signing. If my computer is set to run only signed software or read only signed
documents, then who can sign what becomes far more than a matter of technology, it becomes a political issue.

LE: [still actually Bill] We must not settle for a closed platform which allows the hardware vendor or the operating system supplier to decide, so it is time for governments to intervene and to ensure that we have an open marketplace for code signing.

PF: A closed platform won't work (see above). And signing authorities? This just permits the development of 800lb monopoly gorillas like Verisign. Far simpler to move the burden - the place to police is the network interface. I don't care what naughty crap people run on their computers; what I do care about is that someone running dangerous software can't swerve across the information superhighway and unintentionally deny my service.

LE: [still Bill!] The simplest way to do this is to give the process a statutory backing and then issue licences, just like we do for many professional and financial services. "

PF: Software licensing on this scale can't and won't happen. Especially
not while you can buy hooky software from market stalls and/or China ;)

PF: A regulatory framework needs to be put in place and that regulatory framework needs to be centred around policing traffic through network
endpoints, not what's hanging off them. Does it matter what a non-connected computer runs? Of course not.

LE: It's the last para I can't see happening, for the simple reason that a lot of hardware and software comes from the US and the US is not prone
to extending governement regulation of industry. The UK can impose local regulation on hardware, at least in theory, by stopping it at ports: it simply can't impose licensing control on software downloaded from the States. How can you download that "dodgy software" you have your eye on, if the country it originates from hasn't bought in to a licensing scheme model? Do you simply accept any software with no license - then bangs goes security.

PF: All good points.

LE: A better candidate for a certification authority for signing or
licensing software as safe might be the existing international standard setting authorities. If an ISO standard, available on-line and revised on application by new entrants into the software market, said what programmes my black box should (or could) accept and execute and which it definitely shouldn't, both I and my technophobe mother might feel a lot safer on the Net.

PF: A wise old engineer who used to work in telecoms once said to me
"What's the difference between Jurassic Park and the ISO?" I said I didn't know. "One of them's a theme park filled with dinosaurs - and the other.s a movie". By the time the ISO has defined a model for software certification and verification the problem will have morphed out of recognition. The ISO is essentially completely reactive when it comes to comms and computers; their one attempt to define networking standards was a complete failure in the face of the open-source TCP/IP protocol stack and since then they.ve essentially been regarded as a laughing stock by the Internet community. ISO, ECMA, and similar bodies simply don't have the leverage.

PF: Your technophobe mother doesn't want a true "Trusted Computer"; I doubt she.d be willing to take on the cost of buying one. Your technophobe mother wants a computer that does the right job for her, and that's difficult to unintentionally or maliciously modify.

And LE adds - couldn't agree more! Thanks Pete.

No comments: