|Home » Resources » Rants
Gates' War on Terrorism
Week of August 31, 2004
Things are slowly becoming clear. Maybe.
The forerunner of Windows XP, Windows NT, was evaluated as adhering to Orange Book C2 Classification security standards on 15 December 1999.
- The Orange Book was created in 1985, several years before the invention of the World Wide Web.
- Windows NT's C2 predates the first outbreak of the big worms of the new millennium by half a year.
- A C2 classification assumes no 'interlopers' are able to reach the local area network and machines with Windows NT loaded may not be accessed through floppy drive, SCSI tape, or CD drive.
- Both Service Pack 6a and a special 'C2 Update' module not part of the shrink wrapped product were necessary to achieve the evaluation.
An Orange Book C2 Classification is a good thing. It's not as high as A1, but there are only three systems in the world that have got that high - Boeing's MLS LAN, the Gemini Trusted Network Processor, and Honeywell SCOMP.
The basic tenet of C2 is discretionary access control: this means that people who own resources on a system 'may' control how others use those resources if they want - but they don't have to. If they had to, that would be mandatory access control, which is a higher security rating.
David Cutler is a good programmer. He did a good job for Digital Equipment Corporation. He designed and wrote their VMS operating system, and aficionados world-wide still regard VMS as literally bulletproof. [VMS has also been C2 certified.]
Dave always wanted to rewrite VMS in C. He hated Unix but loved C. As soon as he'd finished VMS he suggested the rewrite. He was turned down flat. Several years later he found himself in Seattle and essentially was doing the rewrite in C when word came DEC were tired of him.
Word came from a few Microsofties.
Dave took his engineers and his new operating system called 'Prism' cross town to the Redmond campus. Prism was retrofitted to the world of Windows. It is instantly recognisable by any VMS engineer but it's not VMS. It adheres to a Windows reality, not a VMS one.
Besides, VMS predates the connectivity of the Internet. [For that matter, so does Windows NT.]
Windows NT was supposed to get an Orange Book C2 Classification. And it finally did - with Service Pack 6a. (Windows 2000 and Windows XP are not certified.)
Certification takes a long time. Most often the product is obsolete before the testing process is completed. Microsoft invested a lot of money in attorneys to get this classification for their Windows NT.
In the case of Windows NT it took until Service Pack 6a and a bit of additional software known as the 'C2 Update' - by which time David Cutler and The Tribe were far away from Redmond.
In the end they got it - but to get it they had to disconnect the test machine from the Internet, and by then - December 1999 - David Cutler was long gone - four whole years in fact.
Secure only if 'not connected'.
But then again C2, a system devised in 1985 (long before the connectivity of today) doesn't test vulnerability to attacks from the Internet anyway.
The C2 test was performed on Compaq professional workstations (5100, 8000) and Proliant server class machines (6500, 7000). The test required Windows NT be set up to only boot from the fixed disk drive where Windows NT has been loaded - normally not the case on the shrink-wrapped product. No floppies, no SCSI tapes, no CDs...
To quote the SAIC report:
'The evaluation configuration assumes that the physical network infrastructure (eg Ethernet) is protected and controlled by a single security administrative authority. This assumption represents the fact that a C2 evaluation does not generally address cryptography and other means of protection against hostile interlopers that are able to gain physical access to the network media. Instead, the assumed scope of the C2 network evaluation is the 'system security architecture', assessing the ability of Windows NT security architecture to protect resources from inappropriate access via the untrusted user and programming interfaces available within each Windows NT system node on the network.'
Not much has been done or done well with Windows NT since David Cutler left. And if you missed that, yes David Cutler left. He could hardly stand Microsoft while he was there, and they eventually drove him round the bend. He's off racing P1 today and for a while Microsoft was his big sponsor. Microsoft paid the bills so Dave didn't tell the world what he knew about Microsoft.
'It kept me from pissing all over them', was how he put it.
Dave wasn't the only good engineer at Microsoft at this time. All Dave's DEC engineers (both hardware and software, that was part of the deal) were along for the ride. Almost all the crucial code for Windows NT was written not by Microsofties but by DEC employees who'd followed Dave and out of pride wanted to finish their product.
They were known for their reliability, their dependability, their keeping regular working hours, and their solid code. They were known as 'The Tribe'.
When Dave Cutler left Microsoft, The Tribe left too. Even the little old lady from Intel who'd done the device driver documentation all those years had it. 'If Dave's gone I'm not staying' she said, emptied her desk and walked out.
Without Dave around it wasn't fun anymore - all that was left were Microsofties.
Microsoft had to rewrite Windows NT for the followup release. They got into a bit of a bind with Digital Equipment over Windows NT. Cutler had taken the OS and made no pretensions about it being anything but stolen from DEC. DEC found out and sued. Microsoft had to settle. Dave wrote a 'HAL' for the DEC Alpha processor and for the next release everything - all 16 million lines of it - would have to be rewritten.
[It would turn into an estimated 50 million lines instead. So much for Microsoft engineering efficiency.]
Lots of fun. What's worse: Dave was gone now. Microsoft had no one of Dave's caliber. They had his old code there to learn from, but they didn't have his savvy or his experience. They didn't have the dependability of The Tribe.
Worse still, they were Microsofties: zombies susceptible to the attacks of Steve Ballmer, always trying to whip up more frenzy (and drive David Cutler insane). With people like Steve Ballmer hanging around, breathing down your neck, how can you be expected to think clearly, design well, or write one good line of code?
What the world suffers through today is the direct legacy of the above. It affects everyone, not just Windows users. The Internet is a shambles today because of what Microsoft have done and keep doing.
And their Windows XP Service Pack 2 will change none of that.
On the contrary, the bandwidth needed for this idiotic 'automatic update' system could cripple us all at any time. Just imagine: 300 million times 300 megabloats - how much bandwidth is that? Add it up.
That's more computers going after the biggest prize ever - more computers than can be expected to want Napster or Kazaa or WinZip or any other product. 51% of all the Windows machines out there. No other product has a 51% demographic.
What's a Debian bittorrent to this?
And the size of the download is enormous. It's not 10 MB or something innocent. Three hundred MEGABLOATS. True, not all machines will need it all, but there will be traffic to determine just how much they need. Suffice it to say there will be a LOT of traffic clogging the Internet to take care of Bill Gates' latest jam.
If you check the full list of fixes for Windows XP Service Pack 2 you'll run into an HTML file that's almost 200 kilobytes long. This coming from a vendor who in the autumn of 2001 proclaimed Windows XP 'The Best Program Ever Written We Really Love This Program™' says a lot. It says things about customer expectations and a possible faulty contact with reality on the part of the vendor.
Each of the vulnerabilities listed in the monster document represents a way malfeasant software can take control of a local machine. Each was a hole in the on-disk security.
The Microsoft plan with Service Pack 2 seems to be the following:
- Plug all the holes you find (there are thousands).
- Establish a perimeter control (the firewall).
- Encourage patrols within the perimeter (antivirus software).
- Make sure new holes can be plugged as soon as they're found (Windows update).
It makes sense, and it may be the only chance Bill Gates has to save Windows. And it explains why he has been so adamant about people accepting automatic updates. If Microsoft can rely on being able to plug their hundreds of millions of systems by a simple flick of a switch, attacks can be contained (but not eliminated). If AV data is kept up to date, patrols within the perimeter will be more effective in finding the latest attack vectors. If the firewall is on, breaking into the system will become more difficult.
Many have wondered why Microsoft's firewall does not check outgoing traffic. The reason may be that Microsoft see the game as over if malfeasant software is already on the machine.
Microsoft seem to be banking on controlling the perimeter (firewall), conducting regular patrols within the perimeter (AV software) and patching holes automatically as fast as they can to contain collective damage.
This would seem to be Bill Gates' bet. The only way he can keep his product and repel the invasion. The only chance he has to win his own war on terrorism. And it might conceivably work.
There are however several things which speak against such a success.
The Microsoft plan is limited because what is being protected is inherently insecure. The beefed up security in Windows comes not from a firewall or antivirus software - both are external products. The beefed up security in Windows is the iterative process to keep on plugging all the holes that are found.
Considering the list of holes is already so long in three years when the product was once touted as the most secure ever says reams about expectations and about what is to come. If this many holes were found in three years, what does this say about quality in Redmond? What does this say about Redmond's ability to write secure code? Above all what does it say about Redmond's ability to design secure systems?
Bill Joy remarked that Windows on the Internet is a big mistake because it's not a secure system, because it wasn't built with security in mind, because it wasn't designed as a multiuser system, and because any system on the Internet is multiuser whether you like it or not - that trojan knocking at your door is your 'multiuser'.
Windows lacks a viable authentication system. True, Microsoft are trying to retrofit such a scheme on things, but Windows was not designed from the ground up to be secure or even proprietary in a multiuser environment.
Once the bad code gets onto the system - once the bad code has found a place to hide on the local machine - it's basically game over. The code will wait for its opportunity to completely take over the machine - install keyloggers, open the machine as a spam relay, dig itself so deep into the underbody that no one can get it out. Antivirus only handles so much of this attack; products such as No Adware and Ad-aware are supposed to do the rest, and security experts have already seen there are vectors which no such program can detect or thwart.
Bill has talked about widening the perimeter: he'd like to see security beefed up on the Internet backbone. But the Internet backbone is already reasonably secure; why should it do the job Bill Gates cannot do? Worse: would it ever succeed?
Inside the Internet Connection Firewall, weaving past the intermittent antivirus patrols, Windows has no security. This is why the system always crashes and hangs: the system has no defences like it's supposed to have. The system was not designed from the ground up to be secure.
A secure system doesn't worry about where the attack comes from; a secure system sees that 'bad' code never has a chance to execute and harm the machine. This bad code might come from anywhere; where it comes from is immaterial. A well designed system looks at itself for the clues, not at virus signature lists.
A well designed system is built on a solid foundation. It's never a question, as with Windows, of first allowing everything and then successively after years and years of abuse, turning all those opportunities off; a well designed system will by definition allow nothing except specific and well controlled operations.
Microsoft have never had a secure connected system and most likely never will. No system of Microsoft's has been designed to function securely in a connected world.
You can patrol the perimeter all you want; you can conduct routine checks inside the perimeter; you can get security updates out to people as soon as you can find out about the holes; but this iterative process conveniently - some would say deliberately - ignores the fundamental issue: the system itself is not secure and never will be.