|Home » Resources » Rants
To Bug Or Not To Bug 2
Week of February 25, 2001
Is bug-free programming possible?
There is growing apathy for the bugginess and bloatedness of commercial PC software; yet is it possible to produce a completely bug-free application?
In a word: 'No' - but the distance between what end users see as bugs and what computer scientists see as bugs is currently insurmountable.
For the end user, a bug is an indication that the application itself has failed. For the computer scientist, a bug is an indication that something somewhere has gone wrong. For the end user, if something has gone wrong, the application must be at fault, and it's time to call customer support; for the computer scientist, if something has gone wrong, it's time to call out the bloodhounds.
Between the silicon itself and the keyboard exist so many levels of abstraction that it might be beneficial to reiterate a few of them:
- The silicon itself. Silicon is not perfect, nor are the engineers who alchemize it into something seemingly intelligent and programmable. Two glorious examples of errors on this level are the Intel i860 RISC processor and the Pentium FDIV instruction.
- The hardware. Although modern application programming can catch many hardware errors through exception handling, errors on this level can still seep through.
- The underlying assembler. Many systems revert the code of higher languages to native assembly and let the system assembler take it from there. Even assemblers are programs and may contain errors.
- The compiler. A compiler is impossible to test in any complete fashion; one can only hope for the best and be extremely meticulous in one's development work. Conversely, knowing - or even suspecting - that one is dealing with a 'buggy' compiler is one of the most unsettling experiences in the life of a computer scientist. Yet buggy compilers have existed and continue to exist.
- The linker. As assemblers and compilers, even linkers are programs. And whether they be used to create dynamic link or object libraries or final executables they can, if flawed, produce applications that despite proper coding will fail.
- The application programming interface. Represented in modern windowed systems by the DLL, this interface can itself (and surely does) contain hundreds (thousands) of minor and even major errors. Security advisories provide a never-ending stream of information about such errors.
- The 'moth'. Grace Hopper's famous pet is the perfect example: An error occurred not because of faulty programming but because an insect decided to set up house inside her hardware.
The list can go on and on. What is important to understand here is that there are thousands - or more like millions - of variables involved in the proper execution of any application - and that things can (and do) go wrong - and on different levels as well - all the time.
That being said, it is time to address the claim, heard more and more often today, that it is possible to create bug-free programs. It is not possible, and any contention otherwise is both pretentious and naive.
However it is possible to create relatively bug-free software - to devote more time to perfecting one's code than adding on new (buggy) features all the time. Robustness is a reality, and if the proponents of bug-free software were to use this description instead, there would be no argument. For robust code is (and should be) easy to attain - and sloppy programming can and should be shunned.
All too many programming temples of today couldn't care less how much slop they churn out and put on computer store shelves. They're not worried about the consistency of client CPUs and peripheral devices; they're not worried about their assemblers, compilers and linkers; they're not worried about bugs in the application programming interfaces; and they're certainly not worried about their own code either.
So while we cannot hope to attain perfection along every level of computer production, we can certainly insist on better performances from our software vendors. And if the complainers were to use a vocabulary more in tune with reality, they might find it easier to have their voices heard.