On the other hand, relatively ``simple'' living organisms such as bacteria are capable of incorporating ``stray'' bits of external code from their environment into their ``operating systems''---as when a gene coding for drug resistance is observed to ``jump species''---to the dismay, at least, of the complex organisms that produce the drug. Such promiscuity may be explained by the relatively low cost of producing a system combined with the potential gains to be had by ``stealing code.''
Is a computer system more like a collie or an E. Coli? On the one hand, even a personal computer is an expensive investment, and if it is used productively its value rises much higher than its capital cost. On the other hand, personal computers today are a motley patchwork of code from dozens of sources, with essentially no ``sense of self'' (for an exception, see, e.g., Forrest, 1996), and only the most rudimentary and shallow of defenses.
In the long run, we can be quite confident that this embarrassing combination of traits will not persist---in the language of evolutionary biology, it is not an ``evolutionarily stable strategy.'' (Maynard Smith, 1982). The essentially immediate explosion of computer viruses following the rise of personal computers is testament to this. Viewed in evolutionary terms, it makes no more sense to rail at the immorality of the virus writers that exploited the situation than it does to blame the system developers that created it. In both cases the protagonists were simply exploiting an opportunity that existed in the environment of the time. Personal computer manufacturers could cut costs by eliminating even the most basic immunologic mechanisms such as protected kernel mode and rationalize the decision by observing that computer viruses were not currently much of a problem. At the time, it was plausible to claim that these are ``personal'' computers that won't be networked like more expensive systems. On the flip side, virus writers and other attackers immediately exploited the results, rationalizing their behavior, where they felt the need to, by observing that they were just revealing obvious design problems that could easily be exploited for gain, and which the cheap-skate manufacturers are papering over about rather than fixing.
Loyalty and ``personal'' computing
Today, the appellation ``personal computer'' is in important ways a
misnomer. Although individually owned, the personal computer does not
``know'' its owner in any significant way, and that's just as well
because it is fundamentally unable to distinguish between what is
``inside itself'' and to be trusted with sensitive information (merely
beginning with passwords) and what is not. If future personal
computers do not make a strong and credible case for loyalty
to their owners, all the graphical interface and ease-of-use
improvements in the world will not get people to use them for serious
work. On the other hand, if a system demonstrates that is manifestly
watching out for its owner's best interests, first, last, and always,
from hardware to software to data and communications, people would be
willing to clap rocks together in Morse code to interact with it.
References