Establishing and maintaining those boundaries, today, is a disaster. Modern computer security amounts to a porous hodge-podge of corporate firewalls, third-party virus scanners, hastily written and sporadically applied operating system patches, and a myriad of woefully inadequate password authentication schemes. Dozens or hundreds of break-ins occur daily, computer virus infections are everyday life for millions of computer users, and new virus detections are booming, even as more and more businesses and business transactions move to the internet, and as software companies race with each other to deliver new ways of embedding code inside data.
Though this situation is far from what one would expect to find in a thoughtfully engineered, deployed, and maintained system-of-systems, it is precisely what one expects to find in systems produced by blindly reactive evolutionary processes. For example, it is too easy simply to lay blame for computer viruses on the early mass-market computer designers, even considering the body of knowledge available from the era of time-sharing. On the contrary, the tale of the PC and the virus is one of evolution in action: When the machine was designed, there were essentially no viruses in the wild--there was no wild to speak of--and code exchanges were either in large system-administrator-managed mainframe environments or in the tiny computer hobbyist community. Why would anybody waste design and manufacturing resources, increase costs greatly, and sacrifice time-to-market, just to defend against a non-existent problem?
Having humans in the loop, with all our marvelous cognitive and predictive abilities, with all our philosophical ability to frame intentions, does not necessarily change the qualitative nature of the evolutionary process in the least. Market forces are in effect regulated evolutionary forces; in any sufficiently large and distributed system, nobody is in charge, and evolutionary forces are constantly at work.