ccr Home Page

The Von Neumann machine is tapped out

Adapted from Ackley (1996)
The biggest problem facing computer science today is how to go parallel. Broadly construed, this includes not only parallel and distributed computing, networks, distributed AI, and so on, but also the evolution of protocols and standards, and cryptography and computer security generally. The classical Von Neumann architecture has delivered miracles, but as a conceptual organization it is largely tapped out.

To design and build future computation and communication networks, we will either learn from or be forced to rediscover evolutionary biology, which, from the beginning, has dealt with parallel and distributed processing under resource and reliability limits, conflicting individual and collective goals, and limited trust. Artificial life research can and should, but to date has yet to, inform and unify these efforts.

The central tenet of the ccr project is that ``living systems'' and ``information systems'' refer to the same class of systems. The research goals are both theoretical and practical: To understand better the connections between `information' and `life', and to design, build, deploy, and evolve experimental systems that explore and extend the `living computation' framework.


Ackley, D. H. (1996, to appear).
ccr: A Network of Worlds for Research. In Artificial Life V, MIT Press.