Artificial Life and Catastrophe Soup

24 August, 2008

The Telegraph reports an observation from Prof Mark Bedau at a recent Artificial Life conference that developments in simulated evolution over the last twenty years have not been as fruitful as originally hoped. Whilst insisting this did not in any way support an intelligent design hypothesis, which prompted a flurry of related blog posts, he did say we needed more understanding of the self-organisation processes involved.

These simulated evolution experiments are finding equilibrium states which are not conducive to evolving higher organisms. Complex life-forms never emerge due to the dominance of very simple parasitic organisms like viruses. One imagines genetic algorithms starting from a random state and exploring the local fitness landscape before settling into fairly stable configurations which only allow minor mutations to survive. Punctuationist evolutionary theory holds that evolution happens at a variable rate, and this is related to extinction cascades within the ecosystem. The magnitude of cascades are distributed by a power law, and the largest result in mass extinctions which provide rich opportunities for evolution to occur rapidly, by vacating niches previously occupied by dominant species.

However, the extinction cascades which result from mutation may not be enough to avoid homeostasis. The Gaia Hypothesis observed that the biosphere is a system of both living organisms and the environment. Organisms cannot be decoupled from the carbon and water cycles. They shape the environment by changing its structure and components, but more importantly environmental changes also instigate massive extinction cascades. If this were not the case, perhaps the Earth would still be ruled by dinosaurs or blue-green algae.

If homeostasis occurs in an evolutionary model then maybe expecting a killer virus to provide wiggle-room for creativity is not enough. Volcanoes, asteroids, ice ages and other environmental impacts must be included in the system. These events change the topology of the fitness landscape. This means that what was previously a good adaptation may now be a disadvantage. This massively relaxes the niche pressure on the survivors, which allows new maxima to be reached. Global optimization problems can be addressed by an approach like Simulated Annealing, which is something akin to shaking a bucket of stones to bring the large ones to the surface. Add some randomness to break the order, so that new structures may be formed. However, the magnitude of extinction events needs to to be distributed properly – there have to be some huge ones. Even the human population has veered close to extinction on several occasions. Most notably we were apparently reduced to something like 10,000 individuals when a supervolcano at Lake Toba erupted about 70,000 years ago.

When a forest is felled there is much room for growth, but biodiversity is hugely reduced since species provide each other’s environment. The ecosystem must bootstrap itself. Storms and forest fires provide catastrophes on the intermediate scale. Minor adjustments are made when a single tree falls and creates a clearing, giving a small window of opportunity to other plants. Meanwhile, a garden sometimes requires drastic tree surgery in order to reduce shade, but this can usually be avoided with regular attention to pruning. Without catastrophes we might still be soup… the question is, can the effects of major impacts on a system ever be simulated with directed minor adjustments? In the context of social media or organisational knowledge management, can revolutions ever be achieved through incremental change? And can Google be beaten by a head-on assault, or does the playing field have to first be redefined?

Advertisement

Web Science and the Novelty Engine

17 July, 2008

Web Science
There is a call from Sir Tim Berners-Lee and Wendy Hall for an academic discipline which studies the internet.

Although this may focus on the internet and its success stories by incorporating lessons learnt in computer and social sciences, I would hope it also draws heavily on work done in the past few decades in the natural sciences, given that pretty much every interesting system studied these days produces knowledge on complex networks. Web Science could be a real destination for generalists!

Social Semantic Web
Glad to see people thinking about the crossover between social networking and the semantic web. It will be interesting to see the spread of papers at the Stanford University symposium.

Novelty Engine
According to Nicholas Carr internet technology is bombarding us with so much information we are all rapidly losing our attention span. This not only forces us to speed-read everything but it prevents us from contemplating the deeper issues. One reason social network tools like Digg, Twitter etc help is because we trust others to read and filter on our behalf. However there is still so much to sift through for the knowledge nuggets.

What if we were also able to trust technology to do the reading and filtering? We are already seeing micro-blogging bots at the broadcast end. Social web browsers would also benefit from some intelligence, enabling messages to be organised and filtered on receipt. Again, perhaps the focus would be on identifying posts which are “related but novel”. Or at least routing them into personalised semantic buckets. Are there tools already out there, eg how much of this can you do with Flock? What would it take for us to build trust in personalised blog-reading bots?