ARTIFICIAL LIFE Gourab Mukhopadhyay

ARTIFICIAL LIFE
Gourab Mukhopadhyay
What is life
There is no generally accepted definition of life.
In general, it can be said that the condition that distinguishes living organisms from
inorganic objects or dead organisms growth through metabolism, a means of
reproduction, and internal regulation in response to the environment.
Even though the ability to reproduce is considered essential to life, this might be more
true for species than for individual organisms. Some animals
are incapable of reproducing, e.g. mules, soldier ants/bees or simply infertile
organisms. Does this mean they are not alive?
What is Artificial Life?
Artificial life researchers have often been divided into two main groups:

The strong alife position states that life is a process which can be abstracted
away from any particular medium.

The weak alife position denies the possibility of generating a "living process"
outside of a carbon-based chemical solution. Its researchers try instead to
mimic life processes to understand the appearance of individual phenomena.

The study of man-made systems that exhibit behaviors characteristic of
natural living
systems .

It came into being at the end of the ’80s when Christopher G. Langton
organized the first workshop on that subject in Los Alamos National
Laboratory in 1987, with the title: "International Conference on the Synthesis
and Simulation of Living Systems
Basic of Artificial life

Artificial Life tries to transcend the limitation to Earth bound life, based
beyond the carbon-chain, on the assumption that life is a property of the
organization of matter, rather than a property of the matter itself.

If we captured the essential spirit of ant behavior in the rules for virtual ants,
the virtual ants in the simulated ant colony should behave as real ants in a
real ant colony.

Synthesis of complex systems from many simple interacting entities.

Property of a system as a whole not contained in any of its
parts. Such emergent behavior results from the interaction of the elements
of such system, which act following local, low-level rules.
The Goal of Artificil life

The goal of Artificial Life is not only to provide biological models but also to
investigate general principles of Life.

These principles can be investigated in their own right, without necessarily
having to have a direct natural equivalent
Linear vs. Non-Linear Models

Linear models are unable to describe many natural phenomena.

In a linear model, the whole is the sum of its parts, and small changes in
model parameters have little effect on
the behavior of the model.

Many phenomena such as weather, growth of plants, traffic jams, flocking of birds,
stock market crashes, development of multi-cellular organisms, pattern formation in
nature (for example on sea shells and butterflies), evolution, intelligence, and so forth
resisted any linearization; that is, no satisfying linear model was ever found


Non-linear models can exhibit a number of features not known from linear
ones:

Chaos: Small changes in parameters or initial conditions can lead to
qualitatively different outcomes.

Emergent phenomena: Occurrence of higher level features that weren’t
explicitly modelled.

As a main disadvantage, non-linear models typically cannot be solved
analytically, in contrast with Linear Models. Nonlinear modeling became
manageable only when fast computers were available .
Models used in Artificial Life are always non-linear.
Lindenmeyer Systems

Lindenmayer Systems or L-systems are a mathematical formalism proposed in
1968 by biologist Aristid Lindenmayer as a basis for an axiomatic theory on
biological development.

The basic idea underlaying L-Systems is rewriting: Components of a single
object are replaced using predefined rewriting rules.

Its main application field is realistic plants modelling and fractals.

They’re based in symbolic rules that define the graphic structure generation,
starting from a sequence of characters.

Only as small amount of information is needed to represent very complex
models
Diffusion Limited Aggregation (DLA)

"Diffusion limited aggregation, a kinetic critical phenomena“, Physical
Review Letters, num. 47, published in 1981.

It reproduces the growth of vegetal entities like mosses, seaweed or lichen,
and chemical processes such as electrolysis or the crystallization of certain
products.

A number of moving particles are freed inside an enclosure where we have
already one or more particles fixed.

Free particles keep moving in a Brownian motion until they reach a fixed
particle nearby. In that case they fix themselves too.
Agent-based Modelling

Computational model based in the analysis of specific individuals situated in
an environment, for the study of complex systems.

The model was conceptually developed at the end of the ’40s, and had to
wait for the arrival of computers to be able to develop totally.

The idea is to build the agents, or computational devices, and simulate them
in parallel to be able to model the real phenomena that is being analysed.

The resulting process is the emergency from lower levels of the social
system (micro) towards the upper levels (macro).

Simulations based in agents have two essential components:

Agents

Environment

The environment has a certain autonomy from the actions of the agents,
although it can be modified by their behaviour.

The interaction between the agents is simulated, as well as the interaction
between the agents and their surrounding environment.
Distributed Intelligence

Complex behaviour patterns of a group, in which there is no central
command.

It arises from “emergent behaviour”.

It appears in a group as a whole, but is no explicitly programmed in none of
the individual members of the group.

Simple behaviour rules in the individual members of the group can cause a
complex behaviour pattern of the group as a whole.

The group is able to solve complex problems a partir only local information.

Examples: Social insects, immunological system, neural net processing
Didabots

Experiment carried on in 1996, studying the collective behaviour of simple
robots, called Didabots.

The main idea is to verify that apparently complex behaviour patterns can
be a consequence of very simple rules that guide the interactions between
the entities and the environment.

This idea has been successfully applied for example to the study of social
insects.

Infrared sensors can be used to
detect proximity up to about 5 cm.

Programmed exclusively for
avoiding obstacles.

Sensorial stimulation of the left
sensor makes the bot turn a bit to
the right, and viceversa
Social Insects

The main quality for the so-called social insects, ants or bees, is to form part
of a self-organised group, whose key aspect is “simplicity”.

These insects solve their complex problems through the sum of simple
interactions of every individual insect
Bees

The distribution of brood and nourishment in the comb of honey bees is
not random, but forms a regular pattern .

The central brooding region is close to a region containing pollen and one
containing nectar (providing protein and carbohydrates for the brood).

Due to the intake and outtake of pollen and nectar, the pattern is changing all
the time on a local scale, but it stays stable if observed from a more global
scale.

The distribution of brood and nourishment in the comb of honey bees is
not random, but forms a regular pattern .

The central brooding region is close to a region containing pollen and one
containing nectar (providing protein and carbohydrates for the brood).

Due to the intake and outtake of pollen and nectar, the pattern is changing all
the time on a local scale, but it stays stable if observed from a more global
scale.
Ants

Ants are able to find the shortest path between a food source and their
anthill without using visual references.

They are also able to find a new path, the shortest one, when a new
obstacle appears and the old path cannot be used any more.

Even though an isolated ant moves randomly, it prefers to follow a
pheromone-rich path. When they are in a group, then, they are able to make
and maintain a path through the pheromones they leave when they walk.

Ants who select the shortest path get to their destination sooner. The shortest
path receives then a higher amount of pheromones in a certain time unit. As a
consequence, a higher number of ants will follow this shorter path.
Self Replication

Self Replication is the process in which something makes copies of itself.

Biological cells, in an adequate environment, do replicate themselves through
cellular division.

Biological viruses reproduce themselves by using the reproductive
mechanisms of the cells they infect.

Computer virus reproduce themselves by using the hardware and software
already present in computers.

Memes do reproduce themselves using human mind as their reproductive
machinery.
Self Replicant Cellular Automata

In 1948, mathematician von Neumann approached the topic of self-replication
from an abstract point of view. He used cellular automata and pointed out for
the first time that it was necessary to distinguish between hardware and
software.

Unfortunately, Von Neumann’s self reproductive automata were too big
(80x400 cells) and complex (29 states) to be implemented.

In 1968, E. F. Codd lowered the number of needed states from 29 to 8,
introducing the concept of ‘sheaths’: two layers of a particular state
enclosing a single ‘wire’ of information flow.

In 1979, C. Langton develops an automata with self reproductive capacity. He
realised that such a structure need not be capable of universal construction
like those from von Neumann and Codd. It just needs to be able to reproduce
its own structure.
Biomorphs

Created by Richard Dawkins in the third
chapter of his book “The Blind
Watchmaker”.

The program is able to show the power of
micromutactions and accumulative
selection.

Biomorph Viewer lets the user move through
the genetic space (of 9 dimensions in this
case) and keep selecting the desired shape.

User’s eye take the role of natural selection.
Biomorphs
Evolutive Algorithms

Genetic Algorithms: The most
common form of evolutive
algorithms. The solution to a
problem is search as a text or a
bunch of numbers (usually
binary), aplying mutation and
recombination operators and
performing a selection on the
possible solutions.

Genetic Programming: Solutions
in this case are computer
programs, and their fitness is
determined by their ability to
solve a computational problem
Artificial Chemistry

Artificial Chemistry is the computer simulation of chemical processes in a
similar way to that found in real world.

It can be the foundation of an artificial life program, and in that case usually
some kind of organic chemistry is simulated