Scientific theories and methods modelled on natural selection.

To home page

Links to sections:


Charles Darwin

Manfred Eigen's theory of inorganic natural selection.

The most basic difference between life forms on earth is not between plants and animals but between cells with or without a nucleus. There is a fundamental difference of emphasis in the ways which the two kinds evolve. Cells without a nucleus, such as bacteria, evolve partly by random mutations of genes, some of which are naturally selected over others. This is the traditional conception of evolution since Darwin. It is an important cause for bacteria, because they can reproduce and multiply so rapidly, that small mutations to their heredity soon take effect over their whole gene pool.

The other important factor in the evolution of bacteria is called DNA recombination or global trading of genes. This is rather as if humans had a natural capacity for genetic engineering, being able to splice bits of each others' genetic inheritance.

However, creatures with a cell nucleus were originally separate creatures that have permanently combined. Lynn Margulis suggested that this incorporated symbiosis, or 'symbiogenesis', is the main way higher organisms have evolved. The claim is that life achieved its global empire not by combat but by networking.


It has long been questioned that life arose simply out of the natural selection of chance genetic variations. Likewise, the natural selection from a menu of randomly combined chemical soups has seemed wildly improbable. This is despite Miller's famous experiment that produced, in a flask, some basic constituents of life, by electrolysis, simulating lightning on a few chemicals present on earth in its primeval state.

Manfred Eigen suggested that life-like, but not living, stably reproducing systems of chemicals evolved thru natural selection, before evolving into life itself. The metabolism of living cells depends on catalysts, mainly enzymes. Catalytic reactions form complex networks, including closed loops or catalytic cycles. Enzymes produced in one cycle act as catalysts in a further cycle, a so-called hypercycle.

Eigen proposed this mechanism for self-organising chemical systems, with inorganic hypercycles in competition for natural selection. These systems are far from equilibrium, that is 'fed' by an energy flow. This may force instabilities in the system. But like the Chinese word for 'crisis' it can also mean 'opportunity' for new development, which may be amplified thru positive feed-back loops.


Environmental selection in memory, connectionist computation and neural networks.

Back to top of page.

Western thought has depended mostly on analysis but there are revivals in the holistic approach. There was Kurt Lewin's gestalt psychology as an alternative interpretation to that offered by behaviorism. Behaviorism, itself, has been likened to a Darwinism of the environment. ( Talcott Parsons' The Structure of Social Action ).

The brain has been compared to a computer. Computers or thinking machines have been based on linear thinking, starting with a few basic principles and then following these to their logical conclusions.

In the 1980's, a different kind of computer was becoming fashionable as a model for the working of the brain, seen as a neural network. Connectionist computation is a descendant of the nineteenth century associationist psychology, which comes from British philosophy of empiricism, rather than the rationalist philosophy which seeks to derive knowledge from first principles.

I can best describe the difference between the two approaches, from my personal experience as a child. When learning to read, I didnt much bother to consult a dictionary. It will define the meaning of a word you dont know. But that is a bit of an interruption of an adventure story.
For that matter, when adults use an obscure word, it would be a very scholarly child, who went off to consult a dictionary. One would lose the rest of the conversation and be worse off, than staying put.

Similarly, this child stayed put at his book. Funnily enough, I learned the meaning of words I didnt know, tho I had never found out their meaning from a dictionary. I believe this was because I gradually picked up the meaning of a strange word, from coming across it several times in books and building-up a sense of what it meant, from the slightly different contexts in which it was used, without ever having been actually told its meaning.

To some extent, I was selecting the books I read. But the chances, of which books I happened upon, were naturally selecting for me the meanings of words in the particular environment of books, where I found them. These chances -- more or less -- reinforce certain contexts of word-usage and discriminate against inapplicable usages.

Passing from one book to another, the reader inherits an accumulation of slightly varying meanings to any given word. These meanings will vary about a norm, with some tolerance for deviation from that most usual meaning of the word in question. This tolerance of variation allows words to adapt their meanings, appropriately to a changing society.

In a living language, no hard and fast definitions are laid down by some text with a monopoly of authority. Instead, all the authors, one reads, are competing authorities, by which a consensus of meaning is reached. Even dictionaries compete as guidelines.
Just as shared speech is a democratic usage, so is the shared authority of many authors.

Words derive a species-specific meaning from their contexts in a semantic ecology of words. As language evolves with a changing society, the meanings of words may be adapted to new situations. It has been claimed that the evolution of language and of tools are the two traits of human society most akin to biotic evolution.

As a child, learning a new word was a bit like pegging down a tent. Each time one came across it, was like a peg in one's memory, until one had its full meaning securely anchored in one's mind. One may move the pegs about a bit, in relation to each other, to straighten the tent's construction. Likewise, one's idea of a new word may shift about a bit, as one builds up a knowledge of its varying usage, into a settled pattern.

Margaret Boden compares the mechanics of connectionist computers to a class of children. ( 'Computers' used to be the name for people, who could do extra-ordinary calculations in their head, without even knowing how. ) Some of the children know bits of information but not the whole 'story': they are like input units. The output units are the children who announce 'the story so far'. Other children are the 'hidden units' that neither have any input or output to offer but mediate between those that have.

This so-called 'parallel distributed processing' is like a guessing game, in which some children have clues, which they pass on to others. Guesses may be reinforced by repetition of the same clues or thru associations. Other guesses are dismissed as they depart from the pattern, building-up. A consensus or equilibrium is approached on the basis of probability of what the object, to be guessed, is.

An object is not defined by explicit rules, leading you to conclude examples. Rather, glimpses or bits of knowledge, exemplifying an object, are amassed, till regularities are implicitly learned, in a build-up of sensory associations. Thus, the nature of one's environment acts as a selector on the rules one takes in.

It is like the difference between formal academic education and 'learning by doing'. And, indeed, the traditional class-room has a poverty of environment to experience, because one is, or was, not meant to learn from one's surroundings but from a syllabus. In this respect, the class-room restricts learning rather than promotes it. A classic criticism of formal education is that it is irrelevant or not adaptive to learning the rules of survival in the real world.


Mathematicians tend to see fields of study in terms of 'fields', in the sense of abstract spaces. These may be visualised in three dimensions, say, as a memory 'landscape', tho the actual mathematics, as used by John Hopfield, generalises the idea to multi-dimensions of memory associations, perhaps simulating the neural networks of the brain.

Trying to remember a name is likened to setting a marble rolling on a hilly terrain. In one of the valleys of memory is the correct name but there are closely associated, over-riding memory basins that the marble insists on rolling into, instead. For example, the Scottish inventor of television was not called Yogi Bear but it was something like that. ( Logie Baird, in fact. )

Like a scratch on an old record, a deeper memory cuts across a closely related memory's impression, and causes the needle of recollection to slip in its tracks and go round and round the wrong way, till one finds a way of putting it back on course to where one wants to be.

The memory landscape, that experience builds from life's chances, may be a fairly reliable map of the past. But a confusion of tracks on the mind, left by the senses' past traffic, can play us false. The point is that the nature of our experience builds a memory landscape that selects our memories for us. Memory becomes subject to our experience's environmental selection

The world, as a whole, does not stamp a formally correct impression, like a record, on our minds. Rather, it is a huge mix-up that we have to more or less make sense of, as far as possible, and with doubtful success, judging by the current state of the planet.


Evolutionary reform of English spelling.

Back to top of page.

A dead language is one in which meanings are fixed, because approximating to a form of society that no longer exists. The success of English depends on the extent of its democratic evolution. English is a hybrid of languages that welcomes new worlds of experience by freely adopting the words of other languages. Critics of English as a possible world language protest it isnt a language at all, because it doesnt conform to their standard of a classical language, that is a dead language, with no new input.

Indeed, the idea of a world language is the misconception that would impose some rigid form on the whole world. Why English most looks like a potential world language is precisely because it has abandoned the idea of some fixed linguistic form and is evolving out of itself, all the time.

Whereas, the failure of English is precisely in its failure to evolve a more rational spelling that would enfranchise the twenty per cent or more of its speakers, who are functionally illiterate. This failure is due to the fallacy of 'correct' spelling usage, the dogmatic insistence on some monopoly of authority, such as Dr Johnson's dictionary. In fact, different English speaking countries have typical spelling variations from each other. The Oxford English Dictionary has adopted both British and American spellings.

The problem of English spelling reform is two-fold. Partly it depends on tolerating freedom of spelling, so that anyone not spelling the same way as Dr Johnson or Teddy Rooseveldt is not looked-down on. Partly, it depends on the need for an education in how to spell English as it sounds, with a tolerance of accents. We dont insist that Australians should drop their species of Cockney twang. We shouldnt insist that all English speaking people should conform to one spelling 'accent'. But to be able to speak in any accent depends on an education in basic English fonetics. Besides the orthografic pedants, the other problem, to saving English literacy for all English speakers, would be the fonetics pedants.

Rational English spelling, for general literacy, needs not only to ensure a close approximation to one letter one sound. It also needs to ration the number of letters in the reformed alfabet to about the same number as the existing English-Roman alfabet of 26 letters. ( This is dealt with, on another of my web pages. )

Suppose educators achieved a short rational English alfabet tolerating accented variations in spelling. This would not lead to anarchy any more than Dickens' novels, with their phonetic transcriptions of accents are anarchic. Moreover, people who wanted to be understood, as widely as possible would spell in the most universal English accent, 'mid-atlantic' or whatever.
A sort of natural selection would operate, where writings, in the most standard accents, would be the most widely understood and therefore the most widely read.

English could evolve in its spelling, as in its grammar, which arose from a 'natural selection' of the simplest syntax, between ordinary people of diverse back-grounds, with limited understanding in common. This contrasts with the highly inflected languages elaborated by scholars, or with monitoring by a language academy, justifying its existence by imposing a supposed classical excellence, purging popular introductions of foreign words.

The section below, on evolved computor programs, has a lesson for language reform. 'Correctness' of programs can only be achieved by total control over its writing by the programmer, who knows every logical step of the way. But for complex programming problems, this 'correct' approach ceases to be practical. The reason is analgous to why the 'command economy' has to give way to the market economy. Economic relationships are too complex for a totally top-down management. Dictatorial control has to be relinquished for individual freedom of initiative.


Lee Smolin's theory of cosmological natural selection.

Back to top of page.

Classical physics would suggest that black holes draw matter into infinitely dense points or 'singularities'. But Smolin firstly assumes that the modern rules of quantum fluctuations would over-ride this notion. Black holes gravitationally attract matter, beyond a bound of no return. But the mathematics of this process were studied in reverse, as akin to the explosive origin of the universe, in a 'big bang'.

It is one step further to Smolin's second assumption that quantum uncertainties might bounce back matter from a black hole to explode a new universe. Stephen Hawking's 'Black Holes and Baby Universes' is the title essay of a popular book, he wrote after 'A Brief History of Time.'

Hawking comments on the well known paradox of Shrodinger's Cat. He says the paradox depends on assuming the viewpoint of classical physics, where the principles of quantum physics apply. The classical notion is that an object has a single definite history, such as is the cat in the box alive or dead? But the whole point of quantum mechanics is a different view of reality in which an object has all possible histories. Probabilities of very slightly differing histories, in most cases, cancel out. But, in certain cases, the probabilities of neighboring histories reinforce each other; one such being observed as the history of the object.
Schrodinger's cat has two histories reinforced: one as alive, the other as dead.

Our universe may be considered as only one of many universes, or a 'multiverse'.
To continue the analogy, of baby universes, the great majority of universes are likely to be still-born. This is assumed because the only reason this universe can exist depends on an incredibly delicate adjustment of physical conditions. ( One of Paul Davies' popular physics books, 'The Runaway Universe', is about this. )
Therefore, most universes born out of black holes are likely to get little further than the sub-atomic scale of initial quantum fluctuations. And most of the remainder are likely to be largely unstructured and so incapable of meeting the conditions that make life possible in this universe.

Smolin's third assumption is that the baby universes inherit the same laws of physics from their parent universe but for small random changes in the physical constants or parameters, such as those that are found to obtain in this universe.

A kind of multiversal memory landscape, or 'mind of God', as Stephen Hawking would put it, is implicit in this assumption that a baby universe remembers, with high probability of accuracy, the physical laws of its parent.
Smolin uses the same sort of mathematical landscape for cosmological natural selection, as is used for memory's selection from environmental experience ( mentioned in the above section on memory ).

The current best theory of our universe, on which most physicists are agreed, is called the standard model. This cannot explain every thing from first principles. There are about twenty parameters or arbitrary constants, which cannot be deduced, but are measured by experiment, and plugged into the theory, to fill in the gaps in its reasoning.

The cosmological theory of natural selection would only conflict with a totally unified theory of physics, that had no parameters. Einstein attempted a unified theory and he assumed quantum theory was incomplete because of its statistical nature. Natural selection is a statistical theory, which Smolin has adapted to cosmology, on the assumption that the multiverse is not absolutely determined, but contains one or more free parameters.

After the fashion of Darwinian theory, which takes many generations to much alter species, many generations of universe would be needed to make big changes to their initial conditions. Highly restricted is the range of parameters, or conditions, for which atomic nuclei, and therefore stars exist, and the black holes that form from some of them. But such universes with many black holes are likely to have many progeny. Given that this is a black hole-rich universe, its existence is explained as having the conditions of a typical universe.

Out of the twenty or so physical parameters, Smollin gives a simplified model of his theory with just two parameters, say, proton mass and electron mass. He treats these as length and breadth co-ordinates, like the two sides of a hilly field. The heights of the hills are proportional to the number of black holes that any given combination of values for these two parameters is likely to produce. In turn, the number of progeny, from each universe, is proportional to the height of the parametric 'landscape', where it is situated.

Lee Smollin's theory makes an analogy of physics parameters to biological genes. The space of parameters ( which would be of a very high dimension ) is compared to the collection of all possible sequences of DNA.

The average number of universes produced by a universe with a particular set of parameters is compared to the average number of offspring from creatures with a particular set of genes. This 'fitness' of creatures depends on their situation on a 'fitness landscape' in an abstract space of genes. The rate of reproduction is measured for how strongly it varies with variations in possible gene combinations.
Smollin qualifies his analogy as only approximating to the simplest or crudest of biological scenarios, a single species evolving in a fixed environment.


Evolving robots and programs.

Back to top of page.

A robotics researcher's work was shown on tv. This appeared to consist of a flock of little metal toys moving around the floor. Someone thought this was a waste of time and the funds should be cut. But I had read Kevin Kelly's Out Of Control, so I knew what this apparently childish researcher was about.

Robbie the Robot, in Forbidden Planet could do humdrum things like produce a pile of bootleg vintage, besides all the out-of-this-world effects. The old movies have made us imagine that robots are large, if not human-size machines that pioneer engineers can at least begin to make do useful jobs, like hoover the house. There are robots, coming into commercial use, that can do routine tasks like clean a hospital floor.

A more recent approach to robot development is to copy how life has evolved, from the simplest to more complex organisms. As robots are essentially mobile computers, this also involves a radically different way to create programs. Traditionally, programs have been written for a specific purpose, achieved by following a set of rigidly controled instructions, that can only work in a definite environment.

The alternative is not to define the environment for one complex machine's operations, but let the environment define lots of simple machines' operations. This is akin to natural selection. Gradually, more complex jobs can be done, when the simple jobs have been learned. And, like insects, many small robots can do big jobs.

The evolutionary approach to robots and to programs compares to the toy contruction game, 'Leggo'. As in object-oriented programming, many small existing programs, like leggo bricks, can be used to put together a program for any new purpose, instead of having to write a whole new program from scratch, to fulfil a given goal.

Also, problems may be too complex to solve easily, or at all, by rational design. But they may yield to massive random trials. For instance, a molecular form must be found for a drug needed to neutralise a disease mechanism. By the evolutionary method, billions of random molecules are tried till one fits against its lock, the rest being washed away.

Karl Sims' 'equation-genes' are small logical units of a computer language ( LISP ). Each module is an arithmetic command, like add, multiply, cosine, etc. Logical units are evolved and randomly flipped, to create new functions, that the computer drew into often stunning patterns. Sims selected and bred variations of these, by swopping branches of their logic trees, analgous to the sexual exchange of genes.

Dr Robert Smith, at the University of the West of England, uses such genetic algorithms, or assortments of code, set the problem of test-flying planes. They are selected and bred like 'animals' until they become species-specific to their problematic new 'environment'.

Evolved solutions involve a great deal of trial and error. The criterion for such complex solutions is not that they are error-free, the property of small systems, but that they work or that they are flexible enough to survive the tests they have been set.

Like the genetic code, itself, evolved computer codes contain much redundant material. One reaches the solution, by giving up total control of how one arrives and without knowing how one got there. God is believed to have given free-will to his creations, according to their evolving complexity. So, evolved problem-solving gives up complete control of one's logical creations' development. This helps one to be creative beyond one's preconceptions.


References.

Margaret Boden: The Creative Mind. 1992.
George Johnson: In The Palaces Of Memory. 1992.
Fritjof Capra: The Web Of Life. 1996.
Lee Smolin: The Life Of The Cosmos. 1997.
Stephen Hawking: Black Holes And Baby Universes and other essays. 1993.
Kevin Kelly: Out Of Control. The new biology of machines. 1994.
Sanjida O'Connell: Those magnificent genes. ( The Guardian, 21 June 2001. )


Darwin's home, his study and favorite walk



Richard Lung


To top

To home page