This is a tutorial on the processes and patterns of organization and complexity in natural systems.
No technical details are included in describing the models or theories used. Instead, I focus
on the concepts of self-organization, complexity, complex adaptive systems, criticality, the edge of chaos and
evolution as they pertain to the formation of coherent pattern and structure in nature.
Decker, Department of
Biology, University of New Mexico, Albuquerque, NM 87131
Faculty sponsor: Dr. Bruce T. Milne,
Department of Biology, University of New Mexico, Albuquerque, NM
Please send mail to Ethan Decker at firstname.lastname@example.org
Note for my readers
reading this tutorial with Netscape, the hypertext links will take
you to a variety of sites around the globe. To return to the
tutorial, press the back icon on the toolbar at the top of
the screen until you return, or find the tutorial's title,
Self-Organizing Systems, in the Go menu.
the questions guiding subsets of physics, chemistry and biology
research is Where does order come from? Following the general laws
of thermodynamics it would seem that dynamic processes would find
the path of least energy until the system found a low spot, a dead
calm, and remained at equilibrium there until some obvious
perturbation moved it from its complacency. For example, a pot of
steaming sugar water will give off matter (water vapor) and energy
(heat) until it reaches equilibrium with its environment. Cooling,
evaporation and crystallization, governed by simple physical and
chemical laws, will drive the system to a point of least energy,
and we should find rock candy in the bottom of a dry pot.
world abounds with systems and organisms that maintain a
high internal energy and organization in seeming defiance of the
laws of physics. As spin glasses cool, ferromagnetic particles
magnetically align themselves with their neighbors until the entire
lattice is highly organized. Water particles suspended in air form
clouds. An ant grows from a zygote into a complex system of cells,
and then participates in an organized, structured hive society.
What is so fascinating is that the organization seems to arise
spontaneously from disordered conditions, and it doesn't appear
driven by known physical laws. Somehow the order arises from
the multitude interactions of the simple parts, and the laws that
may govern this behavior are not well understood. It is clear,
though, that the process is nonlinear, using positive and negative
feedback loops among components at the lowest level of the system
and between them and structures that form at higher levels.
For landscape ecology, an SOS perspective might reveal how spatial
and temporal patterns such as patches, boundaries, cycles, and
succession might arise in a complex, heterogeneous community.
Understanding SOS mechanisms might enable models to be more
informative and accurate. Early models of pattern formation use a
'top-down' approach, meaning the parameters describe the higher
hierarchical levels of the system. For instance, individual trees
are not made explicit in patch models, but clumps of trees are. Or
individual predators are absent in a predation model, but a
predator population is programmed as a unit that impacts a prey
population. In this way, the population dynamics are controlled at
the higher level of the population, rather than being the results
of activity at the lower level of the individual.
The problem with this top-down approach is that it violates two
basic features of biological phenomena: individuality and
locality. By modeling a rodent population as a mass of
rodent with some growth and behavior parameters, we obviate any
differences that might exist between individual rodents. Some are
big, some are small, some reproduce more, some get eaten more.
These small differences can lead to larger differences such as
changes in the population gene frequencies, size or location that
might have cascading effects at still larger scales. For instance,
a moving rodent population might draw their predators with
them, away from environments where the predators have some other
important ecological role.
The tenet of locality means that every event or interaction has
some location and some range of effect (Kawata and Toquenaga 1994).
Tree gaps in the tropics have resultant ecological changes that are
extremely limited by the location and the size of the gap.
Obviously, not every seed in the forest has an equal chance of
germinating in the gap, but gap models assume that seeds are
perfectly evenly distributed throughout the forest, and that the
major influence on germination success is a species' relative
abundance in the seed bank. Ignoring locality obscures the factors
that might contribute to spatial and temporal dynamics. For
instance, seedlings located on a high water table might grow better
than those located on arid soil, and as they grow they might
increase the moisture-holding capacity of that area, creating new
landscape patterns. This is a simple illustration of the
ecological principle that pattern affects process (Watt 1947,
that a system is self-organized is to say it is not governed by
top-down rules, although there might be global constraints on each
individual component. Instead, the local actions and interactions
of individuals is the source of the higher-level organization ofland
the system into patterned, ordered structures with recognizable
dynamics. Since the origins of order in SOS are the subtle
differences among components and the interactions among them,
system dynamics cannot be understood by decomposing the system into
its constituent parts. Thus the study of SOS is synthetic
rather than analytic.
Several research institutes now focus on this topic, often from the
perspective of one scientific discipline. Others, such as the Santa Fe Institute and the Center for Complex Systems
Research at the University of Illinois, were formed
specifically to tackle this subject with a multidisciplinary
Mechanisms of self-organization
everal mechanisms and preconditions are
necessary for systems to
self-organize (Nicolis and Prigogine 1989, Forrest and Jones 1995).
These mechanisms are somewhat redundant and somewhat undefined, but
they are useful intuitive indicators of the potential for self-
First, the system (a recognizable entity such as an organ, an
organism, or a population), must be exchanging energy and/or mass
with its environment. In other words, there must be a nonzero
flow of energy through the system. Adding heat to a pot of
water or food to a fish tank are examples of energy flows. A
system must be thermodynamically open because otherwise it would
use up all the available usable energy in the system (and
maximize its corollary, entropy) and reach what is known as heat
death. A nicer name for this is thermodynamic equilibrium. It is
often said that SOS are "far from" thermodynamic equilibrium, but
that's not necessarily the case. They only need be far enough to
avoid collapsing into a local equilibrium condition, and sometimes
that's not very far.
If a system is not at or near equilibrium, the only other option
for its behavior is that it is dynamic, meaning the system is
undergoing continuous change of some sort. One of the most basic
kinds of change for SOS is to import usable energy from its
environment and export entropy back to it. The idea of "exporting
entropy" is a technical way of saying that the system is not
violating the second law of thermodynamics because it can be seen
as a larger system-environment unit. This entropy-exporting
dynamic is the fundamental feature of what chemists and physicists
call dissipative structures. Nobel Laureate Ilya Prigogine believes dissipation is the
defining feature of SOS. There are some, though, who disagree with him.
Because all natural systems have inherently local interactions,
this condition is noted because it is an important mechanism for
self-organization that must be incorporated into models of SOS.
A system with positive and negative feedback loops is modeled with
equations. Self-organization can occur when feedback
loops exist among component parts and between the parts and the
structures that emerge at higher hierarchical levels. In fact,
and many other interesting
dynamical phenomena are often studied jointly at a large number of
university and private institutes and centers, such as at Los Alamos National Lab, and
The University of
Copenhagen. In chemistry, when an enzyme catalyzes reactions
that encourage the production of more of itself, it is called
auto-catalysis. It's possible that auto-catalysis played an
important role in the origins
Since the magic of self-organization lies in the connections,
interactions, and feedback loops between the parts of the system,
it is clear that SOS must have a large number of parts. Cells,
living tissue, the immune system, brains, populations, hives,
communities, economies, and climates all contain hundreds to
trillions of parts. These parts are often called agents
because they have the basic properties of information transfer,
storage and processing. An agent could be a ferromagnetic particle
in a spin glass, a neuron in a brain, or a firm in an economy.
Models that assign agency at this level are known as individual-based models, such as SFI's ECHO.
They use computer simulations to observe how local, nonlinear
interactions of many agents can develop into complex patterns.
In contrast, traditional system models place agency at the group
level by using group-level parameters such as population growth
rates (Huston et al. 1988)
Probably the most nebulous concept of the bunch (Crutchfield 1994),
the theory of emergence says the whole is greater than the sum of
the parts, and the whole exhibits patterns and structures that
arise spontaneously from the parts. Emergence indicates there is
no code for a higher-level dynamic in the constituent, lower-level
parts (Green 1993). Convection currents, eddies, cellular
dynamics, 'mind,' forest patches, and food webs are examples of
emergent phenomena. (See Jim Crutchfield's paper on this topic.) Some believe emergence is
nothing more than a trick of perception, when the observer's
attention shifts from the micro- level of the agents to the macro-
level of the system. Emergence fits well into hierarchy theory as
a way of describing how each hierarchical level in a system can
follow discrete rule sets.
Emergence also points to the multiscale interactions and effects in
self-organized systems. The small-scale interactions produce
large-scale structures, which then modify the activity at the small
scales. For instance, Specific chemicals and neurons in the immune
system can create organism-wide bodily sensations which might then
have a huge effect on the chemicals and neurons. Prigogine
(1984) has argued that macro- scale emergent order is a way for a
system to dissipate micro- scale entropy creation caused by energy
flux, but this is still not theoretically supported.
ven knowing that self-organization can
occur in systems with these
qualities, it's not inevitable, and it's still not clear why it
sometimes does. In other words, no one yet knows the
necessary and sufficient conditions for
Complexity at the edge of chaos
OS often display a highly complex kind of
have obvious patterns and regularities, but they are not simple
structures. Certainly stochastic (random) elements affect the
structure and dynamics of a hive, but it's not likely that in a
completely deterministic hive the patterns would be simple.
Likewise clouds, weather patterns, ocean circulation, community
assemblages, economies and societies all exhibit complex forms of
self-organization. If so many SOS are characterized by complexity,
it's fair to ask What is complexity?
What is complexity?
here is no good general definition of
complexity, though there are many.
complexity lies somewhere between order and disorder, between the
glassy-calm surface of a lake and the messy, misty turbulence in
gale-force winds. Complexity has been measured by logical depth,
metric entropy, information content, fluctuation complexity, and
many other techniques. These measures are well-suited to specific
physical or chemical applications, but none describe the general
features of self-organization. Instead, we must settle for the
dictionary definition which pulls relative intractability (i.e. we
can't understand it yet) and intricate patterning into a conceptual
taffy. Obviously, the lack of a definition of complexity doesn't
prevent researchers from using the term.
Langton's cellular automata
polite way to talk about complexity
when it is so poorly defined
is to describe the boundary between order and chaos - where
complexity would feasibly reside - as the edge of chaos (Packard
1988, Langton 1990, Kauffman 1991, 1993). Chris
Langton (1990) conducted a computer experiment with cellular
automata (CA) in which he attempted to find
out under what conditions a simple CA could possibly support
"computational primitives," which he defines as the transmission,
storage, and modification of information.
In his experiment, a one-dimensional CA is composed of 128
cells connected in a circle. Each cell is capable of four possible
internal states. Each cell takes as its input the states of the
cells in its region, known as its neighborhood. Langton's
neighborhoods consist of five cells: an automaton is considered a
member of its own neighborhood along with the two neighbors on each
side. The cell's internal state at the next time step is
determined by the state of its neighborhood and some transition
function which describes which internal state it should move to
given a neighborhood state. Thus the neighborhood state is
associated with transmission, the automaton internal state with
storage, and the transition function with modification of
To examine how order and chaos affect computation, he
formulates a lambda value which describes the probability
that a given neighborhood configuration will lead to one
particular, arbitrary internal state, called the "quiescent state."
When lambda = 0, all neighborhood states move a cell to the
quiescent state, and the system is immediately completely ordered.
When lambda = 1, no neighborhood states move to the quiescent
state, and the CA will not settle into any ordered regime of states
When 0 < lambda < 1, the fun begins. As lambda increases, the
time series graphs of the linear CA exhibit longer and larger
streams of cell transitions called transients. (In the time
series, t=1 is the top row, and time flows down.) Transients
supposedly demonstrate the CA's ability to compute. The patterns
that transients exhibit also hint of that elusive quality
complexity. Thus computation seems to be possible at the edge of
Langton has made an interactive CA
site that allows you to
perturb the lambda value up or down from the default value of 0.25.
When you use it, run the CA and check out the resulting time
series. Note how long and intricate the transients are. Then
adjust the lambda value (go up first) and hit perturb
lambda. It will run again and you'll see how the transients
Langton claims that as lambda is increased, the CA undergoes
a phase transition from ordered states to chaotic regimes.
When average transient length is graphed against lambda, there is
a spike of extremely long transients at lambda = 0.50. Langton
shows that the average mutual information (a kind of complexity
measure) of the CA is maximized at the lambda value at which the
phase transition occurs, called its critical value. If
lambda exceeds the critical value the average mutual information
decays as the system becomes more chaotic. Langton suggests that
because computation is associated with this critical value at the
phase transition, a SOS will need to maintain itself at the "edge
of chaos" in order to compute its own organization:
One of the most exciting implications of this point of view
is that life had its origin in just these kinds of extended
transient dynamics.... In order to survive, the early extended
transient systems that were the precursors of life as we now know
it had to gain control over their own dynamical state. They had to
learn to maintain themselves on these extended transients in the
face of fluctuating environmental parameters, and to steer a
delicate course between too much order and too much chaos, the
Scylla and Charybdis of dynamical systems.
-- Langton (1990)
hase transitions are mathematically
interesting. They differ from
standard transitions in the sharpness, or steepness, of the break
between two phases or states. The transition from freezing to
boiling temperatures for a liquid is general: it is a gradual slope
from one to the other. But the transition from boiling-temperature
liquid to boiling-temperature gas occupies a small space between
the two phases (e.g., some particular pressure-temperature
combination). After the gradual heating, there is an abrupt change
to the gas phase so that the two phases are clearly distinct,
separated by the boundary at the phase transition conditions. Such
boundaries are very useful for predicting the properties of a
system or substance in different conditions. Phase transitions
are also often the site of interesting dynamics that don't appear
in the phase regions. For instance, a simple solid will absorb
much more energy per unit mass and will dissolve chemical bonds at
A phase transition also occurs in large networks when the
connectedness between cells reaches a critical value. The degree of
connectedness (i.e. number of connections) determines the
probability that a patch of connected cells spans the entire
lattice. When such a lattice-spanning patch exists, it is said the
system percolates. The boundary between sparsely connected
and percolating networks is well-known to be a phase transition:
with a very large number of runs or a very large lattice, the
boundary region becomes so thin it is approximated by a point.
Percolation allows for long-range correlations between
cells, so that distant cells are linked through the highly-
Phase transitions and percolation occur frequently in nature.
For instance, the ranges of two tree-dwelling squirrel species in
New Mexico is divided by the phase-transition border between forest
patches whose canopies are disjointed or percolated. Langton's work
on phase transitions is compelling because it hints of ways to
measure and perceive the special conditions under which self-
organization might be possible.
o far we've described the general
mechanisms and features of
SOS and illustrated Langton's experiment which demonstrated a thin
region of complexity between order and chaos at which
self-organization might be possible. Before continuing, it's
important to note that these postulates are not proven, and in fact
are under intense scrutiny (Mitchell, Hraber and Crutchfield 1993,
Horgan 1995, Sigmund 1995) because of the many assumptions the
models make and the many profound conclusions drawn from them.
Research in this area continues, though, because of the appeal of
a theory of self-organization that could help us understand the
origins of order and life, and perhaps the process of evolution as
ak et al. (1988) studied the behavior of
dynamical systems using computer simulations of their 'sandpile'
model. In this model, sand is poured onto a table in a continuous
stream. At a certain point, the pile is as large as it can get,
and more sand falls off the sides. The pile is very sensitive to
perturbation (if the table is jostled sand falls). Yet it cannot
be too sensitive, or the maximum slope would not be a regular
value, but would fluctuate depending on initial conditions and
disturbance. Because of this precarious yet stable balance, Bak et
al. say that the system is critical. Further, they note
that the system self-organizes to this critical state without any
fine tuning of the model: with any initial conditions the system
settles on the critical state. Linking this with "1/f" noise and
fractal self-similarity, they speculate that self-organized
criticality (SOC) "might be the underlying concept for
temporal and spatial scaling in dissipative nonequilibrium
While this claim is still unproved, some have recognized the
diagnostic use of linking SOC with fractal self-similarity and
"1/f" noise. In other words, self-similar structure and
dissipation at all scales might be indicators that a system is at
Empirical tests: rainforests and ant colonies
ole and Manrubia (1995) use Bak's
theories to examine whether
a rainforest exhibited SOC. Knowing that treefall and gap
formation are vital to rainforest dynamics, they claim that the
distribution and abundance of forest gaps are indicative of the
organizational state of the forest. They hypothesize that the gaps
in the Barro Colorado Island forest in Panama will show self-
similar, multiscale distribution. With both the empirical data and
a simulation of gap distribution called the Forest Game, their
hypothesis is supported. Further, in the simulation biomass also
shows fractal properties. Knowing that in the simulation the
system starts with an arbitrary set of trees, Sole and Manrubia
note that the system self-organizes to a state with self-similar
structure characterized by "1/f" distribution of gaps and biomass.
They conclude that this is suspiciously akin to Bak's critical
state, and might indicate that the forest has evolved to SOC.
Sole and Miramontes (1995) followed a similar approach with
Leptothorax ant colonies, determining whether actual and
simulated ant colonies would exhibit self-similar structure with
"1/f" noise at a critical value. They find that they do at a
critical density of ants, at which the connections between
individuals allows for maximum information capacity of the colony.
When the density is reached, the colony shows pulses of activity
that exhibit self-similarity. Sole and Miramontes point out that
the key parameter in determining the critical value of density is
simply the number of automata in their model. This is corroborated
with empirical observation: in actual colonies, when the number of
ants increases substantially (towards a critical density), ants
change the colony boundaries to achieve the critical density value
for the new colony size. Thus Bak's SOC is evident again in a
Evolution to the edge
he previous section explored the
concept of self-organized
criticality as a place on the edge of chaos where self-organized
complexity and computation are possible, and where self-similar
fractal structure with "1/f" noise are evident. Next we must
investigate why and how a system can move itself to that state from
some other state in the order-chaos spectrum.
or biotic systems, one important
addition to the list of
mechanisms and conditions for SOS is the ability of agents to
adapt. This means agents are capable of changing their
internal information processing functions. This kind of system is
known as a complex adaptive system (CAS, Forrest and Jones 1995).
In Langton's model, cells would be able to change their transition
rules. If this were possible, CA would be able to tune their
transition rules (thus their lambda values) along the order-chaos
spectrum. In Leptothorax ant colonies it means ants which
do not respond to changes in density or respond in an adverse way
will adapt to a response which leads to SOC. A series of questions
about SOS arise when adaptation is considered: what are the
mechanisms of adaptation? Under what conditions are they possible?
How do systems choose which direction to move among their adaptive
choices? And do adaptive systems always move towards SOC?
In evolutionary terms, the How of SOC would be the conditions
for evolution (individual phenotypic variation, excess
reproduction, and heritability of traits). A population would be
able to adapt through the inheritance of genetic variations due to
mutation and recombination. The Why of SOC would be natural
selection. A population might evolve towards a critical state
because natural selection removes variants of the system that are
farther from the critical state. This is the basic idea of a
population adapting to a particular condition. SOC theory might be
a sound statistical theory of evolution if, as the above
experiments suggest, populations exhibit the diagnostic qualities
of Bak's SOC.
Stuart Kauffman (1991, 1993, 1995) has
developed a Boolean network called a coupled fitness landscape to
model CAS. In his model N units, each capable of A states, and
each connected to K other agents, are mapped into a K-dimensional
"landscape" which topographically expresses all possible system
states. Kauffman assigns fitness values to each of the A unit
states, so that when system states are calculated through the K
connections between the N agents, fitness "peaks" appear in the
landscape which represent optimal system states. Each landscape
then can represent an agent in an adaptive system: a genome, a
population, a niche type. He then links several of these
landscapes together to study coevolution. One of his findings is
that K is a major determinant of how orderly or chaotic the system
dynamics are. David Green at ANU also believes connectivity is a pattern determinant in landscapes. In
highly-connected landscapes, information travels
very quickly and the system becomes more chaotic, whereas in
sparsely-connected ones the system quickly settles onto a stable or
periodic state. He suggests that the number of connections to each
unit in a self-organizing system might be the sole parameter that
determines the self-organizing dynamics of the system.
Still, in order to drive the system across its fitness landscape
towards higher fitness, Kauffman invokes a rule by which landscapes
can move to nearby system states that offer higher system fitness
(i.e., towards local fitness peaks), but they cannot move "down" to
states with lower fitness. This arbitrary rule is in line with the
evolutionary Why of SOC, and the structure of his many-state
landscapes, which closely resembles Langton's CA, offer the How.
But this evolutionary explanation becomes less convincing when
other systems not driven by genes are considered, particularly
communities, biomes, and of course abiotic systems (such as
mountain ranges or climates).
A few other hypotheses exist for Why SOS move towards critical
states, such as the law of maximum entropy production (Swenson
1989) and perpetual disequilibration (Ito and Gunji 1994), but
these have yet to move beyond conjecture. Thus the two most likely
mechanisms remain natural selection and physical laws such as the
interplay between friction and gravity in Bak's sandpile.
Landscape ecology applications of SOS
echanisms of SOS have been
identified as possible sources of
self-organization and complexity in biological and ecological
systems (Brown 1994, Hiebeler 1994, Judson 1994): many parts;
local, spatially-explicit interactions; thermodynamic flux;
multiscale effects; nonlinear dynamics, and adaptation. Food
webs, forest self-stabilization, deforestation, and
host-parasatoid relationships are all thought to be SOS
(Perry 1995). SOC is
intuitively appealing to ecologists who have become frustrated with
traditional analytical techniques that don't seem to capture the
intricate dynamics of systems as a whole. While traditional models
are good predictors of general ecological dynamics and structures,
they are often inadequate at describing or predicting complex
phenomena such as intricate habitat patch mosaics, temporal changes
in community structure, or convoluted species distribution
boundaries (Johnson et al. 1992, Judson 1994). Further,
traditional models often fail to explain the mechanisms which give
rise to such patterns (Huston et al. 1988, Judson 1994, Kawata and
f ecological systems are CAS, it is
valid to question the
explanatory usefulness of traditional models whose assumptions
obviate the above factors (Huston et al. 1988, Keitt and Johnson
1995). Believing this might be the case, ecologists are building
individual-based computer models which incorporate the above
factors in order to study the mechanisms of complex ecological
phenomena. Individual-based models explicitly model and track
individual organisms as the agents in the system. System states
and dynamics are displayed and analyzed by the programs at any time
during the system's run (Forrest and Jones 1994, Hiebeler 1994).
Several types of individual-based models have been built,
including cellular automata (CA)lattices (Caswell and Cohen 1991,
Langton 1992), artificial life (A-Life) simulations (see the
review by Kawata and Toquenaga 1994), and gap models (see the
review by Shugart et al. 1992). If CA and ecological systems are
analogous, statistical properties of CA might apply to ecological
systems (e.g., Sole's Forest Game) and could supply explanations
for certain ecological patterns. The possible correlations between
the two systems are being investigated (Langton 1994).
Gap models (e.g., FORET, FORSKA, and ZELIG) simulate the
spatial variation found in forest stands due to the interactions of
developing trees (Shugart et al. 1992). Though gap models are
essentially individual-based models, the limits of growth for
individual organisms are imposed globally, and they appear to be
based more on allometric data than physiological limitations
(Shugart et al. 1992). Thus they may not help generate realistic
mechanistic explanations of large-scale patterns.
simulations (e.g., ECHO,
resemble biological systems. Individual agents have their own
copies of behavioral code "genomes" that let them individually
perceive the local environment, evaluate the input, and choose how
to act (Jones and Forrest 1993, Hiebeler 1994). A-Life worlds
employ system-level constraints (e.g., limited resources or spatial
structure) but incorporate no global rules governing individual
behavior. Thus A-Life simulations offer the best opportunity to
study CAS. Further, A-Life environments and agent rules can be
based respectively on landscape ecology and plant physiology data
(e.g., Smith and Huston 1989, McCauley et al. 1993), allowing
ecologists to model biological systems without the simplifying
assumptions of traditional models.
Excitement about these models begs the question of whether
they will be more useful than generally simpler traditional models.
Even if they can simulate a system in more detail, do we need such
a fine level of resolution in our models? More importantly, do
individual-based models help reveal mechanisms of ecological
complexity, or are they simply fancy descriptions?
andscape ecology can benefit from the tools
used by CAS
researchers. While these tools might not be new, many, such as
Boolean networks, percolation theory, fractal geometry and "1/f"
noise spectra, have been tailored from their field of origin to
apply to biotic systems by CAS researchers. Such tools can and
have been used to study patch connectivity (Keitt), ecotones (Milne
et. al 1995), geologic formations, and threshold conditions for
biotic systems. Other applications for CAS-related tools will
continue to be found.
Overviews for the nonspecialist
- Gell-Mann, M. 1994. The Quark and the Jaguar.
W. H. Freeman, New York, USA.
- Kauffman, S. A. 1995. At Home in the Universe. Oxford
Univ. Press, New York, USA.
- Prigogine, I., and I. Stengers. 1984. Order Out Of
Chaos. Bantam Books Inc., New York, USA.
- Waldrop, M.M. 1992. Complexity: the emerging science at
the edge of order and chaos. Simon & Schuster Inc., New York,
- Bak, P., C. Tang, and K. Wiesenfeld. 1988. Self-organized
criticality. Physical Review A 38:364-374.
- Brown, J. H. 1994. Complex ecological systems. Pages 419-
443 in G. Cowan, D. Pines, and D. Meltzer, editors.
Complexity: metaphors, models, and reality. SFI studies in
the sciences of complexity, proc. vol. XIX. Addison-Wesley,
- Caswell, H., and J. E. Cohen. 1991. Communities in patchy
environments: a model of disturbance, competition, and
heterogeneity. Pages 97-122 in J. Kolasa and S. T. A.
Pickett, editors. Ecological Heterogeneity. Springer, New
Crutchfield, J. P. 1994. Is anything ever new?
In G. Cowan, D. Pines, and D. Melsner, editors. SFI studies in the
sciences of complexity XIX. Addison-Wesley, Massachusetts, USA.
S., and T. Jones. 1994. Modeling complex adaptive
systems with echo. Pages 3-21 in R. J. Stoner and X. H.
Yu, editors. Complex Systems: mechanisms of adaptation.
IOS Press, Amsterdam.
- Green, D.
G. 1993. Emergent behaviour in biological
systems. Pages 24-35 in D. G. Green and T. J. Bossomaier,
editors. Complex Systems - From Biology to Computation.
IOS Press, Amsterdam.
- Hiebeler, D. 1994. The swarm simulation system and
individual-based modeling. In Decision Support 2001: advanced
technology for natural resource management.
- Horgan, J. 1995. From complexity to perplexity.
Scientific American (June 1995):104-109.
- Huston, M., D. DeAngelis, and W. Post. 1988. New computer
models unify ecological theory. Bioscience 38:682-691.
- Huffaker, C. B. 1958. Experimental studies on predation:
dispersion factors and predator-prey oscillations.
- Ito, K., and Y. Gunji. 1993. Self-organisation of living
systems towards criticality at the edge of chaos.
- Johnson, A. R., J. A. Wiens, B. T. Milne, and T. O. Crist.
1992. Animal movements and population dynamics in heterogeneous
landscapes. Landscape Ecology 7: 63-75.
- Jones, T., and S. Forrest.
1993. An introduction to SFI
- Judson, O. P. 1994. The rise of the individual-based model
in ecology. Trends in Ecology and Evolution 9:9-14.
- Kauffman, S. A. 1991. Coevolution to the edge of chaos:
coupled fitness landscapes, poised states, and coevolutionary
avalanches. Journal of Theoretical Biology 149:467-505.
- Kauffman, S. A. 1993. Origins of Order: self-
organization and selection in evolution. Oxford Univ. Press,
New York, USA.
- Kawata, M., and Y. Toquenaga. 1994. Artificial individuals
and global patterns. Trends in Ecology and Evolution
- Keitt, T. H., and A. R. Johnson. 1995. Spatial
heterogeneity and anomalous kinetics: emergent patterns in
diffusion-limited predatory-prey interaction. Journal of
Theoretical Biology 172:127-139.
- Langton, C. G. 1990. Computation at the edge of chaos:
phase transitions and emergent computation. Physica D
- Langton, C. G., editor. 1994. Artificial Life III.
Addison-Wesley, New York, USA.
M., P. Hraber, and J. P. Crutchfield. 1993.
Revisiting the edge of chaos: Evolving cellular automata to
perform computations. Complex Systems 7:89-130.
- McCauley, E., W. G. Wilson, and A. M. de Roos. 1993.
Dynamics of age-structured and spatially structured predator-prey
interactions: individual-based models and population-level
formulations. American Naturalist 142:412-442.
- Nicolis, G., and I. Prigogine. 1989. Exploring
complexity. W. H. Freeman, New York, USA.
- Olson, R. L., and R. A. Sequeira. 1995. An emergent
computational approach to the study of ecosystem dynamics.
Ecological Modelling 79:95-120.
- Perry, D. A. 1995. Self-organizing systems across scales.
Trends in Evolution and Ecology 10:241-244.
- Packard, N. H. 1988. Adaptation toward the edge of chaos.
Pp. 293-301 in A. J. Mandell, J. A. S. Kelso, and M. F.
Shlesinger, eds. Dynamic Patterns in Complex Systems.
World Scientific, Singapore.
- Shugart, H. H., T. M. Smith, and W. M. Post. 1992. The
potential for application of individual-based simulation models
for assessing the effects of global change. Annual Review of
Ecological Systems 23:15-38.
- Smith, T., and M. Huston. 1989. A theory of the spatial and
temporal dynamics of plant communities. Vegetatio 83:49-
- Sole, R. V., and S. C. Manrubia. 1995. Are rainforests self-
organized in a critical state? Journal of Theoretical
- Sole, R. V., and O. Miramontes. 1995. Information at the
edge of chaos in fluid neural networks. Physica D 80:171-
- Swenson, R. 1989. Emergent attractors and the law of maximum
entropy production: foundations to a theory of general evolution.
Systems Research 6:187-197.
- Watt, A. S. 1947. Pattern and process in the plant
community. Journal of Ecology 35:1-22.
- Yates, F. E., Editor. 1987. Self-Organizing Systems: the
emergence of order. Plenum Press, New York, USA.
Miscellaneous SOS, SOC, CAS, CA & complexity links
A-life On-line Resources Bibliography by Ezequiel A Di
Paolos at the School of Cognitive and
Computing Sciences at the university Sussex.
- A good list of
complexity, self-organization, evolution, and A-life links.
Systems Research Links maintained by Yogesh Malhotra at the
University of Pittsburg.
- An excellent links list of a
of topics by a complex business systems researcher.
by Olivier Bousquet at L'Ecole Polytechnique in Paris.
seemingly exhaustive and expansive links list by someone with
too much time on his hands.
- Nonlinear Science e-Print
Archive at Los Alamos National Lab.
- A searchable
archive of on-line, nonlinear-type papers.
Cybernetica Web by the Principia Cybernetica Project (PCP:
how about that acronym?).
- An interesting collection of
materials: "The Project's aim is the computer-supported
collaborative development of an evolutionary-systemic
philosophy. Put more simply, PCP tries to tackle age-old
philosophical questions with the help of the most recent
cybernetic theories and technologies." Trippy at times, but
they have a great links list.
- Chaos at
Maryland, by the Chaos Group at the University of Maryland
at College Park.
- A good academic page by a very good
program, with e-prints, a chaos database, and other
Complexity Links List maintained by Alex Mallet at the
University of Pennsylvania.
- A big, fun list of a wide
array of complexity links.
Please send comments, questions, corrections and suggestions to Ethan H. Decker