In the 1940's, a trio of legendary animal watchers in Europe -- Konrad
Lorenz, Karl von Frisch, and Niko Tinbergen -- began describing the logical
underpinnings of animal behavior. Lorenz shared his house with geese,
von Frisch lived among honeybee hives, and Tinbergen spent his days with
stickleback perch and sea gulls. By rigorous and clever experiments the
three ethologists refined the lore of animal antics into a respectable
science called ethology (roughly, the study of character). In 1973, they
shared a Nobel prize for their pioneering achievements. When
cartoonists, engineers, and computer scientists later delved into the
literature of ethology, they found, much to their surprise, a remarkable
behavioral framework already worked out by the three ethologists, ready
to be ported over to computers.
At the core of ethological architecture dwells the crucial idea of
decentralization. As formalized in 1951 by Tinbergen in his book The
Study of Instinct, the behavior of an animal is a decentralized
coordination of independent action (drive) centers which are combined
like behavioral building blocks. Some behavioral modules consist of a
reflex; they invoke a simple function, such as: pull away when hot, or
blink when touched. The reflex knows nothing of where it is, what else
is going on, or even of the current goal of its host body. It can be
triggered anytime the right stimulus appears.
A male trout instinctually responds to the following stimuli: a female
trout ripe for copulation, a nearby worm, a predator approaching from
behind. But when all three stimuli are presented simultaneously, the
predator module always wins out by suppressing feeding or mating
instincts. Sometimes, when there is a conflict between action modules,
or several simultaneous stimuli, management modules are triggered to
decide. For instance, you are in the kitchen with messy hands when the
phone rings at the same time someone knocks on the front door. The
conflicting drives -- jump to the phone! no, wipe hands first! no, dash to
the door! -- could lead to paralysis unless arbitrated by a third module of
learned behavior, perhaps one that invokes the holler, "Please
A less passive way to view a Tinbergen drive center is as an "agent." An
agent (whatever physical form it takes) detects a stimuli, then reacts.
Its reaction, or "output" in computer talk, may be considered input by
other modules, drive centers, or agents. Output from one agent may
enable other modules (cocking a gun's hammer) or it may activate other
modules already enabled (pulling the trigger). Or the signal may disable
(uncock) a neighboring module. Rubbing your tummy and patting your head
at the same time is tricky because, for some unknown reason, one action
suppresses the other. Commonly an output may both enable some centers
and suppress others. This is, of course, the layout of a network swamped
with circular causality and primed to loop into self-creation.
Outward behavior thus emerges from the thicket of these blind reflexes.
Because of behavior's distributed origin, very simple agents at the
bottom can produce unexpectedly complex behavior at the top. No central
module in the cat decides whether the cat should scratch its ear or lick
its paw. Instead, the cat's conduct is determined by a tangled web of
independent "behavioral agents" -- cat reflexes-cross-activating each
other, forming a gross pattern (called licking or scratching) that wells
up from the distributed net.
This sounds a lot like Brooks's subsumption architecture because it is.
Animals are robots that work. The decentralized, distributed control
that governs animals is also what works in robots and what works for
Web-strewn diagrams of interlinked behavior modules in ethology
textbooks appear to computer scientists as computer logic flow charts.
The message is: Behavior is computerizable. By arranging a circuit of
subbehaviors, any kind of personality can be programmed. It is
theoretically feasible to generate in a computer any mood, any
sophisticated emotional response that an animal has. Film creatures will
be driven by the same bottom-up governance of behavior running Robbie
the Robot -- and the very same scheme borrowed from living songbirds and
stickleback fish. But instead of causing pneumatic hoses to pressurize,
or fishtails to flick, the distributed system pumps bits of data which
move a leg on a computer screen. In this way, autonomous animated
characters in film behave according to the same general organizational
rules as real animals. Their behavior, although synthetic, is real
behavior (or at least hyperreal behavior). Thus, toons are simply robots
without hard bodies.
More than just movement can be programmed. Character -- in the
old-fashioned sense of the word -- can be encapsulated into bit code.
Depression, elation, and rage will all be add-on modules for a
creature's operating system. Some software companies will sell better
versions of the fear emotion than others. Maybe they'll sell "relational
fear" -- fear that not only registers on a creature's body but trickles
into successive emotion modules and only gradually dissipates over