Through experimentation
and simulation scientists are able to get an understanding of the underlying
biological mechanisms involved in living organisms. These mechanisms, both
behavioral and structural, serve as inspiration in the development of neural
based robotic architectures. Some examples of animals serving as inspiration
to robotic systems are frogs and toads [
SN1],
praying mantis [
SN2], cockroaches [
SN3],
and hoverflies [
SN4]. To address the underlying
complexity in building such biologically inspired neural based robotics
systems we usually distinguish among two different levels of modeling,
behavior (
schemas [
SN5]) and structure
(
neural networks [
SN6]):
Due to the heavy processing
load, most models are designed and implemented at only one of the above
two levels of granularity. For example, in Arkin et al. [
SN13]
we describe a praying mantis prey-predator model simulated and experimented
in a fielded robotic system exclusively at the behavior level. These models
have server as basis for new areas of robotic applications, such as
ecological
robotics [
SN14]. On the other, models
that actually involve neural networks are usually limited in scope as in
[
SN15], while more complex models [
SN16]
are simplified in terms of their inherent neural complexity. Yet, unless
we can fully experiment with the more complex models, we will be limited
in terms of their comprehension. For example, let us consider an extension
to the latter, the toad´s
prey-predator visuomotor coordination
model described in Weitzenfeld et al. [
SN17],
with schema and neural level components shown in Figure 2.
The diagram shows the two
levels of modeling granularity. At the schema level, blocks correspond
to schemas representing animal or robot behavior. At the neural level,
blocks represent neural networks, some having a direct correspondence to
brain regions [SN18]. Our goal is to be
able to execute such a model in a mobile robot in an efficient manner while
interacting with it and its different behavioral and structural components
in real time. A preliminary architecture for this system was presented
in Weitzenfeld et al. [SN19].