Lecture 2: Paradigms of Cognitive Science - David Vernon
Lecture 2: Paradigms of Cognitive Science - David Vernon
Lecture 2: Paradigms of Cognitive Science - David Vernon
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
An Introduction to
Artificial Cognitive Systems
David Vernon
Institute for Cognitive Systems
Technical University of Munich
www.vernon.eu
Copyright © 2012 David Vernon
Lecture 2
Paradigms of Cognitive Science
Copyright © 2012 David Vernon
Cognitive Science
Cybernetics
1943-53
Cognitivism
1956-
(AI)
Emergent Systems
1958-
Network Dynamics
Synergetics
Dynamical Systems Theory
Enactive Systems
1970-
[Varela 88]
Copyright © 2012 David Vernon
COGNITION
Self-Organization
Cognitivist
Hybrid
Emergent
Physical Symbol Systems
Newell’s Unified Theories
of Cognition
Computational paradigm
Dynamical
Systems
Connectionist
Systems
Enactive
Systems
Copyright © 2012 David Vernon
Some Background
on
Cybernetics
Copyright © 2012 David Vernon
Cybernetics
• The word cybernetics has its roots in the Greek word κυβερνητης or
kybernetes, meaning steersman
• It was defined in Norbert Wiener’s book Cybernetics, first published
in 1948, as “the science of control and communication” (this was the
sub-title of the book)
• W. Ross Ashby notes in his book An Introduction to Cybernetics,
published in 1956 that cybernetics is essentially “the art of
steersmanship”
– co-ordination, regulation, and control.
• The word governor derives from gubernator, the Latin version of
κυβερνητης
Copyright © 2012 David Vernon
Cybernetics
• The word cybernetics has its roots in the Greek word κυβερνητης or
kybernetes, meaning steersman
• It was defined in Norbert Wiener’s book Cybernetics, first published
in 1948, as “the science of control and communication” (this was the
sub-title of the book)
• W. Ross Ashby notes in his book An Introduction to Cybernetics,
published in 1956 that cybernetics is essentially “the art of
steersmanship”
– co-ordination, regulation, and control.
• The word governor derives from gubernator, the Latin version of
κυβερνητης
Copyright © 2012 David Vernon
Cybernetics
• Closed-loop control
• An action by the system causes some change in its environment
and that change is fed to the system via feedback that enables the
system to change its behaviour
• This "circular causal" relationship is necessary and sufficient for a
cybernetic perspective [Wikipedia]
Copyright © 2012 David Vernon
Cybernetics
• “The essential goal of cybernetics is to understand and define the
functions and processes of systems that have goals and that
participate in circular, causal chains that move from action to
sensing to comparison with desired goal, and again to action.”
[Wikipedia]
Copyright © 2012 David Vernon
Cybernetics
• Recent definition proposed by Louis Kauffman, President of the
American Society for Cybernetics
"Cybernetics is the study of systems and processes that interact with
themselves and produce themselves from themselves."
Copyright © 2012 David Vernon
COGNITION
Self-Organization
Cognitivist
Hybrid
Emergent
Physical Symbol Systems
Newell’s Unified Theories
of Cognition
Computational paradigm
Dynamical
Systems
Connectionist
Systems
Enactive
Systems
Copyright © 2012 David Vernon
Different Paradigms of
Cognitive Systems
The Cognitivist Approach
Information Processing
Physical Symbol Systems
Representationist Approach
Copyright © 2012 David Vernon
‘The world we perceive is isomorphic with our perceptions of it
as a geometric environment’
[Shepard & Hurwitz ‘84]
Copyright © 2012 David Vernon
‘Cognition is a type of computation’
‘People “instantiate” … representations physically as cognitive
codes and that their behaviour is a causal consequence of
operations carried out on these codes’
[Pylyshyn ‘84]
Copyright © 2012 David Vernon
From The Tree of Knowledge – The Biological Roots of Human Understanding
H. R. Maturana and F. J. Varela, 1988
Copyright © 2012 David Vernon
Cognitivist Systems
• Strong representations
– Explicit & symbolic
– Representations denote external objects
– Isomorphic
– Projection
– Implies an absolute and accessible ontology
– That is consistent with human expression …
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Cognitivist Systems
• Representations
– Descriptive product of a human designer
– Can be directly accessed & understood by humans
– Human knowledge can be directly injected into an
artificial cognitive system
Copyright © 2012 David Vernon
Cognitivist Systems
• But …
Programmer-dependent representations bias the system
‘blind’ the system (cf. Winograd & Flores)
• … can you anticipate every eventuality in your design?
• The semantic gap … and the Symbol Grounding problem
Copyright © 2012 David Vernon
Cognitivist Systems
• Plugging the gap by …
– Machine learning
– Probabilistic modelling
– Better models
– Better logics
– Better reasoning
– Better everything
– … and still …
Copyright © 2012 David Vernon
Cognitive science
is involved in an escalating retreat
from the inner symbol:
a kind of inner symbol flight
A. Clark, Mindware
Copyright © 2012 David Vernon
Cognitivism and Artificial Intelligence
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Physical symbol system approach to AI
• Intelligence
– The degree to which a system approximates a knowledge-level system
[Unified Theories of Cognition, Newell 90]
– A knowledge-level system: can bring all its knowledge to bear on every
problem
• Perfect knowledge -> complete use of knowledge
• Humans aren’t there yet!!
– Principle of maximum rationality [Newell 82]
• ‘If an agent has knowledge that one of its actions will lead to one of its goals,
then the agent will select that action’
– Rational analysis [Anderson 89]
• ‘The cognitive system optimizes the adaptation of the behaviour of the
organism’.
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Physical Symbol Systems
– Symbols are abstract entities that can be instantiated as tokens
– A physical symbol system has [Newell 90]:
• Memory (to contain the symbolic information)
• Symbols (to provide a pattern to match or index other symbols)
• Operations (to manipulate symbols)
• Interpretations (to allow symbols to specify operations)
• Capacities for
– Composability
– Interpretability
– Sufficient memory
– Symbol systems can be instantiated but … behaviour is
independent of the particular form of the instantiation
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Physical Symbol Systems [Newell and Simon 1975]
– The Physical Symbol System Hypothesis
• A physical symbol system has the necessary and sufficient means for
general intelligent action
• Any system that exhibits general intelligence is a physical symbol
system
• A physical symbol system is ‘a machine that produces through time an
evolving collection of symbol structures’
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Physical Symbol Systems [Newell and Simon 1975]
Symbol Systems
comprise
comprise
Symbol Structures /
Expresssions /
Patterns
Produce, destroy, modify
Processes
designate
Objects
designate
Processes
Can affect objects
Can be affected by objects
Can be interpreted: carry out the designated process
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Physical Symbol Systems [Newell and Simon 1975]
– The Heuristic Search Hypothesis
• The solutions to problems are represented as symbol structures
• A physical symbol system exercises its intelligence in problem-solving
by search
– Generating and progressively modifying symbol structures until ti produces
a solution structure
– Effective and efficient search
• ‘The task of intelligence is to avert the ever-present threat of the
exponential explosion of search’
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Physical Symbol Systems
– The Physical Symbol System Hypothesis
A physical symbol system has the necessary and sufficient means
of general intelligence
– What are the implications for humans???
– Natural and artifical intelligence is equivalent (why?)
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
Newell’s four levels
Social
10 5 –10 7 s; behaviours
Requires a symbolic level
Rational
10 2 –10 4 s; tasks requiring reasoning (find way home)
Cognitive
10 -1 –10 1 s; deliberate acts (reaching),
composed operations (shifting gear)
actions (steering a car into a gate)
Biological
10 -4 –10 -2 s; organelle, neuron, neural circuit
All knowledge is here
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• (Cognitive) Architecture: defines the manner in which a cognitive agent
manages the primitive resources at its disposal
• Dictates representations and their deployment
• Dictates properties of cognitive system
– Organization & control strategies (coordination/cooperation;
modular/hierarchical)
– Memory, knowledge, representation (world models, delarative
representations, procedural representations, associate memory, episodic
knowledge, meta-knowledge, represenational structures)
– Types of learning (deliberative vs reflexive; monotonic vs non-monotonic)
– Types of planning
– Behaviour (coherence, saliency, & adequacy: consistency, relevance,
sufficiency)
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Unified Theories of Cognition
– Attempts to explain all the mechanisms of all problems in its
domain
– Now plausible (Newell) cf the Soar project
– Applies to both natural and artificial cognition
Copyright © 2012 David Vernon
Cognitivism & Artificial Intelligence
• Non-functional Attributes of Cognitive Architectures
– Generality (breadth of tasks)
– Versatility (ibid)
– Rationality (cf. consistency and repeatability)
– Scalability (cf. complexity)
– Reactivity (cf. unpredictability)
– Efficiency (cf. time and space constraints)
– Extendability (cf. reconfigurability)
– Taskability (cf. external direction)
– Psychological validity (cf. human models)
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Different Paradigms of
Cognitive Systems
Emergent Approaches
Copyright © 2012 David Vernon
COGNITION
Self-Organization
Cognitivist
Hybrid
Emergent
Physical Symbol Systems
Newell’s Unified Theories
of Cognition
Computational paradigm
Dynamical
Systems
Connectionist
Systems
Enactive
Systems
Copyright © 2012 David Vernon
Self-generated AI:
AI by orchestrating the processes that generate it
Luc Steels
Copyright © 2012 David Vernon
Emergent Approaches
• Cognition is the process whereby an autonomous system
becomes viable and effective in its environment
• It does so through a process of self-organization
– System is continually re-constituting itself
– In real-time
– To maintain it operational identity
– Through moderation of mutual system-environment interaction and
co-determination
Maturana & Varela 87
Copyright © 2012 David Vernon
• Co-determination
Emergent Approaches
– Cognitive agent is specified by its environment
– Cognitive process determines what is real or meaningful for the
agent
– The system constructs its reality (world) as a result of its operation
in that world
– Perception provides sensory data to enable effective action, but as
a consequence of the system’s actions
– Cognition and perception are functionally-dependent on the
richness of the action interface [Granlund]
Copyright © 2012 David Vernon
Emergent Approaches
• Cognition is the complement of perception [Sandini]
– Perception deal with the immediate
– Cognition deals with longer time frames
• Primate model of cognitive learning is anticipative skill construction
(not knowledge acquisition)
• The root of intelligence is the ability to guide action and, at the same
time, improve that ability
• Embodied as physical systems capable of physical interaction with the
world
Copyright © 2012 David Vernon
Emergent Approaches
‘Cognitive systems need to acquire information about the external
world through learning or association’
[Granlund’02]
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Emergent Approaches
• Self-organization
– “Self-organizing systems are physical and biological systems in which
pattern and structure at the global level arises solely from interactions
among the lower-level components of the system.”
– “The rules specifying interactions among the system’s components are
executed only using local information, without reference to the global
pattern.”
• Emergence
– A process by which a system of interacting elements acquires qualitatively
new pattern and structure that cannot be understood simply as the
superposition of the individual contributions.
Camazine 2006
Copyright © 2012 David Vernon
• Self-organization
Emergent Approaches
– Short-range Activator
• autocatalysis: promote its own productions
• Increase the production of an inhibitor (antagonist)
– Inhibitor
• Diffuses rapidly
– Result:
• Local increase in activation
• Long-range antagonistic inhibition which keeps the self-enhancing
reaction localized
Camazine 2006
Copyright © 2012 David Vernon
Emergent Approaches
Local autocatalysis
Non-local negative feedback
Meinhardt 95
Copyright © 2012 David Vernon
Emergent Approaches
Camazine 2006
Copyright © 2012 David Vernon
Connectionist Systems
Copyright © 2012 David Vernon
Connectionist Systems
• Rely on
– Parallel processing
– Non-symbolic distributed activation patterns in networks
– Not logical rules
• Neural networks are the most common instantiations
– Dynamical systems that capture statistical regularities or
associations
Copyright © 2012 David Vernon
Dynamical Systems
Copyright © 2012 David Vernon
COGNITION
Self-Organization
Cognitivist
Hybrid
Emergent
Physical Symbol Systems
Newell’s Unified Theories
of Cognition
Computational paradigm
Dynamical
Systems
Connectionist
Systems
Enactive
Systems
Copyright © 2012 David Vernon
• Dynamical Systems
Dynamical Systems
– A dynamical system is an open dissipative non-linear non-equilibrium
system
– System: large number of interacting components & large number of
degrees of freedom
– Dissipative: diffuse energy – phase space decreased in volume with
time (⇨ preferential sub-spaces)
– Non-equilibrium: unable to maintain structure or function without
external sources of energy, material, information (hence, open)
– Non-linearity: dissipation is not uniform – small number of system’s
degrees of freedom contribute to behaviour
• Order parameters / collective variables
Kelso ‘95: Dynamic Pattern – The Self-Organization of Brain and Behaviour
Copyright © 2012 David Vernon
Dynamical System
q = N(q, p, n)
time derivative
state vector
noise
control parameters
From [Shoner Kelso 88]
Copyright © 2012 David Vernon
Enactive Systems
Copyright © 2012 David Vernon
Enaction
• Orthodoxy (cognitivist)
– World as the system experiences it is independent of the cognitive
system (knower)
• Enactive view
– Known and knower ‘stand in relation to each other as mutual
specification: they arise together’
Copyright © 2012 David Vernon
Enaction
• Five key elements to enactive systems
– Autonomy
– Embodiment
– Emergence
– Experience
– Sense-making
Copyright © 2012 David Vernon
Enaction
• Autonomy
– Self-maintenance
– Homeostasis
– Not controlled by outside agencies
– Stands apart from its environment
Copyright © 2012 David Vernon
Enaction
• Embodiment
– Exists as a physical entity
– Directly interacts with its environment
• Structural coupling
• Mutual perturbation
– Constitutive part of the cognitive process
Copyright © 2012 David Vernon
Enaction
• Emergence
– Cognitive behaviour arises from dynamic interplay between
component parts
– Internal dynamics
• Maintains autonomy
• Condition the system’s experiences through their embodiment in a
specific structure
Copyright © 2012 David Vernon
Enaction
• Experience
– History of interaction with the world
– Interactions don’t control
– Interaction do trigger changes in system state
– Changes are structurally-determined
• Phylogeny
• Structural coupling
Copyright © 2012 David Vernon
Enaction
• Sense-making
– Knowledge is generated by the system itself
– Captures some regularity or lawfulness in the interactions
– The ‘sense’ is dependent on the way interaction can take place
• Perception & Action
– Modify its own state (CNS) to enhance
• Predictive capacity
• Action capabilities
– Development: generative autonomous self-modification
Copyright © 2012 David Vernon
Development
Progressive ontogenetic acquisition of anticipatory capabilities
– Cognition cannot short-circuit ontogeny
– Necessarily the product of a process of embodied development
– Initially dealing with immediate events
t
– Increasingly acquiring a predictive capability
t
Cognition and perception are functionally-dependent
on the richness of the action interface
Copyright © 2012 David Vernon
Co-determination / Structural Coupling
Autonomony-preserving mutual interaction
Perturbation of the system is only effected by the environment
[Note: this ideogram and similar ones to follow were introduced in Maturana and Varela 1987]
Copyright © 2012 David Vernon
Cognitive system: operationally-closed system with a nervous system
Nervous system facilitates a highly-plastic mapping between
sensor and motor surfaces
Perturbation by both environment and system (of receptors & NS)
Copyright © 2012 David Vernon
Enaction
NERVOUS SYSTEM
(a) Facilitates huge increase in the number of possible
sensor-motor patterns (that result from structural coupling
with the environment)
(b) Creates new dimensions (DoF) of structural coupling
by facilitating association of internal states with the
interactions in which the system is engaged
Copyright © 2012 David Vernon
t
t
Anticipation / Planning / Explanation / Prediction
Copyright © 2012 David Vernon
INTERACTION
A shared activity in which the actions of each agent
Influence the actions of the other agents in the same interaction
Resulting in a mutually-constructed pattern of shared behaviour
[Ogden et al.]
Copyright © 2012 David Vernon
Meaning emerges through shared consensual
experience mediated by interaction
Interaction is a shared activity in which the actions of each agent
influence the actions of the other agents engaged in the same
interaction, resulting in a mutually-constructed patterns of shared
behaviour
Ogden, Dautenhahn, Stribling 2002
Copyright © 2012 David Vernon
Bond of Union
M. C. Escher, 1956
COGNITION & SENSE-MAKING
Cognition is a process whereby the issues that are important
for the continued existence of the cognitive entity
are brought forth: co-determined by the entity as it interacts
with the environment in which it is embedded
Copyright © 2012 David Vernon
THE SPACE OF PERCEPTUAL POSSIBILITIES
Is predicated not on an objective environment,
but on the space of possible actions
that the system can engage in whilst still
maintaining the consistency of its coupling with the environment
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Autopoiesis is a special type of self-organization
– Requires self-specification and self-generation
(autopoiesis means literally `self-production')
– An autopoietic system is a special type of homeostatic
system (i.e. self-regulating system)
• regulation applies not to some system parameter
• but to the organization of the system itself.
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Thus, its organization is constituted by relations between
components that by their interaction and transformation
continuously regenerate the network of processes that
produce them
– That is, the relations among components -- a network of
processes of production --- produce the components
themselves
– A significant aspect of autopoiesis is that its function is to
`create and maintain the unity that distinguishes it from the
medium in which it exists'.
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Self-production
– System emerges as a coherent systemic entity, distinct from
its environment, as a consequence of processes of selforganization
– Three orders of system …
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– First-order autopoietic systems correspond to cellular
entities that achieve physical identity through structural
coupling with the environment:
Environmental perturbations trigger structural changes that
permit it to continue operating
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Second-order systems correspond to meta-cellular entities
that exhibit autopoiesis through operational closure in their
organization
They specify a network of dynamic processes that don’t
leave the network (organizationally closed)
Engage in structural coupling with the environment, this time
with a nervous system ( -> much enlarged space of internal
states & much enlarge space of interaction)
– Cognitive systems (at agent level)
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Third-order systems exhibit coupling between second-order
(cognitive) systems
– Common ontogenetic drift between reciprocally-coupled
systems
– Shared epistemology / language
• Reflects but does not abstract the common medium in which
they are coupled
– 3 types of behaviour
• Instinctive behaviour (based on autopoietic self-organization)
• Ontogenic behaviour (based on its development)
• Communication / social behaviour
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Linguistic behaviours: intersection of ontogenic and
communication behaviours
– Facilitates creation of a common understanding of a shared
world (in which they are coupled)
– Language is an emergent consequence of the third-order
structural coupling of a socially-cohesive group of cognitive
entities
Copyright © 2012 David Vernon
Enaction
• Autopoiesis
– Consciousness (mind) emerges when language and
linguistic behaviour is applied recursively to the agent itself
– Consciousness is an emergent property of social coupling
– Doesn’t have any independent existence (and you can’t
model it!)
Copyright © 2012 David Vernon
Enaction
• Third-order systems: coupling between second-order
(cognitive) systems
– Recurrent (common) ontogenic drift from reciprocal coupling
– Instinctive behaviour based on second-order organization
(phylogenic evolution)
– Ontogenetic behaviour, development qua learning over its lifetime
– Communicative behaviour: third order structural coupling (social
coupling)
– Linguistic behaviours: establishment of a shared epistemology
Copyright © 2012 David Vernon
The Hierarchy of Enactive Systems
Same approach: operational closure
Emergence of Language:
Explicit deliberative self-reflective reasoning
Rationalization of
interactions & perceptions:
2
Intelligent
(adaptive & anticipatory)
behaviour 1
3
Societies of
Cognitive Entities
Viable Organisms
Emergence of Cognition
Cellular Organization
Physical Instantiation
Structural coupling with
the environment
Multilevel Autopoeitic / Cognitive Systems
Copyright © 2012 David Vernon
Enactive Cognition
MEANING
Emerges through shared consensual experience
mediated by interaction
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Enactive Cognition
• Structural coupling
– Necessarily embodied
– Operating in synchronous real-time
Machine representations reflect
the machine’s epistemology
Copyright © 2012 David Vernon
Enactive Cognition
Need to achieve synthetic cognition without falling back into using
cognitivist preconceptions basing the ‘design’ on representations
derived by external observers (us)
Copyright © 2012 David Vernon
Bickhard’s Self-Maintenant &
Recursive Self-Maintenant Systems
Copyright © 2012 David Vernon
Emergent Models
• “The grounds for cognition are adaptive far-fromequilibrium
autonomy – recursively self-maintenant
autonomy – not symbol processing nor connectionist
input processing.”
Copyright © 2012 David Vernon
Emergent Models
• Autonomy: the property of a system to contribute to
its own persistence
• Different grades of contribution -> different levels of
autonomy
Copyright © 2012 David Vernon
• Self-maintenant Systems
Emergent Models
– Make active contributions to its own persistence
– Do not contribute to the maintenance of the conditions for
persistence
Copyright © 2012 David Vernon
Emergent Models
• Recursive self-maintenant systems
– Do contribute actively to the conditions for persistence
– Can deploy different processes of self-maintenance
depending on environmental conditions
– “they shift their self-maintenant processes so as to maintain
self-maintenance as the environment shifts” !!!
Copyright © 2012 David Vernon
Emergent Models
• Far from equilibrium stability
– Non-thermodynamic equilibrium
– Requires that the system does NOT go to thermodynamic
equilibrium
– Completely dependent on their continued existence on continued
contributions from external factors
– Do require environmental interaction
– Necessarily open processes (which exhibit closed organization)
Copyright © 2012 David Vernon
Emergent Models
• Self-maintenant and recursive self-maintenant systems are both
far-from-equilibrium systems
• Recursive self-maintenant systems do yield the emergence of
representations
– Function emerges in self-maintenant systems
– Representation emerges in a particular type of function
(“indications of potential interactions”) in recursively self-maintenant
systems
Copyright © 2012 David Vernon
Emergent Models
Why development in humanoid form
is important for cognition
Copyright © 2012 David Vernon
Emergent
Systems
Cognition is the process whereby
an autonomous system becomes
viable and effective in its
environment
Copyright © 2012 David Vernon
Emergent
Systems
through moderation of mutual systemenvironment
Self-
Organization
The system is continually
re-constituting itself in real-time
to maintain its operational identity
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
The system and environment are co-determined
through structural coupling and
contingent self-organization
Co-determination:
Agent is specified by its environment and the
cognitive process determines what is real or
meaningful for the agent
Agent constructs its reality as a result of its
operation in that world.
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
Maturana's and Varela's idea of structural coupling of level one
autopoietic systems (MaturanaVarela87)
Kelso's circular causality of action and perception (Kelso95)
Organizational principles inherent in Bickhard's self-maintenant
systems (Bickhard00)
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
This implies a particular phylogeny:
Phylogeny
1. To effect the homeostatic self-organization
2. To effect the structural coupling
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
Phylogeny
Neuroscience
Developmental
Psychology
Recent studies in natural cognition
guide the identification of this phylogeny
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
Development
Phylogeny
Development is the cognitive process of
establishing (discovering) the possible space of
mutually-consistent couplings
Bickhard's concept of recursive self-maintenance (Bickhard00)
Maturana's and Varela's level two and level three autopoietic systems (MaturanaVarela87)
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
Development
Phylogeny
Ontogeny
Development implies ontogenic development:
the cognitive system develops its own
epistemology: its own system-specific history- and
context-dependent knowledge of its world
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Codetermination
Development
Phylogeny
Ontogeny
Neuroscience
Developmental
Psychology
Copyright © 2012 David Vernon
Neural Nets
Dynamical
Systems
Emergent
Systems
Self-
Organization
Codetermination
Development
Phylogeny
Ontogeny
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Embodiment
Codetermination
Development
Phylogeny
Ontogeny
Emergent systems also implies embodiment …
But what type of embodiment?
It depends on the needs of the structural coupling
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Embodiment
Codetermination
Development
Phylogeny
Ontogeny
Structural Coupling
The system-environment perturbations must be rich enough
to drive the ontogenic development but not destructive of the self-organization
No guarantee that the cognitive behaviour will be consistent with human preconceptions
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Embodiment
Codetermination
Development
Phylogeny
Ontogeny
Humanoid
Meaning emerges through a shared consensual experience,
mediated by interaction (mutually-constructed shared pattern of behaviour)
Therefore, we require a humanoid embodiment to effect mutually-consistent meanings
Copyright © 2012 David Vernon
Otherwise, this is what we’ll get …
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Embodiment
Codetermination
Development
Phylogeny
Ontogeny
Humanoid
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Embodiment
Codetermination
Development
Phylogeny
Ontogeny
Humanoid
Morphology is a constitutive component of both co-determination and development
Consequently, a plastic morphology is important: the embodiment shouldn’t be static
Copyright © 2012 David Vernon
Emergent
Systems
Self-
Organization
Embodiment
Codetermination
Development
Phylogeny
Ontogeny
Humanoid
Cognitive
Development
Increasing complexity
in action space
Increasing degree
of Prospection
Copyright © 2012 David Vernon
Different Paradigms of
Cognitive Systems
Hybrid Approaches
Copyright © 2012 David Vernon
Hybrid Models
• Combination of cognitivist & emergent approaches
• Don’t use explicit programmer based knowledge
• Can use symbolic representations (& representational
invariances) but these are populated by the system itself as it
learns
• Still emphasize perception-action coupling
• Often with action-dependent perception
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
Hybrid Models
Copyright © 2012 David Vernon
RECAP
Copyright © 2012 David Vernon
Differences between Cognitivist & Emergent Paradigms
Differences between Cognitivist
& Emergent Paradigms
1. Computational operation
2. Representational framework
3. Semantic grounding
4. Temporal constraints
5. Inter-agent epistemology
6. Embodiment
7. Perception
8. Action
9. Anticipation
10.Adaptation
11.Motivation
12.Autonomy
13.Cognition
14.Philosophical foundation
[Vernon, Von Hofsten, Fadiga 2010]
Copyright © 2012 David Vernon
Copyright © 2012 David Vernon
What is Cognition?
Cognitivism
Information processing: rule-based
manipulation of symbols
Connectionism
Emergence of global states in a network of
simple components
Dynamical
Systems
A history of activity that brings forth change and
activity
Enactive
Systems
Effective action: history of structural coupling
which enacts (brings forth) a world
Adapted from [Thelen & Smith 94] and [Varela 88]
Copyright © 2012 David Vernon
How Does it Work?
Cognitivism
Through any device that can manipulate
symbols
Connectionism
Through local rules and changes in the
connectivity of the elements
Dynamical
Systems
Through self-organizing processes of
interconnected sensorimotor subnetworks
Enactive
Systems
Through a network of interconnected elements
capable of structural changes undergoing an
uninterrupted history
Copyright © 2012 David Vernon
What does a good cognitive system do?
Cognitivism
Represents the stable truths about the real
world, and solves problems posed to it
Connectionism
Develops emergent properties that yield stable
solutions to tasks
Dynamical
Systems
Becomes an active and adaptive part of an
ongoing and continually changing world
Enactive
Systems
Becomes part of an existing on-going world of
meaning (in ontogeny) or shapes a new one (in
phylogeny)
Copyright © 2012 David Vernon
[From Varela 92]
Copyright © 2012 David Vernon
Which Paradigm is Correct?
• Paradigms are not equally mature
• Dynamical systems
– Arguments are compelling BUT ..
– Not yet clear how to get higher-level cognition
• Cognitivist systems
– More advanced
– Not many achievements in generalization
– More brittle (in principle)
• Enactive (& Dynamical)
– SHOULD be much less brittle (mutual specification through co-development)
– But limited cognition at present
• Hybrid systems
– Best of both worlds?
– But unclear how one can really combine antagonistic philosophies
Copyright © 2012 David Vernon
Which Paradigm is Correct?
• No on one has actually designed a complete cognitive system
• Still lots of disagreement on the right approach to take
Copyright © 2012 David Vernon
Recommended Reading
Vernon, D. “Artificial Cognitive Systems – A Primer for Students”,
Chapter 2.
Vernon, D., Metta. G., and Sandini, G. “A Survey of Artificial
Cognitive Systems: Implications for the Autonomous
Development of Mental Capabilities in Computational Agents”,
IEEE Transactions on Evolutionary Computation, special issue
on Autonomous Mental Development, Vol. 11, No. 2, pp. 151-
180 (2007).
Vernon, D., von Hofsten, C., and Fadiga, L. A Roadmap for Cognitive
Development in Humanoid Robots, Cognitive Systems
Monographs (COSMOS), Springer, ISBN 978-3-642-16903-8
(2010); Chapter 5.
Copyright © 2012 David Vernon