12.03.2015 Views

Lecture 2: Paradigms of Cognitive Science - David Vernon

Lecture 2: Paradigms of Cognitive Science - David Vernon

Lecture 2: Paradigms of Cognitive Science - David Vernon

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

An Introduction to<br />

Artificial <strong>Cognitive</strong> Systems<br />

<strong>David</strong> <strong>Vernon</strong><br />

Institute for <strong>Cognitive</strong> Systems<br />

Technical University <strong>of</strong> Munich<br />

www.vernon.eu<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


<strong>Lecture</strong> 2<br />

<strong>Paradigms</strong> <strong>of</strong> <strong>Cognitive</strong> <strong>Science</strong><br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


<strong>Cognitive</strong> <strong>Science</strong><br />

Cybernetics<br />

1943-53<br />

Cognitivism<br />

1956-<br />

(AI)<br />

Emergent Systems<br />

1958-<br />

Network Dynamics<br />

Synergetics<br />

Dynamical Systems Theory<br />

Enactive Systems<br />

1970-<br />

[Varela 88]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


COGNITION<br />

Self-Organization<br />

Cognitivist<br />

Hybrid<br />

Emergent<br />

Physical Symbol Systems<br />

Newell’s Unified Theories<br />

<strong>of</strong> Cognition<br />

Computational paradigm<br />

Dynamical<br />

Systems<br />

Connectionist<br />

Systems<br />

Enactive<br />

Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Some Background<br />

on<br />

Cybernetics<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cybernetics<br />

• The word cybernetics has its roots in the Greek word κυβερνητης or<br />

kybernetes, meaning steersman<br />

• It was defined in Norbert Wiener’s book Cybernetics, first published<br />

in 1948, as “the science <strong>of</strong> control and communication” (this was the<br />

sub-title <strong>of</strong> the book)<br />

• W. Ross Ashby notes in his book An Introduction to Cybernetics,<br />

published in 1956 that cybernetics is essentially “the art <strong>of</strong><br />

steersmanship”<br />

– co-ordination, regulation, and control.<br />

• The word governor derives from gubernator, the Latin version <strong>of</strong><br />

κυβερνητης<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cybernetics<br />

• The word cybernetics has its roots in the Greek word κυβερνητης or<br />

kybernetes, meaning steersman<br />

• It was defined in Norbert Wiener’s book Cybernetics, first published<br />

in 1948, as “the science <strong>of</strong> control and communication” (this was the<br />

sub-title <strong>of</strong> the book)<br />

• W. Ross Ashby notes in his book An Introduction to Cybernetics,<br />

published in 1956 that cybernetics is essentially “the art <strong>of</strong><br />

steersmanship”<br />

– co-ordination, regulation, and control.<br />

• The word governor derives from gubernator, the Latin version <strong>of</strong><br />

κυβερνητης<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cybernetics<br />

• Closed-loop control<br />

• An action by the system causes some change in its environment<br />

and that change is fed to the system via feedback that enables the<br />

system to change its behaviour<br />

• This "circular causal" relationship is necessary and sufficient for a<br />

cybernetic perspective [Wikipedia]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cybernetics<br />

• “The essential goal <strong>of</strong> cybernetics is to understand and define the<br />

functions and processes <strong>of</strong> systems that have goals and that<br />

participate in circular, causal chains that move from action to<br />

sensing to comparison with desired goal, and again to action.”<br />

[Wikipedia]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cybernetics<br />

• Recent definition proposed by Louis Kauffman, President <strong>of</strong> the<br />

American Society for Cybernetics<br />

"Cybernetics is the study <strong>of</strong> systems and processes that interact with<br />

themselves and produce themselves from themselves."<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


COGNITION<br />

Self-Organization<br />

Cognitivist<br />

Hybrid<br />

Emergent<br />

Physical Symbol Systems<br />

Newell’s Unified Theories<br />

<strong>of</strong> Cognition<br />

Computational paradigm<br />

Dynamical<br />

Systems<br />

Connectionist<br />

Systems<br />

Enactive<br />

Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Different <strong>Paradigms</strong> <strong>of</strong><br />

<strong>Cognitive</strong> Systems<br />

The Cognitivist Approach<br />

Information Processing<br />

Physical Symbol Systems<br />

Representationist Approach<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


‘The world we perceive is isomorphic with our perceptions <strong>of</strong> it<br />

as a geometric environment’<br />

[Shepard & Hurwitz ‘84]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


‘Cognition is a type <strong>of</strong> computation’<br />

‘People “instantiate” … representations physically as cognitive<br />

codes and that their behaviour is a causal consequence <strong>of</strong><br />

operations carried out on these codes’<br />

[Pylyshyn ‘84]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


From The Tree <strong>of</strong> Knowledge – The Biological Roots <strong>of</strong> Human Understanding<br />

H. R. Maturana and F. J. Varela, 1988<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivist Systems<br />

• Strong representations<br />

– Explicit & symbolic<br />

– Representations denote external objects<br />

– Isomorphic<br />

– Projection<br />

– Implies an absolute and accessible ontology<br />

– That is consistent with human expression …<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivist Systems<br />

• Representations<br />

– Descriptive product <strong>of</strong> a human designer<br />

– Can be directly accessed & understood by humans<br />

– Human knowledge can be directly injected into an<br />

artificial cognitive system<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivist Systems<br />

• But …<br />

Programmer-dependent representations bias the system<br />

‘blind’ the system (cf. Winograd & Flores)<br />

• … can you anticipate every eventuality in your design?<br />

• The semantic gap … and the Symbol Grounding problem<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivist Systems<br />

• Plugging the gap by …<br />

– Machine learning<br />

– Probabilistic modelling<br />

– Better models<br />

– Better logics<br />

– Better reasoning<br />

– Better everything<br />

– … and still …<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


<strong>Cognitive</strong> science<br />

is involved in an escalating retreat<br />

from the inner symbol:<br />

a kind <strong>of</strong> inner symbol flight<br />

A. Clark, Mindware<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism and Artificial Intelligence<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Physical symbol system approach to AI<br />

• Intelligence<br />

– The degree to which a system approximates a knowledge-level system<br />

[Unified Theories <strong>of</strong> Cognition, Newell 90]<br />

– A knowledge-level system: can bring all its knowledge to bear on every<br />

problem<br />

• Perfect knowledge -> complete use <strong>of</strong> knowledge<br />

• Humans aren’t there yet!!<br />

– Principle <strong>of</strong> maximum rationality [Newell 82]<br />

• ‘If an agent has knowledge that one <strong>of</strong> its actions will lead to one <strong>of</strong> its goals,<br />

then the agent will select that action’<br />

– Rational analysis [Anderson 89]<br />

• ‘The cognitive system optimizes the adaptation <strong>of</strong> the behaviour <strong>of</strong> the<br />

organism’.<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Physical Symbol Systems<br />

– Symbols are abstract entities that can be instantiated as tokens<br />

– A physical symbol system has [Newell 90]:<br />

• Memory (to contain the symbolic information)<br />

• Symbols (to provide a pattern to match or index other symbols)<br />

• Operations (to manipulate symbols)<br />

• Interpretations (to allow symbols to specify operations)<br />

• Capacities for<br />

– Composability<br />

– Interpretability<br />

– Sufficient memory<br />

– Symbol systems can be instantiated but … behaviour is<br />

independent <strong>of</strong> the particular form <strong>of</strong> the instantiation<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Physical Symbol Systems [Newell and Simon 1975]<br />

– The Physical Symbol System Hypothesis<br />

• A physical symbol system has the necessary and sufficient means for<br />

general intelligent action<br />

• Any system that exhibits general intelligence is a physical symbol<br />

system<br />

• A physical symbol system is ‘a machine that produces through time an<br />

evolving collection <strong>of</strong> symbol structures’<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Physical Symbol Systems [Newell and Simon 1975]<br />

Symbol Systems<br />

comprise<br />

comprise<br />

Symbol Structures /<br />

Expresssions /<br />

Patterns<br />

Produce, destroy, modify<br />

Processes<br />

designate<br />

Objects<br />

designate<br />

Processes<br />

Can affect objects<br />

Can be affected by objects<br />

Can be interpreted: carry out the designated process<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Physical Symbol Systems [Newell and Simon 1975]<br />

– The Heuristic Search Hypothesis<br />

• The solutions to problems are represented as symbol structures<br />

• A physical symbol system exercises its intelligence in problem-solving<br />

by search<br />

– Generating and progressively modifying symbol structures until ti produces<br />

a solution structure<br />

– Effective and efficient search<br />

• ‘The task <strong>of</strong> intelligence is to avert the ever-present threat <strong>of</strong> the<br />

exponential explosion <strong>of</strong> search’<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Physical Symbol Systems<br />

– The Physical Symbol System Hypothesis<br />

A physical symbol system has the necessary and sufficient means<br />

<strong>of</strong> general intelligence<br />

– What are the implications for humans???<br />

– Natural and artifical intelligence is equivalent (why?)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

Newell’s four levels<br />

Social<br />

10 5 –10 7 s; behaviours<br />

Requires a symbolic level<br />

Rational<br />

10 2 –10 4 s; tasks requiring reasoning (find way home)<br />

<strong>Cognitive</strong><br />

10 -1 –10 1 s; deliberate acts (reaching),<br />

composed operations (shifting gear)<br />

actions (steering a car into a gate)<br />

Biological<br />

10 -4 –10 -2 s; organelle, neuron, neural circuit<br />

All knowledge is here<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• (<strong>Cognitive</strong>) Architecture: defines the manner in which a cognitive agent<br />

manages the primitive resources at its disposal<br />

• Dictates representations and their deployment<br />

• Dictates properties <strong>of</strong> cognitive system<br />

– Organization & control strategies (coordination/cooperation;<br />

modular/hierarchical)<br />

– Memory, knowledge, representation (world models, delarative<br />

representations, procedural representations, associate memory, episodic<br />

knowledge, meta-knowledge, represenational structures)<br />

– Types <strong>of</strong> learning (deliberative vs reflexive; monotonic vs non-monotonic)<br />

– Types <strong>of</strong> planning<br />

– Behaviour (coherence, saliency, & adequacy: consistency, relevance,<br />

sufficiency)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Unified Theories <strong>of</strong> Cognition<br />

– Attempts to explain all the mechanisms <strong>of</strong> all problems in its<br />

domain<br />

– Now plausible (Newell) cf the Soar project<br />

– Applies to both natural and artificial cognition<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Cognitivism & Artificial Intelligence<br />

• Non-functional Attributes <strong>of</strong> <strong>Cognitive</strong> Architectures<br />

– Generality (breadth <strong>of</strong> tasks)<br />

– Versatility (ibid)<br />

– Rationality (cf. consistency and repeatability)<br />

– Scalability (cf. complexity)<br />

– Reactivity (cf. unpredictability)<br />

– Efficiency (cf. time and space constraints)<br />

– Extendability (cf. reconfigurability)<br />

– Taskability (cf. external direction)<br />

– Psychological validity (cf. human models)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Different <strong>Paradigms</strong> <strong>of</strong><br />

<strong>Cognitive</strong> Systems<br />

Emergent Approaches<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


COGNITION<br />

Self-Organization<br />

Cognitivist<br />

Hybrid<br />

Emergent<br />

Physical Symbol Systems<br />

Newell’s Unified Theories<br />

<strong>of</strong> Cognition<br />

Computational paradigm<br />

Dynamical<br />

Systems<br />

Connectionist<br />

Systems<br />

Enactive<br />

Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Self-generated AI:<br />

AI by orchestrating the processes that generate it<br />

Luc Steels<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Approaches<br />

• Cognition is the process whereby an autonomous system<br />

becomes viable and effective in its environment<br />

• It does so through a process <strong>of</strong> self-organization<br />

– System is continually re-constituting itself<br />

– In real-time<br />

– To maintain it operational identity<br />

– Through moderation <strong>of</strong> mutual system-environment interaction and<br />

co-determination<br />

Maturana & Varela 87<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


• Co-determination<br />

Emergent Approaches<br />

– <strong>Cognitive</strong> agent is specified by its environment<br />

– <strong>Cognitive</strong> process determines what is real or meaningful for the<br />

agent<br />

– The system constructs its reality (world) as a result <strong>of</strong> its operation<br />

in that world<br />

– Perception provides sensory data to enable effective action, but as<br />

a consequence <strong>of</strong> the system’s actions<br />

– Cognition and perception are functionally-dependent on the<br />

richness <strong>of</strong> the action interface [Granlund]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Approaches<br />

• Cognition is the complement <strong>of</strong> perception [Sandini]<br />

– Perception deal with the immediate<br />

– Cognition deals with longer time frames<br />

• Primate model <strong>of</strong> cognitive learning is anticipative skill construction<br />

(not knowledge acquisition)<br />

• The root <strong>of</strong> intelligence is the ability to guide action and, at the same<br />

time, improve that ability<br />

• Embodied as physical systems capable <strong>of</strong> physical interaction with the<br />

world<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Approaches<br />

‘<strong>Cognitive</strong> systems need to acquire information about the external<br />

world through learning or association’<br />

[Granlund’02]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Approaches<br />

• Self-organization<br />

– “Self-organizing systems are physical and biological systems in which<br />

pattern and structure at the global level arises solely from interactions<br />

among the lower-level components <strong>of</strong> the system.”<br />

– “The rules specifying interactions among the system’s components are<br />

executed only using local information, without reference to the global<br />

pattern.”<br />

• Emergence<br />

– A process by which a system <strong>of</strong> interacting elements acquires qualitatively<br />

new pattern and structure that cannot be understood simply as the<br />

superposition <strong>of</strong> the individual contributions.<br />

Camazine 2006<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


• Self-organization<br />

Emergent Approaches<br />

– Short-range Activator<br />

• autocatalysis: promote its own productions<br />

• Increase the production <strong>of</strong> an inhibitor (antagonist)<br />

– Inhibitor<br />

• Diffuses rapidly<br />

– Result:<br />

• Local increase in activation<br />

• Long-range antagonistic inhibition which keeps the self-enhancing<br />

reaction localized<br />

Camazine 2006<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Approaches<br />

Local autocatalysis<br />

Non-local negative feedback<br />

Meinhardt 95<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Approaches<br />

Camazine 2006<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Connectionist Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Connectionist Systems<br />

• Rely on<br />

– Parallel processing<br />

– Non-symbolic distributed activation patterns in networks<br />

– Not logical rules<br />

• Neural networks are the most common instantiations<br />

– Dynamical systems that capture statistical regularities or<br />

associations<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Dynamical Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


COGNITION<br />

Self-Organization<br />

Cognitivist<br />

Hybrid<br />

Emergent<br />

Physical Symbol Systems<br />

Newell’s Unified Theories<br />

<strong>of</strong> Cognition<br />

Computational paradigm<br />

Dynamical<br />

Systems<br />

Connectionist<br />

Systems<br />

Enactive<br />

Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


• Dynamical Systems<br />

Dynamical Systems<br />

– A dynamical system is an open dissipative non-linear non-equilibrium<br />

system<br />

– System: large number <strong>of</strong> interacting components & large number <strong>of</strong><br />

degrees <strong>of</strong> freedom<br />

– Dissipative: diffuse energy – phase space decreased in volume with<br />

time (⇨ preferential sub-spaces)<br />

– Non-equilibrium: unable to maintain structure or function without<br />

external sources <strong>of</strong> energy, material, information (hence, open)<br />

– Non-linearity: dissipation is not uniform – small number <strong>of</strong> system’s<br />

degrees <strong>of</strong> freedom contribute to behaviour<br />

• Order parameters / collective variables<br />

Kelso ‘95: Dynamic Pattern – The Self-Organization <strong>of</strong> Brain and Behaviour<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Dynamical System<br />

q = N(q, p, n)<br />

time derivative<br />

state vector<br />

noise<br />

control parameters<br />

From [Shoner Kelso 88]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enactive Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Orthodoxy (cognitivist)<br />

– World as the system experiences it is independent <strong>of</strong> the cognitive<br />

system (knower)<br />

• Enactive view<br />

– Known and knower ‘stand in relation to each other as mutual<br />

specification: they arise together’<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Five key elements to enactive systems<br />

– Autonomy<br />

– Embodiment<br />

– Emergence<br />

– Experience<br />

– Sense-making<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autonomy<br />

– Self-maintenance<br />

– Homeostasis<br />

– Not controlled by outside agencies<br />

– Stands apart from its environment<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Embodiment<br />

– Exists as a physical entity<br />

– Directly interacts with its environment<br />

• Structural coupling<br />

• Mutual perturbation<br />

– Constitutive part <strong>of</strong> the cognitive process<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Emergence<br />

– <strong>Cognitive</strong> behaviour arises from dynamic interplay between<br />

component parts<br />

– Internal dynamics<br />

• Maintains autonomy<br />

• Condition the system’s experiences through their embodiment in a<br />

specific structure<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Experience<br />

– History <strong>of</strong> interaction with the world<br />

– Interactions don’t control<br />

– Interaction do trigger changes in system state<br />

– Changes are structurally-determined<br />

• Phylogeny<br />

• Structural coupling<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Sense-making<br />

– Knowledge is generated by the system itself<br />

– Captures some regularity or lawfulness in the interactions<br />

– The ‘sense’ is dependent on the way interaction can take place<br />

• Perception & Action<br />

– Modify its own state (CNS) to enhance<br />

• Predictive capacity<br />

• Action capabilities<br />

– Development: generative autonomous self-modification<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Development<br />

Progressive ontogenetic acquisition <strong>of</strong> anticipatory capabilities<br />

– Cognition cannot short-circuit ontogeny<br />

– Necessarily the product <strong>of</strong> a process <strong>of</strong> embodied development<br />

– Initially dealing with immediate events<br />

t<br />

– Increasingly acquiring a predictive capability<br />

t<br />

Cognition and perception are functionally-dependent<br />

on the richness <strong>of</strong> the action interface<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Co-determination / Structural Coupling<br />

Autonomony-preserving mutual interaction<br />

Perturbation <strong>of</strong> the system is only effected by the environment<br />

[Note: this ideogram and similar ones to follow were introduced in Maturana and Varela 1987]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


<strong>Cognitive</strong> system: operationally-closed system with a nervous system<br />

Nervous system facilitates a highly-plastic mapping between<br />

sensor and motor surfaces<br />

Perturbation by both environment and system (<strong>of</strong> receptors & NS)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

NERVOUS SYSTEM<br />

(a) Facilitates huge increase in the number <strong>of</strong> possible<br />

sensor-motor patterns (that result from structural coupling<br />

with the environment)<br />

(b) Creates new dimensions (DoF) <strong>of</strong> structural coupling<br />

by facilitating association <strong>of</strong> internal states with the<br />

interactions in which the system is engaged<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


t<br />

t<br />

Anticipation / Planning / Explanation / Prediction<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


INTERACTION<br />

A shared activity in which the actions <strong>of</strong> each agent<br />

Influence the actions <strong>of</strong> the other agents in the same interaction<br />

Resulting in a mutually-constructed pattern <strong>of</strong> shared behaviour<br />

[Ogden et al.]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Meaning emerges through shared consensual<br />

experience mediated by interaction<br />

Interaction is a shared activity in which the actions <strong>of</strong> each agent<br />

influence the actions <strong>of</strong> the other agents engaged in the same<br />

interaction, resulting in a mutually-constructed patterns <strong>of</strong> shared<br />

behaviour<br />

Ogden, Dautenhahn, Stribling 2002<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong><br />

Bond <strong>of</strong> Union<br />

M. C. Escher, 1956


COGNITION & SENSE-MAKING<br />

Cognition is a process whereby the issues that are important<br />

for the continued existence <strong>of</strong> the cognitive entity<br />

are brought forth: co-determined by the entity as it interacts<br />

with the environment in which it is embedded<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


THE SPACE OF PERCEPTUAL POSSIBILITIES<br />

Is predicated not on an objective environment,<br />

but on the space <strong>of</strong> possible actions<br />

that the system can engage in whilst still<br />

maintaining the consistency <strong>of</strong> its coupling with the environment<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Autopoiesis is a special type <strong>of</strong> self-organization<br />

– Requires self-specification and self-generation<br />

(autopoiesis means literally `self-production')<br />

– An autopoietic system is a special type <strong>of</strong> homeostatic<br />

system (i.e. self-regulating system)<br />

• regulation applies not to some system parameter<br />

• but to the organization <strong>of</strong> the system itself.<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Thus, its organization is constituted by relations between<br />

components that by their interaction and transformation<br />

continuously regenerate the network <strong>of</strong> processes that<br />

produce them<br />

– That is, the relations among components -- a network <strong>of</strong><br />

processes <strong>of</strong> production --- produce the components<br />

themselves<br />

– A significant aspect <strong>of</strong> autopoiesis is that its function is to<br />

`create and maintain the unity that distinguishes it from the<br />

medium in which it exists'.<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Self-production<br />

– System emerges as a coherent systemic entity, distinct from<br />

its environment, as a consequence <strong>of</strong> processes <strong>of</strong> selforganization<br />

– Three orders <strong>of</strong> system …<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– First-order autopoietic systems correspond to cellular<br />

entities that achieve physical identity through structural<br />

coupling with the environment:<br />

Environmental perturbations trigger structural changes that<br />

permit it to continue operating<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Second-order systems correspond to meta-cellular entities<br />

that exhibit autopoiesis through operational closure in their<br />

organization<br />

They specify a network <strong>of</strong> dynamic processes that don’t<br />

leave the network (organizationally closed)<br />

Engage in structural coupling with the environment, this time<br />

with a nervous system ( -> much enlarged space <strong>of</strong> internal<br />

states & much enlarge space <strong>of</strong> interaction)<br />

– <strong>Cognitive</strong> systems (at agent level)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Third-order systems exhibit coupling between second-order<br />

(cognitive) systems<br />

– Common ontogenetic drift between reciprocally-coupled<br />

systems<br />

– Shared epistemology / language<br />

• Reflects but does not abstract the common medium in which<br />

they are coupled<br />

– 3 types <strong>of</strong> behaviour<br />

• Instinctive behaviour (based on autopoietic self-organization)<br />

• Ontogenic behaviour (based on its development)<br />

• Communication / social behaviour<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Linguistic behaviours: intersection <strong>of</strong> ontogenic and<br />

communication behaviours<br />

– Facilitates creation <strong>of</strong> a common understanding <strong>of</strong> a shared<br />

world (in which they are coupled)<br />

– Language is an emergent consequence <strong>of</strong> the third-order<br />

structural coupling <strong>of</strong> a socially-cohesive group <strong>of</strong> cognitive<br />

entities<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Autopoiesis<br />

– Consciousness (mind) emerges when language and<br />

linguistic behaviour is applied recursively to the agent itself<br />

– Consciousness is an emergent property <strong>of</strong> social coupling<br />

– Doesn’t have any independent existence (and you can’t<br />

model it!)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enaction<br />

• Third-order systems: coupling between second-order<br />

(cognitive) systems<br />

– Recurrent (common) ontogenic drift from reciprocal coupling<br />

– Instinctive behaviour based on second-order organization<br />

(phylogenic evolution)<br />

– Ontogenetic behaviour, development qua learning over its lifetime<br />

– Communicative behaviour: third order structural coupling (social<br />

coupling)<br />

– Linguistic behaviours: establishment <strong>of</strong> a shared epistemology<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


The Hierarchy <strong>of</strong> Enactive Systems<br />

Same approach: operational closure<br />

Emergence <strong>of</strong> Language:<br />

Explicit deliberative self-reflective reasoning<br />

Rationalization <strong>of</strong><br />

interactions & perceptions:<br />

2<br />

Intelligent<br />

(adaptive & anticipatory)<br />

behaviour 1<br />

3<br />

Societies <strong>of</strong><br />

<strong>Cognitive</strong> Entities<br />

Viable Organisms<br />

Emergence <strong>of</strong> Cognition<br />

Cellular Organization<br />

Physical Instantiation<br />

Structural coupling with<br />

the environment<br />

Multilevel Autopoeitic / <strong>Cognitive</strong> Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enactive Cognition<br />

MEANING<br />

Emerges through shared consensual experience<br />

mediated by interaction<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enactive Cognition<br />

• Structural coupling<br />

– Necessarily embodied<br />

– Operating in synchronous real-time<br />

Machine representations reflect<br />

the machine’s epistemology<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Enactive Cognition<br />

Need to achieve synthetic cognition without falling back into using<br />

cognitivist preconceptions basing the ‘design’ on representations<br />

derived by external observers (us)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Bickhard’s Self-Maintenant &<br />

Recursive Self-Maintenant Systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Models<br />

• “The grounds for cognition are adaptive far-fromequilibrium<br />

autonomy – recursively self-maintenant<br />

autonomy – not symbol processing nor connectionist<br />

input processing.”<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Models<br />

• Autonomy: the property <strong>of</strong> a system to contribute to<br />

its own persistence<br />

• Different grades <strong>of</strong> contribution -> different levels <strong>of</strong><br />

autonomy<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


• Self-maintenant Systems<br />

Emergent Models<br />

– Make active contributions to its own persistence<br />

– Do not contribute to the maintenance <strong>of</strong> the conditions for<br />

persistence<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Models<br />

• Recursive self-maintenant systems<br />

– Do contribute actively to the conditions for persistence<br />

– Can deploy different processes <strong>of</strong> self-maintenance<br />

depending on environmental conditions<br />

– “they shift their self-maintenant processes so as to maintain<br />

self-maintenance as the environment shifts” !!!<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Models<br />

• Far from equilibrium stability<br />

– Non-thermodynamic equilibrium<br />

– Requires that the system does NOT go to thermodynamic<br />

equilibrium<br />

– Completely dependent on their continued existence on continued<br />

contributions from external factors<br />

– Do require environmental interaction<br />

– Necessarily open processes (which exhibit closed organization)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Models<br />

• Self-maintenant and recursive self-maintenant systems are both<br />

far-from-equilibrium systems<br />

• Recursive self-maintenant systems do yield the emergence <strong>of</strong><br />

representations<br />

– Function emerges in self-maintenant systems<br />

– Representation emerges in a particular type <strong>of</strong> function<br />

(“indications <strong>of</strong> potential interactions”) in recursively self-maintenant<br />

systems<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent Models<br />

Why development in humanoid form<br />

is important for cognition<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Cognition is the process whereby<br />

an autonomous system becomes<br />

viable and effective in its<br />

environment<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

through moderation <strong>of</strong> mutual systemenvironment<br />

Self-<br />

Organization<br />

The system is continually<br />

re-constituting itself in real-time<br />

to maintain its operational identity<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

The system and environment are co-determined<br />

through structural coupling and<br />

contingent self-organization<br />

Co-determination:<br />

Agent is specified by its environment and the<br />

cognitive process determines what is real or<br />

meaningful for the agent<br />

Agent constructs its reality as a result <strong>of</strong> its<br />

operation in that world.<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

Maturana's and Varela's idea <strong>of</strong> structural coupling <strong>of</strong> level one<br />

autopoietic systems (MaturanaVarela87)<br />

Kelso's circular causality <strong>of</strong> action and perception (Kelso95)<br />

Organizational principles inherent in Bickhard's self-maintenant<br />

systems (Bickhard00)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

This implies a particular phylogeny:<br />

Phylogeny<br />

1. To effect the homeostatic self-organization<br />

2. To effect the structural coupling<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

Phylogeny<br />

Neuroscience<br />

Developmental<br />

Psychology<br />

Recent studies in natural cognition<br />

guide the identification <strong>of</strong> this phylogeny<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Development is the cognitive process <strong>of</strong><br />

establishing (discovering) the possible space <strong>of</strong><br />

mutually-consistent couplings<br />

Bickhard's concept <strong>of</strong> recursive self-maintenance (Bickhard00)<br />

Maturana's and Varela's level two and level three autopoietic systems (MaturanaVarela87)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Development implies ontogenic development:<br />

the cognitive system develops its own<br />

epistemology: its own system-specific history- and<br />

context-dependent knowledge <strong>of</strong> its world<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Neuroscience<br />

Developmental<br />

Psychology<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Neural Nets<br />

Dynamical<br />

Systems<br />

Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Embodiment<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Emergent systems also implies embodiment …<br />

But what type <strong>of</strong> embodiment?<br />

It depends on the needs <strong>of</strong> the structural coupling<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Embodiment<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Structural Coupling<br />

The system-environment perturbations must be rich enough<br />

to drive the ontogenic development but not destructive <strong>of</strong> the self-organization<br />

No guarantee that the cognitive behaviour will be consistent with human preconceptions<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Embodiment<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Humanoid<br />

Meaning emerges through a shared consensual experience,<br />

mediated by interaction (mutually-constructed shared pattern <strong>of</strong> behaviour)<br />

Therefore, we require a humanoid embodiment to effect mutually-consistent meanings<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Otherwise, this is what we’ll get …<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Embodiment<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Humanoid<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Embodiment<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Humanoid<br />

Morphology is a constitutive component <strong>of</strong> both co-determination and development<br />

Consequently, a plastic morphology is important: the embodiment shouldn’t be static<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Emergent<br />

Systems<br />

Self-<br />

Organization<br />

Embodiment<br />

Codetermination<br />

Development<br />

Phylogeny<br />

Ontogeny<br />

Humanoid<br />

<strong>Cognitive</strong><br />

Development<br />

Increasing complexity<br />

in action space<br />

Increasing degree<br />

<strong>of</strong> Prospection<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Different <strong>Paradigms</strong> <strong>of</strong><br />

<strong>Cognitive</strong> Systems<br />

Hybrid Approaches<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Hybrid Models<br />

• Combination <strong>of</strong> cognitivist & emergent approaches<br />

• Don’t use explicit programmer based knowledge<br />

• Can use symbolic representations (& representational<br />

invariances) but these are populated by the system itself as it<br />

learns<br />

• Still emphasize perception-action coupling<br />

• Often with action-dependent perception<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Hybrid Models<br />

<br />

<br />

<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


RECAP<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Differences between Cognitivist & Emergent <strong>Paradigms</strong><br />

Differences between Cognitivist<br />

& Emergent <strong>Paradigms</strong><br />

1. Computational operation<br />

2. Representational framework<br />

3. Semantic grounding<br />

4. Temporal constraints<br />

5. Inter-agent epistemology<br />

6. Embodiment<br />

7. Perception<br />

8. Action<br />

9. Anticipation<br />

10.Adaptation<br />

11.Motivation<br />

12.Autonomy<br />

13.Cognition<br />

14.Philosophical foundation<br />

[<strong>Vernon</strong>, Von H<strong>of</strong>sten, Fadiga 2010]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


What is Cognition?<br />

Cognitivism<br />

Information processing: rule-based<br />

manipulation <strong>of</strong> symbols<br />

Connectionism<br />

Emergence <strong>of</strong> global states in a network <strong>of</strong><br />

simple components<br />

Dynamical<br />

Systems<br />

A history <strong>of</strong> activity that brings forth change and<br />

activity<br />

Enactive<br />

Systems<br />

Effective action: history <strong>of</strong> structural coupling<br />

which enacts (brings forth) a world<br />

Adapted from [Thelen & Smith 94] and [Varela 88]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


How Does it Work?<br />

Cognitivism<br />

Through any device that can manipulate<br />

symbols<br />

Connectionism<br />

Through local rules and changes in the<br />

connectivity <strong>of</strong> the elements<br />

Dynamical<br />

Systems<br />

Through self-organizing processes <strong>of</strong><br />

interconnected sensorimotor subnetworks<br />

Enactive<br />

Systems<br />

Through a network <strong>of</strong> interconnected elements<br />

capable <strong>of</strong> structural changes undergoing an<br />

uninterrupted history<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


What does a good cognitive system do?<br />

Cognitivism<br />

Represents the stable truths about the real<br />

world, and solves problems posed to it<br />

Connectionism<br />

Develops emergent properties that yield stable<br />

solutions to tasks<br />

Dynamical<br />

Systems<br />

Becomes an active and adaptive part <strong>of</strong> an<br />

ongoing and continually changing world<br />

Enactive<br />

Systems<br />

Becomes part <strong>of</strong> an existing on-going world <strong>of</strong><br />

meaning (in ontogeny) or shapes a new one (in<br />

phylogeny)<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


[From Varela 92]<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Which Paradigm is Correct?<br />

• <strong>Paradigms</strong> are not equally mature<br />

• Dynamical systems<br />

– Arguments are compelling BUT ..<br />

– Not yet clear how to get higher-level cognition<br />

• Cognitivist systems<br />

– More advanced<br />

– Not many achievements in generalization<br />

– More brittle (in principle)<br />

• Enactive (& Dynamical)<br />

– SHOULD be much less brittle (mutual specification through co-development)<br />

– But limited cognition at present<br />

• Hybrid systems<br />

– Best <strong>of</strong> both worlds?<br />

– But unclear how one can really combine antagonistic philosophies<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Which Paradigm is Correct?<br />

• No on one has actually designed a complete cognitive system<br />

• Still lots <strong>of</strong> disagreement on the right approach to take<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>


Recommended Reading<br />

<strong>Vernon</strong>, D. “Artificial <strong>Cognitive</strong> Systems – A Primer for Students”,<br />

Chapter 2.<br />

<strong>Vernon</strong>, D., Metta. G., and Sandini, G. “A Survey <strong>of</strong> Artificial<br />

<strong>Cognitive</strong> Systems: Implications for the Autonomous<br />

Development <strong>of</strong> Mental Capabilities in Computational Agents”,<br />

IEEE Transactions on Evolutionary Computation, special issue<br />

on Autonomous Mental Development, Vol. 11, No. 2, pp. 151-<br />

180 (2007).<br />

<strong>Vernon</strong>, D., von H<strong>of</strong>sten, C., and Fadiga, L. A Roadmap for <strong>Cognitive</strong><br />

Development in Humanoid Robots, <strong>Cognitive</strong> Systems<br />

Monographs (COSMOS), Springer, ISBN 978-3-642-16903-8<br />

(2010); Chapter 5.<br />

Copyright © 2012 <strong>David</strong> <strong>Vernon</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!