12.07.2015 Views

Ivancevic_Applied-Diff-Geom

Ivancevic_Applied-Diff-Geom

Ivancevic_Applied-Diff-Geom

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Geom</strong>etrical Path Integrals and Their Applications 1089transition process, according to one of the two standard neural learningschemes, in which the micro–time level is traversed in discrete steps, i.e., ift = t 0 , t 1 , ..., t s then t + 1 = t 1 , t 2 , ..., t s+1 :(1) A self–organized, unsupervised (e.g., Hebbian–like [Hebb (1949)]) learningrule:w s (t + 1) = w s (t) + σ η (wd s(t) − w a s (t)), (6.135)where σ = σ(t), η = η(t) denote signal and noise, respectively, whilesuperscripts d and a denote desired and achieved micro–states, respectively;or(2) A certain form of a supervised gradient descent learning:w s (t + 1) = w s (t) − η∇J(t), (6.136)where η is a small constant, called the step size, or the learning rateand ∇J(n) denotes the gradient of the ‘performance hyper–surface’ atthe t−th iteration.Both Hebbian and supervised learning are used for the local decision makingprocess (see below) occurring at the intention formation faze F.In this way, local micro–level of LSF total represents an infinite–dimensional neural network. In the cognitive psychology framework, ouradaptive path integral (6.133) can be interpreted as semantic integration(see [Bransford and Franks (1971); Ashcraft (1994)]).Motion and Decision Making in LSF pathsOn the macro–level in the subspace LSF paths we have the (loco)motionaction principleδS[x] = 0,with the Newtonian–like action S[x] given byS[x] =∫ tfint inidt [ 1 2 g ij ẋ i ẋ j + ϕ i (x i )], (6.137)where overdot denotes time derivative, so that ẋ i represents processingspeed, or (loco)motion velocity vector. The first bracket term in (6.137)represents the kinetic energy T ,T = 1 2 g ij ẋ i ẋ j ,

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!