273:'s work on concepts. They argued that humans use metaphors whenever possible to better explain their external world. Humans also have a basic stock of concepts in which other concepts can be derived from. These basic concepts include spatial orientations such as up, down, front, and back. Humans can understand what these concepts mean because they can directly experience them from their own bodies. For example, because human movement revolves around standing erect and moving the body in an up-down motion, humans innately have these concepts of up and down. Lakoff and Johnson contend this is similar with other spatial orientations such as front and back too. As mentioned earlier, these basic stocks of spatial concepts are the basis in which other concepts are constructed. Happy and sad for instance are seen now as being up or down respectively. When someone says they are feeling down, what they are really saying is that they feel sad for example. Thus the point here is that true understanding of these concepts is contingent on whether one can have an understanding of the human body. So the argument goes that if one lacked a human body, they could not possibly know what up or down could mean, or how it could relate to emotional states.
379:, this next example emphasizes the importance of action-relevant sensory information, bodily movement, and local environment cues. These three concepts are unified by the concept of affordances, which are possibilities of action provided by the physical world to a given agent. These are in turn determined by the agent's physical body, capacities, and the overall action-related properties of the local environment as well. Clark uses the example of an outfielder in baseball to better illustrate the concept of affordance. Traditional computational models would claim that an outfielder attempting to catch a fly-ball can be calculated by variables such as the running speed of the outfielder and the arc of the baseball. However, Gibson's work shows that a simpler method is possible. The outfielder can catch the ball so long as they adjust their running speed so that the ball continually moves in a straight line in their field of vision. Note that this strategy uses various affordances that are contingent upon the success of the outfielder, including their physical body composition, the environment of the baseball field, and the sensory information obtained by the outfielder.
367:, in which vision is used to create a rich world model so that thought and reason can be used to fully explore the inner model. In other words, pure vision passively creates the external perceivable world so that the faculties of reason can be better used introspectively. Animate vision, by contrast, sees vision as the means by which real-time action can commence. Animate vision is then more of a vehicle by which visual information is obtained so that actions can be undertaken. Clark points to animate vision as an example of embodiment, because it uses both biological and local environment cues to create an active intelligent process. Consider the Clark's example of going to the drugstore to buy some Kodak film. In one's mind, one is familiar with the Kodak logo and its trademark gold color. Thus, one uses incoming visual stimuli to navigate around the drugstore until one finds the film. Therefore, vision should not be seen as a passive system but rather an active retrieval device that intelligently uses sensory information and local environmental cues to perform specific real-world actions.
299:, this point claims that mental states are individuated by their role in a much larger system. So under this premise, the information on a PDA is similar to the information stored in the brain. So then if one thinks information in the brain constitutes mental states, then it must follow that information in the PDA is a cognitive state too. Consider also the role of pen and paper in a complex multiplication problem. The pen and paper are so involved in the cognitive process of solving the problem that it seems ridiculous to say they are somehow different from the process, in very much the same way the PDA is used for information like the brain. Another example examines how humans control and manipulate their environment so that cognitive tasks can be better performed. Leaving one's car keys in a familiar place so they aren't missed for instance, or using landmarks to navigate in an unfamiliar city. Thus, humans incorporate aspects of their environment to aid in their cognitive functioning.
253:
explained under the embodied approach due to the sheer complexity of the action. Depth perception requires that the brain detect the disparate retinal images obtained by the distance of the two eyes. In addition, body and head cues complicate this further. When the head is turned in a given direction, objects in the foreground will appear to move against objects in the background. From this, it is said that some kind of visual processing is occurring without the need of any kind of symbol manipulation. This is because the objects appearing to move the foreground are simply appearing to move. This observation concludes then that depth can be perceived with no intermediate symbol manipulation necessary.
383:
of the ball is linear as it follows a sequence of perception, calculation and performing action. Thus, the affordance approach challenges the traditional view of perception by arguing against the notion that computation and introspection are necessary. Instead, it ought to be replaced with the idea that perception constitutes a continuous equilibrium of action adjustment between the agent and the world. Ultimately Clark does not expressly claim this is certain but he does observe the affordance approach can explain adaptive response satisfactorily. This is because they utilize environmental cues made possible by perceptual information that is actively used in the real-time by the agent.
232:
the processing unit finds semantic meaning. Thus, an appropriate output is produced. For example, a human's sensory organs are its input devices, and the stimuli obtained from the external environment are fed into the nervous system which serves as the processing unit. From here, the nervous system is able to read the sensory information because it follows a syntactic structure, thus an output is created. This output then creates bodily motions and brings forth behavior and cognition. Of particular note is that cognition is sealed away in the brain, meaning that mental cognition is cut off from the external world and is only possible by the input of sensory information.
257:
a given medium. The brain's auditory system takes these factors into account as it process information, but again without any need for a symbolic manipulation system. This is because the distance between the ears for example does not need symbols to represent it. The distance itself creates the necessary opportunity for greater auditory acuity. The amount of density between the ears is similar, in that it is the actual amount itself that simply forms the opportunity for frequency alteration. Thus under consideration of the physical properties of the body, a symbolic system is unnecessary and an unhelpful metaphor.
438:: This is more a theory than a principle, but its implications are widespread. Its claim is that the internal processing of an agent cannot be made more complex unless there is a corresponding increase in complexity of the motors, limbs, and sensors of the agent. In other words, the extra complexity added to the brain of a simple robot will not create any discernible change in its behavior. The robot's morphology must already contain the complexity in itself to allow enough "breathing room" for more internal processing to develop.
244:, which concluded that semantic meaning could not be derived from symbols without some kind of inner interpretation. If some little man in a person's head interpreted incoming symbols, then who would interpret the little man's inputs? Because of the specter of an infinite regress, the traditionalist model began to seem less plausible. Thus, embodied cognitive science aims to avoid this problem by defining cognition in three ways.
351:
there were also the mechanical matters regarding how the foot ought to be constructed so that it could hop. An embodied approach makes it easier to see that in order for this robot to function, it must be able to exploit its system to the fullest. That is, the robot's systems should be seen as having dynamic characteristics as opposed to the traditional view that it is merely a command center that just executes actions.
432:: Ideally, internal mechanisms in an agent should give rise to things like memory and choice-making in an emergent fashion, rather than being prescriptively programmed from the beginning. These kinds of things are allowed to emerge as the agent interacts with the environment. The motto is, build fewer assumptions into the agent's controller now, so that learning can be more robust and idiosyncratic in the future.
412:). In terms of design, this implies that redundancy should be introduced with respect not only to one sensory modality but to several. It has been suggested that the fusion and transfer of knowledge between modalities can be the basis of reducing the size of the sense data taken from the real world. This again addresses the scalability problem.
398:: Pfeifer realized that implicit assumptions made by engineers often substantially influence a control architecture's complexity. This insight is reflected in discussions of the scalability problem in robotics. The internal processing needed for some bad architectures can grow out of proportion to new tasks needed of an agent.
33:
is an interdisciplinary field of research, the aim of which is to explain the mechanisms underlying intelligent behavior. It comprises three main methodologies: the modeling of psychological and biological systems in a holistic manner that considers the mind and body as a single entity; the formation
333:, or tuna, long baffled conventional biologists with its incredible abilities to accelerate quickly and attain great speeds. A biological examination of the tuna shows that it should not be capable of such feats. However, an answer can be found when taking the tuna's embodied state into account. The
256:
A more poignant example exists through examining auditory perception. Generally speaking the greater the distance between the ears, the greater the possible auditory acuity. Also relevant is the amount of density in between the ears, for the strength of the frequency wave alters as it passes through
217:
It can also be maintained that it is best to provide the machine with the best sense organs that money can buy, and then teach it to understand and speak
English. That process could follow the normal teaching of a child. Things would be pointed out and named, etc. Again, I do not know what the right
382:
Clark points out here that the latter strategy of catching the ball as opposed to the former has significant implications for perception. The affordance approach proves to be non-linear because it relies upon spontaneous real-time adjustments. On the contrary, the former method of computing the arc
281:
While this does not mean that such beings would be incapable of expressing emotions in other words, it does mean that they would express emotions differently from humans. Human concepts of happiness and sadness would be different because human would have different bodies. So then an organism's body
465:
A traditionalist may argue that objects may be used to aid in cognitive processes, but this does not mean they are part of a cognitive system. Eyeglasses are used to aid in the visual process, but to say they are a part of a larger system would completely redefine what is meant by a visual system.
337:
is able to take advantage of and exploit its local environment by finding naturally occurring currents to increase its speed. The tuna also uses its own physical body for this end as well, by utilizing its tailfin to create the necessary vortices and pressure so it can accelerate and maintain high
231:
in favor of greater emphasis on how an organism's body determines how and what it thinks. Traditional cognitive theory is based mainly around symbol manipulation, in which certain inputs are fed into a processing unit that produces an output. These inputs follow certain rules of syntax, from which
350:
constructed by
Raibert and Hodgins to demonstrate further the value of the embodiment paradigm. These robots were essentially vertical cylinders with a single hopping foot. The challenge of managing the robot's behavior can be daunting because in addition to the intricacies of the program itself,
252:
The first aspect of embodied cognition examines the role of the physical body, particularly how its properties affect its ability to think. This part attempts to overcome the symbol manipulation component that is a feature of the traditionalist model. Depth perception, for instance, can be better
391:
In the formation of general principles of intelligent behavior, Pfeifer intended to be contrary to older principles given in traditional artificial intelligence. The most dramatic difference is that the principles are applicable only to situated robotic agents in the real world, a domain where
407:
The proposed solutions are to have the agent exploit the inherent physics of its environment, to exploit the constraints of its niche, and to have agent morphology based on parsimony and the principle of
Redundancy. Redundancy reflects the desire for the error-correction of signals afforded by
402:
One of the primary reasons for scalability problems is that the amount of programming and knowledge engineering that the robot designers have to perform grows very rapidly with the complexity of the robot's tasks. There is mounting evidence that pre-programming cannot be the solution to the
315:
It is increasingly clear that, in a wide variety of cases, the individual brain should not be the sole locus of cognitive scientific interest. Cognition is not a phenomenon that can be successfully studied while marginalizing the roles of body, world and
290:
A third component of the embodied approach looks at how agents use their immediate environment in cognitive processing. Meaning, the local environment is seen as an actual extension of the body's cognitive process. The example of a
466:
However, supporters of the embodied approach could make the case that if objects in the environment play the functional role of mental states, then the items themselves should not be counted among the mental states.
703:
472:
explores mind extension further outlining its role in technology. He proposes a cognitive theory of 'extended artificial memory', which represents a theoretical update and extension of the memory theories of
277:
magine a spherical being living outside of any gravitational field, with no knowledge or imagination of any other kind of experience. What could UP possibly mean to such a being?
422:. This design principle differs most importantly from the Sense-Think-Act cycle of traditional AI. Since it does not involve this famous cycle, it is not affected by the
742:
240:
Embodied cognitive science differs from the traditionalist approach in that it denies the input-output system. This is chiefly due to the problems presented by the
875:
Fowler, C., Rubin, P. E., Remez, R. E., & Turvey, M. T. (1980). Implications for speech production of a general theory of action. In B. Butterworth (Ed.),
664:
793:
NIPS 2006 Workshop on
Grounding Perception, Knowledge and Cognition in Sensori-Motor Experience. Department of Computer Science, Iowa State U
469:
34:
of a common set of general principles of intelligent behavior; and the experimental use of robotic agents in controlled environments.
338:
speeds. Thus, the bluefin tuna is actively using its local environment for its own ends through the attributes of its physical body.
320:
The following examples used by Clark will better illustrate how embodied thinking is becoming apparent in scientific thinking.
959:
408:
duplicating like channels. Additionally, it reflects the desire to exploit the associations between sensory modalities. (See
409:
296:
912:
898:
884:
870:
856:
842:
778:
127:
954:
403:
scalability problem ... The problem is that programmers introduce too many hidden assumptions in the robot's code.
964:
547:
924:
311:. He makes the claim that the brain alone should not be the single focus for the scientific study of cognition
307:
The value of the embodiment approach in the context of cognitive science is perhaps best explained by
282:
directly affects how it can think, because it uses metaphors related to its body as the basis of concepts.
228:
934:
803:
512:
87:
292:
270:
119:
227:
Embodied cognitive science is an alternative theory to cognition in which it minimizes appeals to
492:
364:
71:
59:
790:
502:
363:, animate and pure vision. Pure vision is an idea that is typically associated with classical
170:
806:
Detection and
Identification of Rare Audiovisual Cues. DIRAC EU IP IST project, Switzerland.
527:
141:
8:
818:
Extended
Artificial Memory. Toward an integral cognitive theory of memory and technology.
517:
507:
241:
43:
734:
542:
537:
165:
From the perspective of autonomous agent design, early work is sometimes attributed to
24:
929:
722:
969:
908:
894:
880:
866:
852:
838:
774:
726:
683:
641:
497:
360:
111:
93:
83:
47:
20:
738:
718:
679:
633:
419:
101:
75:
522:
376:
199:
637:
624:
445:
147:
133:
97:
67:
948:
645:
474:
449:
423:
266:
203:
166:
159:
155:
123:
115:
105:
730:
334:
180:
137:
55:
619:
532:
487:
210:
151:
308:
213:
proposed that a machine may need a human-like body to think and speak:
195:
191:
51:
460:
939:
817:
444:: This was the architecture developed in the Darwin III robot of
329:
905:
How the body shapes the way we think: a new view of intelligence
804:
Summer
Workshop on Multi-Sensory Modalities in Cognitive Science
392:
traditional artificial intelligence showed the least promise.
386:
347:
302:
592:
Newen, A., De Bruin, L. & Gallagher, S. (Eds.) (2018),
260:
79:
418:: An alternative to hierarchical methods of knowledge and
568:
The
Embodied Mind: Cognitive Science and Human Experience
622:(October 1950), "Computing Machinery and Intelligence",
218:
answer is, but I think both approaches should be tried.
849:
285:
566:Varela, F., Thompson, E., & Rosch, E. (1991).
461:Traditionalist response to local environment claim
375:Inspired by the work of the American psychologist
247:
235:
940:A platform for creating Embodied Cognitive Agents
175:From the perspective of artificial intelligence,
946:
416:Principle of parallel, loosely-coupled processes
42:Embodied cognitive science borrows heavily from
935:Society for the Simulation of Adaptive Behavior
791:Five Basic Principles of Developmental Robotics
222:
879:(pp. 373–420). New York: Academic Press.
295:(PDA) is used to better imagine this. Echoing
132:From the perspective of language acquisition,
925:AI lectures from Tokyo hosted by Rolf Pfeifer
835:Vehicles: Experiments in Synthetic Psychology
765:
763:
658:
656:
654:
877:Language Production, Vol. I: Speech and Talk
607:The Routledge Handbook of Embodied Cognition
697:
695:
693:
809:
760:
651:
387:General principles of intelligent behavior
303:Examples of the value of embodied approach
359:Clark distinguishes between two kinds of
690:
396:Principle of cheap design and redundancy
261:The body's role in the cognitive process
930:synthetic neural modelling in DARWIN IV
662:
430:Principle of sensory-motor coordination
947:
618:
346:Clark uses the example of the hopping
146:From the perspective of anthropology,
66:From the perspective of neuroscience,
23:that emphasize the embodied mind, see
701:
455:
265:The second aspect draws heavily from
187:, by Rolf Pfeifer and Josh C. Bongard
110:From the perspective of linguistics,
62:. Contributors to the field include:
821:(Deutsche Nationalbibliothek, 2013)
190:From the perspective of philosophy,
185:How the Body Shapes the Way We Think
92:From the perspective of psychology,
594:The Oxford Handbook of 4E Cognition
46:and the related research fields of
16:Interdisciplinary field of research
13:
891:Biological Foundations of Language
827:
665:"The Embodied Cognition Programme"
297:functionalism (philosophy of mind)
14:
981:
918:
609:. Routledge Taylor & Francis.
903:Pfeifer, R. and Bongard J. C.,
851:. Cambridge, MA: The MIT Press.
837:. Cambridge, MA: The MIT Press.
704:"An Embodied Cognitive Science?"
684:10.1111/j.1747-9991.2007.00064.x
286:Interaction of local environment
833:Braitenberg, Valentino (1986).
796:
436:Principle of ecological balance
323:
248:Physical attributes of the body
236:The embodied cognitive approach
37:
865:(Yale University Press, 2004)
783:
702:Clark, Andy (September 1999).
612:
599:
586:
573:
560:
1:
960:Branches of cognitive science
723:10.1016/s1364-6613(99)01361-3
663:Shapiro, Larry (March 2007).
554:
370:
711:Trends in Cognitive Sciences
229:computational theory of mind
223:Traditional cognitive theory
7:
889:Lenneberg, Eric H. (1967).
769:Pfeifer, R., Scheier, C.,
513:Embodied embedded cognition
480:
88:Florida Atlantic University
10:
986:
847:Brooks, Rodney A. (1999).
771:Understanding Intelligence
605:Shapiro, L. (Ed.) (2014),
596:. Oxford University Press.
293:personal digital assistant
177:Understanding Intelligence
31:Embodied cognitive science
18:
893:. John Wiley & Sons.
354:
341:
183:and Christian Scheier or
638:10.1093/mind/LIX.236.433
955:Artificial intelligence
907:(The MIT Press, 2007).
493:Behavior-based robotics
448:. It relies heavily on
365:artificial intelligence
72:Neurosciences Institute
60:artificial intelligence
965:Cognitive neuroscience
789:Stoytchev, A. (2006).
503:Cognitive neuroscience
405:
318:
279:
220:
802:Konijn, Paul (2007).
581:Cognition in the Wild
579:Hutchins, E. (1995).
400:
313:
275:
215:
171:Valentino Braitenberg
528:Common coding theory
410:redundant modalities
142:Haskins Laboratories
518:Embodied philosophy
508:Cortical homunculus
242:Homunculus argument
44:embodied philosophy
863:Wider than the Sky
773:(MIT Press, 2001)
672:Philosophy Compass
543:Situated cognition
538:Neurophenomenology
456:Critical responses
25:Embodied cognition
19:For approaches to
498:Cognitive science
112:Gilles Fauconnier
94:Lawrence Barsalou
84:J. A. Scott Kelso
48:cognitive science
21:cognitive science
977:
822:
813:
807:
800:
794:
787:
781:
767:
758:
757:
755:
753:
748:on 26 March 2012
747:
741:. Archived from
708:
699:
688:
687:
669:
660:
649:
648:
632:(236): 433–460,
616:
610:
603:
597:
590:
584:
577:
571:
564:
420:action selection
102:Vittorio Guidano
76:Francisco Varela
985:
984:
980:
979:
978:
976:
975:
974:
945:
944:
921:
830:
828:Further reading
825:
814:
810:
801:
797:
788:
784:
768:
761:
751:
749:
745:
706:
700:
691:
667:
661:
652:
617:
613:
604:
600:
591:
587:
578:
574:
565:
561:
557:
552:
523:Motor cognition
483:
463:
458:
442:Value principle
389:
377:James J. Gibson
373:
357:
344:
326:
305:
288:
263:
250:
238:
225:
200:Shaun Gallagher
82:in France, and
40:
28:
17:
12:
11:
5:
983:
973:
972:
967:
962:
957:
943:
942:
937:
932:
927:
920:
919:External links
917:
916:
915:
901:
887:
873:
859:
845:
829:
826:
824:
823:
815:Ludwig, Lars,
808:
795:
782:
759:
717:(9): 345–351.
689:
650:
611:
598:
585:
572:
558:
556:
553:
551:
550:
545:
540:
535:
530:
525:
520:
515:
510:
505:
500:
495:
490:
484:
482:
479:
462:
459:
457:
454:
446:Gerald Edelman
388:
385:
372:
369:
356:
353:
343:
340:
325:
322:
304:
301:
287:
284:
262:
259:
249:
246:
237:
234:
224:
221:
207:
206:
188:
173:
163:
148:Edwin Hutchins
144:
134:Eric Lenneberg
130:
108:
98:Michael Turvey
90:
68:Gerald Edelman
39:
36:
15:
9:
6:
4:
3:
2:
982:
971:
968:
966:
963:
961:
958:
956:
953:
952:
950:
941:
938:
936:
933:
931:
928:
926:
923:
922:
914:
913:0-262-16239-3
910:
906:
902:
900:
899:0-471-52626-6
896:
892:
888:
886:
885:0-12-147501-8
882:
878:
874:
872:
871:0-300-10229-1
868:
864:
861:Edelman, G.
860:
858:
857:0-262-52263-2
854:
850:
846:
844:
843:0-262-52112-1
840:
836:
832:
831:
820:
819:
812:
805:
799:
792:
786:
780:
779:0-262-66125-X
776:
772:
766:
764:
744:
740:
736:
732:
728:
724:
720:
716:
712:
705:
698:
696:
694:
685:
681:
677:
673:
666:
659:
657:
655:
647:
643:
639:
635:
631:
627:
626:
621:
615:
608:
602:
595:
589:
582:
576:
569:
563:
559:
549:
546:
544:
541:
539:
536:
534:
531:
529:
526:
524:
521:
519:
516:
514:
511:
509:
506:
504:
501:
499:
496:
494:
491:
489:
486:
485:
478:
476:
475:Richard Semon
471:
467:
453:
451:
450:connectionism
447:
443:
439:
437:
433:
431:
427:
425:
424:frame problem
421:
417:
413:
411:
404:
399:
397:
393:
384:
380:
378:
368:
366:
362:
352:
349:
339:
336:
332:
331:
321:
317:
312:
310:
300:
298:
294:
283:
278:
274:
272:
268:
267:George Lakoff
258:
254:
245:
243:
233:
230:
219:
214:
212:
205:
204:Evan Thompson
201:
197:
193:
189:
186:
182:
178:
174:
172:
168:
167:Rodney Brooks
164:
161:
160:Merlin Donald
157:
156:James Wertsch
153:
149:
145:
143:
139:
135:
131:
129:
125:
124:Leonard Talmy
121:
117:
116:George Lakoff
113:
109:
107:
106:Eleanor Rosch
103:
99:
95:
91:
89:
85:
81:
77:
74:at La Jolla,
73:
69:
65:
64:
63:
61:
57:
53:
49:
45:
35:
32:
26:
22:
904:
890:
876:
862:
848:
834:
816:
811:
798:
785:
770:
750:. Retrieved
743:the original
714:
710:
675:
671:
629:
623:
620:Turing, Alan
614:
606:
601:
593:
588:
583:. MIT Press.
580:
575:
570:. MIT Press.
567:
562:
468:
464:
441:
440:
435:
434:
429:
428:
415:
414:
406:
401:
395:
394:
390:
381:
374:
358:
345:
335:bluefin tuna
328:
327:
324:Bluefin tuna
319:
314:
306:
289:
280:
276:
271:Mark Johnson
264:
255:
251:
239:
226:
216:
208:
184:
181:Rolf Pfeifer
176:
138:Philip Rubin
120:Mark Johnson
56:neuroscience
41:
38:Contributors
30:
29:
533:Linguistics
488:Autopoiesis
470:Lars Ludwig
211:Alan Turing
152:Bradd Shore
128:Mark Turner
949:Categories
555:References
371:Affordance
309:Andy Clark
196:Dan Zahavi
192:Andy Clark
52:psychology
646:0026-4423
548:Strong AI
209:In 1950,
970:Robotics
731:10461197
481:See also
752:27 June
739:3084733
330:Thunnus
316:action.
269:'s and
70:of the
911:
897:
883:
869:
855:
841:
777:
737:
729:
644:
361:vision
355:Vision
342:Robots
202:, and
746:(PDF)
735:S2CID
707:(PDF)
678:(2).
668:(PDF)
348:robot
909:ISBN
895:ISBN
881:ISBN
867:ISBN
853:ISBN
839:ISBN
775:ISBN
754:2011
727:PMID
642:ISSN
625:Mind
158:and
136:and
126:and
104:and
80:CNRS
58:and
719:doi
680:doi
634:doi
630:LIX
477:.
179:by
169:or
140:at
86:of
78:of
951::
762:^
733:.
725:.
713:.
709:.
692:^
674:.
670:.
653:^
640:,
628:,
452:.
426:.
198:,
194:,
154:,
150:,
122:,
118:,
114:,
100:,
96:,
54:,
50:,
756:.
721::
715:3
686:.
682::
676:2
636::
162:.
27:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.