Knowledge

Cognitive model

Source đź“ť

122:
speaker. The output signal is the utterance produced by the child. The unseen psychological events that occur between the arrival of an input signal and the production of speech are the focus of psycholinguistic models. Events that process the input signal are referred to as input processes, whereas events that process the production of speech are referred to as output processes. Some aspects of speech processing are thought to happen online—that is, they occur during the actual perception or production of speech and thus require a share of the attentional resources dedicated to the speech task. Other processes, thought to happen offline, take place as part of the child's background mental processing rather than during the time dedicated to the speech task. In this sense, online processing is sometimes defined as occurring in real-time, whereas offline processing is said to be time-free (Hewlett, 1990). In box-and-arrow psycholinguistic models, each hypothesized level of representation or processing can be represented in a diagram by a “box,” and the relationships between them by “arrows,” hence the name. Sometimes (as in the models of Smith, 1973, and Menn, 1978, described later in this paper) the arrows represent processes additional to those shown in boxes. Such models make explicit the hypothesized information- processing activities carried out in a particular cognitive function (such as language), in a manner analogous to computer flowcharts that depict the processes and decisions carried out by a computer program. Box-and-arrow models differ widely in the number of unseen psychological processes they describe and thus in the number of boxes they contain. Some have only one or two boxes between the input and output signals (e.g., Menn, 1978; Smith, 1973), whereas others have multiple boxes representing complex relationships between a number of different information-processing events (e.g., Hewlett, 1990; Hewlett, Gibbon, & Cohen- McKenzie, 1998; Stackhouse & Wells, 1997). The most important box, however, and the source of much ongoing debate, is that representing the underlying representation (or UR). In essence, an underlying representation captures information stored in a child's mind about a word he or she knows and uses. As the following description of several models will illustrate, the nature of this information and thus the type(s) of representation present in the child's knowledge base have captured the attention of researchers for some time. (Elise Baker et al. Psycholinguistic Models of Speech Development and Their Application to Clinical Practice. Journal of Speech, Language, and Hearing Research. June 2001. 44. p 685–702.)
491:, rather than coupling the environment's and the agent's dynamical systems to each other, an “open dynamical system” defines a “total system”, an “agent system”, and a mechanism to relate these two systems. The total system is a dynamical system that models an agent in an environment, whereas the agent system is a dynamical system that models an agent's intrinsic dynamics (i.e., the agent's dynamics in the absence of an environment). Importantly, the relation mechanism does not couple the two systems together, but rather continuously modifies the total system into the decoupled agent's total system. By distinguishing between total and agent systems, it is possible to investigate an agent's behavior when it is isolated from the environment and when it is embedded within an environment. This formalization can be seen as a generalization from the classical formalization, whereby the agent system can be viewed as the agent system in an open dynamical system, and the agent coupled to the environment and the environment can be viewed as the total system in an open dynamical system. 479:
analysis. At the first level of perception and action, an agent and an environment can be conceptualized as a pair of dynamical systems coupled together by the forces the agent applies to the environment and by the structured information provided by the environment. Thus, behavioral dynamics emerge from the agent-environment interaction. At the second level of time evolution, behavior can be expressed as a dynamical system represented as a vector field. In this vector field, attractors reflect stable behavioral solutions, where as bifurcations reflect changes in behavior. In contrast to previous work on central pattern generators, this framework suggests that stable behavioral patterns are an emergent, self-organizing property of the agent-environment system rather than determined by the structure of either the agent or the environment.
69:, though the two are not always easily distinguishable. In contrast to cognitive architectures, cognitive models tend to be focused on a single cognitive phenomenon or process (e.g., list learning), how two or more processes interact (e.g., visual search bsc1780 decision making), or making behavioral predictions for a specific task or tool (e.g., how instituting a new software package will affect productivity). Cognitive architectures tend to be focused on the structural properties of the modeled system, and help constrain the development of cognitive models within the architecture. Likewise, model development helps to inform limitations and shortcomings of the architecture. Some of the most popular architectures for cognitive modeling include 503:, representations can be conceptualized as indicators or mediators. In the indicator view, internal states carry information about the existence of an object in the environment, where the state of a system during exposure to an object is the representation of that object. In the mediator view, internal states carry information about the environment which is used by the system in obtaining its goals. In this more complex account, the states of the system carries information that mediates between the information the agent takes in from the environment, and the force exerted on the environment by the agents behavior. The application of open dynamical systems have been discussed for four types of classical embodied cognition examples: 544:: (1) the total system captures embodiment; (2) one or more agent systems capture the intrinsic dynamics of individual agents; (3) the complete behavior of an agent can be understood as a change to the agent's intrinsic dynamics in relation to its situation in the environment; and (4) the paths of an open dynamical system can be interpreted as representational processes. These embodied cognition examples show the importance of studying the emergent dynamics of an agent-environment systems, as well as the intrinsic dynamics of agent systems. Rather than being at odds with traditional cognitive science approaches, dynamical systems are a natural extension of these methods and should be studied in parallel rather than in competition. 470:. The first transforms the representation of the agents action into specific patterns of muscle activation that in turn produce forces in the environment. The second function transforms the information from the environment (i.e., patterns of stimulation at the agent's receptors that reflect the environment's current state) into a representation that is useful for controlling the agents actions. Other similar dynamical systems have been proposed (although not developed into a formal framework) in which the agent's nervous systems, the agent's body, and the environment are coupled together 408:
observing the toy being hidden in location A and repeatedly searching for it there, the 2-year-olds were shown a toy hidden in a new location B. When they looked for the toy, they searched in locations that were biased toward location A. This suggests that there is an ongoing representation of the toy's location that changes over time. The child's past behavior influences its model of locations of the sandbox, and so an account of behavior and learning must take into account how the system of the sandbox and the child's past actions is changing over time.
135: 524:
Instances where a functionally equivalent external artifact replaces functions that are normally performed internally by the agent, which is a special case of offloading. One famous example is that of human (specifically the agents Otto and Inga) navigation in a complex environment with or without
394:
and repellers that constrain movement in the state space. This means that representations are sensitive to context, with mental representations viewed as trajectories through mental space instead of objects that are constructed and remain static. Elman networks were trained with simple sentences to
507:
Instances where the environment and agent must work together to achieve a goal, referred to as "intimacy". A classic example of intimacy is the behavior of simple agents working to achieve a goal (e.g., insects traversing the environment). The successful completion of the goal relies fully on the
407:
is proposed to be not a distinct error occurring at a specific age (8 to 10 months), but a feature of a dynamic learning process that is also present in older children. Children 2 years old were found to make an error similar to the A-not-B error when searching for toys hidden in a sandbox. After
177:
for which simple, intuitive analytical solutions are not readily available. Rather than deriving a mathematical analytical solution to the problem, experimentation with the model is done by changing the parameters of the system in the computer, and studying the differences in the outcome of the
121:
A number of key terms are used to describe the processes involved in the perception, storage, and production of speech. Typically, they are used by speech pathologists while treating a child patient. The input signal is the speech signal heard by the child, usually assumed to come from an adult
478:
Behavioral dynamics have been applied to locomotive behavior. Modeling locomotion with behavioral dynamics demonstrates that adaptive behaviors could arise from the interactions of an agent and the environment. According to this framework, adaptive behaviors can be captured by two levels of
436:
most of the time. Another feature is that the states are quasi-stable, meaning that they will eventually transition to other states. A simple pattern generator circuit like this is proposed to be a building block for a dynamical system. Sets of neurons that simultaneously transition from one
231:
Hybrid computers are computers that exhibit features of analog computers and digital computers. The digital component normally serves as the controller and provides logical operations, while the analog component normally serves as a solver of differential equations. See more details at
340:
learn on its own, structure and computational properties naturally arise. Unlike previous models, “memories” can be formed and recalled by inputting a small portion of the entire memory. Time ordering of memories can also be encoded. The behavior of the system is modeled with
437:
quasi-stable state to another are defined as a dynamic module. These modules can in theory be combined to create larger circuits that comprise a complete dynamical system. However, the details of how this combination could occur are not fully worked out.
222:
if it is made by constituent entities that are not representations in their turn, e.g., pixels, sound images as perceived by the ear, signal samples; subsymbolic units in neural networks can be considered particular cases of this category.
516:
players; people are able to create more words when playing Scrabble if they have the tiles in front of them and are allowed to physically manipulate their arrangement. In this example, the Scrabble tiles allow the agent to offload
345:
which can change values, representing different states of the system. This early model was a major step toward a dynamical systems view of human cognition, though many details had yet to be added and more phenomena accounted for.
511:
Instances where the use of external artifacts improves the performance of tasks relative to performance without these artifacts. The process is referred to as "offloading". A classic example of offloading is the behavior of
310:
that manifest this dynamics, carry explanatory force. On this dynamical view, parametric inputs alter the system's intrinsic dynamics, rather than specifying an internal state that describes some external state of affairs.
773:
Spencer, J. P., Smith, L. B., & Thelen, E. (2001). Tests of dynamical systems account of the A-not-B error: The influence of prior experience on the spatial memory abilities of two-year-olds. Child Development, 72(5),
287:, representing the totality of overall states the system could be in. The system is distinguished by the fact that a change in any aspect of the system state depends on other aspects of the same or other system states. 790: 685: 49:, and they can range from box-and-arrow diagrams to a set of equations to software programs that interact with the same tools that humans use to complete tasks (e.g., computer mouse and keyboard). In terms of 395:
represent grammar as a dynamical system. Once a basic grammar had been learned, the networks could then parse complex sentences by predicting which words would appear next according to the dynamical model.
283:
this behavior. An alternative approach is to define a system with (1) a state of the system at any given time, (2) a behavior, defined as the change over time in overall state, and (3) a state set or
929: 428:
to control the foot, backward swing, and forward swing effectors of the leg. Outputs of the network represent whether the foot is up or down and how much force is being applied to generate
528:
Instances where there is not a single agent. The individual agent is part of larger system that contains multiple agents and multiple artifacts. One famous example, formulated by
932:. In M. Hahn & S. C. Stoness (Eds.), Proceedings of twenty-first annual conference of the Cognitive Science Society, (pp. 326-330). Mahwah, NJ: Lawrence Erlbaum Associates. 787: 682: 374:
and cognition should be treated as a dynamical system rather than a digital symbol processor. Neural networks of the type Elman implemented have come to be known as
806:
Chiel, H. J., Beer, R. D., & Gallagher, J. C. (1999). Evolution and analysis of model CPGs for walking. Journal of Computational Neuroscience, 7, 99-118.
450:
Modern formalizations of dynamical systems applied to the study of cognition vary. One such formalization, referred to as “behavioral dynamics”, treats the
173:
that requires extensive computational resources to study the behavior of a complex system by computer simulation. The system under study is often a complex
563: 178:
experiments. Theories of operation of the model can be derived/deduced from these computational experiments. Examples of common computational models are
751:. In R.F. Port and T. van Gelder (Eds.), Mind as motion: Explorations in the Dynamics of Cognition. (pp. 195-223). Cambridge, Massachusetts: MIT Press. 712:. In R.F. Port and T. van Gelder (Eds.), Mind as motion: Explorations in the Dynamics of Cognition. (pp. 1-43). Cambridge, Massachusetts: MIT Press. 420:(CTRNNs). By focusing on the output of the neural networks rather than their states and examining fully interconnected networks, three-neuron 306:
and the internal and external forces that shape a specific trajectory that unfold over time, instead of the physical nature of the underlying
919:. In J. Haugeland (Ed.), Having thought: Essays in the metaphysics of mind (pp. 207-237). Cambridge, Massachusetts: Harvard University Press. 822: 705: 558: 886:
Hotton, S., & Yoshimi, J. (2010). The dynamics of embodied cognition. International Journal of Bifurcation and Chaos, 20(4), 943-972.
942: 848: 748: 50: 17: 355: 981: 666: 74: 30:
This article is about models of cognition's operation. For a model of external reality as modeled in the mind, see
976: 329: 597:
Sun, R. (ed.), (2008). The Cambridge Handbook of Computational Psychology. New York: Cambridge University Press.
458:
dynamical systems based on classical dynamical systems theory. In this formalization, the information from the
284: 45:
in humans or other animals for the purposes of comprehension and prediction. There are many types of cognitive
210:
model is expressed in characters, usually non-numeric ones, that require translation before they can be used.
82: 78: 903: 191: 838:
Beer, R. D. (2000). Dynamical approaches to cognitive science. Trends in Cognitive Sciences, 4(3), 91-99.
268:
information is transformed into symbolic inputs, which produce symbolic outputs that get transformed into
1011: 1006: 573: 302:
that describe how the system's state changes over time. By doing so, the form of the space of possible
386:
rules that are learned and then used according to fixed rules, the dynamical systems view defines the
146: 736:
Neurons with graded response have collective computational properties like those of two-state neurons
462:
informs the agent's behavior and the agent's actions modify the environment. In the specific case of
424:(CPG) can be used to represent systems such as leg movements during walking. This CPG contains three 421: 417: 375: 233: 861: 488: 336:, modeling systems of around 30 neurons which can be in either an on or off state. By letting the 553: 295: 110: 819: 702: 761: 467: 66: 874: 459: 403:
A classic developmental error has been investigated in the context of dynamical systems: The
324:
Early work in the application of dynamical systems to cognition can be found in the model of
299: 249: 170: 94: 8: 307: 276: 245: 179: 166: 735: 500: 455: 280: 875:
A dynamical model of visually-guided steering, obstacle avoidance, and route selection
722: 723:
Neural networks and physical systems with emergent collective computational abilities
662: 568: 451: 279:
and in real time. Breaking down the processes into discrete time steps may not fully
98: 864:. Journal of Experimental Psychology: Human Perception and Performance, 29, 343-362. 416:
One proposed mechanism of a dynamical system comes from analysis of continuous-time
887: 342: 325: 303: 291: 187: 174: 106: 955: 826: 794: 762:
Distributed representations, simple recurrent networks, and grammatical structure
709: 689: 463: 253: 183: 432:
in the leg joint. One feature of this pattern is that neuron outputs are either
529: 518: 359: 337: 265: 195: 928:
Maglio, P., Matlock, T., Raphaely, D., Chernickym B., & Kirsh, D. (1999).
891: 1000: 578: 433: 404: 367: 261: 102: 425: 390:
as regions of state space within a dynamical system. Grammar is made up of
275:
What is missing from this traditional view is that human cognition happens
269: 31: 788:
The dynamics of embodiment: A field theory of infant preservative reaching
632: 873:
Fajen, B. R., Warren, W. H., Temizer, S., & Kaelbling, L. P. (2003).
829:. Psychological Review, 113(2), 359-389. doi: 10.1037/0033-295X.113.2.358 607: 991: 862:
Behavioral dynamics of steering, obstacle avoidance, and route selection
134: 906:. Cognitive Science, 35, 444-479. doi: 10.1111/j.1551-6709.2010.01151.x 849:
The dynamics of active categorical perception in an evolved model agent
466:, the coupling of the environment and the agent is formalized by two 391: 257: 42: 916: 703:
It's about time: An overview of the dynamical Approach to cognition
513: 371: 851:. Adaptive Behavior, 11(4), 209-243. doi: 10.1177/1059712303114001 387: 383: 379: 541: 429: 333: 57:
is modeling of human perception, reasoning, memory and action.
986: 904:
Extending dynamical systems theory to model embodied cognition
363: 70: 46: 971: 540:
The interpretations of these examples rely on the following
786:
Thelen E., Schoner, G., Scheier, C., Smith, L. B. (2001).
378:. Instead of treating language as a collection of static 60: 272:
outputs. The entire system operates in an ongoing cycle.
260:
takes place by transforming static symbol structures in
987:
Cognitive modeling at the University of Memphis (LIDA)
877:. International Journal of Computer Vision, 54, 15-34. 65:
Cognitive models can be developed within or without a
105:), and has received contributions from the fields of 564:
Computational-representational understanding of mind
93:Cognitive modeling historically developed within 998: 661:. London, UK: Routledge, Taylor & Francis. 328:. These networks were proposed as a model for 692:. Behavioral and Brain Sciences, 21, 615-665. 683:The dynamical hypothesis in cognitive science 559:Computational models of language acquisition 252:are viewed as static structures of discrete 860:Fajen, B., R., & Warren, W. H. (2003). 440: 797:. Behavioral and Brain Sciences, 24, 1-86. 314: 814: 812: 782: 780: 701:van Gelder, T. & Port, R. F. (1995). 508:coupling of the agent to the environment. 482: 499:In the context of dynamical systems and 398: 349: 125: 116: 61:Relationship to cognitive architectures 14: 999: 958:. Cambridge, Massachusetts: MIT Press. 941:Clark, A., & Chalmers, D. (1998). 902:Hotton, S., & Yoshimi, J. (2011). 809: 777: 445: 820:The dynamics of perception and action 659:Cognitive Design for Artificial Minds 656: 536:, is that of navigating a naval ship. 494: 473: 332:. They represent the neural level of 319: 239: 129: 982:Cognitive modeling at RPI (CLARION) 521:demands on to the tiles themselves. 41:is a representation of one or more 24: 25: 1023: 965: 454:and the environment as a pair of 133: 977:Cognitive modeling at RPI (HCI) 948: 935: 922: 909: 896: 880: 867: 854: 841: 832: 800: 767: 764:. Machine Learning, 7, 195-225. 754: 749:Language as a dynamical system 741: 728: 715: 695: 675: 650: 625: 600: 591: 213: 27:Model of cognition's operation 13: 1: 930:Interactive skill in scrabble 584: 487:In an extension of classical 411: 7: 992:Cognitive modeling at UMich 574:Memory-prediction framework 547: 354:By taking into account the 201: 169:is a mathematical model in 10: 1028: 917:Mind embodied and embedded 525:assistance of an artifact. 362:and the similarity of the 88: 29: 972:Cognitive modeling at CMU 892:10.1142/S0218127410026241 422:central pattern generator 418:recurrent neural networks 234:hybrid intelligent system 226: 945:. Analysis, 58(1), 7-19. 734:Hopfield, J. J. (1984). 721:Hopfield, J. J. (1982). 489:dynamical systems theory 464:perception-action cycles 441:Modern dynamical systems 356:evolutionary development 681:van Gelder, T. (1998). 657:Lieto, Antonio (2021). 554:Computational cognition 315:Early dynamical systems 111:artificial intelligence 954:Hutchins, E., (1995). 915:Haugeland, J. (1996). 818:Warren, W. H. (2006). 738:. PNAS, 81, 3088-3092. 725:. PNAS, 79, 2554-2558. 608:"ISO/IEC 2382-28:1995" 483:Open dynamical systems 300:differential equations 246:computational approach 67:cognitive architecture 51:information processing 956:Cognition in the wild 760:Elman, J. L. (1991). 747:Elman, J. L. (1995). 534:Cognition in the Wild 399:Cognitive development 218:A cognitive model is 171:computational science 847:Beer, R. D. (2003). 350:Language acquisition 264:, sequential steps. 126:Computational models 117:Box-and-arrow models 95:cognitive psychology 633:"ISO/IEC 2382:2015" 446:Behavioral dynamics 244:In the traditional 180:weather forecasting 167:computational model 43:cognitive processes 18:Cognitive modelling 1012:Enactive cognition 1007:Cognitive modeling 825:2017-09-18 at the 793:2018-07-01 at the 708:2017-11-17 at the 688:2018-07-01 at the 501:embodied cognition 495:Embodied cognition 474:Adaptive behaviors 330:associative memory 320:Associative memory 190:models, molecular 145:. You can help by 55:cognitive modeling 943:The extended mind 569:MindModeling@Home 366:to other organs, 326:Hopfield networks 240:Dynamical systems 163: 162: 99:cognitive science 16:(Redirected from 1019: 959: 952: 946: 939: 933: 926: 920: 913: 907: 900: 894: 884: 878: 871: 865: 858: 852: 845: 839: 836: 830: 816: 807: 804: 798: 784: 775: 771: 765: 758: 752: 745: 739: 732: 726: 719: 713: 699: 693: 679: 673: 672: 654: 648: 647: 645: 643: 629: 623: 622: 620: 618: 604: 598: 595: 188:flight simulator 175:nonlinear system 158: 155: 137: 130: 107:machine learning 21: 1027: 1026: 1022: 1021: 1020: 1018: 1017: 1016: 997: 996: 968: 963: 962: 953: 949: 940: 936: 927: 923: 914: 910: 901: 897: 885: 881: 872: 868: 859: 855: 846: 842: 837: 833: 827:Wayback Machine 817: 810: 805: 801: 795:Wayback Machine 785: 778: 772: 768: 759: 755: 746: 742: 733: 729: 720: 716: 710:Wayback Machine 700: 696: 690:Wayback Machine 680: 676: 669: 655: 651: 641: 639: 631: 630: 626: 616: 614: 606: 605: 601: 596: 592: 587: 550: 497: 485: 476: 448: 443: 414: 401: 352: 322: 317: 250:representations 242: 229: 216: 204: 192:protein folding 184:earth simulator 159: 153: 150: 143:needs expansion 128: 119: 91: 63: 39:cognitive model 35: 28: 23: 22: 15: 12: 11: 5: 1025: 1015: 1014: 1009: 995: 994: 989: 984: 979: 974: 967: 966:External links 964: 961: 960: 947: 934: 921: 908: 895: 879: 866: 853: 840: 831: 808: 799: 776: 766: 753: 740: 727: 714: 694: 674: 667: 649: 624: 599: 589: 588: 586: 583: 582: 581: 576: 571: 566: 561: 556: 549: 546: 538: 537: 526: 522: 519:working memory 509: 496: 493: 484: 481: 475: 472: 447: 444: 442: 439: 413: 410: 400: 397: 376:Elman networks 370:proposed that 360:nervous system 351: 348: 321: 318: 316: 313: 241: 238: 228: 225: 215: 212: 203: 200: 196:neural network 161: 160: 140: 138: 127: 124: 118: 115: 113:among others. 90: 87: 62: 59: 26: 9: 6: 4: 3: 2: 1024: 1013: 1010: 1008: 1005: 1004: 1002: 993: 990: 988: 985: 983: 980: 978: 975: 973: 970: 969: 957: 951: 944: 938: 931: 925: 918: 912: 905: 899: 893: 889: 883: 876: 870: 863: 857: 850: 844: 835: 828: 824: 821: 815: 813: 803: 796: 792: 789: 783: 781: 770: 763: 757: 750: 744: 737: 731: 724: 718: 711: 707: 704: 698: 691: 687: 684: 678: 670: 668:9781138207929 664: 660: 653: 638: 634: 628: 613: 609: 603: 594: 590: 580: 579:Space mapping 577: 575: 572: 570: 567: 565: 562: 560: 557: 555: 552: 551: 545: 543: 535: 531: 527: 523: 520: 515: 510: 506: 505: 504: 502: 492: 490: 480: 471: 469: 465: 461: 457: 453: 438: 435: 431: 427: 426:motor neurons 423: 419: 409: 406: 405:A-not-B error 396: 393: 389: 385: 381: 377: 373: 369: 365: 361: 358:of the human 357: 347: 344: 339: 335: 331: 327: 312: 309: 305: 301: 297: 293: 288: 286: 282: 278: 273: 271: 267: 263: 259: 255: 251: 247: 237: 235: 224: 221: 211: 209: 199: 197: 193: 189: 185: 181: 176: 172: 168: 157: 154:December 2010 148: 144: 141:This section 139: 136: 132: 131: 123: 114: 112: 108: 104: 103:human factors 100: 96: 86: 84: 80: 76: 72: 68: 58: 56: 52: 48: 44: 40: 33: 19: 950: 937: 924: 911: 898: 882: 869: 856: 843: 834: 802: 769: 756: 743: 730: 717: 697: 677: 658: 652: 640:. Retrieved 636: 627: 615:. Retrieved 611: 602: 593: 539: 533: 532:in his book 498: 486: 477: 449: 415: 402: 353: 323: 304:trajectories 289: 277:continuously 274: 243: 230: 219: 217: 207: 205: 194:models, and 164: 151: 147:adding to it 142: 120: 92: 64: 54: 38: 36: 32:mental model 530:Ed Hutchins 460:environment 298:by several 285:state space 220:subsymbolic 214:Subsymbolic 101:(including 1001:Categories 774:1327-1346. 585:References 412:Locomotion 392:attractors 382:items and 308:mechanisms 296:formalized 290:A typical 468:functions 434:off or on 294:model is 292:dynamical 258:Cognition 823:Archived 791:Archived 706:Archived 686:Archived 548:See also 514:Scrabble 372:language 262:discrete 208:symbolic 202:Symbolic 198:models. 186:models, 182:models, 456:coupled 388:lexicon 384:grammar 380:lexical 343:vectors 338:network 281:capture 266:Sensory 254:symbols 89:History 75:Clarion 665:  642:13 May 617:13 May 430:torque 334:memory 227:Hybrid 81:, and 47:models 542:logic 452:agent 368:Elman 364:brain 270:motor 71:ACT-R 663:ISBN 644:2023 619:2023 109:and 83:Soar 79:LIDA 888:doi 637:ISO 612:ISO 149:. 1003:: 811:^ 779:^ 635:. 610:. 256:. 248:, 236:. 206:A 165:A 85:. 77:, 73:, 53:, 37:A 890:: 671:. 646:. 621:. 156:) 152:( 97:/ 34:. 20:)

Index

Cognitive modelling
mental model
cognitive processes
models
information processing
cognitive architecture
ACT-R
Clarion
LIDA
Soar
cognitive psychology
cognitive science
human factors
machine learning
artificial intelligence

adding to it
computational model
computational science
nonlinear system
weather forecasting
earth simulator
flight simulator
protein folding
neural network
hybrid intelligent system
computational approach
representations
symbols
Cognition

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑