Learning and Memory

Learning and MemoryHuman learning and memory is often conceived as having three stages: encoding, storage, and retrieval (Melton, 1963). Encoding refers to the acquisition and initial processing of information; storage refers to the maintenance of the encoded information over time; and retrieval refers to the processes by which the stored information is accessed and used. Historically, the scientific study of human memory can be seen as progressing through four phases, the first three of which correspond to an emphasis on encoding, storage, and retrieval, respectively. The fourth phase, which reflects the current state of the field, emphasizes the dynamic interaction among the stages.

The earliest influential concept of memory, derived from Aristotle, was that of an association, a connection between two ideas, thoughts, or events. Associations are formed when two items occur close together in time or space, when two items are very similar, or when two items are very different. The presence of one of these, the cue, brings the other to mind. One point of disagreement was whether associations could be formed only between adjacent items (direct associations) or whether an association could be formed between items that were further removed (a remote association). For example, for three events that occur in a series, not only could there be a direct association between the first and second and between the second and third, but there could also be a remote association between the first and third. Although many early theorists devoted much thought and speculation to the nature of memory, their enthusiasm and insight were hampered by a lack of appropriate tools, methods, and procedures.

Convincing evidence on this issue was not available until 1885, when Hermann Ebbinghaus published the first scientific study on memory. Ebbinghaus was the first scientist to design, conduct, and report an experiment to distinguish between two competing theories of memory. In a series of studies, he reported evidence not only for direct and remote associations, but also for backward associations. Ebbinghaus is also noted for being among the first to use statistical procedures to analyze his data.

Until the 1950S, the verbal learning tradition dominated research on memory in the United States. Following the lead of Ebbinghaus (1885), the emphasis was on the encoding stage, especially on how associations were formed and acquired. One particularly influential line of research, for example, focused on measuring how well learning transferred to new situations (Osgood, 1953). Of course, some emphasis was placed on retrieval; in particular, the dominant theory of forgetting, interference theory, included both unlearning and response competition as factors. Nonetheless, even within this framework, principles of acquisition, such as the differential effectiveness of massed versus distributed rehearsal, dominated (Underwood, 1961).

Beginning in the 1950s, the so-called cognitive revolution ushered in a change of emphasis to storage. The dominant metaphor was the computer, with various buffers, registers, and other forms of storage that were linked to different hypothetical memory structures. The most common view of memory, called the modal model (after a statistical measure, the mode) had three such hypothetical memory structures: sensory register, short-term store, and long-term store (Atkinson & Shiffrin, 1968). Information was first briefly registered in a sensory buffer before being converted from its raw physical form into a more durable (usually verbal) code and deposited into short-term store. Short-term store had a limited capacity, around five to nine items, or “chunks,” and was intended mainly as a buffer where information could be temporarily stored. Rehearsal was the process whereby an item was either maintained or copied into long-term store.

Despite the considerable success of this type of model (Glanzer, 1972), empirical and logical problems quickly became apparent (Neath, 1998). The most significant of these was trying to separate the contributions of short- and long-term store in a given situation. Two reactions followed. One was the development of the idea of working memory (Baddeley, 1986), an update of the short-term memory concept. Working memory viewed memory in a broader context, including an attentional and visuospatial system, and was fundamentally a place where cognitive work was performed. The other reaction was an increasing emphasis on processes rather than structures; this emphasis on processing also followed an intensive study of the retrieval stage in the late 1960s and early 1970s. Indeed, working memory can be seen as a hybrid model, containing both a structure (the phonological store) and a process (the articulatory loop).

According to the levels of processing framework (Craik & Lockhart, 1972), memory is the result of a successive series of analyses, each one at a deeper, more conceptual level than the previous, that are performed on the information. The deeper the level of analysis, the better the memory. Memory is therefore more of a by-product than anything else; it is the residue of the processing that was performed. This view offered an explanation for why intent to learn is not an important factor in subsequent tests of memory (Postman, 1964): if a person tries to memorize something but uses an inappropriate process, performance is poor. Indeed, most of the information that people do remember is not learned intentionally; rather, it is the residue of their processing of the original experience.

Levels of processing focused almost exclusively on encoding and said relatively little about retrieval. The second major processing view of the 1970S was developed as a way of rectifying this omission (Morris, Bransford, & Franks, 1977). The primary difference between levels of processing and transfer-appropriate processing is that the latter explicitly includes retrieval as a factor. According to this view, a particular encoding process leads to better performance not because it is necessarily deeper but rather because it is appropriate given the kind of processing that the test requires.

Currently, memory research is in the fourth phase, where the interaction between encoding and retrieval is emphasized. A good example is Tulving’s (1983) encoding specificity principle. According to this principle, the recollection of an event, or a certain aspect of it, depends on the interaction between the properties of the encoded event and the properties of the encoded cues available at retrieval. Note that there is an explicit acknowledgment of two possible distortions: the representation of the original information may or may not be veridical, and the representation of the retrieval cues may or may not be veridical. Memory is the interaction of these two potentially distorted representations.

It follows from this type of interactive view that memory is inherently cue driven: Information cannot be recollected or otherwise used in processing unless an appropriate cue is present. A second implication is that slight changes in the cue constellation can easily disrupt memory performance. Even a good cue can lose its effectiveness if it is used too often, a phenomenon known as cue overload (Watkins, 1979). A third implication is that memory is a dynamic process, including the potential for multiple ongoing distortions of the event, both from processing that occurs at study and processing that occurs at test. These implications are inherent in most current theories.

It has already been noted that intention to learn is not necessarily an important factor in subsequent memory performance. One topic that has been of considerable interest in recent years concerns situations where both acquisition and retrieval of information is performed without conscious awareness. This area is typically referred to as implicit memory, although the terminology is quite confusing. The most clear terms separate the type of learning situation (intentional or incidental) and the type of test (direct or indirect). Traditional memory research focused on intentional learning (“Try to remember the following list of items”) and direct tests (“Recall the list of items you just studied”). Implicit memory research uses incidental learning (“Rate these items for pleasantness”) and indirect tests (“Complete these word fragments with the first word that comes to mind”). (Of course. all combinations are possible.) The interesting finding is that the information processed at study facilitates performance on a variety of tests even though the subject is unaware of this influence.

One ongoing controversy concerns how best to view memory, as either a set of multiple memory systems (Schacter & Tulving, 1994) or a set of processes (Crowder, 1993). The multiple memory systems view attributes memory performance to the underlying memory system. Although there is some disagreement about the number of memory systems, the most popular conception lists five. The procedural memory system is responsible for performance on tasks that involve motor skills (typing, bicycle riding), simple conditioning, and simple associative learning. The perceptual representation system is responsible for identifying and processing visual forms and in speech recognition. Primary memory (also known as working memory) is responsible for storing information that is to be held briefly, such as a telephone number for the time between looking it up and dialing. Semantic memory processes knowledge, and episodic memory is concerned with autobiographical information and events that have been personally experienced.

The major advantage of this view is that it is able to explain a large number of dissociations.  A dissociation occurs when one variable, such as delay between study and test, affects one memory task differently than a second. Thus a typical explicit memory task shows worse performance after a long delay, whereas the typical implicit memory task shows almost no detrimental effect of delay. According to the multiple systems view, implicit memory is supported by the procedural representation system, whereas explicit memory depends on episodic memory. Because two different systems are used, two different results are seen. Similar explanations are offered to account for amnesia and the effects of normal aging: different systems can be affected and selectively impair some types of memory performance while leaving other memory abilities unimpaired.

The two major weaknesses of this approach are the lack of consensus on the number and type of systems and the lack of predictive dissociations. Although most multiple-system theorists subscribe to the divisions presented above, many offer additional systems (such as sensory memory systems reminiscent of the modal model), whereas others prefer fewer systems (such as combining episodic and semantic memory).

The inability of this view to formulate predictive dissociations is more problematical. For example, there is a phenomenon known as the fan effect: the time to respond to a given sentence increases as the number of facts known about the components in the sentence increases. The fact that this is true only for episodic tasks and not for semantic tasks is taken as evidence supporting the distinction between these two systems. However, the exact opposite finding-if the fan effect were seen only in semantic tasks and not episodic tasks-would also be taken as support for a distinction between the two systems. The problem is that the multiple-systems view cannot yet predict a priori the nature of the dissociations. Many researchers are currently working on addressing this problem.

The other major theoretical orientation is the processing (or proceduralist) view. Sometimes known as the monolithic view (due to the unwillingness to fractionate memory into multiple systems), this view arose out of and is related to the levels of processing and transfer-appropriate processing views. The main idea is that memory resides in the same neural units that originally processed the experience. When an event is initially experienced, it is processed by certain neural assemblies. Memory is what happens when the same or similar neural units are stimulated by a cue (either an external, bottom-up, or internal, top-down, cue) and similar processing results. As Craik (1994. p. 156) put it. “Encoding is simply the set of processes involved in the perception and interpretation of the original event … and retrieval is the attempted recapitulation of the original pattern of encoding activity.”

Supporting research comes from many areas, including current research into the effects of normal aging on memory. Whenever a task requires a process that is initiated by an internally produced cue-regardless of whether the task is episodic, semantic, or whatever- performance will be less successful in the elderly than when the process can use an externally provided cue. Thus the type of processing is more predictive of memory performance than the presumed underlying memory system.

One criticism of the processing approach has been that it is vague about exactly how many processes are involved. The process dissociation framework (Jacoby, 1991) is one attempt to separate the contribution of different processes. The basic logic is to examine at least two situations. One test, an inclusion test, is designed so that all processes can contribute beneficially to recall; a second test, the exclusion test, is designed so that one response cannot contribute. In essence, the effect of one process can be subtracted out and its contribution assessed.

Several other areas of research highlight the view that memory is cue driven, dynamic, and reconstructive. The reality monitoring (or source monitoring) paradigm examines the ability of people to remember the source of an event. Subjects may be asked to imagine an episode, or may actually experience the episode. During the test phase, the question of interest is whether the subjects can determine the source. The data show that people are more likely to say that an imagined event was real than a real event was imagined. The study of eyewitness memory reinforces these findings. Unless there is objective evidence available, there is no way of assessing the accuracy of recollection of an eyewitness: They may be very accurate, very inaccurate, or somewhere in the middle. Among the factors that do not predict subsequent accuracy are the duration of the event; the emotional intensity of the event; the unusualness of the event; the number of details that can be recalled; the confidence expressed about the memory; and the delay between the event and the subsequent questioning.

Current interest focuses on what is unfortunately called false memory, recalling information that was not presented (Roediger & McDermott. 1995). The term is unfortunate because it implies a dichotomy between “true” and “false” memories; if these really were the only options, then all memories would have to be labeled false. The far more interesting and important question is the degree to which the current recollection departs from the original episode. With subsequent tests, the recollection may become more or less accurate, but it always contains some distortion and thus some false elements on the part of the rememberer.

Current formal models of memory are also reflective of the fourth phase, the emphasis on both encoding and retrieval. Indeed, a large number of models are termed global memory models because they address memory performance in a wide variety of paradigms (Raaijmakers & Shiffrin. 1992). The four most influential are ACT* (pronounced act star), SAM (search of associative memory), TODAM (theory of distributed associative memory), and MINERVA2 (after the Greek goddess of wisdom). Connectionist models of memory have not fared well and have had less impact on the field.

Memory, then, is a dynamic, fundamentally reconstructive set of processes that enable previously encoded information to affect current and future performance. The effects of memory need not be consciously available to the rememberer, and each successive recollection may further distort or reconstruct the memory.

References:

  1. Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its control processes. In K. W. Spence & J. T. Spence (Eds.), The psychology of learning and motivation (Vol. 2, pp. 89-195). New York: Academic Press.
  2. Baddeley, A. D. (1986). Working memory. New York: Oxford University Press.
  3. Craik, F. I. M. (1994). Memory changes in normal aging. Current Directions in Psychological Science, 3, 155-158.
  4. Craik, F. I. M., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, II, 671-684.
  5. Crowder, R. G. (1993). Systems and principles in memory theory: Another critique of pure memory. In A. F. Collins. S. E. Gathercole, M. A. Conway, & P. E. Morris (Eds.). Theories of memory (pp. 139-161). Hove, UK: Erlbaum.
  6. Ebbinghaus, H. (1885). Uber das Gediichtnis. Leipzig: Duncker und Humboldt. (Available in English as Memory: A contribution to experimental psychology, H. A. Ruger. Trans., 1964. New York: Dover).
  7. Glanzer, M. (1972). Storage mechanisms in recall. In G. H. Bower & J. T. Spence (Eds.), The psychology of learning and motivation (Vol. 5, pp. 129-193). New York: Academic Press.
  8. Jacoby, L. L. (1991). A process dissociation framework: Separating automatic from intentional uses of memory. Journal of Memory lind Language, 30, 513-541.
  9. Melton, A. W. (1963). Implications of short-term memory for a general theory of memory. Journal of Verbal Learning and Verbal Behavior. 2, 1-21.
  10. Morris, C. D., Bransford, J. D., & Franks, J. J. (1977). Levels of processing versus transfer appropriate processing. Journal of Verbal Learning and Verbal Behavior, 16, 519- 533.
  11. Neath, I. (1998). Human memory: An introduction to research, theory, and data. Pacific Grove, CA: Brooks/Cole.
  12. Osgood, C. E. (1953). Method and theory in experimental psychology. New York: Oxford University Press.
  13. Postman, L. (1964). Short-term memory and incidental learning. In A. W Melton (Ed.), Categories of human leamillg (pp. 146-201). New York: Academic Press.
  14. Raaijmakers, J. G. W., & Shiffrin, R. M. (1992). Models for recall and recognition. Annual Review of Psychology, 43, 205-234.
  15. Roediger, H. L. III. & McDermott, K. B. (1995). Creating false memories: Remembering words not presented in lists. Journal of Experimental Psychology: Learning, Memory. alld Cognition, 21, 803-814.
  16. Schacter, D. L., & Tulving, E. (1994). What are the memory systems of 1994? In D. L. Schacter & E. Tulving (Eds.), Memory systems 1994 (pp. 1-38). Cambridge: MIT Press.
  17. Tulving, E. (1983). Elements of episodic memory. New York: Oxford University Press.
  18. Underwood, B. J. (1961). Ten years of massed practice on distributed practice. Psychological Review, 68, 229-247.
  19. Watkins, M. J. (1979). Engrams as cuegrams and forgetting as cue overload: A cueing approach to the structure of memory. In C. R. Puff (Ed.), Memory organization and structure (pp. 347-372). New York: Academic Press.

Back to Developmental Psychology