Human Memory & Cognition

Human Memory & Cognition

Apologies for commencing with a naively under-estimated question likened to that asked by a small child or perhaps a tiresome teenager. However, in this case, to ask such a question will not result in futility or a slammed door. The point here is to illustrate that the existence of memory has far more widely reaching implications than the lay person might at first suspect. Without memory there would be no past. There would be no ability to employ previously learned skills, no recall of names and places or the ability to recognise a face.

There would be only a present’, one that would not necessarily be our own. There would be no personal identity, no recollection of past days, minutes or even seconds, much like the day-to-day blur that is the heavyweight student’s life, who no doubt feels a sense of ‘nothingness’ on a regular basis, not forgetting the severe head pains also. The point is that individuals would not have a sense of identity was it not for memory. The notion of ‘self’ relies upon a ‘continuity of memories that link our yesterdays to our todays’, an eloquently assembled statement by Gleitman (1995).

Thus, the intention of asking ‘what is memory? was to understand the importance of it and to comprehend the vast field that makes up ‘memory’ which is open for psychologists to explore and exploit (notice the similarity between those 2 last words). The fact that very little was known about memory in general obviously makes this an ideal arena for unbeknownst psychologists to gain notoriety in the field, and money, of course. Many such candidates have taken advantage of this opportunity.

Subsequently their work has not only paved the way for would-be psychologists to understand the terms and depths of memory to a fuller xtent, but has given lecturers the opportunity of requiring dissertations on memory research from up-and-coming psychologists currently taking their psychology degree courses. With the importance of memory firmly grasped, it is not surprising that a vast amount of work has been done investigating it.

Certain notable investigations, namely Craik & Lockhart (1972) for the purpose of this dissertation, have re-aligned psychological research in a fresher direction than they were heading in before. The ‘Stage Theory of Memory’ was the first plausible hypothesis of the echanics of memory. The crude notions of Broadbent (1958); Waugh & Norman (1965) and Atkinson & Shiffrin (1968) to name but a few, argued that memory consists of several ‘storage systems’ due to the fact that memory can delve into the past but at the same time recollect experiences that happened moments before.

They accounted for this diversity by way of hypothetical storage structures, like a warehouse for example. With hindsight it is possible to chuckle at the rudimentary appraisals of memory, but at the time they were given serious consideration and even aved the way for more enlightening research in the future. The notorious ‘short-term’ (referred to hereon as STM) and ‘long-term’ memory (LTM) labels were spawned from stage theory research.

STM is characterised by being very limited and experiments by investigators such as George Miller (1956), for example, gave physicality to the quantity of items that can be stored. LTM however is credited with a larger memory span, some 80,000 words or so. The Stage Theory of Memory asserts that there is a definite path that is taken for items that are ‘promoted’ from STM to LTM.

By way of preposterously mechanical processes it is argued that STM should be viewed as a ‘loading platform’ where ‘parcels’ (memories) await the fork-lift truck journey to the huge ‘memory warehouse’, a warehouse that even Tesco would be proud of. Those ‘parcels’ that ‘sit’ on the ‘loading platform’ for long enough will eventually be taken to the memory warehouse, but most do not make it. It is this notion that dominated memory research in the 50s and 60s. ‘Decay’ and ‘displacement’ were suggested as reasons why some memories are remembered and others not.

What is significant here is that all items were assumed to have been ‘entered’ into the memory structure, and it is the manual retrieval of them that is failing, thus causing forgetfulness. Later research turns its back on this notion and suggests that successful retrieval depends upon the method of processing the items into easily accessible memories. Stage Theory research dominated the field for several decades but, as time continued, a re-evaluation was made, thankfully.

Notions were starting to gradually move away from such automatic processes of upgrading STM memory tatus into LTM memory status by ‘sitting on a platform for long enough’ and acknowledging the concept of ‘rehearsal’ on memory. Subsequently came the investigation of improving memories. Maintenance rehearsal, although does relatively little in the long run, was an example of a changing attitude and a new focus. Psychologists were moving away from the idea of storage depots and were starting to concentrate on processes and methods for successful recollection.

Methods such as ‘chunking’ were proposed. Psychologists started to believe that long-term memories are formed by an ctive process whereby the ‘recollector’ has his/her (hereon taken in the masculine form for reasons of practicality rather than sexism) own methods of encoding events. Material to be retained in memory for subsequent retrieval does not depend upon a simple transfer from one memory storage container to another, which is fortunate for those persons with a bad back, but on how this material is processed.

It became acknowledged that the more elaborate the process, the greater the likelihood of retrieval. It is this new era of research that sparked off the most significant firework hat is Craik and Lockhart (1972)’s work of investigating the levels of processing. Craik and Lockhart put forward leading theories on memory at the time, evaluated them in light of their own findings and suggested new directions that memory research as a whole should take in the future.

Craik and Lockhart very gallantly put forward a detailed argument in favour of the theory of ‘multistore models’ (MMT), as proposed by Broadbent (1958), Waugh & Norman (1965), Peterson (1966) to name but a few, before proceeding to dissect the theory piece by piece e. . ‘we believe that the multistore formulation is unsatisfactory … ‘ Craik and Lockhart ended their paper with a’… suggest[ion] that [multistore models] … are often taken too literally and that more fruitful questions are generated by the present formulation’, bringing closure to, in their view, the rather dated notion of MMT.

Craik and Lockhart’s paper further rejected MMT by illustrating the complicated necessity of accommodating for stores in addition to those already described. They argued with certainty that material is encoded at ifferent times, using different methods, e. g. visual, phonemic, verbal etc. They explained that differently encoded materials persist for different lengths of time and that ‘one way of coping with these kinds of inconsistencies is to postulate additional stores’.

Craik and Lockhart began their argument in favour of ‘depth of processing’ with the conclusive statement that ‘Many theorists now agree… suggesting that any psychologist who does not agree with their argument holds little psychological understanding of memory. Indeed, they may have a point. Although the ‘part-quoted’ statement does not directly attach itself to their argument, it suggests that ‘A’ (the methods of perception) have very strong bonds with ‘B’ (memory encoding through deep processing) thus, ‘B’ is a direct by-product of ‘A’, and as ‘A’ certainly exists, ‘B’ (their argument) therefore exists.

Craik and Lockhart’s following statement suggests this point perfectly, ‘… oding characteristics … arise essentially as by-products of perceptual processing (Morton, 1970)’. Craik and Lockhart go some way to arguing that perceptual processing, lthough grouped into sensory analyses; elaboration processing; pattern recognition etc, should on the whole be regarded as a ‘continuum of analysis’. They argue that memory holds durable memories, as a result of semantic encoding, but at the same time retain material through ‘recirculating information at one level of processing’ i. e. keeping the items in consciousness.

They argue that memory is not divided into stores. They contemplate memory as one entity but the processes that bind memories together are distinguished, some aspiring to LTM retrieval goals and others onfined to keeping material in STM. Craik and Lockhart discussed reasonably briefly the current literature on ‘incidental learning’ and ‘selective attention’, which gave further credit to the notions promoted in their paper. They argued that incidental learning was accomplished through some degree of semantic encoding (see Hyde & Jenkins, 1969).

They also suggest that ‘the effectiveness of retrieval cues depends upon its compatibility with the material’s initial encoding and the extent to which the retrieval situation reinstates the learning context’. With regard to selective attention, work by Triesman (1964) argues that processing necessitated by ‘shadowing’ trebled the durability of memories and that semantic material could be retrieved more easily. Craik and Lockhart discussed the serial position effects of itemised words, which is a major source of evidence distinguishing STM from LTM (see Broadbent, 1971; Kintsch, 1970).

They argued that items appearing early on in the word-lists are retrieved more successfully due to ‘Type II Processing’, i. e. deeper semantic processing. They argue this by uggesting that initial items must be perceived and rehearsed thus subjecting them to Type II Processing. Craik and Lockhart go on to point out that the degree to which words are processed deeply depends upon the material and the task employed (see Palmer & Oynstein, 1971; Baddeley, 1968).

One of the concluding comments that Craik and Lockhart make is worthy of full quotation – ‘Only deeper processing will lead to an improvement in memory’. This is a significant statement that encapsulates the main reason for a re-orientation in the direction that memory research was heading in. Craik and Lockhart produced extensive evidence circulating around a central point – memory can be improved upon. They even give the means by which this improvement can be obtained i. e. depth of processing.

It is arguably this stance that has lead psychologists to investigate the criteria that bring about the best retrieval. It is clear that subsequent research adopts the stance suggested in the Craik and Lockhart paper of 1972. Following the Craik and Lockhart paper, which discussed in depth (no pun intended) the significance of ‘depth of processing’ methods, psychologists ame to realise that phenomenal people such as Shereshevskii (S. ) and the Latvian, V. P. were undoubtedly exceptional, but not for the reasons one might suppose.

Such figures were extraordinary because of their abilities to encode materials to such degrees that they could not forget anything at all. Note the subtle difference. Recognisably extraordinary because of their ability to encode, not because of their actual ability to remember. Their exceptional memory was a by-product of their exceptional depth of processing abilities. With this subtle understanding, it seems logical to resuppose that all human beings have the potential to become the S. s and the V. P. s of this world.

With the knowledge that this new depth of processing research was generating, ‘common’ people were able to show such remarkable capabilities (see Ericsson, Chase & Faloon (1980) for a remarkable study of the ‘laboratory-produced mnemonist’ S. F. ). Further work by Chase and Ericsson (1982) produced the Skilled Memory Theory, a method of enhancing memory, which was arguably inspired by the work of Craik and Lockhart (1972) who identified that memory can be refined if anipulated correctly.

Craik and Tulving (1975)’s work is further example of investigations that have attempted to understand the properties of memory i. e. the effects of ‘deep’ and ‘shallow’ processing on later recall. Investigations emphasising the importance of semantics on successful retrieval abilities were given the most recognition (as was the work by Craik and Lockhart, 1972) and thus formed the majority of memory research undertaken at that time. Bransford & Johnson (1972) provided further proof that ‘meaningful’ encoding helps later recall.

Work investigating the ncient method of ‘mnemonics’ became fashionable, arguably as a result of Craik and Lockhart’s paper, as it deals with improving memory in general. Wollen, Weber & Lowry (1972) investigated visual aids for example. A change of focus manifested as explorations started to branch out to encompass ‘retrieval cues’ and their effects on improving memory. The relationship between original encoding and subsequent retrieval was investigated by Godden and Baddeley (1975) who were responsible for the elegant study involving underwater encoding environments and dry- land retrieval.

What was memorised underwater was best retrieved underwater. This is an important finding, especially for students who need to understand that sitting in a crowded pub with seven empty beer glasses in front of them is not an acceptable arena for study, unless that is where the examination will be taking place of course, which is unlikely to say the least. However, all is not lost for the dedicated ‘pub-student’, work by Smith (1979) partly nullifies the claims by Godden and Baddeley.

Smith argues that changing the retrieval cues e. g. the environmental conditions nder which a participant processes materials, is not as significant as the way in which the participant thinks about these conditions at the time of recall. As drinking takes up a large portion of their minds, there should be little problem. Further strategies for improving were investigated. Craik and Tulving (1975) produced a study that promotes the use of ‘elaborative rehearsal’ as a successful means of retrieving memories.

Processes such as organising, chunking and relating material to already-stored memories all combine to create elaborative rehearsal. Gleitman compares elaborative rehearsal to the system created by the Romans centuries ago: that all roads lead to Rome enabling travellers to easily find their way to the capital city. The same holds true for elaborative rehearsal in that ‘the more paths exist, the easier retrieval will be’ (Craik and Tulving, 1975).

The core issue highlighted by Craik and Tulving is the importance of comprehension on memorising materials, as it provides a much denser network of pathways than any other mnemonic scheme. Search strategies to improve memory were investigated, notably by Williams nd Hollan (1982), who encouraged participants to think ‘out-loud’ as they tried to recollect names from their pasts. Another attempt at finding the most foolproof method of enhancing memory span. Research has come a long way from its rudimentary beginnings.

The most significant shift came following Craik and Lockhart (1972)’s investigation highlighting that memory is not a static concept, it can be improved and polished, if the method is correct. Significance must be attributed to Craik and Lockhart’s paper in that it encouraged psychologists to investigate the qualities that enhance retrieval. They submitted a proposal for investigation – depth of processing.

They called this proposal their ‘framework within which … processes] can be understood’. Subsequently, psychologists attempted to understand exactly what defined a successful method for retrieval, and out of this was born the rich and diverse field of memory research that we are fortunate enough to have today. Without it children would still, for example, be working in front of the TV and students in pubs, concentrating more on The Tweenies and the effervescence in their beer glass respectively than the job in-hand.

Cite this page

Choose citation format:
Human Memory & Cognition. (2019, Mar 02). Retrieved February 25, 2020, from