Summary: Researchers developed a computer model that mimics how the hippocampus stores new episodic memories without erasing old ones. This model demonstrates that the CA3 region of the hippocampus serves as an anchor point for memories, allowing efficient storage in surrounding regions.
The findings reveal insights into how the brain organizes personal experiences and maintains stability despite constant updates. The model shows promise for enhancing our understanding of memory retention and cognitive processing.
Key Facts:
- Episodic Memory Function: Episodic memory allows individuals to store unique personal experiences in a temporal and spatial context, forming a foundation for personal identity.
- Hippocampus Role: The study redefines the function of the hippocampal CA3 region, suggesting it acts as an anchor for memories rather than storing them directly.
- Memory Stability: The model maintains memory stability by organizing new experiences without disrupting existing ones, akin to adding books to a well-organized library.
Source: RUB
The brain is constantly storing new experiences that it has to integrate into the jumble of existing memories. Surprisingly, it does not overwrite previous memory traces in the process.
The first day of school: entering the classroom for the first time, the excited feeling in your stomach and the joy of having a school bag—these are all typical examples of memories from our episodic memory. It stores unique personal episodes in a temporal and spatial order and links them to subjective experiences.
In a study at the Institut für Neuroinformatik at the Faculty of Computer Science of Ruhr University Bochum, Germany, a team led by Professor Laurenz Wiskott has developed a new computer model of episodic memory and thus made significant progress in understanding the hippocampus—the region of the brain that is crucial for the formation of new episodic memories.
The work was published on June 20, 2024 in the journal PLOS ONE.
Reliably storing sequences without destroying previous memories
Episodic memory is an important basis for our personal life story. It helps us to form our identity by storing and connecting past experiences and events in the right order.
“This happens through changes in the connections between the nerve cells in our brain,” explains Laurenz Wiskott.
“A so-far unexplained phenomenon was how the human brain is able to make these changes without forgetting other memories—even though the experience is only seen exactly once and therefore cannot be slowly and carefully integrated into the circuit diagram of the nerve cells.”
The Bochum researchers’ innovative computer model makes it possible to recreate precisely this natural ability of the human brain: to reliably store sequences after a single presentation without destroying previous memories.
The model focuses on the principles of self-organization in the hippocampus and is based on the CRISP theory of Professor Sen Cheng, also a researcher at Ruhr University Bochum. The abbreviation stands for Content Representation, Intrinsic Sequences, and Pattern Completion.
In particular, the model redefines the function of the so-called CA3 region in the hippocampus. “Previously, it was assumed that episodic memories were stored directly in the CA3 network,” says first author Dr. Jan Melchior.
“However, we now use the CA3 region only as a kind of anchor point for memory. Storing takes place in the regions that come before and after CA3.”
A neural network like a well-organized library
To achieve this, the research team trained the CA3 region in their model with pre-information and thus, figuratively speaking, set up a well-organized library in CA3.
“When new books, i.e. new experiences, are added, the library does not have to be completely reorganized. Instead, the new books are added to the existing structure and linked to existing shelves and categories,” continues Jan Melchior. This saves time and keeps the library well organized.
The CA3 region remains stable in the model and can work efficiently without the need to constantly adapt its internal structure. This makes the processing and storage of information faster and more reliable. Neural changes during the learning process occur exclusively in adjacent regions.
The results of the simulation convinced the researchers. “I still regard the robustness of the model as surprising,” says Laurenz Wiskott.
“Even with incomplete or incorrect cues, a single presentation of a pattern sequence can be reliably stored, remembered and retrieved.”
“The model works not only with artificially generated sequences, but also with handwritten numbers and natural images,” adds Jan Melchior. “It can also improve itself without additional input by repeatedly replaying what it has learned.”
About this episodic memory research news
Author: Anke Maes
Source: RUB
Contact: Anke Maes – RUB
Image: The image is credited to Neuroscience News
Original Research: Open access.
“A neural network model for online one-shot storage of pattern sequences” by Jan Melchior et al. PLOS ONE
Abstract
A neural network model for online one-shot storage of pattern sequences
Based on the CRISP theory (Content Representation, Intrinsic Sequences, and Pattern completion), we present a computational model of the hippocampus that allows for online one-shot storage of pattern sequences without the need for a consolidation process.
In our model, CA3 provides a pre-trained sequence that is hetero-associated with the input sequence, rather than storing a sequence in CA3.
That is, plasticity on a short timescale only occurs in the incoming and outgoing connections of CA3, not in its recurrent connections. We use a single learning rule named Hebbian descent to train all plastic synapses in the network.
A forgetting mechanism in the learning rule allows the network to continuously store new patterns while forgetting those stored earlier. We find that a single cue pattern can reliably trigger the retrieval of sequences, even when cues are noisy or missing information.
Furthermore, pattern separation in subregion DG is necessary when sequences contain correlated patterns. Besides artificially generated input sequences, the model works with sequences of handwritten digits and natural images.
Notably, our model is capable of improving itself without external input, in a process that can be referred to as ‘replay’ or ‘offline-learning’, which helps in improving the associations and consolidating the learned patterns.