Download Free The Role Of Synaptic Tagging And Capture For Memory Dynamics In Spiking Neural Networks Book in PDF and EPUB Free Download. You can read online The Role Of Synaptic Tagging And Capture For Memory Dynamics In Spiking Neural Networks and write the review.

Memory serves to process and store information about experiences such that this information can be used in future situations. The transfer from transient storage into long-term memory, which retains information for hours, days, and even years, is called consolidation. In brains, information is primarily stored via alteration of synapses, so-called synaptic plasticity. While these changes are at first in a transient early phase, they can be transferred to a late phase, meaning that they become stabilized over the course of several hours. This stabilization has been explained by so-called synaptic tagging and capture (STC) mechanisms. To store and recall memory representations, emergent dynamics arise from the synaptic structure of recurrent networks of neurons. This happens through so-called cell assemblies, which feature particularly strong synapses. It has been proposed that the stabilization of such cell assemblies by STC corresponds to so-called synaptic consolidation, which is observed in humans and other animals in the first hours after acquiring a new memory. The exact connection between the physiological mechanisms of STC and memory consolidation remains, however, unclear. It is equally unknown which influence STC mechanisms exert on further cognitive functions that guide behavior. On timescales of minutes to hours (that means, the timescales of STC) such functions include memory improvement, modification of memories, interference and enhancement of similar memories, and transient priming of certain memories. Thus, diverse memory dynamics may be linked to STC, which can be investigated by employing theoretical methods based on experimental data from the neuronal and the behavioral level. In this thesis, we present a theoretical model of STC-based memory consolidation in recurrent networks of spiking neurons, which are particularly suited to reproduce biologically realistic dynamics. Furthermore, we combine the STC mechanisms with calcium dynamics, which have been found to guide the major processes of early-phase synaptic plasticity in vivo. In three included research articles as well as additional sections, we develop this model and investigate how it can account for a variety of behavioral effects. We find that the model enables the robust implementation of the cognitive memory functions mentioned above. The main steps to this are: 1. demonstrating the formation, consolidation, and improvement of memories represented by cell assemblies, 2. showing that neuromodulator-dependent STC can retroactively control whether information is stored in a temporal or rate-based neural code, and 3. examining interaction of multiple cell assemblies with transient and attractor dynamics in different organizational paradigms. In summary, we demonstrate several ways by which STC controls the late-phase synaptic structure of cell assemblies. Linking these structures to functional dynamics, we show that our STC-based model implements functionality that can be related to long-term memory. Thereby, we provide a basis for the mechanistic explanation of various neuropsychological effects. Keywords: synaptic plasticity; synaptic tagging and capture; spiking recurrent neural networks; memory consolidation; long-term memory
How can neural and morphological computations be effectively combined and realized in embodied closed-loop systems (e.g., robots) such that they can become more like living creatures in their level of performance? Understanding this will lead to new technologies and a variety of applications. To tackle this research question, here, we bring together experts from different fields (including Biology, Computational Neuroscience, Robotics, and Artificial Intelligence) to share their recent findings and ideas and to update our research community. This eBook collects 17 cutting edge research articles, covering neural and morphological computations as well as the transfer of results to real world applications, like prosthesis and orthosis control and neuromorphic hardware implementation.
Organisms are equipped with value systems that signal the salience of environmental cues to their nervous system, causing a change in the nervous system that results in modification of their behavior. These systems are necessary for an organism to adapt its behavior when an important environmental event occurs. A value system constitutes a basic assumption of what is good and bad for an agent. These value systems have been effectively used in robotic systems to shape behavior. For example, many robots have used models of the dopaminergic system to reinforce behavior that leads to rewards. Other modulatory systems that shape behavior are acetylcholine’s effect on attention, norepinephrine’s effect on vigilance, and serotonin’s effect on impulsiveness, mood, and risk. Moreover, hormonal systems such as oxytocin and its effect on trust constitute as a value system. This book presents current research involving neurobiologically inspired robots whose behavior is: 1) Shaped by value and reward learning, 2) adapted through interaction with the environment, and 3) shaped by extracting value from the environment.
This is the second time that I have had the honor of opening an interna tional symposium dedicated to the functions of the hippocampus here in Pecs. It was a pleasure to greet the participants in the hope that their valuable contributions will make this meeting a tradition in this town. As one of the hosts of the symposium, I had the sorrowful duty to remind you of the absence of a dear colleague, Professor Graham God dard. His tragic and untimely death represents the irreparable loss of both a friend and an excellent researcher. This symposium is dedicated to his memory. If I compare the topics of the lectures of this symposium with those of the previous one, a striking difference becomes apparent. A dominating tendency of the previous symposium was to attempt to define hippocam pal function or to offer data relevant to supporting or rejecting existing theoretical positions. No such tendency is reflected in the titles of the present symposium, in which most of the contributions deal with hip pocampal phenomena at the most elementary level. Electrical, biochemi cal, biophysical, and pharmacological events at the synaptic, membrane, or intracellular level are analyzed without raising the question of what kind of integral functions these elementary phenomena are a part of.
This volume will explore the most recent findings on cellular mechanisms of inhibitory plasticity and its functional role in shaping neuronal circuits, their rewiring in response to experience, drug addiction and in neuropathology. Inhibitory Synaptic Plasticity will be of particular interest to neuroscientists and neurophysiologists.
Serves as a comprehensive introduction and overview of synaptic tagging and capture (STC) and covers the topic from molecular and cellular aspects to behavior. Circa 15 years ago the STC model was proposed to provide a conceptual basis for how short-term memories are transformed into long-term memories. Though the hypothesis remains unconfirmed due to technological limitations, the model is well consolidated and generally accepted in the field. Various researchers have investigated the cellular mechanisms for the formation of long-term memory using the STC model, but this is the first book-length treatments of STC. This volume features an introduction by Prof. Richard Morris and Prof. Cliff Abraham.
Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when two neurons are active together. Neurons that fire together may therefore not necessarily wire together if the precise timing of the spikes involved are not tighly correlated. In the subsequent 15 years, Spike Timing Dependent Plasticity (STDP) has been found in multiple brain brain regions and in many different species. The size and shape of the time windows in which positive and negative changes can be made vary for different brain regions, but the core principle of spike timing dependent changes remain. A large number of theoretical studies have also been conducted during this period that explore the computational function of this driving principle and STDP algorithms have become the main learning algorithm when modeling neural networks. This Research Topic will bring together all the key experimental and theoretical research on STDP.
The idea of one's memory "filling up" is a humorous misconception of how memory in general is thought to work; it actually has no capacity limit. However, the idea of a "full brain" makes more sense with reference to working memory, which is the limited amount of information a person can hold temporarily in an especially accessible form for use in the completion of almost any challenging cognitive task. This groundbreaking book explains the evidence supporting Cowan's theoretical proposal about working memory capacity, and compares it to competing perspectives. Cognitive psychologists profoundly disagree on how working memory is limited: whether by the number of units that can be retained (and, if so, what kind of units and how many), the types of interfering material, the time that has elapsed, some combination of these mechanisms, or none of them. The book assesses these hypotheses and examines explanations of why capacity limits occur, including vivid biological, cognitive, and evolutionary accounts. The book concludes with a discussion of the practical importance of capacity limits in daily life. This 10th anniversary Classic Edition will continue to be accessible to a wide range of readers and serve as an invaluable reference for all memory researchers.
Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.