Publications
2023
-
(2023) Current Opinion in Neurobiology. 80, 102721. Abstract
Learning is a multi-faceted phenomenon of critical importance and hence attracted a great deal of research, both experimental and theoretical. In this review, we will consider some of the paradigmatic examples of learning and discuss the common themes in theoretical learning research, such as levels of modeling and their corresponding relation to experimental observations and mathematical ideas common to different types of learning.
-
(2023) Learning & Memory. 30, 2, p. 43-47 Abstract
How the dynamic evolution of forgetting changes for different material types is unexplored. By using a common experimental paradigm with stimuli of different types, we were able to directly cross-examine the emerging dynamics and found that even though the presentation sets differ minimally by design, the obtained curves appear to fall on a discrete spectrum. We also show that the resulting curves do not depend on physical time but rather on the number of items shown. All measured curves were compatible with our previously developed mathematical model, hinting to a potential common underlying mechanism of forgetting.
2022
-
(2022) Physical Review Research. 4, 3, 033090. Abstract
Numerous studies analyzed the performance of participants in free recall of randomly assembled lists of words with the focus on the average number of words recalled for different experimental parameters such as list length, presentation speed, etc. The distribution of performance around the mean was not systematically studied, even though it is well-known that recall is an unpredictable process resulting in highly variable results over different trials. We recently introduced the mathematical model of free recall that reproduced well the average performance of human participants in experiments with randomly assembled lists of words or short sentences. The model assumes that during recall, each memory item currently recalled triggers a recall of a next item based on the random symmetric matrix of similarity measures between items in the list. When applying the model to experimental data, a crucial assumption was made that upon presentation, a certain fraction of presented items remain in memory that are candidates for recall, and that the number of such items can be estimated with recognition experiments performed by the same group of participants under identical conditions of item presentation as in the recall experiments. It is not clear whether this assumption is valid under different experimental paradigms and with different groups of participants. Here we propose that calculating the variance of recall performance allows one to formulate interesting predictions that can be tested without performing recognition experiments. Comparison of model predictions with experimental data on young and old participants indicates that the same recall algorithm is involved in both groups, even though old participants may have fewer candidate memory items for recall after presentation.
-
(2022) Journal of Mathematical Physics. 63, 7, 073303. Abstract
Human memory is an incredibly complex system of vast capacity but often unreliable. Measuring memory for realistic material, such as narratives, is quantitatively challenging as people rarely remember narratives verbatim. Cognitive psychologists developed experimental paradigms involving randomly collected lists of items that make possible quantitative measures of performance in memory tasks, such as recall and recognition. Here, we describe a set of mathematical models designed to predict the results of these experiments. The models are based on simple underlying assumptions and surprisingly agree with experimental results quite well, in addition to that they exhibit quite interesting mathematical behavior that can partially be understood analytically.
2021
-
(2021) Scientific Reports. 11, 1, 17456. Abstract
Memorizing time of an event may employ two processes (i) encoding of the absolute time of events within an episode, (ii) encoding of its relative order. Here we study interaction between these two processes. We performed experiments in which one or several items were presented, after which participants were asked to report the time of occurrence of items. When a single item was presented, the distribution of reported times was quite wide. When two or three items were presented, the relative order among them strongly affected the reported time of each of them. Bayesian theory that takes into account the memory for the events order is compatible with the experimental data, in particular in terms of the effect of order on absolute time reports. Our results suggest that people do not deduce order from memorized time, instead peoples memory for absolute time of events relies critically on memorized order of the events.
-
(2021) Entropy (Basel, Switzerland). 23, 5, 603. Abstract
When humans infer underlying probabilities from stochastic observations, they exhibit biases and variability that cannot be explained on the basis of sound, Bayesian manipulations of probability. This is especially salient when beliefs are updated as a function of sequential observations. We introduce a theoretical framework in which biases and variability emerge from a trade-off between Bayesian inference and the cognitive cost of carrying out probabilistic computations. We consider two forms of the cost: a precision cost and an unpredictability cost; these penalize beliefs that are less entropic and less deterministic, respectively. We apply our framework to the case of a Bernoulli variable: the bias of a coin is inferred from a sequence of coin flips. Theoretical predictions are qualitatively different depending on the form of the cost. A precision cost induces overestimation of small probabilities, on average, and a limited memory of past observations, and, consequently, a fluctuating bias. An unpredictability cost induces underestimation of small probabilities and a fixed bias that remains appreciable even for nearly unbiased observations. The case of a fair (equiprobable) coin, however, is singular, with non-trivial and slow fluctuations in the inferred bias. The proposed framework of costly Bayesian inference illustrates the richness of a resource-rational (or bounded-rational) picture of seemingly irrational human cognition.
-
(2021) Science (American Association for the Advancement of Science). 372, 6545, eabg4020. Abstract
Hippocampal place cells encode the animals location. Place cells were traditionally studied in small environments, and nothing is known about large ethologically relevant spatial scales. We wirelessly recorded from hippocampal dorsal CA1 neurons of wild-born bats flying in a long tunnel (200 meters). The size of place fields ranged from 0.6 to 32 meters. Individual place cells exhibited multiple fields and a multiscale representation: Place fields of the same neuron differed up to 20-fold in size. This multiscale coding was observed from the first day of exposure to the environment, and also in laboratory-born bats that never experienced large environments. Theoretical decoding analysis showed that the multiscale code allows representation of very large environments with much higher precision than that of other codes. Together, by increasing the spatial scale, we discovered a neural code that is radically different from classical place codes.
-
(2021) Journal of Mathematical Neuroscience. 11, 1, 4. Abstract
Memory and forgetting constitute two sides of the same coin, and although the first has been extensively investigated, the latter is often overlooked. A possible approach to better understand forgetting is to develop phenomenological models that implement its putative mechanisms in the most elementary way possible, and then experimentally test the theoretical predictions of these models. One such mechanism proposed in previous studies is retrograde interference, stating that a memory can be erased due to subsequently acquired memories. In the current contribution, we hypothesize that retrograde erasure is controlled by the relevant "importance" measures such that more important memories eliminate less important ones acquired earlier. We show that some versions of the resulting mathematical model are broadly compatible with the previously reported power-law forgetting time course and match well the results of our recognition experiments with long, randomly assembled streams of words.
2020
-
(2020) Journal of vision (Charlottesville, Va.). 20, 11, p. 216 Abstract
Visual encoding (how stimuli evoke neuronal responses) is known to progress from low to high levels. Decoding (how responses lead to perception), in contrast, is less understood but is widely assumed to follow the same, low-to-high-level hierarchy. According to this assumption, orientation decoding must occur in low-level areas such as V1, and consequently, two orientations on opposite sides of the fixation should not interact with each other perceptually. However, Ding et al (PNAS, 2017) provided evidence against the assumption and proposed that visual decoding may follow the opposite, high-to-low-level hierarchy in working memory. If two orientations on opposite sides of the fixation are both task relevant and enter a high-level working-memory area in a delay period, then they should interact with each other. We tested this prediction. Subjects maintained central fixation when two lines with an orientation difference of 5° were flashed on opposite sides of the fixation, with a center-to-center distance of 16° of visual angle. Their eye positions were monitored with an infrared eye tracker and trials with broken fixation were aborted. After a delay, subjects reported the two orientations by drawing and adjusting an indicator line at the fixation. In one condition, the indicator line disappeared after the first report, and was redrawn for the second report, to minimize potential interference. We found that the two lines showed a large and significant repulsion between them, demonstrating the predicted cross-fixation interactions in working memory. The pattern was consistent across 14 subjects. Control conditions and analyses ruled out alternative explanations such as interactions across trials on the same side of the fixation. Moreover, we quantitatively accounted for the repulsion with the retrospective Bayesian decoding model in Ding et al. We conclude that our results support the theory that visual perception may be viewed as high-to-low-level decoding in working memory.
-
(2020) Physical Review Letters. 124, 1, 018101. Abstract
Human memory appears to be fragile and unpredictable. Free recall of random lists of words is a standard paradigm used to probe episodic memory. We proposed an associative search process that can be reduced to a deterministic walk on random graphs defined by the structure of memory representations. The corresponding graph model can be solved analytically, resulting in a novel parameter-free prediction for the average number of memory items recalled (R) out of M items in memory: R = root 3 pi M/12. This prediction was verified with a specially designed experimental protocol combining large-scale crowd-sourced free recall and recognition experiments with randomly assembled lists of words or common facts. Our results show that human memory can be described by universal laws derived from first principles.
2019
-
(2019) Scientific Reports. 9, 1, 10448. Abstract
Structured information is easier to remember and recall than random one. In real life, information exhibits multi-level hierarchical organization, such as clauses, sentences, episodes and narratives in language. Here we show that multi-level grouping emerges even when participants perform memory recall experiments with random sets of words. To quantitatively probe brain mechanisms involved in memory structuring, we consider an experimental protocol where participants perform 'final free recall' (FFR) of several random lists of words each of which was first presented and recalled individually. We observe a hierarchy of grouping organizations of FFR, most notably many participants sequentially recalled relatively long chunks of words from each list before recalling words from another list. Moreover, participants who exhibited strongest organization during FFR achieved highest levels of performance. Based on these results, we develop a hierarchical model of memory recall that is broadly compatible with our findings. Our study shows how highly controlled memory experiments with random and meaningless material, when combined with simple models, can be used to quantitatively probe the way meaningful information can efficiently be organized and processed in the brain.
2018
-
(2018) Nature Communications. 9, 1, 3590. Abstract
Ethologically relevant stimuli are often multidimensional. In many brain systems, neurons with "pure" tuning to one stimulus dimension are found along with "conjunctive" neurons that encode several dimensions, forming an apparently redundant representation. Here we show using theoretical analysis that a mixed-dimensionality code can efficiently represent a stimulus in different behavioral regimes: encoding by conjunctive cells is more robust when the stimulus changes quickly, whereas on long timescales pure cells represent the stimulus more efficiently with fewer neurons. We tested our predictions experimentally in the bat head-direction system and found that many head-direction cells switched their tuning dynamically from pure to conjunctive representation as a function of angular velocity-confirming our theoretical prediction. More broadly, our results suggest that optimal dimensionality depends on population size and on the time available for decoding-which might explain why mixed-dimensionality representations are common in sensory, motor, and higher cognitive systems across species.
2017
-
(2017) PLoS Computational Biology. 13, 12, 1005861. Abstract
Recurrent and feedback networks are capable of holding dynamic memories. Nonetheless, training a network for that task is challenging. In order to do so, one should face non-linear propagation of errors in the system. Small deviations from the desired dynamics due to error or inherent noise might have a dramatic effect in the future. A method to cope with these difficulties is thus needed. In this work we focus on recurrent networks with linear activation functions and binary output unit. We characterize its ability to reproduce a temporal sequence of actions over its output unit. We suggest casting the temporal learning problem to a perceptron problem. In the discrete case a finite margin appears, providing the network, to some extent, robustness to noise, for which it performs perfectly (i.e. producing a desired sequence for an arbitrary number of cycles flawlessly). In the continuous case the margin approaches zero when the output unit changes its state, hence the network is only able to reproduce the sequence with slight jitters. Numerical simulation suggest that in the discrete time case, the longest sequence that can be learned scales, at best, as square root of the network size. A dramatic effect occurs when learning several short sequences in parallel, that is, their total length substantially exceeds the length of the longest single sequence the network can learn. This model easily generalizes to an arbitrary number of output units, which boost its performance. This effect is demonstrated by considering two practical examples for sequence learning. This work suggests a way to overcome stability problems for training recurrent networks and further quantifies the performance of a network under the specific learning scheme.
-
(2017) Neural Computation. 29, 10, p. 2684-2711 Abstract
Human memory is capable of retrieving similar memories to a just retrieved one. This associative ability is at the base of our everyday processing of information. Current models of memory have not been able to underpin the mechanism that the brain could use in order to actively exploit similarities between memories. The current idea is that to induce transitions in attractor neural networks, it is necessary to extinguish the current memory. We introduce a novel mechanism capable of inducing transitions between memories where similarities between memories are actively exploited by the neural dynamics to retrieve a new memory. Populations of neurons that are selective for multiple memories play a crucial role in this mechanism by becoming attractors on their own. The mechanism is based on the ability of the neural network to control the excitation-inhibition balance.
-
(2017) Proceedings of the National Academy of Sciences of the United States of America. 114, 43, p. E9115-E9124 Abstract
When a stimulus is presented, its encoding is known to progress from low- to high-level features. How these features are decoded to produce perception is less clear, and most models assume that decoding follows the same low- to high-level hierarchy of encoding. There are also theories arguing for global precedence, reversed hierarchy, or bidirectional processing, but they are descriptive without quantitative comparison with human perception. Moreover, observers often inspect different parts of a scene sequentially to form overall perception, suggesting that perceptual decoding requires working memory, yet few models consider how working-memory properties may affect decoding hierarchy. We probed decoding hierarchy by comparing absolute judgments of single orientations and relative/ordinal judgments between two sequentially presented orientations.We found that lower-level, absolute judgments failed to account for higher-level, relative/ordinal judgments. However, when ordinal judgment was used to retrospectively decode memory representations of absolute orientations, striking aspects of absolute judgments, including the correlation and forward/backward aftereffects between two reported orientations in a trial, were explained. We propose that the brain prioritizes decoding of higher-level features because they are more behaviorally relevant, and more invariant and categorical, and thus easier to specify and maintain in noisy working memory, and that more reliable higher-level decoding constrains less reliable lower-level decoding.
-
(2017) Hippocampus. 27, 9, p. 959-970 Abstract
Hippocampal place cells represent different environments with distinct neural activity patterns. Following an abrupt switch between two familiar configurations of visual cues defining two environments, the hippocampal neural activity pattern switches almost immediately to the corresponding representation. Surprisingly, during a transient period following the switch to the new environment, occasional fast transitions between the two activity patterns (flickering) were observed (Jezek, Henriksen, Treves, Moser, & Moser,). Here we show that an attractor neural network model of place cells with connections endowed with short-term synaptic plasticity can account for this phenomenon. A memory trace of the recent history of network activity is maintained in the state of the synapses, allowing the network to temporarily reactivate the representation of the previous environment in the absence of the corresponding sensory cues. The model predicts that the number of flickering events depends on the amplitude of the ongoing theta rhythm and the distance between the current position of the animal and its position at the time of cue switching. We test these predictions with new analysis of experimental data. These results suggest a potential role of short-term synaptic plasticity in recruiting the activity of different cell assemblies and in shaping hippocampal activity of behaving animals.
-
(2017) eLife. 6, 23871. Abstract
Working memory and conscious perception are thought to share similar brain mechanisms, yet recent reports of non-conscious working memory challenge this view. Combining visual masking with magnetoencephalography, we investigate the reality of non-conscious working memory and dissect its neural mechanisms. In a spatial delayed-response task, participants reported the location of a subjectively unseen target above chance-level after several seconds. Conscious perception and conscious working memory were characterized by similar signatures: a sustained desynchronization in the alpha/beta band over frontal cortex, and a decodable representation of target location in posterior sensors. During non-conscious working memory, such activity vanished. Our findings contradict models that identify working memory with sustained neural firing, but are compatible with recent proposals of activity-silent working memory. We present a theoretical framework and simulations showing how slowly decaying synaptic changes allow cell assemblies to go dormant during the delay, yet be retrieved above chance-level after several seconds.
-
(2017) Neuron. 94, 5, p. 1027-1032 Abstract
The dilemma that neurotheorists face is that (1) detailed biophysical models that can be constrained by direct measurements, while being of great importance, offer no immediate insights into cognitive processes in the brain, and (2) high-level abstract cognitive models, on the other hand, while relevant for understanding behavior, are largely detached from neuronal processes and typically have many free, experimentally unconstrained parameters that have to be tuned to a particular data set and, hence, cannot be readily generalized to other experimental paradigms. In this contribution, we propose a set of \u201cfirst principles\u201d for neurally inspired cognitive modeling of memory retrieval that has no biologically unconstrained parameters and can be analyzed mathematically both at neuronal and cognitive levels. We apply this framework to the classical cognitive paradigm of free recall. We show that the resulting model accounts well for puzzling behavioral data on human participants and makes predictions that could potentially be tested with neurophysiological recording techniques.
-
(2017) Opera Medica et Physiologica. 4, p. 42 Abstract
Cortical activity exhibits distinct characteristics in different functional states. In awake behaving animals it shows less synchrony, while in rest or sleeping state cortical activity is most synchronous. Previous studies showed that switching between functional states can change the efficiency of flowing sensory information. Switching between functional states can be triggered by releasing neuromodulators which affect neurotransmitter release probability and depolarization of cortical neurons. In this work we focus on studying primary visual area V1, by using firing rate ring model with short-term synaptic depression (STD). We show that reconstruction of visual features from V1 activity depends on the functional state, with best precision achieved at the state with intermediate release probability. We suggest that this regime corresponds to the state of maximal visual attention.
-
(2017) Neuron. 93, 2, p. 323-330 Abstract
Psychological studies indicate that human ability to keep information in readily accessible working memory is limited to four items for most people. This extremely low capacity severely limits execution of many cognitive tasks, but its neuronal underpinnings remain unclear. Here we show that in the framework of synaptic theory of working memory, capacity can be analytically estimated to scale with characteristic time of short-term synaptic depression relative to synaptic current time constant. The number of items in working memory can be regulated by external excitation, enabling the system to be tuned to the desired load and to clear the working memory of currently held items to make room for new ones.
2016
-
(2016) Learning & Memory. 23, 4, p. 169-173 Abstract
A large variability in performance is observed when participants recall briefly presented lists of words. The sources of such variability are not known. Our analysis of a large data set of free recall revealed a small fraction of participants that reached an extremely high performance, including many trials with the recall of complete lists. Moreover, some of them developed a number of consistent input-position-dependent recall strategies, in particular recalling words consecutively ("chaining") or in groups of consecutively presented words ("chunking"). The time course of acquisition and particular choice of positional grouping were variable among participants. Our results show that acquiring positional strategies plays a crucial role in improvement of recall performance.
2015
-
(2015) Frontiers in Computational Neuroscience. 9, DEC, 149. Abstract
Human memory can store large amount of information. Nevertheless, recalling is often a challenging task. In a classical free recall paradigm, where participants are asked to repeat a briefly presented list of words, people make mistakes for lists as short as 5 words. We present a model for memory retrieval based on a Hopfield neural network where transition between items are determined by similarities in their long-term memory representations. Meanfield analysis of the model reveals stable states of the network corresponding (1) to single memory representations and (2) intersection between memory representations. We show that oscillating feedback inhibition in the presence of noise induces transitions between these states triggering the retrieval of different memories. The network dynamics qualitatively predicts the distribution of time intervals required to recall new memory items observed in experiments. It shows that items having larger number of neurons in their representation are statistically easier to recall and reveals possible bottlenecks in our ability of retrieving memories. Overall, we propose a neural network model of information retrieval broadly compatible with experimental observations and is consistent with our recent graphical model (Romani et al., 2013).
-
(2015) Journal of Neurophysiology. 114, 1, p. 505-519 Abstract
Electrophysiological mass potentials show complex spectral changes upon neuronal activation. However, it is unknown to what extent these complex band-limited changes are interrelated or, alternatively, reflect separate neuronal processes. To address this question, intracranial electrocorticograms (ECoG) responses were recorded in patients engaged in visuomotor tasks. We found that in the 10- to 100-Hz frequency range there was a significant reduction in the exponent chi of the 1/f(chi) component of the spectrum associated with neuronal activation. In a minority of electrodes showing particularly high activations the exponent reduction was associated with specific band-limited power modulations: emergence of a high gamma (80-100 Hz) and a decrease in the alpha (9-12 Hz) peaks. Importantly, the peaks' height was correlated with the 1/f(chi) exponent on activation. Control simulation ruled out the possibility that the change in 1/f(chi) exponent was a consequence of the analysis procedure. These results reveal a new global, cross-frequency (10-100 Hz) neuronal process reflected in a significant reduction of the power spectrum slope of the ECoG signal.
-
(2015) Learning & Memory. 22, 2, p. 101-108 Abstract
Human memory stores vast amounts of information. Yet recalling this information is often challenging when specific cues are lacking. Here we consider an associative model of retrieval where each recalled item triggers the recall of the next item based on the similarity between their long-term neuronal representations. The model predicts that different items stored in memory have different probability to be recalled depending on the size of their representation. Moreover, items with high recall probability tend to be recalled earlier and suppress other items. We performed an analysis of a large data set on free recall and found a highly specific pattern of statistical dependencies predicted by the model, in particular negative correlations between the number of words recalled and their average recall probability. Taken together, experimental and modeling results presented here reveal complex interactions between memory items during recall that severely constrain recall capacity.
-
(2015) Hippocampus. 25, 1, p. 94-105 Abstract
Rodent hippocampus exhibits strikingly different regimes of population activity in different behavioral states. During locomotion, hippocampal activity oscillates at theta frequency (5-12 Hz) and cells fire at specific locations in the environment, the place fields. As the animal runs through a place field, spikes are emitted at progressively earlier phases of the theta cycles. During immobility, hippocampus exhibits sharp irregular bursts of activity, with occasional rapid orderly activation of place cells expressing a possible trajectory of the animal. The mechanisms underlying this rich repertoire of dynamics are still unclear. We developed a novel recurrent network model that accounts for the observed phenomena. We assume that the network stores a map of the environment in its recurrent connections, which are endowed with short-term synaptic depression. We show that the network dynamics exhibits two different regimes that are similar to the experimentally observed population activity states in the hippocampus. The operating regime can be solely controlled by external inputs. Our results suggest that short-term synaptic plasticity is a potential mechanism contributing to shape the population activity in hippocampus.
2014
-
(2014) Frontiers in Computational Neuroscience. 8, OCT, 129. Abstract
In serial recall experiments, human subjects are requested to retrieve a list of words in the same order as they were presented. In a classical study, participants were reported to recall more words from study lists composed of short words compared to lists of long words, the word length effect. The world length effect was also observed in free recall experiments, where subjects can retrieve the words in any order. Here we analyzed a large dataset from free recall experiments of unrelated words, where short and long words were randomly mixed, and found a seemingly opposite effect: long words are recalled better than the short ones. We show that our recently proposed mechanism of associative retrieval can explain both these observations. Moreover, the direction of the effect depends solely on the way study lists are composed.
-
(2014) PLoS Computational Biology. 10, 4, 1003558. Abstract
The spatial responses of many of the cells recorded in layer II of rodent medial entorhinal cortex (MEC) show a triangular grid pattern, which appears to provide an accurate population code for animal spatial position. In layer III, V and VI of the rat MEC, grid cells are also selective to head-direction and are modulated by the speed of the animal. Several putative mechanisms of grid-like maps were proposed, including attractor network dynamics, interactions with theta oscillations or single-unit mechanisms such as firing rate adaptation. In this paper, we present a new attractor network model that accounts for the conjunctive position-by-velocity selectivity of grid cells. Our network model is able to perform robust path integration even when the recurrent connections are subject to random perturbations.
-
(2014) Current Opinion in Neurobiology. 25, p. 20-24 Abstract
Working memory is a system that maintains and manipulates information for several seconds during the planning and execution of many cognitive tasks. Traditionally, it was believed that the neuronal underpinning of working memory is stationary persistent firing of selective neuronal populations. Recent advances introduced new ideas regarding possible mechanisms of working memory, such as short-term synaptic facilitation, precise tuning of recurrent excitation and inhibition, and intrinsic network dynamics. These ideas are motivated by computational considerations and careful analysis of experimental data. Taken together, they may indicate the plethora of different processes underlying working memory in the brain.
2013
-
(2013) Frontiers in Computational Neuroscience. DEC, 188. Abstract
NAKeywords: short-term plasticity, phenomenological model, neural information processing, associative memory, network dynamics, neural field model,continuous attractor neural network
-
(2013) PLoS Computational Biology. 9, 10, 1003307. Abstract
Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and synaptic differentiation is simultaneously achieved remains unclear. Here we show that synaptic scaling - a slow process usually associated with the maintenance of activity homeostasis - combined with synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.
-
(2013) Progress in Neurobiology. 103, p. 214-222 Abstract
Working memory is a crucial component of most cognitive tasks. Its neuronal mechanisms are still unclear despite intensive experimental and theoretical explorations. Most theoretical models of working memory assume both time-invariant neural representations and precise connectivity schemes based on the tuning properties of network neurons. A different, more recent class of models assumes randomly connected neurons that have no tuning to any particular task, and bases task performance purely on adjustment of network readout. Intermediate between these schemes are networks that start out random but are trained by a learning scheme. Experimental studies of a delayed vibrotactile discrimination task indicate that some of the neurons in prefrontal cortex are persistently tuned to the frequency of a remembered stimulus, but the majority exhibit more complex relationships to the stimulus that vary considerably across time. We compare three models, ranging from a highly organized line attractor model to a randomly connected network with chaotic activity, with data recorded during this task. The random network does a surprisingly good job of both performing the task and matching certain aspects of the data. The intermediate model, in which an initially random network is partially trained to perform the working memory task by tuning its recurrent and readout connections, provides a better description, although none of the models matches all features of the data. Our results suggest that prefrontal networks may begin in a random state relative to the task and initially rely on modified readout for task performance. With further training, however, more tuned neurons with less time-varying responses should emerge as the networks become more structured.
-
(2013) Neural Computation. 25, 10, p. 2523-2544 Abstract
Most people have great difficulty in recalling unrelated items. For example, in free recall experiments, lists of more than a few randomly selected words cannot be accurately repeated. Here we introduce a phenomenological model of memory retrieval inspired by theories of neuronal population coding of information. The model predicts nontrivial scaling behaviors for the mean and standard deviation of the number of recalled words for lists of increasing length. Our results suggest that associative information retrieval is a dominating factor that limits the number of recalled items.
2012
-
(2012) Frontiers in Computational Neuroscience. 6, JUL, 43. Abstract
Brain computational challenges vary between behavioral states. Engaged animals react according to incoming sensory information, while in relaxed and sleeping states consolidation of the learned information is believed to take place. Different states are characterized by different forms of cortical activity. We study a possible neuronal mechanism for generating these diverse dynamics and suggest their possible functional significance. Previous studies demonstrated that brief synchronized increase in a neural firing [Population Spikes (PS)] can be generated in homogenous recurrent neural networks with short-term synaptic depression (STD). Here we consider more realistic networks with clustered architecture. We show that the level of synchronization in neural activity can be controlled smoothly by network parameters. The network shifts from asynchronous activity to a regime in which clusters synchronized separately, then, the synchronization between the clusters increases gradually to fully synchronized state. We examine the effects of different synchrony levels on the transmission of information by the network. We find that the regime of intermediate synchronization is preferential for the flow of information between sparsely connected areas. Based on these results, we suggest that the regime of intermediate synchronization corresponds to engaged behavioral state of the animal, while global synchronization is exhibited during relaxed and sleeping states.
2011
-
(2011) Frontiers in Computational Neuroscience. 5, 40. Abstract
Networks with continuous set of attractors are considered to be a paradigmatic model for parametric working memory (WM), but require fine tuning of connections and are thus structurally unstable. Here we analyzed the network with ring attractor, where connections are not perfectly tuned and the activity state therefore drifts in the absence of the stabilizing stimulus. We derive an analytical expression for the drift dynamics and conclude that the network cannot function as WM for a period of several seconds, a typical delay time in monkey memory experiments. We propose that short-term synaptic facilitation in recurrent connections significantly improves the robustness of the model by slowing down the drift of activity bump. Extending the calculation of the drift velocity to network with synaptic facilitation, we conclude that facilitation can slow down the drift by a large factor, rendering the network suitable as a model of WM.
-
(2011) Neural Computation. 23, 3, p. 651-655 Abstract
The pattern of spikes recorded from place cells in the rodent hippocampus is strongly modulated by both the spatial location in the environment and the theta rhythm. The phases of the spikes in the theta cycle advance duringmovement through the place field. Recently intracellular recordings from hippocampal neurons (Harvey, Collman, Dombeck, & Tank, 2009) showed an increase in the amplitude of membrane potential oscillations inside the place field, which was interpreted as evidence that an intracellular mechanism caused phase precession. Here we show that an existing network model of the hippocampus (Tsodyks, Skaggs, Sejnowski, & McNaughton, 1996) can equally reproduce this and other aspects of the intracellular recordings, which suggests that new experiments are needed to distinguish the contributions of intracellular and network mechanisms to phase precession.
2010
-
(2010) PLoS Computational Biology. 6, 8, 1000869. Abstract
Continuous attractor networks are used to model the storage and representation of analog quantities, such as position of a visual stimulus. The storage of multiple continuous attractors in the same network has previously been studied in the context of self-position coding. Several uncorrelated maps of environments are stored in the synaptic connections, and a position in a given environment is represented by a localized pattern of neural activity in the corresponding map, driven by a spatially tuned input. Here we analyze networks storing a pair of correlated maps, or a morph sequence between two uncorrelated maps. We find a novel state in which the network activity is simultaneously localized in both maps. In this state, a fixed cue presented to the network does not determine uniquely the location of the bump, i.e. the response is unreliable, with neurons not always responding when their preferred input is present. When the tuned input varies smoothly in time, the neuronal responses become reliable and selective for the environment: the subset of neurons responsive to a moving input in one map changes almost completely in the other map. This form of remapping is a non-trivial transformation between the tuned input to the network and the resulting tuning curves of the neurons. The new state of the network could be related to the formation of direction selectivity in one-dimensional environments and hippocampal remapping. The applicability of the model is not confined to self-position representations; we show an instance of the network solving a simple delayed discrimination task.
-
(2010) Journal of Neuroscience. 30, 28, p. 9424-9430 Abstract
Comparing two sequentially presented stimuli is a widely used experimental paradigm for studying working memory. The delay activity of many single neurons in the prefrontal cortex (PFC) of monkeys was found to be stimulus-specific, however, population dynamics of stimulus representation has not been elucidated. We analyzed the population state of a large number of PFC neurons during a somato-sensory discrimination task. Using the tuning curves of the neurons, we derived a compact characterization of the population state. Stimulus representation by the population was found to degrade after stimulus termination, and emerge in a different form toward the end of the delay. Specifically, the tuning properties of neurons were found to change during the task. We suggest a mechanism whereby information about the stimulus is contained in activity-dependent synaptic facilitation of recurrent connections.
2009
-
(2009) Frontiers in Computational Neuroscience. 3, NOV, 27. Abstract
Inter-pyramidal synaptic connections are characterized by a wide range of EPSP amplitudes. Although repeatedly observed at different brain regions and across layers, little is known about the synaptic characteristics that contribute to this wide range. In particular, the range could potentially be accounted for by differences in all three parameters of the quantal model of synaptic transmission, i.e. the number of release sites, release probability and quantal size. Here, we present a rigorous statistical analysis of the transmission properties of excitatory synaptic connections between layer-5 pyramidal neurons of the somato-sensory cortex. Our central finding is that the EPSP amplitude is strongly correlated with the number of estimated release sites, but not with the release probability or quantal size. In addition, we found that the number of release sites can be more than an order of magnitude higher than the typical number of synaptic contacts for this type of connection. Our findings indicate that transmission at stronger synaptic connections is mediated by multiquantal release from their synaptic contacts. We propose that modulating the number of release sites could be an important mechanism in regulating neocortical synaptic transmission.
-
(2009) Proceedings of the National Academy of Sciences of the United States of America. 106, 13, p. 5371-5376 Abstract
Our brain is able to maintain a continuously updated memory representation of objects despite changes in their appearance over time (aging faces or objects, growing trees, etc.). Although this ability is crucial for cognition and behavior, it was barely explored. Here, we investigate this memory characteristic using a protocol emulating face transformation. Observers were presented with a sequence of faces that gradually transformed over many days, from a known face (source) to a new face (target), in presentations separated by other stimuli. This practice resulted in a drastic change in the memory and recognition of the faces. Although identification of the source and older face instances was reduced, recent face instances were increasingly identified as the source and rated as highly similar to the memory of the source. Using an object perturbation method, we estimated the corresponding memory shift, showing that memory patterns shifted from the source neighborhood toward the target. Our findings suggest that memory is updated to account for object changes over time while still keeping associations with past appearances. These experimental results are broadly compatible with a recently developed model of associative memory that assumes attractor dynamics with a learning rule facilitated by novelty, shown to hold when objects change gradually over short timescales.
2008
-
(2008) Journal of Computational Neuroscience. 25, 2, p. 308-316 Abstract
The synchronous oscillatory activity characterizing many neurons in a network is often considered to be a mechanism for representing, binding, conveying, and organizing information. A number of models have been proposed to explain high-frequency oscillations, but the mechanisms that underlie slow oscillations are still unclear. Here, we show by means of analytical solutions and simulations that facilitating excitatory (Ef) synapses onto interneurons in a neural network play a fundamental role, not only in shaping the frequency of slow oscillations, but also in determining the form of the up and down states observed in electrophysiological measurements. Short time constants and strong Ef synapse-connectivity were found to induce rapid alternations between up and down states, whereas long time constants and weak Ef synapse connectivity prolonged the time between up states and increased the up state duration. These results suggest a novel role for facilitating excitatory synapses onto interneurons in controlling the form and frequency of slow oscillations in neuronal circuits.
-
-
(2008) Science. 319, 5869, p. 1543-1546 Abstract
It is usually assumed that enhanced spiking activity in the form of persistent reverberation for several seconds is the neural correlate of working memory. Here, we propose that working memory is sustained by calcium-mediated synaptic facilitation in the recurrent connections of neocortical networks. In this account, the presynaptic residual calcium is used as a buffer that is loaded, refreshed, and read out by spiking activity. Because of the long time constants of calcium kinetics, the refresh rate can be low, resulting in a mechanism that is metabolically efficient and robust. The duration and stability of working memory can be regulated by modulating the spontaneous activity in the network.
2007
-
(2007) Frontiers in Neuroscience. 1, 1, p. 197-209 Abstract
We propose a model of the primary auditory cortex (A1), in which each iso-frequency column is represented by a recurrent neural network with short-term synaptic depression. Such networks can emit Population Spikes, in which most of the neurons fire synchronously for a short time period. Different columns are interconnected in a way that reflects the tonotopic map in A1, and population spikes can propagate along the map from one column to the next, in a temporally precise manner that depends on the specific input presented to the network. The network, therefore, processes incoming sounds by precise sequences of population spikes that are embedded in a continuous asynchronous activity, with both of these response components carrying information about the inputs and interacting with each other. With these basic characteristics, the model can account for a wide range of experimental findings. We reproduce neuronal frequency tuning curves, whose width depends on the strength of the intracortical inhibitory and excitatory connections. Non-simultaneous two-tone stimuli show forward masking depending on their temporal separation, as well as on the duration of the first stimulus. The model also exhibits non-linear suppressive interactions between sub-threshold tones and broad-band noise inputs, similar to the hypersensitive locking suppression recently demonstrated in auditory cortex. We derive several predictions from the model. In particular, we predict that spontaneous activity in primary auditory cortex gates the temporally locked responses of A1 neurons to auditory stimuli. Spontaneous activity could, therefore, be a mechanism for rapid and reversible modulation of cortical processing.
-
(2007) Vision Research. 47, 22, p. 2855-2867 Abstract
Mathematical singularities found in the Signal Detection Theory (SDT) based analysis of the 2-Alternative-Forced-Choice (2AFC) method [Katkov, M., Tsodyks, M., & Sagi, D. (2006a). Analysis of two-alternative force-choice Signal Detection Theory model. Journal of Mathematical Psychology, 50, 411-420; Katkov, M., Tsodyks, M., & Sagi, D. (2006b). Singularities in the inverse modeling of 2AFC contrast discrimination data. Vision Research, 46, 256-266; Katkov, M., Tsodyks, M., & Sagi, D. (2007). Singularities explained: Response to Klein. Vision Research, doi:10.1016/j.visres.2006.10.030] imply that contrast discrimination data obtained with the 2AFC method cannot always be used to reliably estimate the parameters of the underlying model (internal response and noise functions) with a reasonable number of trials. Here we bypass this problem with the Identification Task (IT) where observers identify one of N contrasts. We have found that identification data varies significantly between experimental sessions. Stable estimates using individual session data showed Contrast Response Functions (CRF) with high gain in the low contrast regime and low gain in the high contrast regime. Noise Amplitudes (NA) followed a decreasing function of contrast at low contrast levels, and were practically constant above some contrast level. The transition between these two regimes corresponded approximately to the position of the dipper in the Threshold versus Contrast (TvC) curves that were computed using the estimated parameters and independently measured using 2AFC.
-
(2007) Vision Research. 47, 22, p. 2918-2922 Abstract
Klein [Klein, A. S. (2006). Separating transducer nonlinearities and multiplicative noise in contrast discrimination. Vision Research, 46, 4279-4293] questions the existence of intrinsic singularities in two-alternative force-choice (2AFC) Signal Detection Theory (SDT) models, suggesting that the singularities found in Katkov et al. [Katkov, M., Tsodyks, M., & Sagi, D. (2006a). Singularities in the inverse modeling of 2AFC contrast discrimination data. Vision Research, 46, 259-266; Katkov, M., Tsodyks, M., & Sagi, D. (2006b). Analysis of two-alternative force-choice Signal Detection Theory model. Journal of Mathematical Psychology, 50, 411-420] are due to discarding higher order terms in the Taylor expansion of d and/or limited to steep psychometric functions. Here we provide some simple intuitive examples that illustrate the results described in Katkov et al. (2006a, 2006b). We show, for the constant noise model, that singularities exist when exact values of d are computed and that the singularities are not limited to steep psychometric functions. In these cases the disambiguation of the different models requires millions of trials.
-
(2007) Vision Research. 47, 7, p. 965-973 Abstract
We investigated how the recognition and perception of memory-stored visual objects are influenced by cumulative experience with similar stimuli. The memory of a face was established by training observers to identify a set of faces as either "friends" or "non-friends". Subsequently, for multiple daily sessions, observers continued to perform this identification task, in which presented faces included a sequence of morphed faces, gradually transforming from a friend face (source) to another initially distinguishable non-friend face (target), interleaved with other faces. Initially observers identified only the first part of the morph c) 2007 Elsevier Ltd. All rights reserved.
-
(2007) PLoS Computational Biology. 3, 2, p. 323-332 Abstract
Persistent activity states (attractors), observed in several neocortical areas after the removal of a sensory stimulus, are believed to be the neuronal basis of working memory. One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections. A recent experimental study revealed that connections between pyramidal cells in prefrontal cortex exhibit various degrees of synaptic depression and facilitation. Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks. We show that different combinations of synaptic depression and facilitation result in qualitatively different network dynamics with respect to the emergence of the attractor states. This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli.
-
2006
-
(2006) Neuron. 52, 2, p. 383-394 Abstract
The ability to associate some stimuli while differentiating between others is an essential characteristic of biological memory. Theoretical models identify memories as attractors of neural network activity, with learning based on Hebb-like synaptic modifications. Our analysis shows that when network inputs are correlated, this mechanism results in overassociations, even up to several memories "merging" into one. To counteract this tendency, we introduce a learning mechanism that involves novelty-facilitated modifications, accentuating synaptic changes proportionally to the difference between network input and stored memories. This mechanism introduces a dependency of synaptic modifications on previously acquired memories, enabling a wide spectrum of memory associations, ranging from absolute discrimination to complete merging. The model predicts that memory representations should be sensitive to learning order, consistent with recent psychophysical studies of face recognition and electrophysiological experiments on hippocampal place cells. The proposed mechanism is compatible with a recent biological model of novelty-facilitated learning in hippocampal circuitry.
-
(2006) Neural Computation. 18, 10, p. 2343-2358 Abstract
Recognizing specific spatiotemporal patterns of activity, which take place at timescales much larger than the synaptic transmission and membrane time constants, is a demand from the nervous system exemplified, for instance, by auditory processing. We consider the total synaptic input that a single readout neuron receives on presentation of spatiotemporal spiking input patterns. Relying on the monotonic relation between the mean and the variance of a neuron's input current and its spiking output, we derive learning rules that increase the variance of the input current evoked by learned patterns relative to that obtained from random background patterns. We demonstrate that the model can successfully recognize a large number of patterns and exhibits a slow deterioration in performance with increasing number of learned patterns. In addition, robustness to time warping of the input patterns is revealed to be an emergent property of the model. Using a leaky integrate-and-fire realization of the readout neuron, we demonstrate that the above results also apply when considering spiking output.
-
(2006) Journal of Mathematical Psychology. 50, 4, p. 411-420 Abstract
A basic problem in psychophysics is recovering the mean internal response and noise amplitude from sensory discrimination data. Since these components cannot be estimated independently, several indirect methods were suggested to resolve this issue. Here we analyze the two-alternative force-choice method (2AFC), using a signal detection theory approach, and show analytically that the 2AFC data are not always suitable for a reliable estimation of the mean internal responses and noise amplitudes. Specifically, we show that there is a subspace of internal parameters that are highly sensitive to sampling errors (singularities), which results in a large range of estimated parameters with a finite number of experimental trials. Four types of singular models were identified, including the models where the noise amplitude is independent of the stimulus intensity, a situation often encountered in visual contrast discrimination. Finally, we consider two ways to avoid singularities: (1) inserting external noise to the stimuli, and (2) using one-interval forced-choice scaling methods (such as the Thurstonian scaling method for successive intervals).
-
(2006) Journal of Computational Neuroscience. 20, 2, p. 219-241 Abstract
The role of intrinsic cortical dynamics is a debatable issue. A recent optical imaging study (Kenet et al., 2003) found that activity patterns similar to orientation maps (OMs), emerge in the primary visual cortex (V1) even in the absence of sensory input, suggesting an intrinsic mechanism of OM activation. To better understand these results and shed light on the intrinsic V1 processing, we suggest a neural network model in which OMs are encoded by the intrinsic lateral connections. The proposed connectivity pattern depends on the preferred orientation and, unlike previous models, on the degree of orientation selectivity of the interconnected neurons. We prove that the network has a ring attractor composed of an approximated version of the OMs. Consequently, OMs emerge spontaneously when the network is presented with an unstructured noisy input. Simulations show that the model can be applied to experimental data and generate realistic OMs. We study a variation of the model with spatially restricted connections, and show that it gives rise to states composed of several OMs. We hypothesize that these states can represent local properties of the visual scene.
-
(2006) PLoS Computational Biology. 2, 3, p. 174-181 Abstract
The cerebral cortex is continuously active in the absence of external stimuli. An example of this spontaneous activity is the voltage transition between an Up and a Down state, observed simultaneously at individual neurons. Since this phenomenon could be of critical importance for working memory and attention, its explanation could reveal some fundamental properties of cortical organization. To identify a possible scenario for the dynamics of Up-Down states, we analyze a reduced stochastic dynamical system that models an interconnected network of excitatory neurons with activity-dependent synaptic depression. The model reveals that when the total synaptic connection strength exceeds a certain threshold, the phase space of the dynamical system contains two attractors, interpreted as Up and Down states. In that case, synaptic noise causes transitions between the states. Moreover, an external stimulation producing a depolarization increases the time spent in the Up state, as observed experimentally. We therefore propose that the existence of Up-Down states is a fundamental and inherent property of a noisy neural ensemble with sufficiently strong synaptic connections.
-
(2006) Vision Research. 46, 1-2, p. 259-266 Abstract
Analytical calculations show that two-alternative force-choice data are not always suitable for specifying the parameters of the underlying discrimination model. Experimentally, we show here that in the case of contrast discrimination in humans, a variety of models spanning a large range of parameters can explain the data within an experimental error. Monte-Carlo simulations indicate that the number of trials in psychophysical experiments is not the limiting factor in estimating the parameters in contrast discrimination. These results can therefore explain the contradictory conclusions made by different groups about the relationship between the response to contrast and the noise amplitude.
2005
-
(2005) Neuron. 48, 2, p. 168-169 Abstract
Attractor neural network theory has been proposed as a theory for long-term memory. Recent studies of hippocampal place cells, including a study by Leutgeb et al. in this issue of Neuron, address the potential role of attractor dynamics in the formation of hippocampal representations of spatial maps.
-
(2005) 23 Problems in Systems Neuroscience. van Hemmen J. L. & Sejnowski T. J.(eds.). Abstract
This chapter shows that the spontaneous firing of single neurons is tightly linked to the cortical networks in which they are embedded. The idea of a network is a central concept in theoretical brain research and it is now finally possible to directly visualize the cortical networks and their states in action, at high spatiotemporal resolution.
-
(2005) Methods and Models in Neurophysics. Gutkin B., Meunier C., Chow C. C., Hansel D. & Dalibard J.(eds.). C ed. p. 245-265 Abstract
In this lecture series, I describe the recent advances in studying short-term plasticity in synaptic transmission. The material is divided into 3 sections. The first section is dealing with a phenomenological model of synaptic transmission and its underlying biophysical assumptions. In the second section, the model is used to study the implications of synaptic dynamics on the information transmission between ensembles of neocortical neurons. Finally, the last section deals with the effects of short-term synaptic plasticity on dynamics of recurrent networks.
2004
-
(2004) Journal of Vision. 4, 12, p. 993-1005 2. Abstract
Performance in perceptual tasks improves with repetition (perceptual learning), eventually reaching a saturation level. Typically, when perceptual learning effects are studied, stimulus parameters are kept constant throughout the training and during the pre- and post-training tests. Here we investigate whether learning by repetition transfers to testing conditions in which the practiced stimuli are randomly interleaved during the post-training session. We studied practice effects with a contrast discrimination task, employing a number of training methods: (i) practice with a single, fixed pedestal (base-contrast), (ii) practice with several pedestals, and (iii) practice with several pedestals that included a spatial context. Pre-and post-training tests were carried out with the base contrast randomized across trials, under conditions of contrast uncertainty. The results showed that learning had taken place with the fixed pedestal method (i) and with the context method (iii), but only the latter survived the uncertainty test. In addition, we were able to identify a very fast learning phase in contrast discrimination that improved performance under uncertainty. We contend that learned tasks that do not pass the uncertainty test involve modification of decision strategies that require exact knowledge of the stimulus.
-
(2004) Nature. 431, 7010, p. 775-781 Abstract
Sensory perception is a learned trait. The brain strategies we use to perceive the world are constantly modified by experience. With practice, we subconsciously become better at identifying familiar objects or distinguishing fine details in our environment. Current theoretical models simulate some properties of perceptual learning, but neglect the underlying cortical circuits. Future neural network models must incorporate the top-down alteration of cortical function by expectation or perceptual tasks. These newly found dynamic processes are challenging earlier views of static and feedforward processing of sensory information.
-
(2004) Journal Of Physiology-London. 557, 2, p. 415-438 Abstract
Synaptic transmission between pairs of excitatory neurones in layers V (N = 38) or IV (N = 6) of somatosensory cortex was examined in a parasagittal slice preparation obtained from young Wistar rats (14-18 days old). A combined experimental and theoretical approach reveals two characteristics of short-term synaptic depression. Firstly, as well as a release-dependent depression, there is a release-independent component that is evident in smaller postsynaptic responses even following failure to release transmitter. Secondly, recovery from depression is activity dependent and is faster at higher input frequencies. Frequency-dependent recovery is a Ca2+-dependent process and does not reflect an underlying augmentation. Frequency-dependent recovery and release-independent depression are correlated, such that at those connections with a large amount of release-independent depression, recovery from depression is faster. In addition, both are more pronounced in experiments performed at physiological temperatures. Simulations demonstrate that these homeostatic properties allow the transfer of rate information at all frequencies, essentially linearizing synaptic responses at high input frequencies.
-
(2004) Journal of Neurophysiology. 91, 2, p. 704-709 Abstract
Information processing in neocortex can be very fast, indicating that neuronal ensembles faithfully transmit rapidly changing signals to each other. Apart from signal-to-noise issues, population codes are fundamentally constrained by the neuronal dynamics. In particular, the biophysical properties of individual neurons and collective phenomena may substantially limit the speed at which a graded signal can be represented by the activity of an ensemble. These implications of the neuronal dynamics are rarely studied experimentally. Here, we combine theoretical analysis and whole cell recordings to show that encoding signals in the variance of uncorrelated synaptic inputs to a neocortical ensemble enables faithful transmission of graded signals with high temporal resolution. In contrast, the encoding of signals in the mean current is subject to low-pass filtering.
-
(2004) Trends in Neurosciences. 27, 1, p. 11-14 Abstract
A major challenge to understanding behavior is how the nervous system allows the learning of behavioral sequences that can occur over arbitrary timescales, ranging from milliseconds up to seconds, using a fixed millisecond learning rule. This article describes some potential solutions, and then focuses on a study by Mehta et al. that could contribute towards solving this puzzle. They have discovered that an experience-dependent asymmetric shape of hippocampal receptive fields combined with oscillatory inhibition can serve to map behavioral sequences on a fixed timescale.
-
(2004) Neural Networks. 17, 5-6, p. 823-832 Abstract
Sensory discriminations often improve with practice (perceptual learning). Recent results show that practice does not necessarily lead to the best possible performance on the task. It was shown that learning a task (contrast discrimination) that has already reached saturation could be enabled by a contextual change in the stimulus (the addition of surrounding flankers) during practice. Psychophysical results with varying context show a behavior that is described by a network of local visual processors with horizontal recurrent interactions. We describe a mathematical learning rule for the modification of cortical synapses that is inspired by the experimental results and apply it to recurrent cortical networks that respond to external stimuli. The model predicts that repeated presentation of the same stimulus leads to saturation of synaptic modification, such that the strengths of recurrent connections depend on the configuration of the stimulus but not on its amplitude. When a new stimulus is introduced, the modification is rekindled until a new equilibrium is reached. This effect may explain the saturation of perceptual learning when practicing a certain task repeatedly. We present simulations of contrast discrimination in a simplified model of a cortical column in the primary visual cortex and show that performance of the model is reminiscent of context-dependent perceptual learning.
2003
-
(2003) Nature. 425, 6961, p. 954-956 Abstract
Spontaneous cortical activity-ongoing activity in the absence of intentional sensory input-has been studied extensively, using methods ranging from EEG (electroencephalography), through voltage sensitive dye imaging, down to recordings from single neurons. Ongoing cortical activity has been shown to play a critical role in development, and must also be essential for processing sensory perception, because it modulates stimulus-evoked activity, and is correlated with behaviour. Yet its role in the processing of external information and its relationship to internal representations of sensory attributes remains unknown. Using voltage sensitive dye imaging, we previously established a close link between ongoing activity in the visual cortex of anaesthetized cats and the spontaneous firing of a single neuron. Here we report that such activity encompasses a set of dynamically switching cortical states, many of which correspond closely to orientation maps. When such an orientation state emerged spontaneously, it spanned several hypercolumns and was often followed by a state corresponding to a proximal orientation. We suggest that dynamically switching cortical states could represent the brain's internal context, and therefore reflect or influence memory, perception and behaviour.
-
(2003) Journal of Vision. 3, 9, p. 173a Abstract
Performance on perceptual tasks improves with practice. However, contrast discrimination thresholds show a remarkable stability when a large range of contrasts (0-0.6) is practiced. There are two known exceptions: (a) when the practiced target is surrounded by flankers (Adini et al, Nature 415, 790-793, 2002), (b) when practicing with a single base contrast (Yu et al, VSS 2002). The improvement can be explained by increasing the gain of contrast transduction and/or by optimization of discrimination strategies applied to the specific contrast level(s) used during practice. To separate between the two accounts we measured contrast discrimination thresholds before and after learning in conditions where the observers could not predict the target contrast (contrast uncertainty). Learning effects based on plastic changes in the basic sensory mechanism, but not on contrast specific strategies, are expected to survive such an experimental manipulation. The pre-learning tests (using Gabor signals) showed the expected stable performance with typical threshold vs contrast functions for both the certain and the uncertain contrast conditions. Next, observers were trained with contrast discrimination using a constant base (pedestal) contrast (0.5). Discrimination thresholds were almost halved during practice. However, this improvement was found to be specific to the trained condition and post-training tests with contrast uncertainty showed no improvement. A second group of observers practiced the full contrast range with the target embedded in a chain of flankers, showing the expected improvement in contrast discrimination. This learning effect was found to transfer to the post-learning test with contrast uncertainty. The results imply that contrast transduction is modified when contrast discrimination is practiced with flanked targets. Without flankers, learning may involve improvement of decision strategies, depending on the information available to the observer.
-
(2003) Biopolymers. 68, 3, p. 422-436 Abstract
Spontaneous cortical activity of single neurons is often either dismissed as noise, or is regarded as carrying no functional significance and hence is ignored. Our findings suggest that such concepts should be revised. We explored the coherent population activity of neuronal assemblies in primary sensory area in the absence of a sensory input. Recent advances in real-time optical imaging based on voltage-sensitive dyes (VSDI) have facilitated exploration of population activity and its intimate relationship to the activity of individual cortical neurons. It has been shown by in vivo intracellular recordings that the dye signal measures the sum of the membrane potential changes in all the neuronal elements in the imaged area, emphasizing subthreshold synaptic potentials and dendritic action potentials in neuronal arborizations originating from neurons in all cortical layers whose dendrites reach the superficial cortical layers. Thus, the VSDI has allowed us to image the rather illusive activity in neuronal dendrites that cannot be readily explored by single unit recordings. Surprisingly, we found that the amplitude of this type of ongoing subthreshold activity is of the same order of magnitude as evoked activity. We also found that this ongoing activity exhibited high synchronization over many millimeters of cortex. We then investigated the influence of ongoing activity on the evoked response, and showed that the two interact strongly. Furthermore, we found that cortical states that were previously associated only with evoked activity can actually be observed also in the absence of stimulation, for example, the cortical representation of a given orientation may appear without any visual input. This demonstration suggests that ongoing activity may also play a major role in other cortical function by providing a neuronal substrate for the dependence of sensory information processing on context, behavior, memory and other aspects of cognitive function.
-
(2003) Journal of Vision. 3, 9, p. 609a Abstract
Psychophysical contrast discrimination is believed to be mediated by internal responses, each characterized by contrast-dependent mean value and noise amplitude. The standard measure of contrast discrimination, TvC (threshold versus contrast) curve does not allow unambiguous characterization of these two components of the internal response, since many possible combinations could account for the same TvC curve. Here we propose a novel approach that is based on performing a larger number of pair wise contrast discriminations. The performance, measured as percent correct discrimination, is compared with the predicted one based on a model that assumes normally distributed responses. The two response components are then determined by matching the model predictions to the experimental results. This procedure requires a minimal number of stimulus pairs in order to derive a complete or over-complete system of equations for mean and noise response amplitudes. The method can also be used for other stimuli configurations, such as local stimuli surrounded by flankers, allowing to study the lateral interactions. We applied the method to a set of 6 isolated Gabor patches (9.2 cpd, sigma=0.11 deg) with different contrasts (randomly mixed) and 10 pair wise discriminations (temporal 2AFC) that resulted in the complete system of model equations. The preliminary experimental results (2 observers, ∼600 trials each) indicate that the shape of the TvC curve is determined by a nonlinear transducer function and a non-trivial contrast-dependence of the noise amplitudes. However we find that calculated noise amplitudes are more sensitive to experimental measurement errors. It is possible that we will be able to reduce this uncertainty by considering additional contrast discriminations that will result in an over-complete system of equations for response amplitudes.
2002
-
(2002) Trends in Neurosciences. 25, 12, p. 599-600 Abstract
Depending on the precise temporal relationship between their spiking activities, connections between neurons could be modified in opposite directions. Although the functional implications of this spike-timing-dependent plasticity are not clear, several theoretical studies have indicated that it could underlie important effects such as sequence learning, predictive learning and balancing excitation and inhibition. To explore fully this novel form of synaptic plasticity, it is crucial to understand how the modification builds up over the consecutive spikes of presynaptic and postsynaptic neurons. In the absence of solid data, many theorists assumed a linear summation model. However, recent experiments specifically devised to study this issue have demonstrated that the effects of the consecutive spikes on the overall modification steadily decline, indicating strong non-linearities in the corresponding learning rules.
-
(2002) Nature. 415, 6873, p. 790-793 Abstract
Training was found to improve the performance of humans on a variety of visual perceptual tasks. However, the ability to detect small changes in the contrast of simple visual stimuli could not be improved by repetition. Here we show that the performance of this basic task could be modified after the discrimination of the stimulus contrast was practised in the presence of similar laterally placed stimuli, suggesting a change in the local neuronal circuit involved in the task. On the basis of a combination of hebbian and anti-hebbian synaptic learning rules compatible with our results, we propose a mechanism of plasticity in the visual cortex that is enabled by a change in the context.
-
(2002) Journal of Neurophysiology. 88, 2, p. 761-770 Abstract
Spike-frequency adaptation in neocortical pyramidal neurons was examined using the whole cell patch-clamp technique and a phenomenological model of neuronal activity. Noisy current was injected to reproduce the irregular firing typically observed under in vivo conditions. The response was quantified by computing the poststimulus histogram (PSTH). To simulate the spiking activity of a pyramidal neuron, we considered an integrate-and-fire model to which an adaptation current was added. A simplified model for the mean firing rate of an adapting neuron under noisy conditions is also presented. The mean firing rate model provides a good fit to both experimental and simulation PSTHs and may therefore be used to study the response characteristics of adapting neurons to various input currents. The models enable identification of the relevant parameters of adaptation that determine the shape of the PSTH and allow the computation of the response to any change in injected current. The results suggest that spike frequency adaptation determines a preferred frequency of stimulation for which the phase delay of a neuron's activity relative to an oscillatory input is zero. Simulations show that the preferred frequency of single neurons dictates the frequency of emergent population rhythms in large networks of adapting neurons. Adaptation could therefore be one of the crucial factors in setting the frequency of population rhythms in the neocortex.
-
(2002) Journal of Neurophysiology. 87, 1, p. 140-148 Abstract
Synaptic transmission in the neocortex is dynamic, such that the magnitude of the postsynaptic response changes with the history of the presynaptic activity. Therefore each response carries information about the temporal structure of the preceding presynaptic input spike train. We quantitatively analyze the information about previous interspike intervals, contained in single responses of dynamic synapses, using methods from information theory applied to experimentally based deterministic and probabilistic phenomenological models of depressing and facilitating synapses. We show that for any given dynamic synapse, there exists an optimal frequency of presynaptic spike firing for which the information content is maximal; simple relations between this optimal frequency and the synaptic parameters are derived. Depressing neocortical synapses are optimized for coding temporal information at low firing rates of 0.5-5 Hz, typical to the spontaneous activity of cortical neurons, and carry significant information about the timing of up to four preceding presynaptic spikes. Facilitating synapses, however, are optimized to code information at higher presynaptic rates of 9-70 Hz and can represent the timing of over eight presynaptic spikes.
-
(2002) Journal of Computational Neuroscience. 13, 2, p. 111-124 Abstract
While computation by ensemble synchronization is considered to be a robust and efficient way for information processing in the cortex (C. Von der Malsburg and W. Schneider (1986) Biol. Cybern. 54: 29-40; W. Singer (1994) Inter. Rev. Neuro. 37: 153-183; J.J. Hopfield (1995) Nature 376: 33-36; E. Vaadia et al. (1995) Nature 373: 515-518), the neuronal mechanisms that might be used to achieve it are yet to be uncovered. Here we analyze a neural network model in which the computation are performed by near coincident firing of neurons in response to external inputs. This near coincident firing is enabled by activity dependent depression of inter-neuron connections. We analyze the network behavior by using a mean-field approximation, which allows predicting the network response to various inputs. We demonstrate that the network is very sensitive to temporal aspects of the inputs. In particular, periodically applied inputs of increasing frequency result in different response profiles. Moreover, applying combinations of different stimuli lead to a complex response, which cannot be easily predicted from response to individual components. These results demonstrate that networks with synaptic depression can perform complex computations on time-dependent inputs utilizing the ability to generate temporally synchronous firing of single neurons.
2001
-
(2001) Neuro-Informatics and Neural Modelling. Moss F. & Gielen S.(eds.). p. 969-1000 Abstract
This chapter focuses on the emergence of feature selectivity from lateral interactions in the visual cortex. The tuning properties of neurons responding to oriented moving stimuli result from the interplay between excitation on a short length scale and inhibition dominating at larger distances. The chapter reviews models for the dynamics of activities in cortex that are based on stereotyped intracortical interactions. These models stood at the very beginning of the mathematical description of collective phenomena in the brain. The dynamics of these models may have far-reaching consequences and explains a variety of experimental findings. The chapter shows that they might provide a novel explanation for the early development of feature maps in the visual cortex. In chains and two-dimensional neuronal layers, a Mexican-hat shaped coupling induces localized activation patterns. This simple dynamics can be related to response properties of neurons in primary visual cortex. The chapter shows that this approach can also explain the shape of orientation and direction maps as well as the relation of columnar structures to receptive field size and movement.
-
(2001) Neural Computation. 13, 1, p. 35-67 Abstract
The precise times of occurrence of individual pre- and postsynaptic action potentials are known to play a key role in the modification of synaptic efficacy. Based on stimulation protocols of two synaptically connected neurons, we infer an algorithm that reproduces the experimental data by modifying the probability of vesicle discharge as a function of the relative timing of spikes in the pre- and postsynaptic neurons. The primary feature of this algorithm is an asymmetry with respect to the direction of synaptic modification depending on whether the presynaptic spikes precede or follow the postsynaptic spike. Specifically, if the presynaptic spike occurs up to 50 ms before the postsynaptic spike, the probability of vesicle discharge is upregulated, while the probability of vesicle discharge is downregulated if the presynaptic spike occurs up to 50 ms after the postsynaptic spike. When neurons fire irregularly with Poisson spike trains at constant mean firing rates, the probability of vesicle discharge converges toward a characteristic value determined by the pre-and postsynaptic firing rates. On the other hand, if the mean rates of the Poisson spike trains slowly change with time, our algorithm predicts modifications in the probability of release that generalize Hebbian and Bienenstock-Cooper-Munro rules. We conclude that the proposed spike-based synaptic learning algorithm provides a general framework for regulating neurotransmitter release probability.
-
(2001) Nature Neuroscience. 4, 4, p. 431-436 Abstract
Previous experiments indicate that the shape of maps of preferred orientation in the primary visual cortex does not depend on visual experience. We propose a network model that demonstrates that the orientation and direction selectivity of individual units and the structure of the corresponding angle maps could emerge from local recurrent connections. Our model reproduces the structure of preferred orientation and direction maps, and explains the origin of their interrelationship. The model also provides an explanation for the correlation between position shifts of receptive fields and changes of preferred orientations of single neurons across the surface of the cortex.
2000
-
(2000) Neurocomputing. 32-33, p. 359-364 Abstract
We investigated the dynamics of localized excitatory and inhibitory populations coupled by dynamic synapses. The synaptic connections exhibit fast depression and facilitation as recently observed in neocortex. Phase plane analysis and numerical simulations are used to investigate population responses to various stimuli. In particular, we analyze the fixed point structure, their stability as a function of input currents. We find parameter sets for which the system exhibits anomalous behavior, by decreasing the activity of both excitatory and inhibitory populations in response to an increase of the inhibitory input current. (C) 2000 Elsevier Science B.V. All rights reserved.
-
(2000) Neurocomputing. 32-33, p. 365-370 Abstract
We investigated the activity of localized excitatory and inhibitory populations coupled by dynamic synapses. Using numerical simulations, we analyzed the Liapunov exponents as well as fractal dimension of the network for various sets of parameters in order to find regimes of periodic and chaotic behavior. We found that chaotic behavior usually develops when external inputs are near threshold, and that chaos develops through a series of period doublings. It is robust and stable over considerable volume in parameter space. Within chaotic regimes intermittent behavior is exhibited. We investigated the average behavior of the network and shown that the response of the network is approximately linear to the excitatory input across various dynamical regimes.
-
(2000) The Journal of neuroscience : the official journal of the Society for Neuroscience. 20, 1, p. RC50 Abstract
Throughout the neocortex, groups of neurons have been found to fire synchronously on the time scale of several milliseconds. This near coincident firing of neurons could coordinate the multifaceted information of different features of a stimulus. The mechanisms of generating such synchrony are not clear. We simulated the activity of a population of excitatory and inhibitory neurons randomly interconnected into a recurrent network via synapses that display temporal dynamics in their transmission; surprisingly, we found a behavior of the network where action potential activity spontaneously self-organized to produce highly synchronous bursts involving virtually the entire network. These population bursts were also triggered by stimuli to the network in an all-or-none manner. We found that the particular intensities of the external stimulus to specific neurons were crucial to evoke population bursts. This topographic sensitivity therefore depends on the spectrum of basal discharge rates across the population and not on the anatomical individuality of the neurons, because this was random. These results suggest that networks in which neurons are even randomly interconnected via frequency-dependent synapses could exhibit a novel form of reflex response that is sensitive to the nature of the stimulus as well as the background spontaneous activity.
1999
-
(1999) Science. 286, 5446, p. 1943-1946 Abstract
The relation between the activity of a single neocortical neuron and the dynamics of the network in which it is embedded was explored by single-unit recordings and real-time optical imaging. The firing rate of a spontaneously active single neuron strongly depends on the instantaneous spatial pattern of ongoing population activity in a large cortical area. Very similar spatial patterns of population activity were observed both when the neuron fired spontaneously and when it was driven by its optimal stimulus. The evoked patterns could be used to reconstruct the spontaneous activity of single neurons.
-
(1999) Hippocampus. 9, 4, p. 481-489 Abstract
Hippocampal pyramidal neurons in rats are selectively activated at specific locations in an environment. Different cells are active in different places, therefore providing a faithful representation of the environment in which every spatial location is mapped to a particular population state of activity of place cells. We describe a theory of the hippocampus, according to which the map results from the cooperative dynamics of network, in which the strength of synaptic interaction between the neurons depends on the distance between their place fields. This synaptic structure guarantees that the network possesses a quasi-continuous set of stable states (attractors) that are localized in the space of neuronal variables reflecting their synaptic interactions, rather than their physical location in the hippocampus. As a consequence of the stable states, the network can exhibit place selective activity even without relying on input from external sensory cues.
-
(1999) Neurocomputing. 26-27, p. 361-366 Abstract
Using a mean-field approach we simulate the dynamics of a small patch of the primary visual cortex. The model includes excitatory and inhibitory neuronal populations. Lateral synaptic connections between these populations are determined by the topological structure of the orientation selectivity map. Activity-dependent synaptic transmission across the lateral connections generate complex spatio-temporal patterns of network activity, characterized by spontaneous oscillations with several time scales.
-
(1999) Neural Computation. 11, 2, p. 375-379 Abstract
A recent study of cat visual cortex reported abrupt changes in the positions of the receptive fields of adjacent neurons whose preferred orientations strongly differed (Das & Gilbert, 1997). Using a simple cortical model, we show that this covariation of discontinuities in maps of orientation preference and local distortions in maps of visual space reflects collective effects of the lateral cortical feedback.
1998
-
(1998) Neurobiology of Learning and Memory. 70, 1-2, p. 101-112 Abstract
The efficacy of synaptic transmission between two neurons changes as a function of the history of previous activations of the synaptic connection. This history dependence can be characterized by examining the dependence of transmission on the frequency of stimulation. In this framework synaptic plasticity can also be examined in terms of changes in the frequency dependence of transmission and not merely in terms of synaptic strength which constitutes only a linear scaling mechanism. Recent work shows that the frequency dependence of transmission determines the content of information transmitted between neurons and that synaptic modifications can change the content of information transmitted. Multipatch-clamp recordings revealed that the frequency dependence of transmission is potentially unique for each synaptic connection made by a single axon and that the class of pre-postsynaptic neuron determines the class of frequency dependence (activity independent), while the unique activity relationship between any two neurons could determine the precise values of the parameters within a specific class (activity dependent). The content of information transmitted between neurons is also formalized to provide synaptic transfer functions which can be used to determine the role of the synaptic connection within a network of neurons. It is proposed that deriving synaptic transfer functions is crucial in order to understand the link between synaptic transmission and information processing within networks of neurons and to understand the link between synaptic plasticity and learning and memory.
-
(1998) Neural Computation. 10, 4, p. 815-819 Abstract
A recent experiment showed that neurons in the primary auditory cortex of the monkey do not change their mean firing rate during an ongoing tone stimulus. The only change was an enhanced correlation among the individual spike trains during the tone. We show that there is an easy way to extract this coherence information in the cortical cell population by projecting the spike trains through depressing synapses onto a postsynaptic neuron.
-
(1998) Neural Computation. 10, 4, p. 821-835 Abstract
Transmission across neocortical synapses depends on the frequency of presynaptic activity (Thomson & Deuchars, 1994). Interpyramidal synapses in layer V exhibit fast depression of synaptic transmission, while other types of synapses exhibit facilitation of transmission. To study the role of dynamic synapses in network computation, we propose a unified phenomenological model that allows computation of the postsynaptic current generated by both types of synapses when driven by an arbitrary pattern of action potential (AP) activity in a presynaptic population. Using this formalism, we analyze different regimes of synaptic transmission and demonstrate that dynamic synapses transmit different aspects of the presynaptic activity depending on the average presynaptic frequency. The model also allows for derivation of mean-field equations, which govern the activity of large, interconnected networks. We show that the dynamics of synaptic transmission results in complex sets of regular and irregular regimes of network activity.
-
(1998) Proceedings of the National Academy of Sciences of the United States of America. 95, 9, p. 5323-5328 Abstract
The nature of information stemming from a single neuron and conveyed simultaneously to several hundred target neurons is not known. Triple and quadruple neuron recordings revealed that each synaptic connection established by neocortical pyramidal neurons is potentially unique. Specifically, synaptic connections onto the same morphological class differed in the numbers and dendritic locations of synaptic contacts, their absolute synaptic strengths, as well as their rates of synaptic depression and recovery from depression. The same axon of a pyramidal neuron innervating another pyramidal neuron and an interneuron mediated frequency-dependent depression and facilitation, respectively, during high frequency discharges of presynaptic action potentials, suggesting that the different natures of the target neurons underlie qualitative differences in synaptic properties. Facilitating-type synaptic connections established by three pyramidal neurons of the same class onto a single interneuron, were all qualitatively similar with a combination of facilitation and depression mechanisms. The time courses of facilitation and depression, however, differed for these convergent connections, suggesting that different pre-postsynaptic interactions underlie quantitative differences in synaptic properties. Mathematical analysis of the transfer functions of frequency-dependent synapses revealed supra-linear, linear, and sub-linear signaling regimes in which mixtures of presynaptic rates, integrals of rates, and derivatives of rates are transferred to targets depending on the precise values of the synaptic parameters and the history of presynaptic action potential activity. Heterogeneity of synaptic transfer functions therefore allows multiple synaptic representations of the same presynaptic action potential train and suggests that these synaptic representations are regulated in a complex manner. It is therefore proposed that differential signaling is a key mechanism in neocortical information processing, which can be regulated by selective synaptic modifications.
-
(1998) Neuropharmacology. 37, 4-5, p. 489-500 Abstract
Recent experimental evidence indicates that in the neocortex, the manner in which each synapse releases neurotransmitter in response to trains of presynaptic action potentials is potentially unique. These unique transmission characteristics arise because of a large heterogeneity in various synaptic properties that determine frequency dependence of transmission such as those governing the rates of synaptic depression and facilitation. A theoretical analysis was therefore undertaken to explore the phenomenologies of changes in the values of these synaptic parameters. The results illustrate how the change in any one of several synaptic parameters produces a distinctive effect on synaptic transmission and how these distinctive effects can point to the most likely biophysical mechanisms. These results could therefore be useful in studies of synaptic plasticity in order to obtain a full characterization of the phenomenologies of synaptic modifications and to isolate potential biophysical mechanisms. Based on this theoretical analysis and experimental data, it is proposed that there exists multiple mechanisms, phenomena and algorithms for synaptic plasticity at single synapses. Finally, it is shown that the impact of changing the values of synaptic parameters depends on the values of the other parameters. This may indicate that the various mechanisms, phenomena and algorithms are interlinked in a 'synaptic plasticity code'.
-
(1998) Journal of Computational Neuroscience. 5, 2, p. 157-169 Abstract
We discuss the first few stages of olfactory processing in the framework of a layered neural network. Its central component is an oscillatory associative memory, describing the external plexiform layer, that consists of inhibitory and excitatory neurons with dendrodendritic interactions. We explore the computational properties of this neural network and point out its possible functional role in the olfactory bulb. When receiving a complex input that is composed of several odors, the network segments it into its components. This is done in two stages. First, multiple odor input is preprocessed in the glomerular layer via a decorrelation mechanism that relies on temporal independence of odor sources. Second, as the recall process of a pattern consists of associative convergence to an oscillatory attractor, multiple inputs are identified by alternate dominance of memory patterns during different sniff cycles. This could explain how quick analysis of mixed odors is subserved by the rapid sniffing behavior of highly olfactory animals. When one of the odors is much stronger than the rest, the network converges onto it, thus displaying odor masking.
1997
-
(1997) Proceedings of the National Academy of Sciences of the United States of America. 94, 19, p. 10426-10431 Abstract
At early stages in visual processing cells respond to local stimuli with specific features such as orientation and spatial frequency. Although the receptive fields of these cells have been thought to be local and independent, recent physiological and psychophysical evidence has accumulated, indicating that the cells participate in a rich network of local connections. Thus, these local processing units can integrate information over much larger parts of the visual field; the pattern of their response to a stimulus apparently depends on the context presented. To explore the pattern of lateral interactions in human visual cortex under different context conditions we used a novel chain lateral masking detection paradigm, in which human observers performed a detection task in the presence of different length chains of high-contrast-flanked Gabor signals. The results indicated a nonmonotonic relation of the detection threshold with the number of flankers. Remote flankers had a stronger effect on target detection when the space between them was filled with other flankers, indicating that the detection threshold is caused by dynamics of large neuronal populations in the neocortex, with a major interplay between excitation and inhibition. We considered a model of the primary visual cortex as a network consisting of excitatory and inhibitory cell populations, with both short- and long-range interactions. The model exhibited a behavior similar to the experimental results throughout a range of parameters. Experimental and modeling results indicated that long-range connections play an important role in visual perception, possibly mediating the effects of context.
-
(1997) Journal of Computational Neuroscience. 4, 2, p. 173-182 Abstract
The external plexiform layer is where the interactions between the mitral (excitatory) and granule (inhibitory) cells of the olfactory bulb (OB) take place. Two outstanding features of these interactions are that they are dendrodendritic and that there seem to be none between excitatory cells. The latter are usually credited with the role of forming Hebbian cell assemblies. Hence, it would seem that this structure lacks the necessary ingredients for an associative memory system. In this article we show that in spite of these two properties this system can serve as an associative memory. Our model incorporates the essential anatomical characteristics of the OB. The memories in our system, defined by Hebbian mitral assemblies, are activated via the interactions with the inhibitory granule cells. The nonlinearity is introduced in our model via a sigmoid function that describes neurotransmitter release in reciprocal dendrodendritic synapses. The capacity (maximal number of odors that can be memorized) depends on the sparseness of coding that is being used. For very low memory activities, the capacity grows as a fractional power of the number of neurons. We validate the theoretical results by numerical simulations. An interesting result of our model is that its capacity increases as a function of the ratio of inhibitory to excitatory populations. This may provide an explanation for the dominance of inhibitory cells in the olfactory bulb.
-
(1997) Proceedings of the National Academy of Sciences of the United States of America. 94, 2, p. 719-723 Abstract
Although signaling between neurons is central to the functioning of the brain, we still do not understand how the code used in signaling depends on the properties of synaptic transmission. Theoretical analysis combined with patch clamp recordings from pairs of neocortical pyramidal neurons revealed that the rate of synaptic depression, which depends on the probability of neurotransmitter release, dictates the extent to which firing rate and temporal coherence of action potentials within a presynaptic population are signaled to the postsynaptic neuron. The postsynaptic response primarily reflects rates of firing when depression is slow and temporal coherence when depression is fast. A wide range of rates of synaptic depression between different pairs of pyramidal neurons was found, suggesting that the relative contribution of rate and temporal signals varies along a continuum. We conclude that by setting the rate of synaptic depression, release probability is an important factor in determining the neural code.
-
(1997) Journal of Neuroscience. 17, 11, p. 4382-4388 Abstract
The neocortex, hippocampus, and several other brain regions contain populations of excitatory principal cells with recurrent connections and strong interactions with local inhibitory interneurons. To improve our understanding of the interactions among these cell types, we modeled the dynamic behavior of this type of network, including external inputs. A surprising finding was that increasing the direct external inhibitory input to the inhibitory interneurons, without directly affecting any other part of the network, can, in some circumstances, cause the interneurons to increase their firing rates. The main prerequisite for this paradoxical response to external input is that the recurrent connections among the excitatory cells are strong enough to make the excitatory network unstable when feedback inhibition is removed. Because this requirement is met in the neocortex and several regions of the hippocampus, these observations have important implications for understanding the responses of interneurons to a variety of pharmacological and electrical manipulations. The analysis can be extended to a scenario with periodically varying external input, where it predicts a systematic relationship between the phase shift and depth of modulation for each interneuron. This prediction was tested by recording from interneurons in the CA1 region of the rat hippocampus in vive, and the results broadly confirmed the model. These findings have further implications for the function of inhibitory and neuromodulatory circuits, which can be tested experimentally.
-
(1997) Artificial Neural Networks - ICANN 1997 - 7th International Conference, Proceeedings. Germond A., Nicoud J-D, Hasler M. & Gerstner W.(eds.). Vol. 1327. p. 13-23 Abstract
Electrical recordings from three neurons revealed that the same spike train emitted by one neuron had markedly different effects on two target neurons. A spike train from a single neocortical pyramidal neuron produced synaptic responses in two target pyramidal neurons that differed in response strength and rates of activity-dependent depression of synaptic transmission. When a pyramidal neuron targeted another pyramidal neuron as well as an interneuron, then responses were also qualitatively different. The responses onto the pyramidal neuron displayed marked activity-dependent depression while those onto the m-terneuron displayed marked activity-dependent facilitation. The results suggest that each target could have a unique response to the same presynaptic signal. The information contained within the spike train therefore appears to be fragmented and re-integrated into the network at specific locations. The degree to which the specific fragment extracted by each synapse, will influence the spiking activity of the neuron, depends the ongoing integration of input from other presynaptic neurons. It is therefore proposed that differential synaptic transmission enables the neocortex to encode and decode the information contained within spike trains in an associative manner.
-
(1997) Artificial Neural Networks ICANN'97. Germond A., Nicoud J-D, Hasler M. & Gerstner W.(eds.). p. 121-126 Abstract
The timing between individual pre- and post-synaptic action potentials is known to play a crucial role in the modification of the synaptic efficacy during activity. Based on stimulation protocols of two synaptically connected neurons, we infer an algorithm which explains the data by modifying the probability of neurotransmitter discharge as a function of the pre- and postsynaptic spike delays. The characteristics of this algorithm is its asymmetry with respect to the delays: if the postsynaptic spike arrives after the presynaptic spike, the probability of discharge is up-regulated while it is down-regulated if the postsynaptic spike arrives before the presynaptic spike. The algorithm allows to predict stimulation protocols which induce maximal up- and down-regulation of the discharge probability.
1996
-
(1996) Journal Of Physiology-Paris. 90, 3-4, p. 229-232 Abstract
Changing the reliability of neurotransmitter release results in a change in the efficacy of low frequency synaptic transmission and in the rate of high frequency synaptic depression thus it can not cause an uniform change in strength of synapses and instead results in a change in the dynamics of synaptic transmission referred to as 'redistribution of synaptic efficacy' (RSE). Since the change in synaptic transmission associated with RSE depends on the history of action potential activity it is concluded that RSE serves as a mechanism to generate a potentially infinite diversity of synoptic input.
-
(1996) Hippocampus. 6, 3, p. 271-280 Abstract
O'Keefe and Recce ([1993] Hippocampus 68:317-330) have observed that the spatially selective firing of pyramidal cells in the CA1 field of the rat hippocampus tends to advance to earlier phases of the electroencephalogram theta rhythm as a rat passes through the place field of a cell. We present here a neural network model based on integrate-and-fire neurons that accounts for this effect. In this model, place selectivity in the hippocampus is a consequence of synaptic interactions between pyramidal neurons together with weakly selective external input. The phase shift of neuronal spiking arises in the model as a result of asymmetric spread of activation through the network, caused by asymmetry in the synaptic interactions. Several experimentally observed properties of the phase shift effect follow naturally from the model, including 1) the observation that the first spikes a cell fires appear near the theta phase corresponding to minimal population activity, 2) the overall advance is less than 360 degrees, and 3) the location of the rat within the place field of the cell is the primary correlate of the firing phase, not the time the rat has been in the field. The model makes several predictions concerning the emergence of place fields during the earliest stages of exploration in a novel environment. It also suggests new experiments that could provide further constraints on a possible explanation of the phase precession effect.
-
(1996) Artificial Neural Networks ICANN 96. Sendhoff B., von der Malsburg C., Vorbrüggen J. C. & von Seelen W.(eds.). p. 445-450 Abstract
The transmission across neocortical synapses changes dynamically as a function of presynaptic activity [1]. A switch in the manner in which a complex signal, such as a burst of presynaptic action potentials, is transmitted between two neocortical layer 5 pyramidal neurons was observed after coactivation of both neurons. The switch involved a redistribution of synaptic efficacy during the burst such that the synapses transmitted more effectively only the first action potential in the burst. A computational analysis reveals that this modification in dynamically changing transmission enables pyramidal neurons to extract a rich array of dynamic features of ongoing activity in network of pyramidal neurons, such as the onset and amplitude of abrupt synchronized frequency transitions in groups of presynaptic neurons, the size of the group of neurons involved and the degree of synchrony. These synapses transmit information about dynamic features by causing transient increases of postsynaptic current which have characteristic amplitudes and durations. At the same time, the ability of synapses to signal the sustained level of presynaptic activity is limited to a narrow range of low frequencies, which become even narrower after synaptic modification.
-
(1996) Nature. 382, 6594, p. 807-810 Abstract
EXPERIENCE-dependent potentiation and depression of synaptic strength has been proposed to subserve learning and memory by changing the gain of signals conveyed between neurons. Here we examine synaptic plasticity between individual neocortical layer- 5 pyramidal neurons. We show that an increase in the synaptic response, induced by pairing action potential activity in pre- and postsynaptic neurons, was only observed when synaptic input occurred at low frequencies. This frequency-dependent increase in synaptic responses arises because of a redistribution of the available synaptic efficacy and not because of an increase in the efficacy. Redistribution of synaptic efficacy could represent a mechanism to change the content, rather than the gain, of signals conveyed between neurons.
1995
-
(1995) Network-Computation In Neural Systems. 6, 2, p. 111-124 Abstract
We have explored a network model of cortical microcircuits based on integrate-and-fire neurons in a regime where the reset following a spike is small, recurrent excitation is balanced by feedback inhibition, and the activity is highly irregular. This regime cannot be described by a mean-field theory based on average activity levels because essential features of the model depend on fluctuations from the average. We propose a new way of scaling the strength of synaptic interaction with the size of the network: rather than scale the amplitude of the synapse we scale the neurotransmitter release probabilities with the number of inputs to keep the average input constant. This is consistent with the low transmitter release probability observed in a majority of hippocampal synapses. Another prominent feature of this regime is the ability of the network to switch rapidly between different states, as demonstrated in a model based on an orientation columns in the mammalian visual cortex. Both network and intrinsic properties of neurons contribute to achieving the balance condition that allows rapid state switching.
-
(1995) International Journal of Neural Systems. Vol. 6. p. 81-86 Abstract
Many hippocampal pyramidal neurons in rats are selectively activated at specific places in the environment. We present a network model for the CA3 area of the hippocampus. The network produced place selective activity even when the external sensory input was broadly tuned and noisy. The model predicts that the place fields should be nonuniformly distributed, clustering in the places where the synaptic interactions between neurons is strongeest. This may occur at locations of special significance, such as locations where there has been food in the past.
1994
-
(1994) Journal of Neuroscience. 14, 11, p. 6435-6445 Abstract
Interpreting recent single-unit recordings of delay activities in delayed match-to-sample experiments in anterior ventral temporal (AVT) cortex of monkeys in terms of reverberation dynamics, we present a model neural network of quasi-realistic elements that reproduces the empirical results in great detail. Information about the contiguity of successive stimuli in the training sequence, representing the fact that training is done on a set of uncorrelated stimuli presented in a fixed temporal sequence, is embedded in the synaptic structure. The model reproduces quite accurately the correlations between delay activity distributions corresponding to stimulation with the uncorrelated stimuli used for training. It reproduces also the activity distributions of spike rates on sample cells as a function of the stimulating pattern. It is, in our view, the first time that a computational phenomenon, represented on the neurophysiological level, is reproduced in all its quantitative aspects.The model is then used to make predictions about further features of the physiology of such experiments. Those include further properties of the correlations, features of selective cells as discriminators of stimuli provoking different delay activity distributions, and activity distributions among the neurons in a delay activity produced by a given pattern. The model has predictive implications also for the dependence of the delay activities on different training protocols. Finally, we discuss the perspectives of the interplay between such models and neurophysiology as well as its limitations and possible extensions.
-
(1994) Neural Computation. 6, 4, p. 642-657 Abstract
We propose a model of coupled oscillators with noise that performs segmentation of stimuli using a set of stored images, each consisting of objects and a background. The oscillators' amplitudes encode the spatial and featural distribution of the external stimulus. The coherence of their phases signifies their belonging to the same object. In the learning stage, the couplings between phases are modified in a Hebb-like manner. By mean-field analysis and simulations, we show that an external stimulus whose local features resemble those of one or several of the stored objects generates a selective phase coherence that represents the stored pattern of segmentation.
-
(1994) Journal Of Physics A-Mathematical And General. 27, 3, p. 741-756 018. Abstract
We analyse the extensive loading of the neural network model proposed to describe neurophysiological experiments in which correlated attractors associated to uncorrelated patterns are found. The phase diagram is obtained and discussed. Some generalizations of the original model are also considered. In all the cases we demonstrate the existence of a region in the phase diagram with correlated attractors. Results from numerical simulations which confirm the mean-field theory results are also presented.
1993
-
(1993) Physical Review Letters. 71, 8, p. 1280-1283 Abstract
Systems of globally coupled oscillators often display states of full synchrony in which all oscillators are phase locked. It is shown that for globally coupled oscillators with neuronlike pulse interactions, the phase-locked state is unstable to inhomogeneity in the local frequency. For weak inhomogeneity the system breaks into two subpopulations: one that is phase locked and another one that consists of aperiodic oscillators. The fraction of the unlocked population remains finite in the limit of vanishing inhomogeneity.
-
(1993) Neural Computation. 5, 1, p. 1-17 Abstract
It is shown that a simple modification of synaptic structures (of the Hopfield type) constructed to produce autoassociative attractors, produces neural networks whose attractors are correlated with several (learned) patterns used in the construction of the matrix. The modification stores in the matrix a fixed sequence of uncorrelated patterns. The network then has correlated attractors, provoked by the uncorrelated stimuli. Thus, the network converts the temporal order (or temporal correlation) expressed by the sequence of patterns, into spatial correlations expressed in the distributions of neural activities in attractors. The model captures phenomena observed in single electrode recordings in performing monkeys by Miyashita et al. The correspondence is as close as to reproduce the fact that given uncorrelated patterns as sequentially learned stimuli, the attractors produced are significantly correlated up to a separation of 5 (five) in the sequence. This number 5 is universal in a range of parameters, and requires essentially no tuning. We then discuss learning scenarios that could lead to this synaptic structure as well as experimental predictions following from it. Finally, we speculate on the cognitive utility of such an arrangement.
-
(1993) ICANN 93. Kappen B. & Gielen S.(eds.). p. 622-627 Abstract
It is shown that a network of globally coupled integrate-and-fire neurons with pulse interaction possesses a variety of dynamical states with different patterns of synchronization. In the case of homogeneous external input, the network falls into the state of full synchrony in which all oscillators are phase-locked. In a network of excitatory neurons, this state is unstable to weak inhomogeneity: the system breaks into two subpopulations, one that is phase-locked and another that consists of aperiodic oscillators, the overall network activity being a periodic function of time. In the limit of vanishing inhomogeneity, the fraction of the unlocked population remains finite. Increasing the inhomogeneity quickly enters the network into the incoherent state. Adding a population of inhibitory neurons stabilizes the synchronization substantially and extends the dynamical variability of the system. Depending on the values of the parameters, the system can display periodic activity with several subpopulations, or synchronized aperiodic activity.
1992
-
(1992) Network: Computation in Neural Systems. 3, 2, p. 121-137 Abstract
Single-neuron spike dynamics is reconsidered in a situation in which the neural afferent spike input, originating from non-specific spontaneous activity, is very large compared with the input produced by specific (task related) operation of a cortical module. This the authors argue is the situation prevailing in associative cortex. It is shown that the Frolov-Cowan 'point approximation' can be derived systematically in this case, even in the presence of shunting inhibition.The same type of logic is then applied to the cable theory equation for the neuron. Also, here under low ratio of signal to spontaneous activity in the input, the dynamics linearizes, leading to an integrate-and-fire behaviour for the effective neuron. This element sums its synaptic inputs linearly. Its parameters are the resting parameters of the bare neuron, renormalized by the heavy barrage of impinging spontaneous activity. The only remnant of the geometric structure of the dendritic tree is an effective weakening of the postsynaptic potential due to the spatial decay of the spike influence, travelling from the synapse to the spike-emitting part of the membrane, and a time delay for the arrival of the peak of the spike influence.This description can then be re-expressed in terms of rates. The role of the low rates of the selectively spiking neurons is found to be essential at many stages in the argument.
-
(1992) International Journal of Neural Systems. 03, Supp 01, p. 51-56 Abstract
We propose a model of coupled oscillators with noise that performs binding and segmentation of objects using a set of stored images each consisting of figures and a background. The amplitudes of the oscillators encode the spatial and featural distribution of the external stimulus. In the learning stage the couplings between the phases are modified in a Hebb-like manner. By meanfield analysis and simulations we show that an external stimulus whose local features resemble those of one or several of the stored figures causes a selective phase coherence that retrieves the stored pattern of segmentation.
1991
-
(1991) Europhysics Letters. 14, 8, p. 727-732 Abstract
We apply the theory of chaotic regimes for asymmetric networks to the case of highly diluted neural networks with Hebb learning rule. We find the critical capacity and the transition point to chaos.
-
(1991) Network: Computation in Neural Systems. 2, 3, p. 259-273 Abstract
We discuss the conversion of the description of the dynamics of a neural network from a temporal variation of synaptic currents driven by point spikes and modulated by a synaptic structure to a description of the current dynamics driven by spike rates. The conditions for the validity of such a conversion are discussed in detail and are shown to be quite realistic in cortical conditions. This is done in preparation for a discussion of a scenario of an attractor neural network, based on the interaction of synaptic currents and neural spike rates. The spike rates are then expressed in terms of the currents themselves to provide a closed set of dynamical equations for the currents. The current-rate relation is expressed as a neuronal gain function, converting currents into spike rates. It describes an integrate-and-fire element with noisy inputs, under explicit quaniitatve conditions which we argue to be plausible in a cortical situation In particular, it is shown that the gain of the current to rate transduction function, deduced from realistic parameters, does not exclude the possibility of a stable operation of the prospectrve ANN at low spike rates The actual integration into an associative memory network is left for the consecutive article.
-
(1991) Network: Computation in Neural Systems. 2, 3, p. 275-294 Abstract
A network of current-rate dynamics with a symmetric synaptic matrix is analysed and simulated for its low-rate attractor structure. The dynamics is deterministic, with the noise included in a realistic current-rate transduction function (discussed in part I) The analysis is carried out in mean-field theory. It is shown that at low loading the network retrieves without errors, with uniform low rates, that there are no simple spurious states and that the low-rate attractors, retrieving single patterns, are stable to the admixture of additional patterns. The analysis of the attractors in a network with an extensive number of patterns is carried out in the replica symmetric approximations. The results for the dependence of the retrieval rates on loading level; for the distribution of rates among neurons, as well as for the storage capacity are compared with simulations. Simulations also show that retriewal performance is very robust to random elimination of synapses. Moreover, errors in the stimulus, relative to the stored patterns, are very rapidly corrected. It is shown that memory saturation expresses itself either in a drift toward a quiescent state, or in the freezing of rates in the two classes of neurons in a pattern in the foreground and in the background. Freezing means that in each class a subset of neurons are active at high rates and the others are quiescent. The two activity distributions become indistinguishable
-
Imitatsionnaia model' avtoassotsiativnoǐ pamiati v vide neǐronnoǐ seti s malym urovnem aktivnosti.(1991) Biofizika. 36, 2, p. 339-343 Abstract
Information parameters of the neuron net performing the functions of autoassociative memory were investigated by imitation modelling method. The fully connected network with Hebb gradual synapses was studied. Its information capacity was shown to increase significantly with a decrease of the activity level of the stored patterns. This evidence agrees well with the analytical result obtained earlier by the replica method.
1990
-
(1990) Modern Physics Letters B. 4, 11, p. 713-716 Abstract
The simple learning algorithm in the neural network with binary synapses, which take one step for storing one pattern is considered. The resulting model turns out to be palimpsestic, and the number of patterns which can be effectively retrieved is L~N1/2.
-
(1990) Modern Physics Letters B. 4, 4, p. 259-265 Abstract
A Hopfield-like neural network that can store hierarchically correlated patterns with low level of activity is studied. Three learning rules are proposed which enable to obtain nearly optimal storage capacity. These learning rules have different rate of biological relevancy and the restrictions they put upon the structure of hierarchical tree. By varying the value of the neural threshold, it is possible to climb up and down the hierarchical tree.
1989
-
(1989) Modern Physics Letters B. 3, 7, p. 555-560 Abstract
WWe consider the Hopfield model with the most simple form of the Hebbian learning rule, when only simultaneous activity of pre- and post-synaptic neurons leads to modification of synapse. An extra inhibition proportional to full network activity is needed. Both symmetric nondiluted and asymmetric diluted networks are considered. The model performs well at extremely low level of activity p
1988
-
(1988) Europhysics Letters. 7, 3, p. 203-208 Abstract
We extend the analysis of asymmetric diluted networks to the case of low-activity level. The same learning algorithm which was used for the symmetric model turns out to be successful. The use of \u201cV-variables\u201d (V = 0; 1) leads to significant enhancing of the storage capacity. The overloading phase transition is found to be of the first order, which means good retrieval quality in all associative memory phases. The intensity of time-dependent nonthermal noise can be diminished considerably by the appropriate choice of the neural threshold. Some sort of \u201cuniversality\u201d of the performance of the networks with low-activity level can be noted.
-
(1988) Europhysics Letters. 6, 2, p. 101-105 Abstract
The modified Hopfield model defined in terms of \u201cV-variables\u201d (V = 0; 1), which is appropriate for storage of correlated patterns, is considered. The learning algorithm is proposed to enhance significantly the storage capacity in comparison with previous estimates. At low levels of neural activity, p ≪ 1, we obtain αc(p) ∼ (p|ln p|)-1 which resembles Gardner's estimate for the maximum storage capacity.
1987
-
(1987) Journal of Experimental and Theoretical Physics. 65, 1, p. 124-127 Abstract
The dynamics of amorphous magnets with strong random anisotropy is considered in the ferromagnetic-correlation region T = (T - T,)/T1)rc, where TI = (c/3) $ J(r)d 3r and T, z (x3/2c)' [x-I is the J(r) interaction radius and c is the density of the magnetic ions]. It is assumed that CX-~) 1. It is shown by the dynamic-functional method that in the principal approximation the dynamic susceptibility takes the form G -'(k,w) a [ (iw/ro) + r -i- k 2/~2], which is typical of purely dissipative dynamics. The corrections to G(k,w) necessitated by application of an external magnetic field are calculated. There is no ferromagnetic resonance.
1986
-
(1986) Journal of Experimental and Theoretical Physics. 64, 3, p. 562-569 Abstract
A study is made of amorphous magnets having a strong random anisotropy. On theassumption of a long-range interaction it is shown that the correlations in such a system at high temperatures are ferromagnetic in nature, and the maximum correlation length is large compared to the range of the interaction. The phase transition to the disordered phase turns out to be equivalent to the phase transition in an Ising spin glass. The ordinary and nonlinear susceptibilities are calculated for different temperature regions above the transition point.