scholarly journals Toward reconstructing spike trains from large‐scale calcium imaging data

HFSP Journal ◽  
2010 ◽  
Vol 4 (1) ◽  
pp. 1-5 ◽  
Author(s):  
Alex C. Kwan
2017 ◽  
Author(s):  
Stephanie Reynolds ◽  
Therese Abrahamsson ◽  
P. Jesper Sjöström ◽  
Simon R. Schultz ◽  
Pier Luigi Dragotti

AbstractIn recent years, the development of algorithms to detect neuronal spiking activity from two-photon calcium imaging data has received much attention. Meanwhile, few researchers have examined the metrics used to assess the similarity of detected spike trains with the ground truth. We highlight the limitations of the two most commonly used metrics, the spike train correlation and success rate, and propose an alternative, which we refer to as CosMIC. Rather than operating on the true and estimated spike trains directly, the proposed metric assesses the similarity of the pulse trains obtained from convolution of the spike trains with a smoothing pulse. The pulse width, which is derived from the statistics of the imaging data, reflects the temporal tolerance of the metric. The final metric score is the size of the commonalities of the pulse trains as a fraction of their average size. Viewed through the lens of set theory, CosMIC resembles a continuous Sørensen-Dice coefficient — an index commonly used to assess the similarity of discrete, presence/absence data. We demonstrate the ability of the proposed metric to discriminate the precision and recall of spike train estimates. Unlike the spike train correlation, which appears to reward overestimation, the proposed metric score is maximised when the correct number of spikes have been detected. Furthermore, we show that CosMIC is more sensitive to the temporal precision of estimates than the success rate.


2018 ◽  
Vol 30 (10) ◽  
pp. 2726-2756 ◽  
Author(s):  
Stephanie Reynolds ◽  
Therese Abrahamsson ◽  
Per Jesper Sjöström ◽  
Simon R. Schultz ◽  
Pier Luigi Dragotti

In recent years, the development of algorithms to detect neuronal spiking activity from two-photon calcium imaging data has received much attention, yet few researchers have examined the metrics used to assess the similarity of detected spike trains with the ground truth. We highlight the limitations of the two most commonly used metrics, the spike train correlation and success rate, and propose an alternative, which we refer to as CosMIC. Rather than operating on the true and estimated spike trains directly, the proposed metric assesses the similarity of the pulse trains obtained from convolution of the spike trains with a smoothing pulse. The pulse width, which is derived from the statistics of the imaging data, reflects the temporal tolerance of the metric. The final metric score is the size of the commonalities of the pulse trains as a fraction of their average size. Viewed through the lens of set theory, CosMIC resembles a continuous Sørensen-Dice coefficient—an index commonly used to assess the similarity of discrete, presence/absence data. We demonstrate the ability of the proposed metric to discriminate the precision and recall of spike train estimates. Unlike the spike train correlation, which appears to reward overestimation, the proposed metric score is maximized when the correct number of spikes have been detected. Furthermore, we show that CosMIC is more sensitive to the temporal precision of estimates than the success rate.


2020 ◽  
Author(s):  
Darian Hadjiabadi ◽  
Matthew Lovett-Barron ◽  
Ivan Raikov ◽  
Fraser Sparks ◽  
Zhenrui Liao ◽  
...  

AbstractNeurological and psychiatric disorders are associated with pathological neural dynamics. The fundamental connectivity patterns of cell-cell communication networks that enable pathological dynamics to emerge remain unknown. We studied epileptic circuits using a newly developed integrated computational pipeline applied to cellular resolution functional imaging data. Control and preseizure neural dynamics in larval zebrafish and in chronically epileptic mice were captured using large-scale cellular-resolution calcium imaging. Biologically constrained effective connectivity modeling extracted the underlying cell-cell communication network. Novel analysis of the higher-order network structure revealed the existence of ‘superhub’ cells that are unusually richly connected to the rest of the network through feedforward motifs. Instability in epileptic networks was causally linked to superhubs whose involvement in feedforward motifs critically enhanced downstream excitation. Disconnecting individual superhubs was significantly more effective in stabilizing epileptic networks compared to disconnecting hub cells defined traditionally by connection count. Collectively, these results predict a new, maximally selective and minimally invasive cellular target for seizure control.HighlightsHigher-order connectivity patterns of large-scale neuronal communication networks were studied in zebrafish and miceControl and epileptic networks were modeled from in vivo cellular resolution calcium imaging dataRare ‘superhub’ cells unusually richly connected to the rest of the network through higher-order feedforward motifs were identifiedDisconnecting single superhub neurons more effectively stabilized epileptic networks than targeting conventional hub cells defined by high connection count.These data predict a maximally selective novel single cell target for minimally invasive seizure control


2019 ◽  
Author(s):  
Shreya Saxena ◽  
Ian Kinsella ◽  
Simon Musall ◽  
Sharon H. Kim ◽  
Jozsef Meszaros ◽  
...  

Widefield calcium imaging enables recording of large-scale neural activity across the mouse dorsal cortex. In order to examine the relationship of these neural signals to the resulting behavior, it is critical to demix the recordings into meaningful spatial and temporal components that can be mapped onto well-defined brain regions. However, no current tools satisfactorily extract the activity of the different brain regions in individual mice in a data-driven manner, while taking into account mouse-specific and preparation-specific differences. Here, we introduce Localized semi-Nonnegative Matrix Factorization (LocaNMF), a method that efficiently decomposes widefield video data and allows us to directly compare activity across multiple mice by outputting mouse-specific localized functional regions that are significantly more interpretable than more traditional decomposition techniques. Moreover, it provides a natural subspace to directly compare correlation maps and neural dynamics across different behaviors, mice, and experimental conditions, and enables identification of task- and movement-related brain regions.


2020 ◽  
Vol 39 (4) ◽  
pp. 1094-1103
Author(s):  
Younes Farouj ◽  
Fikret Isik Karahanoglu ◽  
Dimitri Van De Ville

2017 ◽  
Author(s):  
Andrea Giovannucci ◽  
Johannes Friedrich ◽  
Matt Kaufman ◽  
Anne Churchland ◽  
Dmitri Chklovskii ◽  
...  

AbstractOptical imaging methods using calcium indicators are critical for monitoring the activity of large neuronal populations in vivo. Imaging experiments typically generate a large amount of data that needs to be processed to extract the activity of the imaged neuronal sources. While deriving such processing algorithms is an active area of research, most existing methods require the processing of large amounts of data at a time, rendering them vulnerable to the volume of the recorded data, and preventing realtime experimental interrogation. Here we introduce OnACID, an Online framework for the Analysis of streaming Calcium Imaging Data, including i) motion artifact correction, ii) neuronal source extraction, and iii) activity denoising and deconvolution. Our approach combines and extends previous work on online dictionary learning and calcium imaging data analysis, to deliver an automated pipeline that can discover and track the activity of hundreds of cells in real time, thereby enabling new types of closed-loop experiments. We apply our algorithm on two large scale experimental datasets, benchmark its performance on manually annotated data, and show that it outperforms a popular offline approach.


Neuron ◽  
2009 ◽  
Vol 63 (6) ◽  
pp. 747-760 ◽  
Author(s):  
Eran A. Mukamel ◽  
Axel Nimmerjahn ◽  
Mark J. Schnitzer

2020 ◽  
Author(s):  
Ashwini G. Naik ◽  
Robert V. Kenyon ◽  
Aynaz Taheri ◽  
Tanya Berger-Wolf ◽  
Baher Ibrahim ◽  
...  

AbstractBackgroundUnderstanding functional correlations between the activities of neuron populations is vital for the analysis of neuronal networks. Analyzing large-scale neuroimaging data obtained from hundreds of neurons simultaneously poses significant visualization challenges. We developed V-NeuroStack, a novel network visualization tool to visualize data obtained using calcium imaging of spontaneous activity of cortical neurons in a mouse brain slice.New MethodV-NeuroStack creates 3D time stacks by stacking 2D time frames for a period of 600 seconds. It provides a web interface that enables exploration and analysis of data using a combination of 3D and 2D visualization techniques.Comparison with existing MethodsPrevious attempts to analyze such data have been limited by the tools available to visualize large numbers of correlated activity traces. V-NeuroStack can scale data sets with at least a few thousand temporal snapshots.ResultsV-NeuroStack’s 3D view is used to explore patterns in the dynamic large-scale correlations between neurons over time. The 2D view is used to examine any timestep of interest in greater detail. Furthermore, a dual-line graph provides the ability to explore the raw and first-derivative values of a single neuron or a functional cluster of neurons.ConclusionsV-NeuroStack enables easy exploration and analysis of large spatio-temporal datasets using two visualization paradigms: (a) Space-Time cube (b)Two-dimensional networks, via web interface. It will support future advancements in in vitro and in vivo data capturing techniques and can bring forth novel hypotheses by permitting unambiguous visualization of large-scale patterns in the neuronal activity data.


2018 ◽  
Author(s):  
Tsubasa Ito ◽  
Keisuke Ota ◽  
Kanako Ueno ◽  
Yasuhiro Oisi ◽  
Chie Matsubara ◽  
...  

AbstractThe rapid progress of calcium imaging has reached a point where the activity of tens of thousands of cells can be recorded simultaneously. However, the huge amount of data in such records makes it difficult to carry out cell detection manually. Consequently, because the cell detection is the first step of multicellular data analysis, there is a pressing need for automatic cell detection methods for large-scale image data. Automatic cell detection algorithms have been pioneered by a handful of research groups. Such algorithms, however, assume a conventional field of view (FOV) (i.e. 512 × 512 pixels) and need a significantly higher computational power for a wider FOV to work within a practical period of time. To overcome this issue, we propose a method called low computational-cost cell detection (LCCD), which can complete its processing even on the latest ultra-large FOV data within a practical period of time. We compared it with two previously proposed methods, constrained non-negative matrix factorization (CNMF) and Suite2P. We found that LCCD makes it possible to detect cells from a huge-amount of high-density imaging data within a shorter period of time and with an accuracy comparable to or better than those of CNMF and Suite2P.


GigaScience ◽  
2020 ◽  
Vol 9 (12) ◽  
Author(s):  
Ariel Rokem ◽  
Kendrick Kay

Abstract Background Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to select the best α from a set of candidates. However, efficient and appropriate selection of α can be challenging. This becomes prohibitive when large amounts of data are analyzed. Because the selected α depends on the scale of the data and correlations across predictors, it is also not straightforwardly interpretable. Results The present work addresses these challenges through a novel approach to ridge regression. We propose to reparameterize ridge regression in terms of the ratio γ between the L2-norms of the regularized and unregularized coefficients. We provide an algorithm that efficiently implements this approach, called fractional ridge regression, as well as open-source software implementations in Python and matlab (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems. In brain imaging data, we demonstrate that this approach delivers results that are straightforward to interpret and compare across models and datasets. Conclusion Fractional ridge regression has several benefits: the solutions obtained for different γ are guaranteed to vary, guarding against wasted calculations; and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. These properties make fractional ridge regression particularly suitable for analysis of large complex datasets.


Sign in / Sign up

Export Citation Format

Share Document