Emerging information-theoretic measures and methods in neuroscience

Workshop at CoSyNe 2007

Organizers:

Direct application of information-theoretic tools to laboratory measurements of stimulus-response relationships have resulted in a number of important insights. However, these approaches often require very large amounts of data (especially for multineuronal analyses), and are thus of limited practicality in vertebrate systems, especially the central nervous system. Moreover, there are sound theoretical reasons for using an information-theoretic approach even when the neurons under study do not behave "optimally."

In extension and response to these issues, over the past few years, several research groups have developed a second generation of information-theoretic tools. The goal of the proposed workshop is to provide an in-depth snapshot of the status of these investigations in some of their most exciting aspects, including:

  1. Notions of optimality of information representations in neurons
  2. Correlation and information measures of redundancy in populations of neurons and/or the implications of limited data
  3. Refined methods and approaches to estimate mutual information from measurement data, with a particular focus on populations of neurons
  4. Use of information-theoretic tools as a means to characterize the nature of the neural code, rather than the quantity of information carried

Speakers
Toby Berger (Cornell/U. Virginia)
Energy-efficient recursive estimation by variable-threshold neurons (Slides)
Michael Berry (Princeton University)
Correlated Neural Populations in the Retina (Slides)
Additional Materials: See Below.
Dmitri Chklovskii (Cold Spring Harbor)
Optimal Information Storage in Noisy Synapses under Resource Constraints (Related Paper)
Adrienne Fairhall (U. Washington)
Model evaluation using information
David Field (Cornell)
Measuring the information content and dimensionality of complex signals: An example of natural scenes and proximity distributions
Michael Gastpar (UC Berkeley)
Scaling Information Measures for Population Codes
William Levy (U. Virginia)
The interaction between timeliness and information in determining the energetic cost of the action potential of unmyelinated nerves
Liam Paninski (Columbia University)
Model-based methods for stimulus decoding, information estimation, and information-theoretic optimal stimulus design
Jonathan Pillow (University College London)
Neural characterization using an information-theoretic generalization of spike-triggered average and covariance analysis (Slides) (Related Paper)
Jonathon Shlens (Salk Institute)
Exploring the network structure of primate retina using maximum entropy methods
Naftali Tishby (The Hebrew University)
Optimal adaptation and predictive information
Jonathan Victor (Cornell)
Why it is difficult to calculate information, and why there are so many approaches (Slides)
Additional Materials and Publications
Michael Berry
  • R. Segev, J. Goodhouse, J. Puchalla, and M.J. Berry II, “Recording Spikes from a Large Fraction of Ganglion Cells in a Retinal Patch”, Nature Neuroscience 7: 1155-1161 (2004).
  • J. Puchalla, E. Schneidman, R. Harris, and M.J. Berry II, “Redundancy in the Population Code of the Retina”, Neuron 46: 493-504 (2005).
  • E. Schneidman, S. Still, M. J. Berry II, and W. Bialek, “Network Information and Connected Correlations”, Physical Review Letters 91, 238701 (2003).
  • E. Schneidman, M.J. Berry II, R. Segev, and W. Bialek, “Weak pairwise correlations imply strongly correlated network states in a neural population”, Nature 440: 1007-1012 (2006).
  • G. Tkachik, E. Schneidman, M.J. Berry II, and W. Bialek, “Ising Models for Networks of Real Neurons”, http://arxiv.org/pdf/q-bio.NC/0607017.
Jonathan Pillow
  • Pillow, JW and Simoncelli, EP. (2006). Dimensionality reduction in neural models: an information-theoretic generalization of spike-triggered average and covariance analysis. Journal of Vision, 6(4):414-428 (Download from Pillow's website)