Abstract
Synergy is a fundamental concept in complex systems that has received much attention in computational biology (Narayanan et al. 2005; Balduzzi and Tononi 2008). Several papers (Schneidman et al. 2003a; Bell 2003; Nirenberg et al. 2001;Williams and Beer 2010) have proposed measures for quantifying synergy, but there remains no consensus which measure is most valid.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
- Mutual Information
- Joint Distribution
- Independent Component Analysis
- Redundant Information
- Unique Information
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Amari, S.: Information geometry on hierarchical decomposition of stochastic interactions. IEEE Transaction on Information Theory 47, 1701–1711 (1999)
Anastassiou, D.: Computational analysis of the synergy among multiple interacting genes. Molecular Systems Biology 3, 83 (2007)
Balduzzi, D., Tononi, G.: Integrated information in discrete dynamical systems: motivation and theoretical framework. PLoS Computational Biology 4(6), e1000091 (2008)
Bell, A.J.: The co-information lattice. In: Amari, S., Cichocki, A., Makino, S., Murata, N. (eds.) Fifth International Workshop on Independent Component Analysis and Blind Signal Separation, Springer (2003)
Bertschinger, N., Rauh, J., Olbrich, E., Jost, J.: Shared information – new insights and problems in decomposing information in complex systems. CoRR, abs/1210.5902 (2012)
Chechik, G., Globerson, A., Anderson, M.J., Young, E.D., Nelken, I., Tishby, N.: Group redundancy measures reveal redundancy reduction in the auditory pathway. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) NIPS 2002, pp. 173–180. MIT Press, Cambridge (2002)
Comtet, L.: Advanced Combinatorics: The Art of Finite and Infinite Expansions. Reidel, Dordrecht (1998)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. John Wiley, New York (1991)
DeWeese, M.R., Meister, M.: How to measure the information gained from one symbol. Network 10, 325–340 (1999)
Gat, I., Tishby, N.: Synergy and redundancy among brain cells of behaving monkeys. In: Advances in Neural Information Proceedings systems, pp. 465–471. MIT Press (1999)
Gawne, T.J., Richmond, B.J.: How independent are the messages carried by adjacent inferior temporal cortical neurons? Journal of Neuroscience 13, 2758–2771 (1993)
Han, T.S.: Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36(2), 133–156 (1978)
Harder, M., Salge, C., Polani, D.: A bivariate measure of redundant information. Physical Review E 87(1), 012130 (2013)
Latham, P.E., Nirenberg, S.: Synergy, redundancy, and independence in population codes, revisited. Journal of Neuroscience 25(21), 5195–5206 (2005)
Lei, W., Xu, G., Chen, B.: The common information of n dependent random variables. In: Forty-Eighth Annual Allerton Conference on Communication, Control, and Computing (2010), doi:abs/1010.3613:836–843
Lizier, J.T., Flecker, B., Williams, P.L.: Towards a synergy-based approach to measuring information modification. In: IEEE Symposium Series on Computational Intelligence (SSCI 2013) — IEEE Symposium on Artificial Life, Singapore. IEEE Press (April 2013)
Maurer, U.M., Wolf, S.: Unconditionally secure key agreement and the intrinsic conditional information. IEEE Transactions on Information Theory 45(2), 499–514 (1999)
Narayanan, N.S., Kimchi, E.Y., Laubach, M.: Redundancy and synergy of neuronal ensembles in motor cortex. The Journal of Neuroscience 25(17), 4207–4216 (2005)
Nirenberg, S., Carcieri, S.M., Jacobs, A.L., Latham, P.E.: Retinal ganglion cells act largely as independent encoders. Nature 411(6838), 698–701 (2001)
Nirenberg, S., Latham, P.E.: Decoding neuronal spike trains: How important are correlations? Proceedings of the National Academy of Sciences 100(12), 7348–7353 (2003)
Panzeri, S., Treves, A., Schultz, S., Rolls, E.T.: On decoding the responses of a population of neurons from short time windows. Neural Comput. 11(7), 1553–1577 (1999)
Pola, G., Thiele, A., Hoffmann, K.P., Panzeri, S.: An exact method to quantify the information transmitted by different mechanisms of correlational coding. Network 14(1), 35–60 (2003)
Schneidman, E., Bialek, W., Berry II, M.: Synergy, redundancy, and independence in population codes. Journal of Neuroscience 23(37), 11539–11553 (2003a)
Schneidman, E., Still, S., Berry, M.J., Bialek, W.: Network information and connected correlations. Phys. Rev. Lett. 91(23), 238701–238705 (2003b)
Weisstein, E.W.: Antichain (2011), http://mathworld.wolfram.com/Antichain.html
White, D., Rabago-Smith, M.: Genotype-phenotype associations and human eye color. Journal of Human Genetics 56(1), 5–7 (2011)
Williams, P.L., Beer, R.D.: Nonnegative decomposition of multivariate information. CoRR, abs/1004.2515 (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Griffith, V., Koch, C. (2014). Quantifying Synergistic Mutual Information. In: Prokopenko, M. (eds) Guided Self-Organization: Inception. Emergence, Complexity and Computation, vol 9. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-53734-9_6
Download citation
DOI: https://doi.org/10.1007/978-3-642-53734-9_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-53733-2
Online ISBN: 978-3-642-53734-9
eBook Packages: EngineeringEngineering (R0)