Please enable JavaScript.
Coggle requires JavaScript to display documents.
Unified framework for information integration based on information…
Unified framework for information integration based on information geometry
Abstruct
Assessment of causal influences is a ubiquitous and important subject across diverse research fields.
Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements.
Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties.
First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over.
Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences.
To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach.
We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected.
This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected.
In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner.
Introduction
Quantitative assessment of causal influences among elements in a complex system is a fundamental problem in many fields of science, including physics (1), economics (2), gene networks (3), social networks (4), ecosystems (5), and neuroscience (6).
There have been many previous attempts to quantify causal influences between elements in stochastic systems.
Information theory has played a pivotal role in these endeavors, leading to various measures, including predictive information (7), transfer entropy (8), and stochastic interaction (9).
Drawn from consciousness studies involving measurement of integration of neural activity (10, 11), the mathematical concept of integrated information is also useful as a framework for analyzing causal relationships in complex systems with multiple elements.
Recent research suggests that the brain loses the ability to integrate information when consciousness is lost during dreamless sleep (12),general anesthesia (13), or vegetative states (14), suggesting that quantifying integration of information can serve as
a neurophysiological marker of consciousness (10, 11, 15).
Recent research suggests that the brain loses the ability to integrate information when consciousness is lost during dreamless sleep (12),general anesthesia (13), or vegetative states (14), suggesting that quantifying integration of information can serve as
a neurophysiological marker of consciousness (10, 11, 15).
The integrated information theory (IIT) of consciousness (16, 17) proposes a measure of integration called integrated information that quantifies multiple causal influences among elements of a system.
Integrated information is theoretically motivated by the holistic property of consciousness experienced as a unified whole that is irreducible into separate parts or experiences.
Whereas the original motivation for integrated information is intended to elucidate the neural substrate of consciousness, it can in principle be applied to many research fields.
Despite its broad potential impact, the application of integrated information (16, 18) to experimental data is severely limited (19, 20) due to the original measure’s derivation under restricted conditions, wherein the probability distribution of past states in a system is assumed to be uniform, variable discrete (18).
In an effort to broaden the applicability, several measures have been proposed under general conditions (9, 19, 21).
However, these proposed measures are limited by mathematical problems.
Quantification of a pairwise causal influence from one element to another can be achieved with existing measures, but to quantify multiple causal influences among many parts poses the problems of overestimation and confounding noncausal influences.
To overcome these problems, we propose a unified framework for quantifying causal influences based on information geometry (22).
The measure we propose, called “geometric integrated information” ΦG , overcomes the described difficulties, provides geometric interpretations of existing measures, and elucidates the relationships among the measures in a hierarchical manner.
The mathematical solution we derive should have broad utility in elucidating complex systems.
Three Postulates on Strength of Influences
Postulate 1.
Postulate 2
Postulate 3
Significance
A Unified Derivation of Existing Measures
Total Causal Influences: Mutual Information.
Partial Causal Influences: Conditional Transfer Entropy
A Measure of Integrated Information
Comparisons with Other Measures
Stochastic Interaction.
The Sum of Transfer Entropies
Analytical Calculation for Gaussian Variables
Hierarchical Structure
Discussion
References
Acknowlegment
Supporting Information
Fig. 1
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.