Asymptotic dependency structure of multiple signals: Asymptotic equipartition property for diagrams of probability spaces

Rostislav Matveev, J.W. Portegies

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

25 Downloads (Pure)

Uittreksel

We formalize the notion of the dependency structure of a collection of multiple signals, relevant from the perspective of information theory, artificial intelligence, neuroscience, complex systems and other related fields. We model multiple signals by commutative diagrams of probability spaces with measure-preserving maps between some of them. We introduce the asymptotic entropy (pseudo-)distance between diagrams, expressing how much two diagrams differ from an information-processing perspective. If the distance vanishes, we say that two diagrams are asymptotically equivalent. In this context, we prove an asymptotic equipartition property: any sequence of tensor powers of a diagram is asymptotically equivalent to a sequence of homogeneous diagrams. This sequence of homogeneous diagrams expresses the relevant dependency structure.
Originele taal-2Engels
Pagina's (van-tot)237-285
TijdschriftInformation Geometry
Volume1
Nummer van het tijdschrift2
DOI's
StatusGepubliceerd - dec 2018

Vingerafdruk

Equipartition
Probability Space
Diagram
Asymptotically equivalent
Neuroscience
Multiple Models
Information Theory
Information Processing
Vanish
Artificial Intelligence
Complex Systems
Express
Tensor
Entropy

Citeer dit

@article{dc90fc110319470ab1ac91e568c60e21,
title = "Asymptotic dependency structure of multiple signals: Asymptotic equipartition property for diagrams of probability spaces",
abstract = "We formalize the notion of the dependency structure of a collection of multiple signals, relevant from the perspective of information theory, artificial intelligence, neuroscience, complex systems and other related fields. We model multiple signals by commutative diagrams of probability spaces with measure-preserving maps between some of them. We introduce the asymptotic entropy (pseudo-)distance between diagrams, expressing how much two diagrams differ from an information-processing perspective. If the distance vanishes, we say that two diagrams are asymptotically equivalent. In this context, we prove an asymptotic equipartition property: any sequence of tensor powers of a diagram is asymptotically equivalent to a sequence of homogeneous diagrams. This sequence of homogeneous diagrams expresses the relevant dependency structure.",
keywords = "Asymptotic Equipartition Property, Entropy distance, Diagrams of probability spaces, Multiple signals",
author = "Rostislav Matveev and J.W. Portegies",
year = "2018",
month = "12",
doi = "10.1007/s41884-018-0013-5",
language = "English",
volume = "1",
pages = "237--285",
journal = "Information Geometry",
issn = "2511-2481",
publisher = "Springer",
number = "2",

}

Asymptotic dependency structure of multiple signals : Asymptotic equipartition property for diagrams of probability spaces. / Matveev, Rostislav; Portegies, J.W.

In: Information Geometry, Vol. 1, Nr. 2, 12.2018, blz. 237-285.

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

TY - JOUR

T1 - Asymptotic dependency structure of multiple signals

T2 - Asymptotic equipartition property for diagrams of probability spaces

AU - Matveev, Rostislav

AU - Portegies, J.W.

PY - 2018/12

Y1 - 2018/12

N2 - We formalize the notion of the dependency structure of a collection of multiple signals, relevant from the perspective of information theory, artificial intelligence, neuroscience, complex systems and other related fields. We model multiple signals by commutative diagrams of probability spaces with measure-preserving maps between some of them. We introduce the asymptotic entropy (pseudo-)distance between diagrams, expressing how much two diagrams differ from an information-processing perspective. If the distance vanishes, we say that two diagrams are asymptotically equivalent. In this context, we prove an asymptotic equipartition property: any sequence of tensor powers of a diagram is asymptotically equivalent to a sequence of homogeneous diagrams. This sequence of homogeneous diagrams expresses the relevant dependency structure.

AB - We formalize the notion of the dependency structure of a collection of multiple signals, relevant from the perspective of information theory, artificial intelligence, neuroscience, complex systems and other related fields. We model multiple signals by commutative diagrams of probability spaces with measure-preserving maps between some of them. We introduce the asymptotic entropy (pseudo-)distance between diagrams, expressing how much two diagrams differ from an information-processing perspective. If the distance vanishes, we say that two diagrams are asymptotically equivalent. In this context, we prove an asymptotic equipartition property: any sequence of tensor powers of a diagram is asymptotically equivalent to a sequence of homogeneous diagrams. This sequence of homogeneous diagrams expresses the relevant dependency structure.

KW - Asymptotic Equipartition Property

KW - Entropy distance

KW - Diagrams of probability spaces

KW - Multiple signals

U2 - 10.1007/s41884-018-0013-5

DO - 10.1007/s41884-018-0013-5

M3 - Article

VL - 1

SP - 237

EP - 285

JO - Information Geometry

JF - Information Geometry

SN - 2511-2481

IS - 2

ER -