Advanced Model of Complex Information System

Advanced Model of Complex Information System

DOI: 10.4018/978-1-5225-2255-3.ch381
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Chapter Preview

Top

Background

Data mean a change of state, for example from 0 to 1 or from 1 to 0, where the state vector is not necessarily only digital or one-dimensional. Every such change can be described with the use of a quantity of information in bits.

Information theory was founded by Shannon (1948) and his colleagues in the 1940s and was associated with coding and data transmission, especially in the newly emerging field of radar systems, which became a component of defensive systems during the Second World War.

Syntactic (Shannon) information has been defined as the degree of probability of a given event and has replied to the question: how often a message appears? For example, by telling you that the solar system would cease to exist tomorrow, I would be giving you the maximum information possible, because the probability of this phenomenon occurring is nearly equal to zero. The probability model of information so defined has been used for the designing of self-repairing codes, digital modulations and other technical applications. Telecommunications specialists and radio engineers were concentrating on a probabilistic description of encoded data and on the minimizing of probability errors during data transmission.

The model-theoretical work of semantic information was done by Carnap and Bar-Hiller (1953). On the other hand, semantic information asks: how often a message is true? Zadeh (1965) introduced the theory of fuzzy sets as functions that map a value, which might be a member of a set, to a number between zero and one, indicating its actual degree of membership.

Currently, a number of interesting results have been discovered in the field of quantum information science, taking as their basis the foundations of quantum physics and using for modeling of complex systems those principles that do not arise in classical physics, such as entanglement and quantization. In the technical literature, we read that the behavior of entangled states is very odd. Firstly, it spreads rapidly among various phenomena, where for this spreading it makes use of a property known as entanglement swapping. The quantum information quantity in bits can be measured e.g. by von Neumann entropy in Vedral (2006) which measures the amount of uncertainty contained within the density operator taking into account also wave probabilistic features like entanglement, quantization or bosonic / fermionic quantum behavior by Svítek (2012).

On the basis of the information theories, a number of methods and algorithms have emerged that attempt to eliminate or minimize indefiniteness and to do a better job of extracting the real, useful information from data. An excellent example is the Bayes method by Peterka (1981), which interprets the density of probability not as a description of a random quantity, but rather as a description of the indefiniteness of the system, i.e. how much information we have available about the monitored system. The system itself might be completely deterministic (describable without probability theory), but we may have very little available information about the system. When performing continuous measurement, we obtain more and more data, and therefore more information as well about our system, and our system begins to appear to us to be more definite. The elimination of indefiniteness therefore increases the quantity of information we have about the monitored system.

Key Terms in this Chapter

Semantic Information: Describes the content level of a given event and replies to the question: how often a message is true?

Information Power: Determines the product of information flow and information content and it is measured in Joules per second.

Shannon Entropy: Is a measure of the uncertainty in a random variable and quantifies the expected value of the information (in bits) contained in a message.

Syntactic Information: Describes the degree of probability of a given event and replies to the question: how often a message appears?

Information Flow: Describes the input or output of information per unit of time and it is measured in bits per second.

von Neumann Entropy: Is the extension of Shannon entropy to the field of quantum mechanical systems described by a density matrix.

Quantum Information: Determines how to integrate information theory with quantum mechanics, by studying how information can be stored with and retrieved from a quantum mechanical system. The primary piece of information in quantum information theory is the qubit, an analog to the bit (0 or 1) in classical information theory.

Complete Chapter List

Search this Book:
Reset