Receive a 20% Discount on All Purchases Directly Through IGI Global's Online Bookstore.

Additionally, libraries can receive an extra 5% discount. Learn More

Additionally, libraries can receive an extra 5% discount. Learn More

Miroslav Svítek (Czech Technical University in Prague, Czech Republic)

Copyright: © 2015
|Pages: 6

DOI: 10.4018/978-1-4666-5888-2.ch733

Top## Background

*Data* mean a change of state, for example from 0 to 1 or from 1 to 0, where the state vector is not necessarily only digital or one-dimensional. Every such change can be described with the use of a quantity of *information* in bits.

Information theory was founded by Shannon (1948) and his colleagues in the 1940s and was associated with coding and data transmission, especially in the newly emerging field of radar systems, which became a component of defensive systems during the Second World War.

*Syntactic (Shannon) information* has been defined as the degree of probability of a given event and has replied to the question: how often a message appears? For example, by telling you that the solar system would cease to exist tomorrow, I would be giving you the maximum information possible, because the probability of this phenomenon occurring is nearly equal to zero. The probability model of information so defined has been used for the designing of self-repairing codes, digital modulations and other technical applications. Telecommunications specialists and radio engineers were concentrating on a probabilistic description of encoded data and on the minimizing of probability errors during data transmission.

The model-theoretical work of *semantic information* was done by Carnap and Bar-Hiller (1953). On the other hand, semantic information asks: how often a message is true? Zadeh (1965) introduced the theory of *fuzzy sets* as functions that map a value, which might be a member of a set, to a number between zero and one, indicating its actual degree of membership.

Currently, a number of interesting results have been discovered in the field of *quantum information science*, taking as their basis the foundations of quantum physics and using for modeling of complex systems those principles that do not arise in classical physics, such as *entanglement* and *quantization*. In the technical literature, we read that the behavior of entangled states is very odd. Firstly, it spreads rapidly among various phenomena, where for this spreading it makes use of a property known as *entanglement swapping*. The quantum information quantity in bits can be measured e.g. by *von Neumann entropy* in Vedral (2006) which measures the amount of uncertainty contained within the density operator taking into account also wave probabilistic features like entanglement, quantization or bosonic / fermionic quantum behavior by Svítek (2012).

Quantum Information: How to integrate information theory with quantum mechanics, by studying how information can be stored with and retrieved from a quantum mechanical system. The primary piece of information in quantum information theory is the qubit, an analog to the bit (0 or 1) in classical information theory.

Shannon Entropy: A measure of the uncertainty in a random variable and quantifies the expected value of the information (in bits) contained in a message.

Semantic Information: The content level of a given event and replies to the question: how often a message is true?

von Neumann Entropy: The extension of Shannon entropy to the field of quantum mechanical systems described by a density matrix.

Syntactic Information: The degree of probability of a given event and replies to the question: how often a message appears?

Information Power: The product of information flow and information content and it is measured in Joules per second.

Information Flow: The input or output of information per unit of time and it is measured in bits per second.

Search this Book:

Reset

Copyright © 1988-2019, IGI Global - All Rights Reserved