Information Model and Measurement

Information Model and Measurement

Manjunath Ramachandra
DOI: 10.4018/978-1-60566-888-8.ch002
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Accurate modeling and measurement of digital information is very important to ascertain the commercial value for the same. The value and demand for the information fluctuates all along the information supply chain posing a challenge for the players of the supply chain. It calls for a dynamic supply chain model. In this chapter, a predictive model based on information feedback is introduced to exploit the dynamism.
Chapter Preview
Top

Introduction

In the last chapter, the need of web technologies for the supply chain management is explained. The different players of the supply chain need to exchange a lot of “meaningful” and “useful” data along with the commodities. The meaning and use of the data may be enhanced by knowing how to measure the same. Here, the model for measuring the information associated with the data is provided. The various properties of this model that may be used for enhancing the information are discussed.

The intention of a good communications system (Bekenstein, Jacob D, 2003) is to transfer meaningful data from the source to the destination. Although the meaning associated with the data is subjective, it is possible to measure the same in the statistical sense. This chapter addresses the popular queries like, what exactly is information (Luciano Floridi, 2005) and how the transferred information may be parameterized and measured. A new approach for the information measurement using classifiers is provided. It provides a foundation for the better understanding of the book. Throughout this chapter and to some extent in the entire book, the example of estimators or classifiers is considered. The concepts are introduced through the example of statistical estimators. For business and purchase departments, it provides a technique to ascertain the value for the data source such as libraries, albums, online games etc based on the information content.

The degree of information content in a message, measured in the unit “bits”, is dictated by the choice of the source in the selection of the message. If the selection of the message is as likely as not selecting the same, the message carries one bit of information. This selection of messages imparts a kind of randomness in to the message, making it ambiguous. The degree of randomness or ambiguity associated with the message is called entropy, a term borrowed from thermodynamics. The randomness associated with the message decreases with more information associated with the message. E.g. “I am crying” is more ambiguous then “I am crying for a chocolate” as the latter carries more information and reduces the randomness.

The information from the source requires a physical medium to reach the destination. For a given channel or medium, there is a theoretical maximum capacity or rate at which it can carry the information. This is called “Shannon limit” named after its inventor. It is defined and expressed in the units of bits per second. To transfer more information over the loaded channel, in general, it is necessary to process the information before transfer by a way of encoding. Transmission of information in a compressed form over a band limited channel is the topic of discussion for a chapter in the next part.

Amari.S has provided the geometric interpretation of the information that is of a great theoretical importance. Like “bits” in the Shannon’s description of information, another interesting measure of information is “Kullback divergence”. The kullback divergence between a pair of probability distribution functions is the information distance between the two estimators in the family of feedback neural networks.

Shannon’s theorem on data coding also infers that, if a data source with certain entropy pumps information in to a channel with a limited capacity and if the entropy of the source is less than the channel capacity, there exist a coding scheme that can reduce the errors to the desired rate. However, the coding scheme fails to help if the channel capacity is less than the source entropy.

According to the theory, there can be a mismatch between the received signal and the transmitted signal as a result of the channel impairments such as the noise and random disturbances. Channel noise reduces the capacity of the channel to transfer the information. (Raymond W. Yeung, 2002). By providing adequate redundancy for the message, it is still possible to recover the same without errors.

Throughout this book, an approach supplement to Shannon’s information would be followed.

Complete Chapter List

Search this Book:
Reset