Receive a 20% Discount on All Purchases Directly Through IGI Global's Online Bookstore

Andrew Targowski (Haworth College of Business, USA)

DOI: 10.4018/978-1-60566-004-2.ch010

Chapter Preview

TopThis is one of the oldest perspectives on information meaning. From a quantitative perspective, *information* is the successful selection of signs or words that form a given list, rejecting all “semantic meaning” as a subjective factor [1]. Hartley (1928) showed that a message of N signs chosen from an “alphabet” or a code book of S signs has S^{N} possibilities, and that the “quantity of information” is most reasonably defined as a logarithmic equation:

H = N log S [1]

Since Hartley’s time, this definition of information as a selection of symbols has been generally accepted, although widely interpreted. As a result, Hartley’s theory crystallized into an exact mathematical definition, provided by Shannon (1948). According to him, the probability p of event α is:

I = - log

_{2}p(α) [2]

This approach is not useful in business decision-making. Let us assume, for example, that a message: “the distance from Kalamazoo to Chicago α =150 miles” has p=1 and therefore I = 0, since Log_{2} 1 = 0 (because 2^{0} = 1). In other words, from the quantitative perspective, this message contains no information. However, for the individual using his personal car for a business purpose, this message contains information that can be measured monetarily: if for each mile driven the individual receives compensation of $0.40, those 150 miles mean $60 in information value for him/her.

An increase in information yields a resultant reduction of chaos or entropy. Entropy, in statistical thermodynamics (Second Law), is a function of the probability of the states of the particles that form a gas. In the quantitative communication theory, entropy means how much information one must introduce into a given information-oriented system to make it informationally organized and at the same time reduce its chaos. The relationship between information and entropy is expressed most objectively by the Shannon-Weaver formula (1949):

H

_{(α)}= - Σ p(α) log_{2}(α) (BIT) (Binary digIT) [3]

In a descriptive thermodynamic sense, entropy is referred to as a “measure of disorder.” Information introduced to a given system eliminates that disorder and is therefore said to be “like” negative entropy or order. Starr (1971) demonstrates the idea of entropy using the following example: suppose that eight different commands can be transmitted from the bridge of a ship to the engine room. If each of those commands is equally likely, then the probability of any of these being sent is p=1/8. Knowing p, entropy H can be determined:

H = 8[1/8 log

_{2}(1/8)] = log_{2}8 = 3 [4]

This result indicates that eight different orders coded into a binary format (as shown below) can be transmitted via a 3-bit-wide channel of communication:

The entropy function is widely used in communication networks in coding for the assessment of channel capacity and code efficiency. However, from the human communication point of view, this perspective has limited applications, because it does not provide any human-oriented meaning to the “bits and probabilities.” This approach has a technical significance concerning how to design a technical communication channel. Finally, the entropy function lacks the semantic meaning of information, which can drive human communication.

Search this Book:

Reset

Copyright © 1988-2018, IGI Global - All Rights Reserved