Information and Organization

Information and Organization

Andrew Targowski
Copyright: © 2009 |Pages: 36
DOI: 10.4018/978-1-60566-004-2.ch010
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The purpose of this chapter is to define information, mainly in terms of cognition units, and also to find out its other perspectives and images. Once we understand information, it becomes possible to define its role in an organization, particularly at the level of information systems. The issue of how more complex information systems may advance an organization to higher levels of structure (configuration) will be investigated. Modern complex organization is still very recent, about 50 years old, but can already be perceived to have some evolutionary phases. Finally, the transformation from the industrial to the informated model of an enterprise is described and both models are compared, with some conclusions about meaning for civilization’s well-being.
Chapter Preview
Top

Perspectives Of Information

The Quantitative Perspective of Information

This is one of the oldest perspectives on information meaning. From a quantitative perspective, information is the successful selection of signs or words that form a given list, rejecting all “semantic meaning” as a subjective factor [1]. Hartley (1928) showed that a message of N signs chosen from an “alphabet” or a code book of S signs has SN possibilities, and that the “quantity of information” is most reasonably defined as a logarithmic equation:

  • H = N log S [1]

Since Hartley’s time, this definition of information as a selection of symbols has been generally accepted, although widely interpreted. As a result, Hartley’s theory crystallized into an exact mathematical definition, provided by Shannon (1948). According to him, the probability p of event α is:

  • I = - log2 p(α) [2]

This approach is not useful in business decision-making. Let us assume, for example, that a message: “the distance from Kalamazoo to Chicago α =150 miles” has p=1 and therefore I = 0, since Log2 1 = 0 (because 20 = 1). In other words, from the quantitative perspective, this message contains no information. However, for the individual using his personal car for a business purpose, this message contains information that can be measured monetarily: if for each mile driven the individual receives compensation of $0.40, those 150 miles mean $60 in information value for him/her.

An increase in information yields a resultant reduction of chaos or entropy. Entropy, in statistical thermodynamics (Second Law), is a function of the probability of the states of the particles that form a gas. In the quantitative communication theory, entropy means how much information one must introduce into a given information-oriented system to make it informationally organized and at the same time reduce its chaos. The relationship between information and entropy is expressed most objectively by the Shannon-Weaver formula (1949):

  • H(α) = - Σ p(α) log2 (α) (BIT) (Binary digIT) [3]

In a descriptive thermodynamic sense, entropy is referred to as a “measure of disorder.” Information introduced to a given system eliminates that disorder and is therefore said to be “like” negative entropy or order. Starr (1971) demonstrates the idea of entropy using the following example: suppose that eight different commands can be transmitted from the bridge of a ship to the engine room. If each of those commands is equally likely, then the probability of any of these being sent is p=1/8. Knowing p, entropy H can be determined:

  • H = 8[1/8 log2 (1/8)] = log2 8 = 3 [4]

This result indicates that eight different orders coded into a binary format (as shown below) can be transmitted via a 3-bit-wide channel of communication:

The entropy function is widely used in communication networks in coding for the assessment of channel capacity and code efficiency. However, from the human communication point of view, this perspective has limited applications, because it does not provide any human-oriented meaning to the “bits and probabilities.” This approach has a technical significance concerning how to design a technical communication channel. Finally, the entropy function lacks the semantic meaning of information, which can drive human communication.

Complete Chapter List

Search this Book:
Reset