Information

From Cyborg Anthropology
Jump to: navigation, search

Definition

In our modern techno-cultural discourse, “information” has become a ubiquitous term for all types of knowledge and communication, but this was not always the case. Information was first formulated as a mathematical-theoretical entity by Claude Shannon in his 1948 paper “The Mathematical Theory of Communication”. In this paper, information is posited as a probability function of a given message, allowing mathematicians to compute the most efficient way to encode any message. In Shannon’s theory, all communication can in principle be reduced to binary code, allowing “meaningful” communication to transcend the “noise”, chaos, or entropy that is understood as the backdrop to communication.[1] It was a theory designed to maximize efficiency in communication channels, allowing for the global telecommunications networks and internet we know today.

There are several implications inherent in the idea of “information”. The first is a metaphysics of pattern / randomness that underlies information theory. Successful communication entails a message being formulated, encoded, transferred, decoded, and understood.

If I send an e-mail from a Mac to a PC, my patterned communication might be disrupted, turning my apostrophe into “#*%”. This would be an example of the encoding/decoding process failing and entropic noise interfering with my patterned communication. Entropy/noise (equated in Wiener’s formulation of information) is the constant enemy to successful information transfer. But entropic noise is also necessary for successful communication, for if there is no ambiguity there is no lack for the information transfer to fill. If I know it is Thursday, an interlocutor stating, “It is Thursday” has little information value.[2] Thus information theory is about preserving the pattern of communication against the onslaught of noise that is simultaneously required yet working in opposition to the pattern of communication.

Perhaps the most profound implication of “information” is that it was formulated with no reference to “meaning”. Because it was formulated in reference to maximum efficiency in communication technology, information value needed to be autonomous from the specific contexts in which it is enacted. For example, the e-mail “We are all going to die today” is “informationally” the same if it is a joke or if it is a warning. Mathematically this made sense for Shannon and Wiener. Accounting for the range of variables complicit in figuring out how much/what meaning someone is going to derive from the communication would yield a vastly more complex theory.[3] Semantic meaning is always dependent on context, yet information theory ignores context in favor of mathematical ease.

Shannon was aware of the limited applications of his theory of information. When scholars from other fields tried to use “information” as a means for conceptualizing social phenomenon, Shannon warned that he did not see “too close a connection between the notion of information as we use it in communication engineering and what you are doing here … the problem here is not so much finding the best encoding of symbols… but, rather, the determination of the semantic question of what to send and to whom to send it.

Despite Shannon’s care in applying information theory, information proved too strong a meme to contain, leading many scholars to adapt this term while not considering the larger metaphysical implications of the concept. It is one of the central paradoxes of modern techno-culture that it reduces all value to information, despite information’s inherent lack of reference to value itself.

Wiener himself often took to conflating information with meaning. In The Human Use of Human Beings, one of the central documents in the foundation of Cybernetics, Wiener contrasts information (associated with patterned organization, communication, form, and coherence) with entropy (the force of disorder, randomness, and disintegration).[4] This move was partially justified by Shannon’s equation for information exactly matching Ludwig Boltzman’s equation for the second law of thermodynamics, also known as entropy.[5] Asking “is this devil (entropy) Manichean or Aristotelian?”.

Wiener comes to the conclusion that entropy, and by extension, Western science, is Aristotelian. We do not have to accept this conclusion to note the significance of Wiener’s use of a theodicean metaphor to explain the nature of entropy and information. Entropy here is understood as evil, the (non-) entity that marks the absence of information, meaning, and the good. Eric Davis explains how Wiener’s theodicy gives information, and derivatively, cybernetics, cosmic significance:

The devil that the scientist fights is simply confusion, the lack of information, and not an organized resistance waged by some dark trickster. “Nature offers resistance to decoding, but it does not show ingenuity in finding new and undecipherable methods for jamming our communication with the outer world”. The enemy is dumb and blind, Wiener says, “defeated by our intelligence as thoroughly as by a sprinkle of holy water."[6]

With Wiener, information becomes the cybernetic “holy water” that defeats unintelligent nature and is linked “to meaning, value, and life itself. Wiener even suggests that the order- and form-generating power of information systems is basically analogous to what some people call God.”[7] Information is our primary weapon against “nature’s [entropy’s] tendency to degrade the organized and to destroy the meaningful.”[8] Thus the reification of information becomes complete in Wiener’s cosmologic extension of Shannon’s laws of efficient communication.

References

  1. Hayles gives a good example of how Shannon’s information theory works: Imagine a paranoid bookie who has a code for callers who are placing a bet. When the caller places a bet, there is a message that necessitates a binary response (“If your number is between 1-16, press 1, between 17-32, press 2”). Through five of these questions/binary responses, the program will know your number (If between 1-8, If between 1-4, If 1 or 2, If 1), because 32=25. Therefore information can be understood as I=log2N, N being the number of possibilities in a given communication, assuming equal probability of each option. Since there is never a real-life situation in which ambiguity is nonexistent, some type of information can always be extrapolated from an utterance. Perhaps my friend said, “It is Thursday” because he thinks I have my head in the clouds, or because there are two Thursdays this particular week, causing general confusion.
  2. N. Katherine Hayles, How We Became Posthuman : Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago, Ill.: University of Chicago Press, 1999). p. 8.
  3. Donald MacKay’s information theory tries to do exactly this, and became the dominant paradigm in British Information Theory. MacKay’s theory was soon displaced as Wiener and Shannon’s American theory became the industry standard. Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. p. 56.
  4. Ibid. p. 54.
  5. There is a very interesting story behind this math. When Shannon found the equation for information to be the same as the equation for entropy, he equated the two and saw information as entropy. Wiener took this equation and reversed it, defining information as the opposite of entropy. In mathematical terms, this is only a difference between a + and – sign in the equation, but Wiener’s interpretation of information became the dominant paradigm for understanding information, a paradigm that allowed him to make the larger metaphysical and cosmological metaphors that eventually became the basis of transhumanist metaphysics. Ibid.
  6. Ibid. p. 90, Quoting Norbert Wiener, Cybernetics; or, Control and Communication in the Animal and the Machine (New York: M.I.T. Press, 1961).p. 36, 34
  7. Erik Davis, Techgnosis : Myth, Magic + Mysticism in the Age of Information. New York: Three River Press, 1998. p. 86.
  8. Ibid. p. 86.