Information Is a Physical Order
This week’s readings about information theory remind me of the book I read last semester: Why Information Grows: The Evolution of Order, from Atoms to Economies. The author, Cesar Hidalgo, maintains that information is not a thing, rather, it is the arrangement of physical things. It is a physical order. The contrary side to the order is randomness, therefore, the information grows by overcoming the randomness. This point of view aligns with Claude Shannon’s opinion. Borrowing the idea from thermodynamics, Shannon calls the random, uncertainty principle “entropy”. What is the most important, Shannon proves that the entropy can be measured and controlled, which provides possibility to reduce the uncertainty to generate and communicate information. Since entropy can be measured, the same logic, the information can be measured as well. The basic measurement is used to quantify how much information will potentially be used to encode, transmit, and decode electronic signals. The measurement introduces the concept, bit (binary digit), which comes from Boolean logic.
Why can’t we extrapolate from the “information theory” model to explain transmission of meanings?
Because the information theory model excludes the meaning.
The signal transmission model is based on binary math, combined with the logic of probability theory. It is the method for signal transmission, not meaning transmission. People always blur the boundary between signal transmission and meaning transmission because we have the ability to interpret messages and infuse them with meaning. But what is going through the physical wires is not the meaning, but the signal. The information itself doesn’t contain any meaning.
Where are the meanings in our understanding of messages, media, and artefacts?
Meaning is created during semiosis.
Meaning exists nowhere but the process when we perceive the signals. As socially symbolic beings always live in technically mediated symbol systems and use information to exchange meanings. Since human are social and collective animals, the meaning-making happens during a certain community, in which people share a common ground on interpreting specific signals. The way that people interpret signals mean is embedded in the social structure and context. It doesn’t mean that the meaning exists somewhere. The meaning-making is a dynamic process, although to some extent it is subject to the social context.
What is needed to complete the information-communication-meaning model to account for the contexts, uses, and human environments of presupposed meaning not explicitly stated in any specific string of symbols used to represent the “information” of a “transmitted message”?
Information theory + semiotics = the whole story.
Shannon’s signal transmission model perfectly explain how information encoded and decoded in a physical aspect. What it leaves out is the meaning system. Semiotics bridges this gap between bit and meaning. Semiotics addresses the process from symbols to meanings, which help complete the process from bit, a kind of symbol, to meaning. Information theory only talks about the input and output on the both sides of the black box, but semiotics clarifies what is inside the box. For example, the triadic structure and the parallel architecture help understand the process of interpreting the symbol from object – representamen – interpretant, in phonological, syntactic, and semantic aspect simultaneously.
[1] Irvine, Martin. “Introduction to the Technical Theory of Information.”
[2] Cesar, Hidalgo. “Why Information Grows: The Evolution of Order, from Atoms to Economies.“