Information transmission and generation

Why is the information theory model essential for everything electronic and digital, but insufficient for extending to models for meaning systems (our sign and symbol systems)?

“Shannon’s classical information theory demonstrates that information can be transmitted and received accurately by processes that do not depend on the information’s meaning. Computers extend the theory, not only transmitting but transforming information without reference to meaning. How can machines that work independently of meaning generate meaning for observers? Where does the new information come from?”

Denning and Bell pose this question in their introductory piece for solving the information paradox—the conflict that emerges from the classical view of information in which it can be processed independent of its meaning and the empirical fact that meaning, and thus new information, is generated in such process—as applied to computers today.

Shannon’s classical information theory posed that information could be coded and transmitted by a sender in a way that was redundant enough to avoid noise and equivocality, thus allowing a receiver to decode it and make sense of the message. The main concern was to “efficiently encipher data into recordable and transmittable signals” (Floridi, 2010, p. 42). In his Very Short Introduction to Information, Floridi explains that MTC (the Mathematical Theory of Communication proposed by Shannon) applies  so well to information and communication technologies, like the computer, because these are syntactic technologies, ready to process data on a syntactic level (2010, p. 45). As explained by Floridi, for there to exist information, according to the General Definition of Information (GDI), there must be data that is ‘well formed’ and has meaning. ‘Well formed’ refers to it being “rightly put together according to the rules (syntax) that govern the chosen system, code, or language being used” (2010, p. 21). Shannon’s theory deals with information at this level in order to find a way to encode and transmit it.

The question posed by Denning and Bell emerges because we see people today communicating and creating through interactive computer programs, so how is meaning emerging from a transmission of this level of data? The point they make solves the paradox by relying on a more comprehensive theory of information, posed by Rocchi, which poses information has two parts, sign and referent, and that it is in the link between the two where meaning emerges (p. 477). Moreover, they also explain that the interactive features of computers today allow for the creation of meaning as every time there is a new output from a users’ interaction with a computer, meaning (new information) emerges because the user is putting together sign and referent, making sense of the transmitted data.

The authors paraphrase Tim Berners-Lee in his interpretation of this process on the web, “someone who creates a new hyperlink creates new information and new meaning” (p. 477), and in doing so, they help illustrate how the sociotechnical system that is the Internet, which can be seen from a systems perspective that takes into account its different modular components and the way they interact, can also be seen from the perspective of information transmission. In both accounts, the system only makes sense once all components are considered, not only the sender, receiver, channel and message, but also the processes by which the message is linked to a referent within the broader system. The classical information theory model then can be complemented by this understanding of information and its two parts, and still remain an essential part of our electronic information and communication technologies.


Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010.