“Man the food-gatherer reappears incongruously as information gatherer” (McLuhan, 1967) (Gleick, pg 7).
The quote above struck me as especially poignant in today’s world of ubiquitous computing. Information is everywhere, enabling communication that bridges perceptions, distances and even foreign languages.
A relevant example in my opinion that may act as a microcosm of some of the theories we explored in the readings is Google Translate, an app that allows you to translate – in real time – one language to another. This is both reflective of and dependent on Shannon’s Theory, as it requires the following:
Info Source > Transmitter > SIGNAL > Receiver > Destination
(Language input) (New language)
It is not, as it may seem, “a room full of bilingual elves” working behind the scenes to convert one language into meaning for the receiver, but rather a microcosm of the manner in which Shannon’s theory works. Much like the diagram Kevin drew for us last week to explain how FaceID works on an Apple iPhone, there is a seemingly obscured process that goes on as the message is transmitted from point A to point B – in this case, from a native speaker of English, for example, to a native speaker of French, in their language, for them to be able to communicate. In this case, the input (the language) is then decoded to reflect the chosen, pre-programmed display. I found this video, explaining how Google Translate works to be quite illuminating:
Therefore, my understanding of it is that the difference between the e-information transmitted and received successfully depends largely on the receiver, circling back to the concepts of entropy in conjunction with the freedom of choice one has in the construction of communication.
• Irvine, Martin. “Introduction to the Technical Theory of Information” Feb. 4, 2019.
• James Gleick,The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).
• Claude E. Shannon and Warren Weaver, The Mathematical Theory of Communication (Champaign, IL: University of Illinois, 1949).