: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php
on line 524
This week’s readings are dense for me, especially the mathematical part of the information theory. But it is still a rewarding process for refreshing my original knowledge of “information”.
Imagine we have an idea in mind. There are multiple choices to express this idea: painting it on the wall, writing a song, sending a telegraph, posting it on Facebook, etc. They are different forms to express the same idea–the same information. Why are they the same? If we want to prove vapor and ice are essentially the same, we will probably list chemical elements they got and compare them. But information is different. It is commonly used so metaphorically or so abstractly in daily life that few of us can tell its exact meaning under a technical context. Are there any fundamental “particles” of all forms of communication? Can we measure information scientifically? Those are something magic that Shannon shows us in Information theory, in which he applies mathematical formulas to strictly measuring the amount of information, reflecting the statistical nature of information expression. Similar to measuring the mass of different objects. No matter it is water, rock or human body, we can use a standard measure such as kilograms to measure it and make comparison accurately. What Shannon did allows us to precisely measure and compare information using a measurement called “entropy”. With this information scale, we intuitively know that some hieroglyphics on the wall has the same information with a page in an unknown book, a piece of music and some unreadable codes since they all contain the same number of bits. Every bit is linked to a simple idea of answering yes or no questions to measure average uncertainty, which is entropy(?), as well as the most powerful invention of human history — language/sign and symbol system.
For the transmission model, as clearly shown in Shannon’s original diagram, there are six basic elements: an information source which produces a message, a transmitter which encodes the message into signals, a channel through which signals are adapted for transmission, a receiver which reconstruct the message from the signal, a destination where the message arrives, and a dysfunctional factor–noise, which might interfere the travel of message along the channel. In conversation, one’s mouth would be the transmitter. The signal would be the sound waves. And the other one’s ears are the receivers. The noise might come from others’ distractions. For telephone, the channel is a wire, the signal is an electrical current. The transmitter and receiver are the telephone handsets. The noise might include the static or crackling from the wire. As for the mobile phone in my hand, it converts my voice into electrical signals, which is further then transmitted as radio waves and converted back into sound by my friend’s phone. Parallel to Lasswell’s model of communication–“who says what in which channel to whom with what effect?”, this transmission model vividly depicts a commonsense understanding of what communication is.
“This model provides an essential abstraction layer in the designs of all electronic and digital systems. It does not provide an extensible model for the larger sense of communication and meaning systems that all our post-digital symbolic-cognitive technologies allow us to implement”. The transmission model has strengths in its simplicity, generality, and quantifiability, yet followed by weakness in its misrepresentation of the nature of human communication. First, it is a highly mechanistic model that based on “conduit” and “container” metaphors. In those metaphors, the communicator put ideas into words, which are containers in this sense, and send them to others who take the idea out of those words. This process is quite like transporting goods. But thoughts and feelings are not real “objects” or goods and language cannot function exactly like a conduit since language could be interpreted to different meanings. The whole process of communication is based on biased assumptions regarding language in this theory. If this view of language is correct, learning something new will be not that hard since knowledge will be absorbed accurately and cost no effort. Also, the model is linear while communication is not one-way. The receiver might have feedback and further influence communication. Further, this model assumes that communicators are isolated individual communicators with the same social roles and power. Yet in reality, communication is a shared social system, and components in it are social beings with different roles, which means not all meanings possess equal value. For instance, if my friends ask me how I feel about the recent study, I am more likely to answer in a somewhat different way from the way I might answer the same question from my professor. Overall, this transmission model assumes communicators are isolated individuals, with no allowance for differing purposes, alternative interpretations, unequal power relations, and situational contexts. All of those constraints make it insufficient for extending to models for meaning systems. As noted in the article, “the semiotic dimensions of information theory are always there, but formally bracket off from the operational focus of electrical engineering and computing data designs.”
Martin Irvine, Introduction to the Technical Theory of Information
Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010.
Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.