Category Archives: Week 8

Context, you would need, and respect it, you will

“Meaning is not a property that lays in the data or bit, it’s an event that happens at the same time as we pick up and decode the encrypted data or bits”.

It’s hard for people to really look at “communication”, as so many things are just taking for granted. “Too much information – we call our era the information age and complaint about information overload. As social beings, there are few moments in a day that don’t involve communication and interaction with others in language and other symbolic media” (Irvine, 2014). There are different kinds of mediums. From the basic ones that we started with — air. The sound wave and vibration were transmitted via air and being picked up by our ear.

Our eyes are another main receiver of information. Before language even exists, we communicate with body language and probably gestures.  Later, the written text became an important carrier of information.  We learned how to recognize words and extract the information that was encoded into the texts.

Now we have all kinds of digital mediums, videos, movies, musics. They could be coverted into various forms, Though we are still dealing with them with mainly our ears and eyes, the richness of the media and midiums provide more stimulation to our sensory organ than ever before. We tend to combine different layers of sense together to gain a bettter perception.

One thing that do touched me is the concept of “noise”. When we refered to a acoustic concept, noise clearly means the disharmony sound that interferes with the musical sound.  But it could be widened to all kinds of factors that would prevent us from getting the appropriate message. We have always been thinking that we have better medium and better technology so that we would have better capability in delivering information. The answer is yes and now, while the successful delivery in an “advanced medium” could have achieved a better result, we have to keep in mind that it also requires more external factors to support the technology.

Meanings transmitted during any communication process.

The signal-code-transmission model is useful for explaining phenomena happening on the micro level and how it turns into substrates and carry numerous amount of messages by decoding and encoding electric signals. In the views of contemporary contemplation on semiotics and information theory, the model seems a little bit outdated since it is not inclusive of the impact contexts (socioeconomic, cultural) in forming any messages. However, I don’t think the model fails on this aspect; it just needs more alternative explanations and extension.

From my own interpretation, the process of encoding already embodies the analysis of contextual information. As every piece of information carried by substrate is internally a symbol (Irvine indicated everything IS meaning), it already contains myriad of meanings contemplated by individual and society. Analysis of any single terms cannot be stripped off its context, just as one cannot live without its surroundings. For example, when any information regarding the 9/11 tragedy is electrically encoded to be digitally computed, it already embodies a certain degree of conception, subject to personal experience, of individuals who have his or her distinct perception of the event. The perception stems from the narrative of the event, nurtured by collective consciousness to construct the ‘context’ of the event. Thus, defining 9/11 as ‘tragedy’ already indicates that contextual information is also included in the specific information, underlying the representamen.

Also, I think how contextual meaning is circulated is also not necessarily excluded from the narrative of the information model. Shannon’s hypothesis lies upon the assumption that the encoding and decoding process of information should be engineered to reduce the entropy ( unpredictability), which also corresponds to, in my opinion, the development of society and culture. The formation of civilization, in a degree, is to ensure entity within its sphere of influence to have increasing welfare, therefore any unpredictable circumstances should be anticipated and circumvented. In this premise, we can apply the information model to the analysis of context given substrate and context abide the same principles of development. The source can be interpreted as changes in the macro level, and same efforted should also be given upon reduce the noise – unpredictable circumstances in the process of communication.


Martin Irvine, “Introduction to the Technical Theory of Information


Packet-switching across a network

There is so much information around us. As Floridi puts it, Information is notorius for coming in many forms and having many meanings. Over the past decades , it has been common to adopt a General Definiton of Information (GDI), in terms of data and meaning. That means that we can manipulate it, encode it, decode it as long as the data must comply with the meanings (semantics) of a chosen system, code or language. There has been a transition from analogue data to digital data. The most obvious difference is that analog data can only record information (think of vinyl records) and digital data can encode information, rather than just recording it.

But how is the information measured?

Claude Shannon, in his publication “A mathematical theory of communication”, used the word bit, to measure information, and as he said, a bit is the smallest measuring unit of information. A bit has a single binary value, either 0 or 1.

When I think of information, I almost never associate it with data, but rather with meaning. In a way, information to me serves the function of communicating a message. But, when we look at how is the message sent and delivered, is when we can see the data in it.

Shannon’s first diagram, a version of which he used for encryption and
decryption techniques in World War II, outlines a simple, one-way, linear signal path
without the surrounding symbolic and social motivation for the signs and symbols
encoded, transmitted, and decoded.

Now let’s take a look and see how information is sent over the web, and how computers exchange data.In 1961, Leonard Kleinrock introduced the packet-switching concept in his MIT doctoral thesis about queuing theory: “Information Flow in Large Communication Nets”. His Host computer became the first node of the Internet in September 1969, and it was the first message to pass over the internet.

So how does packet-switching works?

An animation demonstrating data packet switching across a network.

First, the TCP protocol breaks data into packets or blocks. Then, the packets travel from router to router over the Internet using different paths, according to the IP protocol. Lastly, the TCP protocol reassembles the packets into the original whole, and that’s how the message is delivered.

When you send e message from your computer to a friend, using the Internet as the mean of this communication, that message is divided into packets/blocks as we saw earlier, it finds different paths from the modem, to the router, finds the Domain Name Server and then the appropriate Web Server using the Internet Protocols, and at this point the message is than reassembled into the packets from the original whole, and that’s how your friend receives that message. There is a trade of complexity and performance that happens while using these design principles, but the end goal of this architecture is to effectively have the flow of information, the transmission of the data packets from one end of the server to the other

As Dr. Irvine explains, information theory contributes to the designs for the physical architectures and kinds of digital information encoding and decoding that we now
use in well-recognized, standardized formats and platforms. So, information theory and semiotics gives the more complete picture of meaning-making in our digital
electronic environment.


Floridi, Luciano. Information: A Very Short Introduction

Gleick, James Excerpts from The Information: A History, a Theory, a Flood. New York, NY: Pantheon, 2011

Irvine, Martin. Introduction to the Technical Theory of Information

Claude E. Shannon, E. Claude. A Mathematical Theory of Communication.The Bell System Technical Journal 27 (October 1948): 379–423, 623–656.


Interaction and common grounds—A further understanding of the information theory-Wency

Interaction and common grounds—A further understanding of the information theory-Wency

 The development of different forms of information transmissions is inseperatable with human’s need of mutual interaction. In the earlier times, people were only able to communicate with each other in person when they need human resource to pass the information (i.e. letters) if they need distant connections. It was when later people started to realize the usability of physical resources (i.e. electrical and magnetic waves that are not completed limited by space and distance) that the blossom of electrical telegraph and the utilization of electromagnetic system of signals began. At this point, human beings and those machines were nearly isolated where the human communication system seems to live in a world separated from the physical communication and transmission system. On the one hand, each individual in the human communication system generates meanings constantly where they were able to communicate with each other under the social and cultural context which more or less embedded into their memory and affect the whole process where they make selections in the perception of signs, realizing what is inside the box (i.e., the object), come up with further connotations and also as an output that contributes back to the social, cultural context. On the other hand, people have understandings of how to utilize the bottom layer (i.e. the electronic signals) of communcation to pass the signals no matter the distance. By correlating pulses of electrical current to an abstract symbol set, Morse was able to implement the interface of such bottom layer with the earliest human compiler when they interpret those symbols and transfer them onto a higher level of human readable form of information. At this point, according to Shannon, what matters was not the distance but the control, in other words, how to minimize the loss of information, counteract with the noise which was inevitable during the transmission of signals alone the channels and how to recover the information on the receiver side, in the Mathematical theory of communication, he pointed out that the entropy of information (i.e., the randomness and uncertainty) can be measured and controlled in a mathematical way where information could be guaranteed to be transferred in a maximum way as well as not being changed or lost significantly. To be sure, human beings are still interact with the physical communciation system, but because the update of the whole information system is quiet slowly input onto the system, such interaction is too slight to be taken account.


  1. Growing interactions between human and machine

However, Shannon’s classical information theory that information can be transmitted received accurately by process that do not depend on the information’s meaning seems to be more and more doubtful in today’s standard. After Alan Turing’s first presumption of what makes it a computing machine (although it was pure imaginary without physical support), today the blossom of computer industry obviously is making the transform of information (i.e., the creation of new information) incredibly faster. This seems to be easy to explain, while computers are operating based on the electricity transferring between the transistors, such electricity is passing in a speed identical to the light (3*10^8 m/s), which means the update of a large binary sequence could happen in a second. Moreover, while different compliers are substituting the position of earlier human reader in the Morse’s era, the interaction between human and machine is becoming much faster in a human readable way. When we are reading a text message, digital image, listening to a music piece, watching an online Youtube video or writing some comments at the under interface with digital devices, we are using symbolic interpretative process all the time to map the representations onto recognizable patterns and make inference about what motivated the encoded information that gets transmitted through the system, all of these events happen under a macro social, cultural context where we absorb knowledge, basic rules that affect our interpretation as well as changing the context by ourselves. Our user actions dynamically goes onto the front end programming language of computer and gets transferred layer by layer until they reach the binary system where data packets are thus able to be sent under the manipulation of the TCPIP model. Therefore (according to the figure above), information, although still unchanged in the bottom level while being transferred, is constantly being updated at the application level (through the user interface) within the help of human agencies as the constant meaning generators.

  1. Understand the similaries between the tokens of Turing Machine

The Turing machine, though depicted in an imaginary way according to Turing in the earlier year, is a universal one which set the standard of all the machines or systems to be a computable one. According to the earlier depiction of the universal Turing machine, three major items are necessarily to be mentioned here: tape, symbols and states. A tape is a long strip divided into squares where symbols are written to or read from through a head of the control unit. States determine the machine’s actions according to themselves which in short can be described as in state i, execute algorithm Ni. We can thus easily assign those items into different tokens of the universal Turing machine and therefore realize increabily large amount of connections between human mind and computer. In a macro way, tape, as temporal memory storing the earlier symbols written into it, can bring us to an analogy to human’s memory (including long term and short term) and also the memory in a mechanical way (RAM, ROM, hard disk and today’s cloud storage). The symbols therefore would be those content stored in the memory where we can also read them, or better, retreive them for later actions (e.g. think about human beings’ analyzing a sentence according to the linguistic system, they percept the words and retrieve the memory to map those lexicons in a syntactic way). Last but not least, the states stands for all the process when we analyze something or make a decision (if this is the case, then I will…other wise I will…), and such states also map well in a digital world where we first interpret those states of mind into pseudo code and then code them into a large amout of IF-ELSEs.

Further, if we look in a micro way, what amazes us is that the physical structure of signals and systems even matches well in a biological way – Every neuron, could somehow be considered micro system containing transistor (axon terminals), channel (axon), receiver (dendrites), and the electricity passing between the transistor in a computer system here could be substitute by electrochemical signals and they are more or less all controlled by the central nervous system (probably is similar to the CPU). What is more interesting is that even the electrochemical signals are not completely analog signals (there is somehow several instantaneous changes between states). Currently what confuses a lot of psychologists is how these electrochemical signals are later generate in a larger semantic way (now in computer system we know how electricity are represented in binary system and goes all the way up to human readable programming languages). But what we now know is human’s ability to jump out of the existing symbols and information to a external virtual imagination in order to complete their logic thinking (If we understand fully about how this is realized in an anatomic way we are therefore probably be able to bring computers into a more autonomic level, but fortunatly we are still on the progress of making computer more and more human-like, including enabling computers to compute in a more emboddied way, i.e., the capture).

Nevertheless, no matter how abstract we’ve been right now, we need to bear in mind that all the forms of information lie upon a thermodynamic system where information is inseperatable with energy and physical structures (computer release heat, human beings are consuming calories while thinking, etc.), and while according to the second law where there is always a tendency where entropy is going to increase, how to recycle information in an efficient way can never be avoid to discussion.


  1. Martin Irvine, “Introduction to the Technical Theory of Information
  2. Luciano Floridi,Information: A Very Short Introduction
  3. James Gleick, The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).
  4. Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

From Information to Information

As is written in Shannon’s famous paper, “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.”[1]

Traditionally, at least as for me, when trying to deliver a piece of information to another as accurate as possible, what I would try to do is do add as more description and post scrips as possible. However what Shannon had done is somehow totally the opposite.

It seems that he deals with the information by abandoning the pragmatics elements, here from my perspective not only pragmatics but also semantics. The technical design in Shannon’s theory is focusing on how to deal with the interpretable part of the whole information, the”type” or to say the “pattern”,  so that it can be put into the “semiotic envelope”[2] and together being delivered to the destination as accurately as possible. It seems that Shannon considered little about the “envelope” but payed full attention on how to encode and represent the “letter”. So what happens to the semiotic functions when the part of the whole information that is able to be represented by 0 and 1 are going through the physical architecture?

This reminds me of a really interesting conversation last week. The conversation happens in Wechat, a Chinese SNS application which allows free chat between friends.

This is how the dialogue goes in text.

Friend: “Which do you prefer, Sumsung or Apple?”

Me: “Apple.”

Friend: “Why?”

ME: “IOS works better than Android.”

Friend: “But Sumsung has better appearance.”

Me: “It is even forbidden in the airplane.” (P.S Because of too much exploding reports, a specific series of Sumsung smart phone, NOTE 7,is forbidden to take into the plane in several Asian countries including China and Japan)

Friend: “Apple is a bit more expensive.”

Me: “It may work longer.”

Friend: “But I love Jing Boran.” (P.S Jing Boran is an actor who is the new spokesman of SumSung  )

Me: “Well it seems that you have already made your mind.”

This is a really normal daily dialogue but after learning the information theory, it becomes a bit different to me. All the Chinese characters, punctuations and even the blanks in the sentences are the representable and translatable part of the information that can be encoded into bits and delivered through the movement of electricity.

Firstly, the most important question, what does the conversation talk about? Actually, the background story is that my friend wanted to buy a new smart phone and was struggling between the latest iPhone and the lastest Sumsung product. This background story is not the information that my friend hoped to deliver to me through the electricity this time but unconsciously becomes part of the “semiotic envelope” without which I will not get what she is asking. The same with the background story, who is Jing Boran and why he is relevant to our conversation here are all the collective knowledge, at least the collective knowledge between the two of us.

However, even with these pre-existing knowledge, when purely watching these words shown on the screen, what can we get? There are still a lot more semiotic logics goes without which the whole information will still not be gotten exactly. Here I would like to talk about Sumsung and Apple. They are actually the name of two brands but clearly my friend is not thinking of purchasing the whole company (this is also part of the collective mind). From my perspective, there is actually an unprcisely employment here in our conversation which is led by a really “semiotic” thing—brand. Neither of the two company only sells the cellphone that my friend is struggling with. Each of them are sells different series of smartphones and other electronic devices, such as pads and laptops, simultaneously. In our conversation, these two words are actually working as different tokens even when they are shown in the same way on the screen. Sometimes, they separately represent the specific series of smartphone that my friend is interested in, sometimes they represent all the smartphones that are being sold in that brand and sometimes they are the brand, the company, themselves.

There are still a lot to dig into when considering how complex it is for my friend and I to go through the conversation smoothly.  However the more this process is complex, the more amazing the idea of Shannon is to me. It is not only about how the idea enables the information to be delivered in human society, but more fundamentally, how talented Shannon is to come up with the idea that somehow separate the symbolic system from the whole information and use mathematical model to represent the specific part of that information.

I do not think that I have perfectly understand the information theory idea but the manipulation in mind that translate the information in general context  to the information in technical system really stroke me and somehow influenced how I think of the media.

[1] Shannon, C. “A Mathematical Theory of Communication.” ACM SIGMOBILE Mobile Computing and Communications Review, vol. 5, no. 1, 2001, pp. 3-55.

[2] Irvine, M. “Introduction to the Technical Theory of Information” 


How the Physical Components Taking Abstractable

Readings in this week offers me innovative connections between semiotics what we learnt before and communication system together, which broadens the boundaries of these seemly separate concepts and creates a more comprehensive theoretical framework. I would like to try to answer the second perspective prof. Irvine put forward, how the physical components of symbol systems, are taking abstractable into a different kind of physical signal unit for transmission and recomposition.

In 506, we learnt a concept “capture” which means conversion of human behavior and real-world data into machine input. The “capture” process includes humans& world, sensor, transducer, encoder, and machines& computers. It is consonant with Claude Shannon’s transmission model. Needless to say, they both involves information sources and destination. In the case of capture, transmitter and receiver are transducer and encoder, which transduce physical signals to electricity and then encode it into machine-readable language. Noise source can also play a vital role in this process. Noise source refers to any disturbances affecting the accuracy and certainty of the information transmitted. When it comes to the “capture” case, noise can be unstable Wi-Fi connection, typos, or out of electricity. Shannon even applied a physical term “entropy” to explain noise source better. Entropy means disorder and messy state in physics originally. The messier, the higher entropy. Things have the tendency to flow from low state to high state, in other words, which means that things tend to be from tidy state to messy state.

Though we have basic ideas how information gets transmitted from physical components to digital symbols in computing systems, meanings have not been mentioned in the whole process, which is the very dilemma Peter J. Denning and Tim Bell put forward. If meanings are separated from the information transmission model, how do meanings get transmitted? They concluded that sign-referent interpretation that “information consists of both signs and referents” can resolve this paradox.

According to Peirce’ model, there are three basic elements in semiosis: a sign, an object, and an interpretant. Simply speaking, a sign is representamen which can be interpretable. An object is subject matter of a sign and an interpretant. An interpretant is a sign’s actual meaning. They connect with each other tightly form a complete meaning system together. Meanings do not exist in each one of them but in the whole process of it.

In addition, meanings are not isolated but largely depend on meaning communities. They are built upon social common senses and agreed rules. For instance, a religious group has shared symbolic systems, such as rituals, clothing, or diet. Meanings are embodied in these symbols and get transmitted when symbols are enacted. Therefore, meanings are not independent from information transmission system, but instead deeply inside the information transmission process itself. Just as prof. Irvine stated, “Meaning is not ‘in’ the system; it is the system.”

When written here, I am not sure whether I truly understand how to solve this issue or not. I feel myself have obscure understandings but still in a mess and fail to organize myself logically. Hope this problem could be went over in detail in class.



  1. Denning, P. J., & Bell, T. (2012). The information paradox. American Scientist, 100(6), 477.
  2. Irvine, M. Introduction to the Technical Theory of Information.

Noise in Information Communication

Shannon developed the famous information theory. In this theory, Shannon put forward an idea that noise is one of the necessary elements in the information communication. Shannon proved that we can reliably transmit units of information over noisy electrical channels. In the communication theory, noise refers to anything which blocks between message source and destination. It obstructed the process of coding and decoding information. Noise cannot be thoroughly avoided or eliminated, but it can be controlled or reduced as far as possible.

In the information communication, the exist of noise can lead to the inconsistency of information between source and destination. Therefore, the communication might be unsuccessful and the information might be untrue. Noise can be divided into the following categories. First is physical noise, such as the noise from external environment and the interference of the third when two are having communication, etc. The second noise is semantic noise. This is because message source and destination have different understanding of some terms and grammar which might lead to communication barriers. The third noise is because of the differences in social status, gender, occupation, economic level between message source and destination. This might cause interference and information distortion.

Here are a few examples of how noise works and generates:

  • Reading foreign novels. We will all read some classics. When the languages are not the same, we can only depend on translation. That is to say, the translators play the role of the middleman between the original authors and the audiences. He not only enables the communication but also more or less becomes the noise in the communication. The reading process is also a process of information communication. While reading foreign novels, the first kind of noise comes from the outside environment. If the audiences don’t read in a quiet environment, any other sound such as aircraft, car horns or human voice is the noise in the process of communication. Secondly, because the translator’s ability or background knowledge is different from the original author. His cognition of the author’s viewpoint is subjective. According to The Information Paradox, in the communication system, the information cannot be subjective. That is to say, the communication process is divided into two parts. First is between the author and the translator and the second is between the translator and the reader. The gap between the end of the first part and the beginning of the second part is closely related to the translator’s understanding of the author’s terms, grammar and so on. The third kind of noise is because of the readers’ social status, gender differences, etc. As the saying goes, there are a thousand Hamlets in a thousand people’s eyes. Different readers have different understandings about the same translation. After several noise disturbances, the information received by the audiences is not the same as that of the original authors.
  • Listening to music. The sounds that we recognize as a music genre also go through several layers of information processes. The digital audio file gets interpreted in software and hands off the information to a codec and those signals are sent to an electronic transducer, which converts digital binary information into a form of energy we can perceive and sound waves that play through the audio functions in our devices. In this process, there are several kinds of noise. One is the external noise as above. The second is from the noise of the devices. For example, the MP3 is too old, or the music is downloaded with HQ quality rather than SQ quality, which is nondestructive and will not cause other noise. Also, if audiences use headphone, the quality of the music is different from without using headphones. These are all the possibilities that will cause noise and affect the quality of the music. The third noise is because different audiences have different understanding of the same song. For example, for some special songs that are the singers sing for their fans. Fans of the singer and others will have different understanding of the same song, because their background knowledge of the singer is completely different. The biggest information that the singer add in the song might not transmit to those who don’t have the background knowledge of the singers.



  1. Denning, P. J., & Bell, T. (2012). The information paradox. American Scientist, 100(6), 470.
  2. Irvine, M. Introduction to the Technical Theory of Information.