Category Archives: Week 7

Information transmission and creation

Jun Nie
In computer networks, information is transmitted in the form of 0 and 1. Meaningful information is encoded by the modulator and transmitted quickly through channels. Upon arrival, it is decoded by the demodulator into information that can be interpreted by the receiver. In order to resist the interruption of the noise, designers have created various methods to identify the data that may be lost or garbled in the process so that it can minimize the negative impact and guarantee the accuracy. The information theory model is essential, because it helps us to understand the process of information transmission clearly. But we need to known that the meaning carried by the information is meaningless to the transducer during the transduction. Like the content of letter  is meaningless to the postman, the address on the envelope is the only message they need to know when they deliver it. Besides,  different delivers responsible for  interpreting the messages at respective level, which means that they even don’t need to read the whole long address, passing it to the next person in charge,  then he will deliver it to the next transfer station or the destination.

Similarly, encoded bytes, like the contents of a letter, make sense only to the recipient of the data, and different people may interpret the same information differently. I think the human brain is an advanced decoding machine with more flexibility and unpredictability. If we use the content – the container transport/conduit metaphors to describe the communication, we will confuse about where new information come from? This model simplifies the information transmission process, which seems to indicate that the information will never increase and  keep flowing in its original form. However,  the amount of information is enriched in the process of spatial transmission and intergenerational transmission constantly. That’s why we have rich culture. So where and how does new information be created? There are two answers. One is that the computer itself can give new output through programming, and the other is that people can generate new associations and build new connections according to their own experience and knowledge accumulation in the process of interpreting information. It reminds me that when I look through the wikipedia and clicking on some hyperlinks to find more illustrations, the information transmitted always help me to make more association for a paper with new content.

References:
Martin Irvine, Introduction to the Technical Theory of Information.

Peter Denning and Tim Bell, “The Information Paradox.” From American scientist, 100,Nov-Dec.2012

Information systems and our society

This week’s reading was enlightening, to say the least. Our society has become filled with the sharing and comprehension of information in various forms. The transferring of information was probably first thought out with the creation of Morse code and telegraph systems. The signal transmission model we have learned about has sets of corresponding entities that give and receive, send and deliver, organize and disseminate information in order to transmit it successfully. However, it is clear that this model, while extremely helpful to us as we navigate design thinking, has natural constraints to mass production on a larger scale.

Floridi’s description of the information society can explain the general layout of these systems: “The information society is like a tree that has been growing its far-reaching branches much more widely, hastily, and chaotically than its conceptual, ethical, and cultural roots.” 

This ‘chaos’ is born out of the fact that information can be faulty and therefore the sending of information isn’t necessarily “error-free”. These models also cannot go so far as to describe the meaning or substance of a morsel of information. The information is reduced down to its bare bones in order to send and be delivered, however, this is a basic replication of what an information system is. There are many more functions that can be implicated in the timeline. There are layers of information and processes which cannot be seen by our rudimentary viewing of a system. Designing information systems must overcome this lack of transparency and the complexity of these systems. As is mentioned in the Irvine reading: “What we can’t observe is the “semiotic envelope” that encloses all the analyzable layers in the whole information data-to-meaning system. Semiotic motivation wraps around the whole structure: we are never “outside” a semiotic position in the information- meaning system because the technical design is all about encoding and representing interpretable representations.” 

Martin Irvine, Introduction to the Technical Theory of Information

Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010.

James Gleick, Excerpts from The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).

Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

Source to destination: The concept of ‘Travel’

For communication to take place in full information theory cycle, information is sent (sender) from source, through a channel (medium) and travels towards the receiver of the  information who then acknowledges receipt of such a message, signaling that the received message is in the form and context intended by the sender of the message from start.

Let us consider a package of books traveling from Amazon’s factory in New York to a student in Washington: Amazon ensures that the books are securely packaged and intact for dispatch. The dispatch company is usually a third-party delivery company whose primary function is to pick up and deliver packages securely and does not concern itself with the content of such a package. Upon delivery, the student examines the content of the box to ensure that the received package is consistent with the agreed-upon items  (typically determined by the final book purchase decision made on the Amazon App). Amazon, however, expects some feedback when/if the student either receives a wrong package or is dissatisfied with some parts of the service or the package itself, signifying a breakdown in a part of the process. 

As technological advancements have made geopositioning and tracking possible, the student also enjoys the luxury of observing the path traveled by the packaged books via a tracking app which updates every major handover as the package travels towards the receiver on its journey to delivery. Tracking gives the student a sense of awareness and helps to manage expectations because the incoming delivery will have a computed estimated arrival time (ETA). This concept of package traveling mirrors the way information is transferred from the sender (source) to the receiver(destination) in a Google Mail for example. To know what an ebook means after sending it as an email attachment from your home in New York to a friend in Washington we might consider this traveling concept as a non-physical property in information theory (all types of signals travel). From here we can see how an ebook sent over an email within seconds takes the form of a ‘package’ (as in the form of a physically packaged book despatched from Amazon to the student). The ebook as a package is how we explain digital packets i.e. the digitally transmitted streams of bits and bytes which the book becomes after the sender hits the send button. 

Both senders and receivers of email know through basic cognition that the email which is essentially an ebook transmitted as a data signal over a network has ‘traveled’ in digital sense over the network. In the case of emails, we can see how mentally engaged we become when thinking of the concept of travel given that;  the traveling packet cannot be seen physically and there are no physical paths that allow the sender/receiver adequately measure the position of the ebook as a digital packet dispatched over the network – well, until it arrives. Both sender and receiver are unaware (in the sense that it cannot be seen from the email user interface) that the eBook as a digital packet passes through multiple servers black-boxed and in place to help guarantee (by means of digital encoding and decoding processes) that the ebook sent over the email arrives at the right place (inbox) and to the intended receiver. 

Thinking of emails and what it could mean after they are sent by considering the physical aspects of traveling as a concept in information delivery supports one key statement: All of the processes in the semiotic dimensions of information theory are always there but formally bracketed off from the operational focus of electrical engineering and computing data designs.

Ref

Ronald E.D The ‘Conduit Metaphor’ and the Nature and Politics of Information Studies. Journal of the American Society for Information Science 51, no. 9 (200):

Martin Irvine, Introduction to the Technical Theory of Information 

The universality and uniqueness of information

Xiebingqing Bai

Information is a combination of universality and uniqueness. In daily life, we communicate our messages through diverse formats, such as writing, speaking, drawing, and singing, etc. Information theory, founded upon probability and mathematical statistics, is about the fundamental particle of all forms of digital communication. In physical world, all items can be measured in kilogram or pound, through which we can measure different things by a unified scale. Information also could be measured by a universal scale, which is entropy. In thermodynamics, entropy means the degree of chaos, and in information theory similarly it means how much the small probability one thing has. In simplest form, information can be thought of as the exchange of symbolic differences. For example, the amount of information in two e-books could be compared by using a unit called bit, a measure of surprise. And the bit is linked to a very simple question of yes or no. In this way, the multiple types of information on the Internet is essentially the same. Entropy is the scale and bit is the minimal displayed unit. That is why the signal-code-transmission model is not a description of meaning, because this model is putting all kind of information with different meanings on the unified scale of bit.

When we are using the content/container and transport/conduit metaphors, we are splitting the real content of meanings out of the information transmission model, and assuming the bits working in electronics are just container of real meanings rather than possess meanings themselves. The conduit metaphor helps us to better consider about the speed limit, maximum density and transport construction of information transmission. With that in mind, we can deepen our thinking about how to design the conduit to maximize the information density. The nowadays trend of increasing bit density on electronic devices is a result of conduit thinking.

Since information theory is mainly about the universality of information transmission, it’s not sufficient to explain the whole process of our daily communication in digital devices, which is so unstable and volatile. So the combination of information theory and semiotics could tell the whole story. Language is where communication begins, and each language could be broken down into many symbolic patterns. During communication process, we are encoding our thoughts into different symbolic patterns and receive messages to match existing symbolic patterns. Beyond the solely information transmission level, it’s human who possess the interpretable features and inferences with social-cultural meanings. Therefore, to successfully encode and decode information, sender and receiver should share the similar social-cultural backgrounds and cognitive pattern. Different information is not so distinct in essence, it’s our cognitive and symbolic ability that renders its uniqueness. Information is universal by mathematical measurement, yet is unique by our symbolic cognition.

And I have one question regarding information transmission: What kinds of “noise sources” could affect the transmission process? How can we avoid or reduce these noises?

References:

Martin Irvine, Introduction to the Technical Theory of Information

Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010.

Text message as information

This week’s reading helps us better understand information and its transformation system. Simplifying the process of information transmission, Shannon’s model of information system opens the black box of our communication devices. Digitalization makes texting the most common, convenient and the simplest way of transforming information.

By saying texting, we usually refer to SMS texting, which is sent over a cellular network. Our cell phones are always sending and receiving signals back and forth with a cell phone tower or control channel, even when they are at rest. When text message is typed, it is encoded by code books as bytes and transmitted in data packets, which is the signal the cell phone sent. The signal first arrives at the control channel, the medium that can transmit the signal, then it is stored at the short message service center, or to be sent immediately when the receiver is available. The receiver decodes the data packets, to be more specific, the software translates the signal into information that the user could understand.

By understanding the meaning of the text message, and forming the reply, we know that it is being successfully transmitted and received as information. According to GDI, the reason we can understand the Information transmitted by the text is because it is made of well- formed data, both in the syntax level and semantic level. The syntax of the language is its grammatical rules, which renders the meanings of the information, the semantic level of information. According to Professor Irvine, “meanings are enacted by cognitive agents who use collectively understand material sign structures in living contexts of interpretation and understanding”. Language is a kind of sign that has conventions and a set of man-made rules behind. What we text serves as symbols of meaning. People recognize the pattern, and thus understand the meaning expressed by the signs.

Shannon’s approximation theory of the language indicates that certain pattern of the structure of language made us recognize the pattern. An English speaker can recognize the letters, words, and thus, sentences. However, if the information source is Chinese, and the receiver knows nothing about Chinese, he may recognize such text only by their physical properties, but not by their meanings. Since Chinese is not in their knowledge system. In this case, the transmission of information is not successful, because the receiver cannot recognize the pattern of the information from the sender.

I also come up with a question: Is there any standard of measuring how successful information is transmitted?

Reference

Martin Irvine, Introduction to the Technical Theory of Information

Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010. Read Chapters 1-4. Excerpts.

James Gleick, Excerpts from The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).

Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

 

 

Understanding about the Information Theory

From Wikipedia, we know that in signal processing, a signal is a function that conveys information about a phenomenon. In the transmission of information, a signal can be audio, video, image and other human cognitive representations. In the information theory, these representations could be encoded into signals and then these signals can be transmitted through channel. During the transmission in the channel, there is some noises which could lower the quality of transmission. A certain level of redundancy helps in ensuring that most of codes could reach decoder. The next step is that the decoder of the receiver will decode the codes. The “original” information will be rebuilt after decode process.

Why the signal-code-transmission model is not a description of meaning? In my opinion, the first reason is that humans don’t interact with this transmission process directly. It’s our cognitive representations that are encoded, transmitted and decoded in this process. With the codes only, we can’t understand its meaning, which is different from any other agencies in society and culture.  Second, this model’s function is achieved by machines and machines’ work even plays a more important role than the meaning they carry. If the encoder, channel and decoder don’t work well, the information will be distorted.

The reason why the information theory is essential for everything electronic and digital is that they rely on information theory to achieve their meaning. The electronic and digital devices need to receive signal to perform some functions, especially the smarter electronic digital devices, like computer and mobile phone. Different devices’ cooperation depends on the information theory. Code is a common language in these devices. By the encode-decode process, code could be transported in channels to different devices. It’s safe to say that the information theory makes electronic and digital devices more powerful.

In my perspective, the information theory serves as an auxiliary in human social meaning system. It surely enhances our ability in transmission and representation of the cognitive issues. However, does it improve our meaning in the these signals? No. Even worse, the noise might damage codes during the transmission in channel. Another reason is that humans don’t use code to communicate with each other directly. Therefore, this theory actually helps a lot in transmission, but it could not be used to expand our sign and symbol systems.

 

References:

Martin Irvine, Introduction to the Technical Theory of Information (read first)

Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010. Read Chapters 1-4. Excerpts.

Features and limitations of information theory

Information theory defined information as something used to eliminate random uncertainty. In signal transmission theory, entropy is created to measure the uncertainty and redundancy of the information quantitively. In general, what information a source sends is uncertain and can be measured in terms of the probability of its occurrence. The higher the likelihood of the event, the lower the information entropy, and vice versa. The unit of measurement for encoding and measuring information is bit.

Information in transmission theory can be transmitted and received without its meaning. The process of accurate signal transmission depends on point to point model. This model regarded information as the content in a container. Receivers are the two sides of a conduit. Sending information in specific frequency, which guarantee the integrity and reliability through transmission,can erase Information errors。

However,in Shannon’s theory, information is dead, it does not consider semantics, it does not consider the inconsistency of information, and even the contradiction of information, so there is no need to accept feedback and compromise adjustment. In information theory, it has not been considered that information is often incomplete and partial and needs to be integrated. People can usually derive new information from given information.

Moreover, most information can be perceived, but cannot be measured, because the appropriate definition of information has not been found, and therefore cannot establish a theory like Shannon information theory. For example, we can perceive the joy, anger, sadness in emotional information, but it is difficult to measure them, and we can only use vague adjectives to describe different degrees of feelings. We can’t say exact, or how many more bits that love has than this love.

It is also difficult to give an objective quantitative description of information such as aesthetics, taste, smell. This kind of information is highly subjective, which makes it more challenging to define accurate measurement. Due to the diversity of real things, their vast differences, and endless changes, information theory does not apply in our meaning system. As Peter Denning mentioned in his article: “How can a system process information without regard to its meaning and simultaneously generate meaning in the experience of its users?”

Another reason that information theory cannot be applied in extending to models for meaning systems is the point-to-point model. In many communication systems, information may come from different ends or transmit to different ends. In these situations, information theory is insufficient for designing models.

Resources:

Martin Irvine, Introduction to the Technical Theory of Information.

James Gleick, Excerpts from The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).

Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

Ronald E. Day, “The ‘Conduit Metaphor’ and the Nature and Politics of Information Studies.” Journal of the American Society for Information Science 51, no. 9 (2000)

Information system: from machine to machine learning

Xueying Duan

Information system is a convergent system that combines telecommunication and computing techniques together in order to deliver information to the receivers. What makes the system reliable is the process when the information (signal) is transforming to something that most receivers can comprehend easily and also interpreting the intention of the sender clearly. The signal-code-transmission model is widely used in many of today’s situations and applications, no matter when we’re going to transfer our information through a physical speaker or the designers of a product need to deliver his purpose and design clearly through the screen. The main feature of the signal-code-transmission model is the randomness and uncertainty happened in information communication. It is somehow due to the limitation of technology, every change happened during the transformation of information on a machine can sometimes be misunderstood. Therefore, the introduction of information entropy has later provide researchers a way to optimize the effectiveness of information communication.

Also, the introduction of bit largely help with the storage and communication of all electronic devices. The binary system well matches the process that digital information is created, transformed and stored. Machine first tears apart every input and reconstruct them. But the meaningful parts in our system like signs and symbols, on most occasions, do not have a specific meaning that has a universally acknowledged interpretation. Moreover, most knowledge in our cultural accumulation can’t apply dichotomy that it become even harder for us to decode those mysterious codes.

Human culture’s continuity, to some extent, based on the human habits in historical learning and the process of gradually transform signs to symbols. Without a uniform rule for coding and interpreting, we humans can still receive other people’s information with a low error rate, while digital devices require complicated rule to regulate the communicating process in order to get rid of or minimize the influence of noises. On the other side, it is due to the consistency within machines and electric devices help them simplify the process of communication between different systems as long as they share the same rule.

The theory of information system remind me of machine learning or AI. The process of machine learning is like a continuous trial to reduce information entropy. Denning and Bell says, a computer is a machine that uses some information to control how it transforms other information. As the technique for controlling a computer is quite mature these days, scientists have started to research on how to let a computer choose the effective information independently. I am thinking about the process of how an AI program is sorting information among thousands of them. The process of judging from two identical databases and the error-correcting program inside a machine all improve the correctness when making judgments and during the machine learning process. The binary also guarantees the speed and smooth transmission within an intelligent machine.

References:

Martin Irvine, Introduction to the Technical Theory of Information (read first)

Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010. Read Chapters 1-4. Excerpts.

Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

Entropy and Meaning

Claude Shannon wrote that “the fundamental problem of communication is that of reproducing at one point either exactly or approximately the same message selected at another point.” Perhaps a more humanistic rephrasing would sound something like, assuming interpreting actors in two or more locations understand a perceptible artifact or series of perceptible artifacts to correlate to the same imperceptible and cognitively generated meanings, the fundamental problem of communication only requires reproducing, at one or more points, the perceptible artifact.

For Shannon, the journey of “information” from point A to point B does not require for the perceptible artifact to remain the same the entire time — it can be added to or taken away from, it can be reorganized and jumbled, sent in part or in whole — as long as by the time it arrives at its destination, the perceptible artifact returns to either “exactly or approximately” the same form as when it left its sender. Shannon’s theory, and the Information Theory which grows out of it, turn communication on its head; it only accounts for what is said (broadly speaking), not what is meant.

It would be easy here to disregard meaning and meaningfulness from Information Theory entirely. However, important principles rely on an assumption of meaningfulness as crucial parts of the process of reproducing a message at two different points. In particular, the insights gleaned from entropy and redundancy depend on the meaningfulness of a system either to remove redundancy for purposes of efficiency or to add it in order to ensure the integrity of a transmission.  In other words, Shannon realized that human meaning systems were patterned and therefore predictable. A highly patterned message, which tended away from randomness (i.e. exhibited low entropy) carries with it a greater degree of redundancy, and therefore can be probabilistically predicted with greater accuracy. On the other hand, a message with high entropy, or a great degree of randomness,  provides new information with every bit, making it more difficult to accurately predict.

In this way, for Shannon, determining entropy depends on meaningfulness, not in terms of the actual Peircean object of a bit of information (or what we tend to think of as “content”), but because by assuming meaningfulness, one can assume a pattern, or a level of redundancy, which means the system does not tend toward randomness, and the probabilistic likelihood of any bit of information can be discerned according to known bits of information. Of course, while an insight like this can provide remarkable understanding into modern technologies, such as autocorrect, speech-to-text software, and even machine learning, this assumed “meaningfulness” still disregards “meaning” itself. In other words, even if we program our machines to look for redundancies because our meaning systems are full of them, they still cannot know the Peircean “object” of the physical artifact, which, no matter the extent of mathematical fancy footwork, requires a cognitive agent to be truly understood.

References:

Martin Irvine, Introduction to the Technical Theory of Information

James Gleick, The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).