Understanding Information and Data

This week’s reading provides me a new insight to explain the concept of information and data through our daily use of digital communication.


According to Shannon and Weaver, the information source, transmitter, channel, noise, receiver and destination are crucial elements in communication system, which compose like a conduit for information transmission. Taking an instant messaging app as an example. Under binary system, when we send edited texts or photos to others, the words, signs, images are encoded into bits. Through the transmission, theses symbols are decoded from bits to texts and photos that allow receivers to learn what the content is. Shannon states in the classical information theory that “information can be transmitted and received accurately by processes that do not depend on the information’s meaning.” (Denning and Bell, 2012) During the transmission process, the actual meaning of a sentence becomes irrelevant when it is converted into the strings of 0s and 1s. Presented in binary data, an image becomes a series of numbers according to its pixel grids instead of showing the structure, characters, and background of the photo.

With the development of human-computer interaction, it becomes hard for information to exclude meaning. Rocchi points out that “meaning is the association between sign and referent”, which are two components of information. Considering Siri as an example. It is impossible to imagine a smartphone can talk with people in Shannon’s generation. Performed as an intelligent assistant, Siri can proceed an association between detected human voices (instructions in the form of words or sentences as signs) and corresponding function display (phone call, app activation or rejection to the instructions as referents). As new information is emerging continuously nowadays, we could not ignore the importance of meaning when talking about the substance of information theory.

Data (Further findings about “redundancy”)

When it comes to the transmission of documents and images, it reminds me of the interesting concept of redundancy in A Very Short Introduction to Information. According to Khalvati, in an image data, there are psychovisual redundancy (irrelevant information thus “ignored by human vision system”), coding redundancy (caused by “using higher number of bits per pixel”) and interpixel redundancy (“the correlation among neighboring pixels”). The reduction of data redundancy does no harm to the image itself. If the sender wants to send a high-quality image to the receiver, it is efficient to compress the image by reducing the redundancy, which narrows the size while it still keeps the original quality. It also helps increase the transmission speed at the same time. I’m wondering if the reduction of data redundancy can be treated as lossless compression that Denning and Bell discussed in The Information Paradox. As opposed to lossy compression with file quality is decreased.


From Floridi’s Very Short Introduction to Information, I’m confused about the concept of factual semantic content and factual semantic information. Why the former can be false while the latter should be true?


Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010.

Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.

Khalvati, F. (2009). Computational redundancy in image processing. Retrieved from https://uwspace.uwaterloo.ca/bitstream/handle/10012/4151/Khalvati_Farzad.pdf?sequence=1