Category Archives: Week 7

Could Quantum Information Theory Solve the Symbol Grounding Problem?

Shannon’s foundational information theory (MTC) is simply not bothered with the meaning of the content, and doesn’t need to. Since meaning is not a property of the signal but an event, the entire information transmission process including the signal, system and processing together constitute the meaning of the message, media or artifact as we understand it. This is why we know that a hand-written note means something different from a text message. “Hartley had to admit that some symbols might convey more information, as the word was commonly understood, than others. For example, the single word ‘yes’ or ‘no,’ when coming at the end of a protracted discussion, may have an extraordinarily great significance.” (Floridi) Therefore, if meaning relies on the popularity of symbols in a meaning-making community, could probability theory be applied to explore the transmission of meaning?  Although, Shannon’s theory cheerfully neglected the meaning of information, he concluded “that apples and oranges are after all equivalent, or if not equivalent, then fungible.” Isn’t this the entire point of semiotics or human symbolic cognition?

If data + meaning = information, where is the meaning? Is it implied in the data until it is transmitted and interpreted as information that is valuable? Or is the presence of meaning in the transmitted data prompt an interpretation of it as information that can be used? “How data can come to have an assigned meaning and function in a semiotic system like a natural language is one of the hardest questions in semantics, known as the symbol grounding problem.” (Floridi) In a quantum state, information is transmitted through the entanglement phenomenon. Therefore, information is coded only in the correlation, not in the entangled states, eliminating the binary constraint present in the classical state. Is this a relevant step ahead for a semiotic model of information?

From a strictly symbolic perspective, the text (sign vehicle) gets interpreted by the transistor into electric signals which correspond to other signs, by way of “electromagnetic actions” at the logic unit (ALU) in the computer, that further render a manifestation (by way of computation) of the original sign vehicle (the text). An image is converted into pixels, areas of lightness and darkness, a sound pressure is converted into electrical current (charged free electrons (particles) in the semiconductor), numbers are converted into binary states by method of base-two notation and letters are converted into numbers in the ASCII (developed in 1963) code. Shannon soon realized that the more discrete the signals, the more efficient is the transmission of a message. This is primarily why binary won over quaternary (4 states) and even quinary (5 states). However, a quantum transmission involves superposition, where data can be coded in multiple different states as a result of the nature of the quantum particles. Would this involve a trade-off between efficiency and communication?

How do we know  what a text message means? Firstly, a digital message today usually arrives tagged with a specific sender, which generally implies it is human-motivated. Secondly the message carries a degree of syntactical familiarity. “According to Shannon, a message can behave like a dynamical system, its future course conditioned by its past history.” The design of the electronic/digital system as such carries electromagnetic current that drives the signals to be transmitted back to its original form, according to the code used to encode the message at the first instance. The system is designed based on the amount of information that can be transmitted. I think that all communication acts can be understood by merely supplying on the signal level, provided the signals are commonly known by the communicators. The symbolic cognition occurs with the supply of these signals and the presence of “more developed” ones to draw inference (through the interpretant).

Can the conduit metaphor be rendered obsolete by quantum information theory? Can quantum information theory provide better metaphors than even “network”? Moreover, in a quantum state, there also lies the potential for technology to shift its mode of communication from a transmission view to a higher dimensional one. These are questions I’m beginning to grow a lot of interest in.

References:

  1. Luciano Floridi, Information, Chapters 1-4.
  2. James Gleick, The Information: A History, a Theory, a Flood.
  3. Ronald E. Day, “The ‘Conduit Metaphor’ and the Nature and Politics of Information Studies.”
  4. Crash Course Computer Science, YouTube, http://bit.ly/2xB9N9y
  5. John Preskill, Making Weirdness Work: Quantum Information and Computation.

A Reflection on Daily Routines

As inhabitants of infosphere (borrowing the term from Floridi), we are so used of employing text messages, e-mails and other modern technologies to enhance our communication with others everyday that we are almost unaware of the informational and symbolic processes involved in these activities. A reflection on our daily routines, along with the concepts from this week’s reading, might help us probe into and get a deeper understanding of what we do everyday but hardly bother to ask how.

Let’s take text messages for an example. We take out our mobile phone, compose the message with letters, punctuation marks and blanks, all of which means sets of binary bits from a mobile phone’s perspective. When we finish composing and press “send”, the phone transmits signals that contains those sets of binary bits to another mobile phone by ways of cell service providers.  Then the phone on the other end receives and “decodes” those bits into letters, punctuation marks and blanks, so that the receiver of the text messages can read it. This process conforms to the communication model which features information source, transmitter, noise, receiver and destination in a conduit form, elaborated by Shannon and Weaver.

The reason we can understand the meaning of a text message is, firstly, that the signal system mobile phones use enable texts on one end can be reproduced or “re-tokenize” exactly or with little loss on the other so that we can convey what is intended to another person. The second reason, which I think is more important, is that text messages rely on written language, a presupposed, shared and collective symbolic system, to convey meanings.

As is stated in many articles, meanings is not properties of electronic signals. Strings of “1” and “0” do not mean anything concrete to we human beings. We can extract meanings from texts on the screen because we recognize them as tokes of a type that we are already familiar with. At the end of the day, meanings lie in conventions, like languages, and the ways we translate languages into binary bits that is recognizable by computers and other devices.

Email messages are similar to text messages, encoding texts into binary bits and decoding on the other end. Social media messages combining texts, images and emojis are a little more complicated, but the core processes that ensure transmission and understanding is still the same–binary bits and presupposed symbolic systems. Images, which belong to a system we are also familiar with, can also be translated into binary bits and get reproduced pixel by pixel.  Though the meaning systems of emojis are a little tricky, but the convention of using it is gradually forming.

Therefore, the shared symbolic system is what was left out from the communication model, and is what will complete the model.

Power of “Uncertainty”

To begin with, I want to start with certainty. In “A cultural approach to communication” the author regards communication to human as water to fish. In fact I think, we might be unaware of the logic in communication, but we definitely are well aware of the power of communication, so well as to take every advantage of it. Information has always been powerful to some extent. During the early days, literature and communication has been treated as privilege of the nobles. Not only normal citizens were deprived the right to read and write, punishments to sinners (that can be found in almost all cultures) involves cutting one’s tongue or ears and gouging out the eyes, which in my point of view might symbolic refers to cut off someone’s channel for receiving or giving information. Then time comes to Middle Ages, when French became the favorite “alphabet” of upper class society, communication became harder and harder between nobles and civilian. Not to mention nowadays, when myself, a “digital immigrant” in Floridi’s term, is more than overwhelmed by the flood of information. Time comes for ones who can get hold of the information.

Right then, in this week’s reading, I reached the topic of “uncertainty”.

In Shannon’s view, “uncertainty” is the data deficit that has value in known possibilities. A coin in my fist can create 2 possible data, two coins can create 4, three coins can create 8…… “Information can be quantified in terms of decrease in data deficit.” So apart from the Bar-Hillel-Carnap Paradox which indicates impossible instance creates maximally information, the “uncertainty” does have the power to be more informative than I previously thought.

Crypt has a lot charm, because of the uncertainty the “riddle” produce, from those used in war to the famous plot showed in BBC’s new Sherlock Holmes’s” I am _ _ _ _locked”.

China has an old poem, which translate word by word would be ” Right now no sound at all is better than there is sound”. This poem is now used to describe that under some circumstances no word is needed. In the first place, this line of poem is tend to describe the ending part of a Pipa (a traditional Chinese instrument) show, when the music is too attractive that even in its shortly pause, the silence creates more imagination in its audience mind. This situation can be seen in many musical performances, that the silence between plays are way more informative. Even the film director like to use this kind of charm proved by “uncertainty”. A few seconds of black frames or static frame are more easily to produce suspense.

The most impressive scene in a TV show “Person of Interest” is when the scientist is teaching his A.I to play chess, and the A.I rely too much on the possibility calculation and spent a lot time predicting before its first move. The scientist says: “Each possible move represents a different game… there are more possible games of chess than there are atoms in the universe, no one could possibly predict them all, even you. which means that the first move can be terrifying… but also means that if you make a mistake, there is nearly infinity amount of ways to fix it, so you should simple relax and play.”

I like this part of lyrics because it has a lot alike with the logic lies in the meaning of “uncertainty”. In the chess games, the uncertainty contains the amount of data even an A.I can’t calculate. I tend to fear the uncertainty, exactly because the nearly infinite information it could possibly produce. But here the lines prove a different angle, that the power of uncertainty is an advantage not only for the message (informer), but also for myself (the informee).

Limits of “IRP”

\First of all, I want to say this week’s readings are my favorite so far!!!\

What impresses me most in the information theory is the “IRP” formula — “Inverse Relationship Principle” — simply speaking, there will be more amount of semantic information carried by data when the proposition/event/given language has more uncertainty. It sounds against common sense at first. If thinking over for several seconds, however, it totally makes sense — uncertainty means more probabilities, so it is reasonable to define it more informative.

According to Floridi’s work VSI: Information, this formula works well on the mathematics level, where the efficiency of encoding and transmitting data is the most concern, and information is just treated as data communication. On this level, theories including IRP can explain a lot of phenomenon and develop what we call hitech like telegram and telephone. “Redundancy” is used to counteract the inevitable loss and interference of data in the transmission, and “entropy”, a concept from thermodynamic, is introduced to information theory, and embellishes the theory with incomparable logic and scientificity.

But Floridi soon points out that the “IRP” causes problems on the semantic level: the “scandal of deduction” and “Bar-Hillel Carnap Paradox”. Deduction, by which people make conclusions from lots of given conditions and knowledge, is claimed to have been completely contained in known information, that is, it’s 100% certain. What people “only” do is to draw them out, or reorganize them into an obvious shape, so deduction results in nothing as information. This may annoy countless mathematicians. With regard to the paradox, it infers from the “IRP” that completely contradictory situations create the most “informativeness”. About this paradox, Floridi explains it as the result of a lack of the role “meaning” taking part in this whole process of communication. If the meaning of the language/proposition/event and whether it’s true or false on the factual level is considered first with common knowledge, the paradox can be easily eliminated.

What I get from all these fascinating examples and Floridi’s works is, “meaning”, as well as people’s mental process of understanding the meaning in communications, with whether the most traditional methods or the most advanced technology, cannot be absent when trying to understand the information theory. When it comes to digital or other modern forms of information coated with technology, concepts like “IRP” and “entropy” should play their roles in the explanations of physical theories. Digital and other forms are the vehicle of natural language (or paintings or dancings or other relatively traditional semiotic system), which, we have discussed before, are vehicles of “ideas”, that is to say, technologic forms are the vehicle of the vehicle of ideas. So how can we skip the semantic process in information theory?

Another problem appears when information is just defined as logical, scientific models and theories including “IRP”, according to Day’s article The “Conduit Metaphor” and The Nature and Politics of Information Studies, which is written under certain political context of Cold War but still quite valuable for me. To my understanding (Day’s article is quite readable in general but sadly I don’t get some parts in the middle), the author wants to add more literary, poetic, humanistic and cultural colours to the information theory, along with the strictly and precisely scientific treatment.

Language/information should not just be treated as a transmission or communication medium, but”it’s an agency for social and cultural and political change“. Knowledge is not a self-evident, self-intentional, systemic process, it is “hermeneutic”, “poetic” devices. Otherwise, the scenario has been depicted in George Orwell’s 1984, where languages and information are only treated in the sense of rationality and accuracy, by continuously amending and censoring the NewSpeak Dictionary. That would be creepy. 

At last, I have some questions about Floridi’s theory. 1) In his classification of “data content”. Floridi distinguishes the “instructional” information from the factual one. What’s the point of this difference? 2) In chapter one, Floridi defines different generations as “digital immigrants” and “digital natives”. I’ m wondering, whether, and how this new methods of communication would change the cognitive process of human beings?