Interaction and common grounds—A further understanding of the information theory-Wency
The development of different forms of information transmissions is inseperatable with human’s need of mutual interaction. In the earlier times, people were only able to communicate with each other in person when they need human resource to pass the information (i.e. letters) if they need distant connections. It was when later people started to realize the usability of physical resources (i.e. electrical and magnetic waves that are not completed limited by space and distance) that the blossom of electrical telegraph and the utilization of electromagnetic system of signals began. At this point, human beings and those machines were nearly isolated where the human communication system seems to live in a world separated from the physical communication and transmission system. On the one hand, each individual in the human communication system generates meanings constantly where they were able to communicate with each other under the social and cultural context which more or less embedded into their memory and affect the whole process where they make selections in the perception of signs, realizing what is inside the box (i.e., the object), come up with further connotations and also as an output that contributes back to the social, cultural context. On the other hand, people have understandings of how to utilize the bottom layer (i.e. the electronic signals) of communcation to pass the signals no matter the distance. By correlating pulses of electrical current to an abstract symbol set, Morse was able to implement the interface of such bottom layer with the earliest human compiler when they interpret those symbols and transfer them onto a higher level of human readable form of information. At this point, according to Shannon, what matters was not the distance but the control, in other words, how to minimize the loss of information, counteract with the noise which was inevitable during the transmission of signals alone the channels and how to recover the information on the receiver side, in the Mathematical theory of communication, he pointed out that the entropy of information (i.e., the randomness and uncertainty) can be measured and controlled in a mathematical way where information could be guaranteed to be transferred in a maximum way as well as not being changed or lost significantly. To be sure, human beings are still interact with the physical communciation system, but because the update of the whole information system is quiet slowly input onto the system, such interaction is too slight to be taken account.
- Growing interactions between human and machine
However, Shannon’s classical information theory that information can be transmitted received accurately by process that do not depend on the information’s meaning seems to be more and more doubtful in today’s standard. After Alan Turing’s first presumption of what makes it a computing machine (although it was pure imaginary without physical support), today the blossom of computer industry obviously is making the transform of information (i.e., the creation of new information) incredibly faster. This seems to be easy to explain, while computers are operating based on the electricity transferring between the transistors, such electricity is passing in a speed identical to the light (3*10^8 m/s), which means the update of a large binary sequence could happen in a second. Moreover, while different compliers are substituting the position of earlier human reader in the Morse’s era, the interaction between human and machine is becoming much faster in a human readable way. When we are reading a text message, digital image, listening to a music piece, watching an online Youtube video or writing some comments at the under interface with digital devices, we are using symbolic interpretative process all the time to map the representations onto recognizable patterns and make inference about what motivated the encoded information that gets transmitted through the system, all of these events happen under a macro social, cultural context where we absorb knowledge, basic rules that affect our interpretation as well as changing the context by ourselves. Our user actions dynamically goes onto the front end programming language of computer and gets transferred layer by layer until they reach the binary system where data packets are thus able to be sent under the manipulation of the TCPIP model. Therefore (according to the figure above), information, although still unchanged in the bottom level while being transferred, is constantly being updated at the application level (through the user interface) within the help of human agencies as the constant meaning generators.
- Understand the similaries between the tokens of Turing Machine
The Turing machine, though depicted in an imaginary way according to Turing in the earlier year, is a universal one which set the standard of all the machines or systems to be a computable one. According to the earlier depiction of the universal Turing machine, three major items are necessarily to be mentioned here: tape, symbols and states. A tape is a long strip divided into squares where symbols are written to or read from through a head of the control unit. States determine the machine’s actions according to themselves which in short can be described as in state i, execute algorithm Ni. We can thus easily assign those items into different tokens of the universal Turing machine and therefore realize increabily large amount of connections between human mind and computer. In a macro way, tape, as temporal memory storing the earlier symbols written into it, can bring us to an analogy to human’s memory (including long term and short term) and also the memory in a mechanical way (RAM, ROM, hard disk and today’s cloud storage). The symbols therefore would be those content stored in the memory where we can also read them, or better, retreive them for later actions (e.g. think about human beings’ analyzing a sentence according to the linguistic system, they percept the words and retrieve the memory to map those lexicons in a syntactic way). Last but not least, the states stands for all the process when we analyze something or make a decision (if this is the case, then I will…other wise I will…), and such states also map well in a digital world where we first interpret those states of mind into pseudo code and then code them into a large amout of IF-ELSEs.
Further, if we look in a micro way, what amazes us is that the physical structure of signals and systems even matches well in a biological way – Every neuron, could somehow be considered micro system containing transistor (axon terminals), channel (axon), receiver (dendrites), and the electricity passing between the transistor in a computer system here could be substitute by electrochemical signals and they are more or less all controlled by the central nervous system (probably is similar to the CPU). What is more interesting is that even the electrochemical signals are not completely analog signals (there is somehow several instantaneous changes between states). Currently what confuses a lot of psychologists is how these electrochemical signals are later generate in a larger semantic way (now in computer system we know how electricity are represented in binary system and goes all the way up to human readable programming languages). But what we now know is human’s ability to jump out of the existing symbols and information to a external virtual imagination in order to complete their logic thinking (If we understand fully about how this is realized in an anatomic way we are therefore probably be able to bring computers into a more autonomic level, but fortunatly we are still on the progress of making computer more and more human-like, including enabling computers to compute in a more emboddied way, i.e., the capture).
Nevertheless, no matter how abstract we’ve been right now, we need to bear in mind that all the forms of information lie upon a thermodynamic system where information is inseperatable with energy and physical structures (computer release heat, human beings are consuming calories while thinking, etc.), and while according to the second law where there is always a tendency where entropy is going to increase, how to recycle information in an efficient way can never be avoid to discussion.
- Martin Irvine, “Introduction to the Technical Theory of Information“
- Luciano Floridi,Information: A Very Short Introduction
- James Gleick, The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).
- Peter Denning and Tim Bell, “The Information Paradox.” From American Scientist, 100, Nov-Dec. 2012.