How the Physical Components Taking Abstractable

Readings in this week offers me innovative connections between semiotics what we learnt before and communication system together, which broadens the boundaries of these seemly separate concepts and creates a more comprehensive theoretical framework. I would like to try to answer the second perspective prof. Irvine put forward, how the physical components of symbol systems, are taking abstractable into a different kind of physical signal unit for transmission and recomposition.

In 506, we learnt a concept “capture” which means conversion of human behavior and real-world data into machine input. The “capture” process includes humans& world, sensor, transducer, encoder, and machines& computers. It is consonant with Claude Shannon’s transmission model. Needless to say, they both involves information sources and destination. In the case of capture, transmitter and receiver are transducer and encoder, which transduce physical signals to electricity and then encode it into machine-readable language. Noise source can also play a vital role in this process. Noise source refers to any disturbances affecting the accuracy and certainty of the information transmitted. When it comes to the “capture” case, noise can be unstable Wi-Fi connection, typos, or out of electricity. Shannon even applied a physical term “entropy” to explain noise source better. Entropy means disorder and messy state in physics originally. The messier, the higher entropy. Things have the tendency to flow from low state to high state, in other words, which means that things tend to be from tidy state to messy state.

Though we have basic ideas how information gets transmitted from physical components to digital symbols in computing systems, meanings have not been mentioned in the whole process, which is the very dilemma Peter J. Denning and Tim Bell put forward. If meanings are separated from the information transmission model, how do meanings get transmitted? They concluded that sign-referent interpretation that “information consists of both signs and referents” can resolve this paradox.

According to Peirce’ model, there are three basic elements in semiosis: a sign, an object, and an interpretant. Simply speaking, a sign is representamen which can be interpretable. An object is subject matter of a sign and an interpretant. An interpretant is a sign’s actual meaning. They connect with each other tightly form a complete meaning system together. Meanings do not exist in each one of them but in the whole process of it.

In addition, meanings are not isolated but largely depend on meaning communities. They are built upon social common senses and agreed rules. For instance, a religious group has shared symbolic systems, such as rituals, clothing, or diet. Meanings are embodied in these symbols and get transmitted when symbols are enacted. Therefore, meanings are not independent from information transmission system, but instead deeply inside the information transmission process itself. Just as prof. Irvine stated, “Meaning is not ‘in’ the system; it is the system.”

When written here, I am not sure whether I truly understand how to solve this issue or not. I feel myself have obscure understandings but still in a mess and fail to organize myself logically. Hope this problem could be went over in detail in class.

 

Reference:

  1. Denning, P. J., & Bell, T. (2012). The information paradox. American Scientist, 100(6), 477.
  2. Irvine, M. Introduction to the Technical Theory of Information.