This week’s reading introduced Shannon’s inmformation theory. What’s fascinating in his argument is that information is independent from meanings. He hold the idea that information can be measured and standardized. Information theory allows us to have a deeper understanding of information and data in a fundamental way.
In his paper, Shannon argued that “the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” (Shannon, 1948) For example, music can be thought of as the transmission of information from one point to another. To put it in a communication system, the sound of music is a message and an encoder generates a distinct signal for the message. Signals go through a channel that connects transmitter and receiver. A decoder on the receiver end converts the signals back into sound waves that we can perceive.
According to Shannon, “information is entropy.” Entropy is a measure of disorder or uncertainty about the state of a system. The more disordered a set of states is, the higher the entropy. Shannon considered entropy to be the measure of the inherent information in a source (Gleick, 2011). Denning also pointed out that Information is existing as physically observable patterns. Based on that, Febres and Jaffé found a way to classify different musical genres automatically.
Febres and Jaffé solved the music classification by using the entropy of MIDI files. A MIDI file is a digital representation of a piece of music that can be read by a wide variety of computers, music players and electronic instruments. Each file contains information about a piece of music’s pitch and velocity, volume, vibrato, and so on. This enables music to be reproduced accurately from one point to another. In fact, a MIDI file is composed of an ordered series of 0s and 1s, which allows them to compress each set of symbols into the minimum number necessary to generate the original music. After that, they measured the entropy associated with each piece of music based on the fundamental set. They eventually found that music from the same genre shared similar values for second order entropy. This case is an application of information theory, and it is really inspiring that information theory has the potential be applied into many other fields.
Peter J. Denning and Craig H. Martell. Great Principles of Computing, Chap. 3, “Information.”
Claude E. Shannon and Warren Weaver, The Mathematical Theory of Communication (Champaign, IL: University of Illinois, 1949).
James Gleick, The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011).
Martin Irvine, “Introduction to the Technical Theory of Information“
Musical Genres Classified Using the Entropy of MIDI Files, MIT Technology Review https://www.technologyreview.com/s/542506/musical-genres-classified-using-the-entropy-of-midi-files/