Through the course of this week’s readings, my mind kept returning to two distinct, but related, questions. First, does the intentionality of figures like Alan Kay, Doug Englebart, and others in designing computing systems to remediate “real world” commonplaces such as the desktop exempt them from the pattern of media theory, first observed by Marshall McLuhan, that “the content of the new medium is the old medium?” Second, are computational systems, and particularly commercial computational systems, unique from all other human artifacts in that the remediating action always involves symbols remediating other symbols and not only stylistic imitations (like Manovich describes with Gutenberg Bible’s imitating manuscripts or cinema imitating the theater)?
Regarding the first question, several of the readings for this week, including the Manovich and Moggridge excerpts discussed the process of making cultural computing viable through the use of iconic remediations like the desktop metaphor and the GUI display. These histories also emphasized the role of their inventors in providing these breakthroughs in human-centered design for computers. And while certainly these innovations ought not be trivialized or overlooked, they seem to carry with them a fact that is simultaneously self-evident and ontologically limiting; that computers remediate the way they do because they were designed to do so. On a certain level, this is irrefutably true. The actualized remediations that the digital citizen interacts with every day owe the debt of their existence to Stu Card, Larry Tesler, Doug Englebart, and so on. But none of these figures invented remediation for computational media. In fact, the problem with formulating these innovations in such a way so as to equate them with remediation is that it obscures the fact that even before computers enlisted the help of icons for mainstream acceptance and use, computers were still remediating something. Isn’t the digital substrate itself a remediation of boolean logic? Doesn’t command line interface remediate the syllogism? The point I am trying to make is that it seems unfair to even unintentionally imply that these men initiated remediation in the history of computing. They simply initiated the version of remediation which is actualized and recognizable to the mainstream computer user.
This leads me to my second question. There are, as noted in the cases of boolean logic and command line interface, cases where existing phenomena seem to be remediated by a computer in the same way as, say, the theater was remediated by filmmaking. However, a greater number of remediated objects on the computer are required to first go through a process of being represented by other symbols and finally reconstructed (as is the case with icons, GUI, etc). This seems categorically different than what traditionally occurred in remediation. In the case of the cinema or the printed book, the cultural practices of the old medium influenced the cultural practices of the new medium, which could then continue these practices or change them as a society saw fit. In the case of digital iconicity, however, the old medium is semiotically constructed so as to communicate something to the user. In other words, its old cultural practices are insignificant and the representation of the old medium is only as significant as a device of communication to the user. Hence skeuomorphs like the floppy disk in the save icon, or a postage stamp for email. The differences between these two types of remediation, in our reading, seems implicit, but perhaps needs further working out.
Lev Manovich (2013) Software Takes Command, New York, NY: Bloomsbury
Marshall McLuhan (1964) Understanding Media. Cambridge, MA: The MIT Press
Bill Moggridge, ed. (2007) Designing Interactions. Cambridge, MA: The MIT Press