Rather than simply considering computation and media technologies as merely manufactured products, viewing these technologies as “symbolic-cognitive artefacts” provides an insight into their design principles, allowing a possibility for critique and even a scope for recognising improvements.
Let’s look at the computer. For a long time, the computer meant this assortment of a separate display unit, a mouse, a keyboard, a CPU. But then we had the laptop, the mobile phone, the Mac desktop, the tablet, and new technologies continue to develop. The computer is now looked at as a device to work on a wide variety of cognitive tasks from painting, photoshopping, designing, writing to even guiding weapons systems. But at the core of it lies algorithmic processing. Its essential task is to execute lines of codes. An algorithm is a finite sequence of well–defined, computer-implementable instructions, typically to solve a class of problems or to perform a computation, and this definition has not changed since the time of Ada Lovelace, the first programmer. (The Definitive Glossary) (Fuegi & Francis, 2003)
When computers are looked at as more than devices or machines, we can peer behind their current functionalities and look at “the cognitive” and “symbolic” continuum that we are part of. (Irvine, 2). The essence of computing is still algorithmic and we come closest to this essence when we look at it as a symbolic cognitive artefact. “We are simply at one point in a longer “cognitive continuum” that begins with language and symbolic representation, and expands into our ability the think with and represent multiple levels of abstraction (spanning writing, mathematics, symbolic media like images and combinations in film and multimedia, and computer code).” (Irvine, 2) As a machine, the computer has undergone a redesign for the past 70 years. As a symbolic cognitive artefact, the computer has gone through centuries of redesign. It began materially as fingers combined with a rule — of how to count on fingers and add and subtract. Computation on the basis of rule-following stretches back to logarithmic tables, where massive computations could be carried out by indexing numbers on a grid. By looking from a symbolic artefact POV, we become aware that logarithms, Charles Babbage’s Analytical engine, ENIAC, and the MacBook are all united by this lineage of evolving computation.
When we look at the MacBook, we talk about its processing speed in terms of how fluid the transitions on our display will be. When we talk about the RAM we think about how many applications at a time could it handle. Don Norman explains that this way of looking at cognitive artefacts is called the personal view (Norman, 20). But within the system view, we look at an expansion of cognitive capabilities from the former state (either without a machine or with an inferior machine) to the latter state (the more powerful computational machine).
The awareness of the computer as a cognitive artefact brings to light its symbolic manipulation capabilities in the backdrop of all such previous artefacts that functioned towards the same goal as that of a computer. We forget that calculations and following procedures is what computers were made for. To this effect, outdated technologies might still have something to teach us.
Don Norman gives an excellent example of this notion of antiquated symbolic systems that are not necessarily always inferior. He takes the example of the tally systems, the Roman numerals, Arabic numerals (modern number system). If we want to perform computations like additions, multiplications, etc then yes, the Arabic numerals are superior. But if we consider tasks like comparing numerical values, the tally system is superior as it allows the user to see which tally marks are lengthier. The length corresponds to the numerical value allowing for a more natural mapping. Roman Numerals try to accomplish the best of both worlds by being symbolic yet allowing length of the number to represent numerical value to an extent. (Norman, 32)
There might be problems in the history of the design of computers that led to certain constraints that led to certain designs. But when we look at the computer as a symbolic cognitive artefact, the intent behind designs comes forward and we can subject it to scrutiny once again. Maybe the roman numerals will inspire another number system again and maybe Babbage’s analytical engine has yet to teach us something that only the advent of microprocessors could make available.
Donald A. Norman, “Cognitive Artifacts.” In Designing Interaction, edited by John M. Carroll, 17-38. New York, NY: Cambridge University Press, 1991. Read pp. 17-23.
Fuegi, J; Francis, J (October–December 2003), “Lovelace & Babbage and the creation of the 1843 ‘notes'” (PDF), Annals of the History of Computing, 25 (4): 16–26, doi:10.1109/MAHC.2003.1253887, S2CID 40077111
Martin Irvine, “Introduction to Cognitive Artefacts for Design Thinking” (seminar unit intro).
“The Definitive Glossary of Higher Mathematical Jargon — Algorithm”. Math Vault. August 1, 2019. Archived from the original on February 28, 2020. Retrieved November 14, 2019.