Author Archives: Rohan Somji

The iPhone and Latour’s Meanings of Mediation

“We may then be able, finally, to understand these nonhumans, which are, I have been claiming since the beginning, full-fledged actors in our collective; we may understand at last why we do not live in a society gazing out at a natural world or in a natural world that includes society as one of its components.” (Latour, 174)

One of the features of good design is that it becomes invisible and the user can instead focus on the task at hand rather than the instrument she uses. Yet it is the good design itself that results in black-boxing (though there are socio-cultural forces that are contributing, see Irvine, 1). De black-boxing then, at least in one way, entails understanding the mediating role of technologies.

Latour specifies four meanings of mediation and I would like to display these four meanings with the example of an iPhone. Let’s lay down the way Latour’s analysis applies to an iPhone and then we can look at the meanings of his terms. The iPhone entered the market of smartphones with the unique capacity to successfully integrate touchscreens with mobile computing.

With regard to mobile computing, Apple allied itself with the touchscreen technology (composition) and conveyed all that is needed for the realization of a convenient-to-use smartphone (delegation). After this, the iPhone handled things all by itself (black-boxing). The way it did this was by changing the program of action of smartphone users using styluses and buttons on their devices into using a finger to manipulate the items on the screen. Translation, composition, reversible black-boxing, and delegation each form an aspect of technical mediation that could not exist without the others. (Here I replace Peter-Paul Verbeek’s explanation of Bruno Latour’s example of a speed bump to deblackbox an iPhone, see Verbeek, 131).

Latour gives us a new tool to analyse this situation. Its called Actor Network Theory. Here  we understand the user and his phone as two separate agents. The agents can also influence each other (See, Latour’s Gun example in pp176-180).  When the iPhone comes into the mix, we encounter translation as the first meaning of mediation. The interference of a new device in our midst creates a new link that to some degree modifies the two agents (Latour, 179).

The second meaning of mediation is composition. The original goal was, let’s say to use the smartphone to make a list of clients she wants to call. But the two agents combined can generate a third goal (such as looking up the internet to find a ready-made list).

The third meaning of mediation is reversible black-boxing. Part of the reason the iPhone is good is because we do not need to worry about the technical aspects of it. We do not need to understand the constraints and affordances in detail to operate it. Once we learn how to use it the device becomes both opaque and invisible at the same time (a paradoxical reflex in designing complex structures as Irvine puts it; Irvine, 6). But when it breaks down we realize how many people were involved in assembling it. How far away do pieces come from?

The fourth meaning of mediation is delegation. The past decisions of Steve Jobs exert influence on the touchscreen of my present iPhone SE. He wanted to achieve the final goal of getting users to use fingers for the tasks on smartphones. He achieved it by delegating the task to the creation of a device that requires its users to use nothing but their fingers. Latour says “I rely on many delegated actions that themselves make me do things on behalf of others who are no longer here” (Latour, 189). 

Technologies are constantly in mediation with society and culture and the distinction is quite a remnant of the subject-object distinction. Within a network analysis, the system is composed of actors, actants, and goals, with a fluid transition between culture and technologies, allowing an iPhone and it user to be part of a network of many goals, interactions and programs of actions.




Bruno Latour, “On Technical Mediation,” as re-edited with title, “A Collective of Humans and Nonhumans — Following Daedalus’s Labyrinth,” in Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge, MA: Harvard University Press, 1999, pp. 174-217.

Martin Irvine, “Understanding Media, Mediation, and Sociotechnical Systems: Developing a De-Blackboxing Method” [Conceptual and theoretical overview.]

Verbeek, Peter-Paul. “Artifacts and attachment: A post-script philosophy of mediation.” Inside the politics of technology, 2005, pp.125-146.

Computers as Symbolic Cognitive Artefacts

Rather than simply considering computation and media technologies as merely manufactured products, viewing these technologies as “symbolic-cognitive artefacts” provides an insight into their design principles, allowing a possibility for critique and even a scope for recognising improvements.

Let’s look at the computer. For a long time, the computer meant this assortment of a separate display unit, a mouse, a keyboard, a CPU. But then we had the laptop, the mobile phone, the Mac desktop, the tablet, and new technologies continue to develop. The computer is now looked at as a device to work on a wide variety of cognitive tasks from painting, photoshopping, designing, writing to even guiding weapons systems. But at the core of it lies algorithmic processing. Its essential task is to execute lines of codes. An algorithm is a finite sequence of welldefined, computer-implementable instructions, typically to solve a class of problems or to perform a computation, and this definition has not changed since the time of Ada Lovelace, the first programmer. (The Definitive Glossary) (Fuegi & Francis, 2003)

When computers are looked at as more than devices or machines, we can peer behind their current functionalities and look at “the cognitive” and “symbolic” continuum that we are part of. (Irvine, 2). The essence of computing is still algorithmic and we come closest to this essence when we look at it as a symbolic cognitive artefact. “We are simply at one point in a longer “cognitive continuum” that begins with language and symbolic representation, and expands into our ability the think with and represent multiple levels of abstraction (spanning writing, mathematics, symbolic media like images and combinations in film and multimedia, and computer code).” (Irvine, 2) As a machine, the computer has undergone a redesign for the past 70 years. As a symbolic cognitive artefact, the computer has gone through centuries of redesign. It began materially as fingers combined with a rule — of how to count on fingers and add and subtract. Computation on the basis of rule-following stretches back to logarithmic tables, where massive computations could be carried out by indexing numbers on a grid. By looking from a symbolic artefact POV, we become aware that logarithms, Charles Babbage’s Analytical engine, ENIAC, and the MacBook are all united by this lineage of evolving computation.


Analytic Engine (first described in 1837)

Electronic Numerical Integrator and Computer (1945) –  the first electronic general-purpose digital computer. Able to solve “a large class of numerical problems” through reprogramming.

When we look at the MacBook, we talk about its processing speed in terms of how fluid the transitions on our display will be. When we talk about the RAM we think about how many applications at a time could it handle. Don Norman explains that this way of looking at cognitive artefacts is called the personal view (Norman, 20). But within the system view, we look at an expansion of cognitive capabilities from the former state (either without a machine or with an inferior machine) to the latter state (the more powerful computational machine). 

The awareness of the computer as a cognitive artefact brings to light its symbolic manipulation capabilities in the backdrop of all such previous artefacts that functioned towards the same goal as that of a computer. We forget that calculations and following procedures is what computers were made for. To this effect, outdated technologies might still have something to teach us.

Don Norman gives an excellent example of this notion of antiquated symbolic systems that are not necessarily always inferior. He takes the example of the tally systems, the Roman numerals, Arabic numerals (modern number system). If we want to perform computations like additions, multiplications, etc then yes, the Arabic numerals are superior. But if we consider tasks like comparing numerical values, the tally system is superior as it allows the user to see which tally marks are lengthier. The length corresponds to the numerical value allowing for a more natural mapping. Roman Numerals try to accomplish the best of both worlds by being symbolic yet allowing length of the number to represent numerical value to an extent. (Norman, 32)

There might be problems in the history of the design of computers that led to certain constraints that led to certain designs. But when we look at the computer as a symbolic cognitive artefact, the intent behind designs comes forward and we can subject it to scrutiny once again. Maybe the roman numerals will inspire another number system again and maybe Babbage’s analytical engine has yet to teach us something that only the advent of microprocessors could make available.


Donald A. Norman, “Cognitive Artifacts.” In Designing Interaction, edited by John M. Carroll, 17-38. New York, NY: Cambridge University Press, 1991. Read pp. 17-23.

Fuegi, J; Francis, J (October–December 2003), “Lovelace & Babbage and the creation of the 1843 ‘notes'” (PDF), Annals of the History of Computing, 25 (4): 16–26, doi:10.1109/MAHC.2003.1253887, S2CID 40077111

Martin Irvine, “Introduction to Cognitive Artefacts for Design Thinking” (seminar unit intro).

“The Definitive Glossary of Higher Mathematical Jargon — Algorithm”. Math Vault. August 1, 2019. Archived from the original on February 28, 2020. Retrieved November 14, 2019.