Category Archives: Week 5

If Latour’s theory is difficult for you, your mindset is probably influenced by the western culture

Latour’s theory is very sophisticated and should fall as an anvil on the head of scholars who consider that new technologies are a result of internal dynamics and combinatorial evolution, “constructed mentally before they are constructed physically” (Arthur, 2009).

While Arthur’s passive voice above testifies the centrality of technology itself in his theoretical approach, Latour is clear in saying that behind an object, there are innumerous mediators, from engineers to lawyers, and ultimately, corporations, which make technical objects and also human beings what he calls object-institutions. This powerful idea prevents one to understand an object as made by matter, simply. As the author explains, when in contact with a technical object, one is in the end of an extensive process of proliferating mediators. De-blackboxing it means understanding these relations.

Technology and society are, thus, embedded. Through delegation, objects gain actions to execute human tasks. To imagine a world where human beings would be independent of objects is to imagine a nonhuman world, in Latour’s terms. Objects are agents, actants, and through the articulation of characteristics with human beings they become part of the collective of humans and nonhumans.

For many people, the Latour’s perspectivism can sound difficult to understand. The symmetry between agents and actants, where responsibility for actions are shared, imposes a new way of looking at mediation. A human being is so different with a technical object in hands as the latter is with the former. In the contact, they exchange competences and transform themselves.

In this sense, the speed bump is partially a sleeping policeman, as my cellphone is partially my father helping me to wake up in the morning. In a study that I conducted among the Kambebas, an indigenous community in the Amazon region, when asking the mother how she would translate the word computer into Kambeba language, she answered: “the man who knows”. She, unsurprisingly,  understands deeply what Latour is saying, thanks to the perspectivism characteristic of their culture. For us, western culture, however, the articulation is yet a blurred zone that needs to be better revealed.

Citizen journalism as another side of mediation and socio-technical artifacts.

Citizen journalism as another side of mediation and socio-technical artifacts.

Galib Abbaszade

Couple years ago, I launched and managed an internet site oriented to promote and enhance social journalism activity in my country. In the condition of limitation of essential freedoms, I saw this site as another attempt and chance for people to speak free. In general, this type of sites give an opportunity for all people to outline their problems and brought them to the public discussions.

There were multiple reasons, including financial, why I needed to close this site for a while. Herein, I would like to focus my attention on those problems, which were originated by content and style of articles, who mostly were written by not skillful journalists. Considering that site gradually tended to be a social portal for easy access of thousands people, each article became of peace of mediation between writer and big auditorium of readers. As a consequent, the owner of the site (in this case myself) was responsible for all misused information and biased interpretation of facts in the articles by portal users. On my personal experience I figured out that any published article, thought or even word are a part of mediation process between individuals who can use modern techniques to be heard by others.

Within few months, this portal – www.globalpublicvoice.com – became a popular technical artifacts which made me responsible from moral[i] points of view before users of this site. The site itself as a technical object could not be considered as moral or immoral tool, it could not hurt or praise others itself. However, users of this artifact could direct its options to the area of their own interests (and mostly by unfair methods) and discriminated the rights of others. I needed to find some ways to moderate this challenging obstacle professionally, which demanded involving skillful operators and allocate more financial resources. However, to manage such skillful operators, the owner should realize the content of the problem him/herself to provide resolutions for complicated situation. Therefore, I have decided to start from myself and enhance my education and expertise in the relevant areas.

Lately, I realized that I had improper approach for the site designing from its meaning points of view to users. Now, I understood that I need to clarify first how I see this site from goal translation and composition aspects and how to present them to users considering timing and space factors[ii]. Also, considering this site should be an agent connecting individuals it needs to meet all important requirement of “Inter-Agent” from both perspectives – being easy instrumentally and friendly and satisfying interactive tool for users’ needs[iii].

And, may be the last, but not the least problem I faced to mention in this short essay, it was my personal attitude or the site policy to subjects, problems, and points raised by users. I realized that the one of the hardest skill to learn is ability to save neutrality and do not use owner’s options to interrupt the communication way of others, bringing any subjectivity, biases ideology, defending some points against others.

In short, the creator and launcher of such social network portals need to meet certain professional criteria to become a communication agent or design a technical artefact to connect people.

[i] Pieter Vermaas, Peter Kroes, Ibo van de Poel, Wybo Houkes and Maarten Franssen, A Philosophy of Technology, Morgan & Claypool Publishers, 2011, page 16.

[ii] Bruno Latour, Pandora’s Hope, Harvard University Press, 1999.

[iii] Rammert, Werner, Where the action is: distributed agency between humans, machines, and programs, Article in Open Access Repository (www.ssoar.info), Berlin, 2008, page 6.

My thoughts of mediation – from perspective of UX design

The final goal of a product designer is to provide users with the best user experience. User experience(UX) is a complex concept since its definition just varies from industry to industry. Lidwell, Holden and Butler discuss some basic rules of design in Universal Principles of Design, the reading material at the beginning of this term. Of course, designers from different industries will focus on different principles. But one thing which is undeniable is that UX design requires designers to learn deeply of users’ preferences, mentalities and subconsciousness.

Obviously , it is ridiculous to separate technologies from society and culture. In fact, daily products, especially computer software and mobile applications, can be regarded as structurally hierarchical. The bottom layer, of course, is the coding layer, supporting the realization of function by intricate logic computation and the top layer is user interface(UI), which interacts with the users directly. So, according to the definition of technical artefacts given by Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems, we must discuss user plan and practical function together when we are trying to learn the essence of a design case. It just means that people who involve in the system construction process, not only designers but also users and participants of a hybrid system such as a civil aviation system, matter as well. Here I find out two interesting examples of people’s cognition influencing the effect of information transmission.

The first one is about the same book with two different versions of cover:

24083af0615f479e1797e7abcf5a432a_b e3f1c15dd68dac3bd130c73786bb6bf9_b

The same book of two versions with different covers

In this case, the Chinese version has a hook-like symbol in the middle of the cover, instead the English version uses a X-like symbol. These two symbols, however, just express the same meaning of confirmation. In different cultural backgrounds, people often use diverse, even opposite, gesture or symbol languages to express the same meaning in the same circumstance. The most famous example should be traffic rules in most Commonwealth countries, where people drive on the right side. To be honest, localization should be a common sense to every designer.

The second one is about slides presented by a Baidu designer on a professional conference:

34353e65ec7ba0481b6c243245e37c69_r

c56d8fbc0be1c9523f680fc33f9195d7_r

Slides used by two presenters in the same conference(upper: Microsoft, lower: Baidu)

I do not want to talk much about the color of the slides(in fact it reminds me of the blue screens when Windows breaks down). The problem here is whether it is a good idea to put selfies of girls on slides to illustrate his idea in such an occasion. Do professional designers from different countries need to understand the tendency of design development in such a way? Maybe young students with high level of hormone can be more easily attracted by the slides and interestingly, this designer indeed used the slides during the process of campus recruitment. If the audiences in the conference are deemed to be the users of the inappropriate presentation, I guess their user experience will not be good.

So Leaving users and participants, design and technology will be meaningless.

Now let’s dig deeper into the social-technical system. Latour in his book Pandora’s Hope: Essays on the Reality of Science Studies shows us four meanings of technical mediation. I think we can use a term called information flow to explain and understand technical mediation. Latour describes the mental change of people facing a man holding a gun and asserts that a man with a gun can create a totally different effects from which the two objects separately can bring about. Lu Xun, the famous Chinese writer, criticized some conservatives by stating that “Chinese can be imaginative while seeing short sleeves because they will link the sleeves with white arms, then naked torsos and finally sex.” In fact, human beings are always good at associating an object with a new one. Here is a screenshot of Steam store:

360截图-309972656

the information of a discounted game in Steam store

Players will not realize how much they can save when they just see -70% and how great the discount is when they only view the price change. However, these two things together can leave users an impression that they now have a perfect opportunity to purchase the game. What’s more, the due date on the left side with a countdown clock reminds players that the chance is fading away. Actually, it is extremely effective for those players who are hesitating. Latour also mentions a meaning called composition by making an example of chimpanzee. Actually, the searching system used by every shopping website is a good example here. For instance, the one who wants to purchase a laptop on Amazon can search the ideal product by input color, weight, brand, resolution of screen and many other parameters into the searching system and check the limited results. In this process, the need has been divided into some subprograms, just like Latour says in his book. The last meaning made by Letour just reminds me of last week’s readings, which explain the connotation of symbolic cognition. The process from human to nonhuman describes the formation of symbolic language. For example, in the interface of Visual Studio, I will naturally regard key words like while, for and else as representations of the logic relationships intead of using their primary meanings in English.


References:
[1]Pieter Vermaas, Peter Kroes, Ibo van de Poel, Maarten Franssen, and Wybo Houkes. A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems. San Rafael, CA: Morgan & Claypool Publishers, 2011.
[2]Latour, Bruno. Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge, Mass: Harvard University Press, 1999.
[3]Regis Debray, What is Mediology?. Le Monde Diplomatique, Aug., 1999. Trans. Martin Irvine.
[4]Terrence W. Deacon, The Symbolic Species: The Co-evolution of Language and the Brain. New York, NY: W. W. Norton & Company, 1998
[5]Lidwell, William, Kritina Holden, and Jill Butler. Universal Principles of Design. Revised. Beverly, MA: Rockport Publishers, 2010.

De-blackboxing in research

It is a typical tale that media technologies are received by societies with two reactions, either to celebrate it and imagine a utopia in which the technology finally solves one major issue as if by magic and thus represents a technology that can be liberating for individuals and societies, or to condemn and fear it by emphasizing features that can have “negative effects”. We have seen these narratives in many forms in regards to all sorts of media from books, to the telephone, to television, and the Internet. A systems view of media, technologies, and sociotechnical artefacts however present us not only with an argument against the technological determinism aspect of these type of responses, but also invites us to consider that our relationship with media technologies, and technologies in general, is not as simple as a “social construction of technology” either.

“To conceive of humanity and technology as polar opposites is, in effect, to wish away humanity: we are sociotechnical animals, and each human interaction is sociotechnical. We are never limited to social ties. We are never faced only with objects.” (Latour, 1999, p.214)

My questions for this week revolve around using the systems view not only to think about design, but also to think about how to use this approach for research of media technologies. How to work out this de-blackboxing then would be one of the first challenges. The next one may be to determine the level to which the de-blackboxing can serve a specific research question. A systems view would mean to use the principles of modularity, recursiveness, and combinability to make sense of how different components are combined together, how they interact, and the combined effects and dynamics they create, each on its own level, with the broader social system of which they are part. This means that instead of taking media technologies as closed units of analysis, we need to look further, decompose them, and make sense of which level(s) may be more relevant for analysis. When Latour (1999) is going through the eleven layers of his “Myth of Progress,” he explains that each of the sociotechnical layers he discusses is different from the one below/above it, as each has gone through an iteration that has changed it, either from the human/“subjective” side or the non-human/“objective” side. Considering this to approach a media technology means that an analysis would not only have to consider the role of the technology in a group, but also its evolution, and specifically its evolution in regards to specific groups. The analysis also has to be specifically tailored to those components that are relevant.

As an example, a hashtag on Twitter, as a media technology, could be decomposed in various ways. As a feature of social network sites, it could be decomposed into its technical components (the hashtag serves as a link, it also organizes a page on which all tweets that included it are displayed in reverse chronological order according to popularity, posting time, media use, and other options, it is part of a social media outreach repertoire popularly used, etc.). As a term used by a social movement, it has a particular social, cultural, historical context, one that makes it part of a larger system of actors/actants and processes. As a hashtag or key term on the web, it also becomes part of a larger system, one that includes information about this topic and that can be linked across the web. (Or, it may be part of a larger collection of information online but, because it is part of a proprietary platform, it may not actually be linked to all information to which it could be linked.) Decompositions may go a number of different ways, which is why this approach is helpful in making sense of the different dynamics that take place when we speak of sociotechnical systems. Moreover, another issue to analyze is that of the different iterations of the sociotechnical mutually shaping each other. An analysis of a hashtag would have to also consider how the hashtag use has evolved over time, if there are specific moments that can be considered to define each iteration, what was left behind in each iteration, etc. But to a certain extent, not all components could be realistically de-blackboxed and analyzed, so defining this types of limits in research design could be a helpful discussion.

Alibaba’s Dimensions of Mediation

Chen Shen

Alibaba (Taobao) is China’s biggest online commerce company. Founded in 2003,  it accounts for 80% of China’s online shopping market, which is estimated to be 713 billion in 2017. There’re three typical ways to shop on Taobao: via PC’s web-browser, or using the Taobao app on smartphone or tablet. The main differences of these three ways lie in three aspects: layout, commodity detail, means of payment. Though one can basically buy the same things using each interface, the differences mentioned above cater to different users and leads to a major dichotomy. Here I mainly use the tablet interface as the subject of case study and try to figure out its different dimensions of mediation.

2PC Interface5Tablet Interface4Smartphone Interface

In Latour’s framework of Technical Mediation

  • The First Meaning: Interference

To rephrase the core concept of Latour’s first principle, I am no longer the same when I use this app, nor is this interface, “A third agent emerges from a fusion of the other two”. What’s more important, this new agent’s goal is different from the one I used to have. Personally, I can testify this change because many a time I was looking for a simple item but ended up with a whole basket of goods. It can rarely happen without the interface since if one is shopping in physical stores or unintegrated online store, there will be much less interference during the process of the transaction and much more to overcome for a whimsical desire. By simultaneously providing me pictures and links and sales, the interface changes my mental status from “I need something” to “do I need these other things”. The interface changes along. The original goal of the app is for browsing commodities, but when crossing over a user with paying ability, a new agent emerges whose goal is to make relatively optional and affordable purchases.

  • The Second Meaning: Composition

To illustrate the composition level of Taobao, I need to regard the interface as a subsystem and its subsystems as different agents. For example, my task is to seek opinions about a certain nib that no buyer’s comment is available at hand. First I can browse calligraphy supply stores using its subsystem of store-level searching; then locate a certain one using the subsystem of store scoring; then find some nibs in that store with the top-seller recommendation function; next step is to trace possible users that chose similar goods with me by looking through the transaction history; after finally locating a certain user whose opinion I value, I can contact him/her using Taobao’s module Aliwangwang, an IM app between buyers and sellers. In this chain of actions and subprograms, the actant for each one is a composition of the ones mobilized in its precursor.

  • The Third Meaning: The Folding of Time and Space

This aspect of mediation is rather easy to recognize for Taobao as an interface. It serves as a platform for a cornucopia of commodities,  each one enmeshed in an internet of things and has various histories. Even the interface itself, as in last part, is an aggregation of functional modules. In the code layer, the algorithms and data structure it employs also can trace back to the dawn of modern programming. Like a telescope in search of millions of stars, in our Taobao case, both “telescope” and “stars” have different dimensions of time and space wrapped in.

  • The Fourth Meaning: Crossing the Boundary between Signs and Things

In his A Collective of Humans and Nonhumans — Following Daedalus’ Labyrinth, Latour used the concept of delegation to explain how human are folded into nonhuman and vice versa. Using the interface of Taobao, the user, as enounciatee, also interact with different kinds of delegation designed and deployed by enouciators who are now absent in the process. For example, other buyers also play the part of the sleeping policeman. Still using the nib example, each piece of buyer comment under the commodity detail page is an active actant in my purchasing action. But the buyer is obviously absent here, I interact with him/her via the interface. By doing this, the human is folded into nonhuman. And by buying this nib, I myself also is folded into nonhuman, my preference, my comments, my transaction records will remain there to interact with coming users until eternity (or at least as long as the server continues). By this means, even by just purchasing a single nib, I interact with craftsman, calligrapher, designer, manufacturers, programmers, salesman, web designers, sales reps, deliveryman, and so on. Their work and labor, even happened long ago, are transferred, abstracted, and encoded in the form of either material or information, and swirled  together into this sociatechnological maelstrom.

  • From another point of view, and my concerns

Not totally agreeing with Latour’s approach to establish a symmetry and embrace a flatten concept of agency, Rammert adopted another model of agency in his work Where the Action Is: Distributed Agency Between Humans, Machines, and Programs. Actions are categorized as causality,  contingency, and intentionality. Rammer’s model relies on nature of the action, whether is mechanical and predetermined, or interactive and self-regulatory, or reflexive and intentional. Using the means above we can also analyze which parts of the Taobao interface are which level. But my concerns here is a noticeable trend  that human part in this interaction is continuously demoting from intentionality to contingency to causality. Taobao’s Chinese name, 淘宝, can be roughly translated into “seek treasure”, which clearly emphasized on human’s part of the seeker, who intentionally seeks what’s valuable to him. But as technology advances and interface intellegentizes, the routine process involves (even tolerates) less and less intentional actions. The ever expanding numbers of entries (800 million at the moment) objectively require users to choose without thorough thought, usually by some sorting algorithm which is rather opaque. And  subjectively, the recommendations, either based on purchasing history, or peer choices, or local trend, or seasonal trend or whatever, deprived  users discretion even further. A great part of online purchasing now is responsive rather than intentional. And in a near future, with sociotechnologies like Instant-Ink by HP or Dash Button by Amazon, some human actions will demote to simply irritation. In the meanwhile, the nonhuman part is taking more and more control in this hierarchical model of actions. One may say, isn’t it nice to free us from meaningless labor? But like in trivial actions, human concede predetermined, interactive, and intentional actions successively, isn’t is possible that we concede discretion in trivial matters, less trivial matters, and key matters successively. The fourth meaning of Latour indeed connected individual with more and more human and nonhuman agencies, but the more one is connected, the less urgent is his discretion. Time after time we see an apocalyptical world in science fictions where human is finally enslaved, it’s truly far beyond reach, but from small traces we can see that human is losing control, and not in a good way.

If you log in to www.taobao.com, you can see a meticulously weaved net which ensnares treasures within my scope of interests. My question is, right now we may be the Arachne collecting goodies on the mesh, who’s to say there won’t be a day that we become the victims in this Daedalus’ maze and inevitably follow Ariadne’s thread to somewhere she meant us to?

References

A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems. San Rafael, CA: Morgan & Claypool Publishers, 2011

What is Mediology? Regis Debray, Le Monde Diplomatique, Aug., 1999.

A Collective of Humans and Nonhumans — Following Daedalus’s Labyrinth. Bruno Latour. Cambridge, MA: Harvard University Press, 1999

Where the Action Is: Distributed Agency Between Humans, Machines, and Programs.Werner Rammert, 2008

Working with Mediology and Actor Network Theory: How to De-Blackbox an iPhone.Martin Irvine

A Simple Sociotechnical Interpretation of Artificial Intelligence – Jieshu

“The price of metaphor is eternal vigilance[i].” –Herbert Weiner

This is a quotation in Professor Irvine’s Working with Mediology and Actor Network Theory, also what I would like to start with if I were to explain a different way of thinking about design to someone using popular literature fashion to interpret the relationship between technology and society.

When trying to digest a new thing, metaphor is an oversimplified but useful tool for us to grasp the basic concepts, for example, the word “digest” I used just now, “black hole” in physics, or “content” in media technology. We don’t literally eat something we try to understand, and a black hole is not actually a hole, while content is not invariably something inside a container. A black hole in physics is an area in space with a huge mass but a small volume. Content is an interface(s) mediating different components in sociotechnical systems. Metaphor is a black box that conveniently cuts our inquiry.

Hereby, I will try to use Artificial Intelligence (AI) as an example to elaborate how to use a system view to think differently in future designs. I don’t know whether my interpretation is correct since this week’s reading is very abstract, but I’ll try to use the concepts I learned to do that.

When considering the future of AI, there are two polar camps of thoughts.

  • One is that AI will one day outsmart and ultimately destroy us if it is not under our control, represented by Elon Musk and Stephen Hawking.
  • The other is that AI will not be smarter than us and even if it achieves high intelligence, it will not be an existential risk to us, mostly represented by computer scientists and engineers who devote their every work into this field. This point of view is like National Rifle Association who holds the position of technological neutrality and who insists that it is people, not guns that kill people, mentioned both in Bruno Latour’s On Technical Mediation excerpted from Pandora’s hope[ii] and Pieter Vermaas etc.’ A Philosophy of Technology.

First of all, I think they both made a mistake of seeing AI as a mere technical artifact, defined as “physical objects designed by humans that have both a function and a use plan” by Pieter Vermaas etc. in their A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems[iii], instead of a collective agency in sociotechnical system[iv]. They interpret AI either as a means to fulfill human expectation and goals or as a natural object with a bunch of functions and structures.

The root of their mistake, I think, lies in the dualistic point of view of seeing society and technology as two separate things. Therefore, in their view, AI belongs to the technological domain, having the potential to impact our social domain.

However, AI is more than that. In sociotechnical system point of view, AI is more like an interface that mediate components in a hybrid system with all the people and factors coming from different areas, both technical and social.

AI could be integrated into many systems. Actually, they are woven together seamlessly, serving as both data givers and takers at the same time, forming new systems constantly. For example, an iPhone user asks Siri about the location of a nearby mall. Here, AI is an interface that combines the user, iPhone, Siri, and geographic information to form a new system that is responsible for things happening next, including user checking website of the mall, Google Map suggesting a direction and user calling for an Uber to get there. Another example is a system formed by AI and its designers. Just like the system formed by a gun and a man, whether the AI will do harm to people is not determined solely by the designers or the AI, but by the new system.

In his Artificial intelligence: a modern approach, Stuart Russell took the dimension of “acting rationally” in artifacts and saw AI as a rational agent that does the right things according to the perceived environment[v]. There are two main points here: responding to the environment and doing the right things, of which the former requires sensors to perceive the world and the latter needs powerful algorithms to crunch the data toward optimal decisions. As the principle of modularity we learned weeks ago, both parts have quite a few branches of disciplines being probed into respectively. For example, in the algorithm part, some people prefer supervised learning while others unsupervised learning, both have a lot of smart methods, brilliant proponents, and histories. Meanwhile, a lot of standards have been established and shared by people in this community.

So, AI not only serves as an interface for Siri users, but also combines and mediates engineers, scientists, companies, philosophers, smart devices, industries, standards, information, academic journals, conferences, historical events, databases, institutions, and so on, technically, socially, spatially and timely.

In system design, we should ask two questions: finding the system boundaries and controlling the predictability[iii]. I think, as for the question of whether AI will destroy us, the scope of the system in question should include all the shareholders and factors dynamically. Using rules and instructions to keep the sociotechnical system running. Also, laws and standards should be established to direct these new systems. I’m not able to give specific suggestions for future AI, but I believe that its role in the sociotechnical system and the debates around it would change as time goes by because people discussing these questions and the sociotechnical structure itself are both being mediated continually.


References

[i] Irvine, Martin. n.d. “Working with Mediology and Actor Network Theory: How to De-Blackbox an iPhone.”

[ii] Latour, Bruno. 1999. Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge, Mass: Harvard University Press.

[iii] Vermaas, Pieter E., ed. 2011. A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems. Synthesis Lectures on Engineers, Technology, and Society, #14. San Rafael, Calif.: Morgan & Claypool Publishers.

[iv] Rammert, Werner. 2008. “Where the Action Is: Distributed Agency Between Humans, Machines, and Programs.” In .

[v] Russell, Stuart J., and Peter Norvig. 2010. Artificial Intelligence: A Modern Approach. 3rd ed. Prentice Hall Series in Artificial Intelligence. Upper Saddle River, N.J: Prentice Hall.