Category Archives: Final Project

The War That Changed The World

Dr. J.W. Mauchly makes an adjustment to ENIAC, the massive computer he designed to assist the U.S. military during World War II.

(f.1) Dr. J.W. Mauchly makes an adjustment to ENIAC, the massive computer he designed to assist the U.S. military during World War II.(Source:


This paper provides an overview of the technological advances in the historical context of World War II, the institutions and individuals that played a key role in the creation of computers and the impact of said advancements in our current technology within the context of armed conflicts. It aims to analyze beyond the ‘what if’ scenarios and take a closer look at certain moments in history that signified a before and after for the technology around computers, who were the key actors behind them, and how did it shape and define the computers we use today and how we use them.

Topics, concepts and keywords:

  • History of technology development during and after World War II.
  • University research and government funding for the developing of technology during war.
  • Computing, graphical interfaces for human interaction and combinatorial technologies.
  • Keywords: World War II, Department of Defense, DARPA, ARPA, ENIAC, Cold War, MIT, MIT Lincoln Labs.

Research questions

  • In what ways is the current technology around computers influenced by the technological achievements of World War II?
  • Within this context, under what circumstances do the combination of international conflict and the involvement of government with university research teams motivates the advancement of technology?


In popular culture, it’s common to refer to our current times as the age of technology. We live in a world that is not only intrinsically related to technology but it’s also incredibly dependent on it. This trend is not entirely new, we’ve been influenced by technological advancements for a very long time, even before the invention of electricity. However, there is no denying that the pace at which technology advances has sped up drastically in the last half century. It wasn’t that long ago that we used to live in a world without internet, cellphones, GPS, or digital cameras, just to name a few. More surprisingly, technology is advancing so fast that, many times, its predecessors become obsolete very quickly.

Image result for technological advances

(Fig. 2) Source: Google images.

Society marvels at new technological advances in different fields and wonders “how is it possible?”. The rapid pace and the mysterious aspect (black-boxing) of the modern advancement of technology make it seem as something magical, almost inevitable and unstoppable for everyone. In order to demystify technology as an autonomous entity that magically evolves independent from us, it is important to ask what happened 50-60 years ago that unchained this phenomenon? Who played a part in it? And how did it affect the current state of our technology?

A snapshot in time

To begin to answer our question it is necessary to look at the history of what was happening in the world at the time. Upon analyzing this, we’ll find that it was not one specific event but rather a combined chain of events, interdependent, that happened at the perfect timing. On top of that, it wasn’t one specific individual, but instead a group of different actors and institutions whose actions had an impact in determining the path technology would take in the future.

Even though technology is still very much present and a determining factor in future conflicts –in addition to earlier inventions in World War I serving as the ancestors to build on new technology- no war had such an impact on the current technology of our lives than World War II (1939-45).

It was a peculiar moment in history in which a unique combination occurred simultaneously: the need for technological advances to defeat the enemy with the intellectual flourishment of revolutionary ideas in the field. Both government funding and private sector funding united forces with academic research in the United states, such as MIT and Stanford, which resulted not only in the victory of the allies but its effect still resonates in our lives with the way we interact with technology in our everyday activities.

(Fig. 3) The transportation technology advances in World War Two included amphibious landing vehicles, aircraft carriers, vastly improved tank technology, the first appearance of helicopters in combat support roles, long range bomber aircraft and ballistic missiles. (Source:

There were many types of technologies and discoveries of scientific principles that were customized for military use. Major developments and advances happened in such a short period of time that it’s difficult to study and analyze all of them in this limited space. Just to name a few, we can take into account the design advancements of weapons, ships, and other war vehicles, or the communications and intelligence improvements with devices such as the radar, allowing not only navigation but remote location of the enemy as well. Other fields that were drastically influenced by technological advancements were the medical field and the creation of biological and chemical weapons, the most notorious case being the atomic bomb.

On the subject, Dr. David Mindell from MIT brings attention to a few specific cases and their impact, both during the war and its outcome, as well as in the current state of our technology:

We can point to numerous new inventions and scientific principles that emerged during the war. These include advances in rocketry, pioneered by Nazi Germany. The V-1 or “buzz bomb” was an automatic aircraft (today known as a “cruise missile”) and the V-2 was a “ballistic missile” that flew into space before falling down on its target (both were rained on London during 1944-45, killing thousands of civilians). The “rocket team” that developed these weapons for Germany were brought to the United States after World War II, settled in Huntsville, Alabama, under their leader Wernher von Braun, and then helped to build the rockets that sent American astronauts into space and to the moon. Electronic computers were developed by the British for breaking the Nazi “Enigma” codes, and by the Americans for calculating ballistics and other battlefield equations. Numerous small “computers”—from hand-held calculating tables made out of cardboard, to mechanical trajectory calculators, to some of the earliest electronic digital computers, could be found in everything from soldiers’ pockets to large command and control centers. Early control centers aboard ships and aircraft pioneered the networked, interactive computing that is so central to our lives today”. (Mindell, 2009).

Image result for V-1 or “buzz bomb” world war II

(Fig. 4) The V-1 or “buzz bomb” was one of the early bombers used during World War II. (Source:

(Fig. 5) Radar system in operation in Palau during World War II. (Source:

The history of how all of these advancements came to be it’s fascinating, and it would be easy to get sidetracked into analyzing each of them. However, this paper does not aim to be a mere recounting of the facts that are already very well documented by historians. Let’s take a look at the specific case of advances in computing, which is probably one of the biggest, if not the main, takeaway from World War II.

Even though, ‘computing’ as a way of thinking and seeing the world had existed for a very long time before these events –including machinery- there is no denying that the jump in the last 50-60 years has been abysmal, and we owe it, in big part, to the research and funding achieved during and after World War II.

As a field, Computing started formally in the 30’s, when notorious scholars such as Kurt Gödel, Alonzo Church, Emil Post, and Alan Turing published various revolutionary papers, such as “On Computable Numbers, with an application to the Entscheidungs problem” (Turing, 1936), that stated the importance of automatic computation and intended to give it mathematical structures and foundations.

Alan Turing

(Fig. 6) Alan Turing, considered to be the father of computer science. (Source:

The Perfect Trifecta: Universities Research Teams + Government funding + Private Sector

Before World War II, the most relevant analog computing instrument was the Differential Analyzer, developed by Vannevar Bush at the Massachusetts Institute of Technology in 1929 “At that time, the U.S. was investing heavily in rural electrification, and Bush was investigating electrical transmission. Such problems could be encoded in ordinary differential equations, but these were very time-consuming to solve… The machine was the size of a laboratory and it was laborious to program it… but once done, the apparatus could solve in minutes equations that would take several days by hand”. (Mindell, 2009).

(Fig. 7) Vannevar Bush (1890–1974) with his differential analyzer Bush joined MIT at age 29 as an electrical engineering professor and led the design of the differential analyzer. During World War II, he chaired the National Defense Research Committee and advised President Franklin D. Roosevelt on scientific matters. (Source: Computer History Museum)

During World War II, the US army commissioned teams of women at Aberdeen Proving Grounds to calculate ballistic tables for artillery. These were used to determine the angle, direction and range in which to shoot to more effectively hit the target. However, this process was vulnerable to error and took considerable amounts of time, therefore, the team could not keep up with the demand of ballistic tables. In light of this, the Army commissioned the first computing machine project, the ENIAC, at the University of Pennsylvania in 1943: “The ENIAC could compute ballistic tables a thousand times faster than the human teams. Although the machine was not ready until 1946, after the war ended, the military made heavy use of computers after that” (Denning, Martell, 2015).

(Fig. 8) 1946, ENIAC programmers Frances Bilas (later Frances Spence) and Betty Jean Jennings (later Jean Bartik) stand at its main control panels. Both held degrees in mathematics. Bilas operated the Moore School’s Differential Analyzer before joining the ENIAC project. (Source: Computer History Museum).

This is one of the first examples of the combined work of government and universities research teams to fund and advance technology. However, it is worth noting that this was not the only project in place at the time in the world. In fact, the only one that was completed before the war was over was the top-secret project at Bletchley Park, UK, which cracked the German Enigma cipher using methods designed by Alan Turing (Denning, Martell, 2015).

Nevertheless, projects such as ENIAC (1943 US), UNIVAC (1951 US), EDVAC (1949 US, binary serial computer), and EDSAC (1949 UK) provided ground-breaking achievements that, later on, allowed for the design advancements of a more efficient, reliable, and effective computer: “Even relatively straightforward functions can require programs whose execution takes billions of instructions. We are able to afford the price because computers are so fast. Tasks that would have taken weeks in 1950 can now be done in the blink of an eye”. (Denning, Martell, 2015).

These projects sparked the flourishment of ideas that transformed computing into what it is today. Computers changed from being mere calculators to being information processors, and pioneers John Backus and Grace Hopper had a key role in that shift. In 1957, Backus led a team that developed FORTRAN, a language for numerical computations. In 1959, Hopper led a team that developed COBOL, a language for business records and calculations. Both programming languages are still used today: “With these inventions, the ENIAC picture of programmers plugging wires died, and computing became accessible to many people via easy-to-use languages” (Denning, Martell, 2015).

(Fig. 9) 1952, Mathematician Grace Hopper completes A-0, a program that allows a computer user to use English-like words instead of numbers to give the computer instructions. It possessed several features of a modern-day compiler and was written for the UNIVAC I computer, the first commercial business computer system in the United States. (Source: Computer History Museum).

The role of government funding during this period was essential, but it went beyond just granting money to universities’ research teams. In February 1958, President Dwight D. Eisenhower, ordered the creation of the Defense Advanced Research Projects Agency (DARPA), an agency of the United States Department of Defense which mission is the development of emerging technologies for use by the military. International armed conflict not only played a part in the creation of this agency but it was the reason behind it. About the climate of the context of its creation:

“ARPA [originally] was created with a national sense of urgency amidst one of the most dramatic moments in the history of the Cold War and the already-accelerating pace of technology. In the months preceding [the creation] … the Soviet Union had launched an Intercontinental Ballistic Missile (ICBM), the world’s first satellite, Sputnik 1… Out of this traumatic experience of technological surprise in the first moments of the Space Age, U.S. leadership created DARPA” (Official website).

The agency establishes its purpose clearly: “the critical mission of keeping the United States out front when it comes to cultivating breakthrough technologies for national security rather than in a position of catching up to strategically important innovations and achievements of others” (Official website). By this description, is not difficult to assume that tension between countries due to armed conflicts definitely impacts their willingness to invest in the creation of new technology.

However, the projects funded at this agency, throughout its creation, have provided significant technological advances that have had an impact not only for military uses but in many other fields. The most ground-breaking ones are providing the early stages of computer networking and the Internet, in addition to developments in graphic user interfaces among others.

(Fig. 10) 1962, J. C. R. Licklider, first director of DARPA’s Information Processing Techniques Office (IPTO) discusses concepts with students at MIT. (Source: DARPA)

Along the lines of DARPA, the Department of Defense, in collaboration with Massachusetts Institute of Technology, created the MIT Lincoln Laboratory as a research and development center focused on the application advanced technology to problems of national security: “Research and development activities focus on long-term technology development as well as rapid system prototyping and demonstration… The laboratory works with industry to transition new concepts and technology for system development and deployment” (Freeman, 1995)

Other projects like the Stanford Research Institute started from a combination of forces between university and government funding after World War II and continue to develop technology to better the lives of the public. Among its accomplishments are the first prototype of a computer mouse, inkjet printing, and it was involved in the early stages of ARPANET.

When the future becomes now

Many people involved in the projects created during World War II went on to start computer companies in the early 50’s. Universities began offering programs to study in the new field by the late 50’s. More specifically, Computer Science programs were founded in 1962 at Purdue University and Stanford University, facing early criticism from scholars who believed that there was nothing new outside of mathematics and engineering. “The field and the industry have grown steadily ever since, into a modern behemoth whose Internet connections and data centers are said to consume over 3% of the world’s electricity”. (Denning, Martell, 2015).

Over the years, computing provided new insights and developments at such a pace that, in a matter of few decades, it advanced further than other fields since their creation: “By 1980 computing had matured in its understanding of algorithms, data structures, numerical methods, programming languages, operating systems, networks, databases, graphics, artificial intelligence, and software engineering”. (Mindell, 2009).

In relation to that, the first forty years or so of the new field were focused on developing and perfecting computing technology and networks, providing ground-breaking results that better suited it for combinatoriality and further advancement. In the 1980’s another shift started in the field: the interaction with other disciplines and computational sciences: “Recognizing that the computer itself is just a tool for studying information processes, the field shifted its focus from the machine itself to information transformations”. (Denning, Martell, 2015).

The biggest advances of this field have been integrated into our world seamlessly, shaping not only our lives but the way we see and interact with said world. Design achievements such as the microchip, the personal computer, and the Internet not only introduced computing to the public’s lives both also promoted and sparked a motivation for the creation of new subfields. This effect, in fact, replicates itself almost like a cycle, explain Denning and Martell: “Network science, web science, mobile computing, enterprise computing, cooperative work, cyberspace protection, user-interface design, and information visualization. The resulting commercial applications have spawned new research challenges in social networks, endlessly evolving computation, music, video, digital photography, vision, massive multiplayer online games, user-generated content, and much more”. (Denning, Martell, 2015).

(Fig. 11) Evolution of the computer. (Source: Google Images)

David Mindell clearly expresses this marvelous achievement: “Perhaps the single most remarkable development was that the computer—originally designed for mathematical calculations—turned out to be infinitely adaptable to different uses, from business data processing to personal computing to the construction of a global information network”. (Mindell, 2009)


What if World War II hadn’t happened? Would our current technology be at the stage that it is today? In what ways would it be different? How long would it have taken us to achieve these technological advancements if military conflict wasn’t present in the context?

Such hypothetical questions were the ones that plagued my mind when I started this research, and there is not a clear answer for them. The impact World War II had on society is undeniable and impossible to measure. The world was never the same in every aspect and there was no field left untouched by it. From international relations and diplomacy, with the creation of the UN and the Human Rights, to world politics, specifically in Europe, were forever changed, leading to dictatorships and more armed conflict within the region. Other fields such as physics, biological weaponry, engineering, medicine and genetics, just to name a few, went through a drastic change as well sparked by the events during this time, which in consequence led to future conflicts such as the Cold War and the development of nuclear weapons by various nations.

At the core of all these changes is technology. World War II and its impact on the development and advancement of technology shaped the world as we know it now, in ways that we’re still trying to comprehend and address.

Would technology be less mature, robust or advanced if World War II hadn’t happen? Probably, but more so in a change of pace than a different path. There were astounding technological advances before the war and there are still technological achievements occurring that are not sparked by military conflict. However, wartime stimulates inventiveness and advances because governments are more willing to spend money on revolutionary, and sometimes risky, projects with urgency.

For the specific case of World War II, the creation of computers was a result of different actors and institutions (universities, government agencies, computer scientists and researchers), with various interests, pushed by armed conflict to work together in perfect timing in one of the most drastically world-changing cases of serendipity in history. It is the ‘before-and-after’ of not only our generation but our civilization.





  • Campbell-Kelly, Martin.Origin of Computing.” Scientific American301, no. 3 (September 2009): 62–69.
  • DARPA official website:
  • Denning, Peter J and Craig H. Martell.“Great principles of computing.” Communications of the ACM11 (2003): 15-20.
  • Freeman, Eva C. MIT Lincoln Laboratory: Technology in the National Interest,, Lexington, Mass.: MIT Lincoln Laboratory, 1995.
  • Geiger, Roger L. Research and relevant knowledge: American research universities since World War II. Transaction Publishers, 2008.
  • Hall, Daniel and Lewis Pike. If the World Wars hadn’t happened, would today’s technology be less advanced? Guru Magazine, web source:
  • Mindell, David. The War That Changed Your World: The Science and Technology of World War II. Introductory essay for the exhibition “Science and Technology of World War II exhibition at the National WWII Museum, 2009. Web source:


  • Fig. 1:
  • Fig.2: Google Images.
  • Fig. 3:
  • Fig. 4 and 5:
  • Fig. 6:
  • Fig. 7:
  • Fig. 8:
  • Fig. 9:
  • Fig. 10:
  • Fig. 11:

Comparative Analysis between Smart Phone Cameras and Digital Cameras

In Which Sense Smart Phone Cameras Cannot Replace Dedicated Cameras?

Jiaxin Liu

CCTP 820 Leading by Design

Instructor: Dr. Irvine Martin


With the rapid development of digital technologies and optimization softwares that can mimic the look and “style” of film camera, the smart phone cameras seem can replace digital cameras now. And hence my research question is whether smart phone cameras can replace digital cameras? For which kinds of customers, the cellphone cameras can replace the digital cameras? And for which kinds of customers, the cellphone cameras can never replace the digital cameras? Therefore, this paper is basically a comparative analysis between digital cameras and cellphone cameras. And my argument is, for normal customers, the cellphone cameras are more portable, convenient, and useful; the optimization softwares in cellphones can help reprocess to “perfect” the size, color, and quality of image. However, for the professional photographers, the graphic artists, and anyone who chase the high-quality of image, the smart phones cameras are not enough for them, they still need DLSR to take hight-quality photos. I will collect the past data, survey, and case studies, and I will prove my argument through analyzing those data, survey, and specific cases.


Nowadays, the smart phone camera companies are constantly working on simulating the standard camera: to create the filters consumers are using to processing images, to improve the quality of images, and also to develop the optimization softwares to mimic the “look” and the feelings of film camera gives people. It’s irreversible trends that smart phone cameras are taking the market and fewer people will choose the heavy and thick, and relatively expensive digital cameras, especially DLSR. And thus some people argue the smart phone camera can replace the standard camera, and they’re more portable and easy to process. But that from a normal consumer’s level, not for the real photographer, I want to explore in which degree smart phone cameras cannot replace standard cameras from a designing perspective.

In this paper, I would like to firstly introduces the shared history of smart phone cameras and digital cameras, to demonstrate the common ground for them, and to explain why some people argue smartphone camera can be the replacement of digital camera. And in the second part, I will demonstrate the trends of the rapid development of cellphone cameras and how smartphone camera developers are working on softwares that stimulate the standard camera, and explain for which kinds of customers the cellphone cameras can replace digital camera by analyzing the data I collected and surveys. However, for a few of customers who prioritize the quality of image, for example the professional photographers, the cellphone cameras can never be the replacement of Digital Single Lens Reflex (DLSR).

And therefore, in the next part, I will distinguish the difference between smart phone cameras and DLSR, and explain why cellphone camera can not replace professional using case studies and also data analysis. The major difference between them: consumers’ control over ISO, aperture (i.e. the smart phone camera has fixed aperture, but photographers can change aperture based on their position on their DLSRs), and shutter. In addition a high-quality DLSR have more ability to capture more light information. With a DLSR, the professional photographers can have deeper focus to create the artistic effect and smaller aperture to capture fast motion. As DLSRs, they have larger size sensors which allow professional photographers to take extremely high definition photos. In the last, I also want to point out the design difference in the outlook of smartphone camera and DLSR. A well-designed camera will “meld” into people’s hands and allows a line of sight connection with people’s subject. The design of digital camera follows the affordance design principle, people can “intuitively” know how to use it when they hold the camera and look the landscape through the viewfinder.

The Common Ground for Digital Cameras and Smart Phone Cameras

Tracking back to the history, the word photography was derived from two Greek terms: phos and graphe. The first term means light, and the second means writing and drawing (Osterman, n.d.). And thus from these two root words, photography can be literally interpreted as using light to write and draw. Painting, writing, and drawing, or we say 2D image substrate technology, simulate the monoptical projected image. And light is the essential element in the photo-making process. According to Dr. Irvine, “photography is based on a lens projection from light reflected off a three-dimensional spatial source, the photographic image will always embody a direct analogy with the human eye (n.d.)”. For human beings, photograph is necessary for them to record their significant moments in time, and for the human world, photography as part of semiotic system is necessary to record the history.

Light as one root has an irreplaceable role in photography. All cameras, including digital cameras, film cameras, and cellphone cameras, are sharing the same optic principle. In the 11th century, the principle of Camera Obscura was found by Arab scholar Alhazan. The principle of Camera Obscura as figure 2 shows the image can be projected through a small hole on the wall in a large room. For Camera Obscura, the image of one object will be projected through a small hole in a screen, but in a reversed and inverted version. Lights travels in a straight line, the light rays reflected from a object travels straightly through a small hole in a membrane. And because the light rays reflected from the object is below the hole travels upward through the hole and continues a high point on the wall, the light from the high point will travel down to a low point on the wall while the light from the low point will travel up to a high point on the wall; and also the light from left or right will do the same (Sagers & Patterson, 2010). In the 19th century, the Camera Obscura box had been developed into photographic camera, and the hole in the membrane functions as the aperture in the lens of modern cameras.

In the modern time, photography is being fitted into digital sense. With the appearance of binary code, the digital photography is the process of assigning numbers in a binary code to form an image from light. For the digital image, pixel is the smallest unit that forms a picture can be shown on the screen. Though the technology to record an image has changed, but the principle of managing light to record correct exposure of an image remains the same. In 1975, the first digital camera was invented by Steven Sasson, an engineer working for Eastman Kodak. This camera weighs 8 pounds, and the shutter speed is much slower than most camera today. It would take about 23 seconds to record a 10,000 pixel photograph. However; it is a milestone in the history of cameras and is being considered as one of the most important cameras in the world, because it is the pioneer for taking the first digital image in black and white (Sagers & Patterson, 2010).

During the 20th century, the digital cameras has been developed: flash, color photographs, telephoto and wide angle lenses. But the basic parts for any digital camera remains the same: lens, IRIS or diaphragm, shutter, and medium (Ron, 1944). Different objects will release different light rays, and the lenses focus and capture the light rays and turn them into image. Diaphragm or IRIS determines the amount of light can be entered in the medium, this device is to control the depth of field. Shutter speed determines time of exposure (White, 2007). The medium is the materials in where the light rays can be transferred into recored images.

With the development of digital technologies and smart phone, people find that they can make high-quality photos with their cellphone. The design of digital cameras are more heavy, less portable, and relatively expensive for the consumers. The goal of taking photos for the normal consumers is to record and preserve the significant moment in their life, but now the smart phone can do the same thing with a smaller cost. We can take Apple and Huawei as examples to see what smart phone companies have done for improving their built-in cameras. In 2007, the traditional digital camera company treated iPhone camera as a joke when it firstly launched. In 2010, Apple started to pay more attention to the cellphone cameras because this year Instagram came out and created a trend of recording lives thorough taking photos. Comparing to digital cameras, smart phone cameras are more accessible to normal people. And smartphone itself can work as the interfaces for smart phone cameras and Instagram. Photos taken by cellphone cameras can be directly imported to Instagram. In the same year, iPhone 4 came out and it’s the first time for cellphone cameras to add an additional front-facing VGA camera and a backside rear-facing camera, a 3.85 mm f/2.8 lens and an LED flash. The rear-facing camera is capable of recording HD video in 720p, which is equivalent to a point-and shot camera at that time.

And Apple keeps developing its camera after tasting the big success of iPhone 4. The iPhone 7 Plus has improved its lens, it has two lenses—a 28-mm 12-megapixel lens and a 56-mm12-megapixel telephoto lens. Apple also managed to pack a lot of premium features—longer exposures, better aperture, and the ability to shoot digital negatives, which professionals call DNGs. A DNG is, essentially, a photo file that captures all the visual information possible for further manipulation, such as enhancing shadows or removing highlights. The new iPhone uses circuitry, software, and algorithms to create images that look and feel as if they came out of high-end cameras.

Apple is not the first and only smart phone company that put efforts on improving cameras, Huawei has been working with Leica for a long time. Huawei has shown its ambition by trying to creating the best “cellphone camera ever”.  The following photo was taken by Huawei P9. We can tell that the the lens can capture more light information, and thus the color is more naturally. There is one feature I want to highlight is the “pro” mode: a fanciful name for manual control mode. It simulates the professional cameras and allows the users to set ISO (from 50 to 3200) and shutter speed (from 1/4000th sec to 30 sec) (Tan, 2016). The appearance of this kind of smart phone cameras reveals the future of smart phone cameras. Their features will become more similar to the professional cameras.


From these cases, we can see how smart phone companies have done to improve their cameras. According to the data from NPD Group, 27 percent of photos shot this year was taken by smart phone cameras, while in the last year, the number was 17 percent. Accordingly, photos shot by digital cameras was dropping from 52 to 44 percent. The figure shows the comparison between the growth in smart phone cameras use and the decline in digital camera. This phenomena shows the rapid growth of smart phone cameras has started to disrupt the use of traditional digital cameras. People now have more faith in smart phone cameras and in the future the comparison between cellphone camera use and digital camera use will become more sharpen.

In the meanwhile, the optimization software has greatly influence. The camera is also an interface for the apps (optimization softwares) to further process the recorded images. For example, photoshop and light room allow people to adjust the brightness, whiteness, effects, and even change the detail of the photo. And therefore, even the cellphone cannot take high quality photos, they can use photoshop or light room to reprocess the photo.

How professional cameras functions different from Smart Phone Cameras

The idea of “Making a photograph” is promoted by Ansel Adams. Instead of passively “take” a photo as to record an image, he argues a well-designed photograph is art which is crafted by the photographer by his or her own genre (Ansel, 1935). The photo here is not waiting to be taken. A professional photographer should have a plan and know what is he or she really want. Namely, it should be an intentional artifact. And the photographers have created a visual culture. Photography reflects numerous human culture: realism, modernism, and postmodernism. It’s not only a visual culture to allow people to appreciate its beauty aesthetically, but have more deeper social meanings. It helps people to memorize. The photography has already rooted in the human society, has deeply connection with culture and politics, namely, becomes a cultural symbol in the human history. For example, this figure was taken in Vietnam War. It’s art, but also associated with politics and society. This photo is to help people remember the horrors of the war. Only the professional camera have the chance to capture one important  moment and record it in a relative high quality, and makes people feel its historical depth.

For professional and amateur photographers, the quality of photo has the priority. The professional cameras, for example DLSRs, they usually have larger sensors. They can gather more light and offer more depth of field control. One of the main criticisms of smartphone cameras is the lack of shallow depth of field. The tiny sensor with wide-angle lens design deliver images with extensive depth of field, frustrating photographers who are used to using shallow depth of field for creative effects. Here I would like to take Canon 5D Mark III as the case. The sensor of 5D Mark III is exactly 50 times bigger than iPhone 6. That large sensor allows photographers to get images that are physically impossible with a phone. The ISO value can reach 6,400 while the maximum value for iPhone 6 is ISO 800. Therefore from this comparison, the constraint of cellphone camera can be tell. Through cellphone camera, the photographers cannot really control the value of ISO, aperture, and shutter in a large degree. And hence the quality of image recorded by cellphone is not as good as DLSR. And for these professional photographers, the use of DLSR has already become a habit, and an obsession. Their control over the design of camera makes them have satisfactions.


Getting a glimpse of the current situation, the cell phone cameras have gradually become the preferred way of recording images for normal customers. However; for many professional and amateur photographers and graphic artists, the digital cameras, especially DLSR, cannot be replaced by smart phone cameras because of the high-quality image and the photography cultural environment they create. Photography has become a cultural symbol rooted in the human society. The smart phone cameras can never replace professional cameras and digital cameras completely.


Adams, A., (1935). Making a Photograph: An Introduction to Photography. 2nd ed. Studio.

Burgin, V., (1982). Thinking Photography. London: Macmillan.

Chris, C., (2012). Between Image and Information: The iPhone Camera in the History of Photography. University of Sydney.

Estes, John E., (2005). The  Camera Obscure. University of California at Santa Barbara. Santa Barbara, CA.

Nelson, T., (2016) Review: The Leica-Branded Huawei P9 is Impressive…for a Phone Camera.

Slivka, E., (June 8, 2010). “A Look at iPhone 4’s Camera Quality”. MacRumors. Retrieved June 20, 2010.

Sagers, S., & Patterson, R. (2010). History of Photography. Utah State University.

White, R., (1994). How Digital Photography works, 2ne ed. ISBN 0-78973630-6

Deblackboxing: Taobao Leading by Design

The most powerful things about our technologies are invisible.

— Irvine’s Law


Taobao is a Chinese Internet-based third-party platform where many small businesses and individuals sell goods and services. The website combines the key features of previous online shopping platforms while plugging in several new functions and adopts an efficient recommendation system to reach a balance between providing more merchandise and helping consumers make decisions in a shorter time. It is a complex system that should be regarded as a mere website, but a mini-society where social background, culture, regulation and people all play a crucial part.

In this paper, I will focus on the question why Taobao, the e-commerce giant, is designed in this way rather than another way. To figure it out, I’m going to briefly introduce the current situation of Taobao by a series of statistics. Then, I will use design, computational and socio-technical thinking to deblackbox, trying to find out what is behind the website.


Unlike Amazon or eBay that focus on either C2C or B2C, Taobao offers the both.

According to the Online Consumption Report, 2016, released by (a Chinese e-commerce think tank), in 2016, Taobao took up 85.6% of Chinese e-commerce market with the gross merchandise value amounting to RMB 3767 billion ($569 billion); it had 4660 million active users, about one quarter of the country’s population. is a success.


Design thinking:

In Chinese, “Taobao” means “searching for treasures” literally. As Norman once said that “when things are visible, they tend to be easier than when they are not”, Taobao conveys an obvious message to consumers: this is a place full of treasure waiting to be explored.

Modularity & Abstraction:

Abstraction is a technique for managing complexity that is deeply ingrained in human beings.

Design rules volume 1: the power of modularity.

the four-hierarchy taobao interface

The interface of Taobao can be briefly divided into four hierarchies from top to bottom in terms of consumers’ shopping experience:

  • Front page

The front page serves mainly as a searching page. It provides three ways for consumers to search items: the search box (in the top middle of the page), category sections (under the search box and in the left side of the page) and recommendation section (taking up the rest of the page). Users can search items by directly typing the key words into the searching bar or find things they may be interested in from the suggestions.

  • Searching result page

In this page, users can find items in the order of lowest/highest price, best sellers and sellers’ credit. Certain filters (like brands, features) can also be applied to narrow down the searching results.

  • Description page

This page provides users with detailed descriptions of an item as well as reviews.

  • Payment page

The four layers are relatively independent from each other while information can only flow from a higher layer to a lower layer through interaction points, i.e. interface. By clicking on the picture of a specific item in the searching result page, a consumer can enter the description page of the item. In this case, the picture is an interface bridging the two pages. But he cannot go to an upper level page from a subpage unless he clicks the browser’s “Backward” button.

This structure hides a substantial amount of the information in each module, leaving parameters to the discretion of the module designers. Therefore, the designers can work separately, adding new product pages to the website without affecting other pages.

Cumulative Combinatoriality & Extensibility of Combination:

The development of technology is the procedure of combining the already existed ones together. In his book, Arthur once said that “novel technologies must somehow arise by combination of existing technologies”. Taobao is a combination of the previous online shopping website; an extensible platform embracing future adaptations and growth built-in.

Amazon home page in 1995. Source:

The year 1995 witnessed the birth of Amazon and eBay. At that time, the former one was an “online bookstore” receiving online book orders from all over the world, finding books from publishers and delivering them out. Although the website page looks quite simple compared with modern websites, it functioned well, processing orders totally valuing 12 thousand dollars the first week opened its virtual doors. Years later, the company expanded from a “bookstore” to a “grocery store”, selling not only books, but almost everything in the world like clothes, electronic products and so on.

Different with Amazon, eBay was more like a flea market providing a platform for users to make deals with each other online. Long before eBay, there was flea market, but eBay was the first to move it online.

Then comes the year of 2003 when was launched by Alibaba Group. The website was born under the influence of Amazon and eBay, but it develops further. It is a Chinese Amazon where consumers can search, compare and buy thing online; a Chinese eBay where users can make an auction or sell their used products; an online platform for people to order food from different restaurant; a travel agency where people can rent cars, buy tickets and consult with trip advisers; a movie agency selling tickets; an online branch of telecommunication company offering self-service for paying bills.

Besides, Jack Ma, founder of Taobao, introduced two novel features to Taobao, setting the website apart from competitors. First, consumers can directly talk with sellers at any time on Taobao to acquire information they want and even ask for discounts. This gives consumers a feeling of shopping in a real store. Besides, joined by “Alipay”, a payment system developed by Alibaba, Taobao has a different payment system from other e-commerce platforms. The website assures consumers that once an order is placed, the payment will be kept by Taobao and won’t be sent to the seller until the buyer confirms that he has received the product or the order has been placed for one month. As Chinese consumers were skeptical of the safety of e-commerce when Taobao was launched, this model gave people confidence to give the website a try. In 2012, cooperating with HuaTai Insurance Company, Taobao launched a new insurance called “shipping insurance” to users. When placing an order, a consumer can buy the shipping insurance (usually costs 0.5 to 2 yuan, ie. 0.1to 0.3 dollars) in case he may not like the product when receives it. With the insurance, he can get a compensation to cover the shipping fee to turn the product back.

Computational thinking

The recommendation section takes up the largest part of Taobao’s front page. Why?

The Rapid development of technology brings people more ways to gain information than before. Meanwhile, information overload occurs as people have a fair limited cognitive processing ability.

When shopping online, a consumer will get hundreds of searching results for one key word. This is a strength as well as drawback of online shopping platform. User wants to have choices as much as possible when shopping, but the more choices he gets, the more confused he will be and eventually spend more time finding out the one that suits him the best. The flooding information makes it difficult for people to make decisions.

Recommendation system, an information filter technology, is an efficient tool for online shopping platforms to achieve a balance between the number of products and the length of consumption-decision time. The system can collect a massive volumes of raw user event data captured during sessions in which consumers log in to the website. By compiling and analyze these data, the system can predict future consumption actions and trends and recommend products or online sellers to users. On one hand, the recommender system reduces the time for a user to find his ideal product thus brings more sales to the platform. On the other hand, it can redistribute the flows of the platform, attracting users to less-popular items and new sellers by showing the items or traders in the recommendation session.

To some extent, we can say that recommender system plays a crucial part for the success of an online shopping platform.

The recommender system of Taobao, as shown below, can be divided into three levels.

the three-level Taobao recommendation system


  • Collect raw user event data, including: what the user is searching for, which seller he adds as favorite, how much money he has spent, etc.
  • Build up a basic database dividing products and users into different classes.
  • Based on the basic database, the system consumes that product A and B have a similar tag as users fond of product A usually favorite product B; user A and B belong to the same group as the favorite list of user A are similar with that of user B. It will also analyze the user’s behavior over sessions to better understand a user. With these data, the system set up a core database with more specific classifications on products and users.

Real-time response:

  • External factors, like physical surroundings and the time a user logging in to the website, will also be taken into consideration.

Response valuation:

  • Examine and improve the database by analyzing data like CTR (click-through rate), GMV (gross merchandise volume) and conversion rate.

Berners Lee once pointed out in his book that “a computer typically keeps information in rigid hierarchies and matrices, whereas the human mind has the special ability to link random bits of data”. The recommender system is to some extent, give computer the ability of “associating”, narrowing the gap between a computer and human.

the interface of recommendation system. Source from:

Meanwhile, the interface of recommendation system is also personalized in two aspects: the category of recommendation and the specific item shown in the front page. Different people will see the recommendation categories in different orders. For female users, the categories tend to be shown in order of clothes, costumes, overseas shopping, while males are more likely to have electronic products ahead of the recommendation list followed by sports accessories. Besides, the specific item displayed in a specific category also varies. Users preferring spicy food tend to see picture of La Tiao (a kind of spicy gluten) as representation of the snack category while users like sweets may see Hershey’s.

Different platform employs different recommender system as they have different strategies and targets., the second largest e-commerce giant in China according to the report released by, has a recommender system built on two models: recalling model and reordering model.

Ever since the first day went online, the platform has a slogan of “Making life easier and happier”. The recommender system is good at calculating out the potential interest of users but falls short in timeliness as it needs a great amount of calculating. On the other hand, the pre-analysis section and modularity feature gives Taobao advantages in calculating logically with a high speed. This is consistent with Taobao’s recent actions: improved the delivery speed and announced that the platform would pay refunds on behalf of sellers to good-credit-users before they send the items back. But Taobao’s emphasis on speed also lead to the system’s lacking in personalization in recommendation.

Socio-technical thinking: is not a website build up with codes and machines; it is an interdependent social configuration of technologies and institutions.

Social background:

Both Amazon and Taobao are e-commerce giants offering omnibearing online shopping service, but their development tracks are totally different.

The American society experienced the Dot-com Bubble from 1995, the year when was set up. At that time, the world wide web had already had a solid foundation of American users. People believed that the world wide web would play a crucial part in the future. This led to a pomp up of new internet companies. Before Amazon was set up, there were already several e-commerce corporations in America. When designing, Jeff Bezos targeted the website at being “Earth’s Biggest Bookstore”, as the American book market had not been monopolized by one single company yet. Books are easy to be delivered at a long distance and have a relatively high market demand in America. Besides, online book selling can largely reduce the cost of a book by eliminating the fees for bookstore rental. Two years after foundation, in 1997, Amazon became listed on NASDAQ valued as 5400 million dollars.

In contrast, Chinese e-commerce trade started from late 1990s, years later than that in America. When was set up in May 2003, eBay had the largest share of Chinese online retail market. At that time, traditional Chinese companies could not accept the emerging way of trading online, while many individual sellers were more willing to make deals online. Based on this social environment, the Jackie Ma oriented at a C2C (consumer to consumer) online shopping platform. To win the battle with eBay, Taobao adopted a series of rather liberal regulations to the platform: everyone could sell goods on the platform even those without business license or off-line stores; to register a virtual shop on, traders did not need to pay any registration fee.

Driven by the increasing trader population and higher penetration levels, Taobao moved forward to B2C (business to consumer) market and developed a sub-website, “Tmall”, in 2007. And finally became a platform offering nearly all kinds of goods and services.

Physical surroundings are another factor needs to be considered. The reason why Bezos chose Seattle, WA as the birthplace of Amazon is that Seattle was already a big I.T city with Microsoft located there. The headquarter of biggest book distributor in the U.S. at that time, Ingram Book Group, was also close to Seattle.

Similarly, when founding Taobao, Jack Ma compared the top three cities of mainland China at that time: Beijing, Shanghai and Hangzhou. The capital city was filled with state-owned companies, while Shanghai was the warmbed of foreign companies. In both cities, competitions were fierce and rentals were expensive. On the contrary, the local government of Hangzhou had a series of policy encouraging people to set up their own companies. Therefore, Jack Ma eventually chose Hangzhou as the headquarter base of Taobao.


The traders’ nationality and culture background can have an influence on the structure of an e-commerce platform. Hofstede said in his book, culture’s Consequences, that people in the US rate high on individualism and moderate to low on power distance, while those in China are high on collectivism and power distance. These differences affect the Chinese e-commerce market structure and Taobao’ behavior.

Meanwhile, Taobao reshaped Chinese culture. In 2009, Taobao started the first “‘Double 11’ Shopping Festival” on November, 11 to boost sales and get more people involved in the platform by offering lower prices than normal. The first “‘Double 11’ Shopping Festival” was an immediate success. The total sales on that day was RMB 52 million ($7.8 million), far exceeding the ordinary daily sales of the website. Nowadays, the eight-year-old festival now becomes the Chinese version of “Black Friday”. On that day, not only Taobao will have a series of discount activities, but all the Chinese e-commerce platforms. On the Double 11 Day this year, the festival made a record that the daily sales of Chinese e-commerce market exceeded RMB 253.9 billion ($38.3 billion) with RMB 168.2 billion ($25.4 billion) of Taobao.

Along with the festival, appears many new understandings to traditional Chinese phrases to describe consumers’ feelings or actions. For example, one may use the phrase “Duo Shou(剁手)” to say that he spent a huge sum of money in online shopping at a time while the phrase’s traditional meaning is to chop one’s hand to stop him from doing (including buying) anything. Literally, “Bao Cang(爆仓)” means storing too many things in a warehouse and eventually leading to explosion; but nowadays, it is a phrase specially refers to the scene where express delivery companies have too many packages in their garages to deliver as people made millions of orders on the “‘Double 11’ Shopping Festival”.


National law can affect the operation of the website.

Regulated by the government, Taobao provides commercial voucher for every user shopping in “Tmall Supermarket”, the online supermarket of Taobao. In June, 2010, the Ministry of Cultural Affairs issued new regulations on online games industry aiming at providing a better online game environment, including banning transaction services for uncensored or unrecorded online games. Sooner, Taobao removed all the transaction services for overseas in-game currency from the website.

Apart from national law, Taobao has its own regulations to keep the platform works well. Take “‘Double 11’ Shopping Festival” again as an example. According to the website, on November 11, the price of the off-price merchandise must be lower than 90 percent of the lowest price of that item during the period from September 15 to November 10; the trader must raise the price right after the festival and cannot lower it down at least in the coming month.


It is not only Taobao Team that is working behind Taobao, but millions of people from different industries.

People involved in Taobao can be briefly showed as follows:

A project done by the School of Labor and Human Resources of Renmin University of China shows that by the end of 2016, Taobao has provided 33 millions of working oppotunities in China, creating many new jobs, like cybershop designer, online selling trainers.



What can be studied is always a relationship or an infinite regress of relationships.

Never a ‘thing’. 

—Gregory Bateson

Looking back to the question raised up in the beginning of the article — why Taobao is designed in this way rather than another way, it can be answered as follows:

Influenced by the existing e-commerce platform Taobao combines all the key features of e-commercial platforms while keeping introducing new features to the website during the years of development. It solved the problems of flooding information, regulating the ways traders make deals on the platform and connecting consumers with sellers in a direct way. The design is also affected by social background, regulations and cultural factors.

Taobao doesn’t have to be designed in this way, and in fact, there is no specific way to design a specific thing. Design is not born from necessity. What leads Taobao to the design we see today are a set of ways of thinking, including design thinking, computational thinking and socio-technical thinking. By deblackboxing Taobao, people can find useful clues in perfecting the conceptual model of e-commercial platform.


reference: (2016).Online Consumption Report, 2016. Retrieved from:

Arthur, W. B. (2011). The Nature of Technology What It Is and How It Evolves. New York: Free Press.

Baldwin, C. Y., & Clark, K. B. (2000). Design rules volume 1: the power of modularity. Cambridge: MIT Press.

Bateson, G. (1972). Steps to an ecology of mind. Chicago: University of Chicago Press.

Berners-Lee, T. (1999). Weaving the Web: the past, present and future of the world wide web by its inventor. London: Orion Business.

Greeven, M., Yang, S. Y., Yue, T., Heck, E., Krug, B. (2012). How Taobao bested Ebay in China. Retrieved from:

Hofstede, G. (2001). Culture’s consequences: comparing values, behaviors, institutions, and organizations across nations. London: Sage Publications.

Hong, L., Ren, Q. Y., Liang, S. X.(2016). A comparative study on the recommendation system in Chinese e-commercial websites — take Taobao, JD and Amazon as examples. Book Information, 60 (23):97-110.

Kalpanik, S. (2011). Inside the giant machine: an story. Los Angeles, CA: Center of Artificial Imagination.

Li, J. (2017). A study on the profit model of B2C companies. Caihui Express, (20):61-65.

Ma, S. (2007). An analysis on Amazon. Time of Commerce & Trade, (07):47-48.

Norman, D. A. (2013). The design of everyday things. New York, NY: Basic Books.

Norman, D.A. (1999). Affordance, conventions, and design, interactions, v.6 n.3, p.38-43, [doi>10.1145/301153.301168]

Xiong, F. J. (2010). A study on the profit model of Taobao. Shandong University

De-blackboxing Alipay Wallet from design and social perspectives


Alipay is a third party online payment platform launched by Alibaba Group in 2004. It becomes one of the most important online services and poses great influence on the development of e-commerce in China. Its mobile app, Alipay Wallet, have reached more than 500 million registered users. Nowadays Alipay Wallet is far more than simply being an interface of convenient online payment, its wild popularity is tightly associated with a systematic design structure of personal wealth management and integrative social interaction functions which greatly change people’s lifestyle. This paper will firstly discuss about the design of main user interface. Then the paper will divide the big picture of Alipay Wallet into three specific parts: online transactions, personal wealth management, and its dynamic features of social interactions, analyzing each sector from both design and social perspectives.

Introduction of Alipay Wallet

Alipay stepped into the mobile payment industry and developed its mobile version for both iOS and Andriod operating system in 2009. The mobile app was renamed as “Alipay Wallet” in 2013, which highlighted its convenience as a virtual wallet and its huge impact on creating a cashless society. Since Ant Financial Services Group, a FinTech spin-off from Alibaba Group took over the business, more expanded features and affordances have been provided that enable users to do nearly every money-related activity on this single mobile app (video 1).

(video 1 – An introduction of Alipay Wallet.)

We will specifically focus on the latest 10.1.8 version of Alipay Wallet available on the App Store for iOS devices. Users can simply create an Alipay account by phone numbers and email addresses. After creating an account password, users are required to set up a payment password to enhance account security for future transactions. Beyond the basic account registration, Alipay Wallet has a strict real-name system. Only by filling out real personal information can users experience all the app functions. We will later dig deeper into the real-name system regarding governmental restrictions for online transaction platforms.

Having a look at the general layout of Alipay Wallet, users are able to randomly switch between different types of the main user interface by clicking four icons at the bottom. The interface is designed in a highly customizing fashion. Take the default  “Home” page as an example. Functions are well organized into different rows. To deal with conflicts between the increasing number of add-on features and the demand of simplicity, users are able select eleven most frequently used services and arrange the icons in the main interface (image 1). Under such circumstance, users participate in designing their personal-customized interface, which provides great convenience for them to navigate certain features.

(image 1- Customizing the order of icons in the area highlighted by a red frame.)

Customization of the main user interface is correlated to user’s GPS location. The second page of the interface shows huge differences when Alipay Wallet is used in different countries despite of the same version of the app (image 2). The second page is named as “coupon” for people who locate in Washington, D.C., in which users have access to services including travels, oversea shopping and exchange rate reports. For users in Mainland China, the second page is “Koubei”, a life service platform operated by both Alibaba and Ant Financial Services Group. The services include local transportation, entertainment consumption and product promotions.

(image 2 – Layout differences based on GPS locations: Washington D.C on the left vs. Shanghai on the right.)

The main user interface of Alipay Wallet is designed based on the knowledge of customers’ needs and expectations. An important task of digital designers is to “have an open-ended assessment of human needs and widen the range of design choices” (Murrey, 2012). To meet consumers’ demands under ever-changing global contexts, Alipay Wallet extends its services to people all around the world with the permission of registration by foreign phone numbers, providing them with online payment methods of goods and services from abroad. However, the app has constraints in terms of language barriers. Simplified Chinese is the dominant language for Alipay Wallet. Although multiple language versions are available for foreign users, the user interface does not display entire content in English even if the user shifts the language setting. Language constraints increase users’ confusion, which negatively influence their future using experience.

Forwarding its evolution as a “global lifestyle super app”, Alipay Wallet cumulates increasing numbers of functions into one single app. Under the inspiration of combinatorial design principles, Alipay Wallet combines existing technologies of online payment methods with new integrative designs. This paper attempts to categorize the complex but inventive system of Alipay Wallet into three major modules: online transactions, wealth management, and massive features of social interactions. These divided subsystems are designed separately in different ways with various social and cultural dependencies.

1. Online Transactions

As an online transaction platform, Alipay Wallet provides three major payment methods. Quick pay is an idea of adding debit or credit card information directly on the app. Online banking payment means that users are redirected to the online banking systems provided by according banking institutions during the payment process. Convenient payment method is to pay by account balance. For Alipay Wallet, there are two distinct types of online transactions: secured transaction and instant transaction.

1.1 Secured Transaction

Secured Transaction was traditionally designed for boosting e-commerce on, an Alibaba Group owned online shopping website. Alipay and Paypal share a similar role as an intermediary between sellers and buyers. However, the operational processes of these two services have huge differences. If a buyer makes a purchase through Paypal account, Paypal will immediately send the amount of money to the seller’s Paypal account as soon as the buyer place the order. Using Alipay as a payment platform, the buyer has to firstly remit money to Alipay, confirming success of the order placement. The seller will then be notified about the buyers’ payment, and will be informed to ship the goods to the buyer. Alipay holds funds during the shipment process. The money will not be forwarded to the seller’s Alipay account until the buyer confirms delivery and satisfaction of the commodities at a limited period of time, roughly a week after the order shipment. Paypal acts as an agent of money collection while Alipay can be treated as an escrow holder. The differences of law system between U.S. and China greatly affect the distinctions between these two third parties. There are strict rules and statutory laws about escrow in the U.S. such as California Escrow Law, which poses high standard for licensing companies to perform online escrow services (Yu & Shen, 2015). It is reasonable for Paypal to be distinguished from escrow, in order to avoid any cost under the supervision of strict regulations. However, China’s law system lacks rigorous rules regarding escrow, which provides Alipay with free space to set up systematic online escrow services. Therefore, governmental regulations and law systems significantly influence the design of secured transaction system in different countries.

1.2 Instant Transaction

Secured transaction is primarily designed and developed from online shopping, while instant transaction can be regarded as a newly emerging model of payment type that facilitates peer-to-peer payment interactions and daily offline purchasing experiences.

Peer-to-peer transaction, or P2P transaction, refers to an interpersonal online money transfer process. To provide a better user experience, Alipay Wallet combines online transactions with instant messaging services. It embraces already existed functions of instant messaging apps. Once users become friends on Alipay Wallet, they can freely send texts, photos, emojis, videos, and even real-time locations with each other. However, Alipay Wallet goes far beyond simple emulation of prior technologies and functions by embedding a cumulative combinatorial design approach. Alipay Wallet allows users to directly transfer money to their friends in the chat box. The combination of instant transactions and instant messaging services shapes a new way of online interactions.

Instant transaction on Alipay Wallet also promotes innovation of offline purchasing environment. Without cash or credit cards, users are able to make any purchase in stores as long as Alipay payment is acceptable. Two types of Alipay payment methods are available in offline purchase: by QR code and by conventional Bar code respectively. In regular grocery stores, purchase can be processed successfully by either way. Customers have the initiative to scan the store’s QR code to make the purchase (image 3), while store cashiers are able to complete the purchase by scanning a conventional barcode displayed on the customers’ Alipay Wallet app (image 4).

(image 3 – How to process a payment by QR code.)

(image 4 – How to process a payment by conventional barcode.)

One of the prominent affordances is the QR code, a two-dimensional information matrix. It contains more information, and is more readable than a conventional barcode. Nowadays it is widely used for offline transactions, however, QR code does not substitute conventional barcode despite of its overwhelming superiority.

Barcode is widely familiar to the public, which has been used for commodity purchase for over 40 years. From the perspective of distributed cognition, display of conventional barcode on Alipay Wallet is treated as cognitive schema that users are able to clearly figure out the function of the barcode displayed on the app (Murray 2012). They understand that the barcode should be scanned by the cashier in order to buy a product from the store. This could be one of the reasons why the barcode, as an affordance, is still preserved by Alipay Wallet designers.

1.3 Security Mechanisms of Online Transactions

Regardless of the types of transactions, Alipay Wallet is processed with high level of security. Superficially, users can easily notice that it provides necessary mechanisms to secure users’ accounts and funds, such as two layers of password settings. A login password is used for entering the interface, while another payment password is required for processing any type of online transactions on the app. Users will be reminded to make two different passwords in order to enhance their account security.

From social perspectives, government regulations offer effective measures to improve the overall security of the system. People’s Bank of China, also the central bank of China, is a government sector that announces monetary policies and regulates financial institutions. With the rising prevalence of non-bank financial services, PBC takes great efforts to implement regulations about third party online transactions, in order to maintain a secure financial environment.

Firstly, according to the “Administrative Measures for the Online Payment Business of Non-Banking Payment Institutions” announced in 2015, PBC set up strict limitations in terms of time and amount of online transactions. For Alipay Wallet, the maximum amount of one-time payment is 200 thousand RMB, which equals to approximately 30 thousand U.S. dollars. Up to 100 times of online transactions are available for mobile users per day, and 200 thousand RMB is the maximum amount of online transactions per year. Establishing such kinds of limitations helps prevent users from being victims of Internet frauds.

Secondly, based on upgraded policies, real name authentication has become an indispensible procedure of using third party online transaction services in 2016. Alipay users are encouraged to perfect their personal information, including identification card numbers, aka U.S. social security numbers, for Chinese citizens and passport numbers for users from other countries. It helps raise credibility of interpersonal online transactions.

Thirdly, government dedicates to fighting with money laundering and potential of money stealing by consolidating its control over transaction data. Relevant policies greatly affect online transaction processing and relationships between third party payment services and commercial banks. Previously, any transaction from one bank account to another could be freely processed on third party online transaction services, in which had dominant control of transaction data and records. It was difficult for the government to obtain accurate transaction information and money flows. In response, PBC started to conduct interference to third party intermediaries in order to consolidate its power of supervision toward financial environment. A new rule was issued recently, requiring third party transaction platforms such as Alipay to pass through an independent online clearing house system controlled by PBC, which would be implemented in June, 2018 (Cheung, 2017). Adding one more layer between third parties and financial institutions (image 5), PBC is able to aggregate and secure sufficient data resources from disperse platforms and institutions, thus diminishing the potential of online transaction deception. Overall, designs of security-related features on Alipay Wallet could no be discussed apart from the governmental dependency.

(image 5 – Chinese government ensures security of online transactions by adding the layer of online clearing house system highlighted in the red circle.)

2. Personal Wealth Management

2.1 As Online Bookkeeping

Alipay Wallet is a platform that provides convenience for people to manage their personal wealth. Active e-commerce participants create huge numbers of transactions on Alipay Wallet everyday. It is necessary to design a more readable and portable online account book for them to track their corresponding transaction history. The “Money Tracker” is such a kind of cognitive artifact, a function “designed to maintain, display, or operate upon information in order to serve a representational function” (Norman, 1991). There are three ways of viewing transaction records: by chronological order, by category of transaction purposes, and by accounts used for certain payment (image 6). In physical world, the ways of billing arrangements vary from person to person. Providing three kinds of layouts makes it possible to meet more people’s needs and expectations. It enhances users’ memory of their past online transactions created on Alipay Wallet, and it enables users to know about their transaction activities and spending habits in different ways.

(image 6 from left to right – Three ways of viewing transaction history: by date of transactions, by category, and by account.)

2.2 As Online Commercial Bank

When people are immersed in online transactions, Alipay Wallet provides opportunities for them to make investment. Yu’ebao is an Internet fund product operated by Tianhong Asset Management Co., Ltd., which manages users’ account balance and produces profits to users. The funds in Yu’ebao can be topped up and withdrew at any time for any online transaction purpose on Alipay Wallet. Yu’ebao significantly attracts more people’s attention and interests with considerable returns. It offers around 6% in annualized return in contrast of only 0.36% return offered by banks’ deposits (Tu, 2014). Attracting 325 million active users in 2017, Yu’ebao have become the largest money market fund in China. People are more likely to save their money in Yu’ebao instead of making bank deposits considering high returns and convenience.

In addition to the substitution of bank deposits with Yu’ebao, the extension of credit service named Huabei becomes a replacement of bank loans. Consumers can do both online and offline shopping with borrowed money on Huabei, just like how they use traditional credit cards to checkout in the real life. Available credit line ranges from five hundred to 50 thousand RMB based on Alipay virtual credit scores raised by the account users. Those who made consumptions with Huabei are required to make the payment on the tenth day of each month. More and more people are reluctant to do shopping with credit cards. The number of Huabei users reached 100 million, and “60 percent of the users haven’t linked their credit card to Alipay accounts” (Ding, 2017).

Both Yu’ebao and Huabei share many things in common. From the perspective of design thinking, creation of perceived affordance is the starting point of name design. According to Norman, “designers care more about what actions the user perceives to be possible than what is true” (1999). Perceived affordances suggest and drives people to take possible actions. Designers embrace meanings into names that people can easily know about what each function is about. The name “Yu’ebao” and “Huabei” come from Chinese Phonetic Alphabet. To directly translate “Yu’ebao” from Chinese into English, it means a treasure of balance, thus having the extended meaning of a good function for managing account balance. Users can easily perceive the features of Yu’ebao, and then take the action to use it for the purpose of making corresponding investment. “Huabei” is the meaning of “just spend” in Chinese, which can be connected to the idea of spending money without concern if one runs out of short-term savings. Users will be implied that Huabei is the interface for loaning money. They may be more likely to take the action to use this credit service for purchasing commodities. Name design also has cultural dependency. Connections between the meaning creation and shared cultural knowledge could not be ignored. It is easily for Chinese users to perceive the meaning of both names “Yu’ebao” and “Huabei”. However, these words hardly make sense to users who are not familiar with Chinese culture and language.

From the perspective of social consequences, Yu’ebao and Huabei have significant impact on the operation of commercial banks and government works. Commercial banks encounter great challenges to retain their potential customers when facing with huge competition of better online investment and credit pay services. With Yu’ebao and Huabei, users are more likely to make payments with their account balance rather than use linked debit or credit cards. When it comes to the positive side of social consequences, design of virtual credit scores facilitates the perfection of social credit system in China. The Alipay credit score which ranges from 350 to 950 is compiled by transaction records, payment history, credibility of personal information, and personal impression obtained through online interactions, etc. Not only can Alipay credit score be used for calculating a reasonable credit line on Huabei, but it also provides convenience in people’s daily lives. For example, users who get higher Alipay credit scores have the privilege to waive secured deposits for shared bikes. Virtual credit scores assess citizen’s financial trustworthiness, which becomes an important factor for improving “social credit score plan” in China. China’s State Council issued a planning outline for the construction of a social credit system in 2014. It aimed at setting up social credit scores for citizens based on onmindirectional analysis of people’s daily behaviors, including “financial transactions, political and social participations, as well as their general lifestyles” (Dörrer, 2017). Although Alipay credit score is not a part of the official social credit system, it provides Chinese government with invaluable resources to establish social credit scores in the contemporary society.

3. Online social interactions

In addition to the dominant online transaction services and fast-growing wealth management products, Alipay Wallet puts more and more emphasis on generating innovative designs regarding online social interactions. These design ideas have close correlations with the deep knowledge of new media, as well as social and cultural dependencies. The most significant features are Virtual Red Packets and Ant Forest.

3.1 Virtual Red Packets

Sending red packets during festivals and important occasions have been a Chinese traditional custom for thousands of years. Enclosed with an amount of money, a red packet is the representation of sender’s best wishes to friends and relatives. Currently, an evolution of virtual red packets brings about a new way of online transactions. Most online transaction services, remarkably Alipay Wallet and WeChat Pay, enable people to send money by red packets once they become friends. Unlike regular money transaction notifications, receivers are not be able to see the actual amount of money in the red packet until they click the red packet icon, just as the action of opening the paper red packet in real world. Based on the respect for Chinese custom, Alipay Wallet designers particularly extend conventional meaning of red packets with the inspiration of metamedium. Alipay launched a marketing campaign during Chinese Spring Festival in 2016. All Alipay Wallet users had the same opportunity to participate in the lucky draw of 200 million RMB in the form of red packet after they had successfully collected all five kinds of lucky cards. The five lucky cards were named as “Aiguo Fu”, “Fuqiang Fu”, “Hexie Fu”, “Youshan Fu”, and “Jingye Fu”, which represented the good fortune of patriotism, prosperity, harmony, friendship, and dedication. The meanings of these five cards are correlated to the core values spread by the Chinese government in recent years. “Fu” specifically means good fortune in Chinese characters. To highlight its function of social interaction, friends could exchange their cards to complete the mission. However, the fundamental way of card collection is by scanning different kinds of the letter “Fu” anywhere with AR technology (image 7). During the scanning process, the letter would be digitized into discrete data which could be interpreted and recognized by the mobile system. If the scanned letter matched with the character of “Fu”, users would randomly get one of the five cards. “Metamedium is a combination of already-existing and not-yet-incented media” (Manovich, 2013). As an early medium, a simple written or printed letter “Fu” helps add new properties to the emergence of new media that were born from digitization such as the random fortune cards and virtual red packets.

(image 7 from left to right – How to scan a written or printed Chinese character “Fu” with AR technology & an example of rewarded red packet after successfully collecting all five kinds of lucky cards.)

3.2 Ant Forest

Ant Forest is an interactive game on Alipay Wallet, which embraces profound significance of environmental protection in China. Ant Forest users plant virtual trees by collecting “green energy”, which is calculated in the unit of kilogram. The energy comes from users’ low-carbon activities detected by Alipay Wallet. For example, Alipay Wallet counts users’ everyday walking steps and offline payment transactions which help save paper receipts. All these eligible activities will be converted into energy that makes a tree grows bigger. Once an activity becomes energy, it has to be manually collected by users. An interactive way of collection is to visit friends’ Ant Forest account and steal their uncollected energy. According to Yin (2017), “technology, which can be used to mobilize the public. If everyone is involved, we can easily popularize a low-carbon lifestyle.” Virtual trees planted by Alipay Wallet users do not only represent a kind of game acquirement, but these trees also exist in the real world. Cooperating with Ant Financial Services Group, public welfare organizations grow real trees in desert areas in China on behalf of virtual tree owners. Based on the social implication of environment protection, Ant Forest is designed in a way of media hybridization, which allows users to see what their trees look like in reality through satellite images, real-time photos and locations (image 8). Media hybridization stands for “a more fundamental reconfiguration of media universe in which media properties are exchanged, and new structure are created.” (Manovich, 2013). Ant Forest is taken as an example of media hybrids that include graphic design, photography, GPS location, and the technology of satellite communication. The recombination of various media forms facilitates people’s experience of online social interactions and their offline daily lifestyle as well.

(image 8 from left to right – The user interface of Ant Forest & viewing actual trees through real-time photos.)

4. Conclusion

“It’s not enough that we build products that function, that are understandable and usable, we also need to build products that bring joy and excitement, pleasure and fun, and yes, beauty to people’s lives.” –Don Norman (2004)

By analyzing the three main parts of Alipay Wallet from both design and social perspectives, it is clearly acknowledged that Alipay Wallet is keeping on the right path of being a “lifestyle super app” in China. With its combinatorial design structure, Alipay Wallet consolidates its dominant services of online transactions, and it also strengthens new functions and features including wealth management and online social interactions. The operation and design ideas could not achieve great success without social dependencies. Products with Chinese characteristics are designed to meet people’s daily needs and expectations under certain governmental and cultural contexts. Considering the scope of globalization, Alipay Wallet should pay closer attention to the contexts and needs of foreign users, making its products and designs more adaptive to the global market.


Alibaba Group. (2016, April 30). What is Alipay? Retrieved from

Arthur, W. B. (2009). The Nature of Technology: What It Is and How It Evolves. New York, NY: Free Press.

Chan, J. (2017, September 30). Alipay and Tenpay compete head-to-head for overseas market share. ASEAN Today. Retrieved from

Cheung, M. (2017, August 21). As China Tries to Go Cashless, Government Casts a Wary Eye. eMarketer. Retrieved from

Dörrer, K. (2017, March 31). Hello, Bog Brother: How China controls its citizens through social media. DW. Retrieved from

Manovich, L. (2013). Software Takes Command. New York;London;: Bloomsbury.

Murray, J. (2012). Inventing the Medium: Principles of Interaction Design as a Cultural Practice. Cambridge, MA: MIT Press.

Norman, D. A. (1999). Affordance, Conventions, and Design. Interactions 6(3), 38-43. doi:10.1145/301153.301168

Norman, D. A. (1991). Cognitive Artifacts. In Designing Interaction, edited by John M. Carroll, 17-38. New York, NY: Cambridge University Press.

Norman, D. A. (2004). Introduction to This Special Section on Beauty, Goodness, and Usability. Human-Computer Interaction. 19 (4). 311-318. doi: 10.1207/s15327051hci1904_1

Norman, D. A. (2010). Living with Complexity. Cambridge, MA: The MIT Press, 2010.

Sheed, C. (2009, April 3). We need to build products that bring joy and excitement to people’s lives.

Tu, L. (2014, March 19). Alibaba’s Yuebao dents Chinese banks’ profitability. REUTERS. Retrieved from

Yin, X. (2017, October 27). Ant Forest: from virtual trees to real forests. CHINA PLUS. Retrieved from

Yu, Y., & Shen, M. (2015, March 31). Alibaba and the Threat to China’s Banking Sector. The Foundation for Law, Justice and Society. Retrieved from

Yue, J. (2017, April 27). Alibaba’s Yu’eBao Becomes First Chinese Fund Management Firm With Over RMB 1 Trillion In Assets. CHINA MONEY NETWORK. Retrieved from

Mini Program of WeChat


WeChat, one of the most popular and widely used social media applications in China, has several design principle that may throne its user group. The combinatory design principle exist makes the WeChat an app that “rule it all”. In January 2017, a new program called Mini Program came into use in WeChat app. The program is an integration of various applications; insist on the principle of “to use and to go”. It used the technology of HTML5 web-based application and improve user experience to a large extend. In a few months after the Mini Program appeared, user group of the program has reached more than 200 million people. So why this program so competitive in the market and what is the design principle behind this program?



WeChat is a very popular app in China. It is originally an instant messenger app at first from Tencent Company. It is now the dominating social media apps in Chinese market. As such a successful app, there are many technical design principles there which enable the success for WeChat.

According to the annual report in Tencent Inc Ltd, In 2017, the monthly active user of WeChat app (in both Chinese version of weixin app and international version of WeChat) has reached 889 million people. One user would have 128 contact people within the platform on average, and the number of online payment transaction is more than 600 million per month. As easily can be seen, one of the most distinctive features of WeChat is that it is a combinatory of various functions. WeChat has the function of Facebook, voice message, Instagram, Tweeter, streaming video, online bank and so on. Each time the new functions appearance would bring an increase in user group, and finally lead to such a large scale of user number.

“In 2016, Tencent company, with a set of strategies and comprehensive online economic ecosystem, has reinforce its leading position in the industry and became more competitive. The convenient social and local service functions have enabled the promising future of WeChat and QQ.” (Ma Huateng, co-founder of Tencent)


Why WeChat has such a large user group

The WeChat came to the market in 2011, January. In the very beginning, it has very limited functions as instant message apps, can only be used for text and send photos. In the end of 2011, WeChat has firstly work interactively with other functions online. The app implement the function of Talkbox, which users can use the voice message to communicate with each other, and that is the first time number of user group has increased significantly.

The moment function came the second year since the WeChat came into the market, which came with the boom of user group here, up to more than 100 million people. the moment function, which also called “ friend circle”, it is a sharing function similar to the Instagram and path, you can share your photos or video in moment with texts, or you can record and take pictures then sharing instantly in the moment.

The reason that WeChat have such a large user group can also be attributed to the design thinking of affordance. The clear interface of the app makes functions more visualized trackable and easier to learn. Take a look at the interface of the application, at the bottom is four main pages that can be switched at can be clearly seen named “chats”, which is the place serve as the basic chatting function, “Contacts”, which is the the place that shows all the name of your WeChat contacts, “ discover”, where the function of moment, scan QR code, Shake, nearby people and message in a bottle collect, and “me”, where shows your account information, including your online bank information.

The design principle of dependence is also seen in the app. It’s obvious that to achieve such a mixture of functions, WeChat need to use all the native features of smartphones. For example, to post a picture in the moment, WeChat need the access to the camera, speaker, memory chip, access to Internet. The online payment function also cooperates with the online banking system, and Alipay application, other online shopping website as well. User number has large scale increase each time with the increase of functions. It is going with the development of cellphone. Now WeChat is available to both ios and Android operation system, can cover 94% of smartphones in Chinese market. This illustrate that the dependence of WeChat design is also a factor that enabled the success of this application.


Mini Program

In January 2017, a new program called “mini program” came into exist. The main design principle behind the mini program is modularity. It is a combinatory of all the functions of other apps, which makes the WeChat an app of apps. Mini program is in the form of a formal application with the whole functions. It can be found in the “discover” page after the first time you try the mini program in WeChat. By scanning QR code, or search the name in search bar, sharing from friends or groups, search for nearby programs you may get access to different mini programs. And the interface of the mini program is similar to the original apps. WeChat user who gets into the mini program can also use different functions provided by the program.

From the program creator’s perspective, it is providing a platform for anyone can write code and develop their own app with no need to get permission from the app store for selling. To develop your own mini program, user will need to register for an App ID first. Then setting your program name and tag, combine your ID with certain user account. After that user can log in with ID as manager identity and write code for the program. The text, adjustment can all be done in this page. To help user better and easier to create their own application with more comprehensive functions and interface, the WeChat would also provide the API and access WeChat have to its user-developer, and even with the framework of java code.



The Mini program is making WeChat more a penalty of modularity. The designer is not design anything new, any new technology neither. He was combining the pre-exist technologies together, to create this new and appealing functions. The Mini program, used the concept of HTML5 apps, the search function, scan for QR code, get access to contactor, original technology from other apps or the native function on smartphone that pre-exist already. It combined the new function with the WeChat native functions, but in a new way and making it such an appealing new design. WeChat believe that by housing those applications inside of its app, user would be more willing to stay in one app, and there can still be a boom increase of WeChat user and more business transaction inside WeChat as well. So as the integration of functions within WeChat, those small companies with official website and applications would also be willing to have a mini app within WeChat so that they can be more competitive and gain more users there as well as the resources Tencent would provide. Those big companies’ apps would also want to be part of the program since they may lose their competitive capability if they are the only one that not get involve and their native apps may not as convenient as the Mini Program. For developers, they will also be glad to use this function since it will be harder for them to develop and cultivate and marketing a brand new application by themselves.


Distributed cognitive 

In the WeChat, it has the subscription accounts and service accounts, which are both official accounts created by users that can send news, and provide service information of the account, like hospital information, or restaurants, which can be local service. To complete the service, now the mini program came out as an underpinning of those official accounts. One of the accomplish goal done through the mini program is the connection with offline business. For instance, when you scan the QR code offline, you can get into the WeChat mini program app of the service or anything, the QR code can be poster in subway station, or it can be the offline activity hosted by some real shops, and by get into the mini program you may attend the activity and get rewards or service offline. This kind of connection is actually making user experience not only limited in digital world, but making the service distributed around in the environment through the interface of WeChat app. This program, is realizing the connection between people to people, people to business, people to service.


The interface of Mini Program

To have a clearer concept of what Mini program looks like, take Didi Dache as an example here. The Mini program stays in the last selection of “discover” page within the app interface once you use one of the Mini program. By searching the name of the program, user will be able to find the program there. And when you click into the Mini program page, the program that you previously used will be shown there just under the search bar. Once click into the application page, you can use the function provided by the Mini program. Like in the Didi Dache, I made a comparation of WeChat house app and its native app, the interface is the same and it also provided same functions there.

Didi Dache is a taxi booking application similar to the Uber and Lyft. It is a very popular program in Mini program. Now with this Mini program, you will not need to download the native one to book a taxi, it is just house inside the WeChat app and will not take up your smartphone storage. In the Android operation system it will even able to create a homescreen icon in the smartphone that will make it easier to use.


Thrive of QR code

As the Mini program is tied offline world closely with online apps, the QR code is being an unique hyper markup link invented in China. The QR code is in the form of an icon that can be easy to print offline. With a QR code, people may scan it to pay the bill, to add new friends, to attain information about certain account, or to achieve one website. QR code, with a unique but easy to recognize form, prevailed quickly after appearance. The http can all transferred into a QR code there.

Now with the Mini program’s invention, QR code has become an important part since a lot of the apps would need you to scan the code. For example, many restaurants have the QR code on their poster, and it can be posted not only inside of the restaurants, but also in public place like bus station, flyers, or shopping mall. As it is such a convenience and strong marketing tool, QR code is in large use in Mini program, it acts as the gateway of Mini program that contain the vital information. As this new function would still increase user group, QR code is also going to become more popular around the world.


A tool of connection

The designer of the Mini Program explained that he is trying to think from the perspective of users, to make the program serve people instead of thinking from the interest of the application.

The fundamental principle of WeChat is that the app is only act as an effective tool for communication. All the connection and service would finally lead to the access to some certain people, this is the nature of WeChat usage. It is making efforts on build a bridge between people. This is why the service account and subscription account can be so popular and became one characteristic of WeChat while those business account and online shopping functions is not that welcomed in contrast. What’s more, it should also be highly effective when building the bridge, to make WeChat just a highly effective tool and this is also why the Mini Program was designed to directly use and no need for download and delete afterwards. As the designer of the Mini Program has ever said: the access to website is search bar while the access to application is QR code.


HTML5 app

Now the technology of HTML5 is being largely used for web-based applications. The technology has the advantage that they can storage information for the offline application. And the HTML5 make it possible for all in one application model, that different app will be able to navigate in one web browser.

Baidu, one of the largest search company in China, used to initiate a Light App Campaign, aiming at using HTML5 web based application technology to develop its new function just like mini program, but did not get the same feedback due to its search engine nature that different from social platforms which would have larger user group and easier to communicate.

The Mini Program now is using the HTML5 technology, and pack up the data file information in the zip format, and it is already being stored with HTML5, so that user can open the app within seconds. This design is a change from application to the web appfication, from local storage service to cloud service. We can operate on this application as what we do in PC web browser: we can open several pages at the same time and making them working separately. And we will not download software for one time use and then delate. But the Mini Program is actually a compromise of both web browser and application format, it has the advantages but this also constraints the further development of WeChat program since this still need to install the WeChat application first, and will be effect by the hardware and data flow.

With this technology, the Mini Program will also be very attractive to Android system, since the Mini Program can realize the same program running in both Android system and ios system, it may improve drawbacks exist in Android system due to its massive out-of-control app store, thus improve user experience, which would become a great chance for those app developers with business.


How Mini Program being competitive

The Mini program, since it is firstly published in January 2017, the user group has reached 200 million within months. This is such a huge figure and proved that Mini program is being liked and welcomed by user here. What makes this new mini program so popular in the market?

Firstly, the real time sharing functions of WeChat is also available in mini program. When you share information from one app in mini program, your friends can directly see the information page without search and open the mini program again.

Secondly, the connection between the official account and the app is being settled. Users can open the mini program through the subscription account, and save time for user when they find themselves interested in this website but need extra search and download process to get the application. And when you have problem with the app, you may send message to the official account and get feedback faster.

Thirdly, the Mini program has the function of searching for nearby app. Image one situation that you are in the middle of nowhere and want to search for the nearby station and want to know when the bus would come. In that case, you can use the “nearby program” function, to easily get the app and information with few data.

Also, the Mini program has the function that multiple apps can work in the same time, which means that user will not need to close one processing app before opening the other one. This function is really important and can make the house app more like a real one in the user generate experience. You may also open another mini program within the page of one program, since some program can be related to each other.

As an integration of various function apps, WeChat has large flow of data, which can promote processing speed when using some apps. It cost small amount of money for design and cultivation, does not need to invest more on advertising. The easy use easy go model fits to the rapid life pace in modern life, and thus makes it being advocated by users.



WeChat, a popular social media app with monthly active user of 879 million people, has released a new program called Mini Program, and attained more than 200 million users after a few months. The program is a platform that full of different apps, it is an app of apps. User did not have to download the original app and can use the function of the native app inside the WeChat. Using HTML5 web application technology, users have a better experience and can open the app and use it within one second and realize all in one house apps. The WeChat insist on the principle of design its product as a tool used by its target group. Modularity is also a very important part of design principle which enables the well function of Mini Program. Though with a few disadvantages, Mini Program is becoming more and more popular since its release, and there must be a large market for it since it fits for the future technical structure for application and always focusing on user experience.



  1. Anu Hariharan, Apr 12, 2017. “On Growing: 7 Lessons from the Story of WeChat. “YC Research.

  1. Donald A. Norman. May, 1999.Affordance, Conventions, and Design.” Interaction 6(3).
  2. Janna Anderson, and Lee Rainie. March 23, 2012. “The Future of Apps and Web.” Pew Research Center’s Internet & American Life Project,
  3. Noah Wardrip-Fruin, Nick Montfort. 2002. New Media: Eight Propositions.” The MIT Press (Excerpt from “New Media from Borges to HTML,”)
  4. Norman, D. 2013. “The Design of Everyday Things: Revised and Expanded Edition. “New York: Basic Books.
  5. Tingyi Chen, Apr 6, 2017. 10 examples of GREAT WeChat design, WALKTHECHAT: WeChat guides and tips

  1. Xiaobo Wang, Baotong Gu. November 2015. “The communication design of WeChat: ideological as well as technical aspects of social media.” Communication Design Quarterly Review 4(1).
  2. Yiling Qiu, Jan 19, 2017. WeChat Mini Program Part I: What Is It and Why Is It Significant? Medium

  1. Tracey Xiang. Jun 4, 2014. “ HBuilder: to Make HTML5 Booming in China.” Technod

The Internet as we don’t know it

CCTP 820: Leading by Design: Principles of Technical and Social Systems – Fall 2017

By Linda Bardha

“Having the world at your fingertips”


Today we are globally connected with each other. We have the opportunity to learn, chat, share information and we can do all of these things at the convenience of our home. Just by having an electronic device (computer/phone/tablet) and being connected to the Internet, we can have, as they say, the world at our fingertips. But, how are we connected to the internet? What is the history of the Internet? What were the design principles and methods that were used to design the Internet architecture? Who were the factors and the people that played an important role in designing the Internet that we use today? How did we go from the Internet to the Web? Having a Bachelor’s degree in Computer Science, it was my curiosity that lead to explore this topic and try to give answers to some questions that I’ve had for a while. As a programmer, I have created websites using different programming languages such as HTML, CSS, JavaScript, Python, SQL. This experience, from a back-end developer’s point of view, helped me to understand that “the invisible” part, what we can’t physically see, is so much more powerful than “the visible” part. The Internet is something that we use every day, but the Internet is not just a singular artefact. As a consumer society, we never worry about how something works, and the Internet in particular, is such a delicate case. Since we cannot physically see all the connections that happen when information is transmitted from one point to another, sometimes we use the word “magic” to make up for the lack of knowledge. The Internet is a complex socio-technical system, that has changed the world that we live in, and trying to understand the history and the design principles behind it, will help us to have a better understanding of what it means to be connected to the Internet.


“The Internet has always been, and always will be, a magic box”. Marc Andreessen

There is something about this quote that just doesn’t seem right. I will admit it, I used to think like Mr. Andreessen. The internet, “this thing” that we use every day, is such a strange concept, and because we cannot physically see how the internet actually works, we don’t think twice and use the word “magic” to justify the work that is done in the background. But, the internet is not just an isolated box, and certainly it isn’t a magical box. Internet isn’t just a thing. The internet is a system of distributed agencies and technical meditations,  that has changes the world that we live in. The internet is a product of its social environments, it has shaped the characteristics of communication media and information technology.

Major historical events lead to the creation of new technologies and developments. The Internet was born as a historical accident and different agents played significant roles in creating what the internet is today. The question that we need to ask regarding the Internet, is not about inventions or inventors, since as we will see, there are certainly a group of people, organizations and technologies that shaped The Internet that we use today.  The more correct question would be to ask about the design principles, combinations of different methods and technologies that lead to the architecture that we use today. First, let’s start by taking a look at the historical events that shaped the Internet.

The history of the internet timeline

As Janet Abbate explains in her book “Inventing the internet”, The Internet and its predecessor, the ARPANET, were created by the US Department of Defense’s Advanced Research Projects Agency (ARPA), a small agency that has been deeply involved in the development of computer science in the United States. In 1957, USSR launched Sputnik into space and with it, global communications. In 1958, The United States government created the Advanced Research Projects Agency (ARPA) in response to Sputnik launch. ARPANET was a single network that connected a few dozen sites, which computer scientist used to trade files and information. Joseph C. R. Licklider, was the first director of ARPA’s Information Processing Techniques Office (IPTO). Licklider’s influential 1960 paper “Man-Computer Symbiosis” became an important document for computer science and technology to serve the needs and aspirations of the human user, rather than forcing the user to adapt to the machine.  Licklider (1960, pp. 4–5) wrote:

“The hope is that, in not too many years, human brains and computing

machines will be coupled together very tightly, and that the resulting partner-

ship will think as no human brain has ever thought and process data in a way

not approached by the information-handling machines we know today. . . .

Those years should be intellectually the most creative and exciting in the

history of mankind”.

Another important figure that needs to be mentioned is Robert Taylor, a system engineer in the aerospace industry. He believed that if ARPA’s computers could be linked together, hardware, software and data could be efficiently pooled among contractors rather than wastefully duplicated(Abbate,2000). In 1966 Taylor recruited Lawrence Roberts, a program manager at MIT’s Lincoln Laboratory to oversee the development of the ARPANET.  A group of computer scientists were constantly working in system-building strategies that used effective design principles such as layering and modularization. These two principles that were characteristics of the ARPANET, were later on, successful models that were used for the architecture of the internet.  In 1967, Lawrence Roberts lead Arpanet’s design discussions and published first ARPANET design paper: “Multiple Computer Networks and Intercomputer Communication”. Wesley Clark, suggested that the network is managed by interconnected ”Interface Message Processors” in front of the major computers. Called IMPs, they evolved into today’s routers.

One of the major problems that the engineers and computer scientist were trying to solve while working with ARPANET, was designing a network that could allow any kind of computer to exchange data over a common network with no single point of failure.  The concept of switching small blocks of data was first explored independently by Paul Baran. He described a general architecture for a large-scale distributed network. The main focus of his idea was to use a decentralized network, where a message could be successfully delivered between any two points, while using multiple paths. This message would be divided into different blocks, and then reassembled in the end when it reached the destination. In 1961, Leonard Kleinrock introduced the packet-switching concept in his MIT doctoral thesis about queuing theory: “Information Flow in Large Communication Nets”. His Host computer became the first node of the Internet in September 1969, and it was the first message to pass over the internet.

An animation demonstrating data packet switching across a network.

First, the TCP protocol breaks data into packets or blocks. Then, the packets travel from router to router over the Internet using different paths, according to the IP protocol. Lastly, the TCP protocol reassembles the packets into the original whole, and that’s how the message is delivered.

Working with this idea, were two other scientists Vint Cerf and Bob Kahn. According to Kahn, the origin of the word “the Internet”, is “inter-networking”.  In the late, 1960s, he faced the problem of three communication networks that did not connect to each-other. He worked with Vint Cent to solve the problem.  They invented an internetworking protocol to share information using packet-switching method. The Transmission Control Protocol (TCP) is the main protocol of the Internet Protocol (IP) suite. A lot of internet applications that share information over the internet, rely on TCP.  So, the packet switching method for data protocols for any computer, provided the solution to the problem that I mentioned earlier. Now, any computer could exchange data over a common network, with no point of failure. This was a major invention that happened during the ARPANET research, and it is now one of the major main concepts of the networks that are connected today on the Internet.

In order for us to have a better idea of how the messages are sent from one point to the other, let us take a look at this video, where Spotify engineer, Lynn Root and Vint Cerf, an Internet pioneer, explain what keeps the internet running and how information is broken down into packets.

While the ARPANET was a single network that connected a few dozen sites, the Internet is a system of many interconnected networks, capable of indefinite expansions. At the start of 1980s, the internet was still under military control. (Abbate, 2000) But then, it shifted from military control to academic research. In 1983, the US Department of Defense split the ARPANET into MILNET which was a military site, and ARPANET which became a civilian research site. This division made possible for different scientists and organizations from around the world to do research and explore the possibilities of designing the internet’s architecture that we have today.

While the research was still going on, the idea to divide the internet space into smaller domains, was an invention by Paul Mockapetris. He invented the Domain Name System (DNS). Six large domain names were created to represent different types of network cites: edu (education), gov (government), mil (military), com (commercial), org (organization) and net (network). This division helped to categorize different types of networks and it made possible the idea to expand. Beneath the top level domains, were different categories, so under edu, each university would have it’s own domain and so on.

ARPA helped in funding the research that was done in creating a new generation of technologies for inter-networking, the concept of packet switching, the development of Transmission Control Protocol and Internet Protocol (TCP/IP), the concept of the “network of networks”.

So yes, the Department of Defense and the military programs that funded the research that was done, shaped the history of the internet that we use today.  But, that is only one part of the big story.  To make the global connections possible, different distributed agencies also known as the “Internet Ecosystem” help to develop the internet and make these connections possible. It is important to mention organizations such as the International Standard Organization (ISO), the Internet Engineering Task Force (IETF), the World Wide Web Consortium (W3C), the Internet Corporation for Assigned Names and Numbers (ICANN), Internet Assigned Numbers Authority (IANA), Internet Registries (RIR). There are policy and decision makers that provide regulations on cross-border communications, there are international agreements, there are vendors that provide network infrastructure, there are internet users and educators that use the Internet to communicate, teach and build new technologies. My point is that The Internet is not a thing, a company or a product. The internet is a global system of distributed agencies and technical mediation that make possible to link networking devices worldwide.

The Internet Architecture and the Design Principles

After knowing the history and the events that happened, it is important to also try and “de-blackbox” the design principles and the architecture of the Internet. As Irvine explains in his article “The Internet: Design Principles and Extensible Futures”, there are three main design principles that make it possible for us today to use the Internet in different ways: Extensibility, Scalability and Interoperability.

Extensibility has to do with the idea that an implementation can grow, and to extend a system, means to add new functionality and modify parts of the system, without changing the overall behavior of the system.

Scalability is the ability of a program or a system to run effectively, even when it is changed in size or volume.

Interoperability is the ability of a system to exchange/communicate information.

Irvine highlights the Internet as “the mother of all case studies”.  The Internet is a modular system, it is a complex socio-technical system, it has cumulative combinatorial design principles, and it has an open architecture.

Modularity is a design principle where the components in a system are highly independent. (Schewick, 2010). This means that there are minimal dependencies among the components of a system. So, you can change certain parts of the modules, without affecting the whole system. This design principle reduces the complexity of a system.

As explained by Schewick, Layering is a special form of Modularity.  In a layered system, modules are organized in layers that constrain dependencies between modules.  The architecture of the Internet is based on a layering principle called relaxed layering with a portability layer.  By basing design on increasing levels of abstraction, layering greatly reduces complexity.

Variants of the layering principle (Schewick)

As Irvine suggests, it is somewhat easier to manage the technical layers of how networks are connected, since it is a complex but manageable engineering problem, but, on the other hand, it is much harder to understand the international political-economical issues between different countries, conflicts in ownership of network infrastructure, agreements on standards and control of content. Exactly these issues, make the Internet a complex socio-technical system.

The Internet was built and designed in an open environment, where different communities, researchers from all over the world designed and worked on the prototype. This was made possible because the architecture of the Internet has come to be called “open architecture“.  Ronda Hauben, in her paper “Encyclopedia of Computers and Computer History” explains the definition of what an open-architecture means:

“Open architecture…describes the structure of the Internet, which is built on standard

interfaces, protocols, a basic data format, and a uniform identifier or addressing

mechanism. All the information needed regarding the interconnection aspects is publicly

available.In the case of networks, the challenge in designing an open architecture system is

to provide local autonomy, the possibility of interconnecting heterogeneous systems, and

communication across uniform interfaces. Providing a basic format for data and a

common addressing mechanism makes possible data transmission across the boundaries

of dissimilar networks.”

After having an idea about the design principles and the architecture of the Internet, we need to try and understand how different agents are connected and work together in transmitting information. The Internet is a system that includes everything from the cables that carry information, to routers, modem servers, cell phone towers, satellites all interconnected, transmitting information using the Internet Protocols.

When you send e message from your computer to a friend, using the Internet as the mean of this communication, that message is divided into packets/blocks as we saw earlier, it finds different paths from the modem, to the router, finds the Domain Name Server and then the appropriate Web Server using the Internet Protocols, and at this point the message is than reassembled into the packets  from the original whole, and that’s how your friend receives that message. There is a trade of complexity and performance that happens while using these design principles, but the end goal of this architecture is to effectively have the flow of information, the transmission of the data packets from one end of the server to the other.

The Internet and the Web

When we use the terms “the Internet”, and “the Web”, we usually refer to the same thing. There is a distinguishment between these two terms, and it is important to know the difference. As explained earlier, the Internet is a a system of interconnected computer networks that use TCP/IP to link networking devices worldwide. On the other hand, the Web is a system of web pages and sites that use the Internet to share their information. It was Tim Berners-Lee that invented the Web in 1989. Tim Berners-Lee, in his paper “Weaving the Web: The Original Design and Ultimative Destiny of the World Wide Web” explains that when he thought of the Web, he envisioned ”a space in which anything could be linked to anything”. He wanted this to be a single,global information space. He explains the idea behind this “space” by saying that every information would be labeled, have an address, and then, by being able to reference this information, the computer could represent association between things, and all this could be an open space for everyone to use and share. Similarly to the Internet, the Web is a protocol layer that works over the architecture of the Internet. The Web is based on different standards and protocols including data standards, network services, HTTP protocols.  There are different layers that make up the web architecture.

Understanding web architecture (Petri Kainulainen)


The three layers of every web page (Alex Walker)


The Internet that we use today was born as a historical accident and different agents (government research, private research, university research) played significant roles in designing the Internet architecture. ARPA helped in funding the research that was done in creating a new generation of technologies for inter-networking, the concept of packet switching, the development of Transmission Control Protocol and Internet Protocol (TCP/IP), the concept of the “network of networks”. Packet-switching method for data protocols for any computer, provided the solution to one of the main challenges that researchers were trying to solve, and that was designing a network that could allow any kind of computer to exchange data over a common network with no single point of failure. Internet development converged with the PC industry. In the beginning, computers were seen as single artefacts that could perform calculations in a short amount of time. Now, their purpose has changed and evolved, and data network chips are standard equipment in every PC, so they can connect to the network. There were two main dimensions that made the Internet successful. Firstly, its design principles, modular architecture based on open standard and secondly, computer networks are mediators of the larger network of social, political and economical factors. The Internet is a complex socio-technical system, with a cumulative combinatorial design, an open-architecture that uses the design of TCP/IP. To make the global connections possible, there is a whole “Internet ecosystem”that helps in regulating international agreements and standards, so the architecture of the Internet is open and because of that it’s not owned or controlled by a specific group that has dominant powers over the others. There are a lot of debates recently concerning “the future of the internet”, and we are hearing a lot about net neutrality, the issue among corporations who own network infrastructure and those who own access to content and media services. By understanding the design principles and the architecture of the Internet, we can be part of these discussions and we can find ways for all the combinations of different technologies, so the principles of Extensibility, Scalability and Interoperability can help us to be globally connected, in an open environment. That’s why it it important to not think about the Internet as just “a thing” or “magic”. Rather, it is a complex system that we all are part of, and the more we know and understand the history, the design principles, and the architecture, the more we can help to develop and design the future of the Internet.


Abbate, Janet. Inventing the Internet. Cambridge, MA: The MIT Press, 2000.

Andrew, Evans. Who invented the internet? History Stories. December 2013.

ARPANET archival documentary Computer Networks: The Heralds Of Resource Sharing. Arpanet. 1972.

Baldwin, Carliss Y, and Kim B. Clark. Design Rules, Vol. 1: The Power of Modularity. Cambridge, MA: The MIT Press, 2000.

Baran, Paul. On Distributed Communications Networks. IEEE Trans. Comm. Systems, March 1964.

Bardha, Linda. The internet, this complex social-technical system. Leading By Design: Principles of Technical and Social Systems. Georgetown University. November 2017

Bardha, Linda. How does Google search bar work?. Leading By Design: Principles of Technical and Social Systems. Georgetown University. November 2017

Berners-Lee, Tim Weaving the Web: The Original Design and Ultimative Destiny of the World Wide Web. New York, NY: Harper Business, 2000.

Cerf,Vint. A Brief History of Packets. IEEE Computing Conversations.1997

Cerf, Vint, and David Clark. A Brief History of the Internet. Internet Society.1997

Cerf, Vint, and R. E. Kahn. A protocol for packet network interconnection. IEEE Trans. Comm. Tech., vol. COM-22, V 5, pp. 627-641, May 1974.

Clark, David. The Design Philosophy of the DARPA Internet Protocols. Originally published in Proceedings SIGCOMM ‘88, Computer Communication Review Vol. 18, No. 4, August 1988

Denning, Peter J. Design Thinking. Communications of the ACM, 56, no. 12. December 2013.

Evolution of the Web. Visualization.

Hauben, Ronda. Open Architecture. Raul Rojas (ed),”Encyclopedia of Computers and Computer History”, Fitzroy Dearborn, Chicago, 2001. vol. 2 pg 592.

Hobbes, Robert Zakon. Hobbes’ Internet Timeline. An Internet timeline highlighting some of the key events and technologies that helped shape the Internet as we know it today.

Internet Society. Who Makes the Internet Work: The Internet Ecosystem. February 2014.

Irvine, Martin. Introducing Internet Design Principles and Architecture: Why Learn This?

Irvine, Martin. Intro to the Design and Architecture of the Internet. November 2014

Kleinrock, Leonard. Information Flow in Large Communication Nets. RLE Quarterly Progress Report, July   1961.

Licklider, Joseph C.R. and, W. Clark. On-Line Man Computer Communication. August 1962.

Lidwell, William, Kritina Holden, and Jill ButlerUniversal Principles of Design. Revised. Beverly, MA: Rockport Publishers, 2010

Roberts, Lawrence. Multiple Computer Networks and Intercomputer Communication. ACM Gatlinburg Conf., October 1967.

Schewick, Barbara van.  Internet Architecture and Innovation. Cambridge, MA: The MIT Press, 2012.

Walker, Alex. The Three Layers of Every Web Page. SitePoint. Infographic. May 2014.

White, Ron. How the Internet Works. 9th ed. Que Publishing, 2007.

Zittrain, Jonathan. The Future of the Internet–And How to Stop It. New Haven, CT: Yale University Press, 2009.

Modular Design and Socio-Technical Dependencies: A Case Study on WeChat


In nowadays China, WeChat is one of the most popular social platforms on cell phones. It is not simply a platform for messaging, but also provides functions that can greatly meet people’s needs in their everyday life. This essay will try to analysis the design principles of WeChat from aspects such as its modular design and socio-technical dependencies, based on two research questions: (1) What functions does WeChat provide that makes it such a popular application that people cannot live without; and (2) What are the technical and social reasons behind the phenomenon that WeChat does not provide users with delivery status notification (DSN).

Key words: modular design; socio-technical dependencies; WeChat; social media; DSN


1. Introducing WeChat

(One Day of WeChat)

Currently, Wechat is the most popular mobile chatting application in China. Released in January 2011 and developed by Tecent Company, WeChat provides users with multimedia communications as well as other functions such as online payment and location sharing (Xu, 2016). Step by step, WeChat has built up its unique ecosystem, and it is now regarded as an application “for China’s everything” (Pasternack, 2017). According to the statistics, the average times people check their WeChat per day is 10 times, and more than half of the users spend more than 90 minutes on this application per day (WeChat Blog, 2016). This essay will try to examine features of WeChat from different aspects. In the following parts, it will try to analysis reasons of WeChat’s popularity from its modular design aspects, as well as its design principles from socio-technical perspectives.

(WeChat Statistics, 2016)

2. Why is WeChat so Popular? – From the Modular Design Point of View

WeChat is a product with many add-ons that makes life easier and more convenient. It is not only a cell phone application for people to contact each other, but could also be the platform to post and share life with friends, the channel where people receive all sorts of information, and the tool to transfer money and make payments. With the updates, WeChat has gradually penetrated into many important aspects of people’s life by combining more modules and functions into this application. As Author puts it, new technologies arise by combining already existing technologies (2011). By interconnecting subcomponents, the designed system can manage a larger and more complex structure of functions. Every added module in WeChat has a hierarchy structure, combining more detailed design principles and features. Wang and Grover, product managers of WeChat also state that WeChat is just “simple features organize in a good way” (WeChat Blog, 2015). The following part will try to discuss some of the most frequently used features in WeChat and examine WeChat as a modular design.


2.1 As a Multi-Media Communication Tool

As a social messaging application, the most basic and essential function of WeChat is to help the users build up and keep connections.  By using the internet instead of basic texts, WeChat provides users an environment to stay connected with others without charges. One of the underlined assumptions under the core values of WeChat, according to Wang and Grover, is that the users will always be online (WeChat Blog, 2015). The internet environment provided by the boosting technique of mobile network and Wi-Fi supports the users to use WeChat and be connected as long as they want to. Based on that, WeChat offers users various forms of messages, including text message, voice message and personalized stickers. With all these features in hand, users can choose the channel they want to communicate through and express themselves freely.

Moreover, WeChat provide users chances to recall the last sent message within two minutes. Although it leaves a permanent notification in the chat for all parties involved in the conversation, with that feature, users can withdraw the unwanted message and keep other person from knowing the content of it. This could be a very useful feature when the user sent the message to the wrong person, sent the wrong file, or said something improper in the message.

(WeChat recall feature)

2.2 As an Ingroup Broadcast Channel – Moments

Moments is a unique function of timeline story posting and experience sharing. Similar to Facebook and Instagram, users can post stories on their timelines, and decide which group of people can see the post, as well as like and comment other users’ post. However, different from other social media applications, Moments only allow users to see the likes and comments made by mutual friends between the user and the person who posted the moment. In other words, the user cannot see the likes and comments from those they are not friend with.

According to Wang, that kind of “circle design” changes the content of the posts (WeChat Blog, 2015). Because the one who posted the moment will be the only one to know how many likes or comments she got, users would be more genuine and tend to post authentic things to share with friends instead of posting things catering public taste and make the posting process a competition, which would lead to more interesting posts.

(WeChat Moments)

 2.3 As an Online Payment Method – WeChat Wallet

With the 5.0 update, WeChat has introduced the feature of wallet and payment. Users can bind their bank account to this application and complete monetary transactions. Users can complete purchases and make payments through WeChat wallet for a variety of things from mobile top up, pay utility fees, buy film tickets to order taxi and food delivery. Online investment is also imported in WeChat, providing the users with a high interest rate to encourage them move their money from saving accounts, which also provides a fund raising method for Tecent company.

(Various purchases that can be realized by WeChat wallet)

That feature boosts the micro-business in China. With free and almost real-time money transfer, people are able to run small businesses on their WeChat account. These business runners use WeChat as a free platform for advertising as well as one-on-one customer service channel.

(micro-business runner advertising her products via Moments )

2.4 As an Information Channel – Official Account

Except for the main function of multi-media communication, WeChat is also used as a channel to receive all sort of information via the official accounts. News feeds will be sent to users based on a mechanism called server push, which can “send the data from server to client in an asynchronous fashion” (Sampathkumar, 2010). After the user subscribe to different official accounts according to their interests, these official accounts will start to feed the users with articles. Since the messages are asynchronous, users can read the feeds sent from official accounts anytime after it is published without worrying about the message will be gone or be refreshed if they do not read it as soon as possible, as well as share it with friends on WeChat or share it to other social media platforms such as Weibo.

Official accounts have boosted many business and media related industries in China. For companies and advertisers, official accounts enable them to engage with consumers with a new method (Xu, 2016). As the messages are sent to the followers individually, users can reply to the server push and have “direct conversations” with the service providers via the official accounts from Q&A, feedback, to book a service, and no other users can see their conversation even if they have followed the same account. It has also induced the generation of self-media. As long as the user’s identity is verified, everyone can create their own platform and post articles and opinions. That creates a channel for direct communications between the blogger and the followers. As users can subscribe or unsubscribe an official account at anytime, the self-media as well as advertisers are demanded to offer “more valuable contents as well as a high level of interactive experience with the audience, (Xu, 2016)”, instead of using it as a broadcast tool for message bombardment.

(Types of Official Accounts that Users Follow, 2016)

(Number of Followers on Official Accounts, 2016)


3. Why There is No Message Delivery Notification? — A Socio-Technical Analysis

Combined with all these modules of functions, WeChat is an important application penetrating every aspect of life in China. It is not only used for contacting each other, but also for payment, and receiving all sorts of information. However, as a messaging application, WeChat does not provide users with message delivery status notification (DSN), which is also called message sent/read confirmation. It is a feature helping users to know whether the messages have been delivered or read, normally realized by a small icon below the sent message. WeChat also does not tell the users if the person they are talking to is online or not. Although Facebook Messenger and WhatsApp see these features as essential and important, WeChat is designed without them. The next part of this essay will try to analyze the reasons behind that from socio-technical aspects.

(Messenger: Know when messages are delivered and seen to reach people instantly)

(WhatsApp: Message Status Identification)

(WeChat: Only shows whether message is pending to deliver/failed to deliver)


3.1 Technical Aspects

DSN feature might be left out from WeChat for an optimized user experience, not because that WeChat does not have the technique to support DSN feature, but out of a comprehensive consideration ensuring users can receive the messages in real time. According to Pappu, Carvalho and Pardalos, Quality of Service (QoS) is a discription of the overall performance of a service, and translation delay is an important aspect in QoS measurement (2013). QoS capabilities allows system designers and administrators to attach priority to the message or communication channel (Pappu, Carvalho, Pardalos, 2013). To have the message delivered within the required time, the bandwidth of the network as well as network traffic are important considerations.

Deshpande has determined that, the maximum allowed time for a message to deliver is 8ms. Which, translated into a bandwidth requirement, is 7.144 Mb/s. He has also mentioned that the differences in the traffic of a network and the network architecture will alter the results, thus the actual bandwidth for the delivery of a message could be higher or lower according to the network.

However, connectivity in China could be barely enough to realize the requirement for real time message delivery. The average bandwidth in China is 7.6 Mb/s (Akamai, 2017), and that number differs from place to place – in some of the rural areas, the internet connectivity could be poor, rendering a slow sending and receiving of messages.  Also according to Wong, in the year 2016, the online population in China has reached 688 million, which is half of the population, and nearly 90% of them can access internet connection via their phones (2016). This could also result in more traffic in the network, which, as a consequence, requires a higher bandwidth for the delivery of the message.

(China Internet Bandwidth Ranking, 2017)

Under these operating conditions, DSN might be a heavy burden on message delivery, and could slow down the message sending and receiving process. Instead of offering a more advanced service of informing users whether the message receiver has read the message, WeChat needs to put the priority in ensuring the messages are delivered as fast as possible to increase Quality of Service. Thus to reach the best result of message delivery, WeChat is designed without this feature to make sure message deliveries can meet the timing requirements.


3.2 Social Aspects

Besides the bandwidth and network traffic constraints, WeChat is designed without DSN feature for other social reasons related to the privacy policy, the nature of this application, and the communicating style of the users.

On WeChat ChatterBox, which is the official WeChat blog, WeChat team claims that message read confirmation feature will not be provided to protect the users’ privacy (WeChat Team, 2014). In the blog post “Why You Won’t Find Blue Ticks in WeChat”, the team states, “we believe the exact time you read a message in WeChat is your business and no one else’s – unless you make that decision yourself. Users can chat freely in WeChat knowing the other party will not see any timestamp or receive information about your messaging behavior other than the content of the conversation you choose to share (WeChat Team, 2014).” In other words, DSN is deliberately excluded from WeChat features, so that the users would not be worried about exposing their messaging behavior, or figure out the time lapse between they see and reply the message.

Also, according to Social Networks in China, WeChat does not place a heavy emphasis on simultaneous online communications (Che and Ip, 2017). It provides an asynchronous communication mode, where “instant responses are not necessary for information exchange” (Che and Ip, 2017). Authors of this book compare WeChat with QQ, which is another chatting application designed and owned by Tecent company came out in 1999. They state that to some extent, the type of communication features is defined by the platform characteristics (Che and Ip, 2017). WeChat and QQ has different market positioning based on their characteristics. While QQ is a platform for synchronous chatting on both PC and cell phone when they are both online, WeChat users are more likely to conduct their conversation on cell phones, using their fragmented time slots in daily lives. Thus the feature of showing whether the other party has seen the message is unnecessary: users are expected to check WeChat messages in their fragmented time, thus the timing of the users seeing and replying the messages are non-deterministic in the communication. In other words, instant reply is not expected when people communicate on WeChat, and time lapse might exist between when user see and reply to the message. Without DSN feature, users can feel free to see the message and reply whenever they feel comfortable.

Moreover, to some extent, users will alter their messaging behaviour if DSN feature is introduced. Read notifications works as an “awareness cue”, as it “offers the interaction partner a detailed feedback about the online activities of a user”, and this information may increase users’ response pressure (Marques & Batista, 2017). Having the knowledge “the sender will be notified if I have seen the message” in mind, the user will either not to open the message when she cannot reply, or be pressured to reply right after she have seen the message. In addition, according to Wang and Gu, the Chinese rhetoric style is high context and indirect (2016), it puts a great emphasis on vagueness. DSN feature provides the users with excessively explicit information that they need to consider and deal with. Although the intention of DSN feature was to be informative about the message’s delivery status and help users reach higher quality communications, to some degree it actually causes concerns and pressure to the users.



In conclusion, this essay discussed WeChat from its modular design and social-technical dependencies. The first part of the discussion tried to analyse the crucial modules in WeChat that make it an indispensable application in people’s life, and the second part of the essay tried to explain why WeChat does not provide DSN features for the users. We can also see that design decisions have an influence on the user habit, and sometimes users are affected in ways different from the designers’ intention. Designers should take the design impact on users into consideration as they add new features to an application.



Arthur, W. B. The Nature of Technology: What It Is and How It Evolves (Reprint edition). New York: Free Press, 2011.

Che, Xianhui, and Barry Ip. Social Networks in China. Chandos Publishing, 2017.

Deshpande, Jayant G., Eunyoung Kim, and Marina Thottan. “Differentiated Services QoS in Smart Grid Communication Networks.” Bell Labs Technical Journal 16, no. 3 (December 1, 2011): 61–81.

Figueiredo, Marques, Rui Pedro, and Batista Lopes Joao Carlos. Information and Communication Overload in the Digital Age. IGI Global, 2017.

“Global State of the Internet Connectivity Reports | Akamai.” Accessed December 14, 2017.

“Messenger.” Accessed December 14, 2017.

Pappu, Vijay, Marco Carvalho, and Panos Pardalos. Optimization and Security Challenges in Smart Power Grids. Springer Science & Business Media, 2013.

Pasternack, Alex. “How WeChat Became China’s App For Everything.” Fast Company, January 2, 2017. Accessed December 15, 2017.

Tencent, IBG. “Why You Won’t Find Blue Ticks in WeChat.” WeChat Blog: Chatterbox. Accessed December 14, 2017.

Wang, Xiaobo, and Baotong Gu. “The Communication Design of WeChat: Ideological As Well As Technical Aspects of Social Media.” Commun. Des. Q. Rev 4, no. 1 (January 2016): 23–35.

Wong, Edward. “China’s Internet Speed Ranks 91st in the World.” The New York Times, June 3, 2016, sec. Asia Pacific.

Xiaoge, Xu. Handbook of Research on Human Social Interaction in the Age of Mobile Devices. IGI Global, 2016.

Sampathkumar, Padmashree. “Using WebSphere Application Server Community Edition to Connect Ajax Client and Asynchronous Data,” October 21, 2010. Accessed December 14, 2017.

Tencent, IBG. “Tech Tip – Your Guide to WeChat Moments.” WeChat Blog: Chatterbox. Accessed December 15, 2017.

Tencent, IBG. “The 2016 WeChat Data Report.” WeChat Blog: Chatterbox. Accessed December 15, 2017.

Tencent, IBG. “We Chat About WeChat #3: An Inside Look at How and Why We Build WeChat.” WeChat Blog: Chatterbox. Accessed December 15, 2017.“

2017 WeChat User Report Is Out! – China Channel.” Accessed December 15, 2017.


Building The Cloud: Principles in Information Technology Design


If one watches the tech space for a long enough period of time, they will start to notice a recurring pattern of new and trendy technologies being touted as the “Next Big Thing”. A quick glance at the current state of our tech discourse reveals a bevy of tools and technological phenomena that promise to be socially and economically transformational. Virtual/Augmented Reality, Artificial Intelligence, Big Data, Internet of Things…all of these technologies have taken up prime position in our tech consciousness, but it is the advancements in Cloud Computing that I find most interesting.

The transition from our traditional Information Technology infrastructure to the Cloud environment has been one that encompasses many of the principles central to the digital economy. As we move more and more of our social lives and economic activity to the Internet, design concepts such as scalability, distribution, resource pooling, virtualization, and rapid elasticity will form the bedrock foundation for how we create, move and compute in this new digital environment. Yet while we acknowledge their ubiquity and importance in modern times, it’s important to understand the history of these design principles and how they have taken shape during the evolution of Information Technology.

In this paper, I intent to trace that history – from the era of tabulating machines and large mainframes to the modern cloud era – and show how the Cloud is the furthest advancement of the design principles at the foundation of Information Technology, crucial in unlocking the business potential of this new, modern digital economy.


A Quick History of the Early IT Environment: From Hollerith to IBM System/360

Merriam-Webster defines Information Technology as “the technology involving the development, maintenance, and use of computer systems, software, and networks for the processing and distribution of data” (“Information Technology”, Merriam-Webster). In the early portion of the 20th Century, this meant using tabulating machines, both large and relatively limited in their functionality. These machines, designed by Herman Hollerith, were electro-mechanic in operation and used punch cards symbolically representing data to execute large computations. The first major application of these tabulating machines was the 1890 US Census, which was completed in a much faster and cost-effective manner than the previous census. It quickly became apparent that these tabulating machines would be computationally useful in business contexts for large firms like the railroad companies that dominated the era, particularly for tasks such as accounting, payroll, billing and tracking inventory (Zittrain, 11-12).

Figure 1: Herman Hollerith’s Electric Tabulating Machine. (

The high threshold to a level of functional knowledge needed to operate these machines meant that the firms using them preferred to rent them from Hollerith instead of purchasing them outright. This way, there was a direct vendor they could appeal to in the case of something going wrong.

Decades of progress in computational hardware and theory led to the onset of the electronic computer era of the mid-20th Century, in which the tabulating machine gave way to the mainframe computer as the dominant player in the Information Technology arena. In the early years, these computers were behemoths that would take up entire rooms, but compared to the tabulating machines that came before them, mainframe computers had much more functionality and versatility, and could process amounts of data previously unheard of.

Figure 2: IBM System/360. Released in 1965, went on to become one of the most commercially successful and influential product in the history of computers. (NBC)

Extremely expensive and requiring customization, these computers were initially used only by government agencies and large corporations, but would go on to serve as the backbone of the 20th Century Information Technology landscape. They were designed to process the ever-increasing deluge of data being produced by the modern economy, which included such novel concerns as real-time financial transactions, logistical computing, and enterprise resource planning, speed ordering from wholesalers, and instant reservation services (“IBM Mainframes”, IBM). IBM was the principal player in the design and construction of the mainframes of this era, but they faced competition from a group of companies known by the acronym BUNCH – Burroughs, UNIVAC, NCR, Control Data Corporation, and Honeywell. All of these companies were responsible for much of the innovation surrounding these new machines serving as the engine of 20th Century business innovation.

With the 1980s came the Person Computer revolution, which took computers – once the domain of business and government – and put them in the hands of the general population. Information Technology models would have to adjust to this proliferation of computational literacy. The ability to link computers together was an idea pushed forward by this new computational environment, and the Internet – in development since the 1960s – was brought to the masses (Campbell-Kelly & Aspray, 275). Networking – a concept previously only used by higher education institutions and select organizations – was now a major possibility for the enterprise computing community. The further increases in the amount of data production led to larger, more powerful mainframes and servers being built, but a new Information Technology model would be needed to truly wrestle with the features and components of this new digital environment.

Figure 3: Digital Equipment Company’s Microvax 3600. Unveiled in 1987. We can see the computer shrinking and becoming more ubiquitous. (Computer Weekly)

Design Principles of the Early IT Environment: RAS and More

In terms of principles used for the design and building of mainframe computers, the acronym “RAS” – which stood for Reliability, Availability, and Serviceability – was the accepted industry guiding philosophy. According to IBM – the industry leader in mainframe design – “ When we say that a particular computer system ‘exhibits RAS characteristics’ we mean that its design places a high priority on the system remaining in service at all times.” (“Mainframe strengths: Reliability, Availability, and Serviceability”, IBM).

According to IBM, reliability is when “The system’s hardware components have extensive self-checking and self-recovery capabilities. The system’s software reliability is a result of extensive testing and the ability to make quick updates for detected problems” (“Mainframe strengths: Reliability, Availability, and Serviceability”, IBM). One can imagine how this would be crucial when dealing with systems highly sensitive to downtime. In mainframes of the early enterprise environments were not as reliably built as they are today, and calculations would take orders of magnitude longer. This, along with their steep purchase price, meant that if they were not reliable in their operation, the vendor is likely to lose a customer. This was early in the adoption stage, so discomfort with the technology would only be exacerbated by an unreliable machine.

Availability was defined as the ability for “the system [to] recover from a failed component without impacting the rest of the running system. This term applies to hardware recovery (the automatic replacing of failed elements with spares) and software recovery (the layers of error recovery that are provided by the operating system)” (“Mainframe strengths: Reliability, Availability, and Serviceability”, IBM). While the automated nature of this process would only be available in later mainframe models, the basic rationale behind this design principle is to account for system failure. These machines were incredibly complex and multifaceted, so a single failed component deactivating the entire machine is not an unwanted feature. Ideally, the machine would be able to keep running while the replacements and/or fixes are made, giving it a robustness.

Serviceability was considered to be in effect when “The system can determine why a failure occurred. This capability allows for the replacement of hardware and software elements while impacting as little of the operational system as possible. This term also implies well-defined units of replacement, either hardware or software” (“Mainframe strengths: Reliability, Availability, and Serviceability”, IBM). Good design accounts for the instances in which something goes wrong, and serviceability speaks to that concern. Diagnostic analysis is a large part of any computational maintenance, and being able to identify and fix the problem once discovered without compromising the operation of the entire system would have been a large advantage in the early Information Technology environment.

While RAS served as a central guideline for mainframe design, there were other design concepts that were beginning to take root in the early Information Technology landscape. Variability was once such concept that became central as technology progressed. Early mainframe systems had to be custom built for their purpose, such as IBM’s Sabre (Semi-Automatic Business Research Environment), which was a central reservation system designed and built just for American Airlines (“Sabre, The First Online Reservation System”, IBM). The mainframe industry would soon abandoned this customized model for a more modular model of inter-compatible families of mainframe computer systems that would be applicable for many purposes and in many business contexts.

Figure 4: Promotional material for the American Airlines-IBM SABRE airline reservation system. (IBM)

As the population grew exponentially, so did the amounts and types of data that needed to be crunched by businesses. Instead of computing one single massive problem, the computers of this era needed to be able to compute numerous smaller, simpler transactions and data points. Real time transaction processing was a feature of mainframe computers that was a key to unlocking many of the abilities we now have, such as airline reservations and credit card authorizations. Mainframe computer IT designers dealt with this requirement by increasing the number of I/O (Input/Output) channels for connectivity and scalability purposes (“Mainframe Hardware: I/O Connectivity”, IBM). 

A Quick History of the Modern IT Environment: Clients and Clouds

The immediate predecessor to the cloud computing model was the client-server model. According to Ron White,

“In a client/server network, one central computer is the file server. The server contains programs and data files that can be accessed by other computers in the network. Servers are often faster and more powerful than personal computers…Personal computers attached to a server are the clients. Clients run the gamut from fat clients – computers that run most programs from their own hard drives and use a minimum of network services – to inexpensive thin clients that might have no hard drive at all. They run programs and graphics using their own microprocessor, but depend entirely on a server to access programs and store data files. A dumb terminal is a monitor, keyboard, and the bare minimum of hardware needed to connect them to the network. It uses the server’s microprocessor to perform all functions.” (White, 318)

The key design feature of this model is that multiple client computers are networked and can connect to a central server, onto which the model offloaded multiple computational functions and resources. Whether it’s a file server as detailed above, or a print server allowing everyone on the same network shared access to a printer, or a communications server allowing shared access to internal email system and Internet services, the client is able to collaborate with other clients in their network.

Figure 5: Illustration of the Client-Server model. (The Tech Terms Computer Dictionary)

This had massive implications for enterprise environments, consequently creating an entire industry around enterprise Information Technology management. The client-server model would go on to become the dominant Information Technology model of the 1990s and early 2000s, and it was out of this model that cloud computing was born. By taking the server in the client-server model and replacing it with a collection of interconnected servers run and maintained by a cloud hosting company, many design principles were able to evolve and realize their full potential.

Figure 6: Traditional Hosting Model vs Cloud Hosting Model. (Wicky Design)

Large tech companies – such as Amazon, Salesforce, Microsoft, Google and IBM – would build giant warehouses containing 100,000 servers, and then rent out their portions of their mammoth server capacities to other companies. These cloud hosting services would be capable of replacing much of the high-cost Information Technology work previously done on-site (Patterson & Hennessy, 7). They were also able to offer a whole new model of services through the cloud. By breaking it down into 3 distinct layers – infrastructure, platform, and application – these companies could segment the specific Information Technology services being used by modern businesses and offer customizable services tailored to the Information Technology desires and needs of the specific client (Campbell-Kelly & Aspray, 300).

Design Principles of the Modern IT Environment: Old Ideas, New Technology

Although the technological manifestations are novel, many of the design principles that went into architecting the cloud computing Information Technology model are borrowed from older IT models.

Distribution and Redundancy

As we saw in mainframe RAS design, a high priority was placed on the system being on and available at all times. This design principle is taken to the next level and fully actualized by cloud computing. The nature of our modern socioeconomic environment requires a system ready to process activity 24 hours a day, 7 days a week. One of the main promises of the cloud is to always be available, which is accomplished through redundancy and distribution. Whereas the early mainframe model was susceptible to shut downs if the mainframe computer malfunctioned, the client-server model was able to improve on this disadvantage by distributing the computation load to multiple servers that would jointly handle requests. If one server went down, the other servers would be able to pick up the slack. Although this model was an improvement, it still makes the network susceptible to breakdown if the server site is compromised. Cloud computing addresses this concern by further decentralizing the computational hardware. Instead of on-site servers, the cloud servers are stored in server farms across the country and globe, accessed via the Internet. This mitigates the risk of regional difficulties, and allows for a much more distributed computational network.

Figure 7: Cloud servers in various locations across the globe, accessible via the internet. (My Techlogy)

Rapid Elasticity

The ability to scale your Information Technology architecture up and down as, and when, you need it is a key business feature of the modern economy. According to Rountree and Castillo, “The rapid elasticity feature of cloud implementations is what enables them to be able to handle the “burst” capacity needed by many of their users. Burst capacity is an increased capacity that is needed for only a short period of time” (Rountree & Castillo, 5). An example of this would be if your business were seasonal. You would need larger processing capabilities in-season than you would out-season, and a non-cloud Information Technology model you would be required to pay for computing resources that can reach that maximum threshold, but would be underutilized when your business is out of season. The customizability of the cloud allows for a “pay for what you use” model of resource allocation, much like utilities such as water or electricity. This extensibility is crucial to survival in the economic environment, allowing for businesses to grow at scales of efficiency previously unreachable.

Figure 8: Only pay for what you use. (London Metropolitan University)

Virtualization and Resource Pooling

As is the case with all technology, the moment of mass adoption is rarely the moment of invention. Technologies usually have long gestation periods, waiting until the stars align to cross the threshold of commercial implementation. Virtualization technology was no different. All the way back in 1972, IBM released their Virtual Machine Facility/370 operating software, designed to be used on their Systems/370 mainframe. Going through ebbs and flows of relevancy, the original virtualization system serves as the foundation of IBM’s current z/VM virtualization system. According to Rountree and Castillo, “With virtualization, you are able to host multiple virtual systems on one physical system. This has cut down implementation costs. You don’t need to have separate physical systems for each customer. In addition, virtualization allows for resource pooling and increased utilization of a physical system.” (Rountree & Castillo, 12). Without virtualization, the cloud would lose much of its ability to deliver cross-OS services to customers, and would lose the ability to function as SaaS, PaaS, or IaaS platforms.

Figure 9: Platform/Application/Infrastructure as a Service are all available through cloud virtualization. (Business News Daily)

Ease of Maintenance

Maintaining your computational infrastructure is a major factor when it comes to the design of your Information Technology environment. Hardware and software upgrades are a major resource sink for any organization with a robust Information Technology situation, and can slow down productivity and efficiency of the system. This was the case with the pre-cloud Information Technology mainframe and client-server models, where any maintenance had to be completed on-site, as that is where the hardware was stored. Now, as the hardware is hosted by the cloud service provider – be it Amazon or Microsoft or IBM –, they handle the updates and maintenance. As Rountree and Castillo put it, “You don’t have to worry about spending time trying to manage multiple servers and multitudes of disparate client systems. You don’t have to worry about the downtime caused by maintenance windows. There will be few instances where administrators will have to come into the office after hours to make system changes. Also, having to maintain maintenance and support agreements with multiple vendors can be very costly. In a cloud environment, you only have to maintain an agreement with the service provider” (Rountree & Castillo, 10).

Figure 10: Cloud vendors handle maintenance, leaving the client to focus on more important matters. (Elucidat Blog)

Modularity & Combinatorial Design 

Just as the mainframe model of the mid-20th century transitioned from highly specialized, custom built machines to general purpose machines, the cloud model is able to serve a multitude of customers due to it’s inherently modular design. According to Barbara van Schewick, “The goal of modularity is to create architectures whose components can be designed independently but still work together” (van Schewick, 38). The application-layer offerings of cloud computing vendors allow individuals and companies to create apps, websites, and other digital offerings by combining modules of pre-existing tools and building blocks (“How Cloud Computing Is Changing the Software Stack”, Upwork). A cloud computing data warehouse is a lesson in modularity, as the servers that fill them are standardized pieces of hardware that can be slotted in and out to shift capabilities and resources as needed. By following this modular design principle, the other cloud design principles – such as scalability and ease of maintenance – are augmented.

Figure 11: Microsoft Cloud Server Farm. (Data Center Frontier)



In conclusion, while the evolution of Information Technology has been a long and storied one, the design principles undergirding the progress have been somewhat consistent. The promise of enterprise computing has always been to expand and improve upon the capabilities of the computational landscape. From the tabulating machines of the early-20th Century to the 21st Century cloud computing services and platforms, we see certain features and design values hold throughout the various technological iterations. In what form the next advancement in the evolution of Information Technology will appear is uncertain, but having a grounding in these basic design principles will provide one with the necessary toolkit to understand and impact this field.



Zittrain, Jonathan. The Future of the Internet–And How To Stop It. Yale University Press, 2009.

Campbell-Kelly, Martin, et al. Computer: a History of the Information Machine. Westview Press, 2016.

White, Ron. How Computers Work. Que Publishing, 2008.

Rountree, Derrick, and Ileana Castrillo. The Basics of Cloud Computing: Understanding The Fundamentals of Cloud Computing in Theory and Practice. Syngress, 2014.

Patterson, David A., and John L. Hennessy. Computer Organization and Design: The Hardware/Software Interface. Morgan Kaufmann, 2014.

“Information Technology.” Merriam-Webster, n.d. Web. 15 Dec. 2017.

“IBM Mainframes.” IBM Archives,

“Mainframe Strengths: Reliability, Availability, and Serviceability.” IBM Knowledge Center,

“Sabre: The First Online Reservation System.” IBM100 – Icons of Progress,

“Mainframe Hardware: I/O Connectivity.” IBM Knowledge Center,

“Types of Network Server.” Higher National Computing: E-Learning Materials,

Mahoney, Michael S. “The Histories of Computing(S).” Interdisciplinary Science Reviews, vol. 30, no. 2, June 2005, pp. 119-135. EBSCOhost, doi:10.1179/030801805X25927.

Denning, Peter J., and Craig H. Martell. “Great Principles of Computing.” MIT Press, 15 Jan. 2015,

Schewick, Barbara Van. Internet Architecture and Innovation. MIT Press, 2012.

Wodehouse, Carey. “How Cloud Computing Is Changing the Software Stack.” Upwork, 25 Nov. 2017,

Applied Cryptography in Electronic Commerce: Impact of DRM on music consumption and perception

Grace Chimezie
FALL 2017
CCTP 820

Applied Cryptography in Electronic Commerce:

Impact of DRM on music consumption and perception


Digital Rights Management (DRM), has been an area of continuous discuss in the music industry, with music streaming companies constantly at loggerheads with the rest of the industry which includes users and consumers. In January of 2017 the ¹Financial times wrote a report with headline “How streaming saved the music industry” while some arguments made by Anna Nicolaou, may hold weight, I feel strongly in the opposite direction towards that statement and here is why. In 2017, one would have thought that the fight would have come to an end with the removal of DRM licensing from major music platforms and Apple inc. leading the way a few years back. It turns out we thought wrong, few years later we are plagued with companies like Spotify and Pandora entrenching DRM in our daily lives in sublime ways through their online streaming platforms. This research paper tries to access the impact of these new trend and how it has shaped music consumption, from users and creative end. A major look at how the spiral effect of this has brought the new norm of recommender systems as a way of engaging and keeping users locked on their platforms. Users mostly are at the receiving end of the whole socio-technical complexities but we have those who are at the dual side of the table. These decisions are complex and affect human behavior on how music is perceived. It also highlights that cryptography in e-commerce in form of ratings exhibit different personas using the algorithms recommended, showing that it isn’t a one size fits all problem. This research intends to use the information and network theory to measure the implication of living with this complexity. The title makes it clear as to the intention, but the connections of the impact of relying on these companies and ratings as a means of e-commerce isn’t as simple as it may appear. Research on this is important in that it provides information as to how users and consumers are affected by open system architecture and the implementation of recognizable and automatic features. ²Hence, this paper will look into other studies that demonstrate the integration of information from multiple modalities which in turn is used by encrypted nodes to translate to transactable and interpretable data.

Keywords: E-commerce, cryptography, DRM,  ratings, consumption, music, information theory, socio-technical systems


²A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions


A maximum of explicitness leads to a minimum of understability

–Ungeheuer, 1982



Making connections

Music like all forms of art is a way of representation of human feelings, emotions wants and needs John Street (2012: 1) and so is commerce, right there embodying the same qualities in a social system. Many have argued that music and commerce are two separate entities, but as we’ve seen over the years, our social systems are all interconnected in one form or another (McDonnell and Powers, 1995: Toybee, 1993; Whiteley, 1997). On the other hand technology has seen the coming together of these different systems in a streamlined way that makes discussing issues surrounding its impact and consumption worthwhile.

Commerce can be defined as the exchange of goods and services usually of monetary or economic value between different parties. E-commerce, connotes similar concept, except transaction occurs through electronic communication methods  such as mobile or internet networks. Primarily, e-commerce is being understood as transaction of business across the internet or mobile networks. Challenges however, followed this new socio-technical system, which are unique in the history of e-commerce. Unlike, the means of physical transaction between two parties, as seen in  traditional commerce, e-commerce, on the other hand is conducted without the physical presence and largely anonymous.

One of the many ways in which commerce happens online is through music, which is made through music streaming licensed companies having Digital Rights Management (DRM) e.g. iTunes, Spotify, Google play, Pandora etc. One of the best ways to ensuring that the artist whose contents is found on their platform gets the right exposure is through ‘recommender systems which are an important part of the information and e-commerce ecosystem’ Michael Ekstrand (2010 : 1). They act as tools allowing for users to manage large information and product spaces’.

The argument is hinged mostly on users not being affected to the negative by as to what rules and affordances are provided them by online streaming platforms. At almost no cost, subscribers have immediate access to millions of songs not limited to their environment, on a single and easy to use platform. However, there are no real losers and winners in this and the spiral effect, affects us all.

E-commerce as defined by Information Resources Management Association USA 369

Michael Ekstrand  2010: Collaborative filtering Recommender systems: foundations and trends in human-computer interaction. Vol 4


Most people do not realise that their music is locked up and tied to a particular system. They experience issues like their system crashing and losing all their music as normal. Corporations claim that DRM is necessary to fight copyright infringement online and keep consumers safe from viruses. But there’s no evidence that  DRM does either of those. Long before now, people had to line up at their favorite stores and get CD’s that could only be played on devices that could accept it. From records, to cassette tapes, to CDs, ways of consuming music has changed vastly from what it used to be and continues to. Music streaming platforms have contributed a major deal to the dynamics witnessed in music and companies such as Apple music, Spotify, Pandora, Google Play offering their streaming services in a different ways . This situation turns sour when you look at the different issues that arise from using these music services.


Outside the technological industry there isn’t a clear knowledge as to the function of DRM. Everything is black boxed and users are presented with software applications. Between 2003 and 2009, most music purchased through Apple’s iTunes store was locked using Apple’s FairPlay digital restrictions management (DRM) software, which is designed to prevent users from copying music they purchased. Apple did not seem particularly concerned by the fact that FairPlay was not effective at stopping unauthorized distribution and users could find a way of taking it off with public tools. But for the most part, FairPlay was effective at curbing most users from playing their purchased music on devices that were not made by Apple (Kim, Howard, Ravingdranath, & Park 2008; Sobel, 2007).

FairPlay permitted about five devices to accept music bought from different platforms, but was forced on users by a recording industry paranoid about file sharing and more importantly, by technology companies like Apple, who were eager to control the digital infrastructure of music distribution and consumption. In 2007 30% was charged to users for music files not using the FairPlay platform, they will need to pay per song (or % cost of album) to upgrade their music to a non-DRM iTunes plus version . After numerous lawsuits which were filed in Europe and in US, with years of protest, Apple took into consideration their users complaints and removed DRM from most of their iTunes music catalog. Unfortunately, after an obvious short lived victory, a few years later the return of DRM is no longer news and have appeared on several online music streaming platforms.

Research Question

My mind wanders around, and I conceive of different things day and night. Like a science-fiction writer, I’m thinking, “What if it were like this?”

–Claude Shannon (1948)

Looking at the task at hand few questions emerge to what extent has DRM influenced  how music is being consumed and perceived

  • What happens when one of the DRM streaming companies you’re subscribed to changes its mind?
  • How has streaming companies managed this?
  • how has the outcome influenced new features added on streaming platforms e.g recommender systems?

Moving from Consumer to thought leader


Fig 1.1 Online music streaming  serivces



Itunes approach to fair use by working to integrate capabilities within its FairPlay DRM solution, developed by Apple for storage, categorization, and playback of digital media. While early versions of the software focused on music, the ability to manage and play podcast, television shows and movies, music videos, video games and other plug-in-applications, have been added to its portfolio.

iTunes has been also popular in supporting mobile access to managed digital media by offering support for both proprietary and non-Apple portable media devices. Some criticism over their DRM enabled digital media, and loss of revenue to competitors have enabled them to change their strategy to survive competition and remain in the market.

iTunes Approach to fair use

This software application is installable on both Macintosh and Windows platforms. It currently supports the following video and audio formats, WAV, MP3, MPEG-4. iTunes creates and maintains a library using two files in ITL and XML format a database to categorise information about the digital media in the library including: artist, genre, comments, ratings, play count, last played date, playlist used by the user, track numbers, location of file, and other media specific details.

The FairPlay DRM uses an MP4 container to hold a protected AAC file with most algorithms applied in the encryption scheme being public (AES, MPEG-4) with the exception of the user’s key database component (Grzonkowski, et al. 2007), another area of concern since the proprietary protection of Fairplay prevents interoperability.  

However, Apple needed to cash in to the raving world of online music streaming, hence its launch of Apple Music and the streaming service by 2016 already had 11 million subscribers.

Apple Music

Users with free accounts have access to Beats 1, an internet radio station. Paying customers can play any song on demand. Available on all its platforms, with users only able to use it one one device at a time for both free and paid version. However Family plans, allow up to six people to stream music. Paying subscribers can stream music when their devices isn’t connected to data or Wi-Fi networks.


Spotify is a proprietary music streaming online platform using DRM, supported by many big record companies, such as Warner, Sony, EMI, Universal, giving instant access to millions of songs. By its Proprietary use of DRM its users do not enjoy spotify music freely, such as playing on car players, burning spotify to a CD and so on.

With over 75 million users, it’s difficult to miss this name created by Daniel Ek and Martin Lorentzon in 2005. Music can be played only on one device at a time, with up to three devices signed in to the service at any time, and has available playback offline for only Premium users


The Pandora player is a free, Web-based Flash application. The availability of Flash 7 or 8 installed on the computer gives you the affordance to use it. With $36 per year and $12 for three months the user is provided a free add version.

Pandora delivers a 128-Kbs stream of music, and it only works with a broadband connection. It derives music license from the DMCA (Digital Millennium Copyright Act of 1998) guidelines for streaming internet radio. The use of DRM can be seen in notable ways. Pandora will never play a specific song on demand; if you add a song to a station, it will show up eventually, but Pandora can use that only at random. Users, can only skip songs in an hour this is so you don’t skip a song you intended to listen to. “The licence also limits the number of times Pandora can play a particular song or artist in a particular time period” Julia Layton, storing up music in your computers flash local storage history indicating that it has played.  Pandora also plays the explicit content of songs so as not to take away the artist original intentions for the song.

As all other streaming online music services it also stores your user data to provide recommenders. Pandora music recommenders are built to meet these two needs in balance users want to discover new music while also listening to music they know they like Herlocker et al. (2015)

One cool feature the company has incorporated into its platform is the Music Genome Project.  When a listener chooses a song for the radio station, the Music Genome project chooses the songs that have the strongest edges to the original song.

Google play

For Android users after purchasing an Mp3 on Google play, Google prevents competing applications and third party developers from accessing the file using technical and legal means. Making it obvious that DRM is being implemented on their platform, since music purchased can only be played on on Google’s Play app.

It goes further, to limit the number of devices which you can use to listen to your own music and allows you to “deauthorize” 4 devices per year, including phones and tablets. In addition, each time you flash your device with popular custom ROM such as CyanogenMod, you use one of your authorizations”, John Lech (2015).

It also doesn’t allow you to share your music library with members of your household and can only download the music twice from play music to play on PC or Mac, until the end of time.

There exist other online streaming music services such as soundcloud, tidal, google play etc


Artist’s Discontent

The arguments are mostly streamlined to saying that users are the the ones with the most benefits, but once analysed indepthly there is no winner or loser. Both groups are facing similar issues which vary. For little or no cost the consumers are provided access to millions of songs, through a simple, easy to use platform. The most visible losers are the songwriters, producers, and others involved in the creative aspects of producing music. Most often than not, artist are paid peasantly (Future music coalition 2015), for streaming of their songs on platforms such as Spotify and Pandora.

A big part of the problem is that most consumers attribute very little value to the recording itself with available video and streaming services like Youtube or BitTorrent coming at Zero cost to the listener Paul Rensnilkoff , (2015).

Effect on Users consumption and Perception; From Arguments to Examples (Cases)

Lady gaga: The artist in 2015, through her manager, Troy Carter had decried receiving less for the millions of streams from platforms like Spotify.  He says Universal Music group had paid the singer nothing despite the amount of times her music was downloaded from their platform. However, “Spotify says they pay the labels, though this is often with huge, multi-million dollar advances and or equity positions attached” Paul Rensnlkolf, (2015). This unfortunately doesn’t get to the artist, either for legitimate or illegitimate reasons.  

Taylor swift: In June of 2015, Taylor Swift came under attack, ”when she objected to Apple’s plan to offer free trials at the expense of artist and labels”. Writing an open letter to Apple Music in which she made known that she would not release her album ‘1989’, on their streaming service due to their free 3 month trial policy which writers, producers and artist are not paid.  Explaining herself in this light.

“ This is not about me… This is about the new artist or band that has released their first single and will not be paid for its success. This is about the young songwriter who just got his or her first cut and thought that the royalties from that would get them out of debt. This is about the producer who works tirelessly to innovate and create”.

Not long after this she also had a similar issue with spotify, over similar circumstances.These are not the only stars who have been forthcoming about their experiences, we had Kanye West and Tidal, resulting in him pulling out his “Life of Pablo” from Tidals’ streaming platform.

This on the other hand, sparked a national conversation in regards to the economics surrounding worldwide digital music. The digital market which is valued at around $6.9B is on most days not the center of controversy.

With these issues, companies are finding ways to keep their users on their platforms and there comes recommender systems, which has provided opportunities for users to be exposed to music of their interest at a wide range without regional boundaries.

Source: Digital music

Fig 2.1



Recommender System

With users who are unaware of what goes on behind the scene, one day, they are left to wonder what has become of all the music they loved. They barely have control as to the tides which these streaming services decide to tow. Rather they are bombarded with features which these software companies decide to implement to keep these users glued to their services.  One of which is the Recommender systems.

Many recommender systems are developed in particular contexts, and their evaluations will be on data sets relevant to that context or an internal data sets.

Recommendation is not  the only need users have with respect to their relationship with a recommender system. It can also be used to alter user experience and behavior. Cosel et al (Check year) Recommender systems are frequently “black boxes”, presenting recommendations to the user without any explanation of why the user might like the recommended item. Music streaming e-commerce sites typically uses recommendations to increase sales volume, increasing the importance of persuasion as a goal (Michael ). If the system however, has a reputation of recommending the wrong kind of songs, they lose the trust of users and they suffer in the long run.

Social Impact of Recommenders:

Due to the nature of recommender systems which include collecting data of users, in substantial volume, users of e-commerce platforms face important privacy and security challenges. I’ll love to highlight of one famous ways of recommending which is through star ratings


Economics is recognizing itself as an information science, considering that part of its developmental arc is transforming from matter to bits, storing itself in large computer and magnetic strips. “Even when money seemed to be material treasure, heavy in pockets and ships’ holds and bank vaults, it always was information” James Gleick (2011).

In the bid to keep up and see that music is consumed and people have access to it at a wider range, software companies rely heavily on recommender systems such as the like button, rating etc. This is good for lower selling albums who do not have access to larger platforms, especially those in less developed regions. My argument, is that DRM gave rise to the use of recommenders. On the other hand the instability that follows the features and content on streaming platforms, which is being influenced by DRM is affecting how music is being consumed and perceived, with many users and music providers caught up with what happens behind the scene.

Network Dependence enabling E-commerce: Star Ratings


“The emergence of stars as indicators of quality are, of course, not confines to music”.  John street (2012). Ratings have been used for different purposes an e.g rating of restaurants on a five star scale to the rating of books on Amazon goodreads, hospitals and so have university departments for quality in research, we also have rating of products from experiences. Colin Symes (2004:186) suggests that use of ratings in music may be the legacy Baedeker travel guides. Now part of a general process of audit culture or audit society (Power, 1997), … the star system has become a way of delivering new obligations.

Music criticism is one of the forces that have shaped the modern world, a flow from Europe, a flashback as far as the the eighteenth century. Since then criticism has come to encompass our choice for for a better standard in modern times.  A logic expounded by the economist Richard Caves (2000), who viewed cultural criticism primarily as a way of conveying market information to others. A concept that has been built into the recommender algorithm.

‘Ratings are so important because they influence what shows up on your recommendations. The less helpful the rating system the worse your recommendation will be.

After its disappearance in 2016, itunes returned later in that same year in a 10.2 Beta version,  proving that the star rating system may mean much more to users who tend to their playlist. The Itunes case isn’t a peculiar one. Netflix in march has swapped out its five rating system for a simpler one. ‘According to the online streaming service, its reason is a clear cut one. For it a 5 star rating impresses people and a thumbs up or down, is brutal honesty David sims (2017).

Most users use the star ratings for final control on what gets on to their playlist and synced to their devices .The star rating system has been implemented for different purpose on music streaming platforms. Some serve as recommenders as seen on Google play, but Itunes uses the star ratings to help sort out and arrange playlist of its users, while providing other features like the love and dislike buttons to aid recommenders. iTunes maintains the digital library it creates using two files of ITL and XML format as database to help categorize information on digital media available in the library, for Spotify, the star ratings.

Source: Spotify community


Michael Ekstrand  2010: Collaborative filtering Recommender systems: foundations and trends in human-computer interaction. Vol 4

Brusilovsky, 1996: Methods and techniques of adaptive hypermedia,: User Modeling and User-Adapted interaction, vOL. 6 NO. 2, pp87-129


Sociotechnical systems and  Network Dependency of Ratings.

A system is understood to be an entity that can be separated into parts, which are all simultaneously linked to each other in a specific way.

                                                                                                                —  Pieter Vermaas (2010)

“Socio-technical systems comprises of the interaction and dependencies between aspects such as human actors, organisational units, communication processes, documented information, work procedures, processes, technical units, human-computer interactions and competencies” (Kunau, Loser and Menold, 2004 Jahnke, 2007).  

The starting point in this is to realise that in our socio-technical systems thinking, we come to understand culture and media technologies are co-produced or co-constitute and thus form a necessary system of co-mediation.

Rating system in itself does not exist as a physical property but depends on built in functions of physical technologies, such as PC and Mobile devices. Since, the main focus is on music we would like to focus a bit sociotechnical systems, is the ability to accommodate many users at any one moment and, secondly, that they involve people in some of the issues to be addressed are interoperability of DRM solutions, support, portability, choice in content provider, support for true archives of owned data.

The complexity of sociotechnical systems, revolves around the fact that it has many users, which is unlikely to be found in other typical technical artefacts e.g a calculator. The functioning of the rating system as a whole, as it appears to each of its users, not only requires coordination between the technical or hardware aspects of the system and the behavior of users, but also, and especially, the mutual coordination of the behavior of the many users. An example can be see in the recent return of itunes star ratings to its music player.

Rating an item isn’t limited to music alone. Moreso systems of ratings have been existing since the beginning. In  reply to that argument, one may likely excuse that while engineers are likely to anticipate uses they cannot  determine them Vermass et al (2011).

All this is made possible following concepts adapted from information and communication theory. And we we look at how it all gets connected by analysing Shannons, theory in his paper  “A Mathematical Theory of Communication”

Peter Kroes, 2010: A philosophy of sociotechnical systems: Morgan & Claypool Publishers 


The Theory Behind Mp3

Uncompressed digital CD-quality audio signals consume a large amount of data and are therefore not suited for storage and transmission. “The need to reduce this amount without any significant quality loss was stated in the late 80’s by the International Organisation for Standardization (ISO)” Rassol Raissi, (2002). A working  group from with the ISO referred to the Moving Pictures Experts Group (MPEG), developed a standard that contained several techniques for both audio and video compression. The audio part consisting of three modes, with the third layer managing music compression from CD at 1.4 Mbits/s to 128 Kbit/s with almost no audible degradation. This technique, which is now implemented, has become popular and known as MP3.  

The theory of data compression was first formulated by Claud E. Shannon in 1949 when he released his paper. “A Mathematical Theory of Communication” proving that it was possible to compress data without losing information.

Information Theory and Transmission model of communication:

Perhaps coming up with a theory of information and its processing is a bit like building a transcontinental railway. You can start in the east, trying to understand how agents can process anything, and head west. Or you can start in the west, with trying to understand what information is and then head east. One hopes that these tracks will meet

                                                                                                                   —Jon Barwise (1986)


Transforming information

During his keynote address at the ACM conference in 2009, Martin argued that the algorithms are a small part of the problem. Some work well, there is still much work to be done on user experience, data collection, and other problems which make up the whole of the recommender experience.

Computers do more than just transmit information: they transform it. ‘Transformation opens many new possibilities, most notably the creation of new information’ Peter Denning (2003). Shannon’s classical information theory shows that information can be transmitted and received accurately by processes that do not depend on information meaning. Information depends on the observer

This brings us to Weaver and Wiener’s formulations, a significant formulation in regards to the recommender system is that Information theory

In Glieks “The Information” he gives a overview as to how we got to today’s concept and it shows how relevant his theory is today in the application of features and new concepts explored by the online music streaming services “ …Shannon proposed feeding “cultural things,” such as music, to an electronic brain”.

Ronald Day, 2000: “The conduit Metaphor” and The nature and politics of Information Studies: University of Oklahoma

James Gleick, 2011: The information: A History, A Theory, A Flood: Pantheon Books, New York



MetaData might be the solution

According to Annie Lin, (2015) “Technology has made it possible to offer massive quantities of music to millions of users at once, making metadata more important than ever”. Music licensing which has a complicated structure becomes a difficult hassle when it comes to licensing. However, few companies are trying to find a middle ground For e.g Sony music is giving artist 100% access to their streaming data. The application built towards transparency provides data and shares key information of artist daily streaming earnings as well as the profile and category of people who listens to their songs.

Formerly, “No single comprehensive database of song ownership metadata exists, which means that identifying the owner of any single song requires a hint-and-peck search across multiple limited proprietary databases” say Annie Lin, in her article on


This paper has looked into DRM, its impact to the music industry and how recommender systems grew out of the desperation of companies of companies to keep users locked in. Looking at star ratings as one of the most used forms of recommendation. We also looked at how we got here through Information theory propounded by Shannon.

Technology is advancing too quickly for the world to keep up with, and with DRM still prevalent in music streaming the future of music becomes questionable.Online, various articles and softwares are paraded to aid hackways in which users can use illegal means to getting their music used freely, cutting away the financial gain. As more and more users pay more to be on more than one streaming service because of the different features available, music  moves away from one of its initial intent, which is to entertain or educate, to becoming more political.


More research needs to be done in the area of financial implications of online streaming application to its users. On the other hand it will be encouraging to see that substantial evidence and theories are recommended to seeing how metadata, will proffer a more enduring solution.



Arthur, W. B. (2011). The Nature of Technology: What It Is and How It Evolves (Reprint edition). New York: Free Press.

Gleick, J. (2012). The Information: A History, A Theory, A Flood (2.5.2012 edition). New York: Vintage.

Murray, J. H. (2011). Inventing the Medium: Principles of Interaction Design as a Cultural Practice (1st edition). Cambridge, Mass: The MIT Press.

Norman, D. (2013). The Design of Everyday Things: Revised and Expanded Edition (Rev Exp edition). New York, New York: Basic Books.

Pieter Vermaas, Peter Kroes, Ibo van de Poel, Maarten Franssen, and Wybo Houkes. A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems. San Rafael, CA: Morgan & Claypool Publishers, 2011

Regis Debray, “What is Mediology?” (Also as PDF.) Le Monde Diplomatique, Aug., 1999. Trans. Martin Irvine.

Vermaas, P., Kroes, P., Franssen, M., Poel, I. van de, & Houkes, W. (2011). A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems. San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA): Morgan & Claypool Publishers.

Werner Rammert, “Where the Action Is: Distributed Agency Between Humans, Machines, and Programs,” 2008. Social Science Open Access Repository (SSOAR).

Clickable Links

Netflix Officially Kills Star Ratings, Replacing Them With Thumbs Up and Down

Content Management or Virtual Learning Environment: A deeper look at Canvas LMS


Learning Management Systems are a hot topic of debate as to whether they function primarily as Virtual Learning Environments or as Learning Content Management Systems. This essay explores this debate by opening up and examining the architecture of one specific case, Canvas by Instructure. Upon closer examination, Canvas seems to be most intuitively used for managing learning processes, albeit with extensibility incorporated for refining deeper learning environments with extra effort and commitment. The approaches taken to arrive at this conclusion consisted of an examination of affordances and conventions presented as part of a socio-technical view of the architecture, paying attention to the major components within a system that is both technical and human. In this way, the essay examines the Cloud Architecture, Stakeholders, Abstraction Layers, and Interoperability potential of LTI standards.


Higher Education Institutions today are tasked with the design, delivery, and administration of learning experiences across in-person and online domains. Students sign up to learn, and institutions seek to facilitate that learning in both the pedagogical and administrative sense. They do so through a variety of software tools and platforms. In their annual review of educational technologies and trends, the New Media Consortium and Educause Learning Initiative defined one of the common types of platforms used for this purpose:

Learning Management Systems (LMS), also referred to as Virtual Learning Environments, comprise a category of software and web applications that enable the online delivery of course materials as well as the tracking and reporting of student participation. Viewed as a centralized location for the ephemera of learning experiences, LMS have long been adopted by colleges and universities worldwide to to manage and administer online and blended courses” (New Media Consortium // Educause Learning Initiative, 2017).

“Canvas” is one such LMS. Created by the vendor company Instructure, adoption of Canvas has increased by a number of higher education institutions. As an aspiring learning designer and technologist, my goal in this essay is to open upsome of the major components and layers that make the platform work in order to better understand the platform contextually, architecturally, and functionally; to outline how Canvas’ design architecture  lends itself especially well to the management of learning processes administratively, but requires extra effort on the part of institutions to successfully incorporate deeper learning in the  virtual environment. The software affords to learning experiences in service of students, however, it’s design features are often more conducive to management and administration in the service of institutional stakeholders.

Why is Canvas designed the way it is, versus some other way? I’ve used a few questions as guiding heuristics or methods to answer that question:

  1. Sociotechnical Systems approach: Technical systems and human stakeholders do not exist in a vacuum, especially when examining core mission-related processes such as how learning is managed and delivered by large institutions such as Universities.
  2. Affordances and Constraints of the Architecture: What action possibilities exist because of these architectures and designs? In particular, as it relates to pedagogical approaches and/or the management of the institution.

The Cloud Architecture creates scalability for IT processes and availability for users and stakeholders.  

Figure 1.1

Image Source: (Serrano et al., 2015)

Cloud architecture makes institutional processes scalable, available, and extensible, easing the burden on institutional stakeholders responsible for the administering learning in blended and online environments.   

The National Institute of Standards and Technology defines cloud computing as “…a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” (Grance & Mell, 2011).   

Observing Figure 1.1 above, the “client” is best understood as the end users such as institutional administrators, faculty, or students. Canvas is typically implemented as part of a “private” cloud model where a given University has access to its’ own instance or individualized online infrastructure using the service. While canvas is also public in the sense that they have instances open to all users such as Canvas Network (“Canvas Network | Free online courses | MOOCs,” n.d.), the core business model revolves around offering a private cloud setup for a given institution.

It’s important to note that while Instructure interfaces with institutions to develop and offer the software, the infrastructure housing the data lives in data centers offered by other providers such as Amazon Web Services, a seen below in Figure 1.2. The hosting entity offers their infrastructure to the vendor, Instructure in this case, and then that vendor develops and manages their software platform for use by institutions and universities.

Figure 1.2

In practice, users can access the software with corresponding shared resources, databases, and artefacts from anywhere as long their digital device is connected to the internet. In this manner Canvas delivers its’ software as a service, where the institution does not have to host any servers or computers locally on their campus to make it work. It operates entirely through their internet connection. Computing infrastructure such as memory, storage, processing, and networking are paid for by scale with the rate of consumption, instead of buying hardware outright. This is distinct from physical hosting, where customers purchase a physical service in a data center or host servers locally.

Serrano et al. describe some of the organizational advantages of a cloud-based solution: “Besides the economic advantages from a cost perspective, the main competitive advantages are the flexibility and speed the cloud architecture can add to your IT environment. In particular, this kind of architecture can provide faster deployment of and access to IT resources, and fine-grain scalability.” (Serrano et al., 2015).

At its heart, cloud computing is all about providing Information Technology resources and management at a distance. The most obvious affordances for this architecture involves the ability to off-load the management of IT onto another organization so that institutional stakeholders can focus more on managing content and learners. Instructure continuously updates the software platform, and those updates trickle down to the institutions where beta testing can be handled in their test environments, making repeated transitions to update software a much simpler process. With this infrastructure hosted and managed off-site, it also creates a scalable solution for universities. If there is a surge in new users for the platform, the model allows for easily purchasing new licenses from the vendor.

At the end of the day, if Instructure is facilitating all of the back-end infrastructure, institutions need much less overhead in the form of certified IT administrators specific to the platform. The focus can remain on funding the salaries of individuals who can focus on implementation, training, and the design and delivery of learning content.

Stakeholders, affordances of the software, and convention.

Figure 1.3

Stakeholders will default to ease-of-use and convention despite advanced features for learning included as extras.

Content Management is the Trend for Faculty and Students using Learning Management Systems

Key stakeholders typically interface with the canvas platform as part of a socio-technical system. These stakeholder categories typically also have other responsibilities as part of their job description, but often fit into the following categories in relation to the Canvas Platform:

  • The Cloud Hosting Platform: The organization hosting the software platform and its corresponding data and servers, in this example Amazon Web Services.
  • The Vendor: Instructure is the company developing the software, relevant updates, and facilitation of back-end data.
  • The Administrator(s): Institutional and/or program level administrators paying for the service and responsible for the university organization.
  • The Faculty: The instructors facilitating blended and online environments for classes as end-users for the platform.
  • The Students: The learners taking classes as end-users and participants in the platform.
  • The Technologists: The staff members supporting regular implementation of updates and training for faculty, students and staff on a technical level
  • The Learning Designers: The staff members who are helping to design and developing online courses, online course components, and online programs.

Canvas implements many of the typical functions of the LMS, and is often at the forefront of in developing and implementing new features. That being said it functions similarly to many of the other leading LMS providers in that users are primarily inclined to manage the learning process versus catalyzing learning for the individual student. There is data for this kind of use on a wider scale, albeit for how Learning Management Systems are being used generally. Having surveyed upwards of 17,000 faculty and 75,000 students as well as evaluated data and metrics related to IT practice from more than 800 institutions; the Educause Center for Analysis and Research put out a report (Dahlstrom, Brooks, & Bichsel, 2014) with the following statistics:

  • 99% of participating institutions have an LMS in place
  • 85% of participating faculty in the survey use a Learning Management System
  • 56% of faculty reported using it every day.
  • 74% of Faculty say it is a useful tool to enhance teaching.
  • 83% of surveyed students reported using the LMS
  • 56% of surveyed students said they used it in most or all of their courses.
  • 41% of surveyed Faculty said they used it to promote interaction outside the classroom.  

These numbers speak to widespread adoption rates across institutions that are using learning management systems. But adoption isn’t the same as impact, and that final statistic speaks to some important nuance in how the LMS is being used. Examining this same data set, Brown et al. speak to how these statistics showcase how the LMS is actually being used:

“Despite the high percentages of LMS adoption, relatively few instructors use its more advanced features: just 41 percent of surveyed faculty report using the LMS ‘to promote interaction outside the classroom’ …… What is clear is that the LMS has been highly successful in enabling the administration of learning but less so in enabling learning itself. Tools such as the grade book and mechanisms for distributing materials (e.g., the syllabus) are invaluable for the management of a course, but these resources contribute only indirectly, at best, to learning success.” (Brown, Dehoney, & Millichap, 2015).”

In other words, users are defaulting to the most basic functions in order to manage and facilitate standard pedagogical practice. For the users who are most in touch with the LMS on the ground, the system is being used to manage content and solicit content rather than to engage learners in a more collaborative virtual learning environment. Now these statistics are speaking to the LMS generally, and are not specific to Canvas. But they are helpful to keep in mind as a heuristic when opening up the design of a specific case like Canvas, which does contain features both built-in and available from third parties that can help facilitate better learning practices and collaborative engagement.

Affordances vs Convention

This relationship within the boundaries of a specific case, Canvas, is pertinent to this discussion because of the disparity between what is offered and what is actually used for most learning management systems. End users will act on what they see as the most useful and actionable functions based on what they are intuitively seeing as possible. Faculty will run courses that feel most natural as extensions to their teaching process, despite the online environment offering very real differences in both what is possible and what is not. Canvas offers the option to incorporate advanced features for collaboration and deeper learning, but because that involves extra steps, it is likely that a majority of users will fall on what is most natural; the content management, the pushing and absorbing of documents and multimedia as described above.

Don Norman describes this disparity in terms of affordances and perceived affordances, or the inherent action possibilities and those that are seen as possible action possibilities because of convention. Speaking to the importance of distinguishing between these two concepts, he asks the reader to “Please don’t confuse affordances with perceived affordances. Don’t confuse affordances with conventions. Affordances reflect the possible relationships among actors and objects: they are properties of the world. Conventions, conversely, are arbitrary, artificial, and learned.” (Norman, 1999).

Canvas’ built environment offers both, however, most users will stick to what is basic, built-in, and obvious. In most cases, that correlates to managing procedures and processes simply because they are the least common denominator. This emphasis ultimately makes it less conducive for experimenting with new approaches for learning, barring a concerted and combined effort to integrate these approaches pedagogically and technically.

Using Accounts to Manage Learners & Artefacts

Figure 1.4

Accounts and Subaccounts are used to manage people and permissions within the system, differentiated by roles and permissions.

Instructure offers a variety of Canvas Guides for learning how it is organized, but some of the main building blocks include accounts, sub-accounts, courses and modules. They are defined therein as follows: 

The terms account and sub-account are organizational units within Canvas. Every instance of Canvas has the potential to contain a hierarchy of accounts and sub-accounts but starts out with just one account (referred to as the top-level account). Accounts include sub-accounts, courses, and sections, all of which can be added manually in Canvas, via the API, or via SIS imports (“What is the hierarchical structure for Canvas accounts? | Canvas Admin Guide | Canvas Guides (en),” n.d.). 

Accounts and subaccounts comprise the main skeleton upon which the instance for an entire institution, program, or school is then built out from. They are separate from individual user accounts, which are the what an individual person uses to log in to the platform and participate. The top-level account is usually defined by the largest overall organization using the account, typically the university as a whole or an individual college or school that decides separately to use the LMS. Sub-accounts then account for the branching units of that organization as shown above in in Figure 1.4 and below in Figure 1.5.


Figure 1.5

Image Source: Canvas Guides 

As you continue to go down that chain, courses and modules exist as sub-units, typically housed within sub-accounts for departments, programs, or other tiers most affiliated with faculty and classes at the institution.


Figure 1.6

“Permissions” and “roles” are the designation given to individual user accounts that let them either participate in or modify accounts, sub-accounts, courses, sections, or even their own settings. Students typically have limited permissions, as their “role” as a student is limited to participation in whichever courses and sections they are a member of. Faculty and designers might have increased permissions to modify and build out the courses they are attached to, and administrators at differing levels can make more changes for the administration of upper-level sub-accounts depending on their role at the institution.

Administrators will typically have permissions to modify or add to sub-accounts depending on which tier of the organization they are managing, as differentiated by the permissions they are given. Unique schools, departments, and programs will often have unique configurations of apps and integrations associated with the courses managed by their sub-account. This enables these sub-accounts to manage the affairs of students and faculty that are unique to those subunits in the organization of the institution, in the form of courses that are designed by the what those students are learning unique to that organization, or attached to other software platforms and databases that corresponding  to that unique organization as well.

File management is an important part of the structure as well, as each of these building blocks and individual users also have a designated amount of storage space to store digital media to be associated with these building blocks as well. There are storage folders associated with students, faculty members, courses, going on up.

Abstraction layers designed for managing the complexity of the institution

So why take the time to lay out these building blocks for Canvas as an instance for the institution? By examining how these units exist in nested layers of abstraction, you begin to see how the software is build to manage the complexity of administering the learning process for the institution. Learning experiences for students only exist on the course layer and below,  

Professor Martin Irvine, a faculty member at Georgetown University describes this method for organization in terms of layering, abstraction, and/or black-boxing. BEcause learning in higher education is managed across multiple levels both horizontally and vertically, a software platform that purports to manage that process needs to be designed such that it can account for varying degrees of permission and access across those layer. For that reason,   “the details of complexity in a module or subsystem can be “hidden” (black-boxed) from the rest of the system with only”interfaces” (structures that create interconnections) to the module as needed by the system.” (Martin, 2017). From a student’s perspective, all they are seeing is their list of courses, with the relevant information being communicated to them through their view of the system.

In this manner, Canvas allows institutional leaders and stakeholders to manage student learning at the micro level on up to organizational structure for courses and departments at higher tiers of the organization. It is a structure that is very good at managing and administering that learning process. But because student learning is taking place at that course level and below, any assertion for the relative importance of management vs quality of the learning experience will need to be discussed with a deeper look at the functionality of those units in the larger structure.

Courses and Modules define where a learning experience is either managed for utility, for learning, or both.

Modules organize the flow for learners and learning experiences within a course

Figure 1.7

Image Source: Canvas Guides


Modules are what give flow and direction to an online or blended course by grouping individual pages and assignments into a cohesive unit. The folks at Instructure define these modules as the organizational unit for courses, saying that:

“Modules allow instructors to organize content to help control the flow of the course. Modules are used to organize course content by weeks, units, or a different organizational structure. Modules essentially create a one-directional linear flow of what students should do in a course. Each module can contain files, discussions, assignments, quizzes, and other learning materials” (Canvas Doc Team, 2017).

The elements contained within these modules are the pieces that determine what kind of experience students will have, and Canvas natively has some of these as standard available templates to design components for a module and course.  

Pages accommodate text, images, and video as the most direct method for delivering video. These media elements can be organized using HTML to position and format where they fit, as well as to include tags that guide screen-readers for disabled learners who cannot see the content. Discussions are basically mini message boards centered around a given topic, but where the discussion is limited to that conversation, on that board, specifically with users who are in the course and have the permissions to participate. Quizzes offer templates for assessment within the module, etc.

All of these elements are standard, and none are particularly engaging. All of them do a good job of streamlining the learning process, the grading process, and containing it into a neatly wrapped experience inside of the learning management system. The screen-reader tags even help make sure that content is accessible to users who need it. These qualities are excellent for administering a course but are not particularly inspiring for experimenting with pedagogical approaches outside of models that emphasize the delivery of content by explaining, demonstrating, then assessing through a combination of rich-media, writing prompts, and quizzes. 

But these modules do allow for flexibility experimentation for those who put in the time to design for it, especially once they begin looking outside of the LMS for additional tools. In conversations with 70 thought leaders in the LMS space, The New Media Consortium concluded that “Overall, a “Lego” approach to LMS was recommended to empower both institutions and individuals with the flexibility to create bespoke learning environments that accommodate their unique requirements and needs” (New Media Consortium, 2017).

While modules and the standard elements that can be incorporated do offer some flexibility for moving pieces around, Canvas does offer the ability to integrate third-party tools that can be incorporated into modules, courses, and even up into other building blocks such as sub-accounts. These outside “lego” pieces are where Canvas gives more options for accommodating learners, or for some institutions, reinforcing the administrative strengths of the platform.

LTI is used for interoperability, allowing administrators, designers, and faculty to integrate third-party applications unique to their sub-account or course.


Figure 1.8

Image Source: imsglobal

LTI, which stands for Learning Tools Interoperability, is a means for Learning Management Systems such as Canvas to integrate third-party tools using agreed-upon standards that allow software systems make connections with each other,  establish secure connections, and then allow for these systems to interact with the relevant corresponding digital resources and databases (whether learning objects, documents, or records for users and participants, etc.)  – similar to an API or Application Programming Interface.  

In this case, as with other similar standards, there is an organization that helps facilitate agreement on how this interoperability can take place and through what kinds of protocols. From their website: 

“Learning Tools Interoperability is a standard developed by IMS Global Learning Consortium. LTI prescribes a way to integrate rich learning applications (often remotely hosted and provided through third-party services) with platforms like learning management systems (LMS), portals, learning object repositories or other educational environments managed locally or in the cloud. In LTI, these learning applications are called Tools, delivered by Tool Providers, and the LMS or platforms are called Tool Consumers” (“Learning Tools Interoperability | IMS Global Learning Consortium,” n.d.).


Figure 1.9

Image Source: Canvas Guides


LTI enables Canvas to act as an open-source “Platform” where third-party vendors can sell or give their integrations for use with the LMS. In some cases, these integrations are standalone additions developed specifically to operate with the LMS, and in other cases, they allow for synergy between the LMS and another platform, such as Google Drive or Social Media. The app store is a tool-rich environment where software developers can create, customise, test, and deploy new applications, but the LTI format also gives institutions the option to incorporate their own customized solutions. For those institutions without the resources to build standalone integrations, they can strive to mix and match those available in the app store. 

While LTI allows for many combinations The limitation in Canvas’ ability to be interoperable with other systems lies with its’ limited utility with other standards and channels – essentially limiting pool of integrations and use-cases to apps and integrations in the canvas app store. This is what makes it difficult to completely distinguish Canvas from a “Content Management System” model, or CMS, as these systems operate off similar cloud-based models that channel users into proprietary app-stores. CMS have users, often giving you the ability to organize and deliver content as well. What would differentiate it more is the ability to incorporate additional standards that allow for increased interoperability and synergy with existing and emerging elearning formats; SCORM, Tin Can, etc.


By opening up the basic layers and building blocks of the system it becomes apparent that Canvas does its job well as a Learning Management System. It can also offer powerful and extensible options to create unique learning experiences, however, not without bucking convention and putting extra thought into design, implementation, and training.  

I’ll conclude by going back to the Horizon Report’s insights on how these adjustments might be taken into consideration for anyone seeking to implement or rethink their approach to Canvas as a Learning Environment, and not just as a Management System:

“The overarching goal of next-generation LMS is to shift the focus of these platforms from enabling administrative tasks to deepening the act of learning. Traditional LMS functions are still a part of the ecosystem, but reimagined incarnations deviate from the one-size-fits-all approach to accommodate the specific needs of all faculty and students.” (New Media Consortium // Educause Learning Initiative, 2017).

Canvas is a streamlined “one-size-fits-all” platform, however, it achieves that in large part by enclosing its users within the platform. Institutional stakeholders are in a position to enhance the learning experience by consciously taking that aspect of the platform into account when seeking out integrations and training stakeholders such as Faculty and Designers who can “open up” increased options for deeper learning.



Brown, M., Dehoney, J., & Millichap. (2015). What’s Next for the LMS? EDUCAUSE Review, 50(4). Retrieved from

Canvas Admin Guide | Canvas Guides (en). (n.d.). Retrieved December 15, 2017, from

Canvas Admin Tour. (n.d.). Retrieved December 15, 2017, from

Canvas Doc Team. (2017). What are Modules. Retrieved from

Canvas Network | Free online courses | MOOCs. (n.d.). Retrieved December 17, 2017, from

Cloud Computing Architecture: an overview. (2015, March 5). Retrieved December 16, 2017, from

Dahlstrom, E., Brooks, D. C., & Bichsel, J. (2014). The Current Ecosystem of Learning Management Systems in Higher Education: Student, FAculty and IT Perspectives. EDUCAUSE Center for Analysis and Research. Retrieved from

Grance, T., & Mell, P. (2011). The NIST Definition of Cloud Computing (No. NIST Special Publication 800-145). National Institute of Standards and Technology. Retrieved from

Instructure | Learning + Tech = Awesome. (n.d.). Retrieved December 16, 2017, from

Intro-Systems-and-Architectures.pdf. (n.d.). Retrieved December 16, 2017, from

Irvine, M. (2017). Intro-Modularity-Abstraction.pdf. Retrieved December 18, 2017, from

Learning management system. (2017, November 16). In Wikipedia. Retrieved from

Learning Tools Interoperability | IMS Global Learning Consortium. (n.d.). Retrieved December 18, 2017, from

Manovich, L. (2013). Software Takes Command (INT edition). New York ; London: Bloomsbury Academic.

New Media Consortium // Educause Learning Initiative. (2017). NMC Horizon Report > 2017 Higher Education Edition. Retrieved from

Norman, D. A. (1999). Affordance, Convention, and Design. Interactions, 6(3). Retrieved from

Norman, D. A. (2002). The Design of Everyday Things (Reprint edition). New York: Basic Books.

Serrano, N., Gallardo, G., & Hernantes, J. (2015). Infrastructure as a Service and Cloud Technologies. IEEE Software, 32(2), 30–36.

The Web: Extensible Design. (n.d.). Retrieved December 16, 2017, from

What are External Apps (LTI Tools)? | Canvas Community. (2017). Retrieved December 18, 2017, from

What is the hierarchical structure for Canvas accounts? | Canvas Admin Guide | Canvas Guides (en). (n.d.). Retrieved December 18, 2017, from