Category Archives: Week 11: Digital Media, New Media

week: 11: digital and new media


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

In the readings for this week: this particular quote stood out to me: “…For users who only interact with media content through application software, the ‘properties’ of digital media are defined by the particular software as opposed to solely being contained in the actual content (i.e., inside digital files)” (5). I definitely agree with Manovich’s point. We experience digital media through the mediums and formats through which it is presented to us. Following the example given, one would experience an image differently if it was presented in iPhoto, on Facebook, in Microsoft Paint, etc. Cellular technology is an excellent example of Manovich’s point. Apple’s iOS can be used as a case study to exemplify this phenomenon.

Photos can be experienced through Apple’s camera application, or via any of the other other phone applications available for download in the App Store. Many devices support advanced camera apps that give users the functionality to: adjust black and white clipping, add a vignette, adjust green and magenta tint, alter exposure, adjust color temperature, and remove blemishes and other imperfections: features that go beyond the basic cropping and border capabilities of the past. One can argue that the invisible structure giving foundation to these apps is Adobe Photoshop. Photo application developers are giving modern day users with varying levels of photographic and editing ability the opportunity to mimic software of professional-grade functionality on smartphone and tablet devices.

Another example of media accessed via smartphone/tablet mediums is music: particularly with video applications such as YouTube and Vimeo, users are subject to the same play/pause/fast-forward/rewind buttons popular on cassette players in the past.

I found this quote:

“Over time, the culture’s construction of the medium is inevitably subverted, as new communication technologies emerge to the fore. Users are confronted with the problem of multiple representation, and challenged to consider why “one medium might offer a more appropriate representation than another” (Bonnett)

to be particularly relevant to questions of social media today: many marketers are constantly re-evaluating which mediums are best for them to reach their desired audience. This quote is representative of Bolter and Grusin’s theories on media: user experience does change how we perceive media. With the development of technology, small business owners and marketers will be especially attuned to how new mediums affect media: the content they produce for consumption and sharing.

http://www9.georgetown.edu/faculty/irvinem/theory/Manovich-Media_after_software-2012.pdf

http://quod.lib.umich.edu/j/jahc/3310410.0005.111/–jay-david-bolter-and-richard-grusins-remediation?rgn=main;view=fulltext

 

Digital Nutrition


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

Elizabeth-Burton Jones
“Digital Nutrition”
Week 11


The iPad has many functions and has various uses. From taking notes to playing games to playing instruments. However, it seems as though it’s role in “media-entertainment” is typically restricted by the advertising world. For instance, like any branding venture, the Apple company advertises the iPad in specific ways to gauge specific consumers.

Mini Ad

For illustration, I will use my journey with the iPad. At first, I saw the iPad as a new highway route to my destination of creating digital music. This desire to create music with the iPad came from many advertisements about the drums and the piano apps. That being said, these desires of creating music have not become muted, but rather these desires have shifted based on the technology and course work. Meaning, the Macbook, the iPad, and the iPhone seem to have their own places and definitions for their roles in the “media-entertainment” field and I have started to divide them based on these roles.

iPad Drums

What are these roles?
The Macbook is more of the universal hub for each object. It is the home of all of my purchased iTunes music, all of my Word Documents, all of my garage band files, etc.

Education and Technology

The iPad is my information seeker. I use it during class to record the information through my notes, these notes are then sent back to my Macbook Pro and then reassessed.

The iPhone is my mobile music center, I frequently download music, find music (through SoundHound), record music and lyrics, and also send text messages, use transportation apps, etc.

With the Macbook Pro as the nucleus, this sort of defines the “media-entertainment” roles. But, how does the “Apple System” create these roles in “media-entertainment” and what does it mean? What does it mean to say that the “iPad transforms education”? What does it mean to stroke the tablet rather than hold a mouse?

To start to look at these questions, I read Manovich. Manovich mentions many stories of the evolution of technology (cinema, virtual  camera, etc.) The Apple System has also had a sort of “evolution” (not just from huge computers to small computers). But as we assess the different roles and the different evolutions, Manovich reminds us to remember the dialogic nature of the animals.

Apple’s role as a ‘“global entertainment conglomerate” reminds the community that they are not only a technology powerhouse, but also Apple is a place for innovation (Manovich 54).

From click to touch:

I guess my biggest concern comes from the differences between the interfaces specifically when it comes to playing games and music. Manovich introduces me to this question when he says that “A particularly important example of how computer games use — and extend — cinematic language, is their implementation of a dynamic point of view” (Manovich 91). Before, I read this quote, I was searching for ways to add a musical aspect to this post. I was searching for the meaning of the difference between interfaces and why it mattered. But, this quote helped me find the “point of view”. I interpreted the point of view (with regards to Apple) as the difference between clicking and touching. 

For instance, the iPad has changed so many ways that gamers play their game of choice. In the game “Virtual Villagers” the gamer is given a village and he or she has to rebuild this village. In order to rebuild this village the gamer is given a “bird’s eye view” of the island. The gamer can pick up the players and place them where they should go. The gamer has more of a “hand” in the decision making of the villagers.

This game is similar to other virtual games, but the difference is the perceived power that the gamer has over the game. The Manovich article mentions Tamagotchi’s and I couldn’t help to think about my group of Tamagotchi’s that I would take to school and take care of. But, what makes the iPad “Virtual Villagers” different from the Tamagotchi’s or the other “Pygmalion” games would have to be the point of view and the touch.

Another way to look at this would be through music. In each Apple Product (Macbook Pro, iPad, iPhone) there’s a musical component. With this musical component, each interface is different. Each way that the media is displayed is different. For instance, with the iPhone, the music player is obviously smaller but also resembles the walkman. However, the iPad has more of a music lovers dream when it comes to aesthetics.

For instance, with the iPad, the point of view lends itself to a different feel. The image of the album is very big. It seems like more of a professional instrument. To exemplify the iPad’s relationship with music, you have the relationship with actual performance and the iPad. For instance, last year I did a performance with a full band and our soundcheck was completely done by the iPad. This example is only one example of the many ways that the iPad gives a different powerful point of view. I know that this point of view is not unique to the realm of music, however it is unique (not all platforms can have such an easy transition when it comes to the iPad ie Photography). 

With all of this information I am still searching for the meaning of the point of view. I know that a difference exists, but why is it important? What does it mean to the “interface world”? What does the difference of touch versus click really mean?

Resource:

  • Lev Manovich, The Language of New Media (excerpts). Cambridge: MIT Press, 2001.
    A very influential book in the field. Read selections from chap. 1 (What is New Media) and chap. 2 (“The Interface”)
    Note the categories Manovich set up in chap. 1 for defining “New Media.”
  • Author’s website with supplements to the book.
  • Lev Manovich: “New Media from Borges to HTML.” Introduction to The New Media Reader. Edited by Noah Wardrip-Fruin and Nick Montfort. Cambridge, MA: The MIT Press, 2003, 13-25. (File from author’s site: www.manovich.net).
    See especially the section on “What is New Media: Eight Propositions.”
  • Lev Manovich, “Media After Software,” Journal of Visual Culture, 2012. Author’s preprint version.
    See also a brief preview of his argument, “There is Only Software” (author’s site); pdf version.

From iPhones to Digital Pills


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

IPhones adhere to Manovich’s five characteristics, but so do other “new media”. For my 506 project, I am currently doing research on Digital Pills, or Smart Pills. The device is composed by three subsystems: IEM, the personal monitor (patch), and the software. The IEM is a tiny microchip (size of a grain of sand) with a digestible antenna made of silver nanoparticles that can be attached to any pill. Once the patient ingests the pill, the fluids of the stomach activate the microchip that sends a signal to a personal monitor (patch) when the pill is ingested. The microchip can also send other data, including the type and dose of medication, date and time of ingestion, as well as other biological and behavioral information (e.g. heart rate, weight, activity). The personal monitor is a patch that is attached to the torso – in the future it may also be attached to a watch or a cellphone. The patch receives and stores the data and also sends it through wireless to a software such as the doctors’ computer or a mobile phone that organizes and displays the information. The person receiving the data does not have to be restricted to health care providers, as other persons such as relatives or friends can also be the recipients (Hoover & Howell, 2010; FDA, 2012; Gagnon et al., 2012; O’Reilly, 2012). In May 2012 the FDA approved the digital pills (FDA, 2012).

The Smart Pill thus (1) uses Numerical Representation as it is uses digital technology and works with numbers in one platform as indicators of health information; (2) has Modularity because we can change what vital statistics and health information we want to send and receive without changing the physical pill; (3) is Automated as its ability to send health information is automated at ingestion; (4) has Variability and Multiformity when the medias of all the various health informations converge to make the pill work and send/receive the information; and (5) is Transcoded as the data that is sent/received can be read in many different ways e.g. as an image, text, and even live.

Manovich states that the use of digital has become so banal and ubiquitous that it becomes “something which does not seem to require much reflection about”. Manovich thus is in line with mediologists in the assertion that despite and in spite of our symbolic faculties, it is easy for us to fall into the trap of blackboxing our objects. And just as mediologists would argue, Manovich posits that “all culture, past and present, is being filtered through a computer, with its particular human-computer interface. Human-computer interface comes to act as a new form through which all older forms of cultural production are being mediated.” In the case of the digital pill we can identify many cultural functions have already been embedded: the healthcare function, the doctor-patient relationship function, the drug adherence function, etc.

Bolter and Grusin discuss remediation by explaining the lack of recognition that iPhones are just an amalgamation of past technologies that have been packaged into one device well i.e. “Our culture wants both to multiply its media and to erase all traces of mediation: ideally, it wants to erase its media in the very act of multiplying them…virtual reality should come as close as possible to our daily visual experience and transparent interface is one that erases itself.” Bolter and Grusin not only ring true with the iPhone with its clever interface that renders mediation almost invisible, but this invisibility can also be witnessed with the Digital Pill. The Pill is taken just like any other pill and once the stomach acids dissolve the outer shell and reach the antenna, the pill is activated and starts monitoring, recording, and sending information to doctors, healthcare providers, and eve family members. And this all happens as the user merely ingests.

Choose your own ending.


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

Yesterday I was sitting on the bus, and as usual, everyone was on his or her iPhone playing around, waiting until they got home. I could see multiple screens from my seat and it was clear that everyone was doing something differently with the same machine: one person was reading a book on their phone, another playing a game, someone was writing a list, one person was flipping through instagram, and there I sat, sifting through my Spotify.

We talk about the iOS and iPhones a lot in CCT – I know we did when I took 506 – but it’s such a good example of what’s possible with interface. Apple knows what’s up when it comes to designing easy-to-use machines. And this interface allows for all different types of software (within the Apple circle of trust, of course, but that’s a whole other political issue), basically allowing it’s user a huge range of, but not infinite, amount of activities. When Manovich talked in “Language of the New Media” about how new technology puts more responsibility into the user’s hands, it reminded me of my iPhone, but also basically everything I do online. In talking about variability, Manovich explained how our interfaces now correspond with modern society where every person is unique and so is her experience. Instead of offering one path to do things, now we get customized information delivered to us through our devices. A sort of choose your own ending type of approach, where an user can meander online and find one piece of information in a certain way that maybe nobody else ever will.

Manovich put it this way: “More generally, every hypertext reader gets her own version of the complete text by selecting a particular path through it. Similarly, every user of an interactive installation gets her own version of the work. And so on. In this way new media technology acts as the most perfect realization of the utopia of an ideal society composed from unique individuals. New media objects assure users that their choices — and therefore, their underlying thoughts and desires — are unique, rather than pre-programmed and shared with others.”

He goes on to ask if we should want that kind of freedom. And this is something that I’ve struggled with and that friends of mine who aren’t so into jumping on every new technology also ask. With that freedom comes a lot of choice and a lot of noise. Manovich talks about putting the decision making as a moral issue. I think I could agree with that, but almost in a different way. If technology has evolved to deliver media to us in a way that caters to our interest, is that what we really need? The readings mentioned how technology has been shaped by culture, and vice versa, which sort of implies that we’re getting what we want from our devices. We’re utilizing our interfaces and software to take over certain tasks like communication, research, and news gathering.

But back to the bus. In thinking about McLuhan’s idea of the medium being message, that day with everyone’s cell phone blazing kind of sent its own message about society. While everyone was flipping through their iPhone interface, it seemed like the message was almost more about distraction than any sort of meaningful content. (It feels weird to say this because I really love my phone, and I’m constantly on it.) And maybe that’s just a product of the environment; on a commute, polite societal rules go out the window. But watching everyone flip through their phones, myself included, doing completely different tasks at the same time, we all looked the same. And we all kept flipping. What does that mean? Foulger’s idea of the medium being part of the message also intrigued me in the example of the guy reading a book on his phone. Does reading a long-form book on that kind of device or interface impart a different message than a physical book or even a bigger device like a Kindle?

Touch Screen


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

by Alexis Hamann-Nazaroff

My central question: how can we apply the thinking of Manovich in Language of New Media, and “Media after Software” as well as of Boulter and Grusin in Remediation to the touchscreen-interface that has no become so prevalent today?

The Usefulness of the touch-screen for the Cultural Interface: One of my “Inspire” paintings

In the time since Manovich wrote “The Language of New Media”, a major trend has been the move away from mouse-based user-computer interaction to a touch-screen based interface.  Ipods, Ipads, tablets, phones –all are objects that the user interacts with by touching icons on their screen with a finger rather than by dragging a mouse and guiding an arrow.  I wonder about the reasons behind and the implications of this shift.

One way to think about touch-screen interface is through what Boulter and Grusin call the new media’s “logic of transparent immediacy.”  Media developers, users and enthusiasts keep pushing media to be more transparent, for the interface to disappear in the presentation of content.  With this logic, eliminating the mouse is like eliminating a middleman.  It erases one step in the functioning of the interface, and users feel closer to the links they open.  On a touch screen, icons on the screen become like objects with mass and friction that can respond to pushing and pulling just like the objects in the 3-D world.

That touch screens are any “more natural” than mouse-based interface is however a myth.  They have been developed as a response to the particular ideologies and priorities of our time.  Manovich talks about a shift in in the 1990s from what we can call a “Human-Computer-Interface” being the central aspect of new media, to new media acting more often as a “Cultural-Interface.”  In other words, computers are today less about storing and calculating data (they still have these functions, but they are now secondary) and more about sharing, presenting, manipulating and working with culture.  It is in this context that we might be able to explain the touch screen a bit more fully.  The touch screen corresponds to a fantasy we have about interacting with cultural objects.  When we see cultural objects in person, we always have to be very careful how we handle them (if we are allowed to handle them at all).  Visit an archives, wear cotton gloves; visit a museum, don’t get too close to the paintings or you’ll get a scolding from the guard; even the books you own, handle them with respect.  The touch screen imitates a feeling of immediacy that in the 3D world can only ever be a fantasy.

I don’t have a lot of touch screen gadgets, but I do have an Ipod, and about a year ago, I downloaded an application called “Inspire.”  It’s a drawing application, and is an example of a new media function that is infinitely more possible on a touch-screen than on the older mouse interface.  This fits well into Manovich’s point about computers playing a more and more central role as Cultural Interfaces.  “Inspire” takes advantage of all the subtlety possible on touch screen interface, and allows the user to create art.  Manovich says “the language of cultural interfaces is largely made up from the elements of other, already familiar cultural forms” (81), so I pondered what elements “Inspire” borrows.  On the one hand, drawing on “Inspire” is like drawing in a pre-pencil era, like writing in the sand or like finger painting.  On the other, one can build up so much detail in using “Inspire,” that the finished product looks like a painting.  Boulter and Grusin said that virtual reality aims to be “photorealistic” –users judge success based on how close a VR product comes to the visual codes of photography.  If I may invent a term, “Inspire” is “painting-realistic” –users judge success based on how close an “Inspire” image comes to a the visual codes of painting.

One last thought about “Inspire”: Manovich’s “Media after Software” article makes the point that in fact software, and the software creators are far more important to the properties of the media even as they disappear behind the ideal of automatic computer response and user control.  Anything I create on “Inspire,” I owe far more to the hard work of the software developers than to my couple hours of finger sliding.

The iOS from a Manovich Viewpoint


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

By: Somaiya Sibai

Using Manovich’s eight propositions that define new media, we can analyze and deduce how the iOS used in mobile Apple products, such as iPhones and iPads, can be considered new media. The iOS has been an innovation in the world of technology that changed the face of mobile computing. It has set a standard for similar technologies to emerge by other companies. iOS devices have become a fore granted part of modern life, where people use them for numerous tasks and daily activities. 

According to Manovich, the distinction between cyberculture and New Media lies in that cyberculture is concerned with all the social interactions enabled by technology, while New Media is the cultural objects and paradigms concerned with networking and beyond. He explains such cultural objects as those which use computing as a means to distribute and exhibit information. The iOS consists of applications, which can be considered as cultural objects that perform all sorts of media functions, such as displaying and browsing websites, running games, utilities, and so on. Also, he mentions that New Media is digital data controlled by software, where certain algorithms control how the media operates, which is how iOS applications operate.

This means it follows the principals of modularity, automation, and variability. A software, or app in this case, is designed to run autonomously according to a certain set of algorithms. A game always operates the same way for everyone and progresses in the same way. However, variability allows the customization of some features according to the preferences of each user. We can see this clearly in many apps where users set preferences and options according to their liking. An example could be weight loss apps, where the user customizes the diet plan according to their personal needs and goals, and coupon/special offers apps, which a user customizes according to their location and preferences.

 

Another feature of New Media is in that it morphs two different sets of culture forces together – that is, for example, turning an image that is normally meant to be looked at and gazed upon, into an interactive object that one interacts with and interprets as information. In the iOS, the GUI heavily depends on pictorial icons that are “touched” rather than looked at, in order to operate apps. Also, users now mindlessly slide their fingers over images to switch, zoom, or edit them. The apps themselves depend on imagery the user clicks and drags to operate the app, like in games such as Angry Birds.

Manovich also explains that any new technology that by offers “more immediacy”,“ represents what before could not be represented”, “will contribute to the erosion of moral values”; and “will destroy the natural relationship between humans and the world by eliminating the distance between the observer and the observed”, can be considered New Media. He argues that every form of media, such as film, photography, telephone, and television, have done this at a certain stage, and thus have all passed through the New Media stage. This can be also applied to the iOS, which has been an innovation in the field of mobile computing, one that has certainly contributed to all those points he mentioned.

According to Manovich, the definitions we have described are all focused on technology. The remaining definitions are focused on encoding of cultural tendencies, turning what was once manual activity into digital. All algorithms today are based on linear steps that are done in order so a task can be done. This is similar to human-done tasks, except that technology allows it to be executed much faster. Similarly, art was once man-made, rather than digitally produced. This means that the entire digital world is based on functions that were once done manually, and have sped their process. Examples of this in the iOS are numerous. There are apps for note taking, to do-lists, maps, photo retouching, address books…etc, all which were done manually not to long ago, and have been transformed today into being almost entirely digital, becoming much faster and reliable. 

Finally, just as the Avant-garde period (1915-1928) and the post World War II period revolutionized ways to represent reality and see the world, New Media revolutionizes the ways to access and manipulate information, with techniques such hypermedia, databases, search engines, data mining, image processing, visualization, and simulation. All those techniques are present in iOS, which relies on a combination of them to run its apps. As the first mobile computing system to heavily depend on connectivity to the internet, those functions have become inseparable from it.  

Manovich, the iPhone, and Pictures Under Glass


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

Manovich, the iPhone, and Pictures Under Glass

Sara Levine

Manovich re-energizes the concept of “new media” by attempting to narrow down the specifics of this umbrella term. He explores a variety of components that make up “new media,” including its variability, transcoding processes, and interactivity. Manovich also posits that many of the innovative aspects of “new media” rely on older forms of media. Similarly, Bolter and Grusin emphasize that mediation and remediation are inseparable. I will use the iPhone as an example in order to determine whether or not these definitions are applicable to modern day media platforms.

Case Study: the iPhone

The Apple iPhone is a “new media” technology that has become an incredibly popular entertainment, communication, and computing device. There are a few key components of the iPhone’s design that highlight the way “new media” forms are presented and consumed by the general public. Its presentation as a desirable product is communicated through advertising, and Apple ran a very successful campaign. Manovich discussed marketing in The Language of New Media (Manovich 60). Instead of targeting mass audiences, companies like Apple have been emphasizing the individual. Advertisements for the iPhone highlight features like Siri in order to present the iPhone as customizable and attuned to personal needs. However, the user experience is also presented as simple and direct. Bolter and Grusin point out that there is a noticeable effort to erase the user’s awareness of the medium. The interface of the iPhone, therefore, should be intuitive to the user.

The Black Box

Here are a couple of diagrams I found through Google Image Search of the components of an iPhone:




Manovich explained that “new media” objects such as the iPhone are composed of digital code, automation, and other number-based elements (Manovich 48). However, these are not the parts of “new media” that users directly interact with. Manovich posited that the role of software in the structure of “new media” is much more important than many of us may realize. He argues that “While digital representation makes possible for computers to work with images, text, sounds and other media types in principle, it is the software that determines what we can do with them (Manovich 3).”

Software’s Centrality

Manovich’s concept that “There is Only Software (Manovich 4)” may be applicable to the iPhone’s software components. The software for the iPhone is generally referred to as apps, or applications. Users do not work with the numerical representations of the iPhone’s functions. Instead, they use the software as translations of these hidden processes. Consequently, users forget about the invisible technical details of the technology. As Manovich writes, “media becomes software (Manovich 12).”

New Media as Post-Media, Meta-Media, or Remediation

The implementation of software and apps on the iPhone is not particularly innovative. Software has been used as a method of communicating with hardware since the early days of PC computing. Manovich, Bolton, and Grusin all discussed this process of building on old forms of media in order to produce the “new media” that we interact with on a daily basis. “New media” technologies such as the iPhone possess the combination of a variety of familiar media platforms, cultural semiotic codes, and other primary building blocks. The iPhone, for example, contains numerous instances of this remediation. The camera and its application borrows from photographic tools and digital photography. iMessage seems loosely based on many instant messaging programs. The phone component displays functions of a cell phone, PDA contact list, and voicemail machine. Additionally, Apple includes its iPod capabilities with the iPhone’s music software. All of these components derive from earlier versions of media. They are combined and reformatted in a remediation process that forms the newest meta-media: the iPhone.

Interface

Manovich discussed HCI in his writing, and used the game Myst as an example of “new media” that made use of cinematic techniques as part of its interface (Manovich 81). The touchscreen seems to be the most prominent method of interaction between user and iPhone. A couple of years ago I read a blog post by Bret Victor about touchscreen technology. The post was titled “A Brief Rant on the Future of Interaction Design,” and made the argument that touchscreen technology such as the one implemented through the iPhone does not successfully utilize all of the capabilities of the human hand. He calls the touchscreen “Pictures Under Glass,” and posits that “Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade.” This discussion of human capability and user interaction reminds me of the importance of cognitive science within the realm of “new media.” Victor seems to believe that the touchscreen of an iPhone numbs the senses in a human limb that has historically been used to manipulate tools in a tactile manner. How are our cognitive processes affected by this apparent repression? Victor also posits that Pictures Under Glass is a “transitional technology.” He pleads with researchers to look into the development of technologies that work with more intuitive gestures and hand motions rather than simple slide-across-the-screen movement. The touchscreen technology may not possess the same familiarity as the cinematic techniques used in Myst. Its history of remediation may not stretch back very far, and Victor makes the claim that the interactivity required by a touchscreen does not take advantage of the full range of cognitive and physical processes displayed by humans.

References

Bolter, J. David, and Richard A. Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT, 1999. Print.

Manovich, Lev. The Language of New Media. Cambridge, MA: MIT, 2002. Print.

Manovich, Lev. “Media After Software.” Journal of Visual Culture (2012): n. pag. Web.

Manovich, Lev. “New Media from Borges to HTML.” Introduction. The New Media Reader. By Nick Montfort and Noah Wardrip-Fruin. Cambridge, Mass. [u.a.: MIT, 2003. N. pag. Print.

Victor, Bret. “A Brief Rant on the Future of Interaction Design.” Web log post. Bret Victor. N.p., 8 Nov. 2011. Web. <http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/>.

Internally Inspecting iPhones


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

A major theme in our dissection of the worlds of media and digital artifacts that becomes clearer and clearer is that on the surface, we view our interactions with objects such as the iPhone as a given process of society and that these “new” technologies provide us with easier and quicker communication than ever before. The majority of people do not question the screen interface, media functions of their apps or the mediation of the policies, production, capitalism and cultural message that go into making and distributing the iPhone. Manovich states that our use of the digital has “become an assumed part of the everyday existence, something which does not seem to require much reflection about” summing up our tendencies to blackbox things as we have discussed throughout the course.

At first glance, the interface of my iPhone 4 was nothing more than a bunch of icons on a screen that quickly informed me of its function and allowed to fulfill my goals of calling, texting, using social media, taking photos, checking the weather and etc. However, after taking into account the type of theories proposed by McLuhan and Barthes, the semantics and reproduction found on the screen were much evident such a microphone for voice memos, an address book for contacts and musical note for music. All of these not only represent culturally significant items that have been reproduced into the digital interface and functions, but continue a set of pre-established rules and symbols as in the cultural encyclopedia and recycling of mediums within one another. Manovich goes further in explaining this in “New Media from Borges to HTML” when he says “all culture, past and present, is being filtered through a computer, with its particular human-computer interface. Human-computer interface comes to act as a new form through which all older forms of cultural production are being mediated.” Pg.7

Looking at McLuhan’s theory that the medium is the message alongside Manovich’s analysis in “Media After Software,” the iPhone, media functions are more than playing a youtube video or snapping a photo and uploading it onto Facebook. Using the example of photographs presented in the reading, the point of contact and immediacy met as the light hit camera, capturing an immediate moment in time. Now, images are captured by the software within the iPhone, which is in sync with the software from Facebook to which we can immediately upload a post and let our friends know what we were doing a few seconds beforehand. While the software is faster and presents a new form of immediacy, it is framed within the cultural, social and political context of its predecessor.

The notion of remediation discussed by Bolter and Grusin reaches farther in explaining the lack of discussion and recognition of iPhones as a technology that has developed based on past creations. “Our culture wants both to multiply its media and to erase all traces of mediation: ideally, it wants to erase its media in the very act of multiplying them. (p.5) Their statement rings true as our iPhones are updated and proliferated with new apps to make our lives easier and to essentially place our entire lives onto a single device, a path that is created in part to Apple’s marketing, software development and business practices. Once the touch-screen became a popular feature, it helped to eliminate another barrier in the human-computer interaction experience.  Bolter and Grusin state,”virtual reality should come as close as possible to our daily visual experience and transparent interface is one that erases itself.” As we see the proliferation of media interfaces that enable touch and 3D technology, we are seeing the movement to erase acknowledgement of the technology. For example this new screen cover for the iPhone 5 allows for 3D viewing without wearing glasses or any other device that would intrude upon reality. We are drawn even deeper into the technology as a given object of reality.

More information on 3D iPhone covers:

http://news.cnet.com/8301-17938_105-57577518-1/eyefly-3d-screen-protector-makes-iphone-5-3d-capable/

 

New Media Ourselves to Death


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

Wanyu Zheng

In the winter of 2011, I set up a Chinese “Tumblr” page called New Media Ourselves to Death. (http://zhan.renren.com/communications) Instead of making a tribute to the famous book Amusing Ourselves to Death, I wanted to emphasize the power of new media at the time that how it grew a rush life. I was fascinated by this revolutionary power and couldn’t help imaging how it had changed and would continuously change everything. Apple devices and social networking sites have saved our time and wasted our time simultaneously, and we always enjoy them. I like Manovich’s database theory, as new media presents a world of database: new media forms are such compatible and inclusive that all the videos, texts, images are archived as files on personal computers and are organized yet fragmented. “The database presents the world as a list of items, and it refuses to order its list.” (Manovich, Language of New Media, 41) I’d say my current desktop is a good example: I’m doing multiple things simultaneously on one platform/interface, but by opening all the windows, web pages and software, I’ve interacted with different interfaces and am able to construct my own desktop database.

 

Right in the place we take this course – CCT studio, I noticed an old yet potentially evolutionary device is silently standing at the back of the room, which was designed as “Microsoft Surface” in its time (2004). It was perhaps the earliest tablet ever and the ancestor of the current Microsoft Surface, iPad & Nexus X. This device looks like a heavy metal desk with electronic glass screen on its surface, and there are complicated wire cable and circuit board inside the desk a black-box alike body. The screen allows both direct multi-touch and mouse control through a computer. What’s even more amazing is that this device is installed with software and applications for gaming and working. In short, it’s a desktop sized Microsoft surface. The circuit board inside is an interface providing the networks for wires and technological components to connect each other, which generates the machine – machine interaction; the glass screen is an interface for users to directly “talk” with the machine, which involves human-technology interaction; the software in the device can be viewed as interface as well, since it enables the user to enjoy the process of playing games: the action itself is the transmission of design, code, and ideas. Although not telling a story, this device is such a collection of items that each of them has the same significance as the other, creating an integration of different levels of interface.

Standing in front of the precursory tablet, I felt so excited that I was even picturing what the future of desk would look like: a neat, large, plain desk with nothing on it. By touching any point, the desk surface may transform to any media forms we want, a large piece of paper, a pc/mac system desktop, a social networking site platform, a digital piano, or simply a piece of solid wood. It obeys all the five characteristics concluded by Manovich: Numerical Representation- digital, numbers, work with sound and image in one platform; modularity – we change any form/page but won’t change the structure (the desk itself); automation – the ability to accomplish the task is automated; variability – variability and multiformity during the media convergence process; transcoding – data can be read in many different ways. Most importantly, the future desk has already existed, conveying an idea of the abundance of today’s new media.

In my “New Media Ourselves To Death” page, I posted Steve Jobs’ classic lines when introducing the first generation iPhone at the launch in San Francisco on January 2007:

“Today, we’re introducing three revolutionary products of this class. The first one is a widescreen iPod with touch controls. The second is a revolutionary mobile phone. And the third is a breakthrough Internet communications device. An iPod, a phone, and an Internet communications device. Are you getting it? These are not three separate devices, this is one device, and we are calling it iPhone.”

New media never dies; it re-mediates.

 

References:

Lev Manovich, The Language of New Media (excerpts). Cambridge: MIT Press, 2001.
A very influential book in the field. Read selections from chap. 1 (What is New Media) and chap. 2 (“The Interface”)

http://zhan.renren.com/communications

Steve Jobs Introducing iPhone At the MacWorld 2007:

http://www.youtube.com/watch?v=x7qPAY9JqE4

 

Re-Mediation of the iPad and Its Effects


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

As someone who is studying communication, culture and technology, one of my darkest secrets is my tendency to cling onto familiar technology whenever a new gadget is introduced to the market. When I was a child, I didn’t understand what was so different about using a DVD instead of a VHS, and in middle school I distinctly remember making fun of a friend for wanting to carry around all of his music with him in his iPod. More recently, the concept of a tablet/iPad was more confusing than interesting to me when it initially gained popularity. I just did not see the value in re-shaping a computer screen in an attempt to make it seem new and interesting.

Having read Manovich’s and Bolter’s discussions about what digital media entails in modern times, I realized that it wasn’t so much the physical object of the iPad/tablet that bothered me, but rather the re-mediated qualities embedded into the technology. In an iPad, re-mediation is visible at all levels – on the interface, in its media functions and in its mediations. However, each layer of re-mediation depends on the user’s interaction and perception of the level of invisibility of various forms of mediation. 

The interface of an iPad is a physical re-mediation of a computer monitor, and before that, a notepad. However, this notion is true with one added caveat – the user can manipulate objects on the screen with their hands, rather than with a cursor. This tangible component of the iPad is also a re-mediation of how humans naturally touch and readjust surfaces. This type of interaction through remediation is mentioned by Manovich as “the mix between older cultural conventions for data representation, access and manipulation and newer conventions of data representation, access and manipulation.” Therefore, the interface of the iPad creates a sense of personalization that would have been otherwise unattainable on a computer, given the newly incorporated features of creating things by hand.

Similarly, the media functions of an iPad largely re-mediate those of a computer, but on a more basic level due to constrains placed on the technology in its tangibility. Essentially, the iPad is technically able to process media as well as a computer, but its interface limits the scope of whatever is being processed. Additionally, the apps included in the software of an iPad demonstrate how strategic implementation of software can make the mediated functions of a technology seem invisible. For example, take a look at your address book app. The images shown on the screen replicate how a “real” address book would look, complete with hand-flipped pages. The familiarity of the original, re-mediated object make the software of the app invisible. People are unlikely to question what kind of code was used to create this app, but will likely notice how “easy to use” it is since it is based on a model they already know.

The mediation level of an iPad is more difficult to discuss since an iPad can mediate any number of things – emails, mail, entertainment, and any other type of information. To limit my discussion, I will focus on the iPad’s remediation of television functions and a few effects it has had thus far. With the use of digital apps, the convenience of watching television with the freedom to choose what time and place you will tune in has been heightened. Apps catering to this convenience continue to thrive, and create a somewhat symbiotic relationship of re-mediation between themselves and the interfacing technology. For example, Netflix has been an extremely successful online service, and by providing viewers with a steady stream of entire show seasons, rather than one episode at a time like traditional television, the company has been able to re-mediate not only how people interact with their iPad technology, but also with their televisions. In doing so, Netflix learned that people tend to “binge watch” entire seasons quickly, which prompted the company to invest in creating its own show, House of Cards, that would only be shown on Netflix. 

Since not all iPad/Netflix users participate in these “binge watching” sessions, this situation would demonstrate how re-mediation of a media can make digital versions of the technology truly seem more “real” because it is more current with the type of software the iPad is capable of running, thereby highlighting the “cultural bias” aimed at digital endeavors. Any paid-for television channel with its own specific set of shows can be similarly successful if it were to show all episodes of one show in marathon succession. However, the personalization of choosing when to watch the shows is what gives the remediation of television shows some weight. Overall, the re-mediation involved in all Apple products and some other tablets creates success for the brand as a whole thanks to its strong basis in familiar analog technologies. 

Packing Up Libraries


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

Yiran Sun

E-readers are peculiar objects in today’s world. On one hand, it is commonly considered an electronic device that is in the same class as the Pads, which then usually appears side by side with computers or smart phones; on the other hand, it is relatively narrowly fixed on just on task: reading e-books, which would make it more similar to digital cameras and other task-specific electronic devices. The first e-reader as we know it today was released in 2004 (the Sony Librie; Amazon’s Kindles started from late 2007), so it is also peculiar in the sense that it combines the most ancient medium, inscriptions, with an extremely new technology, the electronic paper. According to an IDC study from March 2011, 48% of all e-book readers sold worldwide were the Kindle models. I myself used to own a K3 and now a Paperwhite. But what’s inside those little devices? Since they are so single-tasked, they can’t possibly be too complicated, can they?

An object: What is inside the device

But it is complicated. The Kindle Paperwhite has an awful lot (see illustration below) packed in a 6.7’’x4.6’’x0.36’’ body: the battery, the WAN board, the circuit board, the 3G antenna, the WIFI antenna, the e-ink display, the touch screen, and the light guide with the LED lights. Physically, the core technology of Paperwhite lies in two parts: the e-ink display and the lighting system. The e-ink display sets the e-readers apart from similar devices such as the mini iPad and the Kindle Fire, while the lighting system makes the Paperwhite stand out amongst a number of models and brands. However, neither of them are revolutionary technologies: the lighting is intricate, yet people have been using clip-on LED lights from 2007 and it worked fine; the e-ink is eye-friendly, but it also limits the content possibility because of its low refreshing rate.

A platform: Software and what lies beyond

A great part of Kindles’ power resides in the “Kindle” application, a software that can be used across various operating systems, from Mac to PC, from iOS to Android. This standardization of platform enables users to make most out of the Kindle system, and it also greatly increases people’s likeliness of exposure to it. But a platform would be useless if there is no content, and this is THE factor that made Kindles the major market holder: behind it stands Amazon, the most influential bookstore of this decade. (There is another entire story behind the power of Amazon, but let’s not delve into that for now.) Almost every published book can be found on Amazon, and a great many of them have a free or purchasable Kindle version, while for those that do not, the user can always click the “Tell the publisher” link to inform an interest in a Kindle version. As time passes, more and more “nodes” (e-books) are added onto the “network” (the Kindle system), and the more nodes there are, the more powerful the network becomes.

An Interface: Pathway to the most ancient

But why do people bother to purchase these e-readers, if their contents can be accessed on any operating system? For this question, the hat should be tilted towards a most ancient function in our society: the inscription function, and two functions that derived from it: the book function and the library function. The latter two has been deeply incorporated into our society via important cultural cores like history, religion and legislation, that they have come to be “naturally” associated with knowledge and prestige. Combined with the single-task design of e-readers, a person with a Paperwhite would most likely be identified as learned or at least interested in pursuit of knowledge, while the same person reading books off the Kindle application on a smart phone would not trigger such association.

It is also fascinating how closely a Paperwhite resembles an actual physical book. (Even its name suggests this!) It is of a similar size, the screen is matte and feels like surface of paper, the e-ink display with the LED lights makes the texts against background the same contrast as texts on paper… Every aspect of the interface design is aimed at creating immediacy, making an illusion that the user is holding onto an actual book, only lighter and holds potentially unlimited information. In fact, this is exactly what I have been telling my friends and family: It’s just like a book!

A Revolution: Yes and no

And the best thing about it is that it’s not just one book: It’s a library. This is the part where many find the technology of e-readers to be revolutionary. However, as mentioned before, none of the technologies in e-readers is particularly cutting-edge or exclusively unique. The use of its content is the same as well: one can read the books, add bookmarks and highlights, making notes, but one cannot change the content of the published book, nor can he/she republish it. The e-readers allow users to do what they can do about a physical book, but nothing more… except having the content of countless books in the size of one. This feature certainly is useful for people on the move, for those who cannot afford to physically store a large quantity of books. In my case, since I have not yet made any plan to stay in any particular city, I cannot store a lot of books with me. For this I am grateful towards my dear Kindle. Yet I still purchase physical books, usually after I’ve read it either from library or from the e-library, and have them directly shipped to my home in China. Through time, books have taken on a symbolic value that is somewhat hardwired into its materiality, and people of our generation still feel “a different vibe” when we lay our hands on the leaves of paper. I do not know whether or not this superstitious view of physical books would carry on to the next generation, but I do know, that e-books would keep on with their pursuit of immediacy creation, and soon even our generation may not be able to tell apart the analogue-analogue and digital-analogue versions of texts.

Media in My Pocket


Warning: Use of undefined constant user_level - assumed 'user_level' (this will throw an Error in a future version of PHP) in /home/commons/public_html/wp-content/plugins/ultimate-google-analytics/ultimate_ga.php on line 524

Apple claims that loving “it” is easy and great. But what’s “it” exactly – not just the phone, but the old, new, and digital media functions that come along with it.

I entered the iPhone world quite recently, this past January. Up until then I wasn’t necessarily opposed to the iPhone but found more cost-efficient cell phone options. After figuring out how to make my cell phone plan work with a gently used iPhone 4 given to me from a friend, I figured I’d give the iPhone a shot and see what all the hype is about.

Tying in this week’s readings with real-world experiences helped to conceptualize how frequently we try to categorize PCs, iDevices and tablets into one or two rigid categories. Take the ubiquitous iPhone for example – it is a phone, it is a digital content maker, a depository of media artefacts, and a medium for digital and new media. I found that Manovich’s argument of the computer can be applied to the iPhone:

No longer just a calculator, a control mechanism or a communication device, a computer becomes a media processor.” (Manovich, 2001, p.48)

This statement by Manovich shows how the initially, people viewed computers as a single-tiered operation tool. Now the computer has evolved into a robust processor of information, communication and media. Like the computer, the iPhone is not just a passive device to make and receive phone calls. It is a processor of media – the iPhone mediates verbal communication (via SMS and MMS), it has its own port of entry into the Internet, it can capture photos in a digital format which can be shared between other phones. Of course, the iPhone does much more than this, but I think it is useful to use it as media processor, just as a regular computer is.

Furthermore, when we consider the media functions of a specific device the “invisibleness” of it often gets overlooked. As Manovich (2012) stated in Media After Software, “The new ways of  media access, distribution, analysis, generation and manipulation all come  from software” (p.2). The software within the iPhone is mostly invisible because it is not the first part you notice about the device. The interface (the screen, buttons, total physical entity) gets noticed first, followed by the ‘regular’ media component such as telephone options and Internet capabilities. But its software and the constant updates that make the iPhone do all the “cool” digital and new media functions they cannot happen on their own.

Lastly, I found Bolter & Grusin’s (2000) take on immediacy relevant to the topic of the iPhone and the convergence of mediated systems. Their example of the 1996 presidential election being the first one covered on the Net was interesting because now it is hard to picture an election without it being mediated through various communication channels. The iPhone serves as a platform for mediated communications and new media (defined as cultural objects which distribute and exhibit by Manovich).

My takeaway is that we cannot take our devices for face value and should dig past the shiny interface of a screen to ask “what does this device do for me?” and “How is new media being implemented here?” Questions regarding the evolution of devices should also be asked – do they truly extend our cultural memories? Or do such devices fragment our cultural memory into invisible pieces across the digital and non-digital platform?


References:

Bolter, Jay D. and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: The MIT Press, 2000.

Lev Manovich’s 5 Principles of New Media. Youtube. http://www.youtube.com/watch?v=PA8x4BZdwVo

Manovich, Lev.  The Language of New Media (excerpts). Cambridge: MIT Press, 2001.

Manovich, Lev.  edia After Software,” Journal of Visual Culture, 2012.