Who Dun It: Faultlessness Thanks to Computers

Of the narratives and backstories provided about pioneers in computing, Vannevar Bush’s story stuck out like a sore thumb to me this week. His instrumental contributions to science led to technological foundations for warfare — an unforeseen occurrence for the intelligible innovator. The creation of the military-industrial complex is one planted upon unstable political grounding, as America transitioned from World War II to the Cold War. As Bush explained, however, the optimum use of science is to benefit and not dismantle society. The need and utility of computer surpasses efficiency; it’s about wiping hands clean of guilt and liability.

Our dependency and abhorrence of computers is complex. We are the minds that created such technologies by determining their purpose, flexibilities, possibilities and accommodations. Yet, it’s clear that technologies are more malleable and capable than we are. Bush iterates that we operate off of association and past experience; aside from the Web’s bread crumbing of webpage history, computers do not do that. The subsequent web page you visit after looking at YouTube isn’t Vimeo, unlike how our memories and cognitive moments are chained together by association. Inner working in web browsing are attempting to mimic human cognitive relationship formation (i.e. cookies). Moreover, there is only one route to reach a specific piece of information in computing, by delving into subclasses. Conditions must be satisfied to reach an outcome, unlike humans who can use other factors to evaluate acting upon something.


The interfaces of computers create a bridge between the human user and the desired action-oriented program. All of the capabilities of these programs are things that humans can do, but as Bush explains, “such machines have enormous appetites.” Moreover, computers lack morality and fulfill tasks if objectives are fulfilled. So for example, one of the opening scenes of the ’80s film Robocop illustrates a boardroom meeting in which engineers and executives discuss the newest model of law enforcement — the ED209. This seemingly effective technology maliciously “malfunctions” and kills an armed executive during a demonstration, leaving several — but not all parties — appalled. Needless to say, ED209 is rejected as solution for the city’s rising crime rate and thus Robocop is created. The ED209 is a computer fulfilling the needs and desired actions of humans, only lacking morality and displacing accountability. So, the EP209 will say anyone who steals is wrong and should be shot, while humans may evaluate the motive behind one stealing — like feeding their family — before placing a final judgment. This is what makes computers so daunting and enchanting to humans; they are more perfect than we are but serve as a scapegoat for our actions.  The U.S. military-industrial complex works in the same way, in that drone strikes devastate residential areas with one goal in mind, neglecting the side effects of attacks.

Final note: The cover art of Engelbart’s text shows a man with a third-eye technological device. Does this remind anyone else of the Dr. T.J. Eckleberg billboard? It makes me this of the post-World War I atmosphere and the steady corruption of America’s moral fiber. Possible preceder of military-industrial complex? Or is this a stretch?


It is clear that the intellectual ability of scholar is not stunted due to technical innovations, but to the layman — is it stifled? With our mundane use of technology, particularly with HCI, how will this alter human-to-human interactions, along with individual and community-dependent skill sets ? Understanding the concepts for this week was a bit of a struggle, but I’m hoping after some discourse with my fellow students, my intellectual vision will be cleared.