
(f.1) Dr. J.W. Mauchly makes an adjustment to ENIAC, the massive computer he designed to assist the U.S. military during World War II.(Source: www.scienceclarified.com)
Abstract
This paper provides an overview of the technological advances in the historical context of World War II, the institutions and individuals that played a key role in the creation of computers and the impact of said advancements in our current technology within the context of armed conflicts. It aims to analyze beyond the ‘what if’ scenarios and take a closer look at certain moments in history that signified a before and after for the technology around computers, who were the key actors behind them, and how did it shape and define the computers we use today and how we use them.
Topics, concepts and keywords:
- History of technology development during and after World War II.
- University research and government funding for the developing of technology during war.
- Computing, graphical interfaces for human interaction and combinatorial technologies.
- Keywords: World War II, Department of Defense, DARPA, ARPA, ENIAC, Cold War, MIT, MIT Lincoln Labs.
Research questions
- In what ways is the current technology around computers influenced by the technological achievements of World War II?
- Within this context, under what circumstances do the combination of international conflict and the involvement of government with university research teams motivates the advancement of technology?
Introduction
In popular culture, it’s common to refer to our current times as the age of technology. We live in a world that is not only intrinsically related to technology but it’s also incredibly dependent on it. This trend is not entirely new, we’ve been influenced by technological advancements for a very long time, even before the invention of electricity. However, there is no denying that the pace at which technology advances has sped up drastically in the last half century. It wasn’t that long ago that we used to live in a world without internet, cellphones, GPS, or digital cameras, just to name a few. More surprisingly, technology is advancing so fast that, many times, its predecessors become obsolete very quickly.

(Fig. 2) Source: Google images.
Society marvels at new technological advances in different fields and wonders “how is it possible?”. The rapid pace and the mysterious aspect (black-boxing) of the modern advancement of technology make it seem as something magical, almost inevitable and unstoppable for everyone. In order to demystify technology as an autonomous entity that magically evolves independent from us, it is important to ask what happened 50-60 years ago that unchained this phenomenon? Who played a part in it? And how did it affect the current state of our technology?
A snapshot in time
To begin to answer our question it is necessary to look at the history of what was happening in the world at the time. Upon analyzing this, we’ll find that it was not one specific event but rather a combined chain of events, interdependent, that happened at the perfect timing. On top of that, it wasn’t one specific individual, but instead a group of different actors and institutions whose actions had an impact in determining the path technology would take in the future.
Even though technology is still very much present and a determining factor in future conflicts –in addition to earlier inventions in World War I serving as the ancestors to build on new technology- no war had such an impact on the current technology of our lives than World War II (1939-45).
It was a peculiar moment in history in which a unique combination occurred simultaneously: the need for technological advances to defeat the enemy with the intellectual flourishment of revolutionary ideas in the field. Both government funding and private sector funding united forces with academic research in the United states, such as MIT and Stanford, which resulted not only in the victory of the allies but its effect still resonates in our lives with the way we interact with technology in our everyday activities.

(Fig. 3) The transportation technology advances in World War Two included amphibious landing vehicles, aircraft carriers, vastly improved tank technology, the first appearance of helicopters in combat support roles, long range bomber aircraft and ballistic missiles. (Source: www.21stcentech.com)
There were many types of technologies and discoveries of scientific principles that were customized for military use. Major developments and advances happened in such a short period of time that it’s difficult to study and analyze all of them in this limited space. Just to name a few, we can take into account the design advancements of weapons, ships, and other war vehicles, or the communications and intelligence improvements with devices such as the radar, allowing not only navigation but remote location of the enemy as well. Other fields that were drastically influenced by technological advancements were the medical field and the creation of biological and chemical weapons, the most notorious case being the atomic bomb.
On the subject, Dr. David Mindell from MIT brings attention to a few specific cases and their impact, both during the war and its outcome, as well as in the current state of our technology:
“We can point to numerous new inventions and scientific principles that emerged during the war. These include advances in rocketry, pioneered by Nazi Germany. The V-1 or “buzz bomb” was an automatic aircraft (today known as a “cruise missile”) and the V-2 was a “ballistic missile” that flew into space before falling down on its target (both were rained on London during 1944-45, killing thousands of civilians). The “rocket team” that developed these weapons for Germany were brought to the United States after World War II, settled in Huntsville, Alabama, under their leader Wernher von Braun, and then helped to build the rockets that sent American astronauts into space and to the moon. Electronic computers were developed by the British for breaking the Nazi “Enigma” codes, and by the Americans for calculating ballistics and other battlefield equations. Numerous small “computers”—from hand-held calculating tables made out of cardboard, to mechanical trajectory calculators, to some of the earliest electronic digital computers, could be found in everything from soldiers’ pockets to large command and control centers. Early control centers aboard ships and aircraft pioneered the networked, interactive computing that is so central to our lives today”. (Mindell, 2009).

(Fig. 4) The V-1 or “buzz bomb” was one of the early bombers used during World War II. (Source: www.learnnc.org)

(Fig. 5) Radar system in operation in Palau during World War II. (Source: www.learnnc.org)
The history of how all of these advancements came to be it’s fascinating, and it would be easy to get sidetracked into analyzing each of them. However, this paper does not aim to be a mere recounting of the facts that are already very well documented by historians. Let’s take a look at the specific case of advances in computing, which is probably one of the biggest, if not the main, takeaway from World War II.
Even though, ‘computing’ as a way of thinking and seeing the world had existed for a very long time before these events –including machinery- there is no denying that the jump in the last 50-60 years has been abysmal, and we owe it, in big part, to the research and funding achieved during and after World War II.
As a field, Computing started formally in the 30’s, when notorious scholars such as Kurt Gödel, Alonzo Church, Emil Post, and Alan Turing published various revolutionary papers, such as “On Computable Numbers, with an application to the Entscheidungs problem” (Turing, 1936), that stated the importance of automatic computation and intended to give it mathematical structures and foundations.

(Fig. 6) Alan Turing, considered to be the father of computer science. (Source: www.biography.com)
The Perfect Trifecta: Universities Research Teams + Government funding + Private Sector
Before World War II, the most relevant analog computing instrument was the Differential Analyzer, developed by Vannevar Bush at the Massachusetts Institute of Technology in 1929 “At that time, the U.S. was investing heavily in rural electrification, and Bush was investigating electrical transmission. Such problems could be encoded in ordinary differential equations, but these were very time-consuming to solve… The machine was the size of a laboratory and it was laborious to program it… but once done, the apparatus could solve in minutes equations that would take several days by hand”. (Mindell, 2009).

(Fig. 7) Vannevar Bush (1890–1974) with his differential analyzer Bush joined MIT at age 29 as an electrical engineering professor and led the design of the differential analyzer. During World War II, he chaired the National Defense Research Committee and advised President Franklin D. Roosevelt on scientific matters. (Source: Computer History Museum)
During World War II, the US army commissioned teams of women at Aberdeen Proving Grounds to calculate ballistic tables for artillery. These were used to determine the angle, direction and range in which to shoot to more effectively hit the target. However, this process was vulnerable to error and took considerable amounts of time, therefore, the team could not keep up with the demand of ballistic tables. In light of this, the Army commissioned the first computing machine project, the ENIAC, at the University of Pennsylvania in 1943: “The ENIAC could compute ballistic tables a thousand times faster than the human teams. Although the machine was not ready until 1946, after the war ended, the military made heavy use of computers after that” (Denning, Martell, 2015).

(Fig. 8) 1946, ENIAC programmers Frances Bilas (later Frances Spence) and Betty Jean Jennings (later Jean Bartik) stand at its main control panels. Both held degrees in mathematics. Bilas operated the Moore School’s Differential Analyzer before joining the ENIAC project. (Source: Computer History Museum).
This is one of the first examples of the combined work of government and universities research teams to fund and advance technology. However, it is worth noting that this was not the only project in place at the time in the world. In fact, the only one that was completed before the war was over was the top-secret project at Bletchley Park, UK, which cracked the German Enigma cipher using methods designed by Alan Turing (Denning, Martell, 2015).
Nevertheless, projects such as ENIAC (1943 US), UNIVAC (1951 US), EDVAC (1949 US, binary serial computer), and EDSAC (1949 UK) provided ground-breaking achievements that, later on, allowed for the design advancements of a more efficient, reliable, and effective computer: “Even relatively straightforward functions can require programs whose execution takes billions of instructions. We are able to afford the price because computers are so fast. Tasks that would have taken weeks in 1950 can now be done in the blink of an eye”. (Denning, Martell, 2015).
These projects sparked the flourishment of ideas that transformed computing into what it is today. Computers changed from being mere calculators to being information processors, and pioneers John Backus and Grace Hopper had a key role in that shift. In 1957, Backus led a team that developed FORTRAN, a language for numerical computations. In 1959, Hopper led a team that developed COBOL, a language for business records and calculations. Both programming languages are still used today: “With these inventions, the ENIAC picture of programmers plugging wires died, and computing became accessible to many people via easy-to-use languages” (Denning, Martell, 2015).

(Fig. 9) 1952, Mathematician Grace Hopper completes A-0, a program that allows a computer user to use English-like words instead of numbers to give the computer instructions. It possessed several features of a modern-day compiler and was written for the UNIVAC I computer, the first commercial business computer system in the United States. (Source: Computer History Museum).
The role of government funding during this period was essential, but it went beyond just granting money to universities’ research teams. In February 1958, President Dwight D. Eisenhower, ordered the creation of the Defense Advanced Research Projects Agency (DARPA), an agency of the United States Department of Defense which mission is the development of emerging technologies for use by the military. International armed conflict not only played a part in the creation of this agency but it was the reason behind it. About the climate of the context of its creation:
“ARPA [originally] was created with a national sense of urgency amidst one of the most dramatic moments in the history of the Cold War and the already-accelerating pace of technology. In the months preceding [the creation] … the Soviet Union had launched an Intercontinental Ballistic Missile (ICBM), the world’s first satellite, Sputnik 1… Out of this traumatic experience of technological surprise in the first moments of the Space Age, U.S. leadership created DARPA” (Official website).
The agency establishes its purpose clearly: “the critical mission of keeping the United States out front when it comes to cultivating breakthrough technologies for national security rather than in a position of catching up to strategically important innovations and achievements of others” (Official website). By this description, is not difficult to assume that tension between countries due to armed conflicts definitely impacts their willingness to invest in the creation of new technology.
However, the projects funded at this agency, throughout its creation, have provided significant technological advances that have had an impact not only for military uses but in many other fields. The most ground-breaking ones are providing the early stages of computer networking and the Internet, in addition to developments in graphic user interfaces among others.
%20Timeline%201962%20-%20Licklider%20-%20MIT%20CSAIL%20Image%20copy.jpg)
(Fig. 10) 1962, J. C. R. Licklider, first director of DARPA’s Information Processing Techniques Office (IPTO) discusses concepts with students at MIT. (Source: DARPA)
Along the lines of DARPA, the Department of Defense, in collaboration with Massachusetts Institute of Technology, created the MIT Lincoln Laboratory as a research and development center focused on the application advanced technology to problems of national security: “Research and development activities focus on long-term technology development as well as rapid system prototyping and demonstration… The laboratory works with industry to transition new concepts and technology for system development and deployment” (Freeman, 1995)
Other projects like the Stanford Research Institute started from a combination of forces between university and government funding after World War II and continue to develop technology to better the lives of the public. Among its accomplishments are the first prototype of a computer mouse, inkjet printing, and it was involved in the early stages of ARPANET.
When the future becomes now
Many people involved in the projects created during World War II went on to start computer companies in the early 50’s. Universities began offering programs to study in the new field by the late 50’s. More specifically, Computer Science programs were founded in 1962 at Purdue University and Stanford University, facing early criticism from scholars who believed that there was nothing new outside of mathematics and engineering. “The field and the industry have grown steadily ever since, into a modern behemoth whose Internet connections and data centers are said to consume over 3% of the world’s electricity”. (Denning, Martell, 2015).
Over the years, computing provided new insights and developments at such a pace that, in a matter of few decades, it advanced further than other fields since their creation: “By 1980 computing had matured in its understanding of algorithms, data structures, numerical methods, programming languages, operating systems, networks, databases, graphics, artificial intelligence, and software engineering”. (Mindell, 2009).
In relation to that, the first forty years or so of the new field were focused on developing and perfecting computing technology and networks, providing ground-breaking results that better suited it for combinatoriality and further advancement. In the 1980’s another shift started in the field: the interaction with other disciplines and computational sciences: “Recognizing that the computer itself is just a tool for studying information processes, the field shifted its focus from the machine itself to information transformations”. (Denning, Martell, 2015).
The biggest advances of this field have been integrated into our world seamlessly, shaping not only our lives but the way we see and interact with said world. Design achievements such as the microchip, the personal computer, and the Internet not only introduced computing to the public’s lives both also promoted and sparked a motivation for the creation of new subfields. This effect, in fact, replicates itself almost like a cycle, explain Denning and Martell: “Network science, web science, mobile computing, enterprise computing, cooperative work, cyberspace protection, user-interface design, and information visualization. The resulting commercial applications have spawned new research challenges in social networks, endlessly evolving computation, music, video, digital photography, vision, massive multiplayer online games, user-generated content, and much more”. (Denning, Martell, 2015).

(Fig. 11) Evolution of the computer. (Source: Google Images)
David Mindell clearly expresses this marvelous achievement: “Perhaps the single most remarkable development was that the computer—originally designed for mathematical calculations—turned out to be infinitely adaptable to different uses, from business data processing to personal computing to the construction of a global information network”. (Mindell, 2009)
Conclusion
What if World War II hadn’t happened? Would our current technology be at the stage that it is today? In what ways would it be different? How long would it have taken us to achieve these technological advancements if military conflict wasn’t present in the context?
Such hypothetical questions were the ones that plagued my mind when I started this research, and there is not a clear answer for them. The impact World War II had on society is undeniable and impossible to measure. The world was never the same in every aspect and there was no field left untouched by it. From international relations and diplomacy, with the creation of the UN and the Human Rights, to world politics, specifically in Europe, were forever changed, leading to dictatorships and more armed conflict within the region. Other fields such as physics, biological weaponry, engineering, medicine and genetics, just to name a few, went through a drastic change as well sparked by the events during this time, which in consequence led to future conflicts such as the Cold War and the development of nuclear weapons by various nations.
At the core of all these changes is technology. World War II and its impact on the development and advancement of technology shaped the world as we know it now, in ways that we’re still trying to comprehend and address.
Would technology be less mature, robust or advanced if World War II hadn’t happen? Probably, but more so in a change of pace than a different path. There were astounding technological advances before the war and there are still technological achievements occurring that are not sparked by military conflict. However, wartime stimulates inventiveness and advances because governments are more willing to spend money on revolutionary, and sometimes risky, projects with urgency.
For the specific case of World War II, the creation of computers was a result of different actors and institutions (universities, government agencies, computer scientists and researchers), with various interests, pushed by armed conflict to work together in perfect timing in one of the most drastically world-changing cases of serendipity in history. It is the ‘before-and-after’ of not only our generation but our civilization.
References
Texts:
- Campbell-Kelly, Martin. “Origin of Computing.” Scientific American301, no. 3 (September 2009): 62–69.
- DARPA official website: https://www.darpa.mil/about-us/timeline/where-the-future-becomes-now
- Denning, Peter J and Craig H. Martell.“Great principles of computing.” Communications of the ACM11 (2003): 15-20.
- Freeman, Eva C. MIT Lincoln Laboratory: Technology in the National Interest,, Lexington, Mass.: MIT Lincoln Laboratory, 1995.
- Geiger, Roger L. Research and relevant knowledge: American research universities since World War II. Transaction Publishers, 2008.
- Hall, Daniel and Lewis Pike. If the World Wars hadn’t happened, would today’s technology be less advanced? Guru Magazine, web source: http://gurumagazine.org/askaguru/if-the-world-wars-hadnt-happened-would-todays-technology-be-less-advanced/
- Mindell, David. The War That Changed Your World: The Science and Technology of World War II. Introductory essay for the exhibition “Science and Technology of World War II exhibition at the National WWII Museum, 2009. Web source: http://www.ww2sci-tech.org/essays/essay2.html
Images:
- Fig. 1: http://www.scienceclarified.com/scitech/Artificial-Intelligence/The-First-Thinking-Machines.html
- Fig.2: Google Images.
- Fig. 3: http://www.21stcentech.com/technology-war-part-3-war-impact-transportation-technology/
- Fig. 4 and 5: http://www.learnnc.org/lp/editions/nchist-worldwar/6002
- Fig. 6: https://www.biography.com/people/alan-turing-9512017
- Fig. 7: http://www.computerhistory.org/revolution/analog-computers/3/143
- Fig. 8: http://www.computerhistory.org/revolution/birth-of-the-computer/4/78
- Fig. 9: http://www.computerhistory.org/timeline/1952/#169ebbe2ad45559efbc6eb35720dca99
- Fig. 10: https://www.darpa.mil/about-us/timeline/ipto
- Fig. 11: https://robertocamana.files.wordpress.com/2014/08/articulo-no-140.jpg