We really do need to think like computer scientists

In Introduction to Computing: Explorations in Language, Logic, and Machines, Evans (2011) makes the argument that computing science should be taught as “a liberal art, not an industrial skill” (p. ix). He explains that everyone should study computing because “1. Nearly all of the most exciting and important technologies, arts, and sciences of today and tomorrow are driven by computing. “ and “2. Understanding computing illuminates deep insights and questions into the nature of our minds, our culture, and our universe” (p. 1). In after only a few lessons into the Code Academy’s Python introductory course, Evans’ motivation becomes clear as the logic of computing puts into course a manner of thinking that is very distinct and in a way empowering. Even within the first lessons in the course, when learning how to properly format instructions in the Python programming language, the logic of thinking from an instructive perspective has different feeling than other manners of thinking. Moreover, the idea that you have to think logically and on a step by step basis is empowering because it gives you the feeling of control; even if basic programming instructions, you are giving instructions, from which results emerge.

An argument explained by Denning and Martell (2015) is that computer science is a science branch of its own because it has its own approach to discover facts. Moreover, they argue that its approach is different than other sciences because it is transformative, not just focused on discovery, classification, storage, and communication. “Algorithms not only read information structures, they modify them” (p. 15). And generative, not just descriptive. “An algorithm is not just a description of a method for solving a problem, it cause s a machine to solve the problem” (ibid). This way of thinking is felt right away when writing a few lines of basic code by which I, as the programmer, could define variables and determine how they would behave in relation to other variables and different logical instructions. However, the idea of machine learning leveled this empowering feeling as I kept going with the module.

Both Denning and Martell’s and Evans’ proposals make sense for todays world. On the one hand, distinguishing the scientific approach of computer science from other sciences is primordial on a more massive level at this point. While computer scientists already know this as they rapidly advance the field — we are already speaking about artificial intelligence and high levels of machine learning —, the public may not be as clear about the wide world that is computing. As explained by Wing (2006), “thinking like a computer scientist means more than being able to program a computer. It requires thinking at multiple levels of abstraction” (p. 34), but the main narrative about programming we have today may not tell the whole story. On the other hand, Evans is right, computing is everywhere and understanding it can only helps us better understand ourselves and our culture.

Making computing more widely used is a challenge on several fronts, but the one that came to my mind considering the history of computing told by Campbell-Kelly and the increasing amount of news and media we see today about algorithms, machine learning and artificial intelligence, is the constantly widening digital divide that is part of the computing field. The fact that computing has to be more accessible has been pushed forward by policies emerging in different sectors and levels, which is why a website with such a greatly designed self-learning software like Code Academy exists for free today. However, even such programs may not fully illustrate to users how fastly the field is growing, and this lack of awareness means those who are not learning this logic are being left behind.

As I mentioned at the beginning, the feeling of empowerment by being the one giving instructions was great when I started the learning module. As I was thinking about this, Ada Lovelace’s argument came to my mind: “The Analytical Engine has no pretensions whatever to originate any thing. It can do whatever we know how to order it to perform” (Evans, p. 35). However, after I was done with a few more lessons, I reached the stage in which you can program interactivity with the user, and I realized that keeping the computing logic in mind is essential not just when I want to code something, but while constantly interacting with ubiquitous computing.

In an interview with mathematician Cathy O’Neil, author of the book ‘Weapons of Math Destruction,’ she explained that algorithms are such a big part of our lives because they process not just the information that we personally input into our computers, but information about us that companies process in order to make decisions that affect our lives. Big data that profiles people in order to advertise services or information to them may end up causing harm because, as she puts it, is used to categorize people in winners and losers. If the algorithms determine you are more likely to be persuaded by voting, for example, there is a type of information that will reach you that wouldn’t if you are categorized differently, even if mistakenly. Our access to information is mediated by algorithms, and I think this means that the logic of computing has to be part of our media literacy as well.

When consuming and processing information today, it is important we develop a layer of thinking in which we question how information was processed by algorithms in order to be shaped the way it is. If there is anything that taking the Python course made clear to me is that nothing in computing is accidental — it may have been instructed by mistake, but it doesn’t happen out of chance. What happens happens because it has been instructed to happen. When we apply for services, such as health insurance, and receive certain information, we have to be able to question how our profiles were processed. And when consuming information online, we also have to be constantly asking why we find some information instead of other. The issue, as put by O’Neil, is that we as a society blindly trust algorithms: “we don’t push back on algorithmic decisioning, and it’s in part because we trust mathematics and in part because we’re afraid of mathematics as a public.” This is highly problematic when we consider it gets in the way of our interaction with culture and knowledge in society today.

As noted, these challenges are started to be faced in different manners today. The idea that everyone should learn basic programming is increasingly part of the narrative, especially in developed countries. In 2013, for example Code.org was launched, funded heavily by the private tech industry, to promote this idea for children by giving tools for teachers, schools, and kids. And the US government has been investing more in getting people to be part of the STEM (science, technology, engineering, and math) field. Part of this effort should include learning the abstract computing thinking method not only to create but to consume. As Evans explains, when a computer scientist is faced with a problem, they think about it “as a mapping between its inputs and desired outputs” and they thus “develop a systematic sequence of steps for solving the problem for any possible input, and consider how the number of steps required to solve the problem scales as the input size increases” (p. 16). As consumers of information, we need to also consider how our information has gone through a number of steps before reaching us and thus is shaped in a particular way. We do this when we think about the news we consume: we know there is a journalist who researched and wrote an article along with an editor and that editing decisions when into the topic, the framing, the placing of the news item, etc. We need to add a layer of thinking in which we consider that information was also selected and processed through the use of algorithms. We need to be able to imagine the mappings mentioned by Evans, but for this we need to know they are there.


David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. 2011.

Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: The MIT Press, 2015.

Martin Campbell-Kelly, “Origin of Computing.” Scientific American 301, no. 3. 2009.

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3. 2006.