Category Archives: Week 9

Computational thinking is all about human thinking – Ruizhong Li

Among the computing principles, the COMPUTATION category is the most impressive one. The computation category addresses the problem of converting complex problems into a number of computational steps to achieve a solution.

When I was learning Python on campus, in order to solve a problem, the tutor always started with a flowchart. For example, if she wanted to print out “Hello World” for 5 times, here is her workflow:

First, she would draw a flowchart:

2

Computational thinking is a way humans solve problems. In the process of drawing the flowchart, we are thinking in human’s way. Print out “Hello World” for 5 times, intuitively we have to define how many times and what to print, and then print it out. These steps in our mind are exactly depicted by the flowchart, as well as the code.

| while loop |

i = 0

nbrTimesToPrint = 5

while (i < nbrTimesToPrint) :

print(‘Hello World’)

i = i+1

In the code, we use the variable i as a counter to count from 0 to 4, which is 5 times. Using the while loop, the little program repeats the “print” action for 5 times in only 5 lines. The number of the lines will be the same even if we want to print it out for hundreds of times, thousands of times.

Therefore, the advantage of the abstraction is manifest. If you cannot abstract “printing ‘Hello World’ for 5 times” as a repeating process, it is impossible for you to think about having a loop in your code. The flowchart will be like this:

1

and the code will be:

print(‘Hello World’)

print(‘Hello World’)

print(‘Hello World’)

print(‘Hello World’)

print(‘Hello World’)

The number of lines is the same as the loop one; this version of code is even more explicit. It is coder’s call. For now, the requirements are easy to meet with: even with the second one without loop, the problem can still be solved in a few lines, but for further consideration, the loop one is more advantageous for sure.

It should be noted that human thinking come first and then computational thinking. The decision of whether to loop or directly to command for 5 times, has been made in human’s mind. The idea of loop is not taught by the computation or software; it is our developed logical thinking that enables us to simplify the steps by making use of loop.

We can observe the existence of RECURSION from this tiny program as well. Making use of the value of the counter variable i, with the comparing between the counter and the variable nbrTimesToPrint, the program knows when to stop. As we can see, the counter i needs an initial value i=0, and then with i = i+1 at the end of the program, the counter adds 1 when the program run through the whole process for one time. The counter is the descriptive feature of the program accounting for how many times the program has run, and it is this value that determines the end of the program.

Computing is the automation of abstractions. Computational thinking focuses on the process of abstraction. The computation and software rely on human minds, and by writing out the codes, human have a chance to recount what’s going on in their minds in an observable way.

Reference:

“Key Concepts in Technology: Week 7: Computational Thinking & Software – YouTube.” Accessed October 27, 2016. https://www.youtube.com/watch?v=CawtLHSC0Zw&feature=youtu.be.
“Jeannette M. Wing – Computational Thinking and Thinking About Computing – YouTube.” Accessed October 27, 2016. https://www.youtube.com/watch?v=C2Pq4N-iE4I.
“Denning-Great-Principles-of-Computing.pdf.” Google Docs. Accessed October 27, 2016. https://drive.google.com/file/u/0/d/0Bxfe3nz80i2GdFZsMnhxSjNqekE/view?usp=embed_facebook.
“Evans-Introduction_to_Computing-2013.pdf.” Google Docs. Accessed October 27, 2016. https://drive.google.com/file/d/0Bxfe3nz80i2GbkVkRkdmZjFhNDQ/view?usp=embed_facebook.

Programming, Language, Conditionals, and … Animals? Lauren

In Great Principles of Computing, Denning writes, “A programming language is a set of syntax rules describing a precise notation for each of the above structures.” It was important for me to realize that we communicate to computer in both a similar and different way than we communicate with others. While coding in Python and my previous experience with Python and Javascript, I came to realize the strict importance of syntax. Computers fail to understand a variety of contextual information if not already programmed with a library of understanding about that information. Therefore, you need to provide very strict gramatical patterns for the computer to make sense of your program.

This then means that computational thinking needs to happen as we are not speaking for another human. In Jeannette Wing’s Computational Thinking video and essay she notes, “The process of building a computational solution to a problem is fraught with errors, which can occur at any of the four keypoints: expressing intentions and desires accurately, programming accurately, compiling accurately, and executing accurately. Here: “accurate” means preserving the original intention of the designer.” This explanation helped me understand that beyond linguistic dimensions, Wing’s video statement that Computational Thinking and programming is clearly pulling from engineering and mathematical theory. Programming is building in strict fashion to execute something that needs to be interpreted in one way and one way only. In linguistics, the patterns can be rearranged and interpreted in an infinite amount of ways. In engineering, if the instructions and numbers can be interpreted in more than one way, human error and miscommunication can cause rockets to explode.

Wing goes on to explain, “Computational thinking involves solving problems, designing systems, and understanding human behavior, by drawing on the concepts fundamental to computer science.” I began to understand that programming itself is not all of computational thinking. Programming is the interaction between the human and the computer in which the meaning making semiotic process occurs. However, computational thinking is the process one takes to solutions. The language is the instructions to the solution. In her video Wing said that space, time, and energy were the fundamental efficiency objectives in computational thinking. Similar to theoretical principles or Occam’s Razor, the simplest, least time and effort consuming answer is the one we must chose because we live in a universe with physical constraints.

Wing continues, “It is recognizing both the virtues and the dangers of aliasing, or giving someone or something more than one name. It is recognizing both the cost and power of indirect addressing and procedure call.” While programming in Python, I learned just how clear I had to be with my definitions. Because computer programs have very little vocabulary stored in their library as compared to human’s, creating new vocabulary for them must happen in the programming language. This is when we define and set our variables. Red = 5 and Monty = 23. Unlike humans, computers cannot function if Monty can equal 23 and 29 unless specifically told when it can equal 23 and when it can equal 29. This could be made in a conditional statement. If 1 + 1 == 2 is True, then Monty =23 , else if 1+ 1 == 4, then Monty = 29. This conditional function operates similarly to our understanding of words in sentence context. If you say, “I am all fired up,” or you say “I was fired today,” or if even you say the word “fire,” They have very different meanings even though the grammatical structures are not that different. As natural language processors, we can make sense of these conditionals through cultural absorption.

My own personal side note on the future of programming:

This week, I was curious about some of Vint Cerf’s (also wrote the forward to Great Principles of Computing) new ideas and projects and was excited to come across this TED Talk. As usual, I like to talk a multispecies view on new arising technology. In this TED Talk, computer scientists and animal behavior biologists come together to define elephants, great apes, and dolphins as sentient or self-aware beings (being able to identify themselves in a mirror). They then go on to imagine a world in which the complex communication patterns that these animals have would expand to our understanding through computing. Vint Cerf describes this project as a precursor to the large projects of the ISS and the Mars project are working on to increase computational language processing outside the scope of humans. Their intention would be to communicate with intelligent extraterrestrial life if they were to come across it.

While at first, this all seems very “wooshy washy” and reminiscent of pseudoscience, this TED Talk made me realize that there is actually a lot of science funding going into this type of computational programming because we exist within a universe of endless possibilities and any physicist and mathematician can tell you that there is a statistical change of coming across intelligent communicative life. Moreover, even getting a glimpse of animal communication networks can help us better understand our own cognition and biological evolution on our own planet.

I wonder how our understanding of abstraction and computational thinking may change as further research into multispecies computing continues…

 

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Peter J. Denning, “The Great Principles of Computing.” American Scientist, October, 2010.

Extended Cognition and the Automoton

This week’s reading and python exercises really helped me to see the major distinction between computational thinking and thinking more generally. I had used Python before for manipulating large sets of Twitter feeds in Social Media Analytics, but the readings along with doing the course in Code Academy helped me connect a few dots I hadn’t seen before.

First and foremost was this idea of layers of abstractions (Wing). That computational processes shed layers of any one human abstraction to only the abstractions necessary to carry out a function really helped me de-blackbox this idea of computational thinking. That human abstractions can be reasoned with by separating them into layers of abstractions really tossed me around this week, because this idea differs from the more traditional semiotic process. It seems like in computational thinking, we start with the meaning, and we break it into its components and reason with them that way – a reverse semiosis. The treatment of artifacts was interesting as well. That in computer science, the scientific method observes artifacts are created is so fundamentally different from how we scientifically study other things, especially because computers are so intertwined with our cognition (Dasgupta). That the intervention of a computer’s purpose must be considered in it’s scientific study effectively makes the study of computers the study of humans.

However, the utility of computers is this idea that they automate abstractions (Wing). This was reminiscent of cognitive offloading and the extended mind for me. We use computers to offload complex processing and automate it, making it more efficient. And then, these automated processes become part of our cognitive process. Even though our brains aren’t the ones doing the automating work, computers allow us to behave as if they are. So the question still stands… what is and isn’t computable? What is the range of possibility with an extended cognition that uses this advanced computational thinking? This question is further complicated if we accept that computers are not the origin of computational thinking (Campbell-Kelly).

References

  • Campbell-Kelly, Martin. “Origin of Computing.” Scientific American 301, no. 3 (September 2009): 62–69.
  • Dasgupta, Subrata. It Began with Babbage: The Genesis of Computer Science. Oxford, UK: Oxford University Press, 2014.
  • TheIHMC. “Jeannette M. Wing – Computational Thinking and Thinking About Computing.” YouTube. YouTube, 30 Oct. 2009. Web. 27 Oct. 2016.

Coding: Ain’t Just For Nerds Anymore! – Alex MacGregor

Having little to no experience with coding before I came to CCT, I found this week’s exercise to be very informative. Being given the opportunity to test the parameters and abilities of coding was helpful in seeing how it is truly a language, replete with syntactic and grammatical rules. Misplacing a certain element of code or confusing one variable for another resulted in errors, much the same way our brain has trouble processing nonsensical linguistic structures. One cannot escape the computational history of coding, either. Concepts we’ve discussed in class that have computational uses, like “Boolean values” and binary, also have fundamental roles in the architecture of coding. By “de-blackboxing” the coding process, you’re able to see how coding is not some arduous task requiring specialized knowledge, but rather just an extension of our own language system and ways of thought. Yet again, we see that there is nothing alien about computing; it came from, and belongs to, us.

This was also evident from the readings this week, particularly Campbell-Kelly’s “Origin of Computing” and Dasgupta’s “It Began with Babbage: The Genesis of Computer Science”. Plotting the computational lineage from the literal human computers of the 19th century to the exponentially smaller and faster machines we’re dealing with in this era was really cool. Seeing the various disciplines and influences that shaped the history and future of computing made me wonder what the computational landscape would look like if certain events, like WW2 or Babbage abandoning the Difference Engine, had never happened.

After reading the article “Computational Thinking” by Jeanette Wing, I started to think of coding as a mechanism through which computational thinking can be wired and framed on our minds. The absolutely imperative role that computers play in not only our daily lives, but also the long-scale trajectory of our species, has been more or less accepted by the public at large, yet coding is still seen as a relatively esoteric field. The bottlenecking of the functional knowledge required to operate these incredibly important cognitive technologies seems to me an undesirable situation. So I share Wing’s insistence on placing computational thinking on the same level as the traditional Three R’s of education. That model is from the turn of the 19th Century, and we’ve quite clearly gone through multiple socio-technological revolutions since then, so the de-blackboxing of these systems and technologies should be an educational imperative. I believe coding should be to the 21st century what literacy was to the 20th.

References

  1. Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35
  2. Martin Campbell-Kelly, “Origin of Computing.” Scientific American 301, no. 3 (September 2009): 62–69.
  3. Subrata Dasgupta, It Began with Babbage: The Genesis of Computer Science. Oxford, UK: Oxford University Press, 2014.

Computer Language and Computational Thinking (Roxy)

At least, I learned something connected with semiotics before, when I was an undergraduate student. But computer science, or computation, is a brand-new field to me. Before I touch this field, I felt it is really cool and challenging. It is like one of the greatest inventions and social changes has a great influence on the human beings. So I finished all the free course connected with Python on the codecademy website. After coding with my own hands, I felt computation is not harder than I thought. In Python, all the sentences quoted by “”” and “”” or led by // are symbols that mean things, since the computer will automatically ignore these words. Besides, other sentences without quotation mark or slashes can be treated as the symbols that mean things, but they can only be read in a particular form.

According to Prof. Dasgupta,  computer science is an artifact. Compared with natural science, which devotes itself to figure out the existing functions of the organs, fossils, or oxygen, computer science, as an artifact, whose purpose and functions are all defined by human beings, is decodable. As an outsider, I really hope coding and computation can be as easy as possible, so I cannot help but ask why computers cannot use natural language to code. English may cannot perform this task, but French or German which are more rigorous than English still could not meet the requirements. There is an impassable gulf between natural language and computer language. I think the root of this gulf is that there is a huge difference between how do human beings think and how do computers understand. People use language of thought to communicate, but computers are trying to understand all the descriptive language.

But we can see the shrinking gap between computer language and natural language from the history of coding. People use 0 and 1 as the basic component of coding at first. It is very hard to memorize, to check, and read by human beings. But now we have a java script, python and so many other computer languages. They are really similar to the natural language, except for the fixed form and arrangement.

With the help of computer language, not only we could code, we can also have . The core of computational thinking is the problem-solving skill, which is reformulate a difficult problem into one we know how to solve (Jeannette M. Wing.) For example, IOS system is a relatively closed system, compared with Android. Yes, it has some open-source bits, but the vast majority of the operating system are closed-source. There is no real possibility of changing the settings by an application. So, how do some music apps realize the function that can display the scrolling lyrics on the screen when the phone is locked. Even if the iTunes and Music cannot implement function. Those computers reformulate this hard question into one they know how to solve. They photoshopped every lyric on the same posters slide by slide, and change the posters one by one every several seconds. To a user, it looks like the poster never changes, only the lyrics are rolling. I think this is really a smart practice. If we could apply this computational thinking into daily life, it still could solve some hard problems.

IMG_0688

Questions:

Why they cannot define a same framework or arrangement of different computer language?

References:

[1] Denning, Peter J., and Craig H. Martell. Great Principles of Computing (Cambridge, MA: MIT Press, 2015).

[2] Wing, Jeannette. “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35

[3] Subrata Dasgupta, It Began with Babbage: The Genesis of Computer Science. Oxford, UK: Oxford University Press, 2014. Excerpts: Prologue and Chapter 1.

 

 

Game of Life- Carson

The Game of Life is a program created by John Conway back in the 1970s ish… In this program, cells can live or die depending on binary, true/false statements. In Conway’s code each cell had neighbors. The number of neighbors each cell possessed during one sequence determined if that cell would live, die, or reproduce in the next sequence. The program would repeat itself (recursion) until there were only still life clusters of cells left. These were cells who had the correct amount of neighbors to live, but not the correct amount to reproduce.

(Sorry for this poor explanation… here is a gif to make up for it)

via GIPHY

(jk… here is an image)

banner

The Game of Life has been re-created a countless number of times using various programming languages. I believe this is because it is such a straight forward visual example of what programming does. Everything is apparent, an abstraction of an abstraction if you will…

63149199

After working with Python for a few hours, I gained the confidence to try and create my own game of life via python. However, the software did not get along with my computer. So I had to stick to JavaScript. Below are a few frames from my game of life program. Instead of using squares to represent the cells, I used lines to show how the code can be manipulated.

collierGOL6

collierGOL7

 

Here is a link to the Game of Life Wikipedia page for general info.

 

 

Okay so the gif does not show up on the blog post, but the link works if you want to check it out…

Beautiful Is Better Than Ugly (Becky)

I knew the process of making meaning as part of the symbolic species was complex before I started on this week’s adventure. But as I tried to wrap my human brain around all of the processes needed to translate human speak into electrical signals, my mind was blown.

At each stage of the transition from symbols that mean things to symbols that do things, an astounding amount of human symbolic power was needed to create these technologies in the first place. And that it all runs efficiently and quickly and mostly without errors, no doubt in large part thanks to Claude Shannon, is hard to believe.

Using the efficient software and apps are humans, with our apparently complex, ambiguous, irregular, uneconomic, and limited natural language (thanks for the adjectives, Evans). We make meaning out of our natural language despite and because of its imperfections, but computers can’t make sense of it like we do. They need precise and unambiguous instructions to do their jobs.

One of the interfaces that helps us communicate with the machines is programming language. Python is one. We can read and write Python, make meaning of it; computers can execute it with the help of some other code. Interestingly, I’ve always used the word “understand” rather than “execute” in that last part, but I stopped myself this time because while the machines are processing symbols, they aren’t understanding meaning. They’re executing.

Python is a relatively high-level programming language that was developed to be accessible and readable to humans versed in natural language—the principles “beautiful is better than ugly” and “readability counts” are part of the canon. Yet, I find trying to learn Python a bit difficult simply because it is so close to natural language. I assume that if or for-in statements should do certain things based on my knowledge of English, but as CodeAcademy and Coursera have taught me, my assumptions are not always correct. I wonder if a more abstract language would be better for me. But I digress.

A compiler (or an interpreter for other programming languages) translates the code I’ve written and that I can understand into code the machine can do something with, or at least starts that process. This has usually been the boundary of the black box for me, but I think it’s been pried open this week.

The compilers map our code on to Python’s underlying grammar, a more abstract symbol system that some very smart people created. That grammar translates the information into machine code, which directs instructions to the computer’s hardware in more symbols—binary 1s and 0s—and ends up in electrical signals (I think). The machine, through symbols, is acting on the commands I gave it using another symbol system. And the symbol system I made meaning of translates into a physical action in the form of electrical pulses. The results of my little program are stored in memory until the program wraps up and the results are sent back to me so I can interpret and make meaning of them. (Although, I think there is another layer before machine code for Python so it can work with lots of different operating systems, kind of like Java, but I’m really on shaky ground here.)

With all this complexity and all the work that went into developing these processes, let alone the complex pieces of software and tiny pieces of hardware involved, I probably shouldn’t get too grumpy when Microsoft Word freezes up every once in a while.

I saw the fruits of compilers when using Python, but I think I’m finally starting to grasp how they work thanks to Evans, Denning, and Martell. The P=NP problem and the concept of stacks are also much clearer than they’ve ever been. Recursion in general makes a lot of sense, and Python training has helped to clarify that more, but the idea as described by Evans is still a little fuzzy. And I find myself wondering about the concept of cognitive scaffolding—does the concept have parallels in computing? Both the process of using heuristics to get answers to problems that can’t be logically computed (described by Denning and Martel) and regular expressions in programming language reminded me of the concept of cognitive scaffolding, but I imagine this might be an incorrect comparison.

I leave this week wondering if computation is the universal language. And I certainly see the value of teaching computational thinking. But there is beauty and adventure in the imprecision and inefficiency of life that would certainly be a shame to lose.

 

Works Referenced

Denning, Peter J., and Craig H. Martell. Great Principles of Computing (Cambridge, MA: MIT Press, 2015).

Evans, David. Introduction to Computing: Explorations in Language, Logic, and Machines. August 19, 2011 edition. CreateSpace Independent Publishing Platform, Creative Commons Open Access: http://computingbook.org/.

“PEP 20 — The Zen of Python.” Python.org. Accessed October 27, 2016. https://www.python.org/dev/peps/pep-0020/.

Wing, Jeannette. “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

———.  “Jeannette M. Wing – Computational Thinking and Thinking About Computing.” YouTube video, 1:04:58. Posted by ThelHMC. October 30, 2009.

 

Python Has Much in Common with Natural Languages – Jieshu Wang

I have learned approximately 40% of the Python course on CodeAcademy. It’s not much, but enough to prompt me to retrospect the ideas we have learned so far, including those about linguistics, distributed cognition, information theory, and the computing principles mentioned in this week’s reading.

Like natural languages we discussed in earlier weeks, programming languages have tripartite parallel architectures mentioned in Ray Jackendoff’s Foundations of Language[i]. Programming languages are made of basic elements of meanings called primitives[ii]. Primitives are predesigned symbols that mean things or do things. For example, strings and variables are symbols that mean things. We can assign meanings to them or change their meanings later. “True” and “False” are Booleans, a kind of primitive that represent truth value[ii]. “print” is a primitive procedure[ii], and it means to display the strings after it on the screen. “import” means pulling modules or individual functions into current editing context. “%” in calculation means calculating modulus, while in strings it is a placeholder whose meaning will be assigned immediately after the string.

Likewise, primitives are organized with syntaxes. For example, equals signs are used to assign value, such as “spam = True”. Triple quotation marks are used to add comments. “else”s should come after “if”. A function definition must be followed by a colon. Parentheses have to come in pairs. However, the syntaxes in programming languages are much stricter than those in natural languages. When you are speaking natural languages, you don’t have to precisely grammatical in order to be understood. But if you lose just one colon after the “if” statement, your entire section of codes couldn’t be interpreted by Python. Thanks to the programs running behind the online Python testers, we could easily identify where the errors locate.

屏幕快照 2016-10-26 下午7.03.08

A Python online tester could identify mistakes.

Even so, programming languages share the property of arbitrariness with natural languages, as Prof. Irvine mentioned in this week’s Leading by Design course. That’s because you can write so many different versions of codes to achieve the same goal.

In this week’s reading, one thing that surprises me is that there are so many problems that computers can’t solve. For example, “the only algorithms guaranteed to solve exponentially hard problems are enumeration methods[iii]”, as mentioned by Peter Denning in his Great Principles of Computing. Because the time needed to enumerate exponentially hard problem is too long, we have to use heuristic methods to approximate the best solutions. In other words, there probably exist better solutions than those given by computers. Maybe quantum computers would solve these problems in the future.

Questions

  1. Should a programmer memorize all the syntaxes in order to write a program?
  2. Jeannette Wing emphasized the importance of computational thinking. She said we should add computational thinking to every child’s analytical ability[iv]. She also explained what computational thinking is. I was wondering how to build a computational thinking?

References

[i] Jackendoff, Ray. 2002. Foundations of Language: Brain, Meaning, Grammar, Evolution. OUP Oxford.

[ii] Evans, David. 2011. Introduction to Computing: Explorations in Language, Logic, and Machines. United States: CreateSpace Independent Publishing Platform.

[iii] Denning, Peter J., and Craig H. Martell. 2015. Great Principles of Computing. Cambridge, Massachusetts: The MIT Press.

[iv] Wing, Jeannette. 2006. “Computational Thinking.” Communications of the ACM 49 (3): 33–35.

We’re all paranoid androids (Jameson)

After “knowing” it on a superficial, theoretical level since the beginning of the course, I think I now understand on a deeper level what “computing” really means. I knew that computing existed before our modern conception of “computers” (what is referred to as “automatic computers”), but these readings flushed out the idea further and illustrated that what we think of as “computers” (AC) are really just bundles of calculations and processes that humans could theoretically do themselves (albeit at a much, much slower pace). Walking through the Python tutorial, and manipulating the “inputs” for the code, I could see that the program was, on a basic level, running calculations. It could solve math problems. Moving up a conceptual level, it could tell the date and time. Moving up even more, it could synthesize information from different lines or variables together. It was performing calculations and initiating processes that humans are capable of, but just automated.

Another concept that became much more clear this week was the idea of binary and how it can be used in computing to create commands. I could understand the concept of binary as a language on its own, and separately I could understand the idea of computer programming code, but I didn’t understand how the two worked together and talked to each other. The binary tree was particularly helpful in illustrating how binary can be used to send messages or operations, merely using “yes” and “no.” I could also see how a particular value or operation, which was the result of a series of “yes”es and “no”s, could be assigned a label or signifier. For example, a value resulting from “no,” “yes,” “yes,” “no,” in a particular tree is distinct from all other possible values in that tree. It has its own distinct signified, and the series of “yes” or “no”s is, in a way, like its signifier.

One final thought I had was regarding the definition of “language” as used in “Introduction to Computing.” According to David Evans, a language “is a set of surface forms and meanings, and a mapping between the surface forms and their associated meanings.” [1] In comparing this to our understanding of language and meaning-making as discussed in class so far, it seems more akin to the de Saussure model of semiotics rather than Peirce’s triadic model. Evans’ conception of “surface forms and meanings” would be the signifier, while the conception of “associated meanings” would be the signified. The “mapping between” the two is similar to the idea of a black-box, but does not specify that there is an essential interpretant. This seems to be a more binary, if you will, way of thinking about language.

 

References

[1] Evans, David. Introduction to Computing: Explorations in Language, Logic, and Machines. United States, David Evans, 2011.

Coding and Computing

With my background in operations management, it didn’t take much for Jeannette Wing to convince me of how pervasive “computational thinking” is within my field. For example, Wing’s descriptions of computational thinking below could just as accurately describe operations management:

  • Computational thinking is thinking in terms of prevention, protection, and recovery from worst-case scenarios through redundancy, damage containment, and error correction.
  • Computational thinking is using heuristic reasoning to discover a solution. It is planning, learning, and scheduling in the presence of uncertainty.
  • Computational thinking involves solving problems, designing systems, and understanding human behavior… (Wing 34)

Of course, these excerpts describe very broad concepts. On a practical level, there are dozens of operation platforms that can be tailored to meet individual organizations’ needs by performing the operations described in these concepts.  That said, even though the companies I worked for used operation platforms, the tool we most often used was Excel.

Working with the Python tutorial reminded me of using Excel, except with a higher degree of tailorability and functionality. In Excel, the standard grid with lettered columns and numbered rows allows users to define cells in the same way that they can define variables in Python. Furthermore, Excel is equipped with a broad range of functions that can be used to do anything from organize data sets to generate solutions to complex algorithms. In Python, it seems that you can program even more functions to change the nature of data and generate abstractions based on complex calculations. Based on this experience, it seems that while Excel offers various aids, such as auto-filled functions, that help users program computing abstractions, it is limited by the spatial constraints of the grid on which it functions. Python, on the other hand, does not have similar restrictions, giving the user a much greater degree of control and customization. The only catch, in my mind, is that users have to learn and somehow remember the coding language, with all of its syntax rules, etc. on which Python functions.

While that task seems somewhat daunting to me, one thing that I did find more user-friendly about Python was the responsiveness of its compiler. Initially, when reading about a compiler and an interpreter in Principles of Great Computing (Denning & Martell  92), I had trouble understanding the differences between the two. However, using Python I was able to understand how a compiler translates code once the user has completed it rather than continuously translating it through an interpreter. Unlike Excel, the Python compiler was relatively specific in highlighting the exact error within the code that I had written. Excel, on the other hand, will either compute the procedure you’ve designed or tell you simply that it didn’t work, forcing the user to retrace his steps in order to find the error. That said, and perhaps Python also has this option, Excel allows the user to toggle between an interpreter or a compiler. When using Excel, if you are building a system with multiple calculations of large amounts of data, you want to switch to a compiler because, as explained in Introduction to Computing the interpreter, which is constantly translating data, will execute functions more slowly than the compiler, which translates data after an entire set of functions has been programmed. (Evans 38-9)

On the whole, between building on my past experiences with Excel and the readings from this week, I thought that the Python tutorial was a helpful way of understanding computing concepts. However, one of the concepts for which I’m not sure if I understood the direct connection was the concept of stacking. I think I understand the concept in general as described in the Introduction to Computing (Evans 24-5), but I don’t see how it works with coding. I would be interested in understanding this concept further by discussing it in class!

Peter J. Denning, and Craig H. Martell, Great Principles of Computing. 2015. Cambridge, Massachusetts: The MIT Press.

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access: http://computingbook.org/.

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.