I knew the process of making meaning as part of the symbolic species was complex before I started on this week’s adventure. But as I tried to wrap my human brain around all of the processes needed to translate human speak into electrical signals, my mind was blown.
At each stage of the transition from symbols that mean things to symbols that do things, an astounding amount of human symbolic power was needed to create these technologies in the first place. And that it all runs efficiently and quickly and mostly without errors, no doubt in large part thanks to Claude Shannon, is hard to believe.
Using the efficient software and apps are humans, with our apparently complex, ambiguous, irregular, uneconomic, and limited natural language (thanks for the adjectives, Evans). We make meaning out of our natural language despite and because of its imperfections, but computers can’t make sense of it like we do. They need precise and unambiguous instructions to do their jobs.
One of the interfaces that helps us communicate with the machines is programming language. Python is one. We can read and write Python, make meaning of it; computers can execute it with the help of some other code. Interestingly, I’ve always used the word “understand” rather than “execute” in that last part, but I stopped myself this time because while the machines are processing symbols, they aren’t understanding meaning. They’re executing.
Python is a relatively high-level programming language that was developed to be accessible and readable to humans versed in natural language—the principles “beautiful is better than ugly” and “readability counts” are part of the canon. Yet, I find trying to learn Python a bit difficult simply because it is so close to natural language. I assume that if or for-in statements should do certain things based on my knowledge of English, but as CodeAcademy and Coursera have taught me, my assumptions are not always correct. I wonder if a more abstract language would be better for me. But I digress.
A compiler (or an interpreter for other programming languages) translates the code I’ve written and that I can understand into code the machine can do something with, or at least starts that process. This has usually been the boundary of the black box for me, but I think it’s been pried open this week.
The compilers map our code on to Python’s underlying grammar, a more abstract symbol system that some very smart people created. That grammar translates the information into machine code, which directs instructions to the computer’s hardware in more symbols—binary 1s and 0s—and ends up in electrical signals (I think). The machine, through symbols, is acting on the commands I gave it using another symbol system. And the symbol system I made meaning of translates into a physical action in the form of electrical pulses. The results of my little program are stored in memory until the program wraps up and the results are sent back to me so I can interpret and make meaning of them. (Although, I think there is another layer before machine code for Python so it can work with lots of different operating systems, kind of like Java, but I’m really on shaky ground here.)
With all this complexity and all the work that went into developing these processes, let alone the complex pieces of software and tiny pieces of hardware involved, I probably shouldn’t get too grumpy when Microsoft Word freezes up every once in a while.
I saw the fruits of compilers when using Python, but I think I’m finally starting to grasp how they work thanks to Evans, Denning, and Martell. The P=NP problem and the concept of stacks are also much clearer than they’ve ever been. Recursion in general makes a lot of sense, and Python training has helped to clarify that more, but the idea as described by Evans is still a little fuzzy. And I find myself wondering about the concept of cognitive scaffolding—does the concept have parallels in computing? Both the process of using heuristics to get answers to problems that can’t be logically computed (described by Denning and Martel) and regular expressions in programming language reminded me of the concept of cognitive scaffolding, but I imagine this might be an incorrect comparison.
I leave this week wondering if computation is the universal language. And I certainly see the value of teaching computational thinking. But there is beauty and adventure in the imprecision and inefficiency of life that would certainly be a shame to lose.
Denning, Peter J., and Craig H. Martell. Great Principles of Computing (Cambridge, MA: MIT Press, 2015).
Evans, David. Introduction to Computing: Explorations in Language, Logic, and Machines. August 19, 2011 edition. CreateSpace Independent Publishing Platform, Creative Commons Open Access: http://computingbook.org/.
“PEP 20 — The Zen of Python.” Python.org. Accessed October 27, 2016. https://www.python.org/dev/peps/pep-0020/.
Wing, Jeannette. “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.
———. “Jeannette M. Wing – Computational Thinking and Thinking About Computing.” YouTube video, 1:04:58. Posted by ThelHMC. October 30, 2009.