I have learned approximately 40% of the Python course on CodeAcademy. It’s not much, but enough to prompt me to retrospect the ideas we have learned so far, including those about linguistics, distributed cognition, information theory, and the computing principles mentioned in this week’s reading.
Like natural languages we discussed in earlier weeks, programming languages have tripartite parallel architectures mentioned in Ray Jackendoff’s Foundations of Language[i]. Programming languages are made of basic elements of meanings called primitives[ii]. Primitives are predesigned symbols that mean things or do things. For example, strings and variables are symbols that mean things. We can assign meanings to them or change their meanings later. “True” and “False” are Booleans, a kind of primitive that represent truth value[ii]. “print” is a primitive procedure[ii], and it means to display the strings after it on the screen. “import” means pulling modules or individual functions into current editing context. “%” in calculation means calculating modulus, while in strings it is a placeholder whose meaning will be assigned immediately after the string.
Likewise, primitives are organized with syntaxes. For example, equals signs are used to assign value, such as “spam = True”. Triple quotation marks are used to add comments. “else”s should come after “if”. A function definition must be followed by a colon. Parentheses have to come in pairs. However, the syntaxes in programming languages are much stricter than those in natural languages. When you are speaking natural languages, you don’t have to precisely grammatical in order to be understood. But if you lose just one colon after the “if” statement, your entire section of codes couldn’t be interpreted by Python. Thanks to the programs running behind the online Python testers, we could easily identify where the errors locate.

A Python online tester could identify mistakes.
Even so, programming languages share the property of arbitrariness with natural languages, as Prof. Irvine mentioned in this week’s Leading by Design course. That’s because you can write so many different versions of codes to achieve the same goal.
In this week’s reading, one thing that surprises me is that there are so many problems that computers can’t solve. For example, “the only algorithms guaranteed to solve exponentially hard problems are enumeration methods[iii]”, as mentioned by Peter Denning in his Great Principles of Computing. Because the time needed to enumerate exponentially hard problem is too long, we have to use heuristic methods to approximate the best solutions. In other words, there probably exist better solutions than those given by computers. Maybe quantum computers would solve these problems in the future.
Questions
- Should a programmer memorize all the syntaxes in order to write a program?
- Jeannette Wing emphasized the importance of computational thinking. She said we should add computational thinking to every child’s analytical ability[iv]. She also explained what computational thinking is. I was wondering how to build a computational thinking?
References
[i] Jackendoff, Ray. 2002. Foundations of Language: Brain, Meaning, Grammar, Evolution. OUP Oxford.
[ii] Evans, David. 2011. Introduction to Computing: Explorations in Language, Logic, and Machines. United States: CreateSpace Independent Publishing Platform.
[iii] Denning, Peter J., and Craig H. Martell. 2015. Great Principles of Computing. Cambridge, Massachusetts: The MIT Press.
[iv] Wing, Jeannette. 2006. “Computational Thinking.” Communications of the ACM 49 (3): 33–35.