By learning Python, I somehow see how to use programming language as the computer’s “nature language”, and give the computer instructions to complete specific tasks. It is the language between human and the computer: according to David Evans, it is a language human can understand and machines can execute. These are some reflections I had through the learning process:
Programming language and natural language share quite a lot of similarities. Words and abbreviations are borrowed from nature language to function in programming language, such as “print” means to present the result on the screen and “len()” means to measure the length of the word. During the first few lessons where there are not a lot of symbols involved, I feel that I can totally understand what the algorithm is about by reading the instructions I write down. As in natural language, the smallest units of meaning are morphemes, according to David Evans in Introduction to Computing, the smallest units in programming language are primitives. Based on that, a scheme program is capable of processing expressions and definitions.
Programming also has its own grammar system which is called algorithm according to Denning’s The Great Principles of Computing. The language structure of programming is kind of recursive according to Computational Thinking, that “a computation is an information process in which the transitions from one element of the sequence to the next are controlled by a representation.” Different symbols are used combined with primitives to function different algorithms, and I need to be careful and pay attention to small details to run the program successfully. Certain rules have to be followed, for example “6.75%” should actually be presented as “6.75/100” (as 6.75 divided by 100 in stead of just showing the percentage), as “%” implies a different function than showing the percentile as we normally use it in maths. Also, by using “str()”,the number does not change but somehow the meaning and nature of it has changed.
Based on that, it is interesting to point out that Python demands accuracy to a great extent, and the flexibility of the language is somehow limited. The “interaction” between human mind and computer is not the same as between humans. While during the latter one human brain can understand each other, complement the incomplete sentences, autocorrect mistakes and develop the meanings, when we are communicating with the computer we need to be accurate and specific of what we actually mean. The computer will not complement or refine the programming language itself, any small mistake could fail the running process.
It is a shame that when I have proceeded the Python tutorial to “String Formatting with %, Part 2”, the webpage somehow stuck when I click “run” and I cannot go any further than that point in the tutorial, otherwise I could have got the chance to explore more aspects of programming.
Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.
Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: The MIT Press, 2015, chapters 4, 5, 6.
David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access: http://computingbook.org/.