Yanjun Liu – Final Project: Analysis of Computation from Coding Procedure

Analysis of computation from coding procedure

       What is computation? According to Merriam Webster Dictionary (https://www.merriam-webster.com/dictionary/computation), it is 1) the act or action of computing: CALCULATION, 2) the use or operation of a computer, 3) a system of reckoning, or 4) an amount computed. In short, computation is a procedure of the change of numerical values under systematic rules, which can be executed by people or digital devices. Alan Turing (1912-1954) regarded computation as “the evaluation of mathematical functions”. (Denning & Martell, 2015) Looking back at the history of human beings, from the distribution of weapons and food among clans in the Stone Age to the complex analysis of data and writing of code via computers today, computation has filled every stage of human history. Bohm and Jacopini (Böhm & Jacopini, 1966) pointed out that generally, we calculate numbers because of three kinds of things: 1) Perform instructions in strict sequential order (sequencing), 2) make a choice between two alternative calculations based on the outcome, true or false, of a test (choice) and 3) repeat a calculation many times until a test says to stop (iteration). With the development of computing devices and information technology, though nature remains the same, the interpretation of computing is changing and becoming more diverse. What does computation mean nowadays? Why are we having different ways of computing? How are we going to do that? What exactly happened when we are using a computing device? In this article, the above questions of computation will be answered by analyzing the procedure of coding using Python language. The main body is divided into two parts: 1) definitions of key concepts and 2) analysis of coding procedure.

       Code as a noun means “a system of symbols (such as letters or numbers) used to represent assigned and often secret meanings”, while it means “to put in or into the form or symbols of a code” as a transitive very and “to create or edit computer code” as an intransitive verb. (https://www.merriam-webster.com/dictionary/coding) Since code equals to symbols or congregation of symbols, we need to discuss the term “symbols” and its main carriers “language” before we go to the coding procedure.

       Symbols are “theories of signs” (Chandler, 201), a sign is defined as “something which stands for something. All meaningful phenomena (including words and images) are signs” (p.8). That means when we say a word (no matter using what language), or do a gesture, or see an object, except the physical existence of them, there are other meanings behind them that originated from different cultures and can be interpreted differently.

       Language is the main carrier of symbols and human being is the only species that can use it because of our cognitive system. From the linguistic study side, language consists of words, rules, and sentences. (Pinker, 1999) When we speak of language, we usually refer to “natural language”, a term that means “any human language acquired ‘naturally’ by being born into a language community”, or say, “our mother tongue”. (Irvine, 2020) When we are coding, instead of natural language, we are using programming language. According to Denning and Martell (2015), a programming language is “an artificial language with its own rules of syntax, used for expressing programs” (p.84), which is a “metalanguage”, a language developed for logic, mathematics, and computer programming. (Irvine, 2020)

       Why must computing on a computer require a different language? There are two reasons. First, the computer as a computing machine, can not recognize the natural language. Instead, it can only read binary code that represents everything by 0 and 1. A programming language is a language that can be translated into binary codes, thus enable the computer system to understand the orders from the input and then execute them by running programs.  Basic binary units are “bit” and “byte”, a byte consists of eight bits and it is also the basic unit of information processing. A program is “a set of instructions arranged in a pattern that causes the desired function to be calculated”. We are using computers in our daily lives because of their ability to run programs such as reading an e-book using PDF viewer, using Zoom to have meetings, or editing videos by using Adobe Premiere. Second, compared to natural language, the programming language is way more precise. Unlike human being, a computer cannot understand ambiguous information and it can only execute functions with clear instructions. We understand things and meanings of symbols by being in that related cultural or social environment, therefore we get it when a different word is used in various situations, while computer requires a clear definition of everything. You can’t tell a computer something simply like “let’s go!” and expect it “goes with you”. Maybe your friend understands it because you have a shared understanding of that expression, but for computer, you need to tell it what is that action and where that action leads to. We will further prove the necessity of this preciseness in the following part of analyzing the coding procedure.

       In my coding practice, I used Visual Studio Code as the code editor and Python as the programming language. Visual Studio Code is a source-code editor made by Microsoft, it can support many programming languages and have multiple functions such as automated debug, highlighting syntax, compile code, and so on. It is an “IDE”, integrated development environment that provides features that speed up code development. The programming language I used was Python, an interpreted language that is widely used in different areas with readable syntax rules, and a more general design purpose. It is been widely adopted because of its understandability, which can help the programmer to write codes and programs with clear logical expression. There are also many other programming languages existing for different purposes such as C++, Java, Javascript, Ruby, and so on. These languages usually have different syntax rules and a different definitions of functions.

       The code I wrote is really simple. The content is a little grading survey about the past online semester. The purpose is to get feedback from students about their general feeling about past experiences and their suggestions for the coming semester that is still going online if they gave a low grade. So, in my programming design, I need to present three basic parts: 1) Greeting and introduction about the survey, 2) proper questions, and 3) interaction feedback. Here is the screenshot of my source code (P1):

                                                                        (p1)

       As you can see, I wrote 17 lines of code, which are pretty short but have basically covered all of the requirements that I mentioned above. Different color indicates different construction parts and functions. For example, the orange part inside of the parenthesis is the text that will show on the screen; the green part is a unique function of programming language called “comment”, which will not be presented publicly but only among those who have the access to source code. It is designed to help programmers to have a better understanding of what exactly these parts of codes do, as a big program will usually be separated into different parts and programmed by many programmers. The other colors indicate rules and statements of Python of different functions, which I need to know beforehand, or there will be no ideal results, but errors come out because the computer cannot recognize what I mean. Usually, the wrong use of programming language statements that causing errors are called “syntax errors”, which happen when the written codes do not fit in the expected rules. Below are the running results of my codes:

Situation 1

       In Situation 1, when asked “are you ready?”, respondent typed “No” or any other answers that are not “yes”. the respondent did not prepare well (or simply don’t want to cooperate), thus, the program will end unless the respondent goes back to the first greeting part (like once again click on the link, if this survey is truly published somewhere online.)

Situation 2

       In Situation 2, the respondent typed “yes” and officially started the grading by following the rules “grade your online learning experience for this semester from 1 to 5”, 1 is the lowest while 5 is the highest, which includes the designs in the source code, the two “if” statements as they show in P1. Here, the respondent typed “5”, the highest grade, which means “very satisfied about the past online learning experience”, then, the program offered its feedback and ended the session.

Situation 3 

       In Situation 3, the respondent inputted a negative result, marked the experience as very bad, and only gave 1 point as the grade. The program, as set in the “else” part in P1, offered feedback and asked the respondent to email her suggestions to an email address.

       In all, the basic logic of this little program is as below:

                                                                               ↗ Grade > 4   →   session end

                                                             ↗ Yes

                                                                              ↘ Grade <= 4 → session end + suggestion feedback 

Greeting+Introduction  →ready?

 

                                                            ↘ other answers   →  session end

      In every application that we use in our daily lives, similar logics are presented in the source code using statements like “if…else” or “def” (definition). I would like to make a connection between programming and teaching, which both of the actions mean that you have to define things and follow certain rules, and the target audience, no matter it is a computer or a child, is learning from your statements to try to understand the meanings behind and act according to your instructions. The programming procedure clearly indicates the necessity of programming languages’ preciseness, it is also a procedure that the programmers translate their ideas into another language, programming language, just like what I did.

       But what exactly happened inside of the computer when I was programming? First, we need to know some basic knowledge about the hardware — the computer. Two main hardware devices are working when I am running the Visual Studio Code and writing codes. A CPU (central processing unit) is a hardware device that reads instructions from a program and executes them, one at a time, in the order prescribed by the program. A RAM (random access memory) is a hardware device that holds data values in locations that can be read or written by the CPU, it is called “random access” because it can access any random location in the same amount of time. (Denning & Martell, 2015, p.65) When my computer is on and I am doing something using it, my computer loads memory to RAM. Besides hardware, the interpreter (here as Visual Studio Code) is also functioning, it processes my source code each time it runs, line by line. Basically, there are three main ways to translate source code into machine code: compile it, interpret it, or a combination of both. (Davis, 2019)

       Also, when I am writing codes in a program, I am staring at a screen, typing on a keyboard, and click buttons with my mouse. These are called peripherals, which can be regarded as the extension of the human body.

       One last thing that I want to introduce is computational thinking, which is also something that I learned from the little programming experience. Computational thinking is a method, or say, model that we can apply in solving big and complicated problems. As Wing (2006) pointed out, “computational thinking is using abstraction and decomposition when attacking a large complex task or designing a large complex system.” By applying computational thinking, we redefine, decompose, create new concepts, and integrate small solutions together to form a complete answer. This way of thinking can be accessed by simply trying programming like what I did, by having a goal in my heart, and then separate it into different parts to explain and seek answers.

       Computer-related knowledge like programming is always confusing as people usually focus on its practicability instead of knowing the reasons behind usage and actions, it is something hiding in the black box but definitely not something that mythical. Programming does enable me to look at computer and computation from a brand new perspective.

 

Reference

Böhm, C., & Jacopini, G. (1966). Flow diagrams, turing machines and languages with only two formation rules. Communications of the ACM, 9(5), 366-371. doi:10.1145/355592.365646

Coding. (n.d.). Retrieved December 13, 2020, from https://www.merriam-webster.com/dictionary/coding

Computation. (n.d.). Retrieved December 13, 2020, from https://www.merriam-webster.com/dictionary/computation

Davis, A. (2019, July 22). The fundamentals of programming – Programming Foundations: Fundamentals Video Tutorial: LinkedIn Learning, formerly Lynda.com. Retrieved December 13, 2020, from https://www.linkedin.com/learning/programming-foundations-fundamentals-3/the-fundamentals-of-programming?resume=false

                  Denning, P., & Martell, C. (2015). Great principles of computing. The MIT Press.

Irvine, M. (2020). Linguistics, Language, and Symbolic Cognition: Key Concepts. Retrieved December 13, 2020, from https://drive.google.com/file/d/1DIN2gFzjugV8J7iCWyqTLY4zxWBzJqna/view

                 Pinker, S. (2000). Words and rules : the ingredients of language (1st ed.). Basic Books.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35. doi:10.1145/1118178.1118215