Computational thinking and countless possibilities of implementation– Wency

Computational thinking and countless possibilities of implementation– Wency

It might not be a bad idea to start the discussion with an example we’ve learnt this week. We want to obtain a word which could be determined by everyone that have seen our need. We want to make it neat with all the characters in this word to be lower case. We also want to do something with the word, we want to replace the first character onto the end of the word and add some other self-defined characters onto the end of the new word.

Now it seems that we have a clear need with clarified task already. To accomplish the task, we need to break it down into several procedures with the help of our nature language:

  1. Define a string variable and assign a value to it
  2. Define a variable and assign the user input on it.
  3. If the length of user input is more than zero and the all the characters in the user input are alphabets, conduct the task below
    • convert all the characters within the word into lower case
    • define a variable and assign the first character in the word onto it
    • add the first character and the value of the string variable defined in the first step onto the variable
    • cut the first character of the new variable from it
  4. If the length of the user input equals to zero, then output empty.

The above steps, i.e., initiate a need, define the task, analyze the task and break down into several logic steps or logic functions. It’s human nature to break accomplish a complex task step by step each of which is relying on a specific function relatively independent from the other parts, and interestingly, we are sometimes unaware of that. The way human beings solve problems, thus, is a way of computational thinking, which, as Wing mentioned in his work, is a way that humans, not computers, think. Within just a few steps we are thus able to conceptualize the large chunk of task into multiple level of abstraction (Wing, 2006, p.35).

At this level, however, our computational thinking is better to be ascribed onto a mathematical level, to connect our daily lives tasks with computer, we need further transformation. As is mentioned by Evans, there are several inevitable problems with nature language including complexity, ambiguity, irregularity, uneconomic and limited means of abstraction (Evans, 2011, p36). Therefore, just as human society has its own grammar of nature language, there is a set of syntax that we need to follow to interact with computers (Irvine, 2018, p.7). Speaking of this example, we are using the syntax of Python to transfer our nature language into the first level of computer understandable language. We implement the Boolean logic thinking, abstraction of different variables into the code stored in script.py, the interpreter later converts the language into binary values, i.e., 0s and 1s which could be manipulate by machines (Evans, 2011, p38).

Since 0s and 1s, basically speaking, stand for two states which represent the concept of digitalization where analog values are divided into several regions based on standards or boundaries, to merely implement two states physically, we can use a ton of different physical devices such as the fluid computer, the jacquard loom, etc., however, since the speed of electricity equals to the speed of light (i.e., 3*10^8 m/s), and since the electricity only need to travel nanometers between transistors, the two states of switch, i.e., ON and OFF, could happen millions of times within one second which makes the information processing much faster, therefore, using electricity to implement the binary operation is a more popular way in modern computation. By combining transistors into logic gates including AND, OR, NOT, XOR and combine the logic gates into more complicated modules including logic arithmetic units such as adder, comparer, etc., and those modules such as memory, control unit, arithmetic logic unit, etc., thus are combined together and interact with each other under control to work efficiently (Irvine, 2018, p.5). we are therefore utilizing instruction to perform several tasks based on the binary sequence (Hillis, 1999, p.20-27). At this point, back to Wings conclusion of the characteristic of computational thinking, it is easier to understand the concept that computational thinking complements and combines mathematical and engineering thinking (Wing, 2006, p.35). Computation, thus, although understood by many several scientists as mathematical thinking, is physically limited and measures the power of a computing machine includes how much information and how fast it can process (Evans, 2011, p.3).

Of course, the example I provided above is just one case in our daily lives. In fact, as Evans mentioned, we can use sequences of bits to represent many kinds of data including Numbers, Discrete Values, Text, Rich data (e.g.: picture is divided into discrete squares known as pixels) and there is a limitation in terms of frequency that human beings can distinguish, whatever that could break down into smaller chunks and procedures in daily lives and illustrated as logical steps could therefore be transferred into the machine understandable language, interpreted into binary sequence and thus implemented by electricity and transistors (Evans, 2011, p.11).

To look back on each step that we use to accomplish a complicated task, therefore, it’s not difficult for us to recognize how much symbolic meanings and representation are assigned by us onto each level of the task and how we implement such symbolic representation to enable our interaction with computers. We are offloading and distributing human agency and cognition into software not only because we are able to manipulate computers and enabling its automatic information processing, but we are also at the same time thinking computationally with abstraction, recursion, modulization ourselves through every single task we confront in our lives. We are in a digital-analog continuum for implementing many kinds of design concepts software designed for automating symbol process, computation is the outcome of cumulative human symbolic thoughts for representing abstract patterns and process, it performs metafunction which, not merely represent meanings, can also be used to perform actions on other symbols (Irvine, 2018).

References:

  1. Evans, D. (2011). Introduction to Computing: Explorations in Language, Logic, and Machines.
  2. Hillis, W.D. (1999). The Pattern on The Stone: The Simple Ideas That Make Computers Work.
  3. Irvine, M. (2018). Introductory video.
  4. Irvine, M. (2018). An introduction to Computational concepts.
  5. Wing, J. (2006). Computational Thinking. Communications of the ACM 49(3). pp. 33-35