Category Archives: Week 8

Coding as a process

 “Understanding computing illuminates deep insights and questions into the nature of our minds, our culture, and our universe” – David Evan 

In early 2009  when I first became aware of coding, I recall feeling less powerful for knowing little about coding or how to access the geeky backend interface which always looked like a dark web space for super-intelligent people (hackers maybe) and must come with very good knowledge of mathematics and computer sciences. As time went by, I gained more education and awareness on computing and the development of computer operating systems, then I began to grasp the basic concepts surrounding coding and how to writing several lines of code in a computer program is responsible for powering millions of softwares used in our everyday life for computing. This transition and progression is empowering to watch even though I never put much thought into it until now.

To become an expert at anything one simply needs to practice – often.

Code Academy, a website famous for providing beginner-level coding modules points out that one does not require any special skill to take modules on coding and provides users with self-help tools which have proven to be very useful especially for anyone looking to start out a career in programming or simply learning to code for fun. After taking a beginner level javascript module on Code Academy I found that I was able to relate easily to the processes involved in the opening steps using my prior knowledge from learning logic gates and arithmetic progressions in mathematics. 

Questions surrounding who should or who should not learn to code creates a lot of misconception and gender wars in the world of computing. One common view is that programming and coding software applications are reserved for ‘very gifted’ individuals and this is very widespread – yet untrue. Following my personal experience with learning to code, we see how it involves repeating basic steps (an algorithm) and can be mastered by anyone after several repetitions.  We ought to think of coding in the same way as we think of the processes involved in doing a basic task such as cooking or baking. Learning to code, however, requires a lot more focus, dedication, practice, and patience because errors are not welcome at all in this area – unlike cooking or baking where one might go over or under with a spice and get away with it. 

Coding is also a lot like writing in terms of flow and I find coding useful for learning how to properly structure my writing. In writing, we identify the main points and all the variables that best explain the point, then arrange each point in such a way that the subject is explained in modular blocks (sentences) with each point opening and  closing to explain succeeding points – just like the opening and closing of braces when writing lines of code in a block. The logic that measures the semblance between coding and writing expresses further how coding inspires thoughts that are useful for coordinating steps especially ones that are in semblance to systematic processing in computing. 

New insights from computational thinking

Jun Nie

After working through the Code Academic tutorial and this week’s readings, I realize that human and computers are best partners because of the complementarity in thinking ways. The creativity and innovation of human mind can optimize the programming and system of computers, and the powerful computing capacity handle tons of complicated and tedious works for human as well. While computers improve by continuous machine learning, human beings should have a further understanding of computational thinking, which means that “thinking at multiple levels of abstraction”, because it’s an essential way to participate the world around us.

In my opinion, computational thinking requires us to get rid of the limitations of thinking as a consumer. Even though the users can find some shortcomings of products based on using experience, they always intend to complain or put forward some fuzzy expectations, which can be considered as “Should Questions”. Only the programmers or designers seem to have the ability of answering the “How Questions”, but sometimes they may ignore details in logical connection during the interaction. In order to fill the cognitive gap between consumers and designers, everyone needs to master the fundamental skill of computational thinking. As the core principles of computer system, programming teaches us to decompose the tasks into multiple small steps in our mental process, and coding helps us to transform the instructions into written language that a computer can understand. Computer science informs us that patterns emerging from the mundane, small change brought by subtle step can create opportunities for great optimization and improvement.

Previously, I longed the innovation of new business models and admired their success, but owing the creation of blue ocean to a flash of inspiration which might come out by accident. However, if all the applications can be split up into different modules comprised by meticulous procedure codes which serve diverse functions respectively, it will be easier for us to identify the entry point of optimization, increasing the efficiency of resource utilization and providing more considerate service to users.

Besides, I am confused about the specific function of four main languages of web development (HTML, CSS, JavaScript and SQL). Especially in the context of Web 2.0, in which everyone can post personal content on the Internet and interact with others immediately, but we never think that how has the website design developed from static browse to the present dynamic interaction? What kind of Internet technology gives the support? How can we use JavaScript to add interactivity to website and combine the front-end interface with back-end servers?

References:
Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.
David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. 2011 edition.

Understanding binary system and computer programming

In Evan’s reading, I have a deeper understanding of how binary system serve as the fundamental brick of computer science and how information transmission is based on probability. How many 0 or 1 needed to represent a specific message is hinged on the depth of binary questions, which is determined by how many possible values it could have. It also gives me a clearer idea of the information theory and entropy in last course. We could know how many bits are needed in advance of transmission through a simple mathematic formula of log2k. Different form the analog computer who has continuous representation, digital computer is discrete due to the binary system. And the text, audio, visual information could all be designed into binary questions. However, although the binary system could encode infinite messages and the hardware of computer is still under Moore’s Law, the design of computing and programming is still rigorously restricted because human cognitive capability has a great limit. So how to frame programming language in a simple, clear, regular and economic way with a flexibility to abstract is a significant design question.

The Language and Computing part help to understand better about data structure in computer. Like natural language, a program is composed of expressions and definitions. Expressions could be primitives and application expressions. At the basic level, primitives have 3 types, which are numbers, Booleans and primitive procedures. As detailed as coding is, we have many ways to abstract it to simplify and understand easily. To abstract details in programming, we could use compiler or interpreter to serve as a transition between high-level language and machine-level language. Nowadays the Integrated Development Environment is frequently used as a program combining different abstracting functions. And to abstract specific mathematics of code, we could use functions to capsule one kind of computing into a abstract function. As Evan said, what distinguishes a computer from other machines is its programmability. Computer program is a series of procedures executed automatically and is dealt with in abstract level rather than physical matters as other mechanics. As for abstracting text, we could use strings to refer to blocks of text. Additionally, variables could also be seen as an abstract for a consistent value, like we use the abstract “meal” to refer the specific food we eat everyday, and we could assign a single variable for multiple references.

As detailed and accurate as it is, computer programming uses many ways to achieve different aspects of abstraction. Beyond the artifact itself, it reflects both the symbolic power of human cognition and the distinct limitation it exerts on us.

References:

Martin Campbell-Kelly, “Origin of Computing.” Scientific American 301, no. 3 (September 2009)

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006)

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. 2011 edition.

Some basic “take aways”

Looking back at the days since the invention of computers, it is easy to accept the fact that computers have been the most powerful and influential invention of all times. It is such a common opinion that rarely anybody would think about the reasons why and in what ways does it affect our daily life, even sometimes, we shrug it off. It is an ingenious invention hidden in plain sight. We have been taught to understand computers as a laptop or desktop, an artefact that has a keyboard and screen, but never thought of a microwave or a car as a computer. Indeed, if we define computers as an artefact that receives external signals, processes the abstract information, then gives out results, we would get a new view of what computers are.

Computers receive a signal, stimulating an impulse that activates trillion steps of procedures, these procedures are automated by computational algorithms, through preset programs and software, eventually providing feedback as interpretable views to humans. That’s why even though we do not know how to do coding and programming, we do not know how to put together a computer piece by piece, we can still operate a computer. But the most powerful thing is the internal processes, by understanding more details of the systems utilizing computational thinking, we could possibly be more active in the role of problem-solver, instead of negatively being subject to the constraints of the artifacts provided.

What was fascinating to me was the information-coin toss metaphor. I wonder what the information stands for specifically? Why does it require a yes or no answer? What is the result ensuing the answers? Evans explained in his book, that the bits of information are the essential steps for the system to process and provide a precise outcome, minimizing uncertainty level to the lowest. But not all the steps are necessary, by determining questions where the binary answers are equally likely, the steps can be simplified and the time needed can be reduced.

I have not a specific question for the readings this week, however, I do feel like there are a lot of key terms that we need to understand before we sort out the logic and connections. Can we discuss some of the key terms in class and how it is interpreted in computational context? apart from that, I would also like to learn more about the mathematics employed in the information tree and the possible depth, as it is a bit intricate to read and digest on my own.

Rules for computer programming

The computer program consists of a series of instructions that humans give the computer to process, such as calculating the result of 2+3, displaying the letter Y on the screen, and determining whether the user hit the space bar to make a pixel on the screen appear white. Programming languages are the way humans express instructions to computers.

Through the tutorial lessons on the python programming language this week, I understand more about computing. First of all, as Jeannette said:“ computational thinking is thinking recursively. It is parallel processing. ”Computer programming decompose a complicated problem into several small issues and operate all of the problems at the same time. For example, during the lessons, one instruction asked me to make a grocery list to plan the budget. In programming, the problem of planning budget became three small problems: numbers of cucumbers, price per cucumber, and the total cost of cucumbers. No matter how complex a problem is, it can eventually be broken down step by step into a single basic instruction and how those instructions fit together.

Secondly, Formatting language is essential in programming. The computer cannot recognize the orders that disobey its code format. For python, print(“ ”) is the basic rule when programming. When I was inputting codes, I made several mistakes in quotation marks, brace, and capital letters.

Computer language has three components: primitive, means of combination, and methods of abstraction. The primitives are the basic units in programming that cannot be broken into smaller parts. The means of combinations are the structure that combines the primitives and build up sentences in the computer. Abstraction is a process of generalization. It replaces complex concepts with simple words. When programming, you can find out these three components are everywhere.

Thirdly,programming in computer use symbols to represent words. Representative symbols conduct information transmission in programming. For example, “=” in programming means equals to, “+” means the sum of the input numbers, “*” means the product of the input numbers, “#” means hide the sentence on the terminal in python, and “print” says the message should be seen on the terminal.

In conclusion, The essence of programming is computability. All programming becomes a computer representation of the mathematical calculations of a limited range of abstract models. It is the commonly used language in the computer, which connects the input side with the output side by strings of words numbers and symbols.

resources:

Martin Campbell-Kelly, “Origin of Computing.” Scientific American 301, no. 3 (September 2009)

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006)

Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: MIT Press, 2015.

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. 2011 edition.

Code: Writing for Humans and Machines

There’s a level of irony that even for our most abstract symbol systems, on the level of human interaction, language persists. In other words, even though a computer might abstract a line of Python code a level further into a series of 1’s and 0’s, the programming language has been designed to accommodate for the human proclivity to deal with information through computable symbols, which is to say, with words. Of course, there are a number of differences between natural language and computer code, but the extreme formalism of code seems the most immediately obvious.

Of course, any computer code needs to be designed in such a way that the human “writer” can input instructions to the machine without ambiguity. Furthermore, the code must be designed in such a way that accommodates for the material limits of computer design. In other words, the code must account for the key insight of information theory (that the message does not matter) and must account for the material process of reproducing a message at a different point, which, in our computing devices as they exist today, means being designed according to the limits of boolean logic.

Code exists in this liminal space: it must be meaningful to the coder and meaningless to the computer (in the information theory sense of meaninglessness). In this way, it must account for the semiotic possibilities of human cognitive faculties, as well as the material limits of the computer. Because of this bi-directionality, computer code differs from perhaps all other writing practices in the course of history in that, on some level, it needs to contain both symbols that say and symbols that do.

It has become the practice of some avant-garde artists, particularly in the earlier days of personal computing, to exploit this very affordance of programmable media by constructing highly formalized poems which can both be read by a human reader as well as run as an operational program. One such attempt at this bi-directional code/poetry is shown in an excerpt here by literary artist/theorist, John Cayley.

Even if the human side of this poem is awkward and constrained by the limits required for maintaining its status as working code, it proves an interesting point about the design of programmed languages: that they always have to account for the semiotic processes of both the man and the machine.

Works cited:

Cayley, John (2002). “The Code is Not the Text (Unless it is the Text)”. Electronic Book Review.

Wing, Jeanette (2006) “Computational Thinking.” Communications of the ACM 49, no. 3: 33–35.

From Computing to Ubiquitous Computing

In the beginning of computing, it is almost equal to calculating. The first “computers”
were people who would tediously compute sums by hand to fill in artillery tables. However, the human computing was definitely not able to fulfill the dramatically increasing need of calculation required by wars. So, computing evolved and real computer appeared. Mathematics, physics and chemistry set foundation for the development of computer and this foundation also suggests the interdisciplinary nature of computing and its future wide implementation. Nowadays, computing already becomes important and necessary not only in our work, but also in our life.

Bread, which is so different from computing, could be an example to illustrate the influence of computing in our work and life. Humans have been eating bread for thousands of years. This habit remains but the way of producing and eating bread have been changing a lot. In a modern bread factory, bread is made by machine. The operation of machine is regulated by program, which roots in the control systems. With this program, machine can operate itself automatically and precisely. As consumers, we also benefit from computing because the computing lower the cost and we can buy cheaper products. Also, we can use some applications to buy bread. It is no doubt that applications’ development is not separable from computing.

Human now lives in a society of ubiquitous computing. As illustrated in Wikipedia, ubiquitous computing is a concept in software engineering and computer science. In contrast to desktop computing, it can occur using any device, in any location, and in any format. Computing already surpasses the boundary of computers and penetrates into other products, including phone, television, car and refrigerator. It’s safe to say that all aspects in our life- eating, drinking, living and transporting- are closely related to computing. Even the unfamiliar phrases like data mining, machine learning and artificial intelligence are not so far from us: data mining can detect spam emails that we receive and machine learning can improve advertising.

Since it’s obvious that our life has deeply interconnected with computing and it seems that this trend will not stop in a short term, it’s better for us to know about computational thinking in order to fit ourselves in this computing society and find our position. Just as Jeannette M. Wing emphasizes in the Computational Thinking, “Computational thinking is a fundamental skill for everyone, not just for computer scientists.” Computational thinking doesn’t mean that we should think as a computer, which is impossible also unreasonable. It rather requires us stand in a higher position to evaluate and use computing: not only knowing the principles of computing, but also know how to imply these principles to continuously develop computing or use them as logic tool -a way of thinking- to solve problems and finish tasks. Not everyone is able to code or needs to code, but everyone has the ability to think computationally, an easy way to improve life quality and work efficiency.

 

Recitation:

Martin Campbell-Kelly, “Origin of Computing.” Scientific American 301, no. 3 (September 2009): 62–69.

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: The MIT Press, 2015, chapters 4, 5, 6.

Programming language and computing system

This week’s reading changes my understanding of programming language. In professor Irvine’s video, he mentions that we use symbols to represent meanings and to represent and interpret other symbols. Language is such kind of symbol, since we use language to represent language. It does not represent what computer speaks and thinks (what it seems like) but imitates human’s thinking pattern. Just like what Evans points out, “we designed artificial languages for some a specific purpose such as for expressing procedures to be executed by computers”, computer is a machine that decodes our information and execute our command, a procedure through which computer helps us solve problems. Instead of being chaotic, it is actually highly organized and follow the syntax rule of language. Programming language only focuses on the surface form of text. Each word or sentence generates a new meaning. The third function of symbol, mentioned by Professor Irvine, is that it does not represent meanings but performs actions on other symbols. For example, the design of operating system can be used as managing and controlling tool of other software applications. In this case, we use programming language to control operating system and further control software applications.

Different from the symbolic human language system, which is complex, ambiguous irregular and uneconomic, programming language serves as a more powerful means of abstraction. Programming language is simple, direct and easy to execute. Last week, we learned about Shannon’s Transmission Model of Communication and Information. I see programming as the process of information transformation from human to programming system. Take the example of Scheme program (although not being widely used), we, human, first put Scheme, the highest level of language, into the programming system as the resource of input. Then the scheme interpreter decodes and transmits the higher level of language, that is the information, to the machine processor, which is the information receiver. Finally, the machine executes the command.

Computing system connects the programming system and the machine by calculating the functions human put in and transmitting human’s command to the machine. Each part of the hardware has their own functions in the calculating process. There are interfaces that connect the separate parts and transit information between them. For example, RAM stores information, while CPU calculates. From the block diagram of CPU and RAM, we can know that the IP indicates the location of the next program while the SP is address of the newly stocked information. The ALU takes two input numbers and produce one output number. CPU and RAM transmit information and values back and force. By looking inside the machine’s hardware, we started to visualize where the information goes after we put codes into the programming system. The way system calculate functions is a mathematical process, and the way each part of the computer hardware function is an engineering process. That is the reason why Wing sees computational thinking as complements and combines mathematical and engineering thinking.

Reference:

Prof. Irvine, Introduction to Computation and Computational Thinking

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: MIT Press, 2015. Review chapters 4, 5, 6.

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. 2011 edition.

The Development of Computers

Early computer design was technically human brain design. Computers have long been modelled to recreate the way humans process information and calculate entities. I’m reminded of the film “Hidden Figures” which features groups of female computers for NASA who is in charge of doing all of the calculating, sometimes machines take over, so a character becomes adept at using and running a machine computer so the process becomes more efficient.

Even during the building of modern computers and laptops, the way one interacts with the interfaces is down to how humans interact with their office spaces, as mentioned in one of this week’s readings: “In 1974 Tim Mott3 was an outsider at Xerox PARC, working for a Xerox subsidiary on the design of a publishing system. He describes how the idea of a desktop came to him as part of an “office schematic” that would allow people to manipulate entire documents, grabbing them with a mouse and moving them around a representation of an office on the screen. They could drop them into a file cabinet or trashcan, or onto a printer. One of the objects in the office was a desktop, with a calendar and clock on it, plus in- and out-baskets for electronic mail.”

The computer, whether human or machine, is by its very nature a way of interpreting information, therefore considering we as people have ways of calculating things (equations, signals, using language and codes to send messages etc.) this must be reflected on the mass creation of physical computers. As Professor Irvine discusses in his reading: “In the context of our computational and software screen metaphors, a computer device interface is a design module for enabling semiotic inter-actions with software and transformable representations for anyone talking up the role of the cognitive interpreting agent.”

This idea of ‘agency’ is vital for opening up computing to the general public. Tech companies had to give agency to people by mirroring their thought patterns in a technological way, allowing human beings to interact with a machine in a way they would other physical non-technical artifacts. One must be able to manipulate, identify and understand the information presented to them on a screen in order to truly interact with it and prove the computer a reasonable object for use.

By Eish Sumra

Some readings.

Computer Interface Design Concepts: Major Historical Developments

Designing Interactions by Moggrich

Computing with Symbolic-Cognitive Interfaces for All Media Systems: The Design Concepts that Enabled Modern “Interactive” “Metamedia” Computers Professor Martin Irvine

Computing and Coding

Xueying Duan

One idea of this week’s readings is that the concept of computer is derived from people’s calculating ability. Inspired by human’s calculating habit, the computer is designed to repeat human work at a higher speed. And the concept of computational thinking brings me back to a Chicken and Egg Situation. I can see a lot in common of the characteristic of computational thinking and what we discussed about the design thinking process of product developing (for example, a computer). Computational thinking requires us to think reversely, which means that we need to trace back to our intention in making every decision. What would we like to achieve? And how to arrange the module and each component and let them interact with each other? How to simplify the computing process of different occasion into something universal or can adopted widely in order to support some overall request (like the introduction of binary)? What are the affordance and limitations of it? How to improve it to fit in diverse situations? These questions all come to me when I start to think about making a new product that fulfills some users’ new requirements. The same is with the development of the computer. Devote the computational thinking process into the product development enables the vitality and success of modern artifacts. Computer science, from the very beginning of its birth, is related to various fields like mathematics, electrical engineering and some computational-oriented science. It is built to solve daily questions for people. Thus, this subject is first being created and observed and then being defined and clarified as an independent research field. What makes me a little bit confusing is the installation process of the algorithm in the computer. If people use some uniform “language” (algorithm) to control the action of computer, what is the process before the computers have learned to recognize the languages? Is there a teaching and studying process?

Computer science equals programming. So then I take the Basic of Programming tutorial on the Codecademy. It first introduces the principle of computer language that we must make the sentence extremely uniform to make it recognizable for computers. Basically it introduces three basic data types: numbers, strings and boolean data. But then I run into big confusion when learning the definition of function and the use of it in Java Script. Then I come across the definition of control flow, which is the order of execution in a program and create different logic command for computers. BTW, here is a program that I write to tell a story…

After the tutorial, what is clearer is the procedure of computational language. Basically, it accords with the daily speaking habit of people, which makes us a lot easier to understand each command by simply reading it. By mastering some basic rules when writing to a computer like the use of specific punctuation and terms, we can then “chat” with the computer. Every miracle we now see is happening due to hundreds and thousands of sentences just like this. Everything about the computer is based on human intelligence and calculating habits. 

References:

Martin Campbell-Kelly, “Origin of Computing.” Scientific American 301, no. 3 (September 2009): 62–69.

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Peter J. Denning, “The Great Principles of Computing.” American Scientist, October, 2010.