Category Archives: Week 8

I once saw it as just the less than < symbol now I refer to it as open tag…

Grace Chimezie

 

Moore’s law is a violation of murphy’s law.

Everything gets better and better… Gordon Moore.

 

Introduction

I’ve always had interest in computing but for some reason never felt inclined to indulge myself in the process, since mathematics wasn’t my strong forte and when I saw people adding greater than to normal grammars or alphabets I was done with the whole process.  

This week’s reading was interesting, Evans brought back those same emotions I had in regards to computing and the principles that underlie it, except this time I didn’t have a choice. At the beginning it felt like I was reading a lot of understandable jargons and I had to pause at some point and try out the code academy, then the scale fell off. What I once attributed to < greater than became, open tags and I could finally relate to some of the codes I was familiar with from all my years of blogging (a story for another day).

In previous weeks of seeing the recurring term of Semantic and Semiotic, for the first time I had to revisit with those words with a different view and knowledge. Now I see my own humanistic attributes in my computer playing out before me. However, I’ll today duel more on the programming principles of this week’s reading and how I could understand the hierarchy existing in our everyday social structure also existed within the computing family.

<!DOCTYPE html>

I need to <br/> this conversation into <div>or

<p>.Now let’s get to the <body> or is it <body> </body>of this argument.

<!– This is what happens when you code, you get –>

some of the terms I was able to clarify

Hypertext markup language (html), self closing tag, Doctype, head, page titles, body and the list is endless. I learnt that to program computers, we need tools that allow us to describe precisely and succinctly. Since procedures are carried out by a machine, every step needs to be described.

Conclusion

Computing changes how we think about problems and how we understand the world. Programing languages come in many flavors. It is difficult to simultaneously satisfy all desired properties since simplicity is often at odds with economy. Every feature that is added e.g <div>, <p>, <body></body> is to increase its expressiveness incurs a cost in reducing simplicity and regularity e.g <br/>.
A better constant practice with book and words and a host will relieve the fear that computing or programing is left to those who wear prescribed glasses although our professor Irvine doesn’t wear one, its a lot of hope for people like us  

Reference

David Evans, 2011: Introduction to computing: Explorations in language, logic and machines: University of Virginia Chapter 1-3

Computer Science, the science that teaches you how to find solutions to everyday problems

Every time I answered the question of what my major was, I usually got the same response back, which was  “Oh, that must be so hard, I could never do it”.

And the irony of that is, that EVERYONE is capable to learn how to code. As Denning says, “Computing may be the forth great domain of science along with the physical, life, and social sciences“, which to me means, it is a science like any other science, but we just need to give it a try. In my opinion, everything looks hard if it is unknown. As humans, we would rather do something that we feel comfortable with, rather than “taking a leap into the unknown” and getting comfortable with the uncomfortable.

Computer Science was not offered everywhere as a major, but now you can learn how to code and program online

Back home in Albania, Computer Science was not a popular program for girls, and the stereotypes of guys putting their headphones on, not socializing and doing their own thing were true. By the time I was ready to graduate from high school and pursue my education, Computer Science became such a hot topic, and everyone was encouraging students to give this major a try and see if we liked it. Computer Science received attention on media, certain programs were talking about all these different opportunities that this major could offer.

So, I decided to study Computer Science as my major during undergrad. I always was interested to see how computers work, and how does a computer understand a certain command, specifically,  I wanted to find “the magic” behind it. I didn’t have previous experience with programming. My high school didn’t offer Computer Science classes, and frankly I didn’t know what to really expect, but in my mind the option that I could change my major at any time, If I didn’t like it, put me at ease.

My first programming class was Java 101. At the beginning, I don’t really think I knew what I was doing, but as the class progressed, I really understood how powerful coding and programming can be.

Give a computer the most complicated exercise in finding a certain value, and it will never disappoint you

For a computer, calculating values is the easiest thing in the world. From last week’s readings we read about the information theory, and how for a computer, the smallest unit of storage is a bit, which stores either a 0 or a 1. This seems very inefficient for us, because we are used to work in base 10. But for a computer, base 2 does all the work. You combine bits to form bytes and you end up with 256 combinations, and you can have different combinations let’s say 2 bytes and so on, and all of the sudden you have all these switches working at the same time and that is powerful.

What is fascinating to me is the use of symbols and the grammar that we use when we program. And it is like learning a new language in a way, meaning that you have to follow the rules of the grammar of this language. There are a lot of different languages that you can choose based on what your end goal is:

Symbols are powerful because not only they represent different data structures but they also give meaning to the commands that we write.

Python is one of my favorite languages. It is easier to write commands in fewer lines (compared to java), which makes it easier to read and understand. As part of the assignment, I opened an account with code academy, and after finishing my undergraduate degree, it reminded me of my first experience with programming and how everything started from learning the syntax, defining variables, assigning values and running the code.

But apart from coding and writing programs to do a certain task, what computer science mostly taught me was logic.  We use algorithms all the time and every day. Whenever you were reading a manual on how to operate a new equipment, you were following an algorithm, whenever you had the option to choose between two things (let’s say there are two ways to get from your house to school) you made a decision based on some variable which might be time, you chose the shortest way that takes you from your house to school.

Computer Science helps you with how to find solutions to different problems we face, and not just homework assignments. Thinking “algorithmically” about the world, helps you to tackle the problem fundamentally, by breaking it down in it’s easiest parts, studying it and find better solutions to the possible errors, just like running a program in the console.

Now days, we can combine the power of computing and programming with any other discipline and the options and opportunities on what can be achieved are limitless. From social sciences, to humanities, to fine arts, to engineering, science and technology we can expand our curiosity and knowledge, and we can start just by taking the first stem into getting uncomfortable until it becomes comfortable.

Sources:

Denning, J Peter, Martell, H. Craig Great Principles of Computing. Cambridge, MA: The MIT Press, 2015

Evans,David Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition

Irvine, Martin Introduction to Computation and Computational Thinking

Verma, Adarsh  “How To Pick Your First Programming Language (4 Different Ways).” Fossbytes. N.p., 06 Mar. 2017. Web

The Importance of Coding

This wasn’t my first go-round with CodeAcademy’s Python tutorial, so many of the introductory concepts were already known to me. That being said, it was still interesting to see how code and programming language fits into our existing linguistic conceptual structures. Unless you are a linguist, it doesn’t seem like that you are constantly aware of exactly how we use language, so the process of learning a new one can make apparent the underlying, tacit processes going on every time we communicate.

The concept of symbols that mean and symbols that do is made very transparent with programming language. We are required to set and define variables to be imbued with meaning in order to be useful in various contexts. We fill the objectively meaningless shell of a word with the numbers, strings, etc, that will become the meaning of this placeholder word. We are then able to make symbols do things using preset symbols and the symbols we defined. The actions these symbols result in when combined are dependant on the meaning packed into them. The program reads them, computes, and returns to us even more symbols packed with meaning. It’s symbols all the way down!

What I find interesting is how even those of us who don’t pay close attention to the programming languages and computations taking place in our technology are utterly beholden to them. Our entire lives are centred around computers, and this gulf between those who know and those who use highlights the importance of good design in this area. It is for this reason I also believe coding literacy should be a fundamental skill learned by all. Websites like CodeAcademy do a wonderful job of opening up these seemingly esoteric and black-boxed domains to make them more accessible to the general public.

Thoughts about computer programming from a Python beginner

This week’s reading and assignment could be a great leap for my knowledge about computer science. As a new learner of coding, I really enjoyed the tutorial lessons of Python on Codecademy, from which I realized the distinctions between natural languages and programming languages. Furthermore, it also reminds me that computational thinking is happening in my daily learning process.

Natural language vs. Programming language

We’ve spent a long time since our birth to learn millions of words that natural languages provide us for communication and transmission of meanings. According to Evan’s critique of natural languages as uneconomic, we have to express meanings of an idea with sufficient words and space. Written in English, describing the calculation of total expense requires lines of sentences. Unlike the tedious expression by natural language, the same scenario is described concisely through programming language on Python.

The words “meal, tip, tax and total” are variables we create for computer to recognize and execute. The computer does not care about what these words mean in the real world for transactions. Particularly, not all the words “meal” here have the same meaning. In the 7th line, the “meal” on the left side of the equation is a new variable we reassign while the “meal” on the right side of the equation is the same as the one in the 3rd line. Similarly, the “meal” in the 8th line shares the same meaning of the “meal” on the left side of the equation in 7th line, which helps create a new variable named “total”. However, some words either in full spelling or abbreviation as symbols have similar meanings as in the natural language environment. For instance, string methods such as “len()” represents the calculation of number of characters in the variable. “Print” represents display of the result.

Different natural languages are ambiguous to a certain extent. Similar to English, Chinese have three main pronouns to abstract people, which are “他”, “她” and “它”, meaning “he”, “she” and “it” respectively. However, all these three letters have the same pronunciation as “ta”. People feel confused if they are not familiar with the context of the conversation. As Evans described, computer scientists dedicate to create simple, structured programming languages with less ambiguity. With Python’s own syntax, the layout of the language is simple. Different data types are distinguished from various colors, like variables in orange and strings in yellow. From a beginner’s point of view, it is easy for coders to detect errors by distinguishing colors of the codes. From my training experience on Codecademy, I totally agree with Denning and Martell’s words that “the virtual worlds created by software tend to be highly sensitive to errors.” (2015) I was stuck when trying to format the date and time in the order like mm/dd/yyyy hh:mm:ss.

I was hesitating on how to show a space between date and time. So I coded date and time separately as showed on the right of the image. After running the code, the error occurred although all the information were correct, displaying the actual date and time. I had no idea about the error until I requested for the correct answer, which was simpler than I expected (as showed on the left of the image).

Using experience of Wikipedia

I find an example of computational thinking in my ongoing Wikipedia project for another class. Contributors could edit Wiki pages via visual editing and source editing. Source editing, particularly, allows people to edit and format the page with codes specifically designed for Wikipedia, called Wikitext. For example, the equal marks are symbols to format the size of headings. The “~~~~” is a symbol for your name if you would like to interact with other contributors for the Wikipage.

A screenshot of Wikitext Cheatsheet

It is also error sensitive. Unlike Python, the codes could not be distinguished by colors. I seldom use source editing because the syntax is really difficult to memorize and it is hard for me to locate the error once occurs. Since the layout of visual editing is more user-friendly (“What You See Is What You Get”), it is hard for Wikitext to be adaptive to ordinary contributors. However, after this week’s reading about Wing’s idea of computational thinking, I’m trying to shift my thinking from a mere editing of a Wiki article to a systematic process of page formatting in a mathematical and engineering perspective. The involving computation thinking not only helps us solve problems on Wiki editing, but also permeates through our online conversation and interaction.

References:

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access: http://computingbook.org/.

Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: The MIT Press, 2015, chapters 4, 5, 6.

Help:Cheatsheet. (2017, October 15). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=Help:Cheatsheet&oldid=805470734

WYSIWYG. (2017, October 19). In Wikipedia. Retrieved from https://en.wikipedia.org/w/index.php?title=WYSIWYG&oldid=806034486

Computational Thinking and Musical Composition

In his open-access book on computation, David Evans says that “Computing changes how we think about problems and how we understand the world.”. It certainly has for me this week, but not in the way I expected. I was fascinated to see how computing and computational thinking have enabled research labs and enthusiasts to develop algorithms that compose music in the style of a given music genre.

Jeannette Wing, Professor of Computer Science at Columbia University, has consistently evangelized computational thinking as an essential skill across all domains – not just in the traditional way most people see computer programming.

Computational thinking is a way humans solve problems; it is not trying to get humans to think like computers. Computers are dull and boring; humans are clever and imaginative. We humans make computers exciting. Equipped with computing devices, we use our cleverness to tackle problems we would not dare take on before the age of computing and build systems with functionality limited only by our imaginations (Wing 2006).

The possibilities for interesting applications are astounding in an era where we can set algorithms loose on a challenges such as Musical Composition.

(Credit: Algorithmic Music Composer)

In this video, you can watch a video of a computer generating an improvised jazz track. Watch as the two melodies stream along, one a bass track and the other on guitar. But how would a computer know how to do that? In and of itself, it doesn’t. As Wing says, computers are dull, not imaginative in and of themselves. People empower computers to do imaginative things such as improvisational composition, and they often do so by solving for the program computationally both before and during the actual process using computational thinking.

In Google’s course on computational thinking for educators, they outline the process of problem solving computationally in more detail (Google 2017), however, check out an abbreviated bullet list for our purposes here below.

  • Decomposition – Breaking down data, processes, or problems into smaller, manageable parts
  • Pattern Recognition – Observing patterns, trends, and regularities in data
  • Abstraction – Identifying the general principles that generate these patterns
  • Algorithm Design – Developing the step by step instructions for solving this and similar problems

Following these steps, someone imaginative had to have gone through a process of using computational thinking to break down the problem, amass a collection of jazz music to analyze, and then develop a set of procedural syntax for the computer to look for principles and patterns within that music to know how and which ones to emulate in its own composition.

Another example would be these two videos from a “Flow Machines”, a research group developing algorithms for musical composition.

Check out this video for an AI generated melody in line with the Beatles

(Credit: Flow Machines)

Or this video for harmonies steeped in the style of Bach:

(Credit: Flow Machines)

In the case of the Beatles Video, musicians are collaborating with the Algorithms, adding the vocals to the AI’s melody. For the harmonies imitating Bach’s style of composition, the data is based off a database of sheet music. You can even try to guess the difference here. In any case, it’s been fascinating to see just how interesting problem solving in traditionally right-brained areas such as music with computational thinking.

Sources

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access: http://computingbook.org/.

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Google. Computational Thinking for Educators – – Unit 1 – Introducing Computational Thinking. Retrieved October 22, 2017, from https://computationalthinkingcourse.withgoogle.com/unit

Programming Language and Nature Language

By learning Python, I somehow see how to use programming language as the computer’s “nature language”, and give the computer instructions to complete specific tasks. It is the language between human and the computer: according to David Evans, it is a language human can understand and machines can execute. These are some reflections I had through the learning process:

Programming language and natural language share quite a lot of similarities. Words and abbreviations are borrowed from nature language to function in programming language, such as “print” means to present the result on the screen and “len()” means to measure the length of the word. During the first few lessons where there are not a lot of symbols involved, I feel that I can totally understand what the algorithm is about by reading the instructions I write down. As in natural language, the smallest units of meaning are morphemes, according to David Evans in Introduction to Computing, the smallest units in programming language are primitives. Based on that, a scheme program is capable of processing expressions and definitions.

Programming also has its own grammar system which is called algorithm according to Denning’s The Great Principles of Computing. The language structure of programming is kind of recursive according to Computational Thinking, that “a computation is an information process in which the transitions from one element of the sequence to the next are controlled by a representation.” Different symbols are used combined with primitives to function different algorithms, and I need to be careful and pay attention to small details to run the program successfully. Certain rules have to be followed, for example “6.75%” should actually be presented as “6.75/100” (as 6.75 divided by 100 in stead of just showing the percentage), as “%” implies a different function than showing the percentile as we normally use it in maths. Also, by using “str()”,the number does not change but somehow the meaning and nature of it has changed.

Based on that, it is interesting to point out that Python demands accuracy to a great extent, and the flexibility of the language is somehow limited. The “interaction” between human mind and computer is not the same as between humans. While during the latter one human brain can understand each other, complement the incomplete sentences, autocorrect mistakes and develop the meanings, when we are communicating with the computer we need to be accurate and specific of what we actually mean. The computer will not complement or refine the programming language itself, any small mistake could fail the running process.

It is a shame that when I have proceeded the Python tutorial to “String Formatting with %, Part 2”, the webpage somehow stuck when I click “run” and I cannot go any further than that point in the tutorial, otherwise I could have got the chance to explore more aspects of programming.

 

References

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Peter J. Denning and Craig H. Martell. Great Principles of Computing. Cambridge, MA: The MIT Press, 2015, chapters 4, 5, 6.

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access: http://computingbook.org/.

Computational Thinking is Everywhere (as it should)

I have to admit that, up until now, the words computational thinking and coding seemed not only foreign but unreachable to me. This personal preconception is starting to change slowly but surely. After this week’s readings and activities I’m still positive that I can’t code yet but now I know that it is possible to understand and use its concepts and principles to think about the way I interact with technology and the way I actively design my everyday activities. It is also worth nothing that this false preconception of computing and coding as something difficult and unaccessible is a main flaw in our educational system and mainstream media, and I agree with Jeannette Wing in her call for a different educational approach in her article “Computational Thinking”.

In this short but poignant article, Wing states and demystify the way people think about Computational Thinking. She lists many ways in which Computational Thinking is embedded in our everyday life activities. We use it without noticing, mainly because Computational Thinking is, in my opinion, ‘human thinking’  “… it is using abstraction and decomposition when attacking a large complex task or designing a large complex system… Computational thinking is planning, learning, and scheduling in the presence of uncertainty” (p. 1).

This last statement called my attention. When she goes about listing common examples in regular activities in which Computational Thinking is very present, such as gathering the things you need before you leave your house, retracing your steps if you lose something, or choosing a check-out line at the supermarket, it is clear that uncertainty seems to be not only a common factor but the motivator for this pattern or behavior. It occurred to me that I’ve been actively using Computational Thinking my whole life but I’ve been calling it ‘Logical Thinking’, or in extreme cases ‘common sense’.

To me it seems like an obvious way of thinking. We might often find ourselves thinking ‘why do people do X when they could do Y and it would be so much easier, faster, cheaper, better etc.’. Therefore, what might seem an evident and inherent way of human thinking, doesn’t always turn out to be that common. As the saying goes: common sense is the least common of the senses.

Here are a some funny examples of design fails:

(Fig. 1)

(Fig. 2)

Wing says “Computational Thinking involves solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science” (p. 1) and it clearly relates to two facts: computer science concepts are human thinking concepts and we can reformulate problems in order to be able to apply a known pattern in a way that we know how to solve them.

We’ve been reading about and understanding the concept that technology is designed (by us) to do everything it does. Therefore, the ‘magic’ behind technology is the ‘magic’ behind human thinking and human condition. On that regard Wing says that one of the characteristics of Computational Thinking is that it is “a way that humans, not computers, think… it is a way humans solve problems… We make computers exciting. With computing devices, we use our cleverness to tackle problems we didn’t before the age of computing”(p. 3).

This idea resonated deeply with me. When coming to CCT I told myself that the reason I wanted to get the “technology” part in my education was because technology seems to be advancing so fast that we, as a society, cannot keep up with it and seem to be one step behind solving problems caused by technology instead of anticipating it… but I’m starting to think I was looking at it from the wrong perspective. Yes, in many ways we are behind technology (our laws could be a clear example) but technology is made by us, designed to do what it does by us. Therefore, in order to solve the problems caused by technology we have to use computational thinking, the same one we used to design said technology, and more importantly we need to think about these outcomes when we are designing technology. We should be using Computational Thinking more actively when it comes to solving problems related to technology, as actively and unconsciously as we use it in everyday activities.

Learning and understanding how to talk to my computer: Python

This was my first encounter with a coding language and I have to say that it was less scary than I originally thought and I enjoyed it (at the beginning) more than I expected. Before starting the lessons on Python I took a few minutes to navigate the Code Academy website and found myself excited and interested about what it offers. The fact that this knowledge is so accessible, both as in free and understandable, seems almost shocking to me.

I know the word ‘language’ in coding language it’s pretty obvious, but I was still a little bit surprised of the similarities I found between Python, languages and music. During the first lesson I found myself thinking ‘oh, this is like learning a new language’ and immediately rolled my eyes at myself because that is exactly what I was doing.

(Fig. 3) Python. First Lesson. Deborah Oliveros. Code Academy. Quote: William Shakespeare.

Because of my experience with languages I could see the similarities regarding using symbols to represent meaning and to represent and interpret other symbols (such as valuable). It seemed to me like I was learning a new language with an alphabet different than mine, as I would be if I was learning Chinese, Korean or Arabic.

However there is an extra element: the console. Even though the whole thing can be described as a translator of sorts, I related it more to playing an instrument. I play, although not very well, the guitar and the ukulele. To play music there is also an alphabet assigned to notes or chords. Now, if you’re familiar with The Sound of Music then you already know this:

(Fig. 4) Music alphabet starting with Do(C).

Depending on the instrument you’re playing, the note Do (C) will require different positioning of your fingers, but the sound would be the same. You can play C on every instrument and it would be the same note, but the way you play it might change. That way if you know what the position of your fingers should be to play C on a piano, a guitar or a ukulele than you can play any song as long as you have the chords ‘lyrics’. This way, you basically can teach yourself how to play any instrument, because the universal musical alphabet or language lets you convert and interpret these symbols from one instrument to another. In my case, I learned the basic chords in a guitar and learned the alphabet, with that information I taught myself how to play the ukulele and briefly applied the same pattern to a melodica and a piano.

(Fig. 5) Chords chart for “Love Is a Losing Game” by Amy Winehouse.

If you look at this image you see the lyrics of the song and, above it in blue, the musical alphabet chords telling what note to play at what time. Although this is from a guitar chords website, I can use this to play this song in a ukulele, a piano, or any other instrument as long as I know what is the value of C, Dm7, Fdim and Cmaj7 in those instruments. However, in this case I am acting as the console, or as the “print… X” on Python, which brings us to the last characteristic of symbols expressed by Prof. Irvine in the introduction video:

“We use symbols (software) not only to represent meanings but to perform actions on other symbols” (Irvine). I have to act as the ‘print’ function on my musical instrument. I cannot tell my ukulele “play C now and then D and then B”. However, I can tell Python print C, D an B with 3 seconds between each to perform an action. This is the main difference I noticed when comparing languages. I can ‘tell’ my computer to perform the actions for me, actions that I can perform but that I might not be able to perform as fast and accurate as the computer.

Which is also another huge difference between python and language and musical language: there is no space for mistakes. It is not as flexible as our native languages or musical language, in which you don’t have to be 100% accurate to be able to communicate what you want. In this case, it has to be accurate and reliable always in order to work:

(Fig. 6) Learning Python. Second Lesson. Deborah Oliveros.

This is the part in which I started to become less excited and more frustrated with the new language. And it was clear and evident that my lack of computing background and my severe aversion to math showed. I still think it is exciting and I probably will use this website in the future because I would like to learn and ‘teach myself’ the same way I did with English and musical language. However, the most important takeaway from this experience is realizing that I’ve been using this way of thinking throughout my whole life unconsciously: now is the time to start doing it consciously.

 

 

References:

  • Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.
  • Figure 1 and 2: “Poor Design Decisions Fails”. Bored Panda Blog: https://www.boredpanda.com/poor-design-decisions-fails/
  • Figure 3 and 6. Code Academy. Learning Python. First and Second Lesson. Deborah Oliveros. Quote: William Shakespeare.
  • Figure 4. Musical Alphabet SOL(G). Music Notes 101 Blog: https://musicnotes101.wordpress.com/2010/04/20/the-musical-alphabet-clefs-the-musical-staff-and-the-keyboard/
  • Figure 5: “Love Is a Losing Game”, Amy Winehouse and Mark Ronson, Chord Chart from Ultimate Guitar Tabs: https://tabs.ultimate-guitar.com/a/amy_winehouse/love_is_a_losing_game_crd.htm
  • Martin Irvine. Key Concepts in Technology: Week 7: Computational Thinking & Software. Accessed October 25, 2017. https://www.youtube.com/watch?v=CawtLHSC0Zw&feature=youtu.be.

What I have learned about computational thinking

  • “Computational thinking is a fundamental skill for everyone, not just for computer scientists. (Wing, 2006)”

I used to believe computational thinking is far away from my life, a communication student. However, after this week’s reading and having a basic understanding of python, my thoughts about computational thinking has changed. Computational thinking is more like a thinking method or model that helping us solve problems in a complex system. It also helps us solve everyday problems and has closely connection with our daily life.

The question is how much we need to pay for the meal including tax and tip. I got this example from the python tutorial. This case is pretty simple, and we’re facing this kind of problem every time when we go out to eat in a restaurant. It’s really interesting to see the everyday stuff in computational language. And the system automatically gives me the answer: 54.63.

As far as I can see, computational thinking helps people enhance their analytical ability and leads us a better, systematically way of thinking and solving problems. Instead of calculating the numbers directly, python tries to give us a function for how to calculate the total payment. It decomposes a big complex problem into some small, relatively simple problems that can be fixed. Firstly, we need to know the money for the meal, and meal = meal + meal * tax. And the total amount of money equals to the new meal (money for the meal including tax) + new meal * tip. And it also shows a fast, flexible way to use massive data to contribute our work as well as daily life.

  • The dynamic interactions between computing and other field: implementation and influence

As Dr. Denning addressed in The Great Principles of Computing, the principles of computing has been categorized into computation, communication, coordination, recollection, automation, evaluation and design (2010). Those seven categories sometimes have overlaps with each other. For example, artificial intelligence can be seen as a computation system, automation system, and a design system. Additionally, we can also see the interactions between computing and other areas. There are two ways that one scientific phenomenon can interact with the other: implementation and influence.

  1. Implementation: the combination of a phenomenon and existing stuffs. Here we can see a software Construct 3 as an example. It’s a software that can make simple digital game and animation. So basically I can tell it’s computation, pictures, and system language that implement this software. 
  2. Influence: Two phenomena influence each other. Also in this software, the system language and code can influence how every object in this animation works. Only by giving a commend to the object CO2 with the correct system language, can the object CO2 start to work.

Reference:

“Learn Python.” Codecademy. Accessed October 24, 2017. https://www.codecademy.com/en/courses/learn-python/lessons/strings–console-output/exercises/strings?action=lesson_resume.

Wing, J. (2006). Computational Thinking. Retrieved from: https://drive.google.com/file/d/0Bxfe3nz80i2GZ21FcXlfdGNhWDA/view [Accessed 25 Oct. 2017].

Wing, J. (2006). The Great Principles of Computing. American Scientists. Retrieved from https://drive.google.com/file/d/0Bxfe3nz80i2GZ21FcXlfdGNhWDA/view [Accessed 25 Oct. 2017].

 

 

How to communicate with a computer

This is the first time for me to learn a computing language and I find python is charming. I’m going to talk about my understanding of computation based what I learned this week.

The core for a computer to function as expected is binary. In fact, before this week, I don’t know what is the language inside the head of a “computer”. Is it English? Does a Chinese computer “speaks” Chinese while the one from Korean “speaks” Korean? These questions once popped into my mind like a flash. But I just didn’t grasp them nor did I explore more about computing language.

From what I read this week, I found the answer: in a computer’s eye, the world is built up with numbers. A computer cannot understand the meaning of the word “red”, but can figure out what “fill (255,0,0)” means. For a computer, the world is a set of “yes or no” questions. The values for these binary questions are “1” and “0”.

As what I learned last week, bit is “a unit to measuring information”, we can use bit sequence to represent a set of things including text, pictures and movies. The procedure for us to do some programming things seems as if we were talking with the computer. The computer is an alien who can neither speak nor understand our language. This means that a translator would be a must if we want to get this “computer-alien” to do anything. This translator is computer languages like python, java script, c++ and so on. However, the translator is not that professional as he can only partly understand our language. The computing language shares something in common with English: words like “variable” and “print” are all what we use in daily life; “del” and “str” can also be recognized as abbreviations of “delete” and “string”. But Evans also cited in his article that natural languages were complex, ambiguous, irregular and uneconomic thus should not be used in programming directly. When we want the computer to print the value of 10, we cannot say, “Hey, print ‘10’.” Instead, we should say as follows:

my_variable=10

print my_variable

And then, the interpreter or compiler will turn these high-level language into commands that can be executed directly by the computer. The CPU will record the value as 1 when there’s currency on the board. As said in her essay, “Computational thinking is using abstraction and decomposition when attacking a large complex task or designing a large complex system”. In this procedure, we break the task of typing out “10” into: set a variable and print a variable.

 

David Evans. (Oct 2011) Introduction to Computing: Explorations in Language, Logic, and Machines.

Jeannette Wing. (March 2006). Computational Thinking. Communications of the ACM 49, no.3. 33 – 35.

James Gleick. (2011). The Information: A History, a Theory, a Flood. (New York, NY: Pantheon).

The Language of Computing

Last week in class we reviewed the transmission model of communication and information in which Claude Shannon described the goal of transmission to be “reproducing at one point either exactly or approximately a message selected at another point” (Shannon 1948). However, we also specified that in this model, the meaning of the message is irrelevant. Human meaning making, as expressed through symbols and language, is flexible and complex. Any attempt to freeze our symbolic processes for the purpose of “more accurate” communication between humans would have dire and far reaching consequences.

If however, we are trying to communicate meaning to a computational device, the flexibility of language becomes a problem. Instructions, represented as symbols, need to be translated into a computational language that follows clear syntax and grammar. Variables need to be defined clearly. Computer programming languages are what David Evans calls “designed language,” tools carefully crafted to eliminate undo complexity, ambiguity, and irregularity, but to enhance abstraction and economy  (Evans 2011). Programming languages such as Python utilize symbols in three ways, as described by Professor Irvine:

  1. Symbols to represent meanings (eg. + means to add)
  2. Symbols to described and interpret other symbols (eg. variable = 5)
  3. Symbols to perform actions on other symbols (eg. PRINT variable)

The clear definition of variables and consistent use of grammar is key. A breakdown will result in the dreaded SyntaxError

Figure 1: Katie’s Python Lesson 1 (“Learn Python” 2017)

Beyond correct grammar and syntax, programs have to be designed from a computational perspective. Jeannette Wing describes computational thinking as “reformulating a seemingly difficult problem into one we know how to solve, perhaps by reduction, embed-
ding, transformation, or simulation” (Wing 2006). In other words, programs have to be built logically, the kind of logic I studied in undergrad because I erroneously believed that class, located in the philosophy department, would be easier than calculus. (At least in calculus the the final exam wouldn’t have been only five extremely difficult questions upon which 70% of your grade would be based).

Coding requires a detailed understanding of processes, to be implemented in a specific order. Only when the ordering and steps are clear can the CPU retrieve programs and data stored in the RAM and compute an output.The recursiveness and the depth of the layering in the code language enables programs to carry out complex processes that appear to run instantaneously by encoding instructions to the computer’s hardware.  One wrong key stoke, however, could break the routine and stall the entire system. Leaders of programing teams need to allow time for testing and user feedback to ensure that the outputs match expectations.

Figure 2: The Possibilities of Coding (Apple App Store 2017)

Automated computing feels like a step away from the human computing which existed long before the invention of ENIAC, however, “coding,” like “writing,” is a humanistic pursuit. Most of our more elaborate programs involved teams of hundreds, if not thousands of designers to reach their current state. In order to support this type of collaboration, codes need to contain instructions for people as well as for the computer. If a programmer needs to determine if an algorithm can be improved, they first need to understand the logic of the initial algorithm. Python, for exampled, contains grammar such as the # and the “”” which allow programers to include explanatory notes or instructions to other people who might one day work on their code. Human language is in that way nested inside programming language.

Figure 3: Katie’s Python Lesson 2 (“Learn Python” 2017)

A friend of mine who creates programs to run complex economic models said that he can tell which member of his team wrote which lines of code based on their style. He even suggested that the entire narrative of a program’s creation could be read and understood if a person knew where to look. Coding is another art form, reflecting human intention in its design.

Works Cited

“App Store” Apple. Accessed October 24, 2017.

Claude E. Shannon, “A Mathematical Theory of Communication.” The Bell System Technical
Journal 27 (October 1948): 379–423, 623–656.

David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access: http://computingbook.org/.

“Learn Python.” Codecademy. Accessed October 24, 2017. https://www.codecademy.com/en/courses/learn-python/lessons/strings–console-output/exercises/strings?action=lesson_resume.

 

Jeannette Wing, “Computational Thinking.” Communications of the ACM 49, no. 3 (March 2006): 33–35.

Martin Irvine. Key Concepts in Technology: Week 7: Computational Thinking & Software. Accessed October 25, 2017. https://www.youtube.com/watch?v=CawtLHSC0Zw&feature=youtu.be.