Computer Coding & Language – Lauren Neville

My particular understanding of the human language actually first came from my experience coding in Javascript and HTML and realizing the extent in which syntax and structures must convey meaning from the human to the computer. Every word has to be placed appropriately and then was allowed to form command sentence structures. These were then nested and looped within each other to create complex architectures.

Of course, I have learned that coding is not language because language is not written language, but is actually words, rules, and interfaces. In computer coding there are very strict series’ of rules and grammars that must be followed. Within my Javascript platform, every word used came from a library which  was previously created and those words meant very specific meanings.

Comparing this experience to the readings about language, I was able to understand the unlimited ways that language itself can work. Words, contexts, and syntax are remixed constantly to form knew understandings from vernaculars to creating new words using morphology like “Googleable.” The lexicon of a single language is infinitely growing and allows for generative and combinatorial phrasing.

So while computer coding is not language by Steven Pinker’s definition, it is clearly modeled after our understanding of language and uses the affordances of recursion and combinatoriality through looping logic statements. Additionally, while many programs offer limited libraries of commands, such as “Circle” or “Canvas” we can also define new objects within coding. By defining variables and then referencing them as the newly defined word. This in many ways is how we bring a diverse lexicon and new meanings to the very rule oriented and structured system built into coding programs.

My question, however, is does coding limit language. As Pinker noted, computers cannot understand context very well and the same word in language can have many meanings with varying contexts. While coding it can only be defined as one thing. I suppose the future of AI is building computer programs that are modeled more closely to language in which words can be rearranged infinite ways and with many contextual components.

 

Jackendoff, Ray. 2002. Foundations of Language: Brain, Meaning, Grammar, Evolution. OUP Oxford.

Steven Pinker, Linguistics as a Window to Understanding the Brain. 2012.