Comparing this experience to the readings about language, I was able to understand the unlimited ways that language itself can work. Words, contexts, and syntax are remixed constantly to form knew understandings from vernaculars to creating new words using morphology like “Googleable.” The lexicon of a single language is infinitely growing and allows for generative and combinatorial phrasing.
So while computer coding is not language by Steven Pinker’s definition, it is clearly modeled after our understanding of language and uses the affordances of recursion and combinatoriality through looping logic statements. Additionally, while many programs offer limited libraries of commands, such as “Circle” or “Canvas” we can also define new objects within coding. By defining variables and then referencing them as the newly defined word. This in many ways is how we bring a diverse lexicon and new meanings to the very rule oriented and structured system built into coding programs.
My question, however, is does coding limit language. As Pinker noted, computers cannot understand context very well and the same word in language can have many meanings with varying contexts. While coding it can only be defined as one thing. I suppose the future of AI is building computer programs that are modeled more closely to language in which words can be rearranged infinite ways and with many contextual components.
Jackendoff, Ray. 2002. Foundations of Language: Brain, Meaning, Grammar, Evolution. OUP Oxford.
Steven Pinker, Linguistics as a Window to Understanding the Brain. 2012.