I wanted to see how my conception of “language” changed as a result of the readings and video this week, so I first wrote down some keywords I associated with the term. I came up with: culture, history, speech, understanding, communication, and transmission.
After doing the readings, I must admit that I was guilty of the sociological problem Jackendoff describes of laymen being especially susceptible to the Dunning-Kruger effect when dealing with the field of linguistics. I think this most likely comes from the fact that, as Jackendoff and Pinker both mention, we acquire and comprehend these incredibly elaborate linguistic skills from such a young age, so we’re never really cognizant of the process. Even more so if, as Chomsky proposes, we’re hard-wired for it. I was really blown away by the sheer amount of behind-the-scenes cognitive work (or as Jackendoff calls it, “f-mental” or “f-language”) that must be taking place in the child learning her or his way around language. Pinker’s segment on structure dependent rules was a great illustration of this. I personally had no idea these rules even existed, yet I immediately knew something was off when they were altered. It was intuitively clear to me, and reminded me of this post I saw a few weeks ago:
Another thing I learned from the Jackendoff reading is just how scientific and complex this area of study is. I was struck by the similarity of the illustrations of linguistic rules and structures with the illustrations of chain reactions you’d find in a chemistry textbook. Which, I suppose, is what sentences are: chain reactions that prompt understanding and meaning. Sequencing and hierarchy of the elements within both linguistic and chemical chains are imperative to their outcome. The wrong linguistic sequencing will lead to an illogical or awkward sentence, whereas the wrong chemical sequencing will lead to totally different formula or element (I think…I’m not a chemist).
I’m interested in the efforts to mirror or mimic these linguistic structures in computers and AI, so when Jackendoff talked of “structures built of discrete combinatorial units”, I was reminded of the discussion we had a couple of classes ago about Morse and the foundation of modern computing. Stripping computing down to its fundamentals in binary was illustrative of how crucial chaining and coupling are to computing. It seems as though there may be some kinship between this conceptualization of computing and linguistics.
So after going through the assigned material, I would add the words science, structure, innate, and computing to my previous list.
A few questions I had:
- One of the critiques of Chomsky was that universal grammar may not be specific to language, but what else could it be applied to? Pinker mentions vision, control of motion, and memory. Is this implying that there could be hard-wired ways in which we physically move, see and remember?
- Is the solution to solving the issue of pragmatics in AI linguistics to program for existing social interactions and contexts? Is there some “learning” ability that AI should be able to exercise if they come across unaccounted for situations and contexts?
- Ray Jackendoff, Foundations of Language: Brain, Meaning, Grammar, Evolution. New York, NY: Oxford University Press, USA, 2003.
- Steven Pinker, Linguistics as a Window to Understanding the Brain. 2012.