Understanding how the human brain works, can lead to new, exiting technologies

Symbolic cognition can bridge the gap between humans and technologies, mind and language. As Colin Renfrew argues in his article, the phase of symbolic culture has undergone different phases and transitions, from Episodic culture (characteristic of primate cognition) to External Symbolic Storage  (characteristic of urban societies). Although it’s hard to really understand these transitions, one thing it’s clear: that human brain and intelligence has been evolving while adjusting to its environment, conditions and cultural background from one  historic period to another.

As seen in babies and kids, a lot of the learning process is done by frequent repetition. The more they grow up, the more you see the incorporation of logic and cognitive artifacts to regulate their interactions with the world. Cognitive artifacts, according to Cole, are simultaneously both ideal (conceptual) and material. We learn a language by studying its symbols, and we use different tools to produce material products. An example of this idea would be going from studying a language, to use that language to write books, which then are stored in libraries (physical or digital), which then we use to develop new technologies. As we see, this will always be an ongoing process, and by studying the cognitive continuum, it is important to take a step back and see how did we get where we are today, because sometimes we take technologies and tools for granted, when it is obvious that they are correlated and will always be.

By studying cognitive science we can find ways to enhance human abilities. As Norman mentions in his article, a lot of the studies has been done from contemporary cognitive science, despite the importance of discoveries of early days of psychological and anthropological investigations. Let us look at an example on how we use our brain for new technologies.

NYT, Computers are taking design cues from human brains.

On a recent article that I read on New York Times, “Chips Off the Old Block: Computers Are Taking Design Cues From Human Brains”, new technologies are testing the limits of computer semiconductors. To deal with that, researchers have gone looking for ideas from nature, hence the human brain. For a long time, computer engineers have built systems around a single chip, the CPU. Now, machines are dividing work into tiny pieces and spreading them among more simpler, specialized chips that consume less power. Companies like Microsoft are using Neural Networks to improve their products and services for their customers. Neural networks have been used on a variety of tasks and systems, including computer vision, speech recognition, machine translation, and in many other domains. Systems that rely on neural networks can learn largely on their own, meaning that the system learns by studying different patterns repeatedly, which requires a lot of trial and error, and tweaking the algorithm to improve the training data over and over. In other words, my point is that even though we are creating new exciting technologies, we still need the human brain to make it all possible.


Irvine,Martin “Introduction to Cognitive Artefacts and Semiotic Technologies”

Renfrew, Colin  “Mind and Matter: Cognitive Archaeology and External Symbolic Storage.” In Cognition and Material Culture: The Archaeology of Symbolic Storage, edited by Colin Renfrew, 1-6. Cambridge, UK: McDonald Institute for Archaeological Research, 1999.

Cole,Michael  On Cognitive Artifacts, From Cultural Psychology: A Once and Future Discipline. Cambridge, MA: Harvard University Press, 1996. Connected excerpts.

Norman, A. Donald  “Cognitive Artifacts.” In Designing Interaction, edited by John M. Carroll, 17-38. New York, NY: Cambridge University Press, 1991. Read pp. 17-23.

Medz, Cade  (2017, September 17). Computers are taking design cues from human brains. Retrieved September 26, 2017, from http://www.post-gazette.com/news/nation/2017/09/17/Computers-are-taking-design-cues-from-human-brains/stories/