Training on Samuel Morse electrical telegraph code was a prerequisite for the completion of my tenth grade mandatory summer camp, back in Syria. I didn’t know then, that this methodology of transforming “patterns of electrical pulses into written symbols” have inspired scientists to create modern computers. The concept of Morse system was used as the basis to transform computers from digital binary to symbol processors. The maturing process of the system witnessed many leaps, which transformed it from a number-crunching tool into a symbol-manipulating process. With time, six principles were identified to produce computation in this seemingly complex system. Understanding the bottom- up design approach provided by the main principles, will help us better understand this system and decipher its codes.
In his video, Professor Irvine explained thoroughly how the binary system, that has only two positions, was used to transform the digital binary computers into symbol processors. The system used binary electronics and logic, in addition to base 2 math for encoding and processing computations. The system uses electronics because they provide the fastest physical structures for registering and transmitting signals that we can encode. It also used electricity to impose a pattern on a type of natural energy. Imposing this pattern, accompanied by assigning human symbolic meanings and values to physical units, created a unified subsystem to build on. We can add different layers on the subsystem to transform inputs into outputs for any technology. This process helps us understand computation and the components of the computer system.
In their book, Great Principles of Computing, Denning and Martell, introduced six principles of computing; communication, computation, coordination, recollection, evaluation and design. The authors emphasized that these principles are tools used by practitioners in many key domains and are considered the fundamental laws that both empower and constrain technologies. Today’s computing technologies use combinations of principles from the six categories. It is true that each category has its weight in a certain technology, but the combination of the six exists in any technology we examine today. The bottom up approach stems from the fact that these principals work as the basis (bottom) to support technologies’ domains (up). Knowing that computing as a whole depends on these principles is very intriguing. It opens up the door to question and investigate how these principles work and interact to develop new discoveries.
Understanding the subsystems and layers that compose computers and that computing principles support any technology we use today, made me understand this complex system. However, being a novice in the field of technology, grasping the process of principles supporting other domains will be one of my learning objectives in this class.
Denning, Peter J. and Martell, Craig H. Great Principles of Computing. (Massachusetts: The MIT Press, 2015).