This week provided almost an evolutionary timeline of computation, which was super cool. Starting the reading with Prof. Irvines video we understand that computation is information processing in the sense of information transformation for all types of digitally encodable symbolic forms. This is done through binary electronics in which we impose a structure on electricity in sequences of on//off states and then assign symbolic values to physical units. From this we created the modern digital computer and the computer system that “orchestrate” (combine, sequence, and make active) symbols that mean (data representations) and symbols that do (programming code) in automated processes for any programmable purpose.
Then we go into computer principles with Denning and Martell which further defines computation as dependent on specific practices and principles, where each category of principle is a perspective on computing. This image sums of the initial chapter:
Then we make this big jump into Machine Learning which is the next step of computation in part because the world has regularities we can collect data of example observations and analyze it to discover relationships. Machine learning involves the development and evaluation of algorithms that enable a computer to extract (or learn) functions from a dataset (sets of examples). This is done through algorithms that induces (or extracts) a general rule (a function) from a set of specific examples (the dataset) or assumptions (inductive bias). Following this is Deep Learning another derivative of computation introduced as the subfield of Machine Learning that focuses on the design and evaluation of training algorithms and model architectures for modern neural networks using mathematical models loosely inspired by the brain. I believe this as an evolution of Machine Learning because Deep Learning has the ability to learn useful features from low-level raw data, and complex non-liner mappings from inputs to outputs rather than having a human input every feature (correct me if I’m wrong but features are the inputs for data within a dataset). Deep Learning was spurred from Big Data which has some notable ethical questions regarding privacy that I would love to further dissect. Overall this mean that Deep Learning’s ability to compute information is much faster and more accurate than many other machine learning models that use hand-engineered features.
It is honestly inspiring and jaw dropping to see the jump from Dartmouth to Machine Learning and now Deep Learning. So many questions still exist, but now I have a decent grasp that the devices I’m using now to create this post consist of humans imposing symbolic meaning to electricity that at its root is just 1/0s that through layers of my computer system is creating comprehensible images. From that we have evolved computers from a device that stores and transports data to actual machines capable of learning through data and deriving computation. I’m still curious the nature of Deep Learning and its difference and applicability to our issues today as opposed to Machine Learning. Also what is noise?
Best,
Chloe
—BREAK—
References