Anyway, now that we've started experimenting and have an intuitive feel for input and output, let's start exploring the space between input and output. Let's start talking about computing itself.
We begin with a trip into the world of wacky anthropomorphisms. Let's say you're a chess player. You know you need to be really smart to play chess well. On the road to becoming a better chess player, you'll learn things about the game that will help you teach other people to play chess well, strengthen your logical reasoning skills, and grow yourself as a competitor.
Now consider the following: You can buy software that will turn your computer into a machine that can destroy you at chess for $7 USD. This means the computer is quite brilliant and intelligent, right? Perhaps the computer could teach us how to play, and maybe installing Fritz has increased the computer's ability to think logically?
As it turns out, Fritz's real strength is its ability to select extremely strong chess moves for a given position. Your computer's ability to think logically is actually unchanged by installing Fritz. The features of Fritz that give amateur chess lessons aren't significantly affected by improvements to Fritz's ability to play the game.
Computers...aren't actually very smart. If your computer isn't very smart, how can $7 software make your computer defeat every human you know at chess?
For a little more than $7, your computer can take on the best human players in the world. |
This execution of instructions in the pursuit of an optimal or best-effort decision is the "computing" to which this series refers. Programming a computer is the act of writing instructions, in order, that will compute the desired output for any given input.
When your program runs, the computer executes each instruction in order as fast as it can. No more, no less. Ever. Other devices may assist users of a program with input and output (e.g. a digital camera will turn light hitting a sensor into a form your phone can understand as input for its camera app), but computing is about that space between the input and output, where fast execution of specific instructions allows us to make machines do what machines have never done in human history.
Exercise:
- Open a cookbook or browse for an enticing recipe online. Pick a recipe that has between 3 and 5 numbered steps. All this blogging is making me hungry--I could really go for some bruschetta right now.
Mmm...bruschetta.... - Imagine you have an untrained but literate, obedient, efficient, and self-disciplined child to help you prepare dinner. Your child will do exactly what he is told in the order instructed, but, unfortunately, nothing else. In addition to moving about the kitchen, your child can follow instructions like "if this is true, then take this action; otherwise, take this other action" and "go back to step 2." Now look at the recipe you selected. Is there anything missing from the recipe that glosses over some pretty major details if this is all your child was using to prepare the dish? Imagine how you would have to augment the recipe's instructions to avoid starvation. We'll call the result an "augmented recipe."
- Let's say you could teach your child 10 kitchen-specific skills (boiling, chopping, etc.). Which skills would simplify future augmented recipes for the largest variety of dishes?
- What are the inputs and outputs to your child?
Extra credit:
- How could you instruct your child to pay your bills?
- If you're mathematically inclined, how could you instruct a very patient child to apply Euler's method to trace out a curve specified by a differential equation?
Bonus:
In modern times, we have the bumper-sticker sized phrase "Garbage in, garbage out" (sometimes "GIGO") to express that computers deal with the data we input, not the data we would have liked to input. While this is sometimes frustrating, it is also logically inevitable after brief reflection on the nature of computing as we have discussed it.
Charles Babbage is credited with creating the first design for a general-purpose computer in 1837. Most people didn't understand GIGO back then, either.