Abstraction is Everything
A computer can't really do much in theory: read a number from disk, write it back to disk, add one to it, see if it equals zero, and then maybe jump to a new location on the disk. If I remember back to the CS Theory classes in college right, then I'm pretty sure that is about it.
Which is why, when someone finally figures out how to coax a computer into synthesizing a believable human voice, all the programmers in the world have been planning to throw a secret party in which we'll laugh the laughs of movie villians: "Did you see the NYT headline? 'Intelligent Computer Unveiled'! We've tricked them! All we've done is add 1 and compare to 0 over and over!"
But to claim the humble premise of a computer makes it a stupid device is a bit misleading, I suppose. It is a bit like atoms. There isn't much to a proton, neutron, or electron, but the ravioli I had for lunch is a pretty impressive accomplishment. Something simple repeated enough times at a fast enough speed doesn't just give the illusion of complexity, it is the only possible source of it. So 1s and 0s are the Real Deal.
It all boils down to Abstraction. Programmers deal with the inherent simplicity of a computer by creating metaphors for themselves that can be reused as building blocks to increasingly complicated systems. So out of bits come bytes, from bytes come characters, from characters come strings, and so on. Pretty soon you have buttons on a screen and MP3 music playing in the background. It works wonderfully and is the fundamental practice that enables us to enjoy the technology we do today. When developers look at code and ooh and aah in admiration, it's either because of one of two things: low-level feats of magic or elegant abstraction.
But abstraction comes with a catch. Users at each layer of abstraction are largely ignorant about the finer-grained layers beneath them. This is absolutely necessary to deal with the complexity of our world -- we couldn't function at higher levels of thoughts if we had to be concerned with every detail underpinning existence. Have you ever tried to make a Twinkie by hand? Me neither. But not understanding the building blocks of an abstraction can be a big risk if the abstraction turns out to be flawed. Most people take it for granted that a Twinkie is a particular abstraction of food, in the delicious category. But what if, due to some design flaw at the molecular level, a Twinkie was actually petroleum-based product? You'd never know! But incorporating this abstraction into your diet would persist this error through your system.
So it is useful to allow ourselves to be ignorant of the finer-grained details beneath an abstraction, but it can also
be hazardous to not understand these details when necessary.
This is why learning to be a good programmer is an art of mindset rather than an art of syntax.
Learning a programming language is different than learning to program.
Learning a programming language introduces you to a specific set of operators and abstractions comprising one
perspective of controlling a computer. But really learning to program is the art of abstraction. Realizing when new
metaphors are necessary, seeing through the ones that currently exist, and constructing them to be relevant in a
generalizable fashion.