Has Computer Programming Really Changed Much Since Lovelace's Time?
Everyone always talks about these new computer programming languages, and how great one is over the other. But really, has computer programming really changed that much over time?
“Wow. You work in the computer industry. It must be a real challenge keeping up, given how quickly things in that industry change.”
That’s a common sentiment you’ll hear when the person with whom you’re engaging in casual conversation discovers that you work in the information technology field. In such circumstances, I typically try to enhance my mystique by playing along with such assertions, talking up what a challenge it is to work in the computer field, while stressing how smart, intelligent and handsome one must be to excel in this field, but the truth of the matter is, it’s all just a rouse.
The more things change, the more they stay the same …
If you want to know the truth, programming really hasn’t changed all that much since Ada Lovelace hacked out some code for Charles Babbage’s programmable machine way back in the eighteen-hundreds.
Computers are useful because they can do three or four things well.
First, computers can manipulate and store vast amounts of data. Sure, an iPod can store more data than a Commodore 64 of yesteryear, but it’s a truism that hasn’t changed over time: computers are useful because they can manipulate and store data.
Secondly, computers can do conditional logic. Basically, a computer can process an if statement, performing some logic if a condition is true, and other logic if a given condition is false. Branched logic based on data the computer is maintaining is really the only impressive trick a computer program is capable of performing.
Thirdly, computers are fast. You can throw a whole bunch of data manipulations (Point #1) and conditional logic (Point #2) in a while loop that iterates a million times, and the whole process will complete within the blink of an eye. But again, the fact that computers are fast isn’t a revelation of modern day programming. Sure, a modern processor fluctuating through three billion cycles per second (3Ghz) is certainly faster than a Vic 20 running at a million cycles per second (1Mhz), but this variation in speed is just that: a variation; and as impressive as this variation is, it hasn’t had any fundamental effect on how we program computers.
Evolution without revolution
Of course, that’s not to say the way we program computers hasn’t evolved. Certainly the manner in which we interact with computers has changed. We now input data using touchscreens instead of punch cards, and we can view the responses a computer generates on a high-definition LCD monitor instead of a green screen. And computer languages have certainly evolved. For example, Java itself is an ‘object-oriented’ programming language, which means it provides improved facilities to help to organize data (Point #1). And new computer programming languages like Scala and Clojure have evolved to optimize performance on these big multi-processor machines (Point #3) that are becoming cheaper and cheaper these days. But no matter what the language is, or how the syntax varies from one programming language to another, they all boil down to the same three basic things: managing data by declaring and initializing variables, performing conditional logic with if…else semantics, and using various types of loops to ensure that all of these things are being done a heck of a lot faster than a million monkeys sitting in front of a million typewriters.