I've been programming for a long looong time now, not only in years (over 30) but in terms of a proportion of my lifespan. I've had some formal training in programming while in the beginning stages of a degree course, but the vast majority of my time spent programming has been using techniques and knowledge that I've acrued through practice. I've made many mistakes, and in the beginning at least, I rarely had anyone telling me that my approach or technique was a bad one.
To give some perspective on just how far back we're talking here, my first programming experience was on Research Machines Link 380z machines at school. My first program was intended to print the time, but I managed to put a bug into a two line program, and instead of the time it printed "Tim" over and over again. Given that's my name, I was instantly hooked. It also shows just how terrible I was at ensuring my code was correct, and that's something that's barely changed over nearly 20 years of professional programming.
I've written in a variety of languages, starting with BASIC on a ZX-81 at my dad's house, followed by BASIC on a Dragon 32 at home (and similar experience with Spectrums and Commodore machines). I modified production code written by someone else while trying to complete a game on the Dragon, and had a foray into assembly on the BBC Micro and Spectrum.
Various diversions around making music (Soundtracker and Octamed being the instruments of choice) and graphics followed, before I wrote a program to generate horoscope birth charts, which was my last non-professional step before the games industry.
Writing games is very, very different to just about any other programming persuit. The hardware is always constrained compared to any desktop machine. The expected outcome of a game is something visually stunning, audibly impressive, containing animation, effects, and a whole host of other gloss while essentially performing a simulation of a world running to some constrained set of rules, and by the time I started programming games, this was all expected to happen in real time. In other words - assuming you've got an NTSC telly, you've got exactly 1/60th of a second to figure out the result of simulating 1/60th of a second of your game world and also to draw that result.
It's incredible just how much you can achieve in that timescale (especially so with modern hardware like the PS3) but even when I was writing game code for the SEGA Genesis, if you consider games like Lemmings that emulate the behaviour of tens of critters all walking around in a world - it's pretty darned cool. Of course, to get those results (including all the shiny graphics) you had to cut every corner possible, shave every instruction from the list of instructions, and cheat like a mofo in every action you took.
To give a concrete example - when writing Shadow of the Beast for the Mega CD, instead of accurately modelling what happened when the beast responded to the jump button (how heavy is he? how fast is he moving upwards? apply gravity for 1/60th of a second ... how fast is he moving now? where is his position now?) we simply had a table of numbers which we added to his current screen height, that looked roughly like this: 7,5,4,3,2,1,2,1,0,0,-1 ... Yes, I typed that list in by hand, and tweaked it a few times till it looked right. No, it's nowhere near an accurate simulation of a velocity curve, but who's counting?
Every action in a program has a cost, in terms of how much memory it uses and how long that action takes to perform. On the earlier consoles, even things as simple as multiplying numbers was a very expensive thing to do - if you could stick with multiplying by powers of two, that was a lot cheaper. Don't even think about using floating point values! And when you've got 8 memory registers and 32k of main memory to play with, then you make damn sure you don't waste even a byte of it (and sometimes you make damn sure you're not even wasting bits).
Pretty much every one of these tricks is actually terribly bad if you look at a program in terms of how likely that program is to contain errors and bugs. Starting from questions like, Is your math correct? (obviously, no!) but leading on to, is the program performing the same actions as it was a second ago? (well, no - I just wrote some self-modifying code there because I needed to reuse the memory, so it's definitely not). Is the data in memory you just created from that calculation the same as it was a second ago? (well, no - I re-used that memory to do something totally different a few milliseconds back.) Does that function give the same results when you give it the same input? (it should, but it doesn't - because that global variable there got changed by some other function elsewhere and now it makes the result twice as big).
Over the years, I've moved from simple to complex to insanely complex hardware, and from teams of one or two people to teams of 50 people. The largest programming team I've worked on had over 100 people actively changing code during the last couple of weeks of crunch.
The resulting code is a mess. Elegant? - sure. Fantastically clever? - sure. Brilliantly constructed given the constraints? always.
Correct? Never.
Every single game I've ever worked on had at least one bug in it when it shipped. Some have hundreds or thousands.
Of course, you can blame my lack of skillz as a programmer (and indeed many people have!) but ultimately the reason these bugs are there is twofold. First, the techniques and methods used to write the software mean the bugs are possible. Second, the techniques and methods used to write the software mean the bugs are mostly invisible except under certain circumstances.
There has to be a better way, surely?
If you're expecting an unequivocal yes, then you've obviously never written a program before. But there's options. See part two of this series for more details.
No comments:
Post a Comment