I bought a new Dell inspired notebook on behalf of a friend and thought I'd better run it up and make sure all was well (including the driver for the very groovy Dell 922 scanner/printer) before I sent it over to her. You can't fault Dell For their machinery - well built, modestly stylish (it's a computer after all!) and with after-sales support second to none. The thing that really surprised me was how many bit of rubbish they pre-install. After the first boot (authenticating online etc.) I timed how long it took to power up and reach the desktop - 4 minutes 37 seconds! Once I'd removed the various 90-day try-out versions of EVERY Norton product (see here for what I think of their products!) and put a good anti-virus package on there it was booting in 46 seconds. Why do they bundle installers for all the worst ISPs as well - having AOL9 pre-loaded and in the startup seems like it should be almost illegal! Anyhow - it is a tidy little laptop.
This got me wondering about the larger relationship between software and hardware and I pondered back to twenty years ago when I was doing a degree in maths and programming. At the time object-orientated languages were thin on the ground (Smalltalk was the only one I had any exposure to) and so working with either a procedural language (C, Pascal, COBOL etc.) or a functional language (LISP, Prolog) the developer could see how the abstraction applied to the final assembled code. All the programmers I graduated with (for example) could also code in Z80 and x86 assembler - not too many programmers can do that today. There was even the idea of induction to prove code. This was an idea that came from defense contracting that allowed you to show that a routine (typically in LISP) was good for every case by showing it worked for the specific case of n=1 and then for the general n+1 case. You could have confidence that a subroutine or even a whole program would work properly every time. Shoot forward to a few years ago when I was still running engineering for Resolution and I had to commission a database application that would support a 24-7 reality TV show and the programmer refused to stand by what he wrote - there was no way he'd be on call-out to fix any bugs we discovered down the line. It seems that as software development has moved into the realms of bolting objects together the idea of robustness has gone (get a programmer to explain the idea of garbage collection and why it is necessary in modern languages).
This is well illustrated if you compare the state of the two biggest NLE applications in the late 90's - Avid and Lightworks.
This got me wondering about the larger relationship between software and hardware and I pondered back to twenty years ago when I was doing a degree in maths and programming. At the time object-orientated languages were thin on the ground (Smalltalk was the only one I had any exposure to) and so working with either a procedural language (C, Pascal, COBOL etc.) or a functional language (LISP, Prolog) the developer could see how the abstraction applied to the final assembled code. All the programmers I graduated with (for example) could also code in Z80 and x86 assembler - not too many programmers can do that today. There was even the idea of induction to prove code. This was an idea that came from defense contracting that allowed you to show that a routine (typically in LISP) was good for every case by showing it worked for the specific case of n=1 and then for the general n+1 case. You could have confidence that a subroutine or even a whole program would work properly every time. Shoot forward to a few years ago when I was still running engineering for Resolution and I had to commission a database application that would support a 24-7 reality TV show and the programmer refused to stand by what he wrote - there was no way he'd be on call-out to fix any bugs we discovered down the line. It seems that as software development has moved into the realms of bolting objects together the idea of robustness has gone (get a programmer to explain the idea of garbage collection and why it is necessary in modern languages).
This is well illustrated if you compare the state of the two biggest NLE applications in the late 90's - Avid and Lightworks.
- Avid v.7, which if you used/installed/fixed it you'll know, was a capable application that needed a 300Mhz computer to run on a typically required 256megs of memory. It crashed often and took many minutes to boot. Judging by the multitude of error messages I got familiar with it was written in at least four languages and had components in it that hadn't been re-written in a decade (I know v.1 came out in 1991 but even today it has elements of EditDroid, the Lucas Film application that Avid acquired in the late 80's).
- Lightworks v.6, by comparison was a stable product that crashed once a month (on average), booted in under thirty seconds (due in part to DTX - they wrote their own disk handler rather than rely on an off-the-shelf OS!) and had an incredibly modest hardware requirement - a 66Mhz '486 with 64megs of RAM! It also came on three floppies!
Having supported several Lightworks suites for three years I was horrified how bad Avid was when we got our first few at Oasis. I knew one of the guys at Lightworks and he told me that it was all written in C++ with some C and assembler for the time-critical bits.
Compare all this to how far hardware has come - back when I was doing that degree the 25Mhz '386 was the fastest cheap processor around and even then people were sounding the death-knell for the CISC architecture. The Von Neumann bottlekneck was perceived as a real problem for traditional processor design and RISC chips (especially the T800 series of transputers) were going to save IT. BUT, hardware design got better and all those problems haven't held back the Moore's Law express-train.
Compare all this to how far hardware has come - back when I was doing that degree the 25Mhz '386 was the fastest cheap processor around and even then people were sounding the death-knell for the CISC architecture. The Von Neumann bottlekneck was perceived as a real problem for traditional processor design and RISC chips (especially the T800 series of transputers) were going to save IT. BUT, hardware design got better and all those problems haven't held back the Moore's Law express-train.
No comments:
Post a Comment