Skip to main content



Converting units - done badly

In February 1968, I wrote my first computer program.  It is a little scary to realise that I have been doing that for nearly fifty years.  I was in my "gap year" before going to university and had a job attached to a team of physics researchers working on new semiconductors.  Their analysis and model building was done in the computer language Algol, which provided the basis of many later and more powerful languages; it was a very good language as my first.  However, Algol in its original form did not have input/output routines as standard, which was a bit of a handicap.  The computer staff we worked with in 1968 had devised their own, and included a feature which I have never seen since.  Data could be read in as exact or to the accuracy of the number of significant decimal digits in the input stream.  The program offered this choice to the modeller.  So if the data read 1968, that might mean 1968.000000000 or somewhere in the range 1967.5 to 1968.5 (all this was converted t…

Latest posts

One-dimensional optimisation, Victorian style

Logistics do not make good themes for movies

How to abuse statistics (again)

Scheduling and Humpty Dumpty

Measurement is not (necessarily) controlling - part 2

Think - is this number reasonable?

An unsavoury interlude

The utility value of a towel

Where does operational research fit into a community plan?