Joined: 5 Aug 09
I'm sorting out my bookshelves and, as usually happens, browsing is slowing the work down.
A book on analog(ue) computing (ref. below) reminded me of an old project. Can't really follow it now - it must be my brain that's slowed down. Anyway, I thought I'd write a note as the silly season is coming to an end.
Early analogue machines were mechanical; bombsights, gunsites and similar nasties were very sophisticated.
The technique became quite widespread with the introduction of electronics. Quantities are usually represented by voltages, all arithmetic functions are available, and you arrange to integrate and differentiate with respect to time.
Horrendously difficult differential equations, and more, solved for mathematical dummies while you watch.
An advantage for end users (as opposed to mathematiciens and programmers) is that implementations are relatively easy to understand as the problem is mapped more-or-less directly onto the hardware. Programmers didn't like it because of the hard-wiring, though you don't need to know much about electronics.
There was renewed interest in analogue computing in the 1970s, when the key component - the operational amplifier - became very inexpensive, and digital computers could be used for timing and to control the electromechanical control relays (hybrid computing). I used a very simple machine to model metabolic processes in different body compartments, and saw a big setup being used to model vehicle suspension systems. Digital computers were nowhere near powerful enough for that. Even now, you need a supercomputer or Boinc.
The technique seems to have died out, although with modern technology the main practical difficulties that drove us mad would exist no more: generating and recording arbitrary functions, accurate and inexpensive multiplication and division, relays and manual patchboards replaced by semiconductors.
The usable dynamic range of an operational amplifier is about 10e5 or 10e6; not too bad compared to some compilers still in use, but you did have to worry about scaling. Nowadays, this could be managed automatically by the digital computer.
Digital computing techniques are getting more and more powerful, and although analogue could be reconsidered at least for add-on accelerators, I don't suppose anyone would be interested in a change of mindset. Analogue computers were proprietary commercial products. A modern implementation would be no great challenge for an electronic engineer, who would perhaps use existing control and monitoring hardware and software (high accuracy, low noise and drift, but speeds as slow as you like). You need only a very limited set of hardware module types.
However, defining suitable (presumably open) hardware and software standards would involve some effort that may or may not be justified.
Perhaps someone brighter than me and with a bit of time to spare would like to review the subject, if only from the cultural/historical point of view.
Christopher R. Lee
Analogue and Iterative Methods in Computation, Simulation, and Control
Wilkins, B. R.
ISBN 10: 0412099608
ISBN 13: 9780412099601
Publisher: Chapman and Hall, 1970
Publication Date: 1970
Available secondhand from the usual sources.
Copyright © 2021 University of California. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.