Mainly from Prof Najm’s office hours. Analogue computers aren’t really used and they’re no longer an active topic of academic research. This is because they’re mainly subject to huge amounts of noise, so there’s limited uses of amplifiers in analogue computers. They’re also usually single-purpose and rigid compared to general-purpose digital computers.
Some modern analogue computers convert to digital and back to analogue so that the signal is preserved.
Digital computers are easy — they conform to discrete values they want to take, and are general-purpose and reliable.
Analogue computers also can’t compare with digital’s computational power. It’s hugely difficult to design analogue chips, or circuits that aren’t subject to much noise and signal loss. Even the time investment is higher — compare that with digital chips where we have programmable arrays like FPGAs that can wire things quickly and easily.
New methods of analogue computing promise to accelerate AI computations (especially matrix multiplication) in a way that’s much different from computers of the past. For example, camera or audio based computers can use much less power and still function as intended (turn on a digital circuit).