Reversible computing

10.04.2009 19:35

I've stumbled upon a Wikipedia page about reversible computing recently. It's a concept I haven't seen before.

The idea is that there's a theoretical limit to the efficiency of computation. For every irreversible bit operation (for example AND, OR, but not XOR or NOT), it is thought that two new microscopic states are added to the thermodynamical system that's doing the computation, increasing its entropy by:

\Delta S = k\ln{2}

If the temperature T of the computer remains constant, an amount of heat must be released to the environment:

\Delta Q = kT\ln{2}

That's called the von Neumann-Landauer limit, and it comes to 33 picowatts at 10 GHz clock and 75°C

As it's name suggests, reversible computing is exploring possibilities of doing computations with only reversible operations, that is logic gates that have a 1 to 1 mapping between input and output states. In contrast to irreversible operations there is no known lower limit to the amount of work that has to be done by such a gate.

Such computers would be very different from what we have today. For example, a mere variable assignment in procedural languages is irreversible, since the previous content of the variable is lost. On the other hand, you could run a program backwards starting with the result and ending with the input parameters. That would be an interesting feature for a debugger.

As I see it, this kind of the research fits in the same box as Dyson spheres and warp drives. These kinds of limits won't be reached any time soon in practice (they may bug Universal AC eventually). But it is still fascinating to see just how far you can push some concept before the most basic physics starts to interfere with you.

Posted by Tomaž | Categories: Ideas

Add a new comment


(No HTML tags allowed. Separate paragraphs with a blank line.)