# Thread: Computer Minds and Reversible Turing Machines.

1. Computations usually generate waste heat, entropy. This is mostly caused by the operation of erasing information, because the energy of that erased bit has to go somewhere, and the somewhere is as waste heat into the environment. This creates problems as circuits get smaller and smaller.

Reversible algorithms avoid this by never erasing information. They are symmetrical in time. If you do 1+3=4 in normal computing, the +, 1 and 3 are lost afterward, given 4 you have no idea how it was arrived at, was it 1+3, 2+2, 2*2, 1*4? Reversible computing never erases anything, so if you have 1+3=4 you can run it backwards and see that, in this case, 4=1+3. There is no dumping of information, and so, theoretically speaking, no innate entropy increase.

There is no entropy decrease either, and in practice there will always be some entropy increase due to errors and so forth, but this is reduced the more micorscopic the circuit, entropy being a macroscopic property. So even in practice, a optimally miniturized reversible computation has almost no entropy increase, the only source being errors caused by cosmic rays and other such distrubances from the environment. An optimally miniturized non-reversible (normal) computation on the other hand produces huge amounts of waste heat as a matter of course.

Any normal, non-reversible computation can be converted to a reversible computation.

Now consider an AI mind that is written on a reversible computation. It could be our mind. The interesing bit is that the computation works equally well running from the future to the past as it does running from the past to the future. There is no innate direction of time to the computation.

The non-reversible version, by comparison, quite definitely follows our own sensation of moving from the past to the future. Our normal, conscious senasation of time is replicated in the underlying algorithm creating us.

But in a reversible computation there is no such correspondence. Because it is perfectly symmetrical with respect to time there is one interpretation of the algorithm that corresponds to our perception of time, but there is a second, equally valid interpretation that has the whole thing going backward. If you were handing the entire algorithm and an example of it running, and you didn't know what it was, you wouldn't be able to say which direction in time was the beginning and which was the end.

It would be like a conscious AI palindrome. So if we had such an algorithm, and we happened, without knowing, to be viewing it "backward", would we interpret it as a mind running backward, or would we come up with some other interpretation that corresponded to a different mind still running forward?

2.

3. What if the time algorhythm was an infinitely recursive loop? I mean you'd need a hard drive the size of the universe but with your reversible computation tecnique it might be quite feasable.

4. Erase data, huh?

You don't erase data, you overwrite it. Some bits end up being the same. I'm not really sure where your going with this. Are you more or less referring to caching problems so that they don't need to be solved the second time? Your erase idea has little theory in fact, as I said we overwrite stuff..not erase it.

An overwrite is not at all the same as a true erase.

5. By erased I mean lost, sorry, I guess I should have said "overwritten."

Overwriting means the original information is lost to the classical algorithm, and as information is physical it has to go somewhere, in this case as waste heat into the environment.

From http://en.wikipedia.org/wiki/Reversible_computing

Probably the largest motivation for the study of hardware and software technologies aimed at actually implementing reversible computing is that they offer what is the only potential way to improve the energy efficiency of computers beyond the von Neumann-Landuaer limit [1] of kT ln 2 energy dissipated per binary operation, where k is Boltzmann's constant of 1.38 × 10-23 J/K, and T is the temperature of the environment into which unwanted entropy will be expelled.

There are two major and closely-related types of reversibility that are of particular interest for this purpose: physical reversibility and logical reversibility. A process is said to be physically reversible to the extent that it results in no increase in physical entropy, or in other words is isentropic. Although no real physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well-isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.
Where I am going with this is the consideration of a reversible mind algorithm. Any non-reversible algorithm can be converted to a reversible one. So if a software version of a human was made, and was reversible, and we were running it backward, is it possible that apart from an interpretation as a mind running backward would it also have an interpretation as a different kind of mind still running forward? Though it would need to be isolated from the external world to protect it's temporal view, so the algorithm would have to include its own little environment for the mind.

Edit: And a little more on the erase/overwrite idea. Basically when you overwrite something you need to displace the information that was already there, at least in traditional computing.

From http://www.powermanagementdesignline.com/howto/52600761

John von Neumann, the famous early 20th-century mathematician and physicist (and proponent of the "von Neumann machine") suggested a limit in a 1949 lecture that was based on fundamental thermodynamic considerations. In modern terms, the intuition behind this limit was that whenever you overwrite a storage location (or logic node) with a new value, the previously contained information must be displaced to the environment, since information is (physically speaking) indestructible.
In otherwords, classical computing generates a lot of entropy from all the information that gets overwritten (which I referred to as being erased, apologies if this has confused some. I'll make sure to say "overwrite" instead of "erase" in the future.)

6. In other words, classical computing generates a lot of entropy from all the information that gets overwritten (which I referred to as being erased, apologies if this has confused some. I'll make sure to say "overwrite" instead of "erase" in the future.)
Well yes, this is certainly true. It is also however unavoidable. Even the address registers that access the memory need to be overwritten in order to access another portion of memory. The internal stack space (work area) also would need to be overwritten as well as the CPU registers. In all reality what your suggesting is merely a form of caching. It would yield little difference in energy usage, however it may yield some time savings. You have to remember that in order to process any information through a microprocessor the internal registers have to change with the data. So even just retrieving previously calculated information would result in many overwrites and transformations.

7. In all reality what your suggesting is merely a form of caching.
It is certainly related to caching, but I think in saying "merely" you may have missed the point of the theory of reversible computing and not just how it differs from non-reversible computing but, more importantly, why it differs.

Also, reversible computing is not something I made up, I only recently stumbled over the concept while looking into the absolute theoretical limits to computation given the known (and speculated) laws of physics.

Primarily reversible computing is about incorporating thermodynamics with principles of computation. There are fundamental thermodynamic differences between normal, non-reversible computation of the kind we are used to and reversible computation. Basicallly, reversible computation can be pushed further than non-reversible computation, because by not dissipating old information as waste heat the circuits can be smaller, and can exceed the theoretical limits of non-reversible computation as far as speed and efficiency goes.

From http://www.zyvex.com/nanotech/reversible.html (Emphasis mine.)

As we pack more and more logic elements into smaller and smaller volumes and clock them at higher and higher frequencies, we dissipate more and more heat. This creates at least three problems:

* Energy costs money.
* Portable systems exhaust their batteries.
* Systems overheat.

When a computational system erases a bit of information, it must dissipate ln 2 x kT energy, where k is Boltzmann's constant and T is the temperature. For T = 300 Kelvins (room temperature), this is about 2.9 x 10^-21 joules. This is roughly the kinetic energy of a single air molecule at room temperature.

Today's computers erase a bit of information (in the sense used here) every time they perform a logic operation. These logic operations are therefore called "irreversible." This erasure is done very inefficiently, and much more than kT is dissipated for each bit erased.

If we are to continue the revolution in computer hardware performance we must continue to reduce the energy dissipated by each logic operation. Today, because we are dissipating much more than kT, we can do this by improving conventional methods, i.e., by improving the efficiency with which we erase information.

An alternative is to use logic operations that do not erase information. These are called reversible logic operations, and in principle they can dissipate arbitrarily little heat. As the energy dissipated per irreversible logic operation approaches the fundamental limit of ln 2 x kT, the use of reversible operations is likely to become more attractive. If current trends continue this should occur sometime in the 2010 to 2020 timeframe. If we are to reduce energy dissipation per logic operation below ln 2 x kT we will be forced to use reversible logic.
It is not just a software thing either, it is more of a hardware consideration. This from http://en.wikipedia.org/wiki/Toffoli_gate (Emphasis mine.)

Reversible gates have been studied since the 1960s. The original motivation was that reversible gates dissipate less heat (or, in principle, no heat). In a normal gate, input states are lost, since less information is present in the output than was present at the input. This loss of information loses energy to the surrounding area as heat, because of Thermodynamic entropy. Another way to understand this is that charges on a circuit are grounded and thus flow away, taking a small charge of energy with them when they change state. A reversible gate only moves the states around, and since no information is lost, energy is conserved.

More recent motivation comes from quantum computing. Quantum mechanics requires the transformations to be reversible but allows more general states of the computation (superpositions). Thus, the reversible gates form a subset of gates allowed by quantum mechanics and, if we can compute something reversibly, we can also compute it on a quantum computer.
(A Toffoli Gate is a reverisible logic gate that can be used instead of AND gates, which are not reversible. NOT gates are already reversible, so a combination of Toffoli Gates and NOT gates can build any required logic circuit.)

It came to my attention while I was looking up the absolute limits of physical computation. I was reading about computers the size of planets. We already have trouble keeping CPUs cool enough to operate now, image the same problem when you make a computer the size of a planet. The issue of dumping overwritten information into the environment becomes a critical problem, because for the entire system apart from the outer shell the environment of any particular processor is more processors.

BTW, I wasn't originally meaning for this to be a dissertation on reversible computing. It was the idea of mind algorithms running on reversible computers not knowing which direction was the future that interested me. I wasn't expecting the background idea of Reversible Computation to be that controversial (though it does have it's critics.)

8. The easier solution is to move to a light based computing model where the laws of thermal dynamics no longer apply.

The use of a light based processor would eliminate most of the problems this article describes. Also there has been some research in to tri-state logic to help greatly reduce the amount of power needed. The third state being NULL or not true or false. It would float between the two and use no power. A form of analog computing also has been researched where instead of using bits that represent two states each bit would represent multiple states, this would greatly increase the amount of combinations and numbers a series of these bits could represent. It also would require less energy to transform between states.

I just don't see what this article suggests as being used for much outside of some research. There are better solutions that require less work and yield more substantial long term results.

9. Using light doesn't change the limit if you are still using non-reversible computing. light is energy, and if you overwrite information that energy has to go somewhere.

So if you have a light based system that uses non-reversible logic, and a light based system that uses reversible logic, the reversible system can run faster and use less energy. The non-reversible system is stymied by the von Neumann-Landuaer limit, the reversible system is not.

Which is, however, beside the point because I wasn't ever discussing whether we should invest in reversible or non-reversible computation technology development. I was thinking about AI in a reversible computation environment. You know, just thinkin' about minds and stuff. This is the philosophy forum, right?

10. Hey, what the hell, we're off on a tangent anyway, and you've obviously thought about this kind of thing.

Where do you see computation possibly going in the future, (In)sanity? Not just short term, but long after humans are a distant memory. I only know of reversible computation because I was looking into just such a topic, and it seemed energy and energy dissipation becomes a serious issues once computers get large (they already are, but this is mainly because current CPUs are fairly inefficient, there's still a lot of slack that can be cut out.)

SquirrelWorshipper: I've been working on the assumption that the computed mind (plus environment) took up a finite number of bits in total. Endless loops would kind of mess things up.

11. If you do a little research on optical computing you will find it does it fact use far less power and generates far less heat. It also has the potential to compute at faster speeds do to the slight difference in the speed of light vs the speed of electrons.

In the end I think you'll find parallel computing will be the way to go, perhaps with analog light based CPU's. We can move to 64 bit (doing so now) and then 128 bit later, however this pales in comparison to a more analog or multi position bit. 2<sup>32</sup> vs say 64<sup>32</sup>. Even with current technology this uses less power at the gates as most of the transitions would require little movement in state. The problem is the circuitry needed for this is a bit complex at the moment. I know some people are researching this concept, not sure how far they have got.

12. Fair enough. I consider that a sort of mid level speculation, looking to the future but not too far. The stuff that brought reversible computation to my attention was of a more extreme nature, related to absolutle physical limits of computation and heat dispersion, where the only boundary is the laws of physics. Not a practical position I admit, but I think still interesting.

Say we take such an optical system and try to build the biggest computer we possibly can. Do you get a sense of how adding reversible computation on top of every other trick might be advantageous? Even in a purely light based system, it has to be built of actual physical components. Overwriting means energy has to be dissipated. If you have the components too close together this dissipated heat will either reach a noise level that destroys the computations, or even worse actually melts the components!

You can get around this by spreading the components out, but then you introduce time lags because information cannot get from one part of the apparatus to the other faster than the speed of light. (Quantum computation not withstanding, and even then lightspeed places a limit on the rate at which you can get answers from the device.) There are also limits on how much information you can pack into a single quanta of electromagnetic energy. Too much will become uncontrollable and, at the limit, become a black hole.

How large and how fast were you thinking as absolute limits on the size of these optically based devices? Do you disagree that, on top of other innovations, reversible computing might be useful?

Reversible computing is actually just theoretical, I don't think any practical devices have surfaced yet, though people at MIT and so on are working on it. Still, I find it interesting. It will probably be realised before large scale Quantum computing, is more likely to be implemented as a practical solution to the desire for more speed.

13. I agree with what your saying. I just feel a more Analog approach will yield more results in the near future. As I pointed out one can process an extreme amount more information on the same footprint using bits that represent multiple values. Let's say we took it from 64 levels to 256 levels. Each bit could represent what we now call a byte. Use 32 of these bits and the amount of data one could process in one cycle would be over 32 times what we can now (if not more). In the end clock speeds would yield far more work being done and use less energy. The reverse computing would not be needed as the transitions in steps would be very slight. You would not be switching on and off a bit, you would merely be changing it's charge slightly plus or minus. That brute force destruction of a bit would be gone.

 Bookmarks
##### Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement