Could Information Be Energy?

John Vandivier

This article documents my ignorant wondering if information can be a kind of energy.

Having recently begun studying very basic information theory, it wasn't long before I ran into the concept of the <a href="http://www.evolutionnews.org/2012/08/conservation_of063671.html">conservation of information. Having taken some science back in the day, particularly chemistry, I thought this sounded a bit like the <a href="http://en.wikipedia.org/w/index.php?title=Second_law_of_thermodynamics&oldid=602037190">Second Law of Thermodynamics. Let's review some definitions:

  • The Law of Conservation of Information: Raising the probability of success of a search does nothing to make attaining the target easier, and may in fact make it more difficult, once the informational costs involved in raising the probability of success are taken into account.
  • The Second Law of Thermodynamics: The entropy of an isolated system never decreases, because isolated systems always evolve toward thermodynamic equilibrium, a state depending on the maximum entropy.
  • The Second Law of Thermodynamics, said another way as it often is, from hereEnergy can be neither created nor destroyed, but can change form. ( For instance, chemical energy can be converted to kinetic energy in the explosion of a stick of dynamite. )
  • Entropy (Information Theory)A measure of the uncertainty in a random variable.
  • Entropy (Thermodynamics)A measure of the number of specific ways in which a thermodynamic system may be arranged, often taken to be a measure of disorder, or a measure of progressing towards thermodynamic equilibrium.
What I will now try to show is that entropy of information is equivalent to entropy of energy. Not at a semantic level, but in a real sense:
  1. The probability of the success of a single search is defined by 1/(the number of possible outcomes; number of elements in the sample space).
  2. If we decrease the size of the sample space (not sure that's the correct information theory term, but it is the term in probability theory) of a search, the success of the search becomes increasingly likely. With the same action we also reduce the uncertainty in the system. In short, while I'm not sure this is the formal definition of entropy in information theory, entropy is approximated by 1- the probability of the success of a single search, = 1-(1/(the number of possible outcomes; number of elements in the sample space)), where entropy is a number between 0 and 1 and a value of 0 means the system is fully deterministic or fully non-entropic.
  3. In thermodynamics, entropy is a measure of the number of ways a system can be arranged. While it's not clear to me how this number could change, the fact that the number can change is sufficient for my purpose here. If the number of of ways a system can be arranged is the size of the sample space, we again find that a smaller sample space means a less entropic, more deterministic system.
In short, both measures are substantively, mathematically, intuitively and semantically similar. Further points in favor of the possibility that information is a kind of energy: In conclusion, it seems possible that information can be seen as a form of energy. If so, what might that imply? It would imply many things, but for one it would modify the Law of Conservation of Energy slightly.

From:

Information in a closed system does not increase.

To:

Information in a closed system does not increase nor decrease.

It also begets a couple new questions. Whereas the community was previously predisposed to consider information, or at least consciousness, as an emergent property of matter, we can now rationally wonder what many have intuitively wondered before: Could matter be an emergent property of information or consciousness? Lastly, perhaps in light of new findings neither of those two ideas work and the relationship between matter and information becomes more akin to the relationship between water and ice; the same essential substance in a different form.