Information/entropy relationship Thema1 #112961 What is the relationship between information and entropy? This issue is taken up elsewhere in the map. Claude Shannon introduced the concept of Information Entropy - so named because the relevant equations have exactly the same form. Follow the cross-link for more. |
|
+Verweise (1)
- VerweiseHinzufügenList by: CiterankMapLink[1] Landauer's Principle
Zitieren: Hooker, Robert Perry - Computer Science grad student, University of Montana Zitiert von: Peter Baldwin 6:01 AM 4 July 2011 GMT Citerank: (1) 403464Complexity based explanationConscious life with memories and a psychological arrow, being highly complex, cannot evolve or exist in an environment at or near thermodynamic equilibrium - and a closed system not at thermodynamic equilibrium will evolve toward it. Therefore we should expect to see entropy increasing.959C6EF URL:
|
Auszug - In 1961, Rolf Landauer published a landmark paper titled "Irreversibility and Heat Generation in the Computing Process" in which he observed that irreversible computational processes emit heat. Landauer's principle is significant because it provides a direct link between physics and information theory, and, as of this writing, the full importance of this connection is not fully understood.1
Thirteen years prior to Landauer's discovery, Claude Shannon launched the field of information theory with his seminal paper A Mathematical Theory of Communication. In this paper, Shannon defined the notion of information entropy, which quantifies the information content of any particular message.2 Interestingly, information entropy H was so named NOT because of a direct link to the thermodynamic quantity S, but because the formulas for S and H look exactly the same... |