For some reason, I have been thinking about negentropy this week. The term has different meanings in different contexts. I would like to consider it in relation to information theory. The central tenet is about the relationships between energy and information. Every bit of information has attached to it an energy cost. This was neatly illustrated by Maxwell’s Demon and further refined by Szilard’s Engine – there is no free lunch in physics. I don’t profess to have an amazing understanding of physics – far from it – but I think I can grasp some of the implications from biological and geopolitical perspectives. Had I spent more effort in gaining information about physics when I was in the sixth form (those were the days!) at school instead of playing darts in the pub at lunchtimes, my negentropy score might be a bit higher! But my dart scores would be lower. Curiously, darts is a game where your score decreases as the game progresses. I will not explore this observation further in relation to negentropy, because I don’t really understand it.
So, the Frothy Filosofer’s simple guide to negentropy is as follows….
Entropy is the degree to which the universe is moving towards decay and disintegration – it is a natural state. We can view it at both molar and molecular levels. The former is at the level of the universe, the latter is at the level of the behaviour of individual cells. A flower blooms and dies, and at every stage there is an exchange of energy between the organism and the environment.
The concept of negentropy can be traced back to the work of Erwin Schrodinger (he of cat fame), but the term itself was coined by Brillouin. Negentropy is the opposite of entropy – a force moving towards growth and organisation. This relies on information (in its broadest sense within, for example, Shannon’s information theory), and of course therefore entails an energy cost.
Simply stated, every bit of information reduces the state of entropy. However, we know through the closely related laws of thermodynamics that energy can neither be created nor destroyed. So, the reduction of entropy in one part of the system is balanced by an increase in another part.
Implications in Psychology
I am particularly interested in learning theory in relation to negentropy. When we first learn to undertake an activity, we often make mistakes and tend to be inefficient. We expend a lot of energy to achieve our goals. The essence of being skilled at a task is that we perform it with high efficiency and hence with minimal energy cost. Think how our daily routine activities demand little from our cognitive resources – the behaviours are habitual and almost automatic (we can literally perform them in “autopilot”).
We can relate this to sporting activities too. Of course, I am interested in running at the moment. Training is about becoming more energy efficient, so my technique is developed to make better use of the energy used by my muscles, and my cardiovascular system adapts to the new demands. The latter is measured by the VO2 max (the maximum amount of oxygen my body can use, measured in millilitres per kilogram of body weight per minute – it increases as fitness levels increase) and more simply by the time it takes my heart rate to return to a resting baseline after physical exertion. Measurement of oxygen consumption can be used as a proxy for measuring how well someone is learning a task.
The wealth and power of nations can be understood in terms of the energy resources they control – food, oil, nuclear and so forth. At a more molecular level, it is the animals that demonstrate their access and control of energy resources that will often be more successful in mating – the stronger and cleverest will get the food needed to rear the young. There is a whole lot of other fascinating stuff, such as how food rations were calculated during the war based on the energy expended (in calories) of different people in society, but this is not the time to go into those aspects.
I have been running – improving my efficiency and VO2 max scores.
Others have been running for political office – to what end? I don’t often comment directly on politics, but it strikes me that the USA presidential campaigns were based more on misinformation than on information. Now, strictly speaking, information is neither positive nor negative – it always tells us something. Knowing that the coin did not land heads-up tells me that it was tails (unless, improbably, it landed on its edge). But, just imagine it does make a difference. If information reduces entropy (chaos and uncertainty), then it is possible that misinformation has a negative effect on negentropy (sorry about the double negatives!) and thus actually increases the degree of uncertainty….
So, this uncertainty was made manifest in the failure of polls to predict the result. And, perhaps more worryingly, we have a result that makes the world at large a place of increased uncertainty, chaos, and disintegration.
For your information, I’m just going to focus on my running. That is something I can control.
Please sponsor me for the 2017 Brighton Marathon and support MIND to help people affected by mental health problems.