9th Posted January 23, 2014 https://www.simonsfoundation.org/quanta/20140122-a-new-physics-theory-of-life/ At the heart of England’s idea is the second law of thermodynamics, also known as the law of increasing entropy or the “arrow of time.” Hot things cool down, gas diffuses through air, eggs scramble but never spontaneously unscramble; in short, energy tends to disperse or spread out as time progresses. Entropy is a measure of this tendency, quantifying how dispersed the energy is among the particles in a system, and how diffuse those particles are throughout space. It increases as a simple matter of probability: There are more ways for energy to be spread out than for it to be concentrated. Thus, as particles in a system move around and interact, they will, through sheer chance, tend to adopt configurations in which the energy is spread out. Eventually, the system arrives at a state of maximum entropy called “thermodynamic equilibrium,” in which energy is uniformly distributed. A cup of coffee and the room it sits in become the same temperature, for example. As long as the cup and the room are left alone, this process is irreversible. The coffee never spontaneously heats up again because the odds are overwhelmingly stacked against so much of the room’s energy randomly concentrating in its atoms. Although entropy must increase over time in an isolated or “closed” system, an “open” system can keep its entropy low — that is, divide energy unevenly among its atoms — by greatly increasing the entropy of its surroundings. In his influential 1944 monograph “What Is Life?” the eminent quantum physicist Erwin Schrödinger argued that this is what living things must do. A plant, for example, absorbs extremely energetic sunlight, uses it to build sugars, and ejects infrared light, a much less concentrated form of energy. The overall entropy of the universe increases during photosynthesis as the sunlight dissipates, even as the plant prevents itself from decaying by maintaining an orderly internal structure. Life does not violate the second law of thermodynamics, but until recently, physicists were unable to use thermodynamics to explain why it should arise in the first place. In Schrödinger’s day, they could solve the equations of thermodynamics only for closed systems in equilibrium. In the 1960s, the Belgian physicist Ilya Prigogine made progress on predicting the behavior of open systems weakly driven by external energy sources (for which he won the 1977 Nobel Prize in chemistry). But the behavior of systems that are far from equilibrium, which are connected to the outside environment and strongly driven by external sources of energy, could not be predicted. This situation changed in the late 1990s, due primarily to the work of Chris Jarzynski, now at the University of Maryland, and Gavin Crooks, now at Lawrence Berkeley National Laboratory. Jarzynski and Crooks showed that the entropy produced by a thermodynamic process, such as the cooling of a cup of coffee, corresponds to a simple ratio: the probability that the atoms will undergo that process divided by their probability of undergoing the reverse process (that is, spontaneously interacting in such a way that the coffee warms up). As entropy production increases, so does this ratio: A system’s behavior becomes more and more “irreversible.” The simple yet rigorous formula could in principle be applied to any thermodynamic process, no matter how fast or far from equilibrium. “Our understanding of far-from-equilibrium statistical mechanics greatly improved,” Grosberg said. England, who is trained in both biochemistry and physics, started his own lab at MIT two years ago and decided to apply the new knowledge of statistical physics to biology. Using Jarzynski and Crooks’ formulation, he derived a generalization of the second law of thermodynamics that holds for systems of particles with certain characteristics: The systems are strongly driven by an external energy source such as an electromagnetic wave, and they can dump heat into a surrounding bath. This class of systems includes all living things. England then determined how such systems tend to evolve over time as they increase their irreversibility. “We can show very simply from the formula that the more likely evolutionary outcomes are going to be the ones that absorbed and dissipated more energy from the environment’s external drives on the way to getting there,” he said. The finding makes intuitive sense: Particles tend to dissipate more energy when they resonate with a driving force, or move in the direction it is pushing them, and they are more likely to move in that direction than any other at any given moment. “This means clumps of atoms surrounded by a bath at some temperature, like the atmosphere or the ocean, should tend over time to arrange themselves to resonate better and better with the sources of mechanical, electromagnetic or chemical work in their environments,” England explained. 3 Share this post Link to post Share on other sites
Zhongyongdaoist Posted January 23, 2014 Thanks for the post and link In the 1960s, the Belgian physicist Ilya Prigogine made progress on predicting the behavior of open systems weakly driven by external energy sources (for which he won the 1977 Nobel Prize in chemistry). But the behavior of systems that are far from equilibrium, which are connected to the outside environment and strongly driven by external sources of energy, could not be predicted. I read Prigogine thirty plus years ago and decided that the chances that thermodynamics favored the evolution of life were very strong. The link between information and entropy can also be taken as pointing in this direction, since living systems are information rich. Back then I reached the conclusion that physics would move in the direction of information theory as a means of solving the problems of integrating relativity and quantum mechanics. Time has born that conclusion out. For an interesting complementary read, this book: Physics in Mind by biophysicist Werner Lowenstein is a very good discussion of consciousness integrating information theory and quantum computing into a theory of consciousness. It is up to date and rather more convincing than other authors that I have read on the subject. 1 Share this post Link to post Share on other sites
thelerner Posted January 23, 2014 Seems to me entropy increases unless there's an energy source. Looking up, there it is, the sun, wonderful source of energy. Looking down a hot radioactive core at Earth's center. We've got pretty big energy sources to work with. Energy and time = stuff. Particularly in a gravity well like earth where stuff can't spread infitinum once created. Seems like the planet is an ideal cooking pot moving through space. I was looking at the forms the element bismuth makes when its heated and cooled. Very intricate, almost machine like. It doesn't mean anything. Our vascular system, the leaves on a tree.. life on the planet is clearly related to each other, right down to a cellular level. Share this post Link to post Share on other sites
Taomeow Posted January 23, 2014 (edited) Chaos is inherently self-organizing. They make a mistake positing an external source required for processes that locally lower entropy by universally increasing it. The very existence of the external source of energy is the outcome of the availability of a larger source of self-organizing -- the sun, e.g., is not a piece of chaos, it's a chunk of order. Life does not utilize chaotic energy, it utilizes pre-organized, structured energy. To structure energy into pockets of order is an inherent ability of chaos, which neither has, nor needs, an external source. Where would THAT come from?.. Our physicists can't break away from some god-creator in the final analysis, they always have to externalize something to build a model of anything. The whole line of pre-trimmed interpretations of the facts they are observing strikes me as expansionism applied to physics, an ideology of a parasitic mode of functioning trying to prove it is the only mode of functioning possible. Feed off something, exhaust it to extinction, move on. They call it "evolving." I call it cognitive blindness. Life functions by balancing rather than increasing entropy. A cooled-off room-temperature cup of coffee won't heat up above room temperature unless I drink it. Then it will, beautifully. Here, I just did it. I organized a local entropy-lowering event, that's the whole point of my being here. The sun which let the coffee tree produce the coffee beans effectively accomplished the same. The galaxy that produced the sun accomplished the same. The chaos that produced the galaxy produces local self-organizing entropy-lowering pockets of existence by itself, ziran. It is one of its inherent properties, or as a taoist might say, virtues. Life as we know it is just one fraction of the fractal of self-patterning, self-organizing, entropy-balancing whole (tao fa ziran, chaos organizes itself into patterns naturally). Prigogine, incidentally, got into chaos and fractals later in his own development -- and whoever missed this particular bandwagon is bound to miss everything... Edited January 23, 2014 by Taomeow 3 Share this post Link to post Share on other sites
Marblehead Posted January 23, 2014 I watched just recently a documentary that spoke to this theory. I'm not ready to buy into it yet but then I have no arguements against it either. Share this post Link to post Share on other sites