Environment & Energy
In reply to the discussion: Thermodynamic footprints [View all]napoleon_in_rags
(3,992 posts)I mean those moments where you get a breakthrough in insight. As I get older, those are the moments I value more than anything. Its quite something to understand the universe in a deep way. I liked Einstein's terminology "Understanding the mind of the ancient one". There is a sense of gaining intimacy with universe as a being or friend when you understand it better.
On completion: Your post really got me thinking, and I sort of used this response as a sort of sketch board to put my own ideas together. I do that. Open source thinking I guess. But as a result its long, don't feel obliged to read it, I sure don't have time for every long post. If you don't just take a thanks for putting out some interesting things to think about, as I take these ideas into my own life
Entropy is such a powerful idea because its so simple, and it applies to everything. At its simplest level I think of it as:
s1 = s2 + e.
Which is to say, state of things 1 equals state 2, (the higher entropy state) plus something lost, e. (often energy.) So a cart on top of a hill is s1, it releases harnessable energy e as it rolls down, but once in state 2, the higher entropy state. Or with gasoline combustion.
2 C8H18 + 25 O2 = 18 H2O + 16 CO2 + E
S1 is the octane oxygen mix in the cylinder, S2 is the water vapor and CO2 that comes out the exhaust pipe, and E drives the car. Its important to note if you mix the elements of S2 together, you get club soda, not gasoline and oxygen. You would have pour energy in through hydrolysis and other things to recover the octane and oxygen, because that's the lost ingredient.
Thermodynamic and information entropy are probabilistic, so the lost element for the former is called "order"
s1 = s2 + order
Or certainty/simplicity for the latter, depending on how you look at it. Their so related their equations can be swapped. For instance, suppose you have a hot frying pan, you are going to put it in a tub of cool water. Let p1 be the probability that an arbitrary hot molecule (defined as above the mean temp they will both eventually arrive at) is in the pan, and p2 be the probability that its in the water. Than the probability vector (p1, p2) moves from (1, 0) to (1/2, 1/2) as the heat disperses from the pan. Plug this into the equation for Shannon Information Entropy, and you get a smooth transition from entropy 0 to 1. Look here to see what chart looks like for entropy, where measured are probabilities of hot molecule being in pan vs. water: (Note right half of chart is reverse, if water were hotter than pan. Central point is max entropy)
http://en.wikipedia.org/wiki/Entropy_%28information_theory%29#Example
But wait! Information entropy is the measure of how much information is in a channel. Well, if each hot molecule were a bit, and whether its in the pan or water is determines whether its in the pan or water, (let pan = 0, water = 1) then it does work as a channel. At first, the channel is a endless stream of zeros, (all hot molecules are in pan) so it clearly contains no information, but has no uncertainty about what the next bit will be. But finally, it has an equal amount of 1's and 0's, so it has as many bits of information as molecules, and total uncertainty about what the next bit will be.
So information entropy and physical entropy are intimately linked. The solution to Maxwell's Demon, which is a PROFOUND result that too many people overlook, makes this fact clear: The demon can't make the observations and computations (involving information entropy) to decide which particles to let through without changing the equation for the physical entropy, which its supposed to be reversing. In simple terms, computation takes energy, CPU's get hot, which we knew.
So anyway, the thing about information entropy is that its reciprocal is soooo important in using it. For instance, when we are doing detective work to solve a problem (computation) we are actually moving from max uncertainty (entropy) to certainty, so entropy is decreasing. But in another framing of the equation, entropy is increasing.
Example: Let's view evolution as a computational process, which determines which species is most well adapted to a given unchanging environment. For simplicity, we have one gene, which produces two different kinds of birds: The poorly adapted, and the well adapted. Now we know that the poorly adapted are going to die out and the well adapted will thrive. If we set up the equation as the probability that a given bird is well adapted or poorly adapted, we will actually see entropy decrease. But now suppose (again for simplicity) that we have the well adapted birds in the east, and the mal adapted birds in the west of the ecosystem. Now we set up the equation as the probability of finding a well adapted bird in the west, vs. the east, sort of like we did with the hot molecules. Now the entropy actually increases as the poorly adapted birds die off and the well adapted birds move in to their habitats. And there are a million other ways to rephrase equation which will show an increase or decrease in entropy depending on whether we have chosen to measure an entropic process, or its reciprocal as it progresses. The main thing is, both have the quality of irreversibility:
S1 = S2 + E
In the first formulation, E was genetic diversity which left the system. In the second, E was maladapted birds which have died, which left the system. They left the state just like energy leaves a chemical equation. Same locally irreversible phenomenon, different formulations, one which appears to make entropy increase one which appears to make it decrease. But that's just how you look at it, under it all, the same thing applies.
But the thing is, entropy is locally irreversible, and irreversible on the whole. Once E leaves, the bolder doesn't roll back up the hill unless you put E back in. But you can put E back in, it just has to come from somewhere else. The birds model assumes an unchanging environment. But that environment is hot, and the maladapted birds are actually cold adapted, and an ice age comes, then the entropic process "reverses", for a new environment, and the cold adapted birds take over. If you leave a frying pan out in the sun so it gets hot, then at night rain fills the tub around it, the same process unfolds. But if it clears up the next day and the water evaporates, and it heats up again, entropy "reverses" locally. Its just the simple equation above reversed, the E comes from the sun. Similarly, CO2 and H20 were the max entropy state these chemicals were in before life on earth. But if hydrocarbons come from ancient life, then ancient plants took in C02 and H20 and plus the E of sunlight, made them (or their precursors) within. So that stored energy you reference very clearly came from somewhere.
So let's stand back and look at the whole scene with a plant getting sunlight:
1) The entropy in the solar system is continuously increasing as energy moves out from the sun to surrounding space
2) The entropy for the cube on earth the plant lives in increases every evening as the heat absorbed from the sunlight by certain balances out to surrounding materials, but decreases every morning as more energy pours in and is absorbed by certain materials.
3) The information entropy of the genetic system increases through evolution for the environment, but decreases as the environment changes, unless we reformulate our metric.
So the bottom line is, there are an infinitude of different metrics of entropy, depending on what we are measuring. Entropy always increases in isolation, but a local measure of it can decrease with energy coming in from the outside.
So taking all this back to what you're saying, I'm sure you're right... For many metrics. But by my accounting, there is no single universal metric of entropy, there are an infinitude of them, especially when you include information entropy. So statements about entropy increasing aren't meaningful without tracking on which metric you're using. For instance, the idea of life speeding up entropy makes one kind of sense for the ancient plants which turned sunlight into fossil fuel precursors, and another kind of sense for the humans who burn those fossil fuels, but not for both because those are opposite activities.
But at the same time, I think you've touched on something, you're on so something really, really powerful. Really really big.
Even more than losing my virginity or my first acid trip. But then, I've always been a little strange, don't ya know...
Lol! Ah, the first acid trip. Did you ever read The Old Man and the Sea by Hemmingway? Its the sad story of an old fisherman with a little boat, who goes out to deep waters, and snags a giant fish. he fights with it for days to reel it in, but in the end its too big for his boat, and he has to lash it to the side. Just a tiny mistake makes a cut in the fish, so it starts to bleed. This draws the sharks. In the end, he arrives back to shore, to society, with only bits of the skeleton, and weeps. Only the wise old veteran fisherman sees the skeleton and says "that must have been a big fish". But does it matter to the rest? No. He didn't bring the fish to shore.
Its a metaphor for genius. The fish is the concept, the boat is the mind. Anybody can snag a big fish, (have a genius concept) only those with an appropriate boat can bring it back to shore, to society. So when somebody with a background in biochemistry, having done all the discipline (big boat) takes acid, you get this:
Nobel Prize-winning father of modern genetics, was under the influence of LSD when he first deduced thedouble-helix structure of DNA nearly 50 years ago.
When they don't, you get a hippy saying "Oh man! I can see life, its like a spiral man!"
A large concept swimming below the water here. But whether there is a boat large enough to bring it to shore is a separate question!
Peace!