Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

napoleon_in_rags

(3,992 posts)
18. Eureka! I love that.
Sat Mar 2, 2013, 07:55 PM
Mar 2013

I mean those moments where you get a breakthrough in insight. As I get older, those are the moments I value more than anything. Its quite something to understand the universe in a deep way. I liked Einstein's terminology "Understanding the mind of the ancient one". There is a sense of gaining intimacy with universe as a being or friend when you understand it better.

On completion: Your post really got me thinking, and I sort of used this response as a sort of sketch board to put my own ideas together. I do that. Open source thinking I guess. But as a result its long, don't feel obliged to read it, I sure don't have time for every long post. If you don't just take a thanks for putting out some interesting things to think about, as I take these ideas into my own life


Entropy is such a powerful idea because its so simple, and it applies to everything. At its simplest level I think of it as:

s1 = s2 + e.

Which is to say, state of things 1 equals state 2, (the higher entropy state) plus something lost, e. (often energy.) So a cart on top of a hill is s1, it releases harnessable energy e as it rolls down, but once in state 2, the higher entropy state. Or with gasoline combustion.

2 C8H18 + 25 O2 = 18 H2O + 16 CO2 + E

S1 is the octane oxygen mix in the cylinder, S2 is the water vapor and CO2 that comes out the exhaust pipe, and E drives the car. Its important to note if you mix the elements of S2 together, you get club soda, not gasoline and oxygen. You would have pour energy in through hydrolysis and other things to recover the octane and oxygen, because that's the lost ingredient.

Thermodynamic and information entropy are probabilistic, so the lost element for the former is called "order"
s1 = s2 + order
Or certainty/simplicity for the latter, depending on how you look at it. Their so related their equations can be swapped. For instance, suppose you have a hot frying pan, you are going to put it in a tub of cool water. Let p1 be the probability that an arbitrary hot molecule (defined as above the mean temp they will both eventually arrive at) is in the pan, and p2 be the probability that its in the water. Than the probability vector (p1, p2) moves from (1, 0) to (1/2, 1/2) as the heat disperses from the pan. Plug this into the equation for Shannon Information Entropy, and you get a smooth transition from entropy 0 to 1. Look here to see what chart looks like for entropy, where measured are probabilities of hot molecule being in pan vs. water: (Note right half of chart is reverse, if water were hotter than pan. Central point is max entropy)
http://en.wikipedia.org/wiki/Entropy_%28information_theory%29#Example

But wait! Information entropy is the measure of how much information is in a channel. Well, if each hot molecule were a bit, and whether its in the pan or water is determines whether its in the pan or water, (let pan = 0, water = 1) then it does work as a channel. At first, the channel is a endless stream of zeros, (all hot molecules are in pan) so it clearly contains no information, but has no uncertainty about what the next bit will be. But finally, it has an equal amount of 1's and 0's, so it has as many bits of information as molecules, and total uncertainty about what the next bit will be.

So information entropy and physical entropy are intimately linked. The solution to Maxwell's Demon, which is a PROFOUND result that too many people overlook, makes this fact clear: The demon can't make the observations and computations (involving information entropy) to decide which particles to let through without changing the equation for the physical entropy, which its supposed to be reversing. In simple terms, computation takes energy, CPU's get hot, which we knew.

So anyway, the thing about information entropy is that its reciprocal is soooo important in using it. For instance, when we are doing detective work to solve a problem (computation) we are actually moving from max uncertainty (entropy) to certainty, so entropy is decreasing. But in another framing of the equation, entropy is increasing.

Example: Let's view evolution as a computational process, which determines which species is most well adapted to a given unchanging environment. For simplicity, we have one gene, which produces two different kinds of birds: The poorly adapted, and the well adapted. Now we know that the poorly adapted are going to die out and the well adapted will thrive. If we set up the equation as the probability that a given bird is well adapted or poorly adapted, we will actually see entropy decrease. But now suppose (again for simplicity) that we have the well adapted birds in the east, and the mal adapted birds in the west of the ecosystem. Now we set up the equation as the probability of finding a well adapted bird in the west, vs. the east, sort of like we did with the hot molecules. Now the entropy actually increases as the poorly adapted birds die off and the well adapted birds move in to their habitats. And there are a million other ways to rephrase equation which will show an increase or decrease in entropy depending on whether we have chosen to measure an entropic process, or its reciprocal as it progresses. The main thing is, both have the quality of irreversibility:

S1 = S2 + E

In the first formulation, E was genetic diversity which left the system. In the second, E was maladapted birds which have died, which left the system. They left the state just like energy leaves a chemical equation. Same locally irreversible phenomenon, different formulations, one which appears to make entropy increase one which appears to make it decrease. But that's just how you look at it, under it all, the same thing applies.

But the thing is, entropy is locally irreversible, and irreversible on the whole. Once E leaves, the bolder doesn't roll back up the hill unless you put E back in. But you can put E back in, it just has to come from somewhere else. The birds model assumes an unchanging environment. But that environment is hot, and the maladapted birds are actually cold adapted, and an ice age comes, then the entropic process "reverses", for a new environment, and the cold adapted birds take over. If you leave a frying pan out in the sun so it gets hot, then at night rain fills the tub around it, the same process unfolds. But if it clears up the next day and the water evaporates, and it heats up again, entropy "reverses" locally. Its just the simple equation above reversed, the E comes from the sun. Similarly, CO2 and H20 were the max entropy state these chemicals were in before life on earth. But if hydrocarbons come from ancient life, then ancient plants took in C02 and H20 and plus the E of sunlight, made them (or their precursors) within. So that stored energy you reference very clearly came from somewhere.

So let's stand back and look at the whole scene with a plant getting sunlight:
1) The entropy in the solar system is continuously increasing as energy moves out from the sun to surrounding space
2) The entropy for the cube on earth the plant lives in increases every evening as the heat absorbed from the sunlight by certain balances out to surrounding materials, but decreases every morning as more energy pours in and is absorbed by certain materials.
3) The information entropy of the genetic system increases through evolution for the environment, but decreases as the environment changes, unless we reformulate our metric.

So the bottom line is, there are an infinitude of different metrics of entropy, depending on what we are measuring. Entropy always increases in isolation, but a local measure of it can decrease with energy coming in from the outside.

So taking all this back to what you're saying, I'm sure you're right... For many metrics. But by my accounting, there is no single universal metric of entropy, there are an infinitude of them, especially when you include information entropy. So statements about entropy increasing aren't meaningful without tracking on which metric you're using. For instance, the idea of life speeding up entropy makes one kind of sense for the ancient plants which turned sunlight into fossil fuel precursors, and another kind of sense for the humans who burn those fossil fuels, but not for both because those are opposite activities.

But at the same time, I think you've touched on something, you're on so something really, really powerful. Really really big.

Even more than losing my virginity or my first acid trip. But then, I've always been a little strange, don't ya know...

Lol! Ah, the first acid trip. Did you ever read The Old Man and the Sea by Hemmingway? Its the sad story of an old fisherman with a little boat, who goes out to deep waters, and snags a giant fish. he fights with it for days to reel it in, but in the end its too big for his boat, and he has to lash it to the side. Just a tiny mistake makes a cut in the fish, so it starts to bleed. This draws the sharks. In the end, he arrives back to shore, to society, with only bits of the skeleton, and weeps. Only the wise old veteran fisherman sees the skeleton and says "that must have been a big fish". But does it matter to the rest? No. He didn't bring the fish to shore.

Its a metaphor for genius. The fish is the concept, the boat is the mind. Anybody can snag a big fish, (have a genius concept) only those with an appropriate boat can bring it back to shore, to society. So when somebody with a background in biochemistry, having done all the discipline (big boat) takes acid, you get this:

Nobel Prize-winning father of modern genetics, was under the influence of LSD when he first deduced thedouble-helix structure of DNA nearly 50 years ago.

When they don't, you get a hippy saying "Oh man! I can see life, its like a spiral man!"

A large concept swimming below the water here. But whether there is a boat large enough to bring it to shore is a separate question!

Peace!

Recommendations

0 members have recommended this reply (displayed in chronological order):

Thermodynamic footprints [View all] GliderGuider Mar 2013 OP
Looks like good work. napoleon_in_rags Mar 2013 #1
My first thought is GliderGuider Mar 2013 #2
I think some of your statements, ... CRH Mar 2013 #3
Yes, you're seeing the humanist blockage up close and personal. GliderGuider Mar 2013 #4
I think we agree more than we disagree, ... CRH Mar 2013 #5
"reading this paper earlier in the week was the greatest "Eureka!" moment I've ever had in my life" kristopher Mar 2013 #6
Building an "integrated, holistic and thoughtful understanding" is exactly what I'm doing. GliderGuider Mar 2013 #7
This is what I see kristopher Mar 2013 #8
And I have no doubt that Swenson and Odum would think I'm bastardizing their work as well. GliderGuider Mar 2013 #9
Recall please... kristopher Mar 2013 #10
Yes, and in fact my view on determinism hasn't changed. GliderGuider Mar 2013 #11
No he isn't kristopher Mar 2013 #12
Well he's out of luck. GliderGuider Mar 2013 #13
You can't explain why Harris rejects your assertions regarding ... kristopher Mar 2013 #14
I don't make absolute assertions about that - at least not any more. GliderGuider Mar 2013 #15
Very nice. Ghost Dog Mar 2013 #16
I don't think it would be a violation. GliderGuider Mar 2013 #17
Eureka! I love that. napoleon_in_rags Mar 2013 #18
Sweet! GliderGuider Mar 2013 #19
Well, the boat can always get bigger. napoleon_in_rags Mar 2013 #20
I know what you mean. GliderGuider Mar 2013 #21
I relate to that actually. napoleon_in_rags Mar 2013 #22
Latest Discussions»Issue Forums»Environment & Energy»Thermodynamic footprints»Reply #18