Monday, July 25, 2022

Philosophical Implications of Entropy

Preliminary readings:

https://thestandupphilosophers.co.uk/the-trump-card-of-modern-nihilism-entropy/

https://youtu.be/Cco0T7cj-B4

https://www.life.illinois.edu/crofts/papers/Life_information_entropy_and_time.html

https://thestandupphilosophers.co.uk/what-is-entropy/

and if you got this far....

https://www.hindawi.com/journals/complexity/2020/8769060/


Musing

The concept of entropy is broad and deep, with both physical and intellectual interpretations. In our physical universe, entropy is a quantity never less than zero, first introduced by a German physicist in 1850, Rudolf Clausius, that evolved into the second law of thermodynamics, primarily through the rigors of Boltzmann and his approach to estimating the most probable state in statistical mechanics. By linking to probability, Boltzmann describes entropy as not an existing mode and state of the general mass and energy of the system but the mode and state of the organization, matching, and distribution of these mass and energy. Since entropy is an estimate of the distribution of this mass in energy at any given state, the change in entropy suggests a change in how mass or energy represents itself. As the laws of physics would tell us, that change in entropy is never negative. The change in the mass organization is unidirectional by virtue of how one defines a change.

Almost 100 years later, Shannon delves into what “information” means in his study of information theory. He postulates non-randomness as “information”: a set system(s) that can generate random messages or signs with their own probabilities.

These two ideas have profound interpretations in a metaphysical sense. 

From the Boltzmann perspective, entropy is never less than zero (at least in the physical universe we experience). As far as I can tell, evidence that dS> 0 requires an increment of time to measure a change. So, does this mean that when dS~0, the state of a system is what it is when there is no corresponding change in time? Missing bits of DNA after millions of replications would think so.

Second: What is “information theory” anyway, and what does randomness have to do with information? Shannon entropy quantifies this in a purely mathematical way. Perhaps this is really just a relative argument since one doesn’t have information in a vacuum; rather, information is put into context with other information (which supports my general theory that mackintosh apples, with their colorful skin, is way more informative than plane old golden delicious apples… and yes, I will fight you on this).

Is the universe just another victim of entropy? 

What also needs to be discussed are these loosie-goosey interpretations of entropy and how they connect with philosophy in our modern-day lives. These are broad generalized statements: "… well things just fall apart so … [insert waxing poetry]”. But underneath all these sentiments is the realization that adopting a worldview that “nothing has meaning” because

A) of the inevitable collapsing of the universe.

B) my brain and neurons are two different things, so I give up.

C) my pastor told me so.

D) etc.,

Of particular note, A), has some (physical) legitimacy since the universe will tend to a state with lower energy (Quick cut to the “Entropy is justification for Nihilism memes”) … because it’s physics! Things break down, and we just must deal with the consequences.


Random topics of query:

1. The YouTube video purports that “we create order -and progress- at the expense of disorder elsewhere.” Do you agree with this assertion?

2. Is entropy really nihilism's trump card?

3. Does thinking of a universe with negative entropy makes sense?

4. What is the fascination with humans trying to create order anyway? 


No comments:

Post a Comment