What is entropy? An exploration of life, time, and immortality.

Rich Mazzola
The Startup
Published in
16 min readMay 9, 2020

--

Introduction

This is a good time for a disclaimer: You do not need any background in chemistry, physics or any other science related field to understand entropy.

Most people don’t think this is true because upon Google-ing, “what is entropy”, you typically get responses like this:

In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations that are consistent with the macroscopic quantities that characterize the system.

Imagine saying that to someone who asked you that question…

It’s true that entropy is related to the second law of thermodynamics and it’s true that entropy describes how everything will move from order to disorderly chaos. But it’s important to know why that happens.

This post sets out to explain that process, in an inviting way. After finishing this article you should have a scientific perspective on philosophical questions like:

— Why does life exist?

— What is time?

— Can I live forever?

There’s no time like the quran-time¹ to explore those questions.

What is Entropy, Part 1: Arrangements

The not-easy-to-understand definition of entropy is:

Entropy is a measure of the number of possible arrangements the atoms in a system can have. The entropy of an object can also be a measure of the amount of energy which is unavailable to do work.

In part 1 we’ll focus on arrangements.

To understand arrangements, we’ll use an analogy of cars and driveways. In this example cars = atoms, and driveways = potential states. States in thermodynamics are the set of variables (position, velocity vector) that describe the state² of the atom (or thing you’re describing).

In a simple example you can see that there is 1 car in 1 driveway³. This creates 1 possible arrangement.

Number of states and arrangements between objects. Example of probabilistic entropy.

But now, if we add a second car and a second driveway, the number of arrangements grows non linearly. There are now 3 possible arrangements of the cars in the driveways:

Number of states and arrangements between objects. Example of probabilistic entropy.
Arrangement #1
Number of states and arrangements between objects. Example of probabilistic entropy.
Arrangement #2
Number of states and arrangements between objects. Example of probabilistic entropy.
Arrangement #3

Again, add a third car and a third driveway and there are now 10 arrangements of the cars and the driveways.

Once you have 10 cars and 10 driveways there are over 92,000 potential arrangements of the cars in the driveways. This is exponential growth at its finest. There is an important trait about all of these possible arrangements; not all of them are equally likely to occur.

In the example below, lets say that we have 6 cars, across 6 driveways, and two different houses (adding the houses to the driveways helps represent the additional states that the molecules, ‘cars’ could take on).

Number of states and arrangements between objects. Example of probabilistic entropy.

If we randomly assign a car to a driveway and ask: what is the percent chance that 1, 2, 3, 4, 5 or 6 cars will be assigned to the top 3 driveways? We get a graph like this:

Number of states and arrangements between objects. Example of probabilistic entropy.

Here, we can see that it’s more likely to have 3 cars in the top property than 0, or all 6. Not only is it more likely, but we can actually observe how much more likely it is. In this case there’s a 21.6% chance that 3 cars end up in the top driveway, with 100 ways to arrange them:

Number of states and arrangements between objects. Example of probabilistic entropy.

However, there is only a 6.1% chance that 0 cars end up in the top property and 28 way to arrange it:

Number of states and arrangements between objects. Example of probabilistic entropy.

Why is this important? Because 1 part of entropy is a measure of the number of possible arrangements the atoms in a system can have. By walking through the example above, you just calculated one part of the entropy of a system that included x number of cars and y number of driveways across two different properties (states).

We’ve now established that entropy is probabilistic. Entropy is a measure of the number of arrangements things can have, however the arrangement possibilities are not all equal. They are normally distributed.

To understand the implications of this lets take another example, but with two important distinctions:

  1. The units we’ll use are much smaller

2. The amount of units are much larger

This is important because entropy describes how microscopic (atoms) arrangements affect macroscopic (things in our world like cups, people and planets) behave.

Sand Castles⁴

Think of this example in three parts:

  1. Individual pieces of sand
  2. Individual pieces of sand to make a sand pile
  3. Individual pieces of sand constructed to create a sand castle

If you go to the beach and you pick up sand with your hands and put it into a pile, it’s a high entropy structure.

The sand pile is high entropy because you could rearrange the individual pieces of sand in many different combinations. In all those combinations you still have a sand pile.

But now if you take your sand pile and construct a castle, you’ve just decreased the entropy of that structure. The reason is that there are less ways you could rearrange that pile of sand that would yield that same sand castle structure (e.g. it has fewer states).

Low entropy = less ways to rearrange microscopic things to create a macroscopic structure

High entropy = many ways to rearrange microscopic things to create a macroscopic structure

^There are more ways to be high entropy than low entropy

Now if you leave the castle there all day, logically the ocean winds would blow the sand castle away increasing the entropy of the sand.

But why is this happening?

There’s nothing fundamental that says that the wind couldn’t blow the sand apart and build another sand castle. This is technically possible. It’s just very, very unlikely because there are very few ways of organizing the sand to look like a castle. If we look back to our histogram, increasing the number of cars (sand) to 50 grains and increase the number of states used to distribute the sand (driveways) to 50 we would get the following histogram:

Number of states and arrangements between objects. Example of probabilistic entropy.

This histogram illustrates an important principle: As you add more states to your system, it becomes more likely that the arrangement you observe is the most likely arrangement. For example with just 50 states, the odds of observing the least likely arrangement is 1 in 132,931,168,175. Now imagine the sand castle has hundreds of millions of grains of sand. The probability that the sand pile would blow away and land as a sand castle, the least likely random arrangement, is almost incalculably low. But not impossible.

It’s just overwhelmingly more likely that the low entropy sand castle turns into a high entropy sand pile. This is why entropy is always increasing because it’s just more likely that it will.

Congratulations, you now understand 1 part of Entropy.

What is Entropy Part 2: Energy Potential

Let’s revisit our definition of entropy: Entropy is a measure of the number of possible arrangements the atoms in a system can have (part 1). The entropy of an object can also be a measure of the amount of energy which is unavailable to do work (Part 2).

The reason that the definition says ‘unavailable to do work’ is because of the nuanced difference between thermal energy and heat. The difference is:

— Thermal energy is not in the process of being transferred; it’s being stored. So under current conditions it is unavailable to do work.

— Heat is thermal energy that’s moving. This is energy that’s available to do work.

Put simply, heat is the transfer of thermal energy.

Entropy is a measure of the energy being stored because it’s looking at the (1) potential number of arrangements (2) potential energy.

However adding heat to a system will always increase the entropy of a system, because you’ll increase the number of potential arrangements. Take the example below:

System 1 cold, low entropy. When molecules are cold, they move slower. This is why ice forms. Ice represents the inability of molecules to move.

System 2, hot, high entropy. When molecules heat up the velocity increases and the likelihood that the molecules will bounce into one another increases.

Thus, heat increases entropy.

Example of entropy with two systems. One system hot and one system cold.

In part 1 we saw that the entropy of a system increased exponentially in relation to the number of possible states. The ‘state’ of the molecules in the diagram above include where they are, where they are going and how quickly they’re moving (velocity).

System 1 isn’t that interesting.

Cold system → molecules don’t move that much → less possible arrangements → lower entropy

System 2 is fascinating.

Hot system → molecules partying → many possible arrangements → Higher entropy.

The importance of thermal energy & heat in these systems should now be clear. For example, let’s assume that system 1 and system 2 are both filled with a random distribution of Hydrogen and Oxygen molecules.

The absence of heat means that these molecules are not likely to interact. However, when thermal energy is transferred, we now have the conditions where simple molecules interact with one another, creating complex structures. System 2 now has the ability to produce complex structures like water.

Thermal energy → Heat → More possible arrangements & states → Increased Entropy → Rise of complex structures.

Congratulations, you now understand the basic components of entropy. Now, with this foundational understanding of entropy we will look at the implications for the universe, time and life as we know it.

Principles for reading on:

  1. Entropy is always increasing because it’s most likely that it will.
  2. Thermal energy transfer increases entropy, which enables the rise of complex structures.

Implication #1: The reason life exists

The universe started out at a singularity that is commonly referred to as ‘the big bang’. The initial conditions after the big bang meant that the universe was small, extremely hot⁵ and extremely organized. We know that this means that the universe had low entropy.

However, entropy (disorder⁶) always increases over time because it is more likely to do so. This means that as the universe expanded and cooled entropy was increased. A good way to think about this is that as the universe expands, it creates more space for a greater number of arrangements to occur.

From our heat example in system two, we learned that there is a ‘sweet spot’ where systems are interesting. You need the right conditions which enable particles to interact with one another and create complex structures (e.g. life is a very complex structure). As entropy increases, and disorder increases, it’s more likely that complexity will emerge as interactions between distinct atoms increase.

While entropy will always increase, complexity has an arc. During the big bang there was no complexity⁷, just uniformity. Eventually once the sun burns out, the universe expands so that the distance between each molecule is unimaginable, the strong nuclear force that holds atoms together degrades, atoms fall apart and black holes evaporate⁸. There won’t be complexity at the end either.

The reason human life exists is because we are at the sweet point of increasing entropy which has created emergent complexity. If it weren’t for entropy increasing, we’d be in equilibrium and complexity couldn’t exist.

You could look at this chart, and say why does life exist? It could be that the answer is the same as asking why the sand castle turned into a sand pile. Because it’s overwhelmingly the most likely outcome.

This is the reason that Abiogenesis, or the natural process by which life has arisen from non-living structures, is the leading hypothesis among scientists as to how life emerged. Life requires a certain level of organization to exist. So on our path to extreme disorganization (increased entropy) there needs to be a boundary that separates nonliving organic material from living organic material. Of course, this boundary is not open ended. Just as entropy creates the initial balance between organization & disorganization for life to form, the continued disorder also breaks that balance.

Entropy’s arc. Showing order moving to disorder, then uniformity over time

Implication # 2: The arrow of time

“What is time? If nobody asks me, I know; but if I were desirous to explain it to one that should ask me, plainly I know not.”

— Saint Augustine, 397 AD

It’s increasingly evident that entropy governs most of the universe as we know it; so it shouldn’t be a surprise that entropy also governs time itself. Yet the paradoxical nature of time is still confounding, even with the knowledge of how entropy works.

Our universe is governed by 4 dimensions. 3 dimensions of space and 1 dimension of time. Interestingly, one of these things is not like the other. Dimensions of space are symmetrical. You can go left, but also right. You can navigate forward, but also backwards. You can ascend up, but also descend down. Time continues to march in one direction — forward. Why?

The answer is hidden in what was discussed in part 1 (arrangements) and part 2 (energy). Entropy always increases, which means there was a lower entropy in the past. This makes entropy and time indistinguishable. Both time and entropy march in one direction. This is why entropy gets the moniker ‘the arrow of time’.

In physics, the arrow of time is defined in the following way.

Start with a body, T1. Let’s imagine T1 represents a steaming hot tea kettle:

The law of thermodynamics and etropy described with two bodies and a process to connect them

Now, we add a second body. T2. T2 is a stream of cold water. Let’s imagine we connect T1 to T2 via some manual process of pouring cold water into the steaming tea kettle (this process is referred to as a working body).

That manual process is represented by Q. We then end up with a diagram like this:

The law of thermodynamics and etropy described with two bodies and a process to connect them

When the hot water from the tea kettle (T1) is brought into contact with the cold water (T2), the energy will always flow in one direction from T1 to T2. Given time, the system will reach an equilibrium of a luke-warm water temperature.⁹

As energy flows from T1 to T2, entropy increases and the process is irreversible. This irreversibly is what represents the arrow of time. Applied at a macro level, it is why you can remember the past, but not the future¹⁰, and why you can’t go back in time; because you can’t decrease total entropy of a system.

These observations give rise to an incredible principle:

Every difference between the past and the future can be described by entropy increasing.

Intuitively we all know this. For example, if I asked you which picture came first, A or B? You know instantly.

Eggs & entropy. You can intuitively tell time by the state of an egg

The egg began in a shell, then the egg was opened, increasing entropy. The process is irreversible and governed by the arrow of time. By looking at these two photos, you can tell time. The egg in the shell (low entropy) came before the cracked egg that’s cooked (high entropy).

The arrow of time can be extremely simple, like in the above example. However, it’s still a mystery with its origins unknown. We can use the arrow of time to learn much about our universe. It’s why we can look into deep space and see the initial conditions of the universe, we are literally looking ‘back in time’ to a lower entropy environment.

Cosmic microwave background. Observing this enables scientists to understand the initial condition of the universe. Source: Berkeley Cosmology Group.

However, we don’t know why the universe started as an ordered system (low entropy), thus we don’t know why time only flows in one direction (increasing entropy). While these mysteries may never be uncovered, their attributes give us much to ponder:

“The Arrow of Time dictates that as each moment passes, things change, and once these changes have happened, they are never undone. Permanence is a fundamental part of being human. We all age as the years pass by — people are born, they live, and they die. I suppose it’s part of the joy and tragedy of our lives, but out there in the universe, those grand and epic cycles peer eternal and unchanging. But that’s an illusion. See, in the life of the universe, just as in our lives, everything is irreversibly changing.”

— Brian Cox

Implication # 3 Impossibility of Immortality

Pop culture talks about limits of life being addressed by science, but scientific progress can’t break the laws of physics that govern the universe. Immortality means overcoming entropy.

To refute this argument, futurists like Ray Kurzweil claim, “our species will soon be able to defeat disease and degeneration, and live indefinitely.

His thinking follows two paths:

  1. The shift from biological to digital.

As he describes, “we’ll be uploading our entire minds to computers by 2045 and our bodies will be replaced by machines within 90 years”. I won’t speculate on the timing or feasibility of this claim, but rather the physics governing the claim. Assuming this is possible, the increased entropy over time will cause the decay of even the electrons that are utilized on the circuit boards that store our consciousness. While it’s possible this could extend like hundreds of thousands of years, or even millions of years, it can’t overcome the entropic nature of the universe. Thus, claiming immortality has a hidden argument of reversing natures entriopic tendencies nested within it.

2. Stopping Aging.

Can you slow aging? Certainly. Can you stop aging all together? No. And once again it’s because you would need to halt the ever charging force of entropy.

In the base example you can see this occurring in individual cells within the human body. The degradation of cells in the human body are responsible for the altered immune systems, viruses and cancers that are ultimately associated with aging. The society for cellular biology puts this best:

“as the cell ages, translational defects and entropy progressively increase the amount of cellular damage.”

Looking at the visual representation of this, its the same principle that turned the sand castle into a sand pile.

Entropy and it’s impact on aging. This shows the entropic impact of young vs. old cells.
Journal For Neuroscience: :https://www.jneurosci.org/content/31/44/16033/tab-figures-data

The cell structure on the left is extremely ordered. Then, you can see the disorder introduced into the pattern in the older cells on the right. Those imperfections introduce disease. So while, you may be able to treat the specific ‘disorder’, eventually the ordered system will become disordered.

When discussing lifespans, biology and future health care interventions, these subjects need to be presented accurately and within the physical bounds of what we know about the universe. Presenting them otherwise opens innovative and credible ideas to scrutiny that will impede their progress over the long term.

Conclusion

Entropy tends to carry a connotation of bleakness. Upon learning of the unstoppable force of entropy pushing the universe into disorder, many associate it with an existential angst.

That brings up a common question that I’ve heard while researching this topic: whats the point? Why should I care about arrangements, energy potential and the forces on universal scale?

Scientific benefits aside, the answer is humility. Understanding the direction that the universe is marching in, and that the direction is governed by the same processes that governs the cells in our body is profound. You could interpret the impermanence that arises out of this march with apprehension, or you could interpret it as a part of what it means to be human.

There’s great value in not only understanding that order → disorder, but that among all this disorder, there was a small system where the disorder was so perfect that life existed. A grandiose fact such as that can be uniting.

The knowledge that nothing, even protons, lasts forever can establish humility. Many people may argue this is a sad outlook, or the knowledge of impermanence creates angst. Embracing the impermanence will help us cherish what exists.

— — — — — — — — — — — — — — — — — — — — — — — — — — —

Footnotes & Sources

¹ This was written during the COVID-19 pandemic. When else could you take the time to learn about entropy?

² Source: https://physics.stackexchange.com/questions/223564/what-is-a-state-in-physics

³ Full credit for this example and inspiration / open source code from the charts to Aatish Bhatia, https://aatishb.com/

⁴ Brian Cox famously uses the example of sand castle to display the probabilistic nature of entropy, and how unlikely it would be for a low entropy structure to stay that way.

⁵ This is confusing, because previously it was stated that heat increased entropy. Yes that’s true, but in the early universe the # of states were greatly reduced because the size of the universe was 1/1,000,000,000 of a pinpoint. So even though there was heat, there were still a very low number of arrangements that were possible.

⁶ You commonly hear entropy described as the ‘amount of disorder’. Sand castles will always become sand piles. This is the scientific version of saying eventually everything goes to shit.

⁷ An example of emergence as it relates to complexity: Carbon has to come out of a dying star → example of order to disorder. This is creates a strange loop of biological life coming from carbon atoms in stars.

Overview of the second law of thermodynamics

⁹ This means that the entropy of this system can always be described as Q (the process of connecting the bodies) / T (the equilibrium of the two bodies combined).

¹⁰ What it would be like to remember the future and predict the past. This is what life would be like if the arrow of time went in multiple directions.

Entropy and the arrow of time
If the arrow of time went forward and backwards you could remember the future and predict the past

--

--

Rich Mazzola
The Startup

Techno-nerd. Looking to translate complex systems into digestible ideas. Storytelling is underrated. Would prefer to be outside.