Entropy: The Heat Death of The Universe

Meet Arthur.

Arthur is really good at math and he always has been. In fact, he planned on going to college to study math. But recently, life didn’t pan out the way he expected it to, and so he’s not quite where he thought he would be by now. Sounds like most of us.

What is more probable? 

  1. That Arthur is a lifeguard at the local beach.
  2. Or, that Arthur is a math tutor at a local school, and a lifeguard in his free time.

The correct answer here is 1, that Arthur is a lifeguard at the local beach and only that. Now don’t worry if you got it wrong, the majority of people do, even people with years of experience in statistics. 

If you chose 2, the error you made is known as a conjunction fallacy. You see, even without knowing anything else about Arthur beyond what the problem tells us, we can be certain that 2 is less probable than 1. 

Why? Because it’s more specific, meaning, it involves the probability of Arthur being a lifeguard AND the probability of Arthur being a math tutor in conjunction. 

Questions like this one were developed in the 1980s to test cognitive biases, among other things. Why is this important for this video, though? Because it offers a general statistical insight that is relevant everywhere else as well, even, as you will see, in the deepest trenches of reality. The insight is simply that more specificity is less likely. The scenario of Arthur being a lifeguard and a math tutor may seem more likely given how I’ve (intentionally) described Arthur to be mathematically inclined. But no matter how math-y he might sound, option 2 is the more specific scenario of the set, meaning more things will have to go a certain way for it to occur, and it’s therefore less likely. Got it? For a more drastic example, which is more likely: getting struck by lightning, or getting struck by lightning and getting hit by a car in the same day?

Now, onto the "deepest trenches of reality."

Have you ever wondered why most of the processes that go on in our lives are so seemingly irreversible? Why do they tend to go one way and not the other? Why is it so that the sugar can be stirred into the cup, but when you move your spoon the other way, it never un-stirs itself?

Why is it so that the sock drawer always needs to be organized by someone and never does so on its own accord? Why is it that a cup of cold water doesn’t suddenly start boiling? 

These questions might seem unwarranted, and even dumb. We have so much intuition for these things that it almost seems unnecessary to ask these questions to begin with. But the reason why they need to be asked is because there is actually nothing in physics - no law or theory - that is against these processes reversing themselves. Laws that are obeyed when a cup of warm water cools down are obeyed just as well when it heats up. So, there must be something, something else that’s tipping the balance almost, something that almost wants gases to disperse through a room, something that wants that cup to go cold. That thing is known as entropy, and it may have more to do with probability than with physics or chemistry. 

But first, let us try and understand what entropy means. You can understand a lot about a concept just by thinking about how it was named. The term ‘entropy’ was coined in 1865 by Rudolph Clausius, who used it to define the second law of thermodynamics. However, it’s much better if we try to understand the term using classical languages. The Greek word for entropy doesn’t necessarily refer directly to “disorder” like you might expect. Rather, it talks about isotropic turning, meaning “same no matter how you turn it.” It’s talking about the number of ways you can change something, or transform something, and still have it look the same. 

If you want to think of it in simpler ways, think of a disorganized desk. There are some ways in which you could organize a desk, and you can specify them since an ‘organized desk’ refers to a particular arrangement of things in a particular way. However, when it comes to a disorganized desk, there is really no rule that needs to be followed. A desk can be disorganized in so many more ways. Of course, a stash of papers on the left might ‘look’ different than a stash of papers on the right, but in the grand scheme of things, it’s simply disordered, especially when that’s the only distinction you are drawing - one between order and disorder. 

Now, these things might seem simple enough, but there are certain missing links we need to address before we jump from an analogy like that to reality. First, let me remind you of what we just talked about at the start of the video - that more specificity is less likely. If we are to use that in the desk analogy, it means that since an organized desk is a comparatively more specific arrangement, it is less likely than a disorganized desk to just occur by random chance. 

I’m sure you have some intuition for this. If you organize your desk at the start of the week, unless you do something to maintain that order, it’s probably not going to stay very ordered as the week goes on. Another missing link is the fact that in reality, things aren’t super distinguishable. Particles at the atomic level are much harder to tell apart than the things on your desk. Even if certain disorganized versions of your desk might look different to you, you can rest assured that won’t be the case when you’re looking at a million particles that all look the same. 

But arguably the biggest link that makes entropy a hard concept to digest is the scale of things. You and I really have no clue, and I mean it when I say it, no clue, as to the real size and number of things. Of course, you’ve heard things about there being more stars than grains of sand and what not. But you don’t really know how big it is. The term ‘unfathomable’ gets thrown around quite a lot, but rarely is its use as fitting as it is now. Reality is unfathomably complicated. Even if you measure ‘complexity’ just by the number of particles, you and I will fail to internalize the sheer size of things no matter how much we try… but I’m gonna go ahead and try anyways.

Let’s look at a box with 2 compartments and a few balls in them. Let's start nice and easy, so we just have 2 identical balls. We allow the balls to move freely in between the 2 compartments. At any given point in time, where could the balls be? Well, they could all be in the left compartment - that’s scenario number one. Or one ball could be in the left and the other could be in the right - that accounts for 2 scenarios actually; since there are 2 identical balls, if we were to swap them in this arrangement, we wouldn’t be able to tell them apart. Or they could all be in the right compartment - the fourth and final scenario. All these situations are equally likely. 

Now, what is the probability of finding both balls in the left compartment? 1 out of the 4 possible scenarios meets our requirement, so it is going to be ¼, or 25%.

Now, let’s bump it up a bit. Instead of having just 2 identical balls, let’s have 4. Now, since there are more balls to move around, the number of ways they can be arranged also goes up. This can be represented by 2 to the power of the number of balls. In this case, 2 to the power of 4 (24 = 16) is 16. Subsequently, our luck needs to be that much better to have the balls all spontaneously place themselves in the left compartment. And sure enough, the probability of having all 4 balls in the left compartment is now down to 1 in 16. 

Let’s keep going. What happens when we have 5 balls? The probability goes down to 1 in 32 now. What happens with 16 balls? The probability goes down to 1 in 65536. Notice how quickly the numbers jump? Let’s jump ahead a bit. What happens with 128 balls?, which is still quite fathomable for us. The probability is now down to a staggering 1 in 340,282,370,000,000,000,808,640,304,032,688,192,896. 

Now let’s take a step back to put things in perspective. We rarely have these well-defined “compartments” in reality. Things move around wherever they want to and so the number of ways in which they can arrange themselves, even with 2 balls is, I’ll say it again, ‘unfathomably’ large. But that’s still with just 2 balls. 

Remember earlier in the video I said it would be hard for you to distinguish between particles when you’re looking at a million of them at once? Even a million particles is such a gross underestimation of reality it’s not even a contest. Something as little as 20ml of water, basically the last sip of water in a bottle, has not a million particles, not 10 million, not even a quadrillion particles. It has over 10 sextillion (10^23) particles. I’ll spare you the math on this one, but let this be the takeaway: things get really complicated, really fast. 

To connect the dots with our other takeaway, in reality, an ordered arrangement of particles, such as gas that has not dispersed, or sugar that has not been mixed, also is significantly more specific when compared to other disorderly arrangements, like the desk example. That specificity makes it so much less likely - there’s that many more particles to arrange, there’s that much more luck needed for an orderly arrangement to exist. 

When you put all of that together, it starts making sense why things tend towards disorder. It’s not so much that the laws of physics are biased towards one arrangement or another. Rather, disordered arrangements are so overwhelmingly more likely than ordered arrangements, that if you let nature do its thing, things will always become more and more disordered. It’s remarkable that, at the core of it all, it’s not a law of physics or chemistry, but rather one of probability. 

Interestingly though, this can lead to the curious insight that it is indeed possible for a cup of water to spontaneously go from cold to hot. If the trend towards disorder is indeed independent of the laws of physics, there should be no problem with that happening. 
And in reality, there really isn’t.

Why don’t we see it then? Well, the scale of things comes into play again. Going back to the ball-compartment example, even if, say, we reduce the number of balls back to a 1,000, and say that the balls change to different arrangement a million times every second, we would have to wait three hundred quattuornonagintillion years (300 x 10285 years) to see them all in the left compartment. That is many orders of magnitude more than the age of the universe. 
All for something as simple as 1,000 balls being put into a box.

It is fair to say it wouldn’t be very wise to wait for that to happen… But what if you did? What if you saw out all of the arrangements of everything in the universe? Well, that brings us to the power of entropy in giving us an insight into the end of existence. 

You see, energy that we can use often needs to be in a particular form, in an ordered form. You can think of it in terms of food. Foods are just molecules, and everything has molecules. But you can’t just eat everything. You can only eat things in which those molecules are arranged in very particular ways. In any other, ‘disordered’ arrangement, energy simply isn’t very useful. 

And across the universe, energy is constantly transitioning from an ordered to a disordered form, causing a loss of useful energy. This sort of a progression also implies that at the start of the universe, energy was probably very ordered. But what must have got it there? Was it just a fortuitous ordering of particles? And what happens if this process keeps going on? Well, the most likely outcome is what is known as the heat death of the universe. As the arrow of time pushes us forward, each day the universe inches closer to maximum entropy. A state of no thermodynamic free energy. It doesn’t necessarily mean that the universe will be hot, or that it will be cold. It just means that once everything has equalized to disorder, there will be no temperature difference to motivate a thermodynamic process to occur. “Nothing” changes, nothing “happens.” Time ceases to have any meaning.

This gradual process is inevitable, but it is nonetheless hard to see. The universe is hardly distinguishable from one day to the next. So to imagine that it might all one day simply cease to exist is a bit hard to wrap your head around. But that is not to say that we are oblivious to the regularity of decay. It’s all around us. Fading memories, aging, death, and the ultimate impermanence of our existence repeatedly remind us of decay. But if it is all so inevitable, what is really the point of it all? Why even bother? 

In his book “Enlightenment Now,” Steven Pinker says: 

“The Second Law of Thermodynamics defines the ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy, and carve out refuges of beneficial order. An underappreciation of the inherent tendency toward disorder, and a failure to appreciate the precious niches of order we carve out, are a major source of human folly.”

Maybe, it really is just that. Small moments of apparent order against an inescapable reality of decay. But eventually, the lights have to go out one last time.

The heat death of the universe is still a theory. It’s a theory in the sense that we would have to cycle through all the different ways in which things can arrange themselves. And while the sheer numbers and timescales involved are too large for a human lifetime, that’s not the case for the universe. The universe has time, and in a sense, is time. What is to say that in doing so, in going through all the different combinations, we won’t just stumble upon an orderly arrangement again? Sure, it might take many quattuornonagintillion years for something like that to happen, but it can happen. The possibility is vanishingly small, but it remains non-zero. 

And when that does happen, the lights just might turn back on, and for all we know, we might just end up back at square one.

“Discovery is seeing what everybody else has seen and thinking what nobody else has thought.” - Albert Szent-György

- MA, MM