Why Does Time Move Forward?

Disclaimer: these are not original ideas but are largely based on Sean Carroll’s “From Eternity to Here,” which I highly recommend.

You may have wondered why time seems to move forward and not backwards. Then, of course, you may have wondered what exactly one means by this. In other articles I’ve adopted a more philosophical approach; today I will focus on the physics.

There are several clues that tell us there is a fundamental difference between past and present. For example, we can only remember the past, not the future. We see glasses breaking, but never re-assembling themselves before our very eyes. When we put blue coloring in water and we stir, the coloring dissolves in the water. No matter how long we keep stirring, the two will never separate again.

This would make one think there is something in the laws of physics that differentiates between past and present. Alas, this is not the case (for physicists out there: it depends on how you define “time reversal” and I’m using CPT as a synonym of time-reversal symmetry.) The laws of physics are time-symmetrical. They predict exactly the same in both directions of time.

Let’s see what we mean by this. Imagine I film a pool game, disregarding the players and focusing on the balls themselves. The laws being time-symmetrical means I have no way to tell, upon watching the video, whether it is being played forward or backwards. Both chains of events are compatible with the laws of physics. If I do the same with any interaction between subatomic particles, I will get exactly the same result.

Pool

Pool (Photo credit: davehuehn)

Some astute reader may have been thinking: “you’re wrong! There is a well-established law which is not time-symmetrical: the second law of thermodynamics.” If you don’t know what this is, here’s a brief summary. The second law of thermodynamics states that there is a quantity, called “entropy,” that increases in every isolated process. This gives us an arrow of time: the future is the direction in which entropy increases. There! Done.

Not so fast! Unfortunately, the laws of thermodynamics are not fundamental laws. They are actually consequences of the basic laws of nature (quantum mechanics or Newtonian mechanics, whatever tickles your fancy) together with probability theory. They are a prediction of our time-symmetrical laws, not an independent entity.

But how can something time-symmetrical predict something that is not time-symmetrical? The short answer is it can’t. The only way to achieve this result is to cheat.

But first, an aside on entropy, which is the most important concept we will deal with in this article. Entropy is usually explained to laypeople as a measure of how disorderly a system is. I will try to go a little beyond this simplistic view and give a more realistic account that’s actually in line with the math.

Thermodynamic system with a small entropy

Thermodynamic system with a small entropy (Photo credit: Wikipedia)

Entropy is (related to) the number of ways you can microscopically arrange a system without changing its macroscopic properties. For example, take a room full of air: if I swap one molecule in the top right by another one in the top left, there will be no appreciable difference in the air’s properties: it will have the same pressure, volume and temperature. In this case, a change in the microstate (the position and velocities of each molecule) has absolutely no effect on the macrostate (pressure, volume and temperature).

Each macrostate (set of properties visible with our “naked eye”) is compatible with a certain number of microstates. That is: for each pressure, volume and temperature combination there is a certain set of microscopic configurations that will give us this result. Entropy is defined as the (logarithm of) the number of microstates compatible with the current macrostate. That is, suppose I want to know the entropy of a gas at a certain temperature, pressure and volume: its entropy is the sum of all the possible microscopic combinations that give rise to the observed properties.

If we understand this, it is easy to derive the second law that says that entropy always increases. In fact, all the second law says is that systems tend to be in their more likely state, basically because it’s, well, more likely. But what does the number of microstates have to do with likelihood? A lot, in fact. Imagine a system (a gas in a container) that can be in one of three macroscopic states. For each macrostate, there is a certain number of microstates compatible with it: 1 for the first one, 10 for the second one and 100,000 for the third one. What are the odds of the system being in the third state? 100,000 / 100,011 or 99.98% (the compatible microstates divided by all the possible microstates). As you can see, the third system is the likeliest and is also the one with the highest entropy. Higher entropy equals higher probability.

Entropy ≥ Memory . Creativity ²

Entropy ≥ Memory . Creativity ² (Photo credit: jef safi \ ‘Parker Mojo Flying)

So we’ve cracked it, haven’t we? Systems tend to evolve to likelier states (duh!) which means higher entropy, which means we will always see an increase from lower to higher entropy, which gives us an arrow of time. Right?

Wrong. Because this argument, which works from past to future, should also work from future to past. If I have some state at this moment, logic also dictates that its evolution towards the past increase its entropy, not decrease it! We should see an increase of entropy in both directions from the present, because the laws of physics are time-symmetrical. So, as I said before, our only way of deriving the time-asymmetrical second law is to cheat: entropy will always increase towards the future, as long as entropy in the past was very low. The question now is finding out why on Earth entropy was so low in the past, when it should have been high. We are right where we started!

Not quite, though. Because now we have transformed our original question (“why does time move forward?”) into “why was entropy so low in the past?” which is a lot more specific. If we manage to answer that, we will have finally cracked the mystery that has had the best minds in the planet scratching their heads for centuries.

So, ready for the solution?

Hah! Who said there’s a solution? Unfortunately, this has not been resolved yet. There are several promising lines of inquiry that I may go into some other day, but for the moment nothing is settled. In short: we have some wild guesses but really no idea. Feel free to contribute!

Who said that science claims to know everything?

Enhanced by Zemanta

27 thoughts on “Why Does Time Move Forward?

  1. geneticfractals

    I always thought that if I could somehow cause an implosion in a smoky room that there is a chance in 10**xxx that the CO2, H20, SO2, and whatever else I should add, that the atoms and molecules would recombine into a fire cracker. This would decrease entropy but only because I add the right energy to the system. Therefore, time’s arrow and entropy are a different ‘thing’. I guess I am being simplistic 🙂

    Reply
    1. David Yerle Post author

      The thing is: how would you be able to tell whether you’re in a universe where the entropy increases or decreases? Also, I forgot to mention something: the second in the modern context is a statistical law, which means it can be broken. It’s just very, very unlikely. So you could actually see all your atoms recombine into a firecracker without adding any energy!

      Reply
  2. Brad

    Is the macrostate of an isolated system defined in terms of “pressure, volume and temperature”? Or is it rather defined as a “set of properties visible with our “naked eye””?

    No doubt it is both (please correct me if I’m wrong).

    I can imagine a situation in which an isolated system having a fixed pressure, volume, and temperature gives rise to observable (by the naked eye) properties but such a system would tell us nothing about entropy at global or universal scales regardless of which way the time arrow goes.

    It does raise an interesting problem regarding the question that Henk at Genetic Fractals has pondered recently – where do the cherry blossoms come from? If they can be made to grow in a closed system like a glasshouse (at a strictly fixed temperature) it would call the definition of a macrostate into question.

    Reply
    1. David Yerle Post author

      I’d say the definition of a macrostate is somewhat fuzzy and depends on the material being studied. In fact, the whole definition of entropy is information-based: it does not illustrate a property of the universe, but of our state of knowledge. In this sense, problems like the one you suggest are completely expected. In fact, the second law is known to have been violated (this has been done in the lab: all you need is a small enough sample and some time).
      In other words, the definition of a macrostate does not have to hold always because it is just a tool to make our life easier in certain situations. In complex systems such as a human being or a quantum computer, where the position of an electron can have an effect on the whole result, the naive thermodynamic approach is doomed to give not-very-helpful results.

      Reply
      1. Brad

        In the spiel about Sean Carroll’s book that you link to, it is said that “[t]he only way to understand the origin of entropy is to understand the origin of the universe.” Yet you have stated that “entropy is information-based: it does not illustrate a property of the universe”. Why, then, do physicists appeal to entropy in their attempts to understand the arrow of time in the universe?

        Reply
        1. David Yerle Post author

          current value of entropy is to understand the origin of the universe.” Physicists appeal to entropy to understand the arrow of time because it illustrate macroscopic, statistical properties of systems. That is: there is no difference (laws of physics-wise) between past and future. The only difference is in the specific distribution of matter, which is a macroscopic concept. In order to quantify this, we need macroscopic, statistical tools. Entropy is one such tool. Put in a different way: having an arrow of time doesn’t even make sense from a microscopic, fundamental point of view.

          Reply
  3. Tongue Sandwich™

    Reading your scientific articles often makes me feel like a yokel — which is a rare pleasure of sorts. I’m going to peruse the National Enquirer now, just to get my self-esteem back into working order.

    Reply
  4. elkement

    Thanks for the summary – I have not read this book (yet?).
    Is there something really new you learned from the book – as a physicist? What are the hottest news in this field of research – that are probably not tought in the class-room today (if there are any).

    I am not familar with the entropy of black-holes and stuff – is this something Carroll touches on in his book?

    Reply
    1. David Yerle Post author

      Yes and no. I had never thought about the arrow of time as deeply, but you can actually learn most of what he says in his blog, since he speaks of similar subjects there. If I remember correctly, he doesn’t touch on entropy of black holes (not much at least). However, there is this blog by “The hammock physicist” that has a brilliant explanation which uses low-level quantum mechanics concepts and some basic GR.

      Reply
  5. livelysceptic

    Thank you for your explanation of entropy. It sounded quite different from the ‘entropy for dummies’ that I was familiar with. Maybe after this I should doubt what I know about science even more…but that is my problem. 🙂
    PS: Of course I subscribed to Big Science Magazine!

    Reply
    1. David Yerle Post author

      Yes, people seem to dismiss it saying “it’s just disorder” and I find that not to be very accurate. With this explanation, which is the one physicists use, you could derive all the mathematics yourself, if you felt so inclined.

      Reply
  6. Matthew Rave

    David Deutsch has a novel spin on time in The Fabric of Reality. The idea is that different times (slices in a 4D spacetime, if you will) are really just different “universes”, each static and independent (in a way). Now, you remember what happened one second ago, and it seems to you that time is “flowing”, but it’s just an illusion. If you are just a static 4D object, then there is no “motion” along the time direction; rather, at each time level you could “ask” the corresponding “you” and they would think that they were preceded by a series of earlier selves. It’s hard to be clear in such a short note but I’d recommend Deutsch’s book if you’re interested.

    Reply
    1. David Yerle Post author

      I read that book, it’s one of my favorites and it’s what really convinced me that the MWI was right. I like his view of time: even if it’s not “true” and the universe is not organized like that, his insights still apply. You have no way to tell whether you live in a frozen universe-slice or in a relativity-type space-time universe.

      Reply
  7. Steve Armstrong

    Time symmetry applied to entropy seems connected, metaphorically at least, with the Penrose thing about large and small universes that you mentioned before. This might be more troubling for a theory of mind as opposed to physics. Please correct me if I’m wrong.

    Reply
  8. Pingback: 5 Amazing Scientific Discoveries | Science Facts

    1. David Yerle Post author

      Well, we can actually derive the 2nd law from first principles and entropy has a very clear definition which is related to information. However, because of its statistical nature it does lend itself to be taken a bit less seriously than other laws (though some other physicists may kill me for saying that).

      Reply
  9. Pingback: entropy | Wuunderland

Leave a Reply