Thermodynamics
The universe trends toward disorder. Coffee cools, ice melts, rooms get messy. Why can't these processes spontaneously reverse? The answer is entropy — and it connects thermodynamics to information theory.
01 — The Concept
Ask a physicist to define entropy and you'll get different answers depending on context:
These are all the same concept, viewed through different lenses. At its core, entropy measures how many ways a system can be arranged while still looking the same at the macroscopic level.
A new deck of cards in factory order. Ice cube at 0°C. All gas molecules in one corner of a room. A clean desk. There are few arrangements that look like this.
A shuffled deck. Water at room temperature. Gas spread uniformly. A messy desk. There are astronomically many arrangements that look like this.
The key insight: there are vastly more disordered states than ordered ones. A shuffled deck has 52! ≈ 8 × 10⁶⁷ possible arrangements. Only one of those is "new deck order." If you randomly shuffle, the odds of hitting that one special state are effectively zero.
02 — The Law
Every other law of physics is time-symmetric. Newton's laws, Maxwell's equations, Schrödinger's equation — they all work equally well forwards or backwards in time. A video of planets orbiting the sun looks equally plausible in reverse.
But a video of an egg unscrambling itself, or a broken glass reassembling, instantly looks wrong. That's the Second Law at work. These processes aren't forbidden — they're just breathtakingly improbable.
03 — Interactive Demo
Start with all gas particles on one side of a box. Remove the barrier. Watch them spread. The reverse — particles spontaneously gathering on one side — is possible but unimaginably unlikely.
The entropy measure here is qualitative: maximum when particles are evenly split, minimum when all are on one side. Real entropy is S = k log W, where W is the number of microstates — which grows exponentially with particle count.
04 — The Deep Question
Why does time flow in one direction? Why do we age, why do we remember the past but not the future, why do processes happen in a specific order?
The answer: the Second Law. Time's arrow is entropy's arrow. We perceive time as moving forward because entropy is increasing. At the microscopic level (individual particle collisions), physics is reversible. But macroscopically, the sheer number of particles makes entropy decrease so improbable that it defines a direction.
Why was entropy ever low? If high-entropy states are overwhelmingly more common, why didn't the universe start in maximum entropy (thermal equilibrium, no structure)?
The answer is one of the deepest unsolved problems in physics: the Big Bang was an extraordinarily low-entropy state. All the matter and energy in the universe was concentrated in a tiny, smooth, hot region. Every increase in entropy since then — stars forming, life evolving, your coffee cooling — is a consequence of that initial condition.
We don't know why the Big Bang had low entropy. That's the cosmological entropy problem.
05 — The Thought Experiment
In 1867, James Clerk Maxwell proposed a paradox. Imagine a tiny demon sitting at a door between two gas chambers. When a fast molecule approaches from the left, the demon opens the door. When a slow molecule approaches, it keeps the door closed. Over time, the left side cools (slow molecules) and the right side heats (fast molecules) — without doing work.
This appears to violate the Second Law. Heat flows from cold to hot, entropy decreases, and all without energy input.
This resolution revealed something profound: information is physical. Forgetting information has a thermodynamic cost. Shannon entropy and Boltzmann entropy are not just analogous — they're two aspects of the same underlying reality.
06 — The Deep Link
Shannon's information entropy and Boltzmann's thermodynamic entropy have the same mathematical form. This is not a coincidence — they're measuring the same thing: how many yes/no questions you'd need to ask to fully specify the state.
Erasing one bit of information in a computer requires dissipating at least kT ln(2) ≈ 3 × 10⁻²¹ joules of energy at room temperature (as heat). This is the thermodynamic cost of forgetting.
Modern computers dissipate far more than this per bit erased (billions of times more), but as we approach quantum and reversible computing, Landauer's limit becomes relevant.
Black holes have entropy proportional to their surface area: S = (k A) / (4 l_P²), where l_P is the Planck length. A solar-mass black hole has entropy ~10⁷⁷ k — far more than any other object of similar mass. Black holes are the maximum entropy configurations of matter.
This led to the holographic principle: the information content of a region of space is bounded by its surface area, not its volume. This is still an active area of research connecting thermodynamics, quantum mechanics, and gravity.
07 — The Far Future
If entropy always increases, where does it end? The answer: maximum entropy — a state of perfect thermal equilibrium. No temperature differences, no gradients, no available energy to do work. This is called the heat death of the universe.
Stars shine, galaxies form, life exists. The universe is far from equilibrium — low entropy by cosmological standards.
Star formation ends. The last stars burn out. The universe is mostly black holes, neutron stars, and cold gas.
Protons decay (if they do). Matter dissolves into radiation. Black holes are the only large structures left.
Black holes evaporate via Hawking radiation. The universe is a cold, dilute gas of photons and neutrinos at near-absolute-zero temperature.
Thermal equilibrium. Maximum entropy. Nothing happens — because there's no free energy to make anything happen. Heat death.
This is the ultimate consequence of the Second Law. Unless there's new physics we don't know about (vacuum decay, cyclic cosmologies, quantum tunneling to a new universe), the far future is cold, dark, and boring.