Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Maxwell’s Demon Continues to Startle Scientists (nautil.us)
107 points by dnetesn on May 21, 2021 | hide | past | favorite | 87 comments


I don't really have the necessary background in physics so maybe that's it but this article just doesn't seem to get the point across very well.

> Two advances would be crucial to solving Maxwell’s demon.

At this point in the article there had been a description of what Maxwell's demon is as: "Imagine that this nonexistent thing does X and has this outcome", however it's utterly unclear what the question is that needs to be solved. Whether such a system can exist? Whether two rooms with more order would violate some law and why? No idea. Can't I have two insulated rooms if some magical creature teleported certain particles magically from one room to the next? I saw other comments about the pointlessness of the thought experiment, but that's not my gripe. But there must be an explanation of why it really matters and what is to be "solved" here when the word "solve" is used.

I have the same problem with other thought experiments or rather their dumbed down description, e.g. Schrödinger's cat. They focus too much on the quirky scenario and superficial outcome ("oh, that would mean ...") but not on what specific questions it raised in the end and what the exact follow-ups were. And most importantly they leave out very crucial details. For instance, I think one important piece to Maxwell's demon (from Wikipedia) is that the demon doesn't do "work".


> however it's utterly unclear what the question is that needs to be solved

What Maxwell’s gate appears to imagine is a perpetual motion machine, which if possible would obviously be very important for our survival as a species.

Remember, once the universe suffers heat death, obviously we’ll be gone too. As we reach nearer and nearer to the heat death of the universe it will become more important to utilize heat energy efficiently.

The original term “demon” was intended as “daemon”, a tireless godlike worker, and not a malicious demon, but I find it more clear to instead envision a membrane (hence my use of the term “gate” instead of demon, which I think is a poor name for the thought experiment because it’s distracting) that allows heat to travel in only one direction, essentially providing free air conditioning by not allowing that heat energy to dissipate into thermal equilibrium.

This may seem like a silly and minor thing (free air conditioning, what’s the big deal, right?) - it’s a big deal because assuming that an alien race survives far enough into the future and are orbiting the last star in the universe, that heat will be the only thing letting them power their starship, and the air conditioning will be the only thing keeping them from being fried to a crisp.


The resolution of Maxwell's demon would involve proving that there is no way to implement such a device in an energy-positive manner. People generally suspect this is the case, but nobody can say why it must be the case.


Thanks for that concise explanation.

My immediate thought was "the demon will be doing work to open and close its door, so the overall entropy is likely to be rising, only local order is increased"

I guess the problem is around establishing that that has to be true, for any given demon.


The work to open and close the door can be reduced to an arbitrarily small level, so that doesn't resolve the paradox.

The solution is that the demon has to collect and then destroy the information relating to its decision whether to open or close the door. It is the cost of destroying this information that balances out the entropy decrease in the system.


Yeah we've just gone past my level of understanding :)

I should probably learn about the information theory aspects here, they seem quite ... derived.


That's okay, speaking as a physicist the connections between entropy and information are not at all obvious at first glance and you have to “get used to them.” For example, Maxwell's demon makes more sense if you phrase the second law as “there is no physical process which can move heat long-term from a colder temperature to a hotter temperature, without some extra input of energy.” So this says that refrigerators can work, but they only work by consuming power. There is no “spontaneous” flow the other way.

Given that statement of the second law, you can very well imagine Maxwell saying, “well if a tiny fluctuation can transfer heat from cold to hot temperature, why can't I just look at these fluctuations and filter out the ones that I don’t want, and then I have a long-term process?” And it is not obvious that every mechanism for filtering is going to require energy input.

Feynman gave a better example: a ratchet. A ratchet allows motion one way but not the other. Connect it to a wall between two gases, now the statistical fluctuations where the cold side happens to be pushing more on the wall can move the wall one way, and then the hot side cannot move it back because of the ratchet. You kind of have to peek inside the ratchet and see these springs and bits of metal interlocking, to see how the problem is resolved.

The “entropy of erasure” argument is huge because it gives a very different language for all of this, one where these ratchets and filters are viewed as information processors like computers, and run into fundamental bounds on information processing which require the input of energy. So yes, it is several steps of derivation on (the above definition of the 2nd law doesn't even mention entropy!) and is more abstract, which is necessary to be more universal. But if you're not trying to be universal, you can see the ratchet with its asymmetrical cog and the pawl and the spring holding them together, and I just tell you that at nonzero temperature the spring & pawl must be randomly jittery and you can start to realize, “oh, ratchets can only work if they are refrigerated to get that energy out of the spring, once the random jitter is about the size of the cog’s notches, fuggedaboutit!”... Entropy of erasure says basically that this is going to be a universal problem.


In Maxwell’s words: “One of the best established facts in thermodynamics is that it is impossible in a system enclosed in an envelope which permits neither change of volume nor passage of heat, and in which both the temperature and the pressure are everywhere the same, to produce any inequality of temperature or pressure without the expenditure of work. This is the second law of thermodynamics, and it is undoubtedly true as long as we can deal with bodies only in mass, and have no power of perceiving the separate molecules of which they are made up. But if we conceive of a being whose faculties are so sharpened that he can follow every molecule in its course, such a being, whose attributes are still as essentially as finite as our own, would be able to do what is at present impossible to us.”

The reason why we can’t move heat from hot to cold is that we don’t know enough. If the demon does know the position of every particle there is no notion of entropy.


I've always thought of entropy as gradients. The universe somehow started in a non-homogenous state and just due to statistics and random walks, will eventually get to a fully homogeneous state.

In the meantime, we can extract useful work from the areas in the universe where there are relatively steep gradients. Energy is what we call these gradients while we can still harvest useful work from them, and entropy is what we call it when near the homogenous state and we can no longer harvest useful work.

Is the information erasure perspective compatible with this gradient perspective, or is something wrong about viewing it as gradients?


Hi! This is a fun question. Certainly, thermal equilibrium involves something called the equipartition theorem, which says that all of the energy is divided equally across all of the degrees of freedom so that each has an average small amount of energy which we call the temperature. (It's hard to define temperature in any other circumstance! But let's not get into ensembles.)

The big caution with the gradient idea is that you can get uncomfortably close, if you are not paying attention, to saying that refrigerators are impossible. An energy flow from degree of freedom A to degree of freedom B manages to “slurp up” some of this thermal energy from a system's degrees of freedom and move them to another system, one gradient undoes a flow across a different gradient. And like, this happens all the time in everyone's kitchen.

Information erasure looks like this: a bit needs to look like a degree of freedom with some energy stored above the baseline average. It could be energy neutral in that you might have two degrees of freedom, {ifZero, ifOne}, so that a NOT gate doesn't take any energy, but just swaps the energy in the two degrees of freedom. But the energy stored in one or the other needs to be roughly one temperature*, or so. And the reason why is that “equipartition” specifies an average, but there is also a standard deviation: these are the energy fluctuations in thermal equilibrium. So, as a result of Ludwig Boltzmann we know that this is an exponentially-distributed random variable and therefore the standard deviation is equal to the mean. So if you want your bits to not be thermally excited, they need to hold a few temperatures’ worth of energy.

Information erasure says that the only circuits on these bits which preserve the energy levels inside them, are reversible circuits like NOT. Irreversible dynamics can look like reversible dynamics plus erasure: in the simplest erasure, we drain both ifZero and ifOne energies into the environment and then we refill ifZero from some energy reservoir. And this says basically that there is nothing much better than that for erasing a bit.

So, just like the ratchet had a spring that was kept at low energy in order to facilitate a transfer of energy from cold to hot, and could only work as long as you kept that spring well-refrigerated, Maxwell's demon has a similar “coldness buffer” in terms of these bits that are present in a definite prepared state ahead of time. The information processing step takes it out of that zero state, and if it wants to go back there it has to erase, but that requires dumping some energy out into an environment, and getting energy from some reservoir to compensate. And then Maxwell's demon becomes just a normal refrigerator like the other ones that we know about.


Interesting, thanks for the detailed answer


Another way to put it: the decision to open the door requires some minimum unit of energy. That energy is itself at maximum entropy: there's no way to regain it for useful work. The total effect of this is that you can't beat entropy in the long term.

That's often expressed in terms of deleting the information, which really does amount to the same thing viewed through a different lens. Looked at that way, the demon would be regaining the (organized) energy it would need to make the next decision. That requires energy input.

I find the "deletion" view kinda confusing, but it does work. It's all a matter of where you want to set your zero points.


> Then in the 21st century, with the thought experiment solved, the real experiments began. “The most important development is we can now realize Maxwell’s demon in laboratories,” said Sagawa.

> In 2007 scientists used a light-powered gate to demonstrate the idea of Maxwell’s demon in action; in 2010, another team devised a way to use the energy produced by the demon’s information to coax a bead uphill; and in 2016 scientists applied the idea of Maxwell’s demon to two compartments containing not gas, but light.

About the first experiment: https://www.nature.com/news/2010/101114/full/news.2010.606.h...

> The experiment does not actually violate the second law of thermodynamics, because in the system as a whole, energy must be consumed by the equipment — and the experimenters — to monitor the bead and switch the voltage as needed. But it does show that information can be used as a medium to transfer energy, says Sano. The bead is driven as a mini-rotor, with a information-to-energy conversion efficiency of 28%.

About the second experiment: https://www.quantumlah.org/about/highlight/2016-02-maxwells-...

> The team also worked out a theoretical description of the setup – accounting not only for the thermodynamic effects on the light, but also for the information gained by the demon – and found a good match with the data. It has long been known that the resolution of the apparent violation of the second law by Maxwell's demon (for it is only apparent) lies with the demon itself. The demon not only does work on the system, but also acquires information about the system that it would take work to erase. When all these factors are budgeted for, there's no net gain.

So they didn't make a Maxwell' Demon. They are just very inefficient motor with a overhyped story around it.


It's a common misconception that the Landauer Limit, which says it takes kT*ln(2) energy to erase a bit, "solves" Maxwell's demon. That argument is actually circular, because the Landauer Limit relies on the Second Law in its derivation. I cannot find the source now, so it's possible I'm wrong, but my recollection is that even Bennett, who advanced the argument in 2003, eventually admitted as much around 2006/2007, if memory holds.[1]

I spent years of my PhD reading and thinking about these topics, and made some contributions to these Wikipedia pages years ago.

[1] https://en.wikipedia.org/wiki/Landauer%27s_principle#Challen...

(To be clear, I am not saying that Landauer Limit or the Second Law are wrong - just that it's wrong to use the Landauer Limit to "prove" the Second Law.)


Isn't there no "proof" within the laws of physics for the second law? - you must posit a "Past Hypothesis".


Depends on what level. Historically it was an empirical observation that was taken in as an axiom to classical thermodynamics. So it couldn't be derived from other laws.

If we move to statistical mechanics we can derive the second law from probabilities. In essence it is possible for any system to break it, it's just very very improbable. Just like if you toss a coin billions of times. It's possible for you to get more, or less, than half of the tosses to be heads. It's just very very improbable to get anything more than very tiny deviations.


But when we apply statistical mechanics to "microscopic" arrangments like a few particles in an idealized rigid box or in a vacuum, there is no arrow of time. The laws of physics themsleves are totally symmetric (CPT theorem), and it is not thought the t-asymmetry of the weak interaction is responsible for the global macroscopic arrow of time. So without some statement of initial/prior conditions, we have no reason to see just the "forward" arrow unless we say something about initial/prior conditions. (Loschmidt's Paradox https://en.wikipedia.org/wiki/Loschmidt%27s_paradox)


You can prove it in the context of Ergodic theory. Though that's a model so assuming that it correctly approximates the real world is still an assumption.


Are you sure Ergodic theory does the trick? When I hear people like David Albert or Sean Carroll speak, they always make clear you need some kind of initial/prior conditions or else stat mech does not guarantee why we only see the forward arrow of time. If we assume we can reduce all physical process to the behavior of fundamental particles (or fields), and the physical laws that describe the microscopic world are entirely symmetrical (CPT theorem), you get Loschmidt's Paradox https://en.wikipedia.org/wiki/Loschmidt%27s_paradox. They say the Past Hypothesis (low entropy big bang prior condition) is necessary to resolve it.

*And the t-assymetry of the weak interaction isn't the reason according to the wikipedia article on time asymmetry - you still need initial conditions. "Time reversal violation is unrelated to the second law of thermodynamics, because due to the conservation of the CPT symmetry, the effect of time reversal is to rename particles as antiparticles and vice versa. Thus the second law of thermodynamics is thought to originate in the initial conditions in the universe. "


Well it doesn't tell you why the original state of the universe had such a low entropy, but it can at least be used to show that entropy should increase from there.


Do you have any recommendation for a book? I studied classical thermodynamics (and physics) long ago and I consider I have good probability and statistics foundations but never studied statistical mechanics. From my ignorance I cannot understand how is it possible that entropy considerations (macroscopic) are applicable to a microscopic demon.


The idea is that temperature alway gives you an energy distribution p(E) =exp(-E/kT)*const. https://de.m.wikipedia.org/wiki/Boltzmann-Statistik if you want to store information with one particle in a double well potential (tiny ist information storage), you need the barrier at least so high that you don't just hop to the next bin due to these fluctuation. This means you have to introduce a minimum energy effort depending on temperature needed to delete the information later.

Ps statistical physics is slightly weird, not sure about a good book. Maybe Berkeley physics course vol 5 or so.

About the Landauer principle itself I remember the paper actually to be quite accessible.


I'm not aware of a good book, but here's a website that has finally made me understand postmodern thermodynamics : (Though I perhaps have a non-common path of learning modern thermodynamics, then skipping basic physical statistics, only to jump into advanced physical statistics later ?)

http://www.av8n.com/physics/thermo/entropy.html

Some core ideas :

- 19th century modern thermodynamics are hopelessly obsolete (at least if you want to understand the "why" and/or do stuff more advanced than "basic" heat engines ?) with tons of confused notions like "heat".

- The term "thermodynamics" itself is bad, "energo-informatics" would perhaps be a better one ?

- Entropy is just a redundant term for (the lack of) information about a system (= macrostate = distribution).

- Hence, you need at least some quantum mechanics : Heisenberg's unsharpness principle (and Plank's constant h) to be able to build the theory from the microstate up, because you need to be able to quantize position and momentum to define information.

- disorder is another of those misleading terms


I think of it as saying that the two are really just equivalent to one another, in a different basis. At a mathematical level, there's a fine line between "proof" and "redefining your terms".


Possessing only an undergrad level of physics, I've never understood why this is a remotely interesting question.

It seems to effectively be equivalent to "Imagine we have a refrigerator with a massless, frictionless motor that also doesn't need to be plugged in. Isn't it crazy that this refrigerator would violate the second law of thermodynamics?"


A motor that doesn't require energy violates the second law by definition; a selectively-enforced barrier that lets hot particles through but not cold particles didn't obviously provably cost energy.

If that sounds like a cop out, remember that the Demon is a metaphor that's supposed to represent an as-yet-undiscovered technology. Like, instead of a literal massless door that opens and closes, something like, "What if we discovered a clever way of designing a little gap between two chambers and ionizing the particles such that only hot ones go through."


> a selectively-enforced barrier that lets hot particles through but not cold particles didn't obviously provably cost energy.

I think you're leaving something out of your description. We have passive selective barriers that will admit low-energy particles while reflecting high-energy particles. (Cotton, say, will reflect visible light while being transparent to infrared light.)

So what do you get by hanging a sheet between two rooms? Nothing.

I think Maxwell's Demon will only let hot particles pass through in one direction, but that's nothing new either; we have one-way mirrors.


> I think Maxwell's Demon will only let hot particles pass through in one direction, but that's nothing new either; we have one-way mirrors.

We don't, actually. We have semi-transparent mirrors that allow a small fraction of the light through, and they appear "one-way" only when one side is vastly brighter than the other.


If one way mirrors worked like that we wouldn't need heat pumps. We do because they don't work like that.


Imagine the door had some mass and some friction, but the particles were really heavy, so even a few correct openings could create a large energy imbalance.

There’s no apparent relationship to the energy gain per separated particle and the energy used by the door. That makes it hard to prove it’s not generating energy.

That’s what makes it an interesting problem.


Interesting, thanks. I hadn't considered the disconnect between the particles and the door.


This approach is assuming thermodynamics as a unquestioningly true law.

That could be fine for a student that is learning physics but not for a scientist.

Treating something as unquestioningly true is a sign of belief and religion.

Now, for most everyday situation it is practical to assume that a device can't be creating energy from nothing, can't be ordering atoms in two containers without using energy, because we know there exists second law of thermodynamics.

But Maxwell's Demon is an experiment specifically aimed at testing thermodynamics.

We can use the second law of thermodynamics exactly because we expend/expended effort to test it to a sufficiently high degree and we are revisiting as we learn new facts and push the boundary.


I agree but for a different reason. It seems obvious that Maxwell's demon would itself use energy, thereby paying for the "decreased entropy" with this energy.

This is essentially the same as:

> The second vital piece of the puzzle was the principle of erasure. In 1961, the German American physicist Rolf Landauer showed that any logically irreversible computation, such as the erasing of information from a memory, would result in a minimal nonzero amount of work converted into heat dumped into the environment, and a corresponding rise in entropy.


Why is it obvious that Maxwell's demon would use energy? It could be a frictionless spring-loaded door with latches to hold it open or closed and it bounces between states when it's unlatched.


I wish someone would address your point. This leapt to my mind immediately but I don’t understand enough about the physics to go any further with it.


iirc Laundauer assumes the second law, so it's not a good refutation here.


The interesting thing is that this theoretically perfect refrigerator operates in a way that hints at deep connections between intelligence, information, and entropy.


What’s the connection between intelligence and entropy?


"Intelligence" is a little misleading, it's more along the lines of "awareness of one's environment".

Per Liouville's Theorem, information about your surroundings is a conserved quantity. A physical system (persistent along a time-like curve) that has more information about a particular part of it's environment (eg "the content of this movie is so-and-so") in one light cone of a event than the other, must have correspondingly more information about some other part of it's environment (eg "this heat sink that I am erasing scratch space with has particle velocities within bounds such-and-such") in the other light cone compared to the first one.

If one light cone has much larger quantities of compressible ('cold', kind of) information than the other, such that interesting information about our surrondings always appears on the opposite side of events from it, we call that light cone "the past".

I don't know that we have a better explanation than the anthropic principle for why one one of our light cones is like that, though.


> we call that light cone "the past"

Hmmm. I had the impression that the flow of time was implicit in the fact that we talk about "light cones" instead of "light spheres". Is that not the case?


Given well-behaved (ie relatively flat-ish) spacetime, there's a distinction between light cones going this way, and light cones going that way (a particle is a graph[0] edge that is attached to this-ward side of one event and the that-ward side of another), but general relativity, quantum mechanics, and particle physics - assuming you also handle the charge and parity from CPT symmetry - don't distinguish which of 'this' versus 'that' is past or future.

0: Quantum field theory makes this more complicated, since particles are more like amplitude distributions over possible graph edges, except events also aren't vertices but distributions over vertices, and if you can figure out a way to explain this that actually makes sense you might win a nobel prize, but it empirically is well aproximated by a graph at macro scales.


I would imagine that a theoretical physicist on LSD would sound like these comments. I love it.

But more seriously (and with tons of respect, because these topics are difficult) - why invoke all these complicated concepts? When we discuss statistical mechanics it is kind of understood that the ferromagnet we try to describe is not placed on the innermost stable orbit around a black hole, and similarly when I look in a mirror there is little need to worry about the difference between P and CPT.

I guess one can say that the essence of a physical system is not always in the fundamentals of physics?

Having said that, I do not know what "intelligence" is, so I do not know how to relate it to entropy.


> I would imagine that a theoretical physicist on LSD would sound like these comments. I love it.

I don't need LSD!

> When we discuss statistical mechanics it is kind of understood that the ferromagnet we try to describe is not placed on the innermost stable orbit around a black hole

If your theory of statistical mechanics requires that, then your theory is wrong, or at least badly incomplete. That might not be a problem you need to care about for practical reasons, but it is a problem.

> I guess one can say that the essence of a physical system is not always in the fundamentals of physics?

Depends on what you mean by "essence" and "fundamentals" - you're not going to have good time trying to analyze a computer [0] running a game-of-life sim in terms of it's constituent quarks, obviously - but there's usually something useful to be learned from the underlying physical laws.

0: (whose primary descriptive quality is that it's)


information


That refrigerator would still use energy compressing the refrigerant. That's Work=F*x energy which won't go to zero with a massless frictionless motor.

Or do you mean you also have a generator to recover the energy from the temperature differences it creates? So it's not really doing refrigerating because you're using all the cooling power to run the motor?


It's interesting because it's one of the intersections between physics and information theory. Why it's not possible is not at all obvious.


Entropy can't be reversed, even if you have a massless, frictionless, motor that doesn't need to be plugged in. That is effectively what Maxwell's demon is.


Of course entropy can be reversed, I literally do it in my sleep!

However, reversing entropy requires an amount of energy commensurate with the amount of entropy you are reducing. Maxwell's demon would appear as a violation of this.

A massless, frictionless motor could trivially reverse all the entropy in the universe given enough time, but it is not clear that Maxwell's demon REQUIRES something equivalent.


Maxwell's demon requires erasure which would subsequently increase entropy.

https://www.nature.com/news/the-unavoidable-cost-of-computat....


As someone else was noting, this result is based on Landauer's limit, which is itself ultimately based on the second law of thermodynamics. If Maxwell's demon existed, the second law would be wrong, so Landauer's limit would be wrong, so there could be a way to erase information without increasing entropy.

I'm also not sure why Maxwell's demon would require erasure.


Wikipedia summarizes this well.

https://en.m.wikipedia.org/wiki/Maxwell%27s_demon

I think it's clear that by now the 2nd law always wins. Always.


Because it's a way to investigate the properties of information and entropy


I made a short Twitter thread relating to this article here: https://twitter.com/mikepfrank/status/1395731216922464263?s=...

I recommend the books “Maxwell’s Demon” and the second edition “Maxwell’s Demon 2”, both edited by Leff and Rex, which collect a lot of the classic literature on this topic.

Bottom line is, of course no real machine can break the Second Law, but the Maxwell’s Demon thought experiment helps illuminate the physical nature of information.


>In the thought experiment, Maxwell imagined splitting a room full of gas into two compartments by erecting a wall with a small door. Like all gases, this one is made of individual particles. The average speed of the particles corresponds to the temperature of the gas—faster is hotter. But at any given time, some particles will be moving more slowly than others. What if, suggested Maxwell, a tiny imaginary creature—a demon, as it was later called—sat at the door. Every time it saw a fast-moving particle approaching from the left-hand side, it opened the door and let it into the right-hand compartment. And every time a slow-moving particle approached from the right, the demon let it into the left-hand compartment. After a while, the left-hand compartment would be full of slow, cold particles, and the right-hand compartment would grow hot. This isolated system would seem to grow more orderly, not less, because two distinguishable compartments have more order than two identical compartments. Maxwell had created a system that appeared to defy the rise of entropy, and thus the laws of the universe.

What do I miss?

The system only "defies the rise of entropy/laws of the universe" because we've added an arbitrary agent in the middle (the "daemon"). Shouldn't the needs (entropy and otherwise) for that also should be accounted?


If we believe in the conservation of energy, wouldn’t the universe itself likely be a version of Maxwell’s Demon? Could black holes/white holes be the membrane driving such a perpetual motion machine?


Thermodynamics are mostly about statistical effects and return to means.

Maxwell's demon seems (to me) to turn probabilities into some Deus Ex Machina thing. (Though in the other way, saying that "disorder" is mandatory)

Maybe the thermalization hypothesis will give a more adequate answer to it.


It seems to me that information theory has already provided an adequate answer to Maxwell's demon (as mentioned in the article).

To operate, Maxwell's demon must perform operations on information. Those operations can be quantified, and necessary generate more waste heat than it creates by separating the particles.


Does that imply we can put a lower bound on the energy required for certain types of operations on certain types of information?


Yes, exactly. There's some theoretical minimum amount of energy per non-reversible operation that a perfectly efficient CPU wouldn't be able to beat.

> or even in developing more advanced computer chips, which may be approaching a fundamental limit dictated by Landauer’s principle.

However, this last part is a bit of a stretch... Current hardware is many many orders of magnitude above this limit.


I've heard this explanation alluded to, but I've never understood the details. What does the last sentence mean?


Basically, any temperature difference between two regions of space can be used to do Work* (essentially, there is a maximum amount of Energy that you can extract from that temperature difference to do anything with). If everything in your closed system is the same temperature, then there is no Energy that you can capture in order to do Work.

Well, if we have this demon that can open this gate at _just_ the right times so that it can separate hotter particles from colder particles, then it is able to _create_ a temperature difference where there wasn't one. So, this demon would allow us to use our closed system to do an infinite amount of Work. It would act as a sort of perpetual motion machine.

- Step 1: Let the demon open the gate at the right times to separate the hotter gasses (more energetic individual particles) from the colder gasses (less energetic individual particles).

- Step 2: Replace the barrier/gate with a heat engine, and let the natural flow of energy from hot -> cold, drive our heat engine and push the box.

- Step 3: Once the temperature has diffused enough that our heat engine can no longer do any productive Work (that is, the temperature has equalized), then we replace the heat engine with the barrier/gate and let the demon go back to work.

That sets us up to discuss the final sentence. With the advent of information theory, we now know that there is a _minimum amount_ of Work that a hypothetical demon would need just to know when to open and close the gate. That Work requires Energy to perform. As it happens, that Work that the demon must do requires more energy than the demon is able to create by separating the particles.

Which means the closed system of particles + demon will still increase entropy, because the demon _consumes more energy than it creates_. Basically, we were able to do the math about the theoretical limits of a perfect demon, and if you were able to make one it doesn't break the 2nd law of thermodynamics.

The reason this stumped scientists for a long time is we didn't have the ability to calculate the _minimum amount of Energy_ that would be needed by the demon's brain.

* Work here being the physics concept of work. For example, we could move a box around by using a heat engine to turn the temperature difference into adding a physical force to the box.


We can use potential barriers to separate hot and cold molecules, e.g.

water pool <-> water surface (barrier) <-ambient heat-> vapor <-> condensator <-> water <-> hydroelectric powerplant <-> water pool


> What does the last sentence mean?

"Waste" here means entropy. So the "thinking" that the demon does actually increases the entropy overall. In other words, even though the system's entropy seems to be decreasing (since particles are being neatly sorted), the process used in the sorting must create some kind of random garbage that actually will be a net increase in entropy, in spite of the now-sorted particles.


It's not really a Maxwell's Demon, but it kind of acts like one: the Ranque-Hilsch vortex tube [1].

It's a simple mechanical device with no moving parts invented in 1931 with one input port and two output ports.

Force compressed air into the input port, and hot air (up to 200 ℃) comes out one of the outputs and cold air (down to -50 ℃) out the other.

[1] https://en.wikipedia.org/wiki/Vortex_tube


I don't understand the concept of Maxwell's Demon.

Wouldn't this demon require it's own energy source to perform this sorting of hot/cold particles? It has to move it's arms or whatever it is doing to manipulate the door. It has to use it's brain and eyes to determine which particles to sort.

How is this any different then if you had a heat exchanger or other mechanical device sitting there? I must be missing something obvious.


I believe the idea is that the demon could be arbitrarily efficient since it opens and then closes the door, allowing a thought experiment of perfect efficiency. Like saying infinity when you mean an arbitrarily large number.


I'm wondering the same, it seems that the common denominator in any system that we can observe where entropy first stops increasing and begins to decrease is the addition of a mind.


I really enjoyed "The demon in the machine" - anyone know of other good books exploring the topic?


Not a book but another article from quanta, and my favourite on the subject, not for it's technical depth but for it's perspective:

https://www.wired.com/2017/02/life-death-spring-disorder/

I felt like this one brings in the modern understanding combining Maxwell, Shannon and Landauer into a tangible perspective of everyday reality... maxwell demon is not "solvable", it's an extreme example for the purpose of a thought experiment, however practical "versions" of it exist everywhere, once you combine the costs as per landauers contributions, swap an infinite tape for a large tape, you essentially have a cache - when you start to observe biological and complex chemical systems with this perspective it really does feel like life can be thought of as little emergent upstream eddies "anti-entropy" machines in a torrent of entropy, like a dead fish that naturally swims upstream, there is no defying physics here, entropy still increases universally, but these are natural local variations.

It's hard to not believe "reducing entropy" is the fundamental "meaning of life" with this point of view, a meaning that is shared among all life: to not dissolve into oblivion, to collect little hacks of physics and chemistry to find better and better ways to evade it.


This is also my perspective, that living things are fundamentally machines which externalize entropy, and the processing of information is crucial to this task. The task of living is that of using information to harvest order out of entropy, much as maxwells daemon .

Of course this increases overall entropy, only conserving order locally....so you could also think of life as a “fire” that burns order out of imperfectly disordered systems.


I missed this

> that living things are fundamentally machines which externalize entropy

That's a clearer way to put, externalising or displacing, makes it more palatable for those getting stuck with the laws of thermodynamics.


> It needed to record and store information about individual particles in order to decide when to open and close the door.

This isn't true, all it needs is to store the average speed of particles on either side of the door and it would still converge.


I have never understood Maxwell's demon at all and would love if anyone could explain its significance.


The general idea of Maxwell's demon as a problem is that it conflicts with the second law of thermodynamics, which essentially states that the total entropy of a closed system will increase over time.

Which, essentially, implies that if we have a (perfectly sealed closed system) box of gaseous particles well mixed at temperature T, then we will never observe that box eventually separates into two different temperatures of gas with a T(cold) on the left side of the box and a T(hot) on the right side of the box. However, if we already had a box that was separated into two different temperatures of gas, we would expect those gasses to eventually become well-mixed over time.

The paradox of Maxwell's Demon was that we had this apparently closed system (gas particles + Demon), that wasn't doing any Work, but apparently had the ability to violate the 2nd law of thermodynamics. The thinking was that this shouldn't be possible, even in principle.

With the caveat that the second-law of thermodynamics is a statistical law, so it's really saying "the odds of entropy decreasing by any significant amount is extremely vanishingly improbable"


What bothers me is, we derive entropy --a blind, involuntary physical law--, from observation of inanimate systems.

So, for example, a very big box full of hot hydrogen will eventually cool off. But, there's a tiny chance it may actually coalesce into life bearing stars. Such life may intentionally work towards increasing order, like Maxwell's demon.

So we try to extend entropy to life too and strenuously prove that it'll increase even when aware biological organisms work against it, while still excluding the same organisms from the observations entropy is derived from.


It's easier to understand with a refrigerator.

A refrigerator decrease the entropy inside (ice cube have low entropy) but it increase the entropy outside, so the total amount of entropy in the universe increase.

There are even some refrigerators that instead of electricity have a burner for alcohol, that for me is very weird.

Now imagine a big box filled with Hydrogen an a refrigerator (with batteries or something). The entropy inside the refrigerator will decrease, but the total entropy will increase.

Now imagine a that there is a star in the big box of Hydrogen, and the refrigerator has a solar panel. The entropy inside the refrigerator will decrease, but the total entropy will increase.

Life is just that, a weird solar powered refrigerator.


Talking about physical entropy (Boltzmann) or information entropy (Shannon), in either case we must consider the system or channel.

Entropy only has meaning in respect of a system. There is no "universal* application of the second law.

Entropy within a system may very well decrease so long as more entropy is exported from that system.

Living organisms import low entropy such as food. And they export higher entropy.

The balance in a notional system that encompasses both the organism and its energy source, wrt beyond that system, will be a net inrease in its entropy.


Are there no natural non-biological processes which decrease entropy locally just as life does? Surely something like a weather system does that, doesn't it?


Crystals might be another example.


One application of Maxwell's demon would be to sort hot atoms in a gas from cold atoms thus allowing you to create both an efficient heater and air conditioner.


That sounds a lot like a vortex tube, a tube with no moving parts that separates a stream of air into hot and cold, which at first sounds kind of impossible!

https://en.wikipedia.org/wiki/Vortex_tube

http://vortextube.scienceontheweb.net/index.html


The essential difference is that you have to put in energy to pump the fluid through the tube. Essentially a vortex tube is simply a very inefficient refrigerator that is fully in agreement with the second law of thermodynamics. With Maxwell's demon it is not immediately obvious that it has to do work in order to sort fast from slow atoms.


Wikipedia mentions that the inefficiency of that is less than a normal AC unit even.

So the vortex tube is actually using up more of the energy in the system than you might originally expect


Here's a simple way to understand the situation. Imagine you have a room of uniform pressure. Air particles are bouncing around at random speeds but its chaotic and uniform.

The randomness means that there will be moments where the rooms pressure is not EXACTLY uniform.

Now image you have a demon that can divide the room at one of those times. The demon does not work on the particles in the room. It simply divides the room. ...But now you have two rooms with a pressure differential. You can use that difference to do work and it will only return the rooms to uniform pressure again.

You get work for free. How can this be? This is the paradox.


I have worked implementing gambling algorithms for online casinos and also created my own lottery apps, and this article posses the __blind optimism__ that drives poor devils into gambling and make them loss their money for years, and begins like this:

> "You can either play all night on the roulette table, or you can stop if you win $100,”

The lucky thermodynamic demon that can extract on average more free energy than the work spent over many iterations, will be forbidden from gambling ever again! If he returns to the system, entropy will make sure to balance his earnings in the long run. Losing and winning devils will be balanced as well. For each lottery winner there are lots of losers.


The analogy was just to help clarify the situation, not to be taken so literally.

Where it doesn't match gambling is that there is no risk of ruin, only time waited.

In other words, the universe is constantly flipping dice, and when it looks in our favour we stop. But we didn't pay to play.


Yeah, but imagine a gambler that wins once and never plays ever again in the same casino. What if he takes his money to another casino? He could lose or win. Let's say he wins again, and goes around the city playing in different casinos. In the long run his earned money will be 0. So, win in the first casino and then stop.

The gambler dies, and his son gets the money. Now his son is a new gambler who lives in a different city. If he plays in other casinos he could win or lose, but in the long run his earned money will be 0 too.

It means that for a gambler to actually win he must win only once, then he must not play ever again, or any other gambler with his money, since entropy is "connected" to that money, because money makes part of the same system.

Transform that money into energy and give it to the gambler demon, what will happen?

Now how much money/energy does it cost to create one of these demons?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: