There's no problem reconciling the quantum with the Newtonian. Quantum mechanics recovers Newtonian mechanics in the appropriate limit. The problem is reconciling the quantum and the Einsteinian.
Actually, Newtonian gravity can be added to QM and work perfectly well. It's GR gravity that doesn't work with QM, especially if you try to model very high curvature like you'd get near a black hole.
Quantum Electro-Dynamics (QED) is the application of Special Relativity (non-accelerating frames of reference, i.e. moving at a constant speed) to Electromagnetism. Thus, the issue is with applying accelerating frames of reference (the General in GR) to QM.
None of these have anything to do with what I said. SR works just as well as classical mechanics with acceleration. If SR didn't work with acceleration, it would have never been accepted as a valid theory at all, it would have been a laughing stock, as acceleration was well understood since the times of Newton.
What general relativity does different from special relativity is that it extends the concept of relativity from inertial frames of reference to all frames of reference, even accelerating ones. In the process, it also explains the reason why inertial mass and gravitational mass happen to be the same, by tying the gravitational interaction fundamentally to the concept of acceleration.
> Special Relativity (non-accelerating frames of reference, i.e. moving at a constant speed)
Sorry, but this is a pet peeve of mine: special relativity works perfectly well in accelerating frames of reference, as long as the spacetime remains flat (a Minkowski space[1]), for example when any curvature caused by gravity is small enough that you can ignore it.
I disagree. Our notion of waves is no less an analogy to macroscopic phenomena than billiard balls. There’s no avoiding the dual nature, and there’s no problem with saying that the wave analogy works in some places, but the particle analogy works in others. The only real truth here is “neither.” A photon is a photon, and there is no macroscopic analogy it reduces to perfectly.
I think neither analogy is correct. We're using macro metaphors (real world things at human time and spatial scales) to explain microscopic phenomena that may not correspond to anything that we find familiar.
I agree with this. As a physicist, I believe the most accurate resolution is to say that «quantum fields» and «quantum particles» describe neither waves (in the sense of e.g. water or acoustic waves) nor particles (in the sense of marbles and billiard balls), but a third thing that simply has some things in common with both classical waves and classical particles. The analogies are useful for understanding that third thing, but if you believe the analogies too literally, then you’ll make mistakes.
3Blue1Brown has a very good explanation of how light works as a wave
And the barber pole effect shows how matter (sugar) rotates light
https://www.youtube.com/watch?v=QCX62YJCmGk
There is also evidence that "photons" are just thresholds in the material that is used to detect light. The atoms vibrate with the EM-wave and at a certain threshold they switch to a higher vibration state that can release an electron.
If the starting state is random, the release of an electron will often coincide with the light that is transmitted from just one atom.
This threshold means that one "photon" can cause zero or multiple detections. This was tested by Eric Reiter in many experiments and he saw that this variation indeed happens. Especially when the experiment is tuned to reveal this. By using high frequency light for example. It happens also in experiments done by others, but they disregarded the zero or multiple detections as noise. I think the double detection effect was discovered when he worked in the laboratory with ultraviolet light.
One has to wonder how far can emergence stretch given enough time, some kind of entropic limit probably exists but I'm just a layman, hopefully someone more knowledgeable can share if we already know a physical hard limit for emergence.
After a brilliant start (atoms etc.,) it starts to be problematic once one hits societies. After all, the earlier progressions are undeniably astounding stable successes in their various incarnations. A pessimist might say 'Stable' societies so far have tended eventually towards being self-destructive tyrannies.
They are increasingly unstable, hence why I pondered about some enthropic limit the higher up it goes in the enthropy ladder.
Atoms are quite stable, even though they also suffer from quantum decay; then molecules can be stable but are less stable than atoms; up the ladder to biochemistry it starts to become more unstable the more complex it gets; so on and so forth.
Stable societies might be something that humans haven't achieved yet but somewhere in the Universe some other lifeform might, each rung of the ladder will filter out the most unstable versions of it, coalescing into the emergence of the more stable versions of it. Advanced technology is very unstable for us, requiring constant maintenance by intelligent humans.
If we take a simple definition of technology - such as “tool” or some external inanimate thing we use as an extension of ourselves - then I think all animals on Earth that we have deemed intelligent to some degree use “technology”. Crows using sticks to pick things out holes, chimps crafting spears for hunting, dolphins wearing “hats”, octopuses building stone fortresses, etc. So I guess it’s important to define the limit of the definition of technology.
What a timely article and comment. I've been watching a lecture series over the last few days about quantum mechanics and the many worlds interpretation. And I have questions.
I may have missed it or didn't understand it when I heard it explained. What underpins the notion that when a particle transitions from a superposed to defined state, the other basis states continue to exist? If they have to continue to exist, then okay many worlds, but why do we think (or know?) they must continue to exist?
Another interpretation of the double-slit posits a guiding 'Pilot Wave' separate from physical particles... aka DeBroglie-Bohm Theory or Bohmian Mechanics.
Apparently it's not popular among professional physicsts though John Bell investigated it a bit. Einstein had some unpublished notes in the 1920s about a "Gespensterfeld" (ghost field) that guided particles.
Born was influenced by this 'Ghost field' idea when he published his famous interpretation of the 'Wave Function' |Ψ|^2 as a probability rather than a physical field.
The way I've always thought of this is there are potentials for interactions and interactions.
Interactions act like point particles and potentials for interactions act like waves.
Arguing over the distinction is a bit like debating whether people are the things they do, or the thing that does things. There is some philosophical discussion to be had, but for the most part it doesn't really matter.
It still interferes with itself, and that interference affects the pattern of detections. It's as if the photon were a wave right up until the moment of detection, at which points it's forced to “particalize” and pick a spot to be located at — but it's the amplitude of the wave it was just before detection that determines where on the detection screen the photon is likely to show up. If you send many photons through one at a time, the detections (each just a point on the screen) will fill out the expected double slit pattern.
I've always wondered what degree of confidence exists amongst the cogniscenti that a single photon event happened. I tend to think the criteria of measurement here would suggest the most likely outcome was a shitload more than 1 photon, and that all the "but we measured we can see one only" measurements are themselvs hedged by a bunch of belief.
That said, I do like the single photon experiment, when it's more than a thought experiment.
Double slit experiment has been done with electrons which are, afaik, much easier to detect and send single file. It's been done with molecules. It's not a thought experiment.
Quantum superposition is real. There's no doubt about that.
Not a physicist, just here to observe single photons weren't reliably emitted until the modern era. like the 1970s. The double slit experiment pre-dates this. it's from 1801. The one which confirms "self interaction" was 1974. I was in high school 1973-78 so the stuff we did, was comparatively "new" physics in that sense. Not a message I remember receiving at the time.
From the pop-sci history reading I do, "detecting" reliable generation of single photon STREAMS in the early days depended on using mechanisms which inherently would release a sequence of photons on a time base, over time, and then gating the time sufficiently accurately to have high confidence you know the time base, and can discriminate an "individual" from the herd.
I don't doubt quantum theory. I only observe it's mostly for young students (like almost all received wisdom) grounded in experiments which don't actually do what people think they do. The ones you run in the school lab are illustrative not probative.
What people do in places like the BIPM in Paris, or CERN, isn't the same as that experiment you did with a ticker-tape and a weighted trolleycar down a ramp. "it's been confirmed" is the unfortunate reality of received wisdom, and inherently depends on trust in science. I do trust science.
Now we have quantum dots, and processes which will depend on reliably emitting single photons and single electrons, the trust has moved beyond "because they did it in CERN" into "because it's implemented in the chipset attached to the system I am using" -QC will need massive amounts of reliably generated single instance signals.
> just here to observe single photons weren't reliably emitted until the modern era.
A dim light bulb from a few feet away emits on the order of 1k photons/sec, which is low enough that you can count individual emissions using fairly simple analog equipment [0] [1].
> The double slit experiment pre-dates this. it's from 1801. The one which confirms "self interaction" was 1974.
There's an experiment from 1909 that demonstrated the double-slit experiment with single(ish) photons [2].
> I only observe it's mostly for young students (like almost all received wisdom) grounded in experiments which don't actually do what people think they do. The ones you run in the school lab are illustrative not probative.
> What people do in places like the BIPM in Paris, or CERN, isn't the same as that experiment you did with a ticker-tape and a weighted trolleycar down a ramp. "it's been confirmed" is the unfortunate reality of received wisdom, and inherently depends on trust in science. I do trust science.
The double-slit experiment is actually fairly easy and cheap to run [3]. Certainly more complicated than ticker tape, but not by much.
It's a wave of probability, that interferes through the slits and then collapses into a probability of one somewhere along the wavefront at the point of detection. Whatever that means :-)
As the other comments have already mentioned, it interferes with itself, so you still observe the same interference patterns [0] [1]. Which admittedly seems impossible at first, but so does the rest of quantum physics.
Depends on the definition of miracle I guess. There's all sort of unintuitive shit going on in the quantum world, but we can make it happen so reliably that it's hardly a miracle anymore. Wikipedia defines a miracle as "an event that is inexplicable by natural or scientific laws and accordingly gets attributed to some supernatural or praeternatural cause". But we understand "how" quantum mechanics quite well, even if the behavior described by the equations is not very intuitive to humans.
> To quantify this influence, the team applied their model to Terbium Gallium Garnet (TGG), a crystal widely used to measure the Faraday effect. They found that the magnetic field of light accounts for about 17% of the observed rotation at visible wavelengths and up to 70% in the infrared range.
Nearly 20% seems already significant, but 70%?! that's massive.
This isnt exactly new. This is a obvious and predicted effect of ECE Theor. I'm surprised that neither the article nor any other commentor mentioned it yet.
tl;dr on ECE Theory: Gravity is a curvature of spacetime, electromagnetism is a torsion.
Einstein–Cartan–Evans theory or ECE theory was an attempted unified theory of physics proposed by the Welsh chemist and physicist Myron Wyn Evans ..., which claimed to unify general relativity, quantum mechanics and electromagnetism. The hypothesis was largely published ... between 2003 and 2005. Several of Evans's central claims were later shown to be mathematically incorrect and, in 2008, the new editor of Foundations of Physics, Nobel laureate Gerard't Hooft, published an editorial note effectively retracting the journal's support for the hypothesis.
I’m sorry if I offended anyone’s consciousness by bringing this up, but this can affect how we view the health effects of radio frequency electromagnetic fields.
Since the magnetic fields of these EMF’s can pass deeper into the body than the electric field, that would mean that the magnetic field can affect many of the voltage gated ion channels in the body. That’s including the brain.
> Since the magnetic fields of these EMF’s can pass deeper into the body than the electric field,
Can they actually though?
I'm not an expert, but to my knowledge the penetration depth of an electromagnetic wave depends only on its frequency. I can't find anywhere online that makes a distinction between electric waves and magnetic waves.
When the electromagnetic wave hits a substance, it splits into separate electric and magnetic waves. The electric wave does not penetrate deeply, but magnetic waves can go through anything. Magnetic waves can only be diverted, not blocked.
But actually everything is merely waves and fields.
There's going to be a time where humans finally reconcile the quantum with the newtonian -- and I can't wait for that day