Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's not. It only has this Schmidt decomposition in one basis. In other bases there will be cross terms among the basis elements. What you're doing is privileging Schmidt bases as ones that give experiences. In another basis with states w,z say the state will be: |w>|w> + |z>|z> + |w>|z> + |z>|w>

The state's evolution will be completely equivalent to (a linear superposition of) the evolution of |x>|x> and |y>|y>. That's a physically observable fact that's independent of your choice of basis (it's less obvious in the |z>/|w> basis, but it's still true).

Any physically valid concept of "experience" would have to behave the same way. If your state is equivalent to a linear superposition of "experiencing x" and "experiencing y" then it can be characterised completely in terms of "experiencing x" and "experiencing y", and that's not dependent on your choice of basis (though it may be easier to see in one basis or another).

> No you can't, it's a direct consequence of the Kochen-Specker theorem. If the device is treated quantum mechanically and it enters an entangled state of the form you gave then you cannot perform conditioning as the Kochen-Specker theorem, via the non-uniqueness of Hilbert space orthogonal decompositions, prevents an unambiguous formulation of Bayes's law. I can link to papers proving this if you wish.

> The fact that we do experiments where we can condition is, in light of this theorem, a demonstration that our measurement devices do not enter into the kind of CHSH states you're giving.

I don't know what you're trying to claim here. All the available evidence is that measurement devices, being ordinary physical objects, follow the laws of quantum mechanics, and that includes conditioning behaving as entanglement; if you've got evidence that that's not the case then a Nobel prize awaits. Non-uniqueness is a red herring, because choice of basis does not and cannot change experimental predictions; the basis exists only in the map, not the territory.



Continuing from the longer post below.

The device has to have its contextual observable algebra develop a non-trivial center, not just be entangled as is mentioned in section 5 of the paper I linked. It's well known entanglement alone isn't enough which again is why entanglement alone has been called "pre-measurement" since the 1980s.

Note how this involves hard mathematics, not vague talk about "obvious features of subjective experience". I'll also note that this is a general feature of discussions about this stuff among non-physicists online, especially programming communities like this one, the knowledge is stuck in the late 1970s.


> The state's evolution will be completely equivalent to (a linear superposition of) the evolution of |x>|x> and |y>|y>. That's a physically observable fact that's independent of your choice of basis (it's less obvious in the |z>/|w> basis, but it's still true).

Of course the state can be written in the form |xx> + |yy>. I never denied that. The point is that it can be written in other forms. So it's equally correct to say we'd "experience" |zz> + |ww> + |zw> + |wz> as to say we'd experience |xx> + |yy> so there's no reason to say we'd "obviously" experience only the latter. Your argument is just "that expansion is always available", but since other expansions are also always available I don't see what the force of this argument is.

Even worse in QFT there isn't an expansion of the form |xx> + |yy> available due to the Reeh-Schleider theorem so your whole construction is moot anyway. Again where is this paper deriving the Born rule from unitarity and basic facts about subjective experience.

> I don't know what you're trying to claim here. All the available evidence is that measurement devices, being ordinary physical objects, follow the laws of quantum mechanics, and that includes conditioning behaving as entanglement.

I'm claiming a consequence of a well known theorem from Quantum Probability. See section 4.2 of this paper https://arxiv.org/abs/1310.1484

Quantum states without superselection (e.g. the entangled states of the form you are considering) leave Bayesian conditioning undefined. As the paper mentions this is a direct consequence of the Kochen-Specker theorem via non-unique orthogonal expansion. It's not a red herring but a rigorously proved theorem.

I don't know what the "Nobel prize" remark is about as it is well known that entanglement doesn't give well-defined conditioning. That's why entanglement with the device alone is called "pre-measurement" in most papers in measurement theory following terminology introduced by Zurek in the early 80s. A good example of the issues with pre-measurement alone is here https://arxiv.org/abs/2003.07464. You can't just treat the device as simply entering some CHSH or GHZ style entangled state and think that solves everything about measurement. It doesn't via the theorem I gave in the paper above (and other issues).


> Of course the state can be written in the form |xx> + |yy>. I never denied that. The point is that it can be written in other forms. So it's equally correct to say we'd "experience" |zz> + |ww> + |zw> + |wz> as to say we'd experience |xx> + |yy> so there's no reason to say we'd "obviously" experience only the latter. Your argument is just "that expansion is always available", but since other expansions are also always available I don't see what the force of this argument is.

If there's a simple description of the wavefunction that's valid then there should be a correspondingly simple description of our experiences that's valid. The fact that there's also a more complicated valid description of the wavefunction is neither here nor there. It's like looking at a basket of 4 apples and asking why your experience doesn't correspond to there being 6 - 2 apples.

> Quantum states without superselection (e.g. the entangled states of the form you are considering) leave Bayesian conditioning undefined. As the paper mentions this is a direct consequence of the Kochen-Specker theorem via non-unique orthogonal expansion. It's not a red herring but a rigorously proved theorem.

Ok, I take your point, saying that we can just condition is overly flippant: if there are cross terms (i.e. entanglement) then classical conditional probability doesn't always accurately describe the behaviour of a system, and of course that's true for a system that includes experimenters inside it. But if we treat an experimenter's conditioning as creating entanglement, like any other QM interaction, and treat the subsequent evolution of the system quantum-mechanically, then there's no problem.

> A good example of the issues with pre-measurement alone is here https://arxiv.org/abs/2003.07464. You can't just treat the device as simply entering some CHSH or GHZ style entangled state and think that solves everything about measurement. It doesn't via the theorem I gave in the paper above (and other issues).

That paper amounts to nothing more than redefining "outcome" as something that cannot be in a superposition, and then using this to argue that it makes their unfounded notion of decoherence physically meaningful. If we assume that experimenters are physical systems that can undergo superpositions like any other, then of course Bell-style "no hidden variables" results apply when those variables are the outcomes of experiments. Big whoop. (Would you find the following argument convincing: "Pre-measuring the polarisation of the photon might have one of two possible results, so it doesn't have an outcome according to any reasonable notion of "outcome". Therefore if any observer has measured a photon's polarisation, a physically meaningful process of decoherence must have occurred"? Put like that it's hopefully obvious that this is nothing more than asserting the primacy of the Copenhagen interpretation).

> Note how this involves hard mathematics, not vague talk about "obvious features of subjective experience". I'll also note that this is a general feature of discussions about this stuff among non-physicists online, especially programming communities like this one, the knowledge is stuck in the late 1970s.

Look, I'm not a big fan of credentialism, but I do have a master's in this from a reputable institution. If working physics has found a compelling argument that there's something mysterious about measurement or experience, then that knowledge hasn't made its way as far as even taught postgrad courses, yet alone the wider public, and the blame for that has to rest with the physicists. (I rather suspect that there's no such argument that has reached any significant consensus among working physicists, and that that the "late 1970s" view in the public sphere reflects that).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: