So understandably written yet exciting and deep reporting from Nature News once again:
He suspects that a strange quantum concept known as negative probability — negative dips in the probability distribution of a particle’s location or momentum — could be at the heart of the issue. These dips may mean that a measuring device disturbs the system less than the uncertainty principle seems to allow. “The fact these two different definitions give you a different answer is telling you something about the weirdness of quantum mechanics,” says Wiseman.
Related to one of my early posts on the blog:
One of the most fascinating and to my mind central questions of contemporary physics is the ontological status of quantum objects – does the quantum wavefunction describe reality as it is, or merely our possible knowledge of it. A related question is: where is the limit between the quantum and the classical? Earlier assumptions about limiting quantum effects to extremely small systems(see also a TED talk on the topic), only non-biological systems, or extremely cold systems have all been vigorously pushed back by improvements in experimental techniques.
Update (12.9.2012): a new news article was just posted on Nature News on the limitations of Heisenberg’s original formulation of the uncertainty principle, which is quite relevant for this discussion. The bottom line is that the uncertainty is not a result of a perturbation of the object by the act of measurement, but rather an inherent property of quantum systems. Ars Technica again has an extremely lucid story on the study.