Friday, February 17, 2023

A Superposition among Quantum Physicists

1. Introduction

The entirety of Modern Physics is based on two fundamental theories of Relativity and Quantum Mechanics. Apart from the obvious nature of being entirely different in terms of their mathematical and theoretical framework, these theories are fundamentally distinct of their historic origin. The Theory of Relativity took birth from a purely theoretical construction. Of course, the theory was carefully mounted using previous knowledge and information whose origin was experimental in nature (disproving the existence of ether, constancy of light speed, invalidity of Galilean Relativity, etc.), but a fair share of this information itself was moderately theoretical. Quantum Physics on the other hand, emanated from a string of seemingly unrelated experiments. It was as if nearly simultaneous events and discoveries in the world of Physics of 19th century suddenly connected together to produce a rigid theory of Quantum Mechanics. One can surely dispute that the true beginning of Quantum Mechanics was from the studies that came from analyzing the nature of Black-Body radiations and the "Ultraviolet Catastrophe". But then, I'd interject with the Stern-Gerlach Experiment which equally contributed in the development of Quantum Theory. De-Broglie's Hypothesis which was quite independent then connected with Schrodinger's formulation of the wavefunction, whilst a nearly parallel "matrix formulation" of Quantum Mechanics was being discovered by Heisenberg.  This wasn't the case with Relativity at all. There were no parallel formulations of the theory which were totally un-correlated. It was a product given solely by Einstein,  supplemented by great minds like Minkowski, Lorentz, Schwarzschild and experimentalists like Arthur Eddington who testified this theory. What difference does this make? The question can be answered by analyzing the various interpretations and to put it gently by the present haze of mild confusion in Quantum Theory.

2. Difference Between a Theory and a Model

It is a common observation that whenever any scientific idea relies too heavily on observational and experimental information rather than depending moderately on theoretical advances, the idea becomes too empirical and lacks a flavor of intuition. This is frequently observed in Chemistry where most of the ideas are based on empirical observations of chemical experiments and reactions. An unpopular opinion about this same proposition can be observed in Physics as well and particularly in the field of Observational Cosmology. The present state of Observational Cosmology is generating cosmological models to fit the observed astronomical data (like supernovae and redshifts) and constraining the cosmological parameters using this same data. 



This particular approach of fitting a model to experimental data is not suited to produce an intuitive idea and often fades away before gaining much recognition. It is similar to manufacturing a piece of clothing for a particular person but then cutting the cloth down at some places or stitching it up in others to make it fit that person. The piece of clothing loses its value and continuity. The difference between a theory which happens to explain the present experimental observations and a theory which is molded and bent to forcefully fit the experimental observations is similar. Another important feature is that the former case usually continues to explain and fit in any future experimental discoveries. Newton's Laws of Motion or Gravitation and Einstein's Theory of Relativity are two strikingly spectacular examples of such theories which solved the mechanical problems of those times but continued to appear in future experiments and applications till this instant. 

This elegance of a theory is what distinguishes it from being a mere model of reality. Of course, this is my individual opinion and it maybe quite controversial to say the least. Most people will object that theories and models are synonymous in the realm of Physics. However, a model is something which is constructed to replicate or mold any observed phenomenon or event. It doesn't extend much beyond the domains of this phenomenon. A theory on the other hand goes well beyond this limit and even continues to apply in some inter-disciplinary fields. Thus, a theory is a strong indicator that we are moving in the right direction whereas a model is something which should be viewed as a temporary tool to understand reality and obtaining a first hand solution to the present experimental problems.

3. The Haze of Confusion

If one were to analyze the nature of Quantum Theory on grounds of the discussion done previously, then it is apparent that Quantum Mechanics leans more towards the "model" type than being a theory. But, it is neither. The conventionally accepted origin of Quantum Physics from Planck's idea of quantizing energy back in the 1900s gives off the picture that it was made to fit the observations namely of Black Body radiation and to solve the "Ultraviolet Catastrophe". However, it should be noted that Planck himself stumbled upon this idea unintentionally and even disregarded it as a mistake until it was revived back by Einstein through the Photoelectric Effect. Planck's idea happened to resolve the infinities in Blackbody radiations. It then took De-Broglie to make quite a bold move and assert that "if light can follow Wave-Particle Duality then why not matter?" and the rest is history. 

What's the confusion about then? Its about the various interpretations that circle around this experimental consequences of Quantum Mechanics. When Heisenberg himself gave the famous Uncertainty Principle constraining the precision with which we can measure a particle's position and momentum simultaneously, he had something different in his mind. In fact, he blamed this stubborn Uncertainty Principle on the measuring apparatus and our inability to probe the particles position without disturbing it. Imagine an extremely high power microscope which can be used to view particles at subatomic scales by shining light on it. This microscope has a knob to change the wavelength of light which is shined on the particle. The basic principle of "seeing" any object is bouncing light or electromagnetic radiations off the object, which then goes into our eyes and we see it.  One basic condition which needs to be obeyed when "seeing" something is that the physical size of the object to be viewed should be larger than the wavelength of light or conversely the light which we bounce off the object should have a sufficiently short wavelength compared to the size of object. Otherwise, the light would pass right through it without reflecting (as observed in case of radio waves). So now we tune the wavelength knob of our microscope and make it small enough so that it would be reflected off the subatomic particle.  But we run into a big problem - It is a known fact that light itself carries some energy and exerts a pressure when incident on objects. This pressure is called as "radiation pressure". On large scales, this pressure is too feeble to produce any measurable changes on macroscopic objects. However, in case of microscopic particle, the light bouncing off them will impart a momentum to these particles. Therefore, by attempting to measure the particle's position with great precision we perturbed it's momentum. This was referred to as the "Heisenberg's Microscope". 

It was a good physical interpretation of the Heisenberg's Principle, whose mathematical basis was that the "position"(x) and "momentum"(p) operators in Quantum Mechanics do not commute i.e.

x*p - p*x = ih

But then physicists started objecting that this physical interpretation of Heisenberg's microscope is not valid. The Uncertainty Principle is not a consequence of the limitations of our measuring ability. In fact, it's not a consequence of our measurements at all. It is to deal with the fact that in Quantum Mechanics, position and momentum variables are something known as "Fourier Pairs" just like Energy and Time, or Frequency and Time. A particle can only have a precise position in space if it is localized at a single point. According to De-Broglie's theory, the momentum carried by a particle is inversely proportional to the wavelength of matter waves. So a particle with an infinitely precise position will have zero wavelength, which makes its momentum undetermined. On the other hand, a particle with a well defined momentum will be spread out in space as a matter wave with a well defined wavelength, but then it will have no fixed position. By this argument, a particle intrinsically doesn't have an arbitrarily well defined position and momentum simultaneously. Which explanation is true then? The more intuitive Heisenberg's microscope or the abstract mathematical explanation on the basis of Fourier pairs? Or could the two be different ways of looking at the same phenomenon? The answer is really not known. There is a fair deal of research which disproves the "Heisenberg's Microscope" experiment by showing that its effects are far weaker than those that are mathematically predicted. 

Some hardcore "Copenhagen Interpreters" of Quantum Theory go even as far as claiming that the notion of position of  particle has no meaning. The particle in fact has no position prior to any measurement and the act of measurement forces the particle to take a position and materialize into existence. It is as if the existence of particles is a manifestation of our own attempt to "look" at them. This is the idea behind the claim that "Reality does not exist when you are not looking at it." There could be a giant elephant behind you as you are reading this post until you decide to look back. I am unaware of the correctness of this statement but it sure as hell is creepy.

4. A Superposition among Quantum Theorists

You can now observe that the coherence and flow of this blogpost started to diverge in the end paragraph of previous section. Well, it is because this is where we reach the ultimate threshold in Quantum Physics where all Physicists agree with each other i.e. until the wave nature of matter and the discreteness of physical variables and the Uncertainty Principle. After that it's all a matter of interpreting the empirical results in one's own way. Some say the Uncertainty principle is due to our measuring limitations while some say its an intrinsic property. Some say that a particle has no position a priori to measurement while some say it is simply unknown due to other "Hidden Variables". The latest Nobel Prize in Physics in the year 2022 was awarded for partially disproving this notion of Hidden Variables by violation of Bell's Inequalities. However, the idea of Local Hidden Variable is still retained. There are loads of other interpretations around the core principles of Quantum Mechanics that I wont touch like Many World's Interpretation, Pilot Wave Theory also known as Bohmian Mechanics, Gravity Induced Collapse models, etc. But this is a known difference in opinions and it is true that the different interpretations won't make any difference in the applications of Quantum Mechanics. Quantum Computers would still work whether there exists many worlds in which every state of the superposition exists or whether the particle is piloted by a guiding wave. 

The most emphasizing difference however is in the disputed nature of the most fundamental quantity in Quantum Physics - the wavefunction. It's standard characteristic is that it gives all the information which could be known about a physical quantum system.  The absolute square of this wavefunction yields us with the probability distribution of the particle existing at a particular position. By its nature, the wavefunction is a complex valued function existing in an infinite dimensional Hilbert Space. It is contradictory to think of this wavefunction as a tangible quantity in real space. Wavefunction and the Schrodinger Wave Equation itself is based on the fundamental De-Broglie Hypothesis which relates the so called "matter waves" with the momentum. De-Broglie himself believed that these matter waves are physical in nature like ripples on the surface of a pond or electromagnetic waves. Later, the advent of Statistical Interpretation and Max Born's discoveries put forth the probabilistic nature of these matter waves. It is hard to visualize how a wave of probabilities can be physical in nature. Therefore, this physical view of matter waves was discarded. The unanswered question is that : Is it really discarded? There is still a subset of Quantum Theorists which dispute that the matter waves are physical in nature. And why they shouldn't be after all? If one can observe the usual wave effects of interference and diffraction then what makes these waves separate from real physical waves? 

I think the real problem in Quantum Physics is that if matter waves are abstract and non-physical then their transition into the real world and manifestation in experiments demonstrating the wave nature is not clearly defined.  After all there should be a mechanism to facilitate this transition. If matter waves were entirely abstract and that the wavefunction always existed in non-physical spaces then these effects were not supposed to be observed experimentally but should have been observed mathematically or in theory. Ehrenfest's Theorem provides only a quantitative way of how quantum problems reduce to their classical counterparts under the averaging case. Ehrenfest's Theorem is observed in many other aspects of Physics and Statistical scenarios like the Central Limit Theorem, reducing of Electrical Noise by increasing the averaging time, eliminating random errors by averaging measurement values and so on. We still fall short of finding a way to connect the abstract mathematics of Quantum Theory with its physical consequences. If the wavefunction isn't physical then what are those ripples that we observe in this particular real image of atoms.  

Image from the short film A Boy and His Atom (Credits : IBM)



There isn't a common rigid opinion among theorists supporting the same interpretation itself. This might point towards the fact that the formalism of Quantum Physics is yet incomplete and the mathematics that we use now might just be a "model" - sufficing to temporarily observe the explained phenomenon and make stuff like Quantum Computers, Semiconductors, Lasers, etc. work. In spite of that, the model falls apart when we stretch it too far and ask meaningful questions about the physical nature of quantum systems. Perhaps in the future, we will get our hands on a more extendable theory of Quantum Physics which will answer all questions and collapse this superposition among Quantum Physicists.


- Thank You