INTRODUCTION TO QUANTUM PHYSICS

Quantum physics is the science of the very small: the body of scientific principles that explains the behaviour of matter and its interactions with energy on the scale of *atoms and subatomic particles*. A great many findings of Quantum physics are counter-intuitive and appear to contradict common sense. The findings are based on complex mathematics. To have a good understanding of quantum physics requires a high order of mathematical skills. What follows below has been simplified as much as possible (without losing the essential aspects of our current understanding of quantum physics) and keeps the mathematics to an absolute minimum. Despite this, there is no getting round the fact that quantum physics is a difficult subject to understand. Therefore it is strongly recommended that the video clips be viewed to aid your understanding of this complex subject.

**THE DEVELOPMENT OF QUANTUM PHYSICS**

Classical physics explains matter and energy on a scale familiar to human experience, including the behaviour of astronomical bodies. It remains the key to measurement for much of modern science and technology. However, toward the end of the 19th century, scientists discovered phenomena in both the large (macro) and the small (micro) worlds that classical physics could not explain.

Coming to terms with these limitations led to a major revolution in physics and created a shift in the original scientific paradigm, first with the theory of relativity, later with the development of quantum mechanics. Physicists discovered the limitations of classical physics and developed the main concepts of the quantum theory that replaced it in the early decades of the 20th century. These concepts are described in roughly the order in which they were first discovered.

In this sense, the word quantum means the minimum amount of any physical entity involved in an interaction. Certain characteristics of matter can take only discrete values.

Light behaves in some respects like particles and in other respects like waves. Matter - particles such as electrons and atoms - exhibits wavelike behaviour too. Some light sources, including neon lights, give off only certain discrete frequencies of light. Quantum mechanics shows that light, along with all other forms of electromagnetic radiation, comes in discrete units, called photons, and predicts its energies, colours, and spectral intensities.

*Some aspects of quantum mechanics can seem counterintuitive or even paradoxical, because they describe behaviour quite different from that seen at larger length scales.* In the words of Richard Feynman, quantum mechanics deals with "nature as She is – absurd". For example, the (Heisenberg) uncertainty principle of quantum mechanics means that the more closely one pins down one measurement (such as the position of a particle), the less precise another measurement pertaining to the same particle (such as its momentum) must become.

**The First Quantum Theory: Max Planck & Black-body Radiation**

Quantum Leap Max Planck and Black Body Radiation

Thermal radiation is electromagnetic (EM) radiation emitted from the surface of an object due to the object's temperature. If an object is heated sufficiently, it starts to emit light at the red end of the spectrum – it is red hot.

Heating it further causes the colour to change from red to yellow, white, and blue, as light at shorter wavelengths (higher frequencies) begins to be emitted. It turns out that a perfect emitter is also a perfect absorber. When it is cold, such an object looks perfectly black, because it absorbs all the light that falls on it and emits none. Consequently, an ideal thermal emitter is known as a black body, and the radiation it emits is called *black-body radiation.*

In the late 19th century, thermal radiation had been fairly well characterized experimentally. However, classical physics was unable to explain the relationship between temperatures and predominant frequencies of radiation. Physicists were searching for a single theory that explained why they got the experimental results that they did.

The first model that was able to explain the full spectrum of thermal radiation was put forward by Max Planck in 1900. He came up with a mathematical model in which the thermal radiation was in equilibrium with a set of harmonic oscillators. To reproduce the experimental results, he had to assume that each oscillator produced an integer number of units of energy at its single characteristic frequency, rather than being able to emit any arbitrary amount of energy. In other words, the energy of each oscillator was *quantized*. The quantum of energy for each oscillator, according to Planck, was proportional to the frequency of the oscillator; the constant of proportionality is now known as the *Planck constant*. The Planck constant, usually written as h, has the value of 6.63×10−34 J s. So, the energy E of an oscillator of frequency f is given by

E = nhf {where} n = 1, 2, 3…

Planck's law was the first quantum theory in physics, and Planck won the Nobel Prize in 1918 "in recognition of the services he rendered to the advancement of Physics by his discovery of energy quanta". At the time, however, Planck's view was that quantization was purely a mathematical trick, rather than (as we now believe) a fundamental change in our understanding of the world.

**Photons: The Quantisation of Light**

Chemistry the Nature of Light

In 1905, Albert Einstein took an extra step. He suggested that quantisation was not just a mathematical trick: the energy in a beam of light occurs in individual packets, which are now called photons. The energy of a single photon is given by its frequency multiplied by Planck's constant:

E = hf

For centuries, scientists had debated between two possible theories of light: was it a wave or did it instead comprise a stream of tiny particles? By the 19th century, the debate was generally considered to have been settled in favour of the wave theory, as it was able to explain observed effects such as *refraction, diffraction, and polarization*.

James Clerk Maxwell had shown that electricity, magnetism, and light are all manifestations of the same phenomenon: the electromagnetic field. Maxwell's equations, which are the complete set of laws of classical electromagnetism, describe light as waves: a combination of oscillating electric and magnetic fields. Because of the preponderance of evidence in favour of the wave theory, Einstein's ideas were met initially with great scepticism. Eventually, however, the photon model became favoured. One of the most significant pieces of evidence in its favour was its ability to explain several puzzling properties of the photoelectric effect, described in the following section. Nonetheless, the wave analogy remained indispensable for helping to understand other characteristics of light, such as diffraction.

**The Photoelectric Effect**

The photo-electric effect explained

In 1887, Heinrich Hertz observed that, when light hits a metallic surface with sufficient frequency, it emits electrons. In 1902, Philipp Lenard discovered that the maximum possible energy of an ejected electron is related to the* frequency of the light*, not to its intensity: if the frequency is too low, no electrons are ejected regardless of the intensity. The lowest frequency of light that can cause electrons to be emitted, called the *threshold frequency*, is different for different metals. __This observation is at odds with classical electromagnetism, which predicts that the electron's energy should be proportional to the intensity of the radiation.__

Einstein explained the effect by postulating that a beam of light is a stream of particles (photons) and that, if the beam is of frequency f, then each photon has an energy equal to hf. An electron is likely to be struck only by a single photon, which imparts at most an energy hf to the electron. Therefore, the intensity of the beam has no effect and only its frequency determines the maximum energy that can be imparted to the electron.

To explain the threshold effect, Einstein argued that it takes a certain amount of energy, called the work function, denoted by φ, to remove an electron from the metal. *This amount of energy is different for each metal.* If the energy of the photon is less than the work function, then it does not carry sufficient energy to remove the electron from the metal. The threshold frequency, f0, is the frequency of a photon whose energy is equal to the work function:

Einstein's description of light as being composed of particles extended Planck's notion of quantised energy: a single photon of a given frequency, f, delivers an invariant amount of energy, hf. In other words, individual photons can deliver more or less energy, but only depending on their frequencies. *However, although the photon is a particle, it was still being described as having the wave-like property of frequency*. Once again, the particle account of light was being compromised.

**Consequences of the Light Being Quantised**

The relationship between the frequency of electromagnetic radiation and the energy of each individual photon is why ultraviolet light can cause sunburn, but visible or infrared light cannot. A photon of ultraviolet light will deliver a high amount of energy – enough to contribute to cellular damage such as occurs in a sunburn. A photon of infrared light will deliver a lower amount of energy – only enough to warm one's skin. So, an infrared lamp can warm a large surface, perhaps large enough to keep people comfortable in a cold room, but it cannot give anyone a sunburn.

If each individual photon had identical energy, it would not be correct to talk of a high energy photon. Light of high frequency could deliver more energy only because of flooding a surface with more photons arriving per second. Light of low frequency could deliver less energy only if it delivered fewer photons per second. If it were true that all photons carry the same energy, then, if one doubled the rate of photon delivery, one would also double the number of energy units arriving each second regardless of the frequency of the incident light. Einstein rejected that wave-dependent classical approach in favour of a particle-based analysis, where the energy of the particle must be absolute and varies with frequency in discrete steps (i.e. quantised). *All photons of the same frequency have identical energy, and all photons of different frequencies have proportionally different energies.*

In nature, single photons are rarely encountered. The sun emits photons continuously at all electromagnetic frequencies, so they appear to propagate as a continuous wave, not as discrete units. The emission sources available to Hertz and Lennard in the 19th century shared that characteristic. A star that radiates red light or a piece of iron in a forge that glows red may both be said to contain a great deal of energy. It might be surmised that adding continuously to the total energy of some radiating body would make it radiate red light, orange light, yellow light, green light, blue light, violet light, and so on in that order. But that is not so, as larger stars and larger pieces of iron in a forge would then necessarily glow with colours more toward the violet end of the spectrum. To change the colour of such a radiating body, it is necessary to change its temperature. An increase in temperature changes the quanta of energy available to excite individual atoms to higher levels, enabling them to emit photons of higher frequencies.

The total energy emitted per unit of time by a star (or by a piece of iron in a forge) depends on both the number of photons emitted per unit of time, as well as the amount of energy carried by each of the photons involved. In other words, *the characteristic frequency of a radiating body is dependent on its temperature.* When physicists were looking only at beams of light containing huge numbers of individual and virtually indistinguishable photons, it was difficult to understand the importance of the energy levels of individual photons. So when physicists first discovered devices exhibiting the photoelectric effect, they initially expected that a higher intensity of light would produce a higher voltage from the photoelectric device. Conversely, they discovered that strong beams of light toward the red end of the spectrum might produce no electrical potential at all, and that weak beams of light toward the violet end of the spectrum would produce higher and higher voltages.* Einstein's idea that individual units of light may contain different amounts of energy, depending on their frequency, made it possible to explain such experimental results that had hitherto seemed quite counterintuitive.*

Although the energy imparted by photons is invariant at any given frequency, the initial energy state of the electrons in a photoelectric device prior to absorption of light is not necessarily uniform. Anomalous results may occur in the case of individual electrons. For instance, an electron that was already excited above the equilibrium level of the photoelectric device might be ejected when it absorbed uncharacteristically low frequency illumination. Statistically, however, the characteristic behaviour of a photoelectric device will reflect the behaviour of the vast majority of its electrons, which will be at their equilibrium level. This point is helpful in comprehending the distinction between the study of individual particles in quantum dynamics and the study of massed particles in classical physics.

**The Quantisation of Matter: The Bohr Model of the Atom**

By the dawn of the 20th century, evidence required a model of the atom with a diffuse cloud of negatively-charged electrons surrounding a small, dense, positively-charged nucleus. These properties suggested a model in which the electrons circle around the nucleus like planets orbiting a sun. *However, it was also known that the atom in this model would be unstable*: according to classical theory orbiting electrons are undergoing centripetal acceleration, and should therefore give off electromagnetic radiation, the loss of energy also causing them to spiral toward the nucleus, colliding with it in a fraction of a second.

A second, related, puzzle was the emission spectrum of atoms. *When a gas is heated, it gives off light only at discrete frequencies.* For example, the visible light given off by hydrogen consists of four different colours, as shown in the picture below.

The intensity of the light at different frequencies is also different. By contrast, white light consists of a continuous emission across the whole range of visible frequencies. By the end of the nineteenth century, a simple rule had been found which showed how the frequencies of the different lines were related to each other, though without explaining why this was, or making any prediction about the intensities. The formula also predicted some additional spectral lines in ultraviolet and infrared light which had not been observed at the time. These lines were later observed experimentally, raising confidence in the value of the formula.

In 1913 Niels Bohr proposed a new model of the atom that included quantized electron orbits: electrons still orbit the nucleus much as planets orbit around the sun, but they are only permitted to inhabit certain orbits, not to orbit at any distance. When an atom emitted (or absorbed) energy, the electron did not move in a continuous trajectory from one orbit around the nucleus to another, as might be expected classically. *Instead, the electron would jump instantaneously from one orbit to another, giving off the emitted light in the form of a photon.* The possible energies of photons given off by each element were determined by the differences in energy between the orbits, and so the emission spectrum for each element would contain a number of lines.

Starting from only one simple assumption about the rule that the orbits must obey, the Bohr model was able to relate the observed spectral lines in the emission spectrum of hydrogen to previously-known constants. In Bohr's model the electron simply wasn't allowed to emit energy continuously and crash into the nucleus: once it was in the closest permitted orbit, it was stable forever. Bohr's model didn't explain why the orbits should be quantised in that way, and it was also unable to make accurate predictions for atoms with more than one electron, or to explain why some spectral lines are brighter than others.

Although some of the fundamental assumptions of the Bohr model were soon found to be wrong, the key result that the discrete lines in emission spectra are due to some property of the electrons in atoms being quantised is correct. The way that the electrons actually behave is strikingly different from Bohr's atom, and from what we see in the world of our everyday experience; this modern quantum mechanical model of the atom is discussed below.

**Wave-particle Duality**

Louis de Broglie in 1929. De Broglie won the Nobel Prize in Physics for his prediction that matter acts as a wave, made in his 1924 PhD thesis.

Just as light has both wave-like and particle-like properties, matter also has wave-like properties.

Matter behaving as a wave was first demonstrated experimentally for electrons: A beam of electrons can exhibit diffraction, just like a beam of light or a water wave. Similar wave-like phenomena were later shown for atoms and even small molecules.

The wavelength, λ, associated with any object is related to its momentum, p, through the Planck constant, h:

The relationship, called the de Broglie hypothesis, holds for all types of matter: *all matter exhibits properties of both particles and waves.*

The concept of wave–particle duality says that neither the classical concept of "particle" nor of "wave" can fully describe the behaviour of quantum-scale objects, either photons or matter. Wave–particle duality is an example of the principle of complementarity in quantum physics. An elegant example of wave–particle duality, the double slit experiment, is discussed in the section below.

**Double-slit Experiment**

Dr Quantum - Double Slit Experiment

The diffraction pattern produced when light is shone through one slit (top) and the interference pattern produced by two slits (bottom). The much more complex pattern from two slits, with its small-scale interference fringes, demonstrates the wave-like propagation of light.

In the double-slit experiment as originally performed by Thomas Young and Augustin Fresnel in 1827, a beam of light is directed through two narrow, closely spaced slits, producing an interference pattern of light and dark bands on a screen. If one of the slits is covered up, one might naively expect that the intensity of the fringes due to interference would be halved everywhere. In fact, a much simpler pattern is seen, a simple diffraction pattern. Closing one slit results in a much simpler pattern diametrically opposite the open slit. Exactly the same behaviour can be demonstrated in water waves, and so the double-slit experiment was seen as a demonstration of the wave nature of light.

The double-slit experiment has also been performed using electrons, atoms, and even molecules, and the same type of interference pattern is seen. *Thus it has been demonstrated that all matter possesses both particle and wave characteristics.*

Even if the source intensity is turned down so that only one particle (e.g. photon or electron) is passing through the apparatus at a time,* the same interference pattern develops over time.* The quantum particle acts as a wave when passing through the double slits, but as a particle when it is detected. This is a typical feature of quantum complementarity: a quantum particle will act as a wave when we do an experiment to measure its wave-like properties, and like a particle when we do an experiment to measure its particle-like properties. The point on the detector screen where any individual particle shows up will be the result of a random process. However, the distribution pattern of many individual particles will mimic the diffraction pattern produced by waves.

Double Slit Experiment explained by Jim Al-Khalili

**Application to the Bohr Model**

De Broglie expanded the Bohr model of the atom by showing that an electron in orbit around a nucleus could be thought of as having wave-like properties. In particular, an electron will be observed only in situations that permit a standing wave around a nucleus. An example of a standing wave is a violin string, which is fixed at both ends and can be made to vibrate. The electron's wavelength therefore determines that only Bohr orbits of certain distances from the nucleus are possible. In turn, at any distance from the nucleus smaller than a certain value it would be impossible to establish an orbit. *The minimum possible distance from the nucleus is called the Bohr radius.*

De Broglie's treatment of quantum events served as a starting point for Schrödinger when he set out to construct a wave equation to describe quantum theoretical events.

**Development of Modern Quantum Mechanics**

When Bohr assigned his younger colleagues the task of finding an explanation for the intensities of the different lines in the hydrogen emission spectrum, Werner Heisenberg moved forward from a recent success in explaining a simpler problem. In 1925, by means of a series of mathematical analogies, he wrote out the quantum mechanical analogue for the classical computation of intensities. Shortly afterwards, Heisenberg's colleague Max Born realised that Heisenberg's method of calculating the probabilities for transitions between the different energy levels could best be expressed by using the mathematical concept of matrices.

Erwin Schrödinger based himself on de Broglie's hypothesis that he learned of in 1925, and during the first half of 1926 successfully described the behaviour of a quantum mechanical wave. *The mathematical model, called the Schrödinger equation after its creator, is central to quantum mechanics, defines the permitted stationary states of a quantum system, and describes how the quantum state of a physical system changes in time.* The wave itself is described by a mathematical function known as a "wave function", and is usually represented by the Greek letter "psi". In the paper that introduced Schrödinger's cat, he says that the wave function provides the "means for predicting probability of measurement results", and that it therefore provides "future expectation[s], somewhat as laid down in a catalogue."

Schrödinger was able to calculate the energy levels of hydrogen by treating a hydrogen atom's electron as a classical wave, moving in a well of electrical potential created by the proton. This calculation accurately reproduced the energy levels of the Bohr model.

In May 1926, Schrödinger proved that Heisenberg's matrix mechanics and his own wave mechanics made the same predictions about the properties and behaviour of the electron; mathematically, the two theories were identical.

**Copenhagen Interpretation**

The Niels Bohr Institute in Copenhagen, which served as a focal point for researchers into quantum mechanics and related subjects in the 1920s and 1930s. Most of the world's best known theoretical physicists spent time there, developing what became known as the Copenhagen interpretation of quantum mechanics.

Bohr, Heisenberg and others tried to explain what these experimental results and mathematical models really mean. Their description, known as the Copenhagen interpretation of quantum mechanics, aimed to describe the nature of reality that was being probed by the measurements and described by the mathematical formulations of quantum mechanics.

The main principles of the Copenhagen interpretation are:

• A system is completely described by a wave function, ‘psi’. (Heisenberg)

• How ‘psi’ changes over time is given by the Schrödinger equation.

• The description of nature is essentially probabilistic. The probability of an event – for example, where on the screen a particle will show up in the two slit experiment – is related to the square of the absolute value of the amplitude of its wave function. (Born rule, due to Max Born, which gives a physical meaning to the wave function in the Copenhagen interpretation: the probability amplitude)

• It is not possible to know the values of all of the properties of the system at the same time; those properties that are not known with precision must be described by probabilities. (Heisenberg's uncertainty principle)

• Matter, like energy, exhibits a wave–particle duality. An experiment can demonstrate the particle-like properties of matter, or its wave-like properties; but not both at the same time. (Complementarity principle due to Bohr)

• Measuring devices are essentially classical devices, and measure classical properties such as position and momentum.

• The quantum mechanical description of large systems should closely approximate the classical description. (Correspondence principle of Bohr and Heisenberg)

• Various consequences of these principles are discussed in more detail in the following subsections.

**Schrödinger’s Cat (Mind) Experiment.**

No description of quantum physics would be complete without mention of the famous Schrödinger's cat (thought) experiment.

Schrödinger's Cat

Schrödinger's cat is a famous illustration of the principle in quantum theory of superposition, proposed by Erwin Schrödinger in 1935. __Schrödinger's cat serves to demonstrate the apparent conflict between what quantum theory tells us is true about the nature and behaviour of matter on the microscopic level and what we observe to be true about the nature and behaviour of matter on the macroscopic level __- everything visible to the unaided human eye.

Here's Schrödinger's thought experiment: We place a living cat into a steel chamber, along with a device containing a vial of hydrocyanic acid. There is, in the chamber, a very small amount of hydrocyanic acid, a radioactive substance. If even a single atom of the substance decays during the test period, a relay mechanism will trip a hammer, which will, in turn, break the vial and kill the cat.

The observer cannot know whether or not an atom of the substance has decayed, and consequently, cannot know whether the vial has been broken, the hydrocyanic acid released, and the cat killed. Since we cannot know, according to quantum law, the cat is both dead and alive, in what is called a superposition of states. It is only when we break open the box and learn the condition of the cat that the superposition is lost, and the cat becomes one or the other (dead or alive). This situation is sometimes called quantum indeterminacy or the observer's paradox: the observation or measurement itself affects an outcome, so that the outcome as such does not exist unless the measurement is made. (That is, there is no single outcome unless it is observed.)

We know that superposition actually occurs at the subatomic level, because there are observable effects of interference, in which a single particle is demonstrated to be in multiple locations simultaneously. What that fact implies about the nature of reality on the observable level (cats, for example, as opposed to electrons) is one of the stickiest areas of quantum physics. Schrödinger himself is rumoured to have said, later in life, that he wished he had never met that cat.

Schrödinger’s Cat has been used to illustrate the differences between emerging theories in quantum mechanics, by testing how they would approach the experiment.

For example, the ‘many worlds interpretation’, developed in the 1950s, would argue that when the box is opened, the observer and dead-and-alive cat split into two realities, in one of which the observer sees a dead cat and the other an alive one.

**Uncertainty Principle**

Werner Heisenberg at the age of 26. Heisenberg won the Nobel Prize in Physics in 1932 for the work that he did at around this time.

Suppose that we want to measure the position and speed of an object – for example a car going through a radar speed trap. We assume that the car has a definite position and speed at a particular moment in time, and how accurately we can measure these values depends on the quality of our measuring equipment – if we improve the precision of our measuring equipment, we will get a result that is closer to the true value. In particular, we would assume that how precisely we measure the speed of the car does not affect its position, and vice versa.

In 1927, Heisenberg proved that these assumptions are not correct. Quantum mechanics shows that certain pairs of physical properties, like position and speed, cannot both be known to arbitrary precision: the more precisely one property is known, the less precisely the other can be known. This statement is known as the uncertainty principle. The uncertainty principle isn't a statement about the accuracy of our measuring equipment, but about the nature of the system itself – our assumption that the car had a definite position and speed was incorrect. On a scale of cars and people, these uncertainties are too small to notice, but when dealing with atoms and electrons they become critical.

Heisenberg gave, as an illustration, the measurement of the position and momentum of an electron using a photon of light. In measuring the electron's position, the higher the frequency of the photon the more accurate is the measurement of the position of the impact, but the greater is the disturbance of the electron, which absorbs a random amount of energy, rendering the measurement obtained of its momentum increasingly uncertain (momentum is velocity multiplied by mass), for one is necessarily measuring its post-impact disturbed momentum, from the collision products, not its original momentum. With a photon of lower frequency the disturbance – hence uncertainty – in the momentum is less, but so is the accuracy of the measurement of the position of the impact.

The uncertainty principle shows mathematically that the product of the uncertainty in the position and momentum of a particle (momentum is velocity multiplied by mass) could never be less than a certain value, and that this value is related to Planck's constant.

**Wave Function Collapse**

Wave function collapse is a forced expression for whatever just happened when it becomes appropriate to replace the description of an uncertain state of a system by a description of the system in a definite state. Explanations for the nature of the process of becoming certain are controversial. At any time before a photon "shows up" on a detection screen it can only be described by a set of probabilities for where it might show up. When it does show up, for instance in the CCD of an electronic camera, the time and the space where it interacted with the device are known within very tight limits. However, the photon has disappeared, and the wave function has disappeared with it. In its place some physical change in the detection screen has appeared, e.g., an exposed spot in a sheet of photographic film, or a change in electric potential in some cell of a CCD.

**Pauli Exclusion Principle**

In 1924, Wolfgang Pauli proposed a new quantum degree of freedom (or quantum number), with two possible values, to resolve inconsistencies between observed molecular spectra and the predictions of quantum mechanics. In particular, the spectrum of atomic hydrogen had a doublet, or pair of lines differing by a small amount, where only one line was expected. Pauli formulated his exclusion principle, stating that "There cannot exist an atom in such a quantum state that two electrons within [it] have the same set of quantum numbers."

**Spin**

A year later, Uhlenbeck and Goudsmit identified Pauli's new degree of freedom with a property called spin.

In 1922, Otto Stern and Walther Gerlach tried shooting electrons through a magnetic field. In classical mechanics, a magnet thrown through a magnetic field may be, depending on its orientation, deflected a small or large distance upwards or downwards. The electrons that Stern and Gerlach shot through the magnetic field acted in a similar way, however, while the magnets could be deflected variable distances, the electrons would always be deflected a constant distance either up or down, implicating that the property of the electron which corresponds to the magnet's orientation must be quantified, taking one of two values, as opposed to being chosen freely from any angle.

The idea, originating with Ralph Kronig, was that electrons behave as if they rotate, or "spin", about an axis. Spin would account for the missing magnetic moment, and allow two electrons in the same orbital to occupy distinct quantum states if they "spun" in opposite directions, thus satisfying the exclusion principle. The quantum number represented the sense (positive or negative) of spin.

**Dirac Equation**

In 1928, Paul Dirac extended the Pauli equation, which described spinning electrons, to account for special relativity. The result was a theory that dealt properly with events, such as the speed at which an electron orbits the nucleus, occurring at a substantial fraction of the speed of light. By using the simplest electromagnetic interaction, Dirac was able to predict the value of the magnetic moment associated with the electron's spin, and found the experimentally observed value, which was too large to be that of a spinning charged sphere governed by classical physics. He was able to solve for the spectral lines of the hydrogen atom, and to reproduce from physical first principles Sommerfeld's successful formula for the fine structure of the hydrogen spectrum.

Dirac's equations sometimes yielded a negative value for energy, for which he proposed a novel solution: he posited the existence of an antielectron and of a dynamical vacuum. This led to the many-particle quantum field theory.

**Philosophical Debate**

Quantum Theory has led to some debates that were previously thought to be philosophical. In the video below issues of free will and determinism are discussed. Please note that this is only one perspective among many!

Quantum Mechanics Documentary on Determinism vs Free Will

Since its inception, the many counter-intuitive results of quantum mechanics have provoked strong philosophical debate.

Albert Einstein, himself one of the founders of quantum theory, disliked this loss of determinism. He held that quantum mechanics must be incomplete, and produced a series of objections to the theory. The most famous of these was the EPR paradox.

Our understanding of quantum physics tells us that even a vacuum is not entirely empty. Particles can appear spontaneously in a vacuum and equally spontaneously disappear. Over a period of time the appearance and disappearance of these ‘virtual’ particles balances out and hence the principle that matter cannot be created out of nothing is not violated!

The many worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a "multiverse" composed of mostly independent parallel universes. Some scientists have argued that for every choice that is possible for an individual, each choice is ‘acted out’ in a parallel universe. So if you had a choice of marrying X or Y, and chose X, in another parallel universe you chose Y. According to this way of thinking on some parallel Earth (in an alternate universe), Hitler won WW2, or possibly the Cuban crisis of the 1960s resulted in nuclear war between the US and USSR.

While the multiverse is deterministic, we perceive non-deterministic behaviour governed by probabilities because we can observe only the universe we inhabit.

Is There A Multiverse? - Universe Documentary

**Quantum Mechanics: Applications**

Much of modern technology operates under quantum mechanical principles. Examples include the laser, the electron microscope, and magnetic resonance imaging. Most of the calculations performed in computational chemistry rely on quantum mechanics.

Many of the phenomena studied in condensed matter physics are fully quantum mechanical, and cannot be satisfactorily modelled using classical physics. This includes the electronic properties of solids, such as superconductivity and semiconductivity. The study of semiconductors has led to the invention of the diode and the transistor, which are indispensable for modern electronics.

In even the simple light switch, quantum tunnelling is absolutely vital, as otherwise the electrons in the electric current could not penetrate the potential barrier made up of a layer of oxide. Flash memory chips found in USB drives also use quantum tunnelling, to erase their memory cells.

Researchers are currently seeking robust methods of directly manipulating quantum states. Efforts are being made to develop quantum cryptography, which will allow guaranteed secure transmission of information. A more distant goal is the development of quantum computers, which are expected to perform certain computational tasks with much greater efficiency than classical computers.

You might find these video documentaries by Jim Khalili interesting.

The Secrets of Quantum Physics Episode 1 Einstein's Nightmare BBC Documentary 2014

The Secrets of Quantum Physics: Let There Be Life Episode 2 BBC Full Documentary 2013

**Recommended Further Reading**

*Quantum Physics for Dummies*, by Steve Holzner, and

*Quantum Mechanics: The Theoretical Minimum*, by Susskind and Friedman.

## Bookmarks