A person, seen in silhouette from behind, looking up at stars and the Milky Way in the night sky

The universe is weird at the smallest scales. Particles pop in and out of existence, exist in multiple places at once, and somehow “know” what their partners are doing from across the cosmos.

For nearly a century, physicists have had the math to predict these bizarre behaviors with stunning accuracy. But they’ve struggled with a more fundamental question: What does it all mean? In his book What Is Real?, Adam Becker argues that most scientists simply gave up trying to find an answer.

Read on for our overview of What Is Real?: The Unfinished Quest for the Meaning of Quantum Physics to explore what quantum mechanics reveals about reality and why the debate over its meaning still matters today.

Overview of What Is Real? (Adam Becker)

Quantum physics presents a contradiction: Its mathematical laws, called quantum mechanics, describe a world where particles exist in multiple states at the same time and affect each other across vast distances. This seems alien to our observable reality, where things exist in definite locations and behave in predictable ways. But if quantum mechanics applies to all of the particles that compose our universe, why don’t we see quantum effects in everyday objects made of those same particles? And why, when physicists try to observe quantum systems, do their quantum properties seem to disappear? Where—and why—do the rules change? 

Physicists have struggled with these questions since the 1920s, when the mathematical framework describing this new kind of physics emerged. In the 2018 book What Is Real?, Adam Becker argues that most physicists chose to sidestep the problem rather than solve it. They adopted the “Copenhagen interpretation” of quantum mechanics, which says the rules don’t actually change—instead, nothing exists in definite form until it’s observed. This eliminates the mystery of why we don’t see quantum strangeness in our everyday world: Objective reality emerges from the act of observation rather than existing independently.

Becker argues that this interpretation didn’t emerge as the winner in a fair scientific debate. Instead, viable alternatives were marginalized due to political pressures and philosophical fashion. Becker holds a PhD in astrophysics from the University of Michigan and has written for the New York Times and the BBC. He published his book when the development of technologies that exploit quantum theory’s strangest features began to force a reconsideration of what quantum mechanics tells us about the nature of reality.

Our overview of What Is Real?: The Unfinished Quest for the Meaning of Quantum Physics unpacks Becker’s argument in four sections. First, we establish what quantum mechanics reveals about the microscopic world and how this conflicts with classical physics’ assumptions about reality. We then examine how quantum mechanics created a philosophical crisis about science’s purpose and limits. Next, we follow Becker’s account of how anti-realist interpretations became accepted, and finally, we explore the ongoing revival of approaches that take quantum strangeness seriously as a feature of reality.

Dramatis Personae: The Key Players in the Story of Quantum Physics

The debate over quantum mechanics involves a cast of characters spanning nearly a century. Here are the key figures whose ideas and arguments shaped this controversy.

The Founders: Max Planck discovered that energy comes in discrete packets, launching the quantum revolution despite his discomfort with its implications. Albert Einstein contributed crucial early insights about light particles but became a critic of quantum mechanics, troubled not by its randomness but by its apparent requirement for what he called “spooky action at a distance.”

The Debaters: Niels Bohr led the argument that particles don’t have definite properties until measured. Werner Heisenberg developed the uncertainty principle and matrix mechanics, suggesting particles exist only as “potentialities” until observation makes them real. Erwin Schrödinger created wave mechanics and a famous thought experiment that illustrates quantum mechanics’ apparent absurdities. John von Neumann created an “impossibility proof” that shut down debate on quantum mechanics’ philosophical questions for decades.

The Alternatives: David Bohm developed a realistic interpretation of quantum mechanics in which “pilot waves” guide particles along definite paths, but he faced political persecution and professional exile during the 1950s. Hugh Everett proposed that all quantum possibilities occur in parallel universes, but his many-worlds theory was buried by his own advisor and he left physics entirely.

The Revolutionary: John Bell proved that Einstein’s “spooky action at a distance” could be tested, transforming philosophical arguments into decisive laboratory experiments that vindicated quantum mechanics’ strangest predictions.

What We Know About the World

At the dawn of the 20th century, physicists believed they had mapped reality’s basic structure. But experiments with atoms shattered their most fundamental assumptions about the world. Becker reports that this forced physicists to develop an entirely new branch of physics—and a new mathematics to describe it. It revealed that nature’s building blocks operate according to rules so strange they seem to violate logic. In this section, we’ll examine what physicists thought they knew about the world and how quantum mechanics overturned those ideas.

What Physicists Thought They Knew

Becker explains that classical physics rested on intuitive assumptions about reality that successfully explained the observable world. Physicists viewed atoms as the fundamental building blocks of matter—tiny spheres that combined to create chemical compounds. In their view, each atom had a specific position, velocity, and energy that only changed according to Newton’s laws. Later discoveries revealed that atoms aren’t actually solid spheres but consist mostly of empty space, with electrons orbiting a dense nucleus containing positively charged protons. This “planetary model” suggested that atoms functioned like miniature solar systems, obeying the same physical laws as planets and stars.

The planetary model of atoms wasn’t the only assumption classical physics took for granted. Physicists believed energy flowed continuously, like water from a faucet—you could have any amount of energy, from a lot to a little to any fraction in between. They distinguished between waves and particles: Waves spread out through space and interfered with each other when they met, while particles followed definite paths and collided like solid objects. Finally, physicists assumed all objects had definite properties, whether you observed them or not.

What Quantum Mechanics Reveals Instead

At the time, physics stood on the precipice of a paradigm shift: Becker explains that experiments with atoms and light revealed a microscopic world where energy comes in discrete chunks rather than flowing smoothly, where matter and light exhibit properties of both waves and particles simultaneously, and where electrons can occupy only specific energy levels rather than any arbitrary energy value. When physicists developed new mathematics to explain these observations, they discovered that particles can exist in multiple states at once and influence each other across vast distances—phenomena that seem to contradict everyday experience.

Energy Comes in Packets Called “Quanta”

Becker reports that the first assumption to fall was that of energy’s continuity. In 1900, German physicist Max Planck tackled a practical problem: improving lightbulbs by studying how heated objects emit light. Everyone knew the pattern: A poker left in a fireplace starts black, then glows red, then orange, then white-hot. Classical physics predicted that heated objects should emit all colors of light equally, which would make the light look white at any temperature. Planck discovered he could explain real-world observations only by abandoning the continuity assumption and replacing it with a new one: Energy comes in indivisible packets, which Planck called “quanta,” like coins that can’t be broken into smaller denominations. 

According to Becker, Planck’s key insight was that different colors of light require different-sized energy packets. Higher-frequency light (like blue) demands larger energy packets than lower-frequency light (like red). This explained the poker’s color progression: At low temperatures, heated objects lack the energy to create the large packets that higher-frequency colors require, so they emit red and orange light. As temperature increases, more energy becomes available, shifting the glow toward white. Albert Einstein extended this insight in 1905, proving that light itself travels in discrete, quantized packets called “photons.”

Matter and Light Behave Like Both Waves and Particles

Next to fall was the classical distinction between waves and particles. Physicists discovered that light and matter exhibit wave and particle properties at the same time, depending on how you observe them. Becker explains how the double-slit experiment demonstrates this: When physicists fire individual electrons toward a barrier with two parallel slits, classical physics predicts the electrons should behave like tiny particles, passing through one slit or the other and creating two distinct bands on a detection screen. Instead, something seemingly impossible happens: The electrons create interference patterns—alternating bright and dark bands shaped like patterns created by waves when they overlap, such as those in a pond. 

The only explanation is that each electron somehow travels through both slits simultaneously and interferes with itself on the other side. But, as Becker notes, the mystery deepens: If physicists place detectors at the slits to track which path each electron takes, this makes the wave pattern vanish. The electrons behave like ordinary particles, each traveling through exactly one slit. The act of observation changes the electron’s behavior from wavelike to particle-like, showing that the universe’s basic building blocks are neither waves nor particles, but something that exhibits aspects of both categories.

Particles Can Occupy Only Specific Energy States

Classical physics predicted that electrons should orbit nuclei like planets around the sun, occupying any possible orbit and gradually spiraling inward while radiating energy. This energy would be detectable as a continuous rainbow of colors as electrons move through all possible positions. But Becker explains that when physicists heated different elements to examine the light they produced, they discovered something unexpected: Each element produces only specific colors with sharp boundaries between them. Sodium always produces yellow light, hydrogen emits distinct wavelengths of red, and every other element has its own unique colors.

Danish physicist Niels Bohr solved this puzzle in 1913 by proposing that electrons can only occupy specific energy levels around atomic nuclei—like a staircase where electrons can stand on particular steps but never in the spaces between. When electrons jump between levels, they emit or absorb light with energy precisely equal to the step difference, explaining why each element produces characteristic colors. As Becker explains, this principle applies universally: Vibrating molecules, spinning nuclei, and all atomic-scale systems exist only in discrete states determined by quantum mathematics. The smooth, continuous world of classical physics was thus completely overthrown at the microscopic level.

Mathematics Revealed Superposition and Entanglement

These experimental discoveries demanded entirely new mathematical tools. Becker writes that from 1925 to 1926, Werner Heisenberg and Erwin Schrödinger independently developed frameworks to describe quantum reality. Heisenberg’s “matrix mechanics” used abstract number arrays where normal arithmetic rules failed, while Schrödinger’s “wave mechanics” treated particles as waves governed by precise equations. Connecting these approaches showed that wave functions represent the probabilities of particles appearing in different states, meaning quantum mechanics could only predict likelihoods, never definite outcomes. 

Once Heisenberg and Schrödinger developed the math, things got even stranger. Becker explains that the mathematics revealed quantum reality’s most bizarre features: superposition and entanglement. Superposition means particles can exist in multiple states simultaneously. Entanglement creates connections between particles so that measuring one immediately affects its partner, regardless of how far apart they are, violating the rule that nothing travels faster than the speed of light. Nevertheless, Becker notes that Heisenberg and Schrödinger’s math could predict experimental results perfectly. This created a crisis: If the math works so well, what does it tell us about the nature of reality?

How the Measurement Problem Forced Physicists to Make a Choice

Becker explains that quantum mechanics seems to demand two different sets of physical laws for identical particles, and which laws apply depends on whether anyone’s watching, as in the double-slit experiment. Physicists call this the “measurement problem”—the act of measurement appears to change the rules that govern particles. This creates a puzzle: Where does the transition between one set of rules and the other occur?

Schrödinger responded with a thought experiment: Imagine a cat in a box with a Geiger counter and a radioactive atom that has a 50% chance of decaying—triggering a hammer to break a vial of poison. Quantum mechanics says the radioactive atom exists in superposition, both decayed and not-decayed. If quantum mechanics applies universally, superposition extends to the Geiger counter (triggered and not-triggered), the vial (broken and intact), and the cat (dead and alive). Only when you open the box does everything “choose” definite states. Schrödinger thought this ridiculous: Cats are alive or dead regardless of observation. This exposed that either quantum mechanics was incomplete or reality was stranger than anyone imagined.

Three Responses to the Measurement Problem

Becker explains that physicists developed three answers to the problem. Einstein and other realists insisted that quantum mechanics must be incomplete: that particles have properties the theory fails to describe. Bohr and the anti-realists suggested that particles don’t have properties until measured, which makes questions about unmeasured reality meaningless. Heisenberg, also an anti-realist, argued particles exist as “potentialities” until measurement makes them actual. By 1927, these crystallized into two competing visions: Realists insisted physics must describe an objective world that exists independently of observation, while anti-realists saw quantum mechanics as a tool for organizing experimental results rather than describing reality.

Einstein’s Realist Position: Quantum Mechanics Must Be Incomplete

Einstein, who’d contributed to quantum theory, found other physicists’ interpretations of the math unsatisfactory. Becker explains that Einstein objected to abandoning a reality that exists independently of observation. He believed science should describe the world as it really is and argued that, if quantum mechanics described situations such as Schrödinger’s cat, the theory must be incomplete. 

Einstein aired this objection in a thought experiment about two particles that bounce off each other. If you measure one particle’s position and momentum after the collision, that instantly determines the other’s properties, regardless of the distance between them. However, according to quantum mechanics, the other particle can only exist as a probability wave until it’s directly observed. So, either that particle has properties (momentum and position) that quantum mechanics doesn’t describe, or nature violates the principle of locality—the idea that objects can only be influenced by their immediate surroundings. Because of this, Einstein concluded that quantum mechanics couldn’t represent the final truth about reality.

Einstein believed future developments would reveal quantum mechanics to be a statistical approximation of some deeper, more complete theory. Becker explains that, in Einstein’s mind, this deeper theory could restore both locality and objective reality while preserving quantum mechanics’ practical successes.

Bohr’s Anti-Realist Position: Questions About Unmeasured Reality Are Meaningless

Unlike Einstein, Bohr’s response was to abandon the goal of making physics describe objective reality. As Becker explains, Bohr’s principle of complementarity held that certain pairs of properties can’t be observed at the same time, and that physicists needed both wave and particle descriptions to fully explain the world: Different experiments would reveal that light and matter have both of these “complementary” aspects, but they never apply at the same time. Further, Bohr argued that particles don’t have definite properties independent of measurement, so asking about where they are or what they’re doing when nobody is measuring them is meaningless. In sum, he concluded that quantum phenomena aren’t independently real.

Becker points out that Bohr’s interpretation created a divide in the anti-realist view of the world: There was a classical realm of real measurement devices and concrete experimental outcomes, and a quantum realm existing only as a mathematical formalism, not an independent reality. Bohr dismissed questions about what happens in the absence of observation, arguing that physics should focus on experimental results, not speculate about what’s unobservable. This let physicists use quantum mechanics without confronting its interpretive puzzles. Rather than asking what the mathematics meant about reality, they could just use it to predict experimental outcomes and leave philosophical questions aside.

Heisenberg’s Uncertainty Principle: A Different Kind of Anti-Realism

Heisenberg approached the measurement problem through his uncertainty principle: the idea that the more precisely you measure a particle’s position, the less precisely you can know its momentum, and vice versa. This wasn’t due to imperfect instruments but constraints imposed by quantum mechanics.

Becker explains that, like Bohr, Heisenberg took an anti-realist position, arguing that particles don’t have definite properties until measured. Yet while Bohr denied that any reality existed between measurements, Heisenberg proposed that particles exist, but only in a realm of “potentialities” rather than actualities. This solution created new puzzles: If particles exist only as potentialities, how do they interact with scientific instruments to produce definite measurements? How can something without actual characteristics cause specific readings? Despite their differences, both Bohr and Heisenberg reached the same conclusion: Questions about what particles are doing between measurements are meaningless.

Why Bohr’s Anti-Realism Prevailed

Becker contends that the measurement problem should have started a debate that didn’t stop until answers emerged. Instead, physicists accepted Bohr’s anti-realism—not because it offered a compelling solution to the problems posed by quantum mechanics, but because world events and institutional forces made pursuing answers professionally dangerous. The textbook story is that physicists agreed on a new interpretation of quantum mechanics at the 1927 Solvay Conference. But Becker argues this story is false. The debate revealed no unified position among Bohr’s supporters, just an alliance of opposition to Einstein’s realism. Only decades later would this collection of anti-realist views be labeled the “Copenhagen interpretation.”

There were two other reasons that anti-realism prevailed. First, physics evolved from a philosophical discipline into a massive military enterprise. During World War II, thousands of physicists worked on the Manhattan Project, the US’s program to build atomic bombs. After the war, military funding continued pouring into physics to develop weapons, radar systems, and other technologies. This meant physicists spent their time completing practical calculations rather than solving the theoretical puzzles that Einstein and Bohr’s generation debated.

Second, physicists who attempted to develop realistic alternatives to the Copenhagen interpretation faced career destruction. Becker reports that those who proposed viable interpretations were dismissed without serious scientific engagement and often lost their chances of finding academic employment because they didn’t “toe the line.” By the 1960s, the physics community had stopped asking hard questions about the meaning of quantum mechanics, treating this abandonment of foundational inquiry as scientific maturity rather than intellectual failure.

Foundational Questions Survived the Forced Consensus

Despite decades of institutional hostility, the fundamental questions about quantum mechanics’ meaning proved impossible to eliminate. Becker explains that experimental breakthroughs and theoretical innovations gradually rehabilitated foundational research—and revealed that the same questions troubling Einstein and Schrödinger remained unresolved, creating ongoing tensions about science’s ultimate purpose and the nature of reality itself.

Bell’s Theorem Transforms Philosophy Into Experiment

The return to these fundamental questions began in 1964 with John Bell, who was skeptical about a mathematical argument that had supposedly proven the Copenhagen interpretation was the only possible approach to quantum mechanics. This argument was John von Neumann’s 1932 “impossibility proof,” which claimed to show that no theory with “hidden variables,” where particles have definite properties before measurement, could reproduce quantum mechanics’ predictions. As Becker notes, this seemed to prove that realist positions such as Einstein’s were mathematically impossible: If particles can’t have definite properties before measurement, then only anti-realist interpretations such as Copenhagen could be correct. 

But Bell discovered that von Neumann’s proof was flawed. Bell then reconsidered Einstein’s thought experiment and transformed his philosophical concerns into mathematical tests. Bell reasoned that if particles do have definite properties before measurement, then measurements on entangled particles should obey certain mathematical constraints on how strongly correlated the results can be, which became known as “Bell’s inequalities.” Quantum mechanics predicts that entangled particles will violate these limits, so this gave physicists a decisive test, and experiments in 1972 and 1982 proved that particles violate Bell’s inequalities. This meant Einstein was right about “spooky action at a distance” being real—but wrong about quantum mechanics being incomplete. The theory wasn’t missing information; reality really was nonlocal.

Physicists Developed Three Alternative Paths Forward

Becker explains that Bell’s theorem forced physicists to face a choice: Abandon the principle of locality (and accept the idea of instant connections across space), abandon realism (and accept that properties don’t exist before measurement), or abandon the idea that quantum mechanics is complete. Three alternative interpretations represent different responses to this choice. They all revolve around the question of what causes wave function collapse—the moment when the “probability wave” of a quantum particle’s potential location and momentum “collapses” into the specific characteristics it takes on when it’s observed and measured.

Many-Worlds: Preserve Everything by Multiplying Universes

The many-worlds interpretation offers one escape route: Physicists could preserve both locality and realism by abandoning the assumption that only one outcome occurs. Becker explains that in this view, wave functions never collapse. Instead, all possible measurement outcomes happen in parallel branches of reality. This dissolves Bell’s dilemma by denying there’s a single definite result to correlate across space. When you measure an entangled particle, you don’t get just one outcome; instead, you experience all possible outcomes. The apparent nonlocality results from observers’ limited perspective: We only see one branch of reality while remaining unaware of countless others.

Becker points out that, under this interpretation, Schrödinger’s cat is both alive and dead, but in separate branches of reality. The measurement problem vanishes because measurements don’t force choices—they simply reveal which branch of the universal wave function we happen to be experiencing.

Pilot-Wave Theory: Accept Nonlocality, Restore Objective Reality

The pilot-wave interpretation takes a different approach: Accept Bell’s proof of nonlocality while restoring the objective reality Einstein sought. Becker notes that according to this view, particles always have definite positions and properties, and they’re guided by “pilot waves” that can influence distant particles instantly. This eliminates the measurement problem by removing the need for wave function collapse. Particles follow definite trajectories determined by waves, and measurements reveal where particles are. There’s no mystery about obtaining definite results: The particles detected in any experiment existed in definite states all along; we just didn’t know which ones until we measured them.

In the double-slit experiment, for example, each electron takes a definite path through one slit or the other, but the pilot waves go through both slits and create the interference patterns that guide where electrons can land on the detection screen. This explains the wave-like results without requiring particles to somehow pass through multiple slits simultaneously. Becker explains that the price is explicit nonlocality: Pilot waves connecting entangled particles provide the “spooky action at a distance” that Bell proved was unavoidable. Many physicists find this disturbing, but the interpretation at least makes the nonlocal connections explicit rather than hiding them within the measurement process itself.

Spontaneous Collapse: Modify the Mathematics

Spontaneous collapse theories take a third approach: They modify quantum mechanics to make wave function collapse a natural physical process rather than something mysterious triggered by measurement. These theories propose that wave functions randomly collapse on their own, with larger objects collapsing much more frequently than individual particles. Becker explains that this preserves both locality and objective reality by making collapse happen randomly rather than through nonlocal measurement interactions. Individual particles might remain in superposition for billions of years, but macroscopic objects that contain countless particles resolve into definite states almost instantly as random events accumulate.

According to Becker, this approach dissolves the measurement problem by eliminating the need for special measurement processes—collapse happens naturally through the theory’s modified dynamics. Schrödinger’s cat wouldn’t remain in a “both dead and alive” state of superposition for more than a split second because random wave function collapse would quickly force a definite outcome.

Why These Questions Matter

Becker reports that the revival of research into the interpretation of quantum mechanics coincided with the emergence of quantum technologies, which exploit the same strange phenomena—superposition, entanglement, and nonlocality—that created the original crisis. Quantum computers derive their power from superposition, performing many calculations simultaneously. Quantum cryptography exploits entanglement to create secure communications. Every successful quantum technology validates quantum mechanics’ mathematical accuracy while highlighting how little we understand what that mathematics means about reality. 

Becker argues that dismissing the questions posed by quantum mechanics represents a retreat from physics’ mission to understand the nature of reality. The measurement problem touches the core of our most successful scientific theory and may prove essential for developing theories capable of unifying quantum mechanics with gravity and cosmology.

What Is Real? Adam Becker on Quantum Physics (Book Overview)

Elizabeth Whitworth

Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books—and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.

Leave a Reply

Your email address will not be published. Required fields are marked *