In 1905, a patent clerk in Bern published four papers that restructured physics. Special relativity, the photoelectric effect, Brownian motion, an explanation of mass-energy equivalence—all from one mind in a single year. The temptation is to treat this as destiny, as though the universe was always going to hand us these insights on that particular schedule. But what if Einstein had pursued a different career? What if he'd secured an academic post and been consumed by teaching obligations rather than liberated by clerical monotony?
This isn't idle speculation. The question of whether scientific knowledge is inevitable—converging toward the same truths regardless of who discovers them or when—or deeply contingent on particular people, cultures, and sequences of events sits at the heart of how we understand scientific progress. It shapes how we fund research, how we train scientists, and how much confidence we place in the edifice of knowledge we've constructed.
The history of science is littered with simultaneous discoveries—Darwin and Wallace, Leibniz and Newton, multiple independent derivations of the same theorem—which suggests a kind of gravitational pull toward certain ideas when conditions are ripe. But it also contains singular creative leaps so idiosyncratic, so dependent on a particular mind's architecture, that their inevitability becomes difficult to defend. Exploring these counterfactuals reveals something profound about the relationship between truth, discovery, and the human minds that mediate between the two.
Necessity vs. Contingency: The Convergence Debate
The strongest argument for scientific inevitability comes from the phenomenon of multiple independent discovery. Sociologist Robert Merton catalogued hundreds of cases—calculus, natural selection, the telephone, oxygen—where two or more investigators arrived at essentially the same insight within a narrow window. The pattern is so pervasive that Merton proposed it as a structural feature of science itself: when the prerequisite knowledge, tools, and problems align, the discovery becomes almost overdetermined.
This convergence thesis carries a reassuring implication. It suggests that scientific truth exerts a kind of gravitational pull on inquiry. The world has a structure, and sufficiently sophisticated investigation will uncover that structure regardless of which particular humans do the investigating. Philosopher Hilary Putnam's notion of science converging toward a unique "God's eye view" of reality reflects this intuition. The details might differ, but the deep architecture of knowledge would be recognizable across any sufficiently advanced civilization.
Yet the convergence thesis has serious vulnerabilities. Many so-called simultaneous discoveries, on closer inspection, turn out to be less simultaneous and less identical than the narrative suggests. Darwin and Wallace shared a commitment to natural selection, but their conceptions of its mechanisms, scope, and implications diverged significantly. Newton and Leibniz developed calculus with fundamentally different notations and conceptual frameworks—differences that shaped the subsequent mathematical traditions of England and the continent for over a century.
Philosopher Ian Hacking has argued that scientific convergence operates robustly for what he calls "low-level" regularities—empirical patterns close to observation—but becomes far less reliable for high-level theoretical frameworks. We might all discover that objects fall, but the explanation for why they fall—Aristotelian natural motion, Newtonian gravitation, Einsteinian spacetime curvature—is deeply shaped by the theoretical vocabulary available at the time of inquiry. The same phenomena can sustain radically different theoretical structures.
The implication is uncomfortable for those who see science as a purely cumulative march toward truth. If theoretical frameworks are underdetermined by the evidence they explain—if multiple incompatible theories can account for the same observations—then which framework a scientific community adopts may depend on factors that have nothing to do with the world's intrinsic structure: aesthetic preferences, institutional power, the charisma of particular advocates, the accidents of who happened to be working on what problem at what moment.
TakeawaySimultaneous discovery suggests the empirical world constrains inquiry powerfully, but the theoretical frameworks we build to explain that world may be far more contingent than we assume—shaped as much by historical circumstance as by nature's structure.
Path Dependence: How the Sequence of Discovery Shapes the Landscape of Knowledge
Consider a deceptively simple observation: the order in which discoveries occur shapes which questions scientists think to ask next. This is path dependence—the idea that early events in a sequence constrain and channel later possibilities in ways that are not easily reversed. In economics and technology studies, path dependence explains why inferior standards sometimes persist (the QWERTY keyboard being the canonical example). In science, its effects may be even more profound.
Quantum mechanics provides a striking illustration. The theory emerged from a specific sequence of puzzles—blackbody radiation, the photoelectric effect, atomic spectra—encountered in a particular order by a particular community of European physicists steeped in particular mathematical traditions. The formalism that resulted—Hilbert spaces, operator algebras, the Born rule—is notoriously unintuitive. Physicists have spent a century arguing about what it means. But had the empirical puzzles arrived in a different sequence, or been encountered by a community with different mathematical tools, the resulting formalism might have looked very different—perhaps more intuitive, perhaps less, but almost certainly different in ways that would have channeled subsequent research along alternative trajectories.
Path dependence operates through what Kuhn called the "exemplar"—the paradigmatic problem-solution that trains each generation of scientists in how to see the world. Once a community adopts a particular exemplar, it becomes the lens through which anomalies are perceived, research programs are designed, and careers are built. The exemplar doesn't just describe the world; it constitutes the framework within which new phenomena become visible or remain invisible. Discoveries that don't fit the reigning exemplar may be ignored, dismissed, or simply unnoticed—not through malice but through the cognitive architecture of trained perception.
Barbara McClintock's work on transposable genetic elements—"jumping genes"—offers a vivid case. Her discoveries in the 1940s and 1950s were largely ignored for decades, not because the evidence was poor but because the prevailing paradigm of genetics had no conceptual space for them. The framework assumed that genes were fixed entities in stable chromosomal locations. McClintock's findings required a different way of thinking about the genome—one that the field was not prepared to adopt until molecular biology had developed sufficiently to provide independent corroboration and a new conceptual vocabulary.
What makes path dependence so consequential is its invisibility. Scientists working within an established paradigm rarely perceive the contingency of their own framework. The questions they ask seem natural, obvious, inevitable—because the paradigm has trained them to see the world in a way that makes those questions salient. Alternative questions, the ones that a different historical path might have foregrounded, simply don't occur. The road not taken in science is not merely unexplored; it is, for all practical purposes, unimaginable from within the paradigm that took the other road.
TakeawayThe sequence in which discoveries arrive doesn't just affect the timeline of knowledge—it shapes which questions become thinkable and which remain invisible, making the landscape of science far less neutral than it appears from the inside.
Alternative Histories: The Roads Scientific Progress Might Have Taken
Thought experiments about alternative scientific histories aren't just philosophical indulgences—they expose the hidden load-bearing assumptions in our current knowledge. Consider the case of continental drift. Alfred Wegener proposed the idea in 1912, supported by compelling evidence: the jigsaw-fit of coastlines, matching fossil assemblages, geological continuities across oceans. The theory was rejected, sometimes viciously, for nearly half a century—primarily because Wegener, a meteorologist, lacked standing in geological circles, and because no plausible mechanism for the movement of continents was available within the reigning geophysical framework.
Had plate tectonics been accepted in the 1910s rather than the 1960s, the consequences would have ramified across multiple disciplines. Biogeography, paleontology, and evolutionary biology would have developed with a fundamentally different spatial framework. The isolation and reconnection of landmasses would have been a central explanatory tool decades earlier. Entire research programs that flourished in the interim—land bridge hypotheses, static-earth models of species distribution—would never have consumed the intellectual energy they did. The science we have is always shadowed by the science we might have had.
An even more dramatic counterfactual involves the development of computing. Charles Babbage's Analytical Engine, designed in the 1830s, anticipated the fundamental architecture of modern computers by more than a century. Had Victorian engineering and institutional support been sufficient to realize his designs, mechanical computation might have developed alongside, or even ahead of, electrical approaches. The mathematical and logical frameworks that accompanied early computing—Boolean algebra, formal logic, information theory—might have matured in a profoundly different intellectual context, potentially accelerating or redirecting entire branches of mathematics and physics.
Perhaps the most philosophically provocative counterfactual concerns the relationship between quantum mechanics and general relativity. These two pillars of modern physics remain fundamentally incompatible—a situation that has persisted for nearly a century. One can reasonably ask whether the order in which they were discovered, and the particular formalisms chosen to express them, has made their unification more difficult than it needed to be. A different historical path—one where a unified framework emerged before the two theories crystallized into separate mathematical traditions with separate communities and separate institutional structures—might have avoided the impasse entirely.
These thought experiments do not prove that our current science is wrong. They suggest something more subtle and more important: that the body of knowledge we possess is one of several possible bodies of knowledge that the natural world could sustain. The truths we've discovered are real, but they may not be the only truths—or even the most illuminating truths—that a different history of investigation might have uncovered. This recognition should cultivate a productive humility about what we know and a heightened sensitivity to the paths our current paradigms may be preventing us from exploring.
TakeawayThe science we have is real but not unique—the natural world likely supports multiple coherent frameworks of understanding, and the one we inhabit is as much a product of historical accident as of nature's necessity.
The counterfactual lens does not diminish science—it deepens our appreciation for its complexity. Scientific knowledge is neither a pure mirror of nature nor a mere social construction. It is something more interesting: a dialogue between the world's structure and the historically situated minds that interrogate it.
Recognizing the contingency of our scientific trajectory is not a counsel of despair. It is an invitation to intellectual humility and creative ambition. If the paths we've taken are not the only ones available, then genuinely novel frameworks—not just incremental extensions of existing paradigms—remain possible. The next paradigm shift may require not better data but a different starting question.
What we know is shaped by the order in which we came to know it. Sitting with that thought changes what it means to do science—and what it might mean to do it differently.