Skip to main content

The Empty Half of the Glass May Also Be Full

How many scientific breakthroughs have been lost because they came from outside the mainstream?

Finding an ant in the kitchen often triggers my sense of alarm, because for any ant I see there must be many more out of view. The same applies to many aspects of life, occasionally with greater consequences.

Take scientific breakthroughs as an example. The 2019 Nobel Prize in Physics was awarded in part for the first discovery of a Jupiterlike planet close to a sunlike star by Michel Mayor and Didier Queloz in 1995. Searching for such a planet was proposed by Otto Struve in 1952, but it took four decades for mainstream astronomers to agree that a risky search for another planetary system so different from our own was worth the precious telescope time it would consume.

Similarly, the revolutionary theory of continental drift advanced by Alfred Wegener in 1912, was rejected by mainstream geologists for four decades and only became popular after the mechanism of plate tectonics was recognized. In biology, the rules of genetic heredity formulated by Gregor Mendel in 1866 were ignored by the scientific community, rediscovered by Hugo de Vries and Carl Correns three decades later, and eventually explained by the molecular chemistry of DNA almost a century after Mendel’s work.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Such examples are often used to support the notion that the scientific method works and that the truth eventually prevails. But these success stories reflect a selection bias. For every case that barely made it to a successful end, there must have been many that never came to our attention since their value was never recognized. Bearing in mind the “ants in the kitchen” metaphor, there must have been many scientific innovations that were suppressed, and their originators bullied, because they were ahead of their time. That these breakthroughs never came to fruition constitutes a net loss to humanity.

This lesson also reverberates throughout the professional life of scientists, including my own. My eventual focus on astrophysics was enabled by John Bahcall, who chose to offer an untrained foreigner like myself a five-year fellowship at the Institute for Advanced Study in Princeton. His generosity paid it forward three decades later—when I hired the brilliant, unrecognized scientist Manasvi Lingam as my postdoctoral fellow. My collaboration with Manasvi over the past four years blossomed into 35 papers and a forthcoming textbook. A similar but more extreme example involves the career of the mathematician Srinivasa Ramanujan, who produced groundbreaking theorems after being plucked out of obscurity based on the recognition of his raw talent by Godfrey Hardy of Cambridge University. There must be many Ramanujans in developing countries, who lack an opportunity to realize their talents.

The same applies to artists and their creative work. Vincent van Gogh was considered a madman and a failure throughout his life, but his reputation changed to that of misunderstood genius when elements of his painting style were incorporated by expressionists several decades after his suicide in 1890 at age 37. Today, van Gogh’s paintings are among the most expensive ever to have sold. The writer Samuel Beckett did not get his first novel published and so he shelved it.

The novel was eventually published in 1992, three years after Beckett’s death and 23 years after he was awarded the 1969 Nobel Prize in literature. In an even more unusual example, the extraordinary novelist Franz Kafka instructed his friend Max Brod to burn his literary writings after his death at age 40. If Brod had followed the instructions, Kafka’s remarkable writings would have been lost forever. How many works of art and literary treasures were lost this way from our collective memory?

Darwinian selection based on the principle of short-term popularity does not necessarily favor the most significant human creations for the long run. Rather than assuming that our evaluation system functioned well in selecting the worthiest products of our civilization, historians of arts and sciences should dig into past records in search for lost treasures.

Recognizing “unborn babies” and hearing their muted messages from our past should convince us to do better in the future. Most importantly, we should not use social media to bully those who are different among us but rather celebrate the innovation they bring. Constructive debates can nowadays be recorded online at a relatively low cost.

During these turbulent times of polarization, it is crucial to cultivate tolerance to diversity of ideas. We must learn to listen and explain why we disagree, especially on matters of science where evidence serves as the ultimate arbitrator for disputes. “The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently,” wrote the philosopher Friedrich Nietzsche in his 1881 book titled The Dawn of Day.

The lesson from our troubled past is simple. We should nurture those who think creatively and use merit instead of the number of “likes” on Twitter to gauge the value of their insights. Private and federal funding agencies should establish scholarships to support brilliant individuals with insufficient education and adverse socioeconomic backgrounds. If we will not get our act together, advanced alien civilizations in search of intelligent life in the cosmos might downgrade us as not particularly intelligent and worthy of their attention. From their perspective, we might be a form of life as primitive and common in the cosmos as ants are in a kitchen. Through better behavior, it is in our power to exclude this possible explanation for not hearing from them—the so-called Fermi paradox.