I am a scientist. I am often wrong, and that’s okay.
You may have heard about major errors in science and engineering that made the news headlines, like the collapse of the Tacoma Narrows Bridge, aka “Galloping Gertie,” or the 1999 crash of the Mars Climate Orbiter. Or maybe you’ve seen the recent video from SpaceX, “How Not to Land an Orbital Rocket Booster.” You may not realize how often scientists are wrong, but being wrong is actually part of the process of doing science. The trick is to catch errors before they leave the lab, and certainly before they make the front-page news, though, obviously, that doesn’t always happen.
Thomas Edison famously said, “Genius is one percent inspiration and 99 percent perspiration.” Why so much perspiration? Because of all the effort you put into testing your inspirations, having them be wrong and not work, and then trying again. This is brilliantly illustrated by something Edison supposedly also said about making lightbulbs: “I have not failed 700 times. I’ve succeeded in proving 700 ways how not to build a lightbulb.”
I have had many moments of being wrong in my scientific career. One of my most memorable moments was during a hands-on exam in school when I was given equipment to observe and measure the radioactive decay of a certain isotope. I remember thinking that I needed to repeat the measurement several times to find the average decay rate. I diligently recorded the number of clicks read by the Geiger counter during a fixed time interval and then I averaged the results. In the process of doing this averaging, I completely overlooked the fact that the rate of radioactive decay decreases over time. I had incorrectly assumed that this experiment would have the same property as most science experiments: that the results (in this case, the decay rate) wouldn’t change over time. After hearing the chatter of the other students after the exam, I immediately realized my mistake, but it was too late. My answers to the exam were completely wrong. I was mortified.
Looking back, this experience taught me several lessons. First, I learned that science can be humbling. I shouldn’t be overly confident in my conclusions because there’s always a chance I might be wrong, something that Mother Nature will no doubt reveal to me (or my colleagues) at some point. More importantly, though, the experience taught me that it’s okay to be wrong if you are willing to accept that possibility and make corrections. In this case, I had followed the scientific method, but I ran out of time before I could correct myself. In other words, I hypothesized that the rate of radioactive decay did not change over time. I tested the hypothesis by observing and recording the rates. I analyzed the data, but I failed to notice that there was a downward trend. With more time, I probably would have caught my error and revised my hypothesis and data analysis, accordingly.
In my current work, I often follow the same basic framework. I have a hypothesis that I want to test. I do experiments and analyze the data to look for evidence that will confirm or disprove that hypothesis. Many times, the trend I find does not match my expectations, so I go back and re-examine my hypothesis and/or check whether I’m doing the experiment correctly. Problems with an experiment are common because it’s easy to overlook factors like the temperature stability or uniformity inside an oven, or the alignment of a laser in the experimental setup. A lot of effort in the laboratory is spent troubleshooting and repeating experiments before arriving at a conclusion.
This vigilance against errors is the key ingredient to making advances in science. One of the greatest discoveries made at NIST was of quasicrystals by Dan Shechtman, who earned the Nobel Prize in Chemistry in 2011 for this work. While studying the electron diffraction patterns from a rapidly solidified aluminum alloy in 1982, Dan saw symmetries that—according to the existing theory of crystal structure—were impossible. His observations and hypothesis were opposed by both the prevailing theory and two-time Nobel Prize-winner Linus Pauling, one of the world’s most famous scientists. Shechtman spent more than two years gathering data and debating with colleagues before he was able to publish his work. To do this, Dan had to painstakingly eliminate all the other possible explanations for his measurements, including experimental errors. Dan’s determination—his perspiration—proved the conventional wisdom about crystal symmetry—and a double-Nobel-Prize-winner—wrong.
See? Even world-renowned experts can be wrong sometimes! (Pauling, however, never conceded.)
As this example shows, being wrong is not the same as being incompetent. Whereas incompetence involves being both wrong and lacking the conceptual tools to discover that you’re wrong, it’s okay to be wrong if you are able to realize your error and take steps to both correct and learn from it. Thinking like a scientist involves recognizing that you will occasionally (or more than occasionally) be wrong and knowing how to find out why. Science is a journey, and part of that journey is making errors and being empowered to make changes based on lessons learned.
Even the news-making science errors have had lasting, positive impacts. From the Tacoma Narrows bridge collapse, scientists learned the importance of wind and aerodynamics for bridges. After the Mars orbiter crash, NASA made changes that enabled the success of the two Mars rovers, Spirit and Opportunity. Making errors in science is just part of the process and allows scientists to learn and broaden what we know. It’s only by being wrong that we ever learn what’s right.
So, to all you scientists and non-scientists, go forth and be wrong! You’ll probably discover something new on your journey.