How do we know climate change is really happening?
Climate change is often cast as a prediction made by complicated computer models. But the scientific basis for climate change is much broader, and models are actually only one part of it (and, for what it’s worth, they’re surprisingly accurate).
For more than a century, scientists have understood the basic physics behind why greenhouse gases like carbon dioxide cause warming. These gases make up just a small fraction of the atmosphere but exert outsized control on Earth’s climate by trapping some of the planet’s heat before it escapes into space. This greenhouse effect is important: It’s why a planet so far from the sun has liquid water and life!
However, during the Industrial Revolution, people started burning coal and other fossil fuels to power factories, smelters and steam engines, which added more greenhouse gases to the atmosphere. Ever since, human activities have been heating the planet.
We know this is true thanks to an overwhelming body of evidence that begins with temperature measurements taken at weather stations and on ships starting in the mid-1800s. Later, scientists began tracking surface temperatures with satellites and looking for clues about climate change in geologic records. Together, these data all tell the same story: Earth is getting hotter.
Average global temperatures have increased by 2.2 degrees Fahrenheit, or 1.2 degrees Celsius, since 1880, with the greatest changes happening in the late 20th century. Land areas have warmed more than the sea surface and the Arctic has warmed the most — by more than 4 degrees Fahrenheit just since the 1960s. Temperature extremes have also shifted. In the United States, daily record highs now outnumber record lows two-to-one.
This warming is unprecedented in recent geologic history. A famous illustration, first published in 1998 and often called the hockey-stick graph, shows how temperatures remained fairly flat for centuries (the shaft of the stick) before turning sharply upward (the blade). It’s based on data from tree rings, ice cores and other natural indicators. And the basic picture, which has withstood decades of scrutiny from climate scientists and contrarians alike, shows that Earth is hotter today than it’s been in at least 1,000 years, and probably much longer.
In fact, surface temperatures actually mask the true scale of climate change, because the ocean has absorbed 90 percent of the heat trapped by greenhouse gases. Measurements collected over the last six decades by oceanographic expeditions and networks of floating instruments show that every layer of the ocean is warming up. According to one study, the ocean has absorbed as much heat between 1997 and 2015 as it did in the previous 130 years.
We also know that climate change is happening because we see the effects everywhere. Ice sheets and glaciers are shrinking while sea levels are rising. Arctic sea ice is disappearing. In the spring, snow melts sooner and plants flower earlier. Animals are moving to higher elevations and latitudes to find cooler conditions. And droughts, floods and wildfires have all gotten more extreme. Models predicted many of these changes, but observations show they are now coming to pass.
How much agreement is there among scientists about climate change?
There’s no denying that scientists love a good, old-fashioned argument. But when it comes to climate change, there is virtually no debate: Numerous studies have found that more than 90 percent of scientists who study Earth’s climate agree that the planet is warming and that humans are the primary cause. Most major scientific bodies, from NASA to the World Meteorological Organization, endorse this view. That’s an astounding level of consensus given the contrarian, competitive nature of the scientific enterprise, where questions like what killed the dinosaurs remain bitterly contested.
Scientific agreement about climate change started to emerge in the late 1980s, when the influence of human-caused warming began to rise above natural climate variability. By 1991, two-thirds of earth and atmospheric scientists surveyed for an early consensus study said that they accepted the idea of anthropogenic global warming. And by 1995, the Intergovernmental Panel on Climate Change, a famously conservative body that periodically takes stock of the state of scientific knowledge, concluded that “the balance of evidence suggests that there is a discernible human influence on global climate.” Currently, more than 97 percent of publishing climate scientists agree on the existence and cause of climate change (as does nearly 60 percent of the general population of the United States).
Do we really only have 150 years of climate data? How is that enough to tell us about centuries of change?
Earth’s climate is inherently variable. Some years are hot and others are cold, some decades bring more hurricanes than others, some ancient droughts spanned the better part of centuries. Glacial cycles operate over many millenniums. So how can scientists look at data collected over a relatively short period of time and conclude that humans are warming the planet? The answer is that the instrumental temperature data that we have tells us a lot, but it’s not all we have to go on.
Historical records stretch back to the 1880s (and often before), when people began to regularly measure temperatures at weather stations and on ships as they traversed the world’s oceans. These data show a clear warming trend during the 20th century.
Some have questioned whether these records could be skewed, for instance, by the fact that a disproportionate number of weather stations are near cities, which tend to be hotter than surrounding areas as a result of the so-called urban heat island effect. However, researchers regularly correct for these potential biases when reconstructing global temperatures. In addition, warming is corroborated by independent data like satellite observations, which cover the whole planet, and other ways of measuring temperature changes.
Much has also been made of the small dips and pauses that punctuate the rising temperature trend of the last 150 years. But these are just the result of natural climate variability or other human activities that temporarily counteract greenhouse warming. For instance, in the mid-1900s, internal climate dynamics and light-blocking pollution from coal-fired power plants halted global warming for a few decades. (Eventually, rising greenhouse gases and pollution-control laws caused the planet to start heating up again.) Likewise, the so-called warming hiatus of the 2000s was partly a result of natural climate variability that allowed more heat to enter the ocean rather than warm the atmosphere. The years since have been the hottest on record.
Still, could the entire 20th century just be one big natural climate wiggle? To address that question, we can look at other kinds of data that give a longer perspective. Researchers have used geologic records like tree rings, ice cores, corals and sediments that preserve information about prehistoric climates to extend the climate record. The resulting picture of global temperature change is basically flat for centuries, then turns sharply upward over the last 150 years. It has been a target of climate denialists for decades. However, study after study has confirmed the results, which show that the planet hasn’t been this hot in at least 1,000 years, and probably longer.