On January 15th, the US Energy Information Administration released a report highlighting expected trends in energy production and consumption through 2020. Among those highlights was a forecast that US carbon emissions are expected to fall in both 2019 and 2020, continuing a decade-long trend of carbon emission reduction in the US. This is encouraging for multiple reasons.
When this trend began a decade ago, hydraulic fracturing was revolutionizing the oil and gas industry. While the technology had been around for decades, it was advancements made in the 1990s and early 2000s that made it a commercial success. Massive underground reservoirs of oil and gas that were previously inaccessible were now economically viable options. Natural gas has now overtaken a large portion of coal-based energy production in the United States. Because natural gas generation emits significantly less carbon dioxide than coal-fired generation, emissions have been falling.
Carbon emissions had been rising steadily since 1985, hitting their peak in 2007 right around the time the fracking boom began. That same year set a record for the highest total energy consumption in the United States. This past year was the first time since 2007 we’ve seen an increase in carbon emissions compared to the year before. The EIA concludes, however, that 2018’s increase is attributed to (1) abnormally warm summer months, (2) abnormally cold winter months, and (3) total energy consumption that fell just 0.4 percent short of the 2007 record. Despite such high levels of consumption compared to 2017, carbon emissions in 2018 were still 12 percent lower than they were in 2007.
Yes, energy production in the United States is a hotly debated topic, but the data show that innovations and growth in hydraulic fracturing have done more to reduce carbon emissions in the power sector than any other technology introduced over the past decade.
While it may not seem like it, fracking should be understood as an example of how we can tackle climate change. It is an example of what innovation can do, often times unexpectedly, to solve problems we face as a society.
Much of the climate debate today focuses heavily on intervention. President Trump was widely criticized for pulling out of the Paris climate accord. Yet, the US is already about two-thirds of the way to meeting the goal established by the Paris accord for 2025. This dramatic reduction with little coaxing by way of public policies is due in large part to coal-fired power being replaced by cleaner natural gas.
In fact, traditional regulatory efforts are often a blunt tool to reduce carbon emissions. Instead of focusing on performance (i.e., reduce carbon emissions by X%) they focus on design (i.e., use X device to reduce carbon emission). The problem with such an approach, however, is that as technology changes the regulations don’t. What’s more, lawmakers have no way of predicting what better alternatives may lie ahead in the next 10, 20, or 50 years that could provide a better path to reducing carbon emissions.
Take, for example, renewable portfolio standards (RPS). These common state-level policies are meant to encourage the growth of specific technologies (i.e., wind and solar) rather than focus on reducing emissions. While wind and solar can be useful energy systems, forcing their adoption by subsidizing and mandating their energy production creates unintended consequences. As research by energy economist Carolyn Fischer shows, after a 3 to 7.5 percent threshold is met, the RPS causes the cost of energy for consumers to rise rapidly.
Regulations like RPS suffer from what I call a “technology bias.” Instead of focusing on the outcome, they focus on the mode of achieving it. Wind and solar are almost exclusively mandated in most state RPS regulations. Other viable, carbon-neutral sources like nuclear and hydropower are not. In fact, while they do not produce any carbon emissions, they are not considered renewable by most states. This bias toward particular technologies doesn’t just hurt other current competitors, it also fails to consider what the next generation may look like. This greatly reduces the incentive to innovate under those standards. As Patrick McLaughlin, economist and Director of Policy Analytics at the Mercatus Center at George Mason University, noted in testimony before the US House Judiciary Committee,
Such short-sighted policy measures designed to pick winners and losers may do more harm than good in the long run. That isn’t to say there isn’t hope for massive reductions in carbon emissions. There certainly is. It just isn’t going to be found in politics.
Throughout all of human history, no force has played a bigger role in improving economic and social welfare than human ingenuity. This is especially true when considering the changes the world has experienced since the end of the 18th century. In 1776, the year America declared independence from the British Empire, life expectancy for Europeans and Americans hovered around 35 years (Figure 1). While not great, this was still better than the rest of the world. At that time, the economies of both regions were still largely agrarian. The scientific revolution had started 200 years prior, bringing with it the idea of scientific reasoning. This was followed by the enlightenment period in the 17th century, which began to challenge traditional ruling powers and examine the rights of individual human beings.
The ideas that came out of these revolutions were changing the way men and women thought, but the day-to-day aspects of human life remained largely unchanged. There is no reason to believe anyone in 1776 could have predicted that just 200 years later, life expectancy in developed countries would more than double. Certainly, they would have never anticipated the development of the internet, modern medicine, autonomous vehicles, or any of a number of innovations that have improved the quality of human life.
Britain (and subsequently the American Colonies) grew rapidly during the 18th century. Britain’s population nearly doubled over the course of a century, growing from 5 million people at the turn of the 17th century to over 9 million by 1801. This rapid growth was a source of concern for many individuals who worried that populations were growing faster than food supplies. The most notable proponent of this idea was an English economist named Robert Malthus.
In 1798, Malthus published a book titled “An Essay on the Principle of Population.” He had observed that populations seemed to be growing at a faster rate than food supplies and deduced from that idea that the world would soon reach its carrying capacity. He wrote:
Writing again in 1830 on the topic, he stated:
It is hard not to imagine how Malthus would feel about his predictions nearly 200 years later. Certainly, he felt he was doing the right thing by trying to reduce overall human suffering, but Malthus suffered from one fatal flaw: just like all of us, he could not see into the future.
Malthus published his calls for population control just as the industrial revolution began. As figure 2 shows, the industrial revolution, driven by human innovation, led to a massive and rapid increase in wealth across the world.
From the early 1800s to today, the world population grew from 1 billion to over 7 billion people. Innovations like pesticides, tractors, refrigerators, and genetically modified organisms have resulted in higher agricultural yields while reducing overall per capita arable land use required to produce those yields. As Matt Ridley, author of The Rational Optimist, explains,
Of course, none of this growth in human prosperity would have been possible without a reliable source of fuel. Coal was the foundation upon which the industrial revolution was built. It powered mankind through an era of unprecedented prosperity. Even as recently as 1908, coal accounted for three-quarters of American energy use.
However, such high levels of coal use created major pollution problems. Even as recently as the 1950’s, smoke and soot from coal-fired power plants filled the skies in cities like Chicago. Today, however, coal only accounts for 14 percent of the US energy portfolio. That reduction was made possible by the development and use of new, cleaner technologies like natural gas, oil, hydro, nuclear, wind, and solar. Since 2001 nuclear, hydro, and natural gas have replaced coal as the top electricity generating source in 14 states.
During the second half of the 20th century, numerous discussions focused on whether or not the world had reached the height of oil production, often referred to as peak oil. Discussions surrounding peak oil are often littered with threats of a looming energy crisis, increased liquid fuel costs, and economic downturns. It was Malthus reborn.
Yet few could have predicted how we would innovate our way out of our crisis. Most recently, breakthroughs in hydraulic fracturing have actually boosted oil production to all-time highs, bringing with it natural gas supplies that have lowered the cost of energy for consumers and reduced reliance on coal.
In many ways, fracking can be seen as the perfect example of the unpredicted solution. While many were confident in their doom-and-gloom predictions of the future, very few painted a picture in which a dramatic shift in technology would turn the world on its head. No one 50 years ago was predicting that the United States would be the largest producer of natural gas and crude oil in the world. Yet, here we are. Likewise, the most confident prediction we can make about the future is that almost every single prediction made will be proven wrong in ways we could have never imagined.
There are a number of exciting technologies currently being explored that hold real promise for changing the way we consume energy. Small modular nuclear reactors are scaling down the size of traditional reactors, while also scaling down costs and risks. Fusion reactors are being developed that can produce vastly more energy than any current energy technology while eliminating long-term radioactive waste problems facing traditional nuclear power. Technology like closed-cycle natural gas generators and carbon capture and storage have the potential to provide carbon-neutral energy from tradition fossil fuel sources. Battery technology is growing to a point that renewable systems may eventually be able to overcome issues with intermittency and provide stable, consistent base load energy.
Of course, in addition to all of these promising technologies, there are energy sources that have yet to be invented or even imagined. Each of these technologies comes with its own set of challenges, both foreseen and unforeseen. Some are still a long way from economic viability. Others face physical and technological limitations. What all of them have in common, however, is a need for a regulatory climate that allows them to grow to sustainable levels.
Time and again, the prophets of doom have fallen victim to human ingenuity. Inventors, thinkers, and entrepreneurs have pulled humanity out of poverty and pushed us toward a brighter future. It is crucial that we recognize the danger of deciding what tomorrow will look like today. Instead, and especially in a world where politics has become so toxic, we must lean on our best and brightest thinkers to solve environmental problems. It’s only then that our future will truly take shape.
CGO scholars and fellows frequently comment on a variety of topics for the popular press. The views expressed therein are those of the authors and do not necessarily reflect the views of the Center for Growth and Opportunity or the views of Utah State University.