For some things Harvard suffices; this blog is for the rest.

Climate Change and the Risk of Stagnation

The discussion of climate change and the potential dangers for our species has sadly been turned into a virtue-signaling contest by politicians. In this essay we will stay away from these political discussions; we will also not discuss the science behind the different climate models. Instead, we will look at something more important: the risks involved.

Precautionary Principle

The problem of climate change should be viewed in the bigger context of trying to reduce the risk of global extinction (global ruin). The distinction between local and global ruin is essential here; risk of local ruin is a normal part of innovation and civilization whereas the risk of global ruin should be reduced to a minimum. In a healthy economy, you want people to take—bounded—local risk. Entrepreneurship works this way. Many people take risk and some fail (local ruin), improving the economy as a whole. This risk-taking becomes a problem when the risk is global and everyone gets damaged by failure. In the case of climate change, we are obviously dealing with a risk of global ruin—no matter the size of the risk. This makes the discussion about the accuracy of models secondary; we are taking a global risk and should be extremely cautious. The burden of evidence falls onto those advocating for inaction—those saying that more CO² does not cause harm—and not on those wanting to reduce emissions. To clarify: People do not need to prove the damage caused by more CO² but rather the opposite; the absence of damage caused by more CO² has to be proven to a reasonably high degree. It is justified to be skeptical about our climate current models, given their mostly poor track record, but contrary to popular belief the uncertainty about the harm of CO² and the accuracy of models is not a reason for inaction.

Long Walks

The line of reasoning described above was put forward by Joseph Norman, Rupert Read, Yaneer Bar-Yam, and Nassim Nicholas Taleb in their paper “Climate models and precautionary measures.”

There is a problem however: This line of reasoning (only looking at risk) could be used to warrant every measure to reduce CO² emission—including the most extreme ones. There are economic consequences when severely cutting down on CO² emissions and the problem of enforcing these regulations, but for some people, these things merely represent small obstacles in their fight to “save the planet.” I intuitively knew that there is another reason besides the well being and prosperity of humanity (some people seem not to care about these things) that we can’t slow down the progress of society too much—something was missing. I finally found that missing piece after reading David Deutsch (The Beginning of Infinity).

The Risk of Slowing Down Progress

The missing piece was the risk of slowing down progress; yes, it is not only inconvenient but also risky to slow down progress. As we saw in the beginning, the goal is to reduce the risk of global ruin and slowing down innovation increases this risk. How? Murphy’s Law: “Anything that can go wrong will go wrong.” In the future, we will face some unknown challenges that can only be survived with new technologies—i.e. new applied knowledge. These challenges are by definition unknowable in the present. If we do not have the appropriate knowledge and technologies to deal with these unknown problems we will go extinct (global ruin). Because we cannot know these challenges ahead of time we need to prepare the best way we can. This means we need to innovate as fast as possible. New knowledge and new technologies give us the best chance of surviving the obstacles we do not yet know. This is the risk argument for fast progress. Slowing down the progress of society—in order to reduce CO² emissions—decreases the risk caused by climate change, but it increases the risk of future extinction by some—in the present—unknown cause. This constrains the appropriate measures we should take to reduce CO² emissions.

In Practice

Since we do not—and cannot—have accurate risk numbers for either climate change or the unknown future events (again, they are unknowable) it is difficult to recommend concrete actions. There is however one technological solution that reduces CO² emissions while not slowing down progress: nuclear power. This technology has a bad reputation for its costly failures in the past but I think that we should reconsider our negative stance towards it. It is not only efficient but also lowers CO² emissions while being reliable. The two main problems with nuclear power plants are a) the nuclear waste, and b) the blow-up risk. Both of these problems are not negligible, but in contrast to climate change, they present local risks. The blow-up of a power plant—even in the worst case—only damages the region nearby; even if this nearby region is large, it is still local. The same is true for nuclear waste; it is a problem we have not solved (yet) but the risk for humanity as a whole is comparatively small.

Yes, there are some other arguments against the use of nuclear energy, concerning, for example, the supply of the needed uranium. Although this essay’s main focus lies not on providing an easy solution to a complex problem, but rather on laying out a risk-focused framework to be used to evaluate different solutions, I want to comment on the supply problem. When looking at our track record of predicting the available supply of natural resources, we have to admit that these predictions were—at best—inaccurate; our track record is awful. There are many reasons for this but the most important one is epistemological. We can—by definition—not know what knowledge and technologies we are going to create in the future. Not taking this future development into account has been the mistake of many (if not all) of those who have made predictions. New knowledge and new technologies are going to enable us to a) find more supply, b) use the existing supply more efficiently, and c) use new resources for a task. There is no guaranty that all (or any) of these things are going to happen in this specific case but it is likely. This epistemological limit on our ability to predict should make us skeptical of arguments like the one I described above.

The only reason I have mentioned a specific technology in this essay is that otherwise some people—those who do not get the main argument—would think of this essay as “useless philosophizing” without relevance for the real world. Again, I do not want to debate the “pros” and “cons” of nuclear energy or any other specific technology (there are people who are much more knowledgeable about this than I am); I want to discuss our framework for evaluation and decision making. We are all—at least for the present—confined to this planet, so let’s try to minimize the risk that we all die from climate change, aliens, meteorites, or other unknown events in the future.