Time For a Reality Check On Computer Models

Predictive computer models have their place, writes Bob Dinneen of the RFA, but where concrete real-world data and observations exist, they should take precedence. Sound policymaking and regulation must be grounded in reality, not hypothetical fancy.
By Bob Dinneen | January 19, 2015

Policy wonks and regulatory agencies have long had an affinity for predictive computer models. The allure is understandable. Assumptions and hypothetical scenarios chosen by the user can be fed into a model, processed through a series of complex equations and algorithms, and—voila—tidy results are spit out the back end. Certainly, these models can be useful in guiding policy and regulatory development. But too often, regulators treat these models as “answer machines” and use them as tools for rigid regulatory enforcement. And, frequently, the results from these models just don’t make any sense when compared to real-world data and observations. Two prime examples of the disconnect between model results and reality have surfaced in recent months.

First, a new study by economists at Iowa State University exposed the absurdity of the results from economic models used by the California Air Resources Board to estimate indirect land use change (ILUC) emissions for biofuels regulated under the Low Carbon Fuel Standard. The Iowa State study found that farmers around the world have responded to higher crop prices in the past decade mainly by using existing land resources more efficiently, not by converting forest and grassland into cropland. According to the paper, “the primary land use change response of the world’s farmers in the last 10 years has been to use available land resources more efficiently rather than to expand the amount of land brought into production. This finding is not new ... however, this finding has not been recognized by regulators who calculate indirect land use.”

In response, CARB staff told stakeholders that looking at real-world land use data is “not productive.” In other words, the agency would rather regulate biofuels based on predictive model results rather than real-world outcomes. Indeed, CARB recently proposed new ILUC penalties for biofuels that entirely ignore the results of the Iowa State research in favor of new computer modeling results. Because every point of carbon intensity under the LCFS means dollars and cents to ethanol producers, CARB’s blind faith in computer modeling has real financial consequences for our industry. Fortunately, Oregon regulators working on their own LCFS program seem to understand that the real world matters. They recently elected to exclude ILUC, stating that “recent data has shown that both food (human and animal) and fuel production has increased while the amount of land farmed has stayed constant.”

The second recent example of trusting models over real-world experience is a paper published by University of Minnesota researchers. Using a black box computer model and a series of questionable assumptions, the study asserts that increased ethanol use would cause higher emissions of ozone and fine particulates (PM2.5). But there’s one little problem with this finding: actual data from 222 U.S. EPA air sensor sites show that ozone and PM2.5 concentrations have trended downward during the period in which the use of ethanol-blended gasoline has dramatically increased. Ozone concentrations have fallen 33 percent since 1980, while PM2.5 is down 34 percent since 2000. In recent years, both ground-level ozone and PM2.5 emissions have dropped below their respective national standards, according to EPA.

Further, there is a substantial body of evidence based on actual tailpipe testing that shows ethanol reduces both exhaust hydrocarbons and carbon monoxide emissions, and thus can help reduce the formation of ground-level ozone. After all, ethanol’s high oxygen content and ability to reduce exhaust hydrocarbons and carbon monoxide emissions is the primary reason it is used as an important component of reformulated gasoline in cities with high smog levels. In addition, studies have shown that increasing the oxygen content in gasoline reduces primary PM2.5 from the tailpipe.

Again, I’m not suggesting that all computer models are useless and should be ignored. Model results can be instructive, but they should be validated with empirical data whenever possible. Computer models can and should be used to fill knowledge and data gaps, but where concrete real-world data and observations exist, they should take precedence. Sound policymaking and regulation must be grounded in reality, not hypothetical fancy. If there are data points available on actual global land use responses to higher crop prices during the biofuels era, why not see what we can learn from them? If we have air quality data from air sensors across the country, and results from actual tailpipe emissions studies, why not use that information to shape our understanding of ethanol’s impact on air quality? Models are fine, but insight and wisdom gained from real-world data and experience are simply irreplaceable.

Author: Bob Dinneen
President and CEO,
Renewable Fuels Association
202-289-3835