Researchers Fault EPA for Resisting Efforts to Verify Accuracy of Computer Models on Methane
The Environmental Protection Agency should withdraw and reissue a proposed rule regulating methane emissions because it hasn’t provided enough information to verify the computer modeling behind it, Heritage Foundation researchers say.
By violating the modeling requirements in the Clean Air Act, the federal law designed to reduce air pollution, EPA is operating under a double standard—one for itself, another for state agencies and other regulated agencies, the two researchers explain in comments submitted to EPA.
The researchers are Kevin Dayaratna, chief statistician in Heritage’s Center for Data Analysis, and Mario Loyola, a senior research fellow at Heritage for environmental policy and regulation who also is a professor at Florida International University. (The Daily Signal is Heritage’s news and commentary outlet.)
In January, President Joe Biden’s EPA unfurled what it calls the Waste Emissions Charge for Petroleum and Natural Gas Systems. The agency’s stated purpose: to “impose and collect an annual charge on methane emissions that exceed specified waste emissions thresholds from applicable oil and gas facilities.”
The overall objective is to impose fees on the largest emitters of methane to create an incentive for them to curtail what EPA describes as “harmful air pollutants.” If the Biden administration can make the case for a higher “social cost” of carbon, it would be in a stronger position to rationalize stricter regulations to combat climate change.
But Dayaratna told The Daily Signal in a phone interview that the cost-benefit analysis underlying computer models used by EPA to simulate a real-world system is open to debate.
“The real key here, the meat on the bones, is their quantifying of economic damages,” Dayaratna said of EPA’s methodology. “Any government agency is entitled to use statistical modeling in cost-benefit analysis, but the problem here is they need to provide enough information so their calculation of the proposed damages is reproducible. But in my opinion, all they are doing is providing the [computer] codes, and the underlying assumptions of the codes they go into the calculations of damages aren’t reproducible.”
In their comments to EPA, Dayaratna and Loyola burrowed into the legal problems the proposed rule is likely to encounter.
Federal courts have ruled that “it is an abuse of discretion for the EPA to fail to follow its own prior standards,” the two Heritage researchers write. They say the courts also ruled “that it is arbitrary for EPA to rely on models [the] reliability and predictiveness of which cannot be independently determined because of insufficient collection and correlation of empirical data.”
Their comments focus primarily on EPA’s efforts to calculate the “social cost of methane,” the greenhouse gas generated during production and transportation of oil, gas, or coal. By quantifying the social cost of carbon, which in combination with hydrogen creates methane, EPA seeks to attach a firm figure to effects on the climate that result from human emissions of methane.
But the two Heritage scholars said they see many problems with the associated computer modeling. Some fundamental problems with one model, the “Data-driven Spatial Climate Impact Model” or DSCIM, stand out.
In their comments, Dayaratna and Loyola point to what they call “the unavailability of computer codes necessary to be able to reproduce the damage function coefficients in the DSCIM model.”
Dayaratna explained why this is a problem.
“Without the code, there is no way to verify the accuracy of the model’s estimates which hinge on these coefficients,” Heritage’s chief statistician told The Daily Signal. “Although we sent correspondence to EPA staff asking for the data needed to substantiate the modeling results, they simply referred us to another research group. The onus is on [EPA officials] to provide codes to reproduce the analysis they are using to justify the proposed regulatory policy.”
The Daily Signal asked EPA’s press office for comment on the modeling concerns raised by the two Heritage analysts and, specifically, whether the agency would release the information they requested.
The deadline for public comments on EPA’s proposed rule was March 26 and the agency is reviewing the submitted comments, “including those from The Heritage Foundation,” an EPA spokesman responded late Tuesday.
EPA has made mistakes before in calculations involving the social cost of carbon, Dayaratna and Loyola argue, so it is vital that the agency provide data it uses to promulgate a methane regulation with far-reaching ramifications. The two researchers identify “subcomponents” of the damage calculations affecting coastlines, agriculture, human mortality, energy, and labor.
What’s the answer to potential errors in computer modeling and EPA’s unwillingness to share information?
Dayaratna recommends strict adherence to the requirements of the Clean Air Act, so that regulations with wide-ranging ramifications for industry and energy consumers are rooted in hard facts.
“The EPA staff don’t tell you where they get those numbers that they are essentially just multiplying and adding together,” Dayaratna told The Daily Signal. “This is problematic because there could be a major mistake with the modeling, which we’ve seen in the past.”
“These models could be the basis for potentially burdensome regulations,” he added. “The EPA staff can’t just be waving their magic wand and have an agenda to regulate what they want. If they are going to claim there is a reason, they have to provide the reason and show that it is legitimate and valid. So far, I’m not convinced they have.”
The post Researchers Fault EPA for Resisting Efforts to Verify Accuracy of Computer Models on Methane appeared first on The Daily Signal.