Carbon capture and storage is an old technology, first commercialized in the 1970s. Back then it was called enhanced oil recovery, because the carbon dioxide recovered from oil and gas production was injected into depleted oil and gas reservoirs to re-pressurize them and extract more hydrocarbons.
As the climate change movement gained momentum, the oil and gas industry wisely rebranded enhanced oil recovery as a “climate-friendly” process with a new name: carbon capture utilization and storage. Today, over 70 percent of carbon capture projects are, in fact, enhanced oil recovery projects used to produce more oil and/or gas, resulting in yet more greenhouse gas emissions.
The Institute for Energy Economics and Financial Analysis has estimated that most of the total captured carbon throughout history found its use in enhanced oil recovery—approximately 80–90 percent. Only a small proportion of carbon capture projects (approximately 10–20 percent) have stored carbon in dedicated geological structures without using it for oil and gas production.
Despite its long history, carbon capture is a problematic technology. A new IEEFA study reviewed the capacity and performance of 13 flagship projects and found that 10 of the 13 failed or underperformed against their designed capacities, mostly by large margins.