Tech
What it will take to achieve affordable carbon removal
Published
2 years agoon
By
Drew Simpson
A pair of companies have begun designing what could become Europe’s largest direct-air-capture plant, capable of capturing as much as a million metric tons of carbon dioxide per year and burying it deep beneath the floor of the North Sea.
The sequestered climate pollution will be sold as carbon credits, reflecting the rising demand for carbon removal as a drove of nations and corporations lay out net-zero emissions plans that rely heavily, whether directly or indirectly, on using trees, machines, or other means to pull carbon dioxide out of the air.
Climate researchers say the world may need billions of tons of carbon dioxide removal annually by midcentury to address the “residual emissions” from things like aviation and agriculture that we can’t affordably clean up by then—and to pull the climate back from extremely dangerous levels of warming.
The critical and unanswered question, however, is how much direct air capture will cost—and whether companies and nations will decide they can afford it.
The facility proposed by the two companies, Carbon Engineering and Storegga Geotechnologies, will likely be located in North East Scotland, enabling it to draw on plentiful renewable energy and funnel captured carbon dioxide to nearby sites offshore, the companies said. It’s expected to come online by 2026.
“We can’t stop every [source of] emissions,” says Steve Oldham, chief executive of Carbon Engineering, which is based in British Columbia. “It’s too difficult, too expensive, and too disruptive. That’s where carbon removal comes in. We’re seeing an increasing realization that it’s going to be essential.”
Getting to $100 a ton
Oldham declines to say how much the companies plan to charge for carbon removal, and says they don’t yet know the per-ton costs they’ll achieve with the European plant.
But he is confident it will eventually reach the target cost levels for direct air capture identified in a 2018 analysis in Joule, led by Carbon Engineering founder and Harvard professor David Keith. It put the range at between $94 and $232 per ton once the technology reaches commercial scale.
COURTESY: CARBON ENGINEERING
Getting to $100 per ton is essentially the point of economic viability, as large US customers generally pay $65 to $110 for carbon dioxide used for commercial purposes, according to a little-noticed May paper by Habib Azarabadi and direct-air-capture pioneer Klaus Lackner, both at Arizona State University’s Center for Negative Carbon Emissions. (The $100 doesn’t include the separate but considerably smaller cost of carbon sequestration.)
At that point, direct air capture could become a reasonably cost-effective way of addressing the 10% to 20% of emissions that will remain too difficult or expensive to eliminate—and may even compete with the cost of capturing carbon dioxide before it leaves power plants and factories, the authors state.
But the best guess is that the sector is nowhere near that level today. In 2019, the Swiss direct-air-capture company Climeworks said its costs were around $500 to $600 per ton.
What it will take to get to that $100 threshold is building a whole bunch of plants, Azarabadi and Lackner found.
Specifically, the study estimates that the direct-air-capture industry will need to grow by a factor of a little more than 300 in order to achieve costs of $100 a ton. That’s based on the “learning rates” of successful technologies, or how rapidly costs declined as their manufacturing capacity grew. Getting direct-air capture to that point may require total federal subsidies of $50 million to $2 billion, to cover the difference between the actual costs and market rates for commodity carbon dioxide.
Lackner says the key question is whether their study applied the right learning curves from successful technologies like solar—where costs dropped by roughly a factor of 10 as scale increased 1,000-fold—or if direct air capture falls into a rarer category of technologies where greater learning doesn’t rapidly drive down costs.
“A few hundred million invested in buying down the cost could tell whether this is a good or bad assumption,” he said in an email.
Dreamcatcher
The United Kingdom has set a plan to zero out its emissions by 2050 that will require millions of tons of carbon dioxide removal to balance out the emissions sources likely to still be producing pollution. The government has begun providing millions of dollars to develop a variety of technical approaches to help it hit those targets, including about $350,000 to the Carbon Engineering and Storegga effort, dubbed Project Dreamcatcher.
The plant will likely be located near the so-called Acorn project developed by Scotland-based Storegga’s subsidiary, Pale Blue Dot Energy. The plan is to produce hydrogen from natural gas extracted from the North Sea, while capturing the emissions released in the process. The project would also repurpose existing oil and gas infrastructure on the northeast tip of Scotland to transport the carbon dioxide, which would be injected into sites below the seabed.
The proposed direct-air-capture plant could leverage the same infrastructure for its carbon dioxide storage, Oldham says.
The companies initially expect to build a facility capable of capturing 500,000 tons annually but could eventually double the scale given market demand. Even the low end would far exceed the otherwise largest European facility under way, Climeworks’ Orca facility in Iceland, slated to remove 4,000 tons annually. Only a handful of other small-scale plants have been built around the world.
The expected capacity of the Scotland plant is essentially the same as that of Carbon Engineering’s other full-sized facility, planned for Texas. It will also begin as a half-million-ton-a-year plant with the potential to reach a million. Construction is likely to start on that plant early next year, and it’s expected to begin operation in 2024.
Much of the carbon dioxide captured at that facility, however, will be used for what’s known as enhanced oil recovery: the gas will be injected underground to free up additional oil from petroleum wells in the Permian Basin. If done carefully, that process could potentially produce “carbon neutral” fuels, which at least don’t add more emissions to the atmosphere than were removed.
Oldham agrees that building more plants will be the key to driving costs, noting that Carbon Engineering will see huge declines just from its first plant to its second. How sharply the curve bends from there will depend on how rapidly governments adopt carbon prices or other climate policies that create more demand for carbon removal, he adds. Such policies essentially force “hard-to-solve” sectors like aviation, cement, and steel to start paying someone to clean up their pollution.
You may like
-
A stealth effort to bury wood for carbon removal has just raised millions
-
Why we need to do a better job of measuring AI’s carbon footprint
-
We’re getting a better idea of AI’s true carbon footprint
-
The Download: capturing carbon with seagrass, and China’s election interference
-
Why scientists want to help plants capture more carbon dioxide
-
Why the carbon capture subsidies in the climate bill are good news for emissions
Tech
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
Published
1 day agoon
03/25/2023By
Drew Simpson
Power struggle
When Anton Korinek, an economist at the University of Virginia and a fellow at the Brookings Institution, got access to the new generation of large language models such as ChatGPT, he did what a lot of us did: he began playing around with them to see how they might help his work. He carefully documented their performance in a paper in February, noting how well they handled 25 “use cases,” from brainstorming and editing text (very useful) to coding (pretty good with some help) to doing math (not great).
ChatGPT did explain one of the most fundamental principles in economics incorrectly, says Korinek: “It screwed up really badly.” But the mistake, easily spotted, was quickly forgiven in light of the benefits. “I can tell you that it makes me, as a cognitive worker, more productive,” he says. “Hands down, no question for me that I’m more productive when I use a language model.”
When GPT-4 came out, he tested its performance on the same 25 questions that he documented in February, and it performed far better. There were fewer instances of making stuff up; it also did much better on the math assignments, says Korinek.
Since ChatGPT and other AI bots automate cognitive work, as opposed to physical tasks that require investments in equipment and infrastructure, a boost to economic productivity could happen far more quickly than in past technological revolutions, says Korinek. “I think we may see a greater boost to productivity by the end of the year—certainly by 2024,” he says.
Who will control the future of this amazing technology?
What’s more, he says, in the longer term, the way the AI models can make researchers like himself more productive has the potential to drive technological progress.
That potential of large language models is already turning up in research in the physical sciences. Berend Smit, who runs a chemical engineering lab at EPFL in Lausanne, Switzerland, is an expert on using machine learning to discover new materials. Last year, after one of his graduate students, Kevin Maik Jablonka, showed some interesting results using GPT-3, Smit asked him to demonstrate that GPT-3 is, in fact, useless for the kinds of sophisticated machine-learning studies his group does to predict the properties of compounds.
“He failed completely,” jokes Smit.
It turns out that after being fine-tuned for a few minutes with a few relevant examples, the model performs as well as advanced machine-learning tools specially developed for chemistry in answering basic questions about things like the solubility of a compound or its reactivity. Simply give it the name of a compound, and it can predict various properties based on the structure.
Tech
Newly revealed coronavirus data has reignited a debate over the virus’s origins
Published
1 day agoon
03/24/2023By
Drew Simpson
Data collected in 2020—and kept from public view since then—potentially adds weight to the animal theory. It highlights a potential suspect: the raccoon dog. But exactly how much weight it adds depends on who you ask. New analyses of the data have only reignited the debate, and stirred up some serious drama.
The current ruckus starts with a study shared by Chinese scientists back in February 2022. In a preprint (a scientific paper that has not yet been peer-reviewed or published in a journal), George Gao of the Chinese Center for Disease Control and Prevention (CCDC) and his colleagues described how they collected and analyzed 1,380 samples from the Huanan Seafood Market.
These samples were collected between January and March 2020, just after the market was closed. At the time, the team wrote that they only found coronavirus in samples alongside genetic material from people.
There were a lot of animals on sale at this market, which sold more than just seafood. The Gao paper features a long list, including chickens, ducks, geese, pheasants, doves, deer, badgers, rabbits, bamboo rats, porcupines, hedgehogs, crocodiles, snakes, and salamanders. And that list is not exhaustive—there are reports of other animals being traded there, including raccoon dogs. We’ll come back to them later.
But Gao and his colleagues reported that they didn’t find the coronavirus in any of the 18 species of animal they looked at. They suggested that it was humans who most likely brought the virus to the market, which ended up being the first known epicenter of the outbreak.
Fast-forward to March 2023. On March 4, Florence Débarre, an evolutionary biologist at Sorbonne University in Paris, spotted some data that had been uploaded to GISAID, a website that allows researchers to share genetic data to help them study and track viruses that cause infectious diseases. The data appeared to have been uploaded in June 2022. It seemed to have been collected by Gao and his colleagues for their February 2022 study, although it had not been included in the actual paper.
Tech
Fostering innovation through a culture of curiosity
Published
2 days agoon
03/24/2023By
Drew Simpson
And so I think a big part of it as a company, by setting these ambitious goals, it forces us to say if we want to be number one, if we want to be top tier in these areas, if we want to continue to generate results, how do we get there using technology? And so that really forces us to throw away our assumptions because you can’t follow somebody, if you want to be number one you can’t follow someone to become number one. And so we understand that the path to get there, it’s through, of course, technology and the software and the enablement and the investment, but it really is by becoming goal-oriented. And if we look at these examples of how do we create the infrastructure on the technology side to support these ambitious goals, we ourselves have to be ambitious in turn because if we bring a solution that’s also a me too, that’s a copycat, that doesn’t have differentiation, that’s not going to propel us, for example, to be a top 10 supply chain. It just doesn’t pass muster.
So I think at the top level, it starts with the business ambition. And then from there we can organize ourselves at the intersection of the business ambition and the technology trends to have those very rich discussions and being the glue of how do we put together so many moving pieces because we’re constantly scanning the technology landscape for new advancing and emerging technologies that can come in and be a part of achieving that mission. And so that’s how we set it up on the process side. As an example, I think one of the things, and it’s also innovation, but it doesn’t get talked about as much, but for the community out there, I think it’s going to be very relevant is, how do we stay on top of the data sovereignty questions and data localization? There’s a lot of work that needs to go into rethinking what your cloud, private, public, edge, on-premise look like going forward so that we can remain cutting edge and competitive in each of our markets while meeting the increasing guidance that we’re getting from countries and regulatory agencies about data localization and data sovereignty.
And so in our case, as a global company that’s listed in Hong Kong and we operate all around the world, we’ve had to really think deeply about the architecture of our solutions and apply innovation in how we can architect for a longer term growth, but in a world that’s increasingly uncertain. So I think there’s a lot of drivers in some sense, which is our corporate aspirations, our operating environment, which has continued to have a lot of uncertainty, and that really forces us to take a very sharp lens on what cutting edge looks like. And it’s not always the bright and shiny technology. Cutting edge could mean going to the executive committee and saying, Hey, we’re going to face a challenge about compliance. Here’s the innovation we’re bringing about architecture so that we can handle not just the next country or regulatory regime that we have to comply with, but the next 10, the next 50.
Laurel: Well, and to follow up with a bit more of a specific example, how does R&D help improve manufacturing in the software supply chain as well as emerging technologies like artificial intelligence and the industrial metaverse?
Art: Oh, I love this one because this is the perfect example of there’s a lot happening in the technology industry and there’s so much back to the earlier point of applied curiosity and how we can try this. So specifically around artificial intelligence and industrial metaverse, I think those go really well together with what are Lenovo’s natural strengths. Our heritage is as a leading global manufacturer, and now we’re looking to also transition to services-led, but applying AI and technologies like the metaverse to our factories. I think it’s almost easier to talk about the inverse, Laurel, which is if we… Because, and I remember very clearly we’ve mapped this out, there’s no area within the supply chain and manufacturing that is not touched by these areas. If I think about an example, actually, it’s very timely that we’re having this discussion. Lenovo was recognized just a few weeks ago at the World Economic Forum as part of the global lighthouse network on leading manufacturing.
And that’s based very much on applying around AI and metaverse technologies and embedding them into every aspect of what we do about our own supply chain and manufacturing network. And so if I pick a couple of examples on the quality side within the factory, we’ve implemented a combination of digital twin technology around how we can design to cost, design to quality in ways that are much faster than before, where we can prototype in the digital world where it’s faster and lower cost and correcting errors is more upfront and timely. So we are able to much more quickly iterate on our products. We’re able to have better quality. We’ve taken advanced computer vision so that we’re able to identify quality defects earlier on. We’re able to implement technologies around the industrial metaverse so that we can train our factory workers more effectively and better using aspects of AR and VR.
And we’re also able to, one of the really important parts of running an effective manufacturing operation is actually production planning, because there’s so many thousands of parts that are coming in, and I think everyone who’s listening knows how much uncertainty and volatility there have been in supply chains. So how do you take such a multi-thousand dimensional planning problem and optimize that? Those are things where we apply smart production planning models to keep our factories fully running so that we can meet our customer delivery dates. So I don’t want to drone on, but I think literally the answer was: there is no place, if you think about logistics, planning, production, scheduling, shipping, where we didn’t find AI and metaverse use cases that were able to significantly enhance the way we run our operations. And again, we’re doing this internally and that’s why we’re very proud that the World Economic Forum recognized us as a global lighthouse network manufacturing member.
Laurel: It’s certainly important, especially when we’re bringing together computing and IT environments in this increasing complexity. So as businesses continue to transform and accelerate their transformations, how do you build resiliency throughout Lenovo? Because that is certainly another foundational characteristic that is so necessary.