Stories this week:
- CO2 levels see highest jump in five decades
- How big a deal is Trudeau and Obama’s methane pact?
- Pinning down the role of climate change in extreme events
- Will longer toxic algal blooms become the new normal?
- Turning waste cooking oil into biodiesel
CO2 levels see highest jump in five decades
Carbon dioxide levels in the atmosphere jumped higher in the last 12 months than at any time in more than five decades of tracking.
Researchers with the US National Oceanic and Atmospheric Administration (NOAA) at the Mauna Loa Observatory in Hawaii reported last week that their measurements showed CO2 concentrations had climbed by 3.05 parts per million (ppm) during 2015.
Over the last decade, the average annual increase had been 2ppm, but from February 2015 to February 2016, the increase soared to 3.76ppm—and this leap was only partly due to the 2015-2016 El Nino extended period of warmer Pacific sea surface temperatures.
“Carbon dioxide levels are increasing faster than they have in hundreds of thousands of years,” according to Pieter Tans, a lead scientist at NOAA. “It’s explosive compared to natural processes.”
In February 2016, the average global atmospheric CO2 level stood at 402.59ppm. Prior to the 18th century Industrial Revolution, which was powered by the combustion of coal, gas, and oil, such concentrations averaged around 280pppm. For much of human civilisation, that figure had hovered around 275.
The observatory is not just any research outpost, but the world's oldest continuous CO2 monitoring station. Scientists use data from the station as a gold standard for measurement of the gas because the station is situated at 3,400 m above sea level in Hawaii far away from any continent and high above where smog might get trapped close to the ground.
The NOAA researchers also noted last week that 2015 had hit another ignominious record, the fourth consecutive year that CO2 had climbed by more than 2ppm.
That’s a lot of numbers to take in, but why are they important?
In 2008, former NASA climate scientist James Hansen wrote that if humanity wishes to preserve conditions optimum under which that civilisation developed, the evidence suggests that current CO2 concentrations need to be reduced to “at most 350ppm, but likely less than that.” In other words, 350ppm is the ‘safe level’ we need to strive for. He was writing at a time when atmospheric CO levels had hit 385ppm.
This figure isn’t plucked out of the air, but relates to a comparison to the ‘paleoclimate’, or what the climate was like in the deep past. And the last time that the planet was nearly ice-free was some 50 million years ago, when CO2 levels were at to 450ppm, give or take 100ppm. Above 350ppm, will we see a similar ice-free world? At 402.59ppm, we’re clearly already overshooting this target, but Hansen and other scientists feel that so long as the overshoot is brief, and we begin to turn the situation around via immediate, sharp cuts in greenhouse gas emissions alongside measures to suck up some of that carbon, we should be okay.
How big a deal is Trudeau and Obama’s methane pact?
A new Canada-US agreement to slash oil and gas sector methane emissions by 40-45 percent below 2012 levels by 2025 could have global political ramifications.
The deal was announced by Canadian Prime Minister Justin Trudeau and US President Barack Obama last week during the first official visit to Washington DC by a Canadian leader in 19 years.
While much of the press coverage of the meeting focussed on the ‘bromance’ between the two leaders who are closely ideologically aligned, and the presence of a string of Canadian celebrities at a state dinner at the White House, the core result of their meeting, the joint methane target, has enjoyed only passing attention. But what are the details, and how big a deal are the cuts anyway?
Campaigners who have been working for years to try get tougher action on the powerful greenhouse gas say it’s actually quite a big deal, could lead to Mexico taking similar action, and increases pressure on other jurisdictions to up their game as well. If similar action were adopted everywhere around the world, this would be the same as shuttering a third of the planet’s coal-fired power plants.
Methane has a global warming potential some 25 times greater than carbon dioxide. The gas represents 15 percent of Canada’s greenhouse gas emissions and 43 percent of that comes from the oil and gas sector, as it does around the world (although belching cows, rice production and landfills are also big emitters).
Breaking that sector down further, most of the methane is emitted during natural gas and oil production and processing (93%), with another five percent coming from natural gas transportation and storage, and two percent from gas distribution.
The new regulations have yet to be detailed by either side, although Ottawa says that they would target venting and fugitive emissions from both new and existing facilities responsible for extraction, processing and storage. Pipelines however will not be covered due to the small percentage of methane emissions represented by this part of the production chain. The US has already begun to address emissions from any new facilities, but until now, existing infrastructure had been excluded. Both governments will begin discussing details in the coming weeks and consultations with industry next month, but Washington does not expect the process to be finalised before 2017.
For comparison, the new Ottawa-US agreement closely resembles Alberta’s pledge announced in November to reduce methane emissions 45 percent from 2012 levels by 2025. Alberta has welcomed the federal deal, saying a national commitment will make their province more competitive. Last week, BC Premier Christy Clark said her government aims to harmonise its methane strategy with that of Alberta. Meanwhile, last November, British Columbia’s government-appointed Climate Leadership Team set the time line even closer, recommending a provincial methane reduction target of 40 percent below 2015 levels by 2021.
Pinning down the role of climate change in extreme events
Scientists have known for a long time that climate change drives an increase in the number and intensity of extreme weather events such as droughts, hurricanes and floods. Nevertheless, until very recently, researchers have also been very reluctant to attribute any one event to human-caused global warming.
But the new science of extreme-event attribution studies has advanced rapidly in the last few years, according to a comprehensive assessment of the state of the field by the US National Academies of Sciences, Engineering, and Medicine out this week. Scientists now have much greater ability to describe the role played by climate change in specific events.
And here attribution means how much anthropogenic global warming made the event more or less likely, and to what extent it intensified or weakened the event, rather than whether a specific hurricane or drought was caused by humans.
“It all started with reporters after one of these events sticking a microphone in our faces and asking did we cause this. This is not a well posed question,” says Francis Zwiers, co-author of the report and the director of PICS’ sister organization, the Pacific Climate Impacts Consortium. This is because natural variability almost always plays some role. Both human and non-human events must align to set up a given event. “And try to find an event that isn’t affected by climate change.”
The National Academies report concludes that confidence is greatest in attribution where the underlying physics is best understood, particularly for events that are related to temperature such as heat waves and extreme cold. Researchers however have the lowest confidence in describing the role played by climate change on individual storms, tornadoes, and wildfires. And they have a middling confidence in attributing the role played by climate change on droughts, downpours and snowstorms.
Explaining some of the difficulty, the report notes that while many studies have shown that climate change is producing an increase in wildfires, the risk of any individual fire also depends on forest management practices in addition to background climate variability and human-caused climate change.
One of the ways that researchers have got better at attribution is by bringing together multiple models and varying approaches. In this way, if different methods and analyses end up producing results that look pretty similar, this lowers the uncertainty.
Ultimately, researchers want to develop tools that provide forecasts of future events at lead times of seasons or just days. These would not be like weather forecasts, but forecasts of risk that would allow authorities to mitigate such disasters, for example by pre-emptively moving fire-fighting equipment into those areas at greatest risk of wildfire, or clearing culverts in regions where floods are more likely.
Will longer toxic algal blooms become the new normal?
Longer and more frequent blooms of toxic algae may become the new normal off the Pacific coast, from Alaska through British Columbia down to California. The phenomenon appears to be related to warmer waters, but the question for scientists is whether this is a result of climate change.
In a study appearing last week in the journal Harmful Algae, a group of US researchers on the look-out for the appearance of two of the most common algal bloom toxins report the results of sampling the carcasses of some 900 Alaskan sea mammals. Both neurotoxins saxitoxin and domoic acid are making a common appearance off the coast of California and have been found to be present in BC waters. But while the presence of saxitoxin has for some time been found in Alaskan shellfish, domoic acid poisoning had not been spotted in such animals north of California until last year when a sea lion in Washington waters was found to be afflicted.
The researchers wanted to know whether the problem was moving northward as waters warmed. They found that of the 13 animal species investigated, all showed a low-level presence of domoic acid and ten showed a presence of saxitoxin.
Kathi Lefebvre, a biologist with the National Oceanic and Atmospheric Administration (NOAA), told reporters that while her study focused on Alaska, its findings held a warning for BC: “It’s the same coastline.”
Last year, unusually warm Pacific temperatures resulted in a vast ‘red tide’ off the west coast of North America spotted by NOAA surveyors, stretching from California to Alaska. Harmful blooms are nothing new, but historically, they have tended to restrict themselves to much smaller areas and to wane after a few weeks. Last year’s much more extensive bloom lasted for months, with higher levels of neurotoxins than is typical. Researchers report that we are experiencing larger algal blooms more often, in more places, and lasting longer than documented in earlier decades.
Scientists suspect that last year’s record red tide was caused by ‘the Blob’, a large stretch of water in the northeastern Pacific some 2°C warmer than normal. Appearing first in 2013, the blob did not dissipate until the end of 2015.
There have been blobs before, but not with such record breaking sea surface temperatures, and nowhere near as strong or distinct. Oceanographers think the warming ocean together with unusual circulation caused the blob, but once formed, the blob itself may have created atmospheric conditions—a persistent ridge of high pressure that sat there for two years—that maintained it. Whether all this is a product of normal perturbation in the atmosphere and ocean or climate change remains unclear.
However, the oceans are warming, and so these types of conditions are likely to become more common. By studying the blob’s effects, researchers are able to explore how sensitive ecosystems are to such warming and the nature of future anomalous algae events.
Turning waste cooking oil into biodiesel
Canadian clean technology that converts waste feedstock oil into low-carbon diesel is assisting communities in developing nations around the world, according to the Vancouver company behind the BioCube product.
The BioCube is a 20-foot repurposed shipping container equipped with a biodiesel processor. It works by converting excess feedstock oils such as waste vegetable oil from restaurants, crude palm oil soya, corn, coconut, pongamia or tallow into usable biodiesel. The product comes out ready for use in any diesel engine without need for modification.
The biodiesel it produces can reduce carbon emissions by up to 70 per cent relative to regular diesel. And biodiesel produced in BC from waste vegetable oil clocked in at 95 per cent less carbon-intensive than regular diesel.
The BioCube is portable, weighing 3.5 tonnes, and self-generating—that is, it uses its own biodiesel to operate, though it can also connect to grid power where available. This makes it particularly suitable for agriculture in Africa, for example, where one model is fuelling a remote palm oil mill plantation in the Democratic Republic of Congo.
While industrial-scale refineries are stationary and cost millions to set up and operate, BioCube is geared towards the mid-range market, where one unit costs around 15 per cent of the cheapest mid-sized refinery and is much smaller.
A visit earlier this month to the Coquitlam-based company by Prime Minister Justin Trudeau has helped attract attention to the tech company as it looks to expand its client base across the developed and developing world, including here at home.
The firm is currently servicing clients in African states, India, and Australia, but it hopes to begin selling its wares in North America as well—if current barriers can be overcome.
“We have customers here in BC who want a BioCube; unfortunately political roadblocks and tough economics make biodiesel production unfeasible for them," says BioCube Director, Peter Wilken.