The new composition for fluorine-containing electrolytes promises to maintain high battery charging performance for future electric vehicles even at sub-zero temperatures. (Image by Shutterstock.)
Topics: Battery, Chemistry, Climate Change, Global Warming, Lithium, Materials Science
Scientists developed a new and safer electrolyte for lithium-ion batteries that work as well in sub-zero conditions as it does at room temperature.
Many owners of electric vehicles worry about how effective their batteries will be in very cold weather. Now new battery chemistry may have solved that problem.
In current lithium-ion batteries, the main problem lies in the liquid electrolyte. This key battery component transfers charge-carrying particles called ions between the battery’s two electrodes, causing the battery to charge and discharge. But the liquid begins to freeze at sub-zero temperatures. This condition severely limits the effectiveness of charging electric vehicles in cold regions and seasons.
To address that problem, a team of scientists from the U.S. Department of Energy’s (DOE) Argonne and Lawrence Berkeley national laboratories developed a fluorine-containing electrolyte that performs well even in sub-zero temperatures.
“Our research thus demonstrated how to tailor the atomic structure of electrolyte solvents to design new electrolytes for sub-zero temperatures.” — John Zhang, Argonne group leader.
“Our team not only found an antifreeze electrolyte whose charging performance does not decline at minus 4 degrees Fahrenheit, but we also discovered, at the atomic level, what makes it so effective,” said Zhengcheng “John” Zhang, a senior chemist and group leader in Argonne’s Chemical Sciences and Engineering division.
This low-temperature electrolyte shows promise of working for batteries in electric vehicles, as well as in energy storage for electric grids and consumer electronics like computers and phones.
Topics: Battery, Chemistry, Climate Change, Economics, Global Warming
Welcome back to The Green Era, a weekly newsletter bringing you the news and trends in the world of sustainability. Click subscribe above to be notified of future editions.
The shift to renewable energy has caused consternation over the fate of workers in the fossil fuel industry. Those same concerns are hitting the automotive sector as U.S. demand for electric vehicles grows.
EVs require not just new assembly lines and parts but also factories to build the batteries that power them. The president of one of the biggest unions called the transition the largest in the industry’s history.
The automotive sector and its workers are not new to factory closures. The Great Recession brought the big three automakers to their knees, forcing the federal government to bail them out, leaving cities like Detroit and large swaths of the midwest with car workers out of a job.
This time could be different. Many factories are being converted and are investing in retraining their workers. The batteries and charging infrastructure required present another opportunity. Ford, General Motors, and Volkswagen are all building new battery manufacturing plants or expanding existing ones in Tennessee.
Researchers detected a surprising rise in levels of chlorofluorocarbons between 2010 and 2020 using a monitoring network that includes the Jungfraujoch research station in Switzerland. Credit: Shutterstock
Topics: Chemistry, Civilization, Climate Change, Environment, Global Warming
From my resume: "I eliminated ozone-depleting materials using Failure Mode and Effects Analysis (FMEA) and Taguchi Methods of Quality Engineering - using an L16 Orthogonal Array - in the Poly Silicon etch substituting out CFCs in manufacturing processes." How I did it: I substituted our CFC with Sulfur Hexafluoride and Nitrogen (SF6/N2). On the negative photoresist product, the CFC over-etch was 50 seconds. For the positive photoresist, CFC had a 25-second process. I was able to reduce each product line to two seconds, increasing throughput, and the process increased die yields. It is possible to balance the positive impact of product improvement and the environment. I did it in the 90s, so the following report is disappointing.
*****
The Montreal Protocol, which banned most uses of ozone-destroying chemicals known as chlorofluorocarbons (CFCs) and called for their global phase-out by 2010, has been a great success story: Earth’s ozone layer is projected to recover by the 2060s.
So atmospheric chemists were surprised to see a troubling signal in recent data. They found that the levels of five CFCs rose rapidly in the atmosphere from 2010 to 2020. Their results are published today in Nature Geoscience1.
“This shouldn’t be happening,” says Martin Vollmer, an atmospheric chemist at the Swiss Federal Laboratories for Materials Science and Technology in Dübendorf, who helped to analyze data from an international network of CFC monitors. “We expect the opposite trend. We expect them to slowly go down.”
At current levels, these CFCs do not pose much threat to the ozone layer’s healing, said Luke Western, a chemist at the University of Bristol, UK, at an online press conference on 30 March. CFCs, once used as refrigerants and aerosols, can persist in the atmosphere for hundreds of years. Given that they are potent greenhouse gases, eliminating emissions of these CFCs will also have a positive impact on Earth’s climate. The collective annual warming effect of these five chemicals on the planet is equivalent to the emissions produced by a small country like Switzerland.
It’s highly likely that manufacturing plants are accidentally releasing three of the chemicals — CFC-113a, CFC-114a, and CFC-115 — while producing replacements for CFCs. When CFCs were phased out, hydrofluorocarbons (HFCs) were brought in as substitutes. But CFCs can crop up as unintended by-products during HFC manufacture. This accidental production is discouraged by the Montreal Protocol but not prohibited by it.
A vertical shock tube at Los Alamos National Laboratory is used for turbulence studies. Sulfur hexafluoride is injected at the top of the 5.3-meter tube and allowed to mix with air. The waste is ejected into the environment through the blue hose at the tube tower’s lower left; in the fiscal year 2021, such emissions made up some 16% of the lab’s total greenhouse gas emissions. The inset shows a snapshot of the mixing after a shock has crossed the gas interface; the darker gas is SF6, and the lighter is air. The intensities yield density values.
Topics: Civilization, Climate Change, Global Warming, Research
Reducing air travel, improving energy efficiency in infrastructure, and installing solar panels are among the obvious actions that individual researchers and their institutions can implement to reduce their carbon footprint. But they can take many other small and large steps, too, from reducing the use of single-use plastics and other consumables and turning off unused instruments to exploiting waste heat and siting computing facilities powered by renewable energy. On a systemic level, measures can encourage behaviors to reduce carbon emissions; for example, valuing in-person invited job talks and remote ones equally could lead to less air travel by scientists.
So far, the steps that scientists are taking to reduce their carbon footprint are largely grassroots, notes Hannah Johnson, a technician in the imaging group at the Princess Máxima Center for Pediatric Oncology in Utrecht and a member of Green Labs Netherlands, a volunteer organization that promotes sustainable science practices. The same goes for the time and effort they put in for the cause. One of the challenges, she says, is to get top-down support from institutions, funding agencies, and other national and international scientific bodies.
At some point, governments are likely to make laws that support climate sustainability, says Astrid Eichhorn, a professor at the University of Southern Denmark whose research is in quantum gravity and who is active on the European Federation of Academies of Sciences and Humanities committee for climate sustainability. “We are in a situation to be proactive and change in ways that do not compromise the quality of our research or our collaborations,” she says. “We should take that opportunity now and not wait for external regulations.”
Suppose humanity manages to limit emissions worldwide to 300 gigatons of carbon dioxide equivalent (CO2e). In that case, there is an 83% chance of not exceeding the 1.5 °C temperature rise above preindustrial levels set in the 2015 Paris Agreement, according to a 2021 Intergovernmental Panel on Climate Change special report. That emissions cap translates to a budget of 1.2 tons of CO2e per person annually through 2050. Estimates for the average emissions by researchers across scientific fields are much higher and range widely in part because of differing and incomplete accounting approaches, says Eichhorn. She cites values from 7 to 18 tons a year for European scientists.
National Ignition Facility operators inspect a final optics assembly during a routine maintenance period in August. Photo credit: Lawrence Livermore National Laboratory
After the heady, breathtaking coverage of pop science journalism, I dove into the grim world inhabited by the Bulletin of the Atomic Scientists on their take on the first-ever fusion reaction. I can say that I wasn’t surprised. With all this publicity, it will probably get the Nobel Prize nomination (my guess). Cool Trekkie trivia: the National Ignition Facility was the backdrop for the Enterprise's warp core for Into Darkness.
*****
This week’s headlines have been full of reports about a “major breakthrough” in nuclear fusion technology that, many of those reports misleadingly suggested, augurs a future of abundant clean energy produced by fusion nuclear power plants. To be sure, many of those reports lightly hedged their enthusiasm by noting that (as The Guardian put it) “major hurdles” to a fusion-powered world remain.
Indeed, they do.
The fusion achievement that the US Energy Department announced this week is scientifically significant, but the significance does not relate primarily to electricity generation. Researchers at Lawrence Livermore National Laboratory’s National Ignition Facility, or NIF, focused the facility’s 192 lasers on a target containing a small capsule of deuterium–tritium fuel, compressing it and inducing what is known as ignition. In a written press release, the Energy Department described the achievement this way: “On December 5, a team at LLNL’s National Ignition Facility (NIF) conducted the first controlled fusion experiment in history to reach this [fusion ignition] milestone, also known as scientific energy breakeven, meaning it produced more energy from fusion than the laser energy used to drive it. This historic, first-of-its-kind achievement will provide the unprecedented capability to support [the National Nuclear Security Administration’s] Stockpile Stewardship Program and will provide invaluable insights into the prospects of clean fusion energy, which would be a game-changer for efforts to achieve President Biden’s goal of a net-zero carbon economy.”
Because of how the Energy Department presented the breakthrough in a news conference headlined by Energy Secretary Jennifer Granholm, news coverage has largely glossed over its implications for monitoring the country’s nuclear weapons stockpile. Instead, even many serious news outlets focused on the possibility of carbon-free, fusion-powered electricity generation—even though the NIF achievement has, at best, a distant and tangential connection to power production.
To get a balanced view of what the NIF breakthrough does and does not mean, I (John Mecklin) spoke this week with Bob Rosner, a physicist at the University of Chicago and a former director of the Argonne National Laboratory who has been a longtime member of the Bulletin’s Science and Security Board. The interview has been lightly edited and condensed for readability.
The Shisper Glacier in April 2018, left, and April 2019, right. The surging ice blocked a river fed by a nearby glacier, forming a new lake.YALE ENVIRONMENT 360 / NASA
Topics: Civilization, Climate Change, Environment, Existentialism, Global Warming
Everything about Earth and the organization of human civilization is about the control of resources.
We’ve come up with arbitrary “rules” about who is worthy of those resources, and how much they can horde, or obtain. Pharaohs, priests, secret societies, and guilds all have “knowledge” they jealously guard, or it may be as simple as caste or color. Every society with billionaires, emperors, kings, oligarchs, potentates, and sheiks all have a designated group to blame for the ills of poor planning and sadistic resource management: indigenous, or imported servants by force, they are the easy go-to designated pariahs. It is a cynical way to get rich, but a poor method of species survival. A resource we all need, from billionaires to pariahs, is potable water to drink. Jackson, Mississippi is a foreshadowing of what we might expect.
This continual differentiation of mankind by caste, color, station, and monetary wealth has brought us to this rolling train wreck catastrophe. Climate refugees occurred in 2005 in the aftermath of Hurricane Katrina. Climate refugees occurred after the flooding in Pakistan. Climate refugees will occur in the aftermath of future superstorms. Lest we think ourselves immune, we may all be seeking higher ground, leaving homes and businesses for something we could have solved decades ago except for avarice.
The permafrost is melting, and that will release viruses that haven't seen the light of day for several millennia, and we have no vaccines for what will likely be carried on the wind and zoonotically transferred between animals and humans.
Starships are as real as magic carpets, genies, Yetis, and mermaids.
There is no “planet B,” life, or wealth on a nonfunctional planet.
Warmer air is thinning most of the vast mountain range’s glaciers, known as the Third Pole because they contain so much ice. The melting could have far-reaching consequences for flood risk and for water security for a billion people who rely on meltwater for their survival.
Spring came early this year in the high mountains of Gilgit-Baltistan, a remote border region of Pakistan. Record temperatures in March and April hastened melting of the Shisper Glacier, creating a lake that swelled and, on May 7, burst through an ice dam. A torrent of water and debris flooded the valley below, damaging fields and houses, wrecking two power plants and washing away parts of the highway and a bridge connecting Pakistan and China.
Pakistan’s climate change minister, Sherry Rehman, tweeted videos of the destruction and highlighted the vulnerability of a region with the largest number of glaciers outside the Earth’s poles. Why were these glaciers losing mass so quickly? Rehman put it succinctly. “High global temperatures,” she said.
Just over a decade, ago, relatively little was known about glaciers in the Hindu Kush Himalayas, the vast ice mountains that run across Central and South Asia, from Afghanistan in the west to Myanmar in the east. But a step-up in research in the past 10 years — spurred in part by an embarrassing error in the Intergovernmental Panel on Climate Change’s 2007 Fourth Assessment Report, which predicted that Himalayan glaciers could melt away by 2035 — has led to enormous strides in understanding.
Scientists now have data on almost every glacier in high-mountain Asia. They know “how these glaciers have changed not only in area but in mass during the last 20 years,” says Tobias Bolch, a glaciologist with the University of St Andrews in Scotland. He adds, “We also know much more about the processes which govern glacial melt. This information will give policymakers some instruments to really plan for the future.”
Cool stuff: the diagram shows how the temperature of the caloric material was measured. The plot in the center shows the temperature change in the sample when exposed to a magnetic field. The plot on the right shows the change in temperature when the sample is strained. (Courtesy: Peng Wu et al/Acta Materialia237 118154)
Topics: Global Warming, Green Tech, Materials Science, Solid-State Physics, Thermodynamics
Researchers in China have shown that applying strain to a composite material using an electric field induces a large and reversible caloric effect. This novel way of enhancing the caloric effect without a magnetic field could open new avenues of solid-state cooling and lead to more energy-efficient and lighter refrigerators.
The International Institute of Refrigeration estimates that 20% of all electricity used globally is expended on vapor-compression refrigeration – which is the technology used in conventional refrigerators and air conditioners. What is more, the refrigerants used in these systems are powerful greenhouse gases that contribute significantly to global warming. As a result, scientists are trying to develop more environmentally friendly refrigeration systems.
Cooling systems can also be made from completely solid-state systems, but these cannot currently compete with vapor compression for most mainstream applications. Today, most commercial solid-state cooling systems use the Peltier effect, which is a thermoelectric process that suffers from high cost and low efficiency.
Topics: Civilization, Climate Change, Environment, Global Warming
Humans can survive up to 108.14 F, or 42.3 C before our brains and constitutions (bodies) start turning to mush. As a species, we're going to have to decide if enriching a handful of global oligarchs is more important than survival. Wealth cannot be measured on a dysfunctional planet.
Nobody in Ashish Agashe’s seven-story apartment building in Thane, a suburb of Mumbai, had air conditioning 20 years ago. Today, his apartment is one of only two of the 28 units without it.
“Once you make peace with sweating,” says Agashe, “it is easy to survive this weather.” He decided against air conditioning because it gives him a “faux feel,” and he doesn’t believe his income should determine his lifestyle choices. Later, he was “chuffed” to learn that his choice is better for the planet.
Unlike Agashe, many Indians are adopting air conditioning to deal with more frequent and more intense heat waves. Earlier this year, temperatures in parts of India and Pakistan surpassed 120 degrees Fahrenheit.
At age 37, Agashe hopes temperatures do not rise high enough in his lifetime to require air conditioning in Mumbai, a humid and densely populated city on India’s west coast that today rarely sees temperatures above 40 degrees Celsius (104 degrees Fahrenheit). But even if the climate stopped changing, he worries that the heat produced by all the air conditioners in his building, which spills in through his open window, may force him to install air conditioning, too.
Kayakers and other boaters paddled up to Manchin, who famously lives on a houseboat named “Almost Heaven” when he’s in DC. The subtitle should be “for the rest of you, hell.” Source: Washingtonian, Maya Pottiger, 10/14/21
Topics: Civilization, Environment, Existentialism, Global Warming
Four more people died that night. In the morning the sun again rose like the blazing furnace of heat it was, blasting the rooftop and its sad cargo of wrapped bodies. Every rooftop and, looking down at the town, every sidewalk was now a morgue. The town was a morgue, and it was as hot as ever, maybe hotter. The thermometer now said 42 degrees (107.6 F), humidity 60 percent.
—Kim Stanley Robinson, from The Ministry for the Future
The first chapter of Kim Stanley Robinson’s The Ministry for the Future takes my breath away. Not just because I can almost feel the heat and humidity dripping off the pages, but because I know that—although the story is fictional—similar scenes are already playing out in real life.
Topics: Astronautics, Climate Change, Environment, Futurism, Global Warming, Mars, Spaceflight
When Elon Musk founded SpaceX in 2002, he envisioned a greenhouse on Mars, not unlike the one later depicted in the 2015 blockbuster The Martian. Soon, his fantasy grew from a small-scale botanical experiment into a vision for a self-sustaining Martian city. In a speech at the 67th International Astronautical Congress in 2016, he argued his point. “History is going to bifurcate along with two directions. One path is we stay on earth forever and then there will be some eventual extinction event,” Musk says. “The alternative is to become a space-faring civilization and a multi-planet species, which, I hope you would agree, is the right way to go.”
Though Musk later clarified that the extinction event he referenced may take place millennia (or even eons) in the future, the conditions on earth today are becoming increasingly dangerous for human beings. Deadly heatwaves, food insecurity, and catastrophic natural disasters are a few of the hazards that we face as the planet continues to warm. Unfortunately, the Red Planet is a very long way from becoming a viable alternative home. While we measure carbon dioxide concentrations in parts per million on earth, Mars’ atmosphere contains 96% CO2, just one of a litany of logistical nightmares that Martian colonists would have to overcome.
In a perfect world, Musks’ dreams of extraterrestrial civilization could coexist with the eco-forward values that have driven ventures like Tesla’s solar program. But while SpaceX’s aspirations are in space, its operations have an undeniable impact at home. Unlike a Tesla sports car, SpaceX’s rockets aren’t propelled by electricity — they burn kerosene.
Carbon emissions from space launches are dwarfed by other sources of greenhouse gasses, but they could have an outsized impact on climate. The reason for this stems from one particular product of rocket propulsion: black carbon. These tiny chunks of crystalline carbon atoms are short-lived in the atmosphere, but highly absorptive of sunlight. On the Earth’s surface, black carbon from diesel, coal, and wood combustion poses a threat to environmental and public health, particularly in developing countries. But in the upper atmosphere, rocket engines are the sole source of black carbon. For years, scientists have warned that these emissions could have unpredictable effects on climate. Still, research on the topic has been frustratingly slow.
“We identified the issue with black carbon in 2010,” says Darin Toohey, an atmospheric scientist at the University of Colorado Boulder. “The story comes and goes, but the basic players remain the same.”
Clean energy sources like wind turbines are part of Argonne’s decades-long effort to create a carbon-free economy. (Image by Shutterstock/Engel.ac.)
Topics: Battery, Biofuels, Climate Change, Existentialism, Global Warming
Reducing carbon dioxide emissions and removing them from the atmosphere is critical to the global fight against climate change. Called decarbonization, it is one of the focal points in the nation’s strategy to ensure a bright future for our planet and all who live on it.
The U.S. Department of Energy’s (DOE) Argonne National Laboratory has been at the forefront of the quest to decarbonize the U.S. economy for decades.
Argonne scientists are developing new materials for batteries and researching energy-efficient transportation and sustainable fuels. They are expanding carbon-free energy sources like nuclear and renewable power. Argonne researchers are also exploring ways to capture carbon dioxide from the air and from industrial sources, use it to produce chemicals, or store it in the ground.
The ultimate goal? To reduce the greenhouse gases that trap heat in the atmosphere and warm the planet.
Fast physics Formula E has created huge advances in electric vehicles off the racing circuit as well as on, but they still have drawbacks. (Courtesy: Luis Licona/EPA-EFE/Shutterstock)
Topics: Alternate Energy, Battery, Biofuels, Climate Change, Global Warming
Cars – and in particular racecars – might seem the villains in a world grappling with climate change. Racing Green: How Motorsport Science Can Change the World hopes to convince you of exactly the opposite, with science journalist Kit Chapman showing how motorsports not only pioneers new, planet-friendlier machines and materials, but saves lives on and off the track too.
The first part of Chapman’s argument tracks the historical development of cars and competition. His stories show how, from its start, racing has served as a research lab and proving ground for new technologies. The first organized motor races were competitions to encourage innovation, akin to today’s X-Prizes. In 1894 Le Petit Journal offered a purse for the first car to make it from Paris to Rouen, while later races emphasized pure speed or, like the legendary 24 Hours of Le Mans, endurance. Chapman provides a whirlwind tour through the development of the internal combustion engine-powered car and its damning limitations, including the copious greenhouse-gas emissions and the inability to ever achieve more than 50% thermal efficiency.
He then introduces us to new racing series like Formula E and Extreme E, which have changed electric cars “from an eccentric folly to the undisputed future of the automotive industry”. Chapman highlights the advantages of electric vehicles without glossing over their drawbacks: recycling challenges, the potential for difficult-to-extinguish fires resulting from thermal runaway, and ethical/sustainability issues surrounding the materials used. Throughout this section, he links motorsport advances with “real-life” applications. For example, the same flywheels that enabled Audi’s hybrid racecars to take all three podium spots at the 24 Hours of Le Mans in 2012 made London buses more energy efficient. Some connections are a little more tenuous than others, but they are uniformly fascinating.
A radical reimagining of information processing could greatly reduce the energy use—as well as greenhouse gas emissions and waste heat—from computers. Credit: vchal/Getty Images
In case you had not noticed, computers are hot—literally. A laptop can pump out thigh-baking heat, while data centers consume an estimated 200 terawatt-hours each year—comparable to the energy consumption of some medium-sized countries. The carbon footprint of information and communication technologies as a whole is close to that of fuel used in the aviation industry. And as computer circuitry gets ever smaller and more densely packed, it becomes more prone to melting from the energy it dissipates as heat.
Now physicist James Crutchfield of the University of California, Davis, and his graduate student Kyle Ray have proposed a new way to carry out computation that would dissipate only a small fraction of the heat produced by conventional circuits. In fact, their approach, described in a recent preprint paper, could bring heat dissipation below even the theoretical minimum that the laws of physics impose on today’s computers. That could greatly reduce the energy needed to both perform computations and keep circuitry cool. And it could all be done, the researchers say, using microelectronic devices that already exist.
In 1961 physicist Rolf Landauer of IBM’s Thomas J. Watson Research Center in Yorktown Heights, N.Y., showed that conventional computing incurs an unavoidable cost in energy dissipation—basically, in the generation of heat and entropy. That is because a conventional computer has to sometimes erase bits of information in its memory circuits in order to make space for more. Each time a single bit (with the value 1 or 0) is reset, a certain minimum amount of energy is dissipated—which Ray and Crutchfield have christened “the Landauer.” Its value depends on ambient temperature: in your living room, one Landauer would be around 10–21 joule. (For comparison, a lit candle emits on the order of 10 joules of energy per second.)
Topics: Climate Change, Existentialism, Global Warming, Research
More moisture in a warmer atmosphere is fueling intense hurricanes and flooding rains.
The summer of 2021 was a glaring example of what disruptive weather will look like in a warming world. In mid-July, storms in western Germany and Belgium dropped up to eight inches of rain in two days. Floodwaters ripped buildings apart and propelled them through village streets. A week later a year’s worth of rain—more than two feet—fell in China’s Henan province in just three days. Hundreds of thousands of people fled rivers that had burst their banks. In the capital city of Zhengzhou, commuters posted videos showing passengers trapped inside flooding subway cars, straining their heads toward the ceiling to reach the last pocket of air above the quickly rising water. In mid-August a sharp kink in the jet stream brought torrential storms to Tennessee that dropped an incredible 17 inches of rain in just 24 hours; catastrophic flooding killed at least 20 people. None of these storm systems were hurricanes or tropical depressions.
Soon enough, though, Hurricane Ida swirled into the Gulf of Mexico, the ninth named tropical storm in the year’s busy North Atlantic season. On August 28 it was a Category 1 storm with sustained winds of 85 miles per hour. Less than 24 hours later Ida exploded to Category 4, whipped up at nearly twice the rate that the National Hurricane Center uses to define a rapidly intensifying storm. It hit the Louisiana coast with winds of 150 miles an hour, leaving more than a million people without power and more than 600,000 without water for days. Ida’s wrath continued into the Northeast, where it delivered a record-breaking 3.15 inches of rain in one hour in New York City. The storm killed at least 80 people and devastated a swath of communities in the eastern U.S.
What all these destructive events have in common is water vapor—lots of it. Water vapor—the gaseous form of H2O—is playing an outsized role in fueling destructive storms and accelerating climate change. As the oceans and atmosphere warm, additional water evaporates into the air. Warmer air, in turn, can hold more of that vapor before it condenses into cloud droplets that can create flooding rains. The amount of vapor in the atmosphere has increased about 4 percent globally just since the mid-1990s. That may not sound like much, but it is a big deal to the climate system. A juicier atmosphere provides extra energy and moisture for storms of all kinds, including summertime thunderstorms, nor’easters along the U.S. Eastern Seaboard, hurricanes, and even snowstorms. Additional vapor helps tropical storms like Ida intensify faster, too, leaving precious little time for safety officials to warn people in the crosshairs.
The Eye of Providence, Wikipedia. Not a conspiracy theorist, but greed, not mutual survival by cooperation, is how we got where we are.
Topics: Climate Change, Existentialism, Global Warming, Human Rights, Politics
Note: Because certain states are tied umbilically to coal, fossil fuels, and natural gas, this is a quandary. The same industry that's known about this problem since 1979 (my senior year in high school) hired the same law firms that obfuscated the risk of lung cancer to so many Americans, one of them, my father, whose death was from the accumulated damage to his lungs from a lifetime of smoking. Washington lobbyists are there to push an agenda for the companies they represent that have an influence on Capitol Hill lawmakers. They have a seat at the Paris Climate Accords because the goal of Capitalism is to maximize profits, sadly, at the sacrifice of the ground under our feet.
“Only when the last tree has been cut down, the last fish been caught, and the last stream poisoned, will we realize we cannot eat money.”
― Cree Indian Prophecy
The majority of the planet’s fossil fuel reserves must stay in the ground if the world wants even half a chance—literally—at meeting its most ambitious climate targets.
A new study published yesterday in the journal Nature found that 60 percent of oil and natural gas, and a whopping 90 percent of coal, must remain unextracted and unused between now and 2050 in order for the world to have at least a 50 percent shot at limiting warming to 1.5 degrees Celsius.
These results are broadly consistent with the findings of numerous recent reports, from the United Nations, the International Energy Agency, and others, which have “all provided evidence that dramatic cuts in fossil fuel production are required immediately in order to move towards limiting global heating to 1.5 degrees,” said Dan Welsby, a researcher at University College London and lead author of the study, at a press conference announcing the results.
Under the Paris climate agreement, nations are working to keep global temperatures within 2 C of their preindustrial levels, and within 1.5 C if at all possible. Research suggests that the effects of climate change—melting ice, rising seas, more extreme weather, and so on—will be worse at 2 C than at 1.5 C, and worse still at higher temperatures. These targets are an attempt to limit the consequences of global warming as much as possible.
Yet studies increasingly suggest that the 1.5 C target is looming closer and closer.
The world has already warmed by more than a degree Celsius since the start of the industrial era, which began about 150 years ago. A landmark U.N. report on climate change, released last month by the Intergovernmental Panel on Climate Change, warned that the 1.5 C mark could be reached within two decades.
To have even a 50 percent chance of meeting the target, the U.N. report suggests, the world can emit only about 460 billion metric tons of additional carbon dioxide into the atmosphere. That’s another 12 years or so of emissions at the rate at which the world is currently going.
That means global carbon emissions need to fall sharply, and immediately, in order to meet the goal.
Topics: Climate Change, Existentialism, Global Warming</span>
The hundreds of climate experts who compiled the mammoth new climate report released today by the U.N. Intergovernmental Panel on Climate Change (IPCC) had to work under unprecedented pandemic conditions. At vast meetings forced online, scientists wrestled with how to convey the extent of the global crisis and the urgent need to act. It was uncanny to see “the echoes of one crisis in another,” says Claudia Tebaldi, a climate scientist at Pacific Northwest National Laboratory and one of the authors of the report.
The report paints an alarming picture but emphasizes there is still time for swift action to mitigate the worst of the projected impacts of climate change. Current average warming is now estimated at 1.1°C compared to preindustrial records, a revision based on improved methods and data that adds 0.1°C to previous estimates. Under every emissions scenario explored by the report, average warming of 1.5°C—a major target of the Paris climate accord—will very likely be reached within the next 20 years.</em>
That timetable “underscores a sense of urgency for immediate and decisive action by every country, especially the major economies,” says Jane Lubchenco, deputy director for climate and the environment at the White House Office of Science and Technology Policy. “This is a critical decade for keeping the 1.5°C targets within reach.” And the projections mean countries should come to the United Nations Climate Change Conference, scheduled for November, with the most “aggressive, ambitious” targets possible, she says.
A river snakes its way through the Amazon rain forest in Peru. Credits: USDA Forest Service
Topics: Climate Change, Economics, Environment, Existentialism, Global Warming
The finding comes out of an effort to map where vegetation is emitting and soaking up carbon dioxide from the atmosphere.
Earth’s trees and plants pull vast amounts of carbon dioxide out of the atmosphere during photosynthesis, incorporating some of that carbon into structures like wood. Areas that absorb more carbon than they emit are called carbon sinks. But plants can also emit the greenhouse gas during processes like respiration, when dead plants decay, or during combustion in the case of fires. Researchers are particularly interested in whether – and how – plants at the scale of an ecosystem like a forest act as sources or sinks in an increasingly warming world.
A recent study led by scientists at NASA’s Jet Propulsion Laboratory in Southern California identified whether vegetated areas like forests and savannas around the world were carbon sources or sinks every year from 2000 to 2019. The research found that over the course of those two decades, living woody plants were responsible for more than 80% of the sources and sinks on land, with soil, leaf litter, and decaying organic matter making up the rest. But they also saw that vegetation retained a far smaller fraction of the carbon than the scientists originally thought.
In addition, the researchers found that the total amount of carbon emitted and absorbed in the tropics was four times larger than in temperate regions and boreal areas (the northernmost forests) combined, but that the ability of tropical forests to absorb massive amounts of carbon has waned in recent years. The decline in this ability is because of large-scale deforestation, habitat degradation, and climate change effects, like more frequent droughts and fires. In fact, the study, published in Science Advances, showed that 90% of the carbon that forests around the world absorb from the atmosphere is offset by the amount of carbon released by such disturbances as deforestation and droughts.
The scientists created maps of carbon sources and sinks from land-use changes like deforestation, habitat degradation, and forest planting, as well as forest growth. They did so by analyzing data on global vegetation collected from space using instruments such as NASA’s Geoscience Laser Altimeter System (GLAS) onboard ICESat and the agency’s Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Terra and Aqua satellites, respectively. The analysis used a machine-learning algorithm that the researchers first trained using vegetation data gathered on the ground and in the air using laser-scanning instruments.
Topics: Climate Change, Existentialism, Global Warming, Politics
Fahrenheit to Celsius
Celsius to Fahrenheit
(5/9)(°F - 32) = °C
(9/5) °C + 32 = °F
Handy-Dandy Conversion Table
Even though the Big Think video is informative, my critique is it presumes much regarding the audience, presumably the species.
The assumption is that even with the equivalent of supercomputers on our hips, humans will be motivated beyond the video to know the difference between Fahrenheit, and Celsius. What the average human mind will process is: "two degrees," which doesn't sound like much as mathematical dexterity is only encouraged in those interested in STEM.
On Wednesday, when former Rhode Island Gov. Lincoln Chafee announced his bid for the Democratic presidential nomination, his remarks on the occasion contained some of the usual sentiment about the importance of being a bold and inspiring nation—but they also contained something a bit unusual. “Here’s a bold embrace of internationalism: let’s join the rest of the world and go metric,” he said. “I happened to live in Canada as they completed the process. Believe me, it is easy. It doesn’t take long before 34 degrees is hot. Only Myanmar, Liberia, and the United States aren’t metric and it will help our economy!”
The resistance to the Metric System (originally from the French) has to quote Ms. Rothman, "a long, tortuous history" in the United States. Resistance to "change" is inherently political, and we have but one of the two major political parties famous for looking backward, as well as celebration, and apoptosis of a hierarchal status quo.
I'm not saying the video isn't informative. The above formulas were drilled into me in middle school science class, and since I have made my living, and continue my education in STEM, mental conversion is a familiar exercise.
It should be for average citizens also. The video concerns two degrees Celsius hotter; the title I derived from one degree hotter (in bold below):</p>
(9/5) 0°C + 32 = 32°F
(9/5) 1°C + 32 = 1.8 + 32 = 33.8°F
(9/5) 2°C + 32 = 3.6 + 32 = 35.6°F
(9/5) 3°C + 32 = 5.4 + 32 = 37.4°F
(9/5) 4°C + 32 = 7.2 + 32 = 39.2°F
Add that to whatever is average summer temperatures in the Arctic, California, Texas, or North Carolina, and you can see why Environmental Scientists are hair-on-fire excited.
Some of the best science lectures I've attended are when the speaker assumes the audience is hearing the information for the first time, provides a primer of about 15 - 20 minutes, and about a thirty-five to forty-minute lecture, allowing time for questions. It respects the intelligence, and time of the audience.
The opposite: the lecturer is so excited about their work, they hit Warp Seven after clearing orbital drydock, and head for Andromeda, 2.537 million light-years away. The only time they stop is when the host informs them their time is up, and it's evident the crowd has tuned out, checking social media, and drooling as they wait for the lecture/torture to end.
To communicate the gravity of the situation, I feel we need to communicate better to the general public for buy-in that: 1. There is a crisis, 2. We have to do something about it.
By logical extension, science communication can mean life or death. Ninety-nine-point-five percent of new COVID deaths are from the unvaccinated, so armchair conspiracy theories are not proving helpful. I took the Moderna vaccine. I did not become magnetic. I did not become the carrier of a variant. I'm a grandfather, so my infertility at this stage is kind of irrelevant. No one started tracking me (for what reason, God only knows).
Please feel free to share my post, and check my calculations. We all need a clear understanding, not fossil fuel industry/corporate lobbyist gaslighting, on where we're headed if we don't heed the warnings.
“Science-fiction writers foresee the inevitable, and although problems and catastrophes may be inevitable, solutions are not.” Isaac Asimov
Optimal size: wind farm efficiency drops as installations become bigger. (Courtesy: iStock/ssuaphoto)
Topics: Alternate Energy, Climate Change, Existentialism, Global Warming, Green Tech, Thermodynamics
Optimizing the placement of turbines within a wind farm can significantly increase energy extraction – but only until the installation reaches a certain size, researchers in the US conclude. This is just one finding of a computational study on wind turbines’ effects on the airflow around them, and consequently the ability of nearby turbines – and even nearby wind farms – to extract energy from that airflow.
Wind power could supply more than a third of global energy by 2050, so the researchers hope their analysis will assist in better designs of wind farms.
It is well known that the efficiencies of turbines in a wind farm can be significantly lower than that of a single turbine on its own. While small wind farms can achieve a power density of over 10 W/m2, this can drop to a little as 1 W/m2 in very large installations The first law of thermodynamics dictates that turbines must reduce the energy of the wind that has passed through them. However, turbines also inject turbulence into the flow, which can make it more difficult for downstream turbines to extract energy.
“People were already aware of these issues,” says Enrico Antonini of the Carnegie Institution for Science in California, “but no one had ever defined what controls these numbers.”
A polar bear perches on a thick chunk of sea ice north of Greenland in March 2016. These thicker, older pieces of sea ice don’t fully protect the larger region from losing its summer ice cover. (Image credit: Kristin Laidre/University of Washington)
Topics: Climate Change, Existentialism, Global Warming
The "Last Ice Area," an Arctic region is known for its thick ice cover, may be more vulnerable to climate change than scientists suspected, a new study has found.
This frozen zone, which lies to the north of Greenland, earned its dramatic name because even though its ice grows and shrinks seasonally, much of the sea ice here was thought to be thick enough to persist through summer's warmth.
But during the summer of 2020, the Wandel Sea in the eastern part of the Last Ice Area lost 50% of its overlying ice, bringing coverage there to its lowest since record-keeping began. In the new study, researchers found that weather conditions were driving the decline, but climate change made that possible by gradually thinning the area's long-standing ice year after year. This hints that global warming may threaten the region more than prior climate models suggested.