research (88)

The Secret Life of the Universe...

12810704076?profile=RESIZE_710x

Topics: Astrobiology, Biology, Instrumentation, James Web Space Telescope, Research, SETI

"The Secret Life of the Universe" by Dr. Nathalie Cabrol, the SETI Institute's chief scientist and Director of the Carl Sagan Center at the SETI Institute, is coming out this week, both in the US (August 13, 2024) and in the UK (August 15, 2024). Scriber/Simon & Schuster publishes both editions. Cabrol articulates an overview of where we stand today in our search for life in the universe, what's coming, and how looking out for life beyond Earth teaches us about our place on our planet.

Here is an excerpt to inspire you:

On July 11, 2022, the James Webb Space Telescope (JWST) returned its first images, penetrating the wall of time to show us the universe just a few hundred million years after its formation. In a marvelous cosmic irony, this immersion into the depths of our origins propels us into the future, where a revolution looms large in astronomy, in cosmology, and in astrobiology—the search for life in the universe. JWST comes after a few decades of space and planetary exploration during which we have discovered countless habitable environments in our solar system—for (simple) life as we know it, but also thousands of exoplanets in our galaxy, some of them located in the habitable zone of their parent stars.

We are living in a golden age in astrobiology, the beginning of a fantastic odyssey in which much remains to be written, but where our first steps bring the promise of prodigious discoveries. And these first steps have already transformed our species in one generation in a way that we cannot foresee just yet.

Copernicus taught us long ago that the Earth was neither at the center of the universe nor the center of the solar system, for that matter. We also learned from the work of Harlow Shapley and Henrietta Swan Leavitt that the solar system does not even occupy any particularly prominent place in our galaxy. It is simply tucked away at the inner edge of Orion’s spur in the Milky Way, 27,000 light-years from its center, in a galactic suburb of sorts. Our sun is an average-sized star located in a galaxy propelled at 2.1 million kilometers per hour in a visible universe that counts maybe 125 billion such cosmic islands, give or take a few billion. In this immensity, the Kepler mission taught us that planetary systems are the rule, not the exception.

This is how, in a mere quarter of a century, we found ourselves exploring a universe populated by as many planets as stars. Yet, looking up and far into what seems to be an infinite ocean of possibilities, the only echoes we have received so far from our explorations have been barren planetary landscapes and thundering silence. Could it be that we are the only guests at the universal table? Maybe. As a scientist, I cannot wholly discount this hypothesis, but it seems very unlikely and “an awful waste of space,” and for more than one reason.

The Secret Life of the Universe, ?ETI Institute

Read more…

Ice, Snow, Water, Nada...

12754256476?profile=RESIZE_584x

Figure 1. The Vadret da Tschierva glacier in 1935 (top) and in 2022 (bottom).Photos courtesy of swisstopo, L. Hösli, G. Carcanade, M. Huss, VAW-ETHZ.

Topics: Civilization, Climate Change, Fluid Mechanics, Global Warming, Meteorology, Research

Glaciers—dynamic masses of ice descending from the mountain tops—have always been fascinating to humankind. They intrinsically belong to the high-alpine environment. Countless photographs immortalize their bright white beauty and the power they radiate. Glaciers have been depicted on oil and parchment for centuries, as if trying to capture their transience. They are constantly moving; under the influence of gravity, the ice generated at high elevation flows downwards and shapes tremendous glacier tongues that are speckled with deep fissures known as crevasses. Sometimes, the openings to these crevasses are hidden by a light dusting of snow; subsequently, mountaineers need a lot of experience to accurately judge their exposure to the ice.

Even though glaciers are not living things, they are not lifeless. For many mountain regions worldwide, glaciers function similarly to lungs: They absorb snow in wintertime and “breathe” out water during hot summer days. This glacier water is urgently needed, especially in dry periods.

Glaciers consequently have a relevance that goes far beyond the mountain peaks where they reside. A reduction in meltwater from glaciers would be painful for nature and the global economy: irrigation of fields would be restricted, the temperature and mineralization of rivers would change, and during periods of drought, serious bottlenecks could come into existence for the drinking water supply and for shipping on rivers. In addition, melting glacial ice contributes to sea-level rise and therefore directly or indirectly affects billions of people living near the coast.

Despite, or perhaps because of, their majestic appearance, glaciers can also pose an immediate threat. Glaciers can produce floods and ice avalanches that endanger villages in the valleys. Together with permafrost, glaciers also stabilize mountain flanks and therefore reduce the potential for rock falls and landslides—a role that is becoming increasingly lost.

The so-called “eternal” ice of glaciers tells a long and dynamic story. During the Ice Age, ice sheets covered a large part of the North American continent, as well as Europe. The last time this happened was around 20,000 years ago—a blink of the eye from a geological perspective. Since then, the climate has changed, due both to natural factors and to anthropogenic influences—human-caused factors—which have massively accelerated over the past 100 years (Marzeion et al., 2014). As a result, the glaciers are still present, but they are getting smaller every year.

The Alps’ iconic glaciers are melting, but there’s still time to save (most of) the biggest, Matthias Huss, Bulletin of the Atomic Scientists.

Read more…

Spectral Molecule...

12435055278?profile=RESIZE_710x

Scientists detected 2-Methoxyethanol in space for the first time using radio telescope observations of the star-forming region NGC 6334I. Credit: Massachusetts Institute of Technology

Topics: Astronomy, Chemistry, Instrumentation, Interstellar, Research, Spectrographic Analysis

New research from the group of MIT Professor Brett McGuire has revealed the presence of a previously unknown molecule in space. The team's open-access paper, "Rotational Spectrum and First Interstellar Detection of 2-Methoxyethanol Using ALMA Observations of NGC 6334I," was published in the April 12 issue of The Astrophysical Journal Letters.

Zachary T.P. Fried, a graduate student in the McGuire group and the lead author of the publication worked to assemble a puzzle comprised of pieces collected from across the globe, extending beyond MIT to France, Florida, Virginia, and Copenhagen, to achieve this exciting discovery.

"Our group tries to understand what molecules are present in regions of space where stars and solar systems will eventually take shape," explains Fried. "This allows us to piece together how chemistry evolves alongside the process of star and planet formation. We do this by looking at the rotational spectra of molecules, the unique patterns of light they give off as they tumble end-over-end in space.

"These patterns are fingerprints (barcodes) for molecules. To detect new molecules in space, we first must have an idea of what molecule we want to look for, then we can record its spectrum in the lab here on Earth, and then finally we look for that spectrum in space using telescopes."

Researchers detect a new molecule in space, Danielle Randall Doughty, Massachusetts Institute of Technology, Phys.org.

Read more…

Esse Quam Videri...

12428240263?profile=RESIZE_710x

Credit: Menno Schaefer/Adobe

Starlings flock in a so-called murmuration, a collective behavior of interest in biological physics — one of many subfields that did not always “belong” in physics.

Topics: Applied Physics, Cosmology, Einstein, History, Physics, Research, Science

"To be rather than to seem." Translated from the Latin Esse Quam Videri, which also happens to be the state motto of North Carolina. It is from the treatise on Friendship by the Roman statesman Cicero, a reminder of the beauty and power of being true to oneself. Source: National Library of Medicine: Neurosurgery

If you’ve been in physics long enough, you’ve probably left a colloquium or seminar and thought to yourself, “That talk was interesting, but it wasn’t physics.”

If so, you’re one of many physicists who muse about the boundaries of their field, perhaps with colleagues over lunch. Usually, it’s all in good fun.

But what if the issue comes up when a physics faculty makes decisions about hiring or promoting individuals to build, expand, or even dismantle a research effort? The boundaries of a discipline bear directly on the opportunities departments can offer students. They also influence those students’ evolving identities as physicists, and on how they think about their own professional futures and the future of physics.

So, these debates — over physics and “not physics” — are important. But they are also not new. For more than a century, physicists have been drawing and redrawing the borders around the field, embracing and rejecting subfields along the way.

A key moment for “not physics” occurred in 1899 at the second-ever meeting of the American Physical Society. In his keynote address, the APS president Henry Rowland exhorted his colleagues to “cultivate the idea of the dignity” of physics.

“Much of the intellect of the country is still wasted in the pursuit of so-called practical science which ministers to our physical needs,” he scolded, “[and] not to investigations in the pure ethereal physics which our Society is formed to cultivate.”

Rowland’s elitism was not unique — a fact that first-rate physicists working at industrial laboratories discovered at APS meetings, when no one showed interest in the results of their research on optics, acoustics, and polymer science. It should come as no surprise that, between 1915 and 1930, physicists were among the leading organizers of the Optical Society of America (now Optica), the Acoustical Society of America, and the Society of Rheology.

That acousticians were given a cold shoulder at early APS meetings is particularly odd. At the time, acoustics research was not uncommon in American physics departments. Harvard University, for example, employed five professors who worked extensively in acoustics between 1919 and 1950. World War II motivated the U.S. Navy to sponsor a great deal of acoustics research, and many physics departments responded quickly. In 1948, the University of Texas hired three acousticians as assistant professors of physics. Brown University hired six physicists between 1942 and 1952, creating an acoustics powerhouse that ultimately trained 62 physics doctoral students.

The acoustics landscape at Harvard changed abruptly in 1946, when all teaching and research in the subject moved from the physics department to the newly created department of engineering sciences and applied physics. In the years after, almost all Ph.D. acoustics programs in the country migrated from physics departments to “not physics” departments.

The reason for this was explained by Cornell University professor Robert Fehr at a 1964 conference on acoustics education. Fehr pointed out that engineers like himself exploited the fundamental knowledge of acoustics learned from physicists to alter the environment for specific applications. Consequently, it made sense that research and teaching in acoustics passed from physics to engineering.

It took less than two decades for acoustics to go from being physics to “not physics.” But other fields have gone the opposite direction — a prime example being cosmology.

Albert Einstein applied his theory of general relativity to the cosmos in 1917. However, his work generated little interest because there was no empirical data to which it applied. Edwin Hubble’s work on extragalactic nebulae appeared in 1929, but for decades, there was little else to constrain mathematical speculations about the physical nature of the universe. The theoretical physicists Freeman Dyson and Steven Weinberg have both used the phrase “not respectable” to describe how cosmology was seen by physicists around 1960. The subject was simply “not physics.”

This began to change in 1965 with the discovery of thermal microwave radiation throughout the cosmos — empirical evidence of the nearly 20-year-old Big Bang model. Physicists began to engage with cosmology, and the percentage of U.S. physics departments with at least one professor who published in the field rose from 4% in 1964 to 15% in 1980. In the 1980s, physicists led the satellite mission to study the cosmic microwave radiation, and particle physicists — realizing that the hot early universe was an ideal laboratory to test their theories — became part-time cosmologists. Today, it’s hard to find a medium-to-large sized physics department that does not list cosmology as a research specialty.

Opinion: That's Not Physics, Andrew Zangwill, APS

Read more…

Spongy Narks...

12389875456?profile=RESIZE_710x

Scientists used samples from sclerosponges off the coast of Puerto Rico to calculate ocean surface temperatures going back 300 years. Douglas Rissing/iStockphoto/Getty Images

Topics: Climate Change, Existentialism, Global Warming, Research, Thermodynamics

CNN — Using sponges collected off the coast of Puerto Rico in the eastern Caribbean, scientists have calculated 300 years of ocean temperatures and concluded the world has already overshot one crucial global warming limit and is speeding toward another.

These findings, published Monday in the journal Nature Climate Change, are alarming but also controversial. Other scientists say the study contains too many uncertainties and limitations to draw such firm conclusions and could end up confusing public understanding of climate change.

Sponges — which grow slowly, layer by layer — can act like data time capsules, allowing a glimpse into what the ocean was like hundreds of years ago, long before the existence of modern data.

Using samples from sclerosponges, which live for centuries, the team of international scientists was able to calculate ocean surface temperatures going back 300 years.

They found human-caused warming may have started earlier than currently assumed and, as a result, global average temperature may have already warmed more than 1.5 degrees Celsius above pre-industrial levels. Researchers say the results also suggest global temperature could overshoot 2 degrees of warming by the end of the decade.

Under the 2015 Paris Agreement, countries pledged to restrict global warming to less than 2 degrees above pre-industrial levels, with an ambition to limit it to 1.5 degrees. The pre-industrial era — or the state of the climate before humans started burning large amounts of fossil fuels and warming the planet — is commonly defined as 1850-1900.

Data from centuries-old sea creatures suggest the world is warming faster than scientists thought, Rachel Ramirez, CNN

Read more…

Boltwood Estimate...

12365551887?profile=RESIZE_710x

Credit: Public Domain

Topics: Applied Physics, Education, History, Materials Science, Philosophy, Radiation, Research

We take for granted that Earth is very old, almost incomprehensibly so. But for much of human history, estimates of Earth’s age were scattershot at best. In February 1907, a chemist named Bertram Boltwood published a paper in the American Journal of Science detailing a novel method of dating rocks that would radically change these estimates. In mineral samples gathered from around the globe, he compared lead and uranium levels to determine the minerals’ ages. One was a bombshell: A sample of the mineral thorianite from Sri Lanka (known in Boltwood’s day as Ceylon) yielded an age of 2.2 billion years, suggesting that Earth must be at least that old as well. While Boltwood was off by more than 2 billion years (Earth is now estimated to be about 4.5 billion years old), his method undergirds one of today’s best-known radiometric dating techniques.

In the Christian world, Biblical cosmology placed Earth’s age at around 6,000 years, but fossil and geology discoveries began to upend this idea in the 1700s. In 1862, physicist William Thomson, better known as Lord Kelvin, used Earth’s supposed rate of cooling and the assumption that it had started out hot and molten to estimate that it had formed between 20 and 400 million years ago. He later whittled that down to 20-40 million years, an estimate that rankled Charles Darwin and other “natural philosophers” who believed life’s evolutionary history must be much longer. “Many philosophers are not yet willing to admit that we know enough of the constitution of the universe and of the interior of our globe to speculate with safety on its past duration,” Darwin wrote. Geologists also saw this timeframe as much too short to have shaped Earth’s many layers.

Lord Kelvin and other physicists continued studies of Earth’s heat, but a new concept — radioactivity — was about to topple these pursuits. In the 1890s, Henri Becquerel discovered radioactivity, and the Curies discovered the radioactive elements radium and polonium. Still, wrote physicist Alois F. Kovarik in a 1929 biographical sketch of Boltwood, “Radioactivity at that time was not a science as yet, but merely represented a collection of new facts which showed only little connection with each other.”

February 1907: Bertram Boltwood Estimates Earth is at Least 2.2 Billion Years Old, Tess Joosse, American Physical Society

Read more…

A Path From Panic...

12356767098?profile=RESIZE_710x

PAC1R-expressing dorsal raphe neurons in the mouse brain (red) serve as the projection targets for PACAP parabrachial neurons to mediate panic-like behavioral and physical symptoms. Credit: Salk Institute

Topics: Biology, Medicine, Research, Science

Overwhelming fear, sweaty palms, shortness of breath, rapid heart rate—these are the symptoms of a panic attack, which people with panic disorder have frequently and unexpectedly. Creating a map of the regions, neurons, and connections in the brain that mediate these panic attacks can provide guidance for developing more effective panic disorder therapeutics.

Now, Salk researchers have begun to construct that map by discovering a brain circuit that mediates panic disorder. This circuit consists of specialized neurons that send and receive a neuropeptide—a small protein that sends messages throughout the brain—called PACAP. What's more, they determined that PACAP and the neurons that produce its receptor are possible druggable targets for new panic disorder treatments.

The findings were published in Nature Neuroscience.

"We've been exploring different areas of the brain to understand where panic attacks start," says senior author Sung Han, associate professor at Salk.

"Previously, we thought the amygdala, known as the brain's fear center, was mainly responsible—but even people who have damage to their amygdala can still experience panic attacks, so we knew we needed to look elsewhere. Now, we've found a specific brain circuit outside of the amygdala that is linked to panic attacks and could inspire new panic disorder treatments that differ from currently available panic disorder medications that typically target the brain's serotonin system."

Scientists uncover key brain pathway mediating panic disorder symptoms, Salk Institute.

Read more…

Anthrobots...

This image has an empty alt attribute; its file name is anthrobots.png

An Anthrobot is shown, depth colored, with a corona of cilia that provides locomotion for the bot. Credit: Gizem Gumuskaya, Tufts University

Topics: Applied Physics, Biology, Biomimetics, Biotechnology, Research, Robotics

Researchers at Tufts University and Harvard University's Wyss Institute have created tiny biological robots that they call Anthrobots from human tracheal cells that can move across a surface and have been found to encourage the growth of neurons across a region of damage in a lab dish.

The multicellular robots, ranging in size from the width of a human hair to the point of a sharpened pencil, were made to self-assemble and shown to have a remarkable healing effect on other cells. The discovery is a starting point for the researchers' vision to use patient-derived biobots as new therapeutic tools for regeneration, healing, and treatment of disease.

The work follows from earlier research in the laboratories of Michael Levin, Vannevar Bush, Professor of Biology at Tufts University School of Arts & Sciences, and Josh Bongard at the University of Vermont, in which they created multicellular biological robots from frog embryo cells called Xenobots, capable of navigating passageways, collecting material, recording information, healing themselves from injury, and even replicating for a few cycles on their own.

At the time, researchers did not know if these capabilities were dependent on their being derived from an amphibian embryo or if biobots could be constructed from cells of other species.

In the current study, published in Advanced Science, Levin, along with Ph.D. student Gizem Gumuskaya, discovered that bots can, in fact, be created from adult human cells without any genetic modification, and they are demonstrating some capabilities beyond what was observed with the Xenobots.

The discovery starts to answer a broader question that the lab has posed—what are the rules that govern how cells assemble and work together in the body, and can the cells be taken out of their natural context and recombined into different "body plans" to carry out other functions by design?

Anthrobots: Scientists build tiny biological robots from human tracheal cells, Tufts University

Read more…

Dark Matter, Ordinary Matter...

12301280470?profile=RESIZE_710x

Topics: Astronomy, Astrophysics, Dark Matter, Research, Theoretical Physics

Dark matter, composed of particles that do not reflect, emit, or absorb light, is predicted to make up most of the matter in the universe. However, its lack of interactions with light prevents its direct detection using conventional experimental methods.

Physicists have been trying to devise alternative methods to detect and study dark matter for decades, yet many questions about its nature and its presence in our galaxy remain unanswered. Pulsar Timing Array (PTA) experiments have been trying to probe the presence of so-called ultralight dark matter particles by examining the timing of an ensemble of galactic millisecond radio pulsars (i.e., celestial objects that emit regular millisecond-long radio wave pulses).

The European Pulsar Timing Array, a multinational team of researchers based at different institutes that are using 6 radio-telescopes across Europe to observe specific pulsars, recently analyzed the second wave of data they collected. Their paper, published in Physical Review Letters, sets more stringent constraints on the presence of ultralight dark matter in the Milky Way.

"This paper was basically the result of my first Ph.D. project," Clemente Smarra, co-author of the paper, told Phys.org. "The idea arose when I asked my supervisor if I could carry out research focusing on gravitational wave science, but from a particle physics perspective. The main aim of the project was to constrain the presence of the so-called ultralight dark matter in our galaxy."

Ultralight dark matter is a hypothetical dark matter candidate, made up of very light particles that could potentially address long-standing mysteries in the field of astrophysics. The recent study by Smarra and his colleagues was aimed at probing the possible presence of this type of dark matter in our galaxy via data collected by the European Pulsar Timing Array.

"We were inspired by previous efforts in this field, especially by the work of Porayko and her collaborators," Smarra said. "Thanks to the longer duration and the improved precision of our dataset, we were able to put more stringent constraints on the presence of ultralight dark matter in the Milky Way,"

The recent paper by the European Pulsar Timing Array makes different assumptions than those made by other studies carried out in the past. Instead of probing interactions between dark matter and ordinary matter, it assumes that these interactions only occur via gravitational effects.

"We assumed that dark matter interacts with ordinary matter only through gravitational interaction," Smarra explained. "This is a rather robust claim: in fact, the only sure thing we know about dark matter is that it interacts gravitationally. In a few words, dark matter produces potential wells in which pulsar radio beams travel. But the depth of these wells is periodic in time; therefore, the travel time of the radio beams from pulsars to the Earth changes with a distinctive periodicity as well."

New constraints on the presence of ultralight dark matter in the Milky Way, Ingrid Fadelli, Phys.org.

Read more…

Pines' Demon...

12224357688?profile=RESIZE_710x

Lurking for decades: researchers have discovered Pines' demon, a collection of electrons in a metal that behaves like a massless wave. It is illustrated here as an artist’s impression. (Courtesy: The Grainger College of Engineering/University of Illinois Urbana-Champaign)

Topics: Particle Physics, Quantum Mechanics, Research, Solid-State Physics, Theoretical Physics

For nearly seven decades, a plasmon known as Pines’ demon has remained a purely hypothetical feature of solid-state systems. Massless, neutral, and unable to interact with light, this unusual quasiparticle is reckoned to play a key role in certain superconductors and semimetals. Now, scientists in the US and Japan say they have finally detected it while using specialized electron spectroscopy to study the material strontium ruthenate.

Plasmons were proposed by the physicists David Pines and David Bohm in 1952 as quanta of collective electron density fluctuations in a plasma. They are analogous to phonons, which are quanta of sound, but unlike phonons, their frequency does not tend to zero when they have no momentum. That’s because finite energy is needed to overcome the Coulomb attraction between electrons and ions in a plasma in order to get oscillations going, which entails a finite oscillation frequency (at zero momentum).

Today, plasmons are routinely studied in metals and semiconductors, which have conduction electrons that behave like a plasma. Plasmons, phonons, and other quantized fluctuations are called quasiparticles because they share properties with fundamental particles such as photons.

In 1956, Pines hypothesized the existence of a plasmon which, like sound, would require no initial burst of energy. He dubbed the new quasiparticle a demon in honor of James Clerk Maxwell’s famous thermodynamic demon. Pines’ demon forms when electrons in different bands of metal move out of phase with one another such that they keep the overall charge static. In effect, a demon is the collective motion of neutral quasiparticles whose charge is screened by electrons from another band.

Demon quasiparticle is detected 67 years after it was first proposed. Edwin Cartlidg, Physics World.

Read more…

Polluting the Pristine...

12222728900?profile=RESIZE_584x

The sea floor near Australia’s Casey station in Antarctica has been found to have levels of pollution comparable to those in Rio de Janeiro’s harbor. Credit: Torsten Blackwood/AFP via Getty

Topics: Antarctica, Biology, Chemistry, Environment, Physics, Research

Antarctica is often described as one of the most pristine places in the world, but it has a dirty secret. Parts of the sea floor near Australia’s Casey research station are as polluted as the harbor in Rio de Janeiro, Brazil, according to a study published in PLoS ONE in August.

The contamination is likely to be widespread across Antarctica’s older research stations, says study co-author Jonathan Stark, a marine ecologist at the Australian Antarctic Division in Hobart. “These contaminants accumulate over long time frames and don’t just go away,” he says.

Stark and his colleagues found high concentrations of hydrocarbons — compounds found in fuels — and heavy metals, such as lead, copper, and zinc. Many of the samples were also loaded with polychlorinated biphenyls, highly carcinogenic chemical compounds that were common before their international ban in 2001.

When the researchers compared some of the samples with data from the World Harbor Project — an international collaboration that tracks large urban waterways — they found that lead, copper, and zinc levels in some cases were similar to those seen in parts of Sydney Harbour and Rio de Janeiro over the past two decades.

Widespread pollution

The problem of pollution is not unique to Casey station, says Ceisha Poirot, manager of policy, environment, and safety at Antarctica New Zealand in Christchurch. “All national programs are dealing with this issue,” she says. At New Zealand’s Scott Base — which is being redeveloped — contamination left from past fuel spills and poor waste management has been detected in soil and marine sediments. More of this historical pollution will emerge as the climate warms, says Poirot. “Things that were once frozen in the soil are now becoming more mobile,” she says.

Most of Antarctica’s contamination is due to historically poor waste management. In the old days, waste was often just dumped a small distance from research stations, says Terence Palmer, a marine scientist at Texas A&M University–Corpus Christi.

Research stations started to get serious about cleaning up their act in 1991. In that year, an international agreement known as the Protocol on Environmental Protection to the Antarctic Treaty, or the Madrid Protocol, was adopted. This designated Antarctica as a “natural reserve, devoted to peace and science,” and directed nations to monitor environmental impacts related to their activities. But much of the damage had already been done — roughly two-thirds of Antarctic research stations were built before 1991.

Antarctic research stations have polluted a pristine wilderness, Gemma Conroy, Nature.

Read more…

Quantum Vortexes...

11772220259?profile=RESIZE_710x

A new study by KTH Royal Institute of Technology and Stanford University revises of our understanding of quantum vortices in superconductors. Pictured an artist’s depiction of quantum vortices. Credit: Greg Stewart, SLAC National Accelerator Laboratory

Topics: Modern Physics, Quantum Mechanics, Research, Superconductors

Within superconductors, little tornadoes of electrons, known as quantum vortices, can occur, which have important implications in superconducting applications such as quantum sensors. Now a new kind of superconducting vortex has been found, an international team of researchers reports.

Egor Babaev, professor at KTH Royal Institute of Technology in Stockholm, says the study revises the prevailing understanding of how electronic flow can occur in superconductors, based on work about quantum vortices that was recognized in the 2003 Nobel Prize award. The researchers at KTH, together with researchers from Stanford University, TD Lee Institute in Shanghai, and AIST in Tsukuba, discovered that the magnetic flux produced by vortices in a superconductor can be divided up into a wider range of values than thought.

That represents a new insight into the fundamentals of superconductivity and also potentially can be applied in superconducting electronics.

A vortex of magnetic flux happens when an external magnetic field is applied to a superconductor. The magnetic field penetrates the superconductor in the form of quantized magnetic flux tubes, which form vortices. Babaev says that originally research held that quantum vortices pass through superconductors each carrying one quantum of magnetic flux. But arbitrary fractions of quantum flux were not a possibility entertained in earlier theories of superconductivity.

Using the Superconducting Quantum Interference Device (SQUID) at Stanford University Babaev's co-authors, research scientist Yusuke Iguchi and Professor Kathryn A. Moler, showed at a microscopic level that quantum vortices can exist in a single electronic band. The team was able to create and move around these fractional quantum vortices, Moler says.

"Professor Babaev has been telling me for years that we could see something like this, but I didn't believe it until Dr. Iguchi actually saw it and conducted a number of detailed checks," she says.

Tiny quantum electronic vortexes can circulate in superconductors in ways not seen before, KTH Royal Institute of Technology, Phys.org.

 

Read more…

11433450673?profile=RESIZE_710x

An X-ray flash illuminates a molecule. Credit: Raphael Jay

Topics: Chemistry, Climate Change, Green Tech, High Energy Physics, Research, X-rays

The use of short flashes of X-ray light brings scientists one big step closer to developing better catalysts to transform the greenhouse gas methane into a less harmful chemical. The result, published in the journal Science, reveals for the first time how carbon-hydrogen bonds of alkanes break and how the catalyst works in this reaction.

Methane, one of the most potent greenhouse gases, is being released into the atmosphere at an increasing rate by livestock farming and the unfreezing of permafrost. Transforming methane and longer-chain alkanes into less harmful and, in fact, useful chemicals would remove the associated threats and, in turn, make a huge feedstock for the chemical industry available. However, transforming methane necessitates, as a first step, the breaking of a C-H bond, one of the strongest chemical linkages in nature.

Forty years ago, molecular metal catalysts that can easily split C-H bonds were discovered. The only thing found to be necessary was a short flash of visible light to "switch on" the catalyst, and, as by magic, the strong C-H bonds of alkanes passing nearby are easily broken almost without using any energy. Despite the importance of this so-called C-H activation reaction, it remained unknown over the decades how that catalyst performs this function.

The research was led by scientists from Uppsala University in collaboration with the Paul Scherrer Institute in Switzerland, Stockholm University, Hamburg University, and the European XFEL in Germany. For the first time, the scientists were able to directly watch the catalyst at work and reveal how it breaks those C-H bonds.

In two experiments conducted at the Paul Scherrer Institute in Switzerland, the researchers were able to follow the delicate exchange of electrons between a rhodium catalyst and an octane C-H group as it gets broken. Using two of the most powerful sources of X-ray flashes in the world, the X-ray laser SwissFEL and the X-ray synchrotron Swiss Light Source, the reaction could be followed all the way from the beginning to the end. The measurements revealed the initial light-induced activation of the catalyst within 400 femtoseconds (0.0000000000004 seconds) to the final C-H bond breaking after 14 nanoseconds (0.000000014 seconds).

X-rays visualize how one of nature's strongest bonds breaks, Uppsala University, Phys.org.

Read more…

Organic Solar Cells...

11217774072?profile=RESIZE_584x

Prof. Li Gang invented a novel technique to achieve breakthrough efficiency with organic solar cells. Credit: Hong Kong Polytechnic University

Topics: Chemistry, Green Tech, Materials Science, Photonics, Research, Solar Power

Researchers from The Hong Kong Polytechnic University (PolyU) have achieved a breakthrough power-conversion efficiency (PCE) of 19.31% with organic solar cells (OSCs), also known as polymer solar cells. This remarkable binary OSC efficiency will help enhance these advanced solar energy device applications.

The PCE, a measure of the power generated from a given solar irradiation, is considered a significant benchmark for the performance of photovoltaics (PVs), or solar panels, in power generation. The improved efficiency of more than 19% that was achieved by the PolyU researchers constitutes a record for binary OSCs, which have one donor and one acceptor in the photoactive layer.

Led by Prof. Li Gang, Chair Professor of Energy Conversion Technology, and Sir Sze-Yen Chung, Endowed Professor in Renewable Energy at PolyU, the research team invented a novel OSC morphology-regulating technique by using 1,3,5-trichlorobenzene as a crystallization regulator. This new technique boosts OSC efficiency and stability.

The team developed a non-monotonic intermediated state manipulation (ISM) strategy to manipulate the bulk-heterojunction (BHJ) OSC morphology and simultaneously optimize the crystallization dynamics and energy loss of non-fullerene OSCs. Unlike the strategy of using traditional solvent additives, which is based on excessive molecular aggregation in films, the ISM strategy promotes the formation of more ordered molecular stacking and favorable molecular aggregation. As a result, the PCE was considerably increased, and the undesirable non-radiative recombination loss was reduced. Notably, non-radiative recombination lowers the light generation efficiency and increases heat loss.

Researchers achieve a record 19.31% efficiency with organic solar cells. Hong Kong Polytechnic University. Tech Explore

Read more…

Solar...

11148143690?profile=RESIZE_710x

The LRESE parabolic dish: the solar reactor converts solar energy to hydrogen with an efficiency of more than 20%, producing around 0.5 kg of "green" hydrogen per day. (Courtesy: LRESE EPFL)

Topics: Applied Physics, Energy, Environment, Research, Solar Power

A new solar-radiation-concentrating device produces “green” hydrogen at a rate of more than 2 kilowatts while maintaining efficiencies above 20%. The pilot-scale device, which is already operational under real sunlight conditions, also produces usable heat and oxygen, and its developers at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland say it could be commercialized in the near future.

The new system sits on a concrete foundation on the EPFL campus and consists of a parabolic dish seven meters in diameter. This dish collects sunlight over a total area of 38.5 m2, concentrates it by a factor of about 1000, and directs it onto a reactor that comprises both photovoltaic and electrolysis components. Energy from the concentrated sunlight generates electron-hole pairs in the photovoltaic material, which the system then separates and transports to the integrated electrolysis system. Here, the energy is used to “split” water pumped through the system at an optimal rate, producing oxygen and hydrogen.

Putting it together at scale

Each of these processes has, of course, been demonstrated before. Indeed, the new EPFL system, which is described in Nature Energy, builds on previous research from 2019, when the EPFL team demonstrated the same concept at a laboratory scale using a high-flux solar simulator. However, the new reactor’s solar-to-hydrogen efficiency and hydrogen production rate of around 0.5 kg per day is unprecedented in large-scale devices. The reactor also produces usable heat at a temperature of 70°C.

The versatility of the new system forms a big part of its commercial appeal, says Sophia Haussener, who leads the EPFL’s Laboratory of Renewable Energy Science and Engineering (LRESE). “This co-generation system could be used in industrial applications such as metal processing and fertilizer manufacturing,” Haussener tells Physics World. “It could also be used to produce oxygen for use in hospitals and hydrogen for fuels cells in electric vehicles, as well as heat in residential settings for heating water. The hydrogen produced could also be converted to electricity after being stored between days or even inter-seasonally.”

Concentrated solar reactor generates unprecedented amounts of hydrogen, Isabelle Dumé, Physics World.

Read more…

Balsa Chips...

11135716495?profile=RESIZE_710x

Modified wood modulates electrical current: researchers at Linköping University and colleagues from the KTH Royal Institute of Technology have developed the world’s first electrical transistor made of wood. (Courtesy: Thor Balkhed)

Topics: Applied Physics, Biomimetics, Electrical Engineering, Materials Science, Research

Researchers in Sweden have built a transistor out of a plank of wood by incorporating electrically conducting polymers throughout the material to retain space for an ionically conductive electrolyte. The new technique makes it possible, in principle, to use wood as a template for numerous electronic components, though the Linköping University team acknowledges that wood-based devices cannot compete with traditional circuitry on speed or size.

Led by Isak Engquist of Linköping’s Laboratory for Organic Electronics, the researchers began by removing the lignin from a plank of balsa wood (chosen because it is grainless and evenly structured) using a NaClO2 chemical and heat treatment. Since lignin typically constitutes 25% of wood, removing it creates considerable scope for incorporating new materials into the structure that remains.

The researchers then placed the delignified wood in a water-based dispersion of an electrically conducting polymer called poly(3,4-ethylene-dioxythiophene)–polystyrene sulfonate, or PEDOT: PSS. Once this polymer diffuses into the wood, the previously insulating material becomes a conductor with an electrical conductivity of up to 69 Siemens per meter – a phenomenon the researchers attribute to the formation of PEDOT: PSS microstructures inside the 3D wooden “scaffold.”

Next, Engquist and colleagues constructed a transistor using one piece of this treated balsa wood as a channel and additional pieces on either side to form a double transistor gate. They also soaked the interface between the gates and channels in an ion-conducting gel. In this arrangement, known as an organic electrochemical transistor (OECT), applying a voltage to the gate(s) triggers an electrochemical reaction in the channel that makes the PEDOT molecules non-conducting and therefore switches the transistor off.

A transistor made from wood, Isabelle Dumé, Physics World

Read more…

10998141656?profile=RESIZE_710x

Atomic analog: when a beam of light is shone into a water droplet, the light is trapped inside. (Courtesy: Javier Tello Marmolejo)

Topics: Modern Physics, Optics, Quantum Mechanics, Quantum Optics, Research

Light waves confined in an evaporating water droplet provide a useful model of the quantum behavior of atoms, researchers in Sweden and Mexico have discovered. Through a simple experiment, a team led by Javier Marmolejo at the University of Gothenburg has shown how the resonance of light inside droplets of specific sizes can provide robust analogies to atomic energy levels and quantum tunneling.

When light is scattered by a liquid droplet many times larger than its wavelength, some of the light may reflect around the droplet’s internal edge. If the droplet’s circumference is a perfect multiple of the light’s wavelength inside the liquid, the resulting resonance will cause the droplet to flash brightly. This is an optical example of a whispering gallery mode, whereby sound can reflect around a circular room.

This effect was first described mathematically by the German physicist Gustav Mie in 1908 – yet despite the simplicity of the scenario, the rich array of overlapping resonances it produces can create some incredibly complex patterns, some of which have yet to be studied in detail.

Optical Tweezers

To explore the effect in more detail, Marmolejo and the team devised an experiment where they confined water droplets using optical tweezers. They evaporated the liquid by heating it with a fixed-frequency laser. As the droplets shrank, their circumferences will sometimes equal a multiple of the laser’s wavelength. At these “Mie resonances,” the droplets flashed brightly.

As they studied this effect, the researchers realized that the flashing droplets are analogous to the quantum behaviors of atoms. In these “optical atoms,” orbiting electrons are replaced with resonating photons. The electrostatic potential that binds electrons to the nucleus is replaced by the droplet’s refractive index, which tends to trap light in the droplet by internal reflection. The quantized energy levels of an atom are represented by the droplet sizes where Mie resonances occur.

Flashing droplets could shed light on atomic physics and quantum tunneling, Sam Jarman, Physics World.

Read more…

AAAS Science Awards...

10997164252?profile=RESIZE_710x

Topics: Diversity in Science, Education, Research, STEM, Theoretical Physics

The American Association for the Advancement of Science has announced the 2023 winners of eight longstanding awards that recognize scientists, engineers, innovators, and public servants for their contributions to science and society.

The awards honor individuals and teams for a range of achievements, from advancing science diplomacy and engaging the public in order to boost scientific understanding to mentoring the next generation of scientists and engineers.

The 2023 winners were first announced on social media between Feb. 23 and Feb. 28; see the hashtag #AAASAward to learn more. The winners were also recognized at the 2023 AAAS Annual Meeting, held in Washington, D.C., March 2-5. The winning individuals and teams were honored with tribute videos and received commemorative plaques during several plenary sessions.

Six of the awards include a prize of $5,000, while the AAAS David and Betty Hamburg Award for Science Diplomacy award the winning individual or team $10,000, and the AAAS Newcomb Cleveland Prize awards the winning individual or team $25,000.

Learn more about the awards’ history, criteria, and selection processes via the AAAS awards page, and read on to learn more about the individuals and teams who earned the 2023 awards.

*****

Sekazi Mtingwa is the recipient of the 2023 AAAS Philip Hauge Abelson Prize, which recognizes someone who has made significant contributions to the scientific community — whether through research, policy, or civil service — in the United States. The awardee can be a public servant, scientist, or individual in any field who has made sustained, exceptional contributions and other notable services to the scientific community. Mtingwa exemplifies a commitment to service and dedication to the scientific community, research workforce, and society. His contributions have shaped research, public policy, and the next generation of scientific leaders, according to the award’s selection committee.

As a theoretical physicist, Mtingwa pioneered work on intrabeam scattering that is foundational to particle accelerator research. Today a principal partner at Triangle Science, Education and Economic Development, where he consults on STEM education and economic development, Mtingwa has been affiliated during his scientific career with North Carolina A&T State University, Harvard University, the Massachusetts Institute of Technology, and several national laboratories.

His contributions to the scientific community have included a focus on diversity, equity, and inclusion in physics. He co-founded the National Society of Black Physicists, which today is a home for more than 500 Black physicists and students. His work has also contributed to rejuvenating university nuclear science and engineering programs and paving the way for the next generation of nuclear scientists and engineers. Mtingwa served as the chair of a 2008 American Physical Society study on the readiness of the U.S. nuclear workforce, the results of which played a key role in the U.S. Department of Energy allocating 20% of its nuclear fuel cycle R&D budget to university programs.

“I have devoted myself to being an apostle for science for those both at home and abroad who face limited research and training opportunities,” said Mtingwa. “Receiving the highly prestigious Philip Hauge Abelson Prize affirms that I have been successful in this mission. Moreover, it provides me with the armor to press onward to even greater contributions.”

AAAS Recognizes 2023 Award Winners for Contributions to Science and Society, Andrea Korte

Read more…

Where No One Has Gone Before...

10978077289?profile=RESIZE_584x

Images of six candidate massive galaxies, seen 500-800 million years after the Big Bang. One of the sources (bottom left) could contain as many stars as our present-day Milky Way but is 30 times more compact. Credit: NASA, ESA, CSA, I. Labbe (Swinburne University of Technology); Image processing: G. Brammer (Niels Bohr Institute’s Cosmic Dawn Center at the University of Copenhagen)

Topics: Astronomy, Astrophysics, Cosmology, Research

Nobody expected them. They were not supposed to be there. And now, nobody can explain how they had formed. 

Galaxies nearly as massive as the Milky Way and full of mature red stars seem to be dispersed in deep-field images obtained by the James Webb Space Telescope (Webb or JWST) during its early observation campaign. They are giving astronomers a headache. 

These galaxies, described in a new study based on Webb's first data release, are so far away that they appear only as tiny reddish dots to the powerful telescope. By analyzing the light emitted by these galaxies, astronomers established that they were viewing them in our universe's infancy, only 500 million to 700 million years after the Big Bang.

Such early galaxies are not in themselves surprising. Astronomers expected that the first star clusters sprung up shortly after the universe moved out of the so-called dark ages — the first 400 million years of its existence when only a thick fog of hydrogen atoms permeated space. 

But the galaxies found in the Webb images appeared shockingly big, and the stars in them were too old. The new findings are in conflict with existing ideas of how the universe looked and evolved in its early years and don't match earlier observations made by Webb's less powerful predecessor, the Hubble Space Telescope.

JWST Discovers Enormous Distant Galaxies That Should Not Exist, Tereza Pultarova, Scientific American/Space.com.

Read more…

Small Steps, Large Changes...

10952388074?profile=RESIZE_400x

A vertical shock tube at Los Alamos National Laboratory is used for turbulence studies. Sulfur hexafluoride is injected at the top of the 5.3-meter tube and allowed to mix with air. The waste is ejected into the environment through the blue hose at the tube tower’s lower left; in the fiscal year 2021, such emissions made up some 16% of the lab’s total greenhouse gas emissions. The inset shows a snapshot of the mixing after a shock has crossed the gas interface; the darker gas is SF6, and the lighter is air. The intensities yield density values.

Topics: Civilization, Climate Change, Global Warming, Research

Reducing air travel, improving energy efficiency in infrastructure, and installing solar panels are among the obvious actions that individual researchers and their institutions can implement to reduce their carbon footprint. But they can take many other small and large steps, too, from reducing the use of single-use plastics and other consumables and turning off unused instruments to exploiting waste heat and siting computing facilities powered by renewable energy. On a systemic level, measures can encourage behaviors to reduce carbon emissions; for example, valuing in-person invited job talks and remote ones equally could lead to less air travel by scientists.

So far, the steps that scientists are taking to reduce their carbon footprint are largely grassroots, notes Hannah Johnson, a technician in the imaging group at the Princess Máxima Center for Pediatric Oncology in Utrecht and a member of Green Labs Netherlands, a volunteer organization that promotes sustainable science practices. The same goes for the time and effort they put in for the cause. One of the challenges, she says, is to get top-down support from institutions, funding agencies, and other national and international scientific bodies.

At some point, governments are likely to make laws that support climate sustainability, says Astrid Eichhorn, a professor at the University of Southern Denmark whose research is in quantum gravity and who is active on the European Federation of Academies of Sciences and Humanities committee for climate sustainability. “We are in a situation to be proactive and change in ways that do not compromise the quality of our research or our collaborations,” she says. “We should take that opportunity now and not wait for external regulations.”

Suppose humanity manages to limit emissions worldwide to 300 gigatons of carbon dioxide equivalent (CO2e). In that case, there is an 83% chance of not exceeding the 1.5 °C temperature rise above preindustrial levels set in the 2015 Paris Agreement, according to a 2021 Intergovernmental Panel on Climate Change special report. That emissions cap translates to a budget of 1.2 tons of CO2e per person annually through 2050. Estimates for the average emissions by researchers across scientific fields are much higher and range widely in part because of differing and incomplete accounting approaches, says Eichhorn. She cites values from 7 to 18 tons a year for European scientists.

Scientists take steps in the lab toward climate sustainability, Toni Feder, Physics Today.

Read more…