applied physics (60)

Bitcoin and Gaia...

12271621897?profile=RESIZE_710x

"What are the environmental impacts of cryptocurrency?" Written by Paul Kim; edited by Jasmine Suarez Mar 17, 2022, 5:21 PM EDT, Business Insider.

 Image: Ethereum, the second biggest cryptocurrency on the market, plans on changing to proof of stake mining in the future. Rachel Mendelson/Insider

 

Topics: Applied Physics, Computer Science, Cryptography, Economics, Environment, Star Trek, Thermodynamics

In what is now “old school Internet” (or web surfing for fogies), I will get a friend request from someone on Facebook/Meta who is in cryptocurrency. I quote myself in the first paragraph of what I refer to as my “public service announcement):

I am not INTERESTED in crypto. As someone who worked with cryptography as a matter of national security, holding a TS/SCI clearance, when you start your message with “let me explain to YOU how crypto works,” expect to get blocked.

Invariably, I still do, which makes me wonder if they read the PSA or think “they will be the one” to sign me. News flash, pilgrim...I now have another pertinent reason to ignore your blockchain solicitations, actually, several good reasons.

Every time we turn on a light in our homes, there is a thermal budget that we are being charged for (that's how Duke Power makes its money in North Carolina and Perdernales Electric Cooperative in Texas). Bitcoin/Blockchain (I think) caught the imagination because it seemed like a "Federation Credit" from Star Trek, where no one explains fully how a society that is "post-scarcity" somehow feels the need for some type of currency in utopia. It's kind of like magic carpets: you go with the bit for the story - warp drive, Heisenberg compensators, Federation credits. The story, and if you are thoroughly entertained after the denouement, not the physics, is what matters.

You might not be extracting anything from the planet directly, but Bitcoin mining has a massive impact on the planet’s environment.

Mining resources from our planet can take a devastating toll on the environment, both local and global. Even beyond this, using the resource could cause disastrous effects on our planet, and dependence on a single resource can wreak havoc on a country’s economy. Yet, many of these resources are needed for our daily lives -- sometimes as a luxury, sometimes as a necessity. Any responsible country or company should always take pause to consider what impact mining of any kind can have on the planet.

It turns out that these days, one type of mining might be the worst for Earth’s environment: bitcoins. Yes, the “mining” of virtual currency makes its mark on our planet. The unequal distribution of Bitcoin mining across the globe means that some countries are making a much larger dent into the planet’s climate and environment than others ... all for a “resource” that is far from necessary for our society.

Bitcoin mining uses a lot of computing power to solve the cryptographic puzzles that lie at the heart of the industry. As of today (October 30, 2023), each Bitcoin is worth over $34,000, and with the multitude of other cryptocoins out there, using computers to unlock more can be a profitable endeavor. Almost half a trillion dollars of the global economy runs on these “virtual currencies.”

Worst Kind of Mining for the Environment? It Might Be Bitcoin. Erik Klemetti, Discover Magazine

 

Read more…

In Medias Res...

12271231481?profile=RESIZE_710x

Image source: Link below

Topics: Applied Physics, Astrophysics, Computer Modeling, Einstein, High Energy Physics, Particle Physics, Theoretical Physics

In the search for new physics, a new kind of scientist is bridging the gap between theory and experiment.

Traditionally, many physicists have divided themselves into two tussling camps: the theorists and the experimentalists. Albert Einstein theorized general relativity, and Arthur Eddington observed it in action as “bending” starlight; Murray Gell-Mann and George Zweig thought up the idea of quarks, and Henry Kendall, Richard Taylor, Jerome Freidman and their teams detected them.

In particle physics especially, the divide is stark. Consider the Higgs boson, proposed in 1964 and discovered in 2012. Since then, physicists have sought to scrutinize its properties, but theorists and experimentalists don’t share Higgs data directly, and they’ve spent years arguing over what to share and how to format it. (There’s now some consensus, although the going was rough.)

But there’s a missing player in this dichotomy. Who, exactly, is facilitating the flow of data between theory and experiment?

Traditionally, the experimentalists filled this role, running the machines and looking at the data — but in high-energy physics and many other subfields, there’s too much data for this to be feasible. Researchers can’t just eyeball a few events in the accelerator and come to conclusions; at the Large Hadron Collider, for instance, about a billion particle collisions happen per second, which sensors detect, process, and store in vast computing systems. And it’s not just quantity. All this data is outrageously complex, made more so by simulation.

In other words, these experiments produce more data than anyone could possibly analyze with traditional tools. And those tools are imperfect anyway, requiring researchers to boil down many complex events into just a handful of attributes — say, the number of photons at a given energy. A lot of science gets left out.

In response to this conundrum, a growing movement in high-energy physics and other subfields, like nuclear physics and astrophysics, seeks to analyze data in its full complexity — to let the data speak for itself. Experts in this area are using cutting-edge data science tools to decide which data to keep and which to discard and to sniff out subtle patterns.


Opinion: The Rise of the Data Physicist, Benjamin Nachman, APS News

Read more…

Tunnel Falls...

12128045054?profile=RESIZE_710x

Chip off the old block: Intel’s Tunnel Falls chip is based on silicon spin qubits, which are about a million times smaller than other qubit types. (Courtesy: Intel Corporation)

Topics: Applied Physics, Chemistry, Electrical Engineering, Quantum Computer, Quantum Mechanics

Intel – the world’s biggest computer-chip maker – has released its newest quantum chip and has begun shipping it to quantum scientists and engineers to use in their research. Dubbed Tunnel Falls, the chip contains a 12-qubit array and is based on silicon spin-qubit technology.

The distribution of the quantum chip to the quantum community is part of Intel’s plan to let researchers gain hands-on experience with the technology while at the same time enabling new quantum research.

The first quantum labs to get access to the chip include the University of Maryland, Sandia National Laboratories, the University of Rochester, and the University of Wisconsin-Madison.

The Tunnel Falls chip was fabricated on 300 mm silicon wafers in Intel’s “D1” transistor fabrication facility in Oregon, which can carry out extreme ultraviolet lithography (EUV) and gate and contact processing techniques.

Intel releases 12-qubit silicon quantum chip to the quantum community, Martijn Boerkamp, Physics World.

Read more…

Beyond Attogram Imaging...

12126828078?profile=RESIZE_400x

When X-rays (blue color) illuminate an iron atom (red ball at the center of the molecule), core-level electrons are excited. X-ray excited electrons are then tunneled to the detector tip (gray) via overlapping atomic/molecular orbitals, which provide elemental and chemical information about the iron atom. Credit: Saw-Wai Hla

Topics: Applied Physics, Instrumentation, Materials Science, Nanomaterials, Quantum Mechanics

A team of scientists from Ohio University, Argonne National Laboratory, the University of Illinois-Chicago, and others, led by Ohio University Professor of Physics, and Argonne National Laboratory scientist, Saw Wai Hla, have taken the world's first X-ray SIGNAL (or SIGNATURE) of just one atom. This groundbreaking achievement could revolutionize the way scientists detect materials.

Since its discovery by Roentgen in 1895, X-rays have been used everywhere, from medical examinations to security screenings in airports. Even Curiosity, NASA's Mars rover, is equipped with an X-ray device to examine the material composition of the rocks on Mars. An important usage of X-rays in science is to identify the type of materials in a sample. Over the years, the quantity of materials in a sample required for X-ray detection has been greatly reduced thanks to the development of synchrotron X-rays sources and new instruments. To date, the smallest amount one can X-ray a sample is in an attogram, which is about 10,000 atoms or more. This is due to the X-ray signal produced by an atom being extremely weak, so conventional X-ray detectors cannot be used to detect it. According to Hla, it is a long-standing dream of scientists to X-ray just one atom, which is now being realized by the research team led by him.

"Atoms can be routinely imaged with scanning probe microscopes, but without X-rays, one cannot tell what they are made of. We can now detect exactly the type of a particular atom, one atom-at-a-time, and can simultaneously measure its chemical state," explained Hla, who is also the director of the Nanoscale and Quantum Phenomena Institute at Ohio University. "Once we are able to do that, we can trace the materials down to the ultimate limit of just one atom. This will have a great impact on environmental and medical sciences and maybe even find a cure that can have a huge impact on humankind. This discovery will transform the world."

Their paper, published in the scientific journal Nature on May 31, 2023, and gracing the cover of the print version of the scientific journal on June 1, 2023, details how Hla and several other physicists and chemists, including Ph.D. students at OHIO, used a purpose-built synchrotron X-ray instrument at the XTIP beamline of Advanced Photon Source and the Center for Nanoscale Materials at Argonne National Laboratory.

Scientists report the world's first X-ray of a single atom, Ohio University, Phys.org.

Read more…

Straining Moore...

12126816677?profile=RESIZE_710x

Topics: Applied Physics, Chemistry, Computer Science, Electrical Engineering, Materials Science, Nanotechnology, Quantum Mechanics, Semiconductor Technology

Gordon Moore, the co-founder of Intel who died earlier this year, is famous for forecasting a continuous rise in the density of transistors that we can pack onto semiconductor chips. James McKenzie looks at how “Moore’s law” is still going strong after almost six decades but warns that further progress is becoming harder and ever more expensive to sustain.

When the Taiwan Semiconductor Manufacturing Company (TSMC) announced last year that it was planning to build a new factory to produce integrated circuits, it wasn’t just the eye-watering $33bn price tag that caught my eye. What also struck me is that the plant, set to open in 2025 in the city of Hsinchu, will make the world’s first “2-nanometer” chips. Smaller, faster, and up to 30% more efficient than any microchip that has come before, TSMC’s chips will be sold to the likes of Apple – the company’s biggest customer – powering everything from smartphones to laptops.

But our ability to build such tiny, powerful chips shouldn’t surprise us. After all, the engineer Gordon Moore – who died on 24 March this year, aged 94 – famously predicted in 1965 that the number of transistors we can squeeze onto an integrated circuit ought to double yearly. Writing for the magazine Electronics (38 114), Moore reckoned that by 1975 it should be possible to fit a quarter of a million components onto a single silicon chip with an area of one square inch (6.25 cm2).

Moore’s prediction, which he later said was simply a “wild extrapolation”, held true, although, in 1975, he revised his forecast, predicting that chip densities would double every two years rather than every year. What thereafter became known as “Moore’s law” proved amazingly accurate, as the ability to pack ever more transistors into a tiny space underpinned the almost non-stop growth of the consumer electronics industry. In truth, it was never an established scientific “law” but more a description of how things had developed in the past as well as a roadmap that the semiconductor industry imposed on itself, driving future development.

Moore's law: further progress will push hard on the boundaries of physics and economics, James McKenzie, Physics World

Read more…

As The Worm Turns...

11180936452?profile=RESIZE_710x

Schematic diagram of the worm-inspired robot. Credit: Jin et al.

Topics: Applied Physics, Biomimetics, Instrumentation, Mechanical Engineering, Robotics

Bio-inspired robots, robotic systems that emulate the appearance, movements, and/or functions of specific biological systems, could help to tackle real-world problems more efficiently and reliably. Over the past two decades, roboticists have introduced a growing number of these robots, some of which draw inspiration from fruit flies, worms, and other small organisms.

Researchers at China University of Petroleum (East China) recently developed a worm-inspired robot with a body structure that is based on the oriental paper-folding art of origami. This robotic system, introduced in Bioinspiration & Biomimetics, is based on actuators that respond to magnetic forces, compressing and bending its body to replicate the movements of worms.

"Soft robotics is a promising field that our research group has been paying a lot of attention to," Jianlin Liu, one of the researchers who developed the robot, told Tech Xplore. "While reviewing the existing research literature in the field, we found that bionic robots, such as worm-inspired robots, were a topic worth exploring. We thus set out to fabricate a worm-like origami robot based on the existing literature. After designing and reviewing several different structures, we chose to focus on a specific knitting pattern for our robot."

A worm-inspired robot based on an origami structure and magnetic actuators, Ingrid Fadelli, Tech Xplore

Read more…

Solar...

11148143690?profile=RESIZE_710x

The LRESE parabolic dish: the solar reactor converts solar energy to hydrogen with an efficiency of more than 20%, producing around 0.5 kg of "green" hydrogen per day. (Courtesy: LRESE EPFL)

Topics: Applied Physics, Energy, Environment, Research, Solar Power

A new solar-radiation-concentrating device produces “green” hydrogen at a rate of more than 2 kilowatts while maintaining efficiencies above 20%. The pilot-scale device, which is already operational under real sunlight conditions, also produces usable heat and oxygen, and its developers at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland say it could be commercialized in the near future.

The new system sits on a concrete foundation on the EPFL campus and consists of a parabolic dish seven meters in diameter. This dish collects sunlight over a total area of 38.5 m2, concentrates it by a factor of about 1000, and directs it onto a reactor that comprises both photovoltaic and electrolysis components. Energy from the concentrated sunlight generates electron-hole pairs in the photovoltaic material, which the system then separates and transports to the integrated electrolysis system. Here, the energy is used to “split” water pumped through the system at an optimal rate, producing oxygen and hydrogen.

Putting it together at scale

Each of these processes has, of course, been demonstrated before. Indeed, the new EPFL system, which is described in Nature Energy, builds on previous research from 2019, when the EPFL team demonstrated the same concept at a laboratory scale using a high-flux solar simulator. However, the new reactor’s solar-to-hydrogen efficiency and hydrogen production rate of around 0.5 kg per day is unprecedented in large-scale devices. The reactor also produces usable heat at a temperature of 70°C.

The versatility of the new system forms a big part of its commercial appeal, says Sophia Haussener, who leads the EPFL’s Laboratory of Renewable Energy Science and Engineering (LRESE). “This co-generation system could be used in industrial applications such as metal processing and fertilizer manufacturing,” Haussener tells Physics World. “It could also be used to produce oxygen for use in hospitals and hydrogen for fuels cells in electric vehicles, as well as heat in residential settings for heating water. The hydrogen produced could also be converted to electricity after being stored between days or even inter-seasonally.”

Concentrated solar reactor generates unprecedented amounts of hydrogen, Isabelle Dumé, Physics World.

Read more…

Balsa Chips...

11135716495?profile=RESIZE_710x

Modified wood modulates electrical current: researchers at Linköping University and colleagues from the KTH Royal Institute of Technology have developed the world’s first electrical transistor made of wood. (Courtesy: Thor Balkhed)

Topics: Applied Physics, Biomimetics, Electrical Engineering, Materials Science, Research

Researchers in Sweden have built a transistor out of a plank of wood by incorporating electrically conducting polymers throughout the material to retain space for an ionically conductive electrolyte. The new technique makes it possible, in principle, to use wood as a template for numerous electronic components, though the Linköping University team acknowledges that wood-based devices cannot compete with traditional circuitry on speed or size.

Led by Isak Engquist of Linköping’s Laboratory for Organic Electronics, the researchers began by removing the lignin from a plank of balsa wood (chosen because it is grainless and evenly structured) using a NaClO2 chemical and heat treatment. Since lignin typically constitutes 25% of wood, removing it creates considerable scope for incorporating new materials into the structure that remains.

The researchers then placed the delignified wood in a water-based dispersion of an electrically conducting polymer called poly(3,4-ethylene-dioxythiophene)–polystyrene sulfonate, or PEDOT: PSS. Once this polymer diffuses into the wood, the previously insulating material becomes a conductor with an electrical conductivity of up to 69 Siemens per meter – a phenomenon the researchers attribute to the formation of PEDOT: PSS microstructures inside the 3D wooden “scaffold.”

Next, Engquist and colleagues constructed a transistor using one piece of this treated balsa wood as a channel and additional pieces on either side to form a double transistor gate. They also soaked the interface between the gates and channels in an ion-conducting gel. In this arrangement, known as an organic electrochemical transistor (OECT), applying a voltage to the gate(s) triggers an electrochemical reaction in the channel that makes the PEDOT molecules non-conducting and therefore switches the transistor off.

A transistor made from wood, Isabelle Dumé, Physics World

Read more…

11117286286?profile=RESIZE_710x

Fractals are a never-ending pattern that you can zoom in on, and the image doesn’t change. Fractals can occur in two dimensions, like frost on a window, or in three dimensions, like tree limbs. A recent discovery from Purdue University researchers has established that superconducting images, seen above in red and blue, are actually fractals that fill a three-dimensional space and are disorder driven rather than driven by quantum fluctuations as expected. Frost and tree images by Adobe. Superconducting image (center) from "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x" in Nature Communications. Credit: Nature Communications (2023). DOI: 10.1038/s41467-023-38249-3

Topics: Applied Physics, Civilization, Computer Modeling, Condensed Matter Physics, Materials Science, Solid-State Physics, Superconductors

Meeting the world's energy demands is reaching a critical point. Powering the technological age has caused issues globally. It is increasingly important to create superconductors that can operate at ambient pressure and temperature. This would go a long way toward solving the energy crisis.

Advancements with superconductivity hinge on advances in quantum materials. When electrons inside quantum materials undergo a phase transition, the electrons can form intricate patterns, such as fractals. A fractal is a never-ending pattern. When zooming in on a fractal, the image looks the same. Commonly seen fractals can be a tree or frost on a windowpane in winter. Fractals can form in two dimensions, like the frost on a window, or in three-dimensional space, like the limbs of a tree.

Dr. Erica Carlson, a 150th Anniversary Professor of Physics and Astronomy at Purdue University, led a team that developed theoretical techniques for characterizing the fractal shapes that these electrons make in order to uncover the underlying physics driving the patterns.

Carlson, a theoretical physicist, has evaluated high-resolution images of the locations of electrons in the superconductor Bi2-xPbzSr2-yLayCuO6+x (BSCO) and determined that these images are indeed fractal and discovered that they extend into the full three-dimensional space occupied by the material, like a tree filling space.

What was once thought of as random dispersions within the fractal images are purposeful and, shockingly, not due to an underlying quantum phase transition as expected but due to a disorder-driven phase transition.

Carlson led a collaborative team of researchers across multiple institutions and published their findings, titled "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x," in Nature Communications.

The team includes Purdue scientists and partner institutions. From Purdue, the team includes Carlson, Dr. Forrest Simmons, a recent Ph.D. student, and former Ph.D. students Dr. Shuo Liu and Dr. Benjamin Phillabaum. The Purdue team completed their work within the Purdue Quantum Science and Engineering Institute (PQSEI). The team from partner institutions includes Dr. Jennifer Hoffman, Dr. Can-Li Song, Dr. Elizabeth Main of Harvard University, Dr. Karin Dahmen of the University of Illinois at Urbana-Champaign, and Dr. Eric Hudson of Pennsylvania State University.

Researchers discover superconductive images are actually 3D and disorder-driven fractals, Cheryl Pierce, Purdue University, Phys.org.

Read more…

Electrical Wound Care...

11036959494?profile=RESIZE_710x

New research from Chalmers University of Technology, Sweden, and the University of Freiburg, Germany, shows that wounds on cultured skin cells heal three times faster when stimulated with electric current. The project was recently granted more funding so the research can get one step closer to the market and the benefit of patients. Credit: Science Brush, Hassan A. Tahin

Topics: Applied Physics, Biotechnology, Medicine

Chronic wounds are a major health problem for diabetic patients and the elderly—in extreme cases, they can even lead to amputation. Using electric stimulation, researchers in a project at Chalmers University of Technology, Sweden, and the University of Freiburg, Germany, have developed a method that speeds up healing, making wounds heal three times faster.

There is an old Swedish saying that one should never neglect a small wound or a friend in need. For most people, a small wound does not lead to any serious complications, but many common diagnoses make wound healing far more difficult. People with diabetes, spinal injuries, or poor blood circulation have impaired wound-healing ability. This means a greater risk of infection and chronic wounds—which can lead to serious consequences like amputation in the long run.

Now a group of researchers at Chalmers and the University of Freiburg have developed a method using electric stimulation to speed up the healing process. The study, "Bioelectronic microfluidic wound healing: a platform for investigating direct current stimulation of injured cell collectives," was published in the Lab on a Chip journal.

"Chronic wounds are a huge societal problem that we don't hear much about. Our discovery of a method that may heal wounds up to three times faster can be a game changer for diabetic and elderly people, among others, who often suffer greatly from wounds that won't heal," says Maria Asplund, Associate Professor of Bioelectronics at the Chalmers University of Technology and head of research on the project.

How electricity can heal wounds three times faster, The Chalmers University of Technology

Read more…

Strange Metals II...

11029472860?profile=RESIZE_710x

Credit: CC0 Public Domain

Topics: Applied Physics, Chemistry, Materials Science, Metamaterials, Quantum Mechanics

The behavior of so-called "strange metals" has long puzzled scientists—but a group of researchers at the University of Toronto may be one step closer to understanding these materials.

Electrons are discrete, subatomic particles that flow through wires like molecules of water flowing through a pipe. The flow is known as electricity, and it is harnessed to power and control everything from lightbulbs to the Large Hadron Collider.

In quantum matter, by contrast, electrons don't behave as they do in normal materials. They are much stronger, and the four fundamental properties of electrons—charge, spin, orbit, and lattice—become intertwined, resulting in complex states of matter.

"In quantum matter, electrons shed their particle-like character and exhibit strange collective behavior," says condensed matter physicist Arun Paramekanti, a professor in the U of T's Department of Physics in the Faculty of Arts & Science. "These materials are known as non-Fermi liquids, in which the simple rules break down."

Now, three researchers from the university's Department of Physics and Centre for Quantum Information & Quantum Control (CQIQC) have developed a theoretical model describing the interactions between subatomic particles in non-Fermi liquids. The framework expands on existing models and will help researchers understand the behavior of these "strange metals."

Their research was published in the journal Proceedings of the National Academy of Sciences (PNAS). The lead author is physics Ph.D. student Andrew Hardy, with co-authors Paramekanti and post-doctoral researcher Arijit Haldar.

"We know that the flow of a complex fluid like blood through arteries is much harder to understand than water through pipes," says Paramekanti. "Similarly, the flow of electrons in non-Fermi liquids is much harder to study than that in simple metals."

Hardy adds, "What we've done is construct a model, a tool, to study non-Fermi liquid behavior. And specifically, to deal with what happens when there is symmetry breaking, when there is a phase transition into a new type of system."

"Symmetry breaking" is the term used to describe a fundamental process found in all of nature. Symmetry breaks when a system—whether a droplet of water or the entire universe—loses its symmetry and homogeneity and becomes more complex.

Researchers develop new insight into the enigmatic realm of 'strange metals', Chris Sasaki, University of Toronto, Phys.org

Read more…

Green Homing...

11000128501?profile=RESIZE_710x

Divine light The Dean of Gloucester Cathedral, Stephen Lake, blesses the cathedral’s solar panels after the solar-energy firm MyPower installed them in November 2016. The array of PV panels generates just over 25% of the building’s electricity. (Courtesy: MyPower)

Topics: Alternate Energy, Applied Physics, Battery, Chemistry, Economics, Solar Power

With energy bills on the rise, plenty of people are interested in ditching the fossil fuels currently used to heat most UK homes. The question is how to make it happen, as Margaret Harris explains.

Deep beneath the flagstones of the medieval Bath Abbey church, a modern marvel with an ancient twist is silently making its presence felt. Completed in March 2021, the abbey’s heating system combines underfloor pipes with heat exchangers located seven meters below the surface. There, a drain built nearly 2000 years ago carries 1.1 million liters of 40 °C water every day from a natural hot spring into a complex of ancient Roman baths.

By tapping into this flow of warm water, the system provides enough energy to heat not only the abbey but also an adjacent row of Georgian cottages used for offices. No wonder the abbey’s rector praised it as “a sustainable solution for heating our beautiful historic church.”

But that wasn’t all. Once efforts to decarbonize the abbey’s heating were underway, officials in the £19.4m Bath Abbey Footprint project turned their attention to the building’s electricity. Like most churches, the abbey runs from east to west, giving its roof an extensive south-facing aspect. At the UK’s northerly latitudes, such roofs are bathed in sunlight for much of the day, making them ideal for solar photovoltaic (PV) panels. Gloucester Cathedral – an hour’s drive north of Bath – has already taken advantage of this favorable orientation, becoming – in 2016 – the UK’s first major ancient cathedral to have solar panels installed on its roof.

To find out if a similar set-up might be suitable at Bath Abbey, the Footprint project worked with Ph.D. students in the University of Bath-led Centre for Doctoral Training (CDT) in New and Sustainable Photovoltaics. In a feasibility study published in Energy Science & Engineering (2022 10 892), the students calculated that a well-designed array of PV panels could supply 35.7% of the abbey’s electricity, plus 4.6% that could be sold back to the grid on days when a surplus was generated. The array would pay for itself within about 13 years and generate a total profit of £139,000 ± £12,000 over its 25-year lifetime.

Home, green home: scientific solutions for cutting carbon and (maybe) saving money, Margaret Harris, Physics World

Read more…

Caveat Super...

10997708055?profile=RESIZE_584x

A diamond anvil is used to put superconducting materials under high pressure. Credit: J. Adam Fenster/University of Rochester

Topics: Applied Physics, Condensed Matter Physics, Materials Science, Superconductors

Will a possible breakthrough for room-temperature superconducting materials hold up to scrutiny?

This week researchers claimed to have discovered a superconducting material that can shuttle electricity with no loss of energy under near-real-world conditions. But drama and controversy behind the scenes have many worried that the breakthrough may not hold up to scientific scrutiny.

“If you were to find a room-temperature, room-pressure superconductor, you’d have a completely new host of technologies that would occur—that we haven’t even begun to dream about,” says Eva Zurek, a computational chemist at the University at Buffalo, who was not involved in the new study. “This could be a real game changer if it turns out to be correct.”

Scientists have been studying superconductors for more than a century. By carrying electricity without shedding energy in the form of heat, these materials could make it possible to create incredibly efficient power lines and electronics that never overheat. Superconductors also repel magnetic fields. This property lets researchers levitate magnets over a superconducting material as a fun experiment—and it could also lead to more efficient high-speed maglev trains. Additionally, these materials could produce super strong magnets for use in wind turbines, portable magnetic resonance imaging machines, or even nuclear fusion power plants.

The only superconducting materials previously discovered require extreme conditions to function, which makes them impractical for many real-world applications. The first known superconductors had to be cooled with liquid helium to temperatures only a few degrees above absolute zero. In the 1980s, researchers found superconductivity in a category of materials called cuprates, which work at higher temperatures yet still require cooling with liquid nitrogen. Since 2015 scientists have measured room-temperature superconductive behavior in hydrogen-rich materials called hydrides. but they have to be pressed in a sophisticated viselike instrument called a diamond anvil cell until they reach a pressure of about a quarter to half of that found near the center of Earth.

The new material, called nitrogen-doped lutetium hydride, is a blend of hydrogen, the rare-earth metal lutetium, and nitrogen. Although this material also relies on a diamond anvil cell, the study found that it begins exhibiting superconductive behavior at a pressure of about 10,000 atmospheres—roughly 100 times lower than the pressures that other hydrides require. The new material is “much closer to ambient pressure than previous materials,” says David Ceperley, a condensed matter physicist at the University of Illinois at Urbana-Champaign, who was not involved in the new study. He also notes that the material remains stable when stored at a room pressure of one atmosphere. “Previous stuff was only stable at a million atmospheres, so you couldn’t really take it out of the diamond anvil” cell, he says. “The fact that it’s stable at one atmosphere of pressure also means that it’d be easier to manufacture.”

Controversy Surrounds Blockbuster Superconductivity Claim, Sophie Bushwick, Scientific American

Read more…

When Water Outpaces Silicon…

10948713060?profile=RESIZE_710x

On target: Water is fanned out through a specially developed nozzle, and then a laser pulse is passed through it to create a switch. (Courtesy: Adrian Buchmann)

Topics: Applied Physics, Lasers, Materials Science, Photonics, Semiconductor Technology

A laser-controlled water-based switch that operates twice as fast as existing semiconductor switches has been developed by a trio of physicists in Germany. Adrian Buchmann, Claudius Hoberg, and Fabio Novelli at Ruhr University Bochum used an ultrashort laser pulse to create a temporary metal-like state in a jet of liquid water. This altered the transmission of terahertz pulses over timescales of just tens of femtoseconds.

With the latest semiconductor-based switches approaching fundamental upper limits on how fast they can operate, researchers are searching for faster ways of switching signals. One unexpected place to look for inspiration is the curious behavior of water under extreme conditions – like those deep within ice-giant planets or created by powerful lasers.

Molecular dynamics simulations suggest water enters a metallic state at pressures of 300 GPa and temperatures of 7000 K. While such conditions do not occur on Earth, it is possible that this state contributes to the magnetic fields of Uranus and Neptune. To study this effect closer to home, recent experiments have used powerful, ultrashort laser pulses to trigger photo-ionization in water-based solutions – creating fleeting, metal-like states.

Water-based switch outpaces semiconductor devices, described in APL Photonics.

Read more…

Chip Act and Wave Surfing...

10943737673?profile=RESIZE_584x

Massive subsidies to regain the edge of the US semiconductor industry will not likely succeed unless progress is made in winning the global race of idea flow and monetization.

Topics: Applied Physics, Chemistry, Computer Science, Electrical Engineering, Semiconductor Technology

Intelligent use of subsidies for winning the global idea race is a must for gaining and regaining semiconductor edge.

The US semiconductor industry started with the invention of Bell Labs. Subsequently, it attained supremacy in semiconductor production due to the success of making computers better and cheaper. Notably, the rise of the PC wave made Intel and Silicon Valley seemingly unsinkable technology superpowers. But during the first two decades of the 21st century, America has lost it. The USA now relies on Asia to import the most advanced chips. Its iconic Intel is now a couple of technology generation behind Asia’s TSMC and Samsung.

Furthermore, China’s aggressive move has added momentum to America’s despair, triggering a chip war. But why has America lost the edge? Why does it rely on TSMC and Samsung to supply the most advanced chips to power iPhones, Data centers, and Weapons? Is it due to Asian Governments’ subsidies? Or is it due to America’s failure to understand dynamics, make prudent decisions and manage technology and innovation?

Invention and rise and fall of US semiconductor supremacy

In 1947, Bell Labs of the USA invented a semiconductor device—the Transistor. Although American companies developed prototypes of Transistor radios and other consumer electronic products, they did not immediately pursue them. But American firms were very fast in using the Transistor to reinvent computers—by changing the vacuum tube technology core. Due to weight advantage, US Airforce and NASA found transistors suitable for onboard computers. Besides, the invention of integrated circuits by Fairchild and Texas instruments accelerated the weight and size reduction of digital logic circuits. Consequentially, the use of semiconductors in building onboard computers kept exponentially growing. Hence, by the end of the 1960s, the US had become a powerhouse in logic circuit semiconductors. But America remained 2nd to Japan in global production, as Japanese companies were winning the race of consumer electronics by using transistors.

US Semiconductor–from invention, supremacy to despair, Rokon Zaman, The-Waves.org

Read more…

CEM and SEI...

10928839087?profile=RESIZE_710x

Panel A shows how the native SEI on Li metal is passivating to nitrogen, which means that no reactivity with Li metal is possible. Panel B shows that a proton donor like Ethanol will disrupt the SEI passivation and enable Li metal to react with nitrogen species. Panel C describes 3 potential mechanisms through which the proton donor can disrupt the SEI passivation. Credit: Steinberg et al.

Topics: Applied Physics, Battery, Chemistry, Climate Change, Environment

Ammonia (NH3), the chemical compound made of nitrogen and hydrogen, currently has many valuable uses, for instance, serving as a crop fertilizer, purifying agent, and refrigerant gas. In recent years, scientists have been exploring its potential as an energy carrier to reduce global carbon emissions and help tackle global warming.

Ammonia is produced via the Haber-Bosch process, a carbon-producing industrial chemical reaction that converts nitrogen and hydrogen into NH3. As this process is known to contribute heavily to global carbon emissions, electrifying ammonia synthesis would benefit our planet.

One of the most promising strategies for electrically synthesizing ammonia at ambient conditions is using lithium metal. However, some aspects of these processes, including the properties and role of lithium's passivation layer, known as the solid electrolyte interphase (SEI), remain poorly understood.

Researchers at the Massachusetts Institute of Technology (MIT), the University of California- Los Angeles (UCLA), and the California Institute of Technology have recently conducted a study closely examining the reactivity of lithium and its SEI, as this could enhance lithium-based pathways to electrically synthesize ammonia. Their observations, published in Nature Energy, were collected using a state-of-the-art imaging method known as cryogenic transmission electron microscopy.

Using cryogenic electron microscopy to study the lithium SEI during electrocatalysis, Ingrid Fadelli, Phys.org

Read more…

Caveat Emptor...

10913832662?profile=RESIZE_710x

National Ignition Facility operators inspect a final optics assembly during a routine maintenance period in August. Photo credit: Lawrence Livermore National Laboratory

Topics: Alternate Energy, Applied Physics, Climate Change, Energy, Global Warming, Lasers, Nuclear Fusion

After the heady, breathtaking coverage of pop science journalism, I dove into the grim world inhabited by the Bulletin of the Atomic Scientists on their take on the first-ever fusion reaction. I can say that I wasn’t surprised. With all this publicity, it will probably get the Nobel Prize nomination (my guess). Cool Trekkie trivia: the National Ignition Facility was the backdrop for the Enterprise's warp core for Into Darkness.

*****

This week’s headlines have been full of reports about a “major breakthrough” in nuclear fusion technology that, many of those reports misleadingly suggested, augurs a future of abundant clean energy produced by fusion nuclear power plants. To be sure, many of those reports lightly hedged their enthusiasm by noting that (as The Guardian put it) “major hurdles” to a fusion-powered world remain.

Indeed, they do.

The fusion achievement that the US Energy Department announced this week is scientifically significant, but the significance does not relate primarily to electricity generation. Researchers at Lawrence Livermore National Laboratory’s National Ignition Facility, or NIF, focused the facility’s 192 lasers on a target containing a small capsule of deuterium–tritium fuel, compressing it and inducing what is known as ignition. In a written press release, the Energy Department described the achievement this way: “On December 5, a team at LLNL’s National Ignition Facility (NIF) conducted the first controlled fusion experiment in history to reach this [fusion ignition] milestone, also known as scientific energy breakeven, meaning it produced more energy from fusion than the laser energy used to drive it. This historic, first-of-its-kind achievement will provide the unprecedented capability to support [the National Nuclear Security Administration’s] Stockpile Stewardship Program and will provide invaluable insights into the prospects of clean fusion energy, which would be a game-changer for efforts to achieve President Biden’s goal of a net-zero carbon economy.”

Because of how the Energy Department presented the breakthrough in a news conference headlined by Energy Secretary Jennifer Granholm, news coverage has largely glossed over its implications for monitoring the country’s nuclear weapons stockpile. Instead, even many serious news outlets focused on the possibility of carbon-free, fusion-powered electricity generation—even though the NIF achievement has, at best, a distant and tangential connection to power production.

To get a balanced view of what the NIF breakthrough does and does not mean, I (John Mecklin) spoke this week with Bob Rosner, a physicist at the University of Chicago and a former director of the Argonne National Laboratory who has been a longtime member of the Bulletin’s Science and Security Board. The interview has been lightly edited and condensed for readability.

See their chat at the link below.

The Energy Department’s fusion breakthrough: It’s not really about generating electricity, John Mecklin, The Bulletin of the Atomic Scientists, Editor-in-Chief

Read more…

OPVs...

10884770301?profile=RESIZE_710x

V. ALTOUNIAN/SCIENCE

Topics: Alternate Energy, Applied Physics, Chemistry, Materials Science, Solar Power

As ultrathin organic solar cells hit new efficiency records, researchers see green energy potential in surprising places.

In November 2021, while the municipal utility in Marburg, Germany, was performing scheduled maintenance on a hot water storage facility, engineers glued 18 solar panels to the outside of the main 10-meter-high cylindrical tank. It’s not the typical home for solar panels, most of which are flat, rigid silicon and glass rectangles arrayed on rooftops or in solar parks. The Marburg facility’s panels, by contrast, are ultrathin organic films made by Heliatek, a German solar company. In the past few years, Heliatek has mounted its flexible panels on the sides of office towers, the curved roofs of bus stops, and even the cylindrical shaft of an 80-meter-tall windmill. The goal: expanding solar power’s reach beyond flat land. “There is a huge market where classical photovoltaics do not work,” says Jan Birnstock, Heliatek’s chief technical officer.

Organic photovoltaics (OPVs) such as Heliatek’s are more than 10 times lighter than silicon panels and in some cases cost just half as much to produce. Some are even transparent, which has architects envisioning solar panels, not just on rooftops, but incorporated into building facades, windows, and even indoor spaces. “We want to change every building into an electricity-generating building,” Birnstock says.

Heliatek’s panels are among the few OPVs in practical use, and they convert about 9% of the energy in sunlight to electricity. But in recent years, researchers around the globe have come up with new materials and designs that, in small, lab-made prototypes, have reached efficiencies of nearly 20%, approaching silicon and alternative inorganic thin-film solar cells, such as those made from a mix of copper, indium, gallium, and selenium (CIGS). Unlike silicon crystals and CIGS, where researchers are mostly limited to the few chemical options nature gives them, OPVs allow them to tweak bonds, rearrange atoms, and mix in elements from across the periodic table. Those changes represent knobs chemists can adjust to improve their materials’ ability to absorb sunlight, conduct charges, and resist degradation. OPVs still fall short of those measures. But, “There is an enormous white space for exploration,” says Stephen Forrest, an OPV chemist at the University of Michigan, Ann Arbor.

Solar Energy Gets Flexible, Robert F. Service, Science Magazine

Read more…

Mirror, Mirror...

10780308878?profile=RESIZE_584x

Various views of a 3D-printed object are captured by a single camera using a dome-shaped array of mirrors. Left: The raw image. Right: closeups of some of the individual views. (Image: Sanha Cheong, SLAC National Accelerator Laboratory)

Topics: Applied Physics, Atomic-Scale Microscopy, Materials Science, Optics

(Nanowerk News) When it goes online, the MAGIS-100 experiment at the Fermi National Accelerator Laboratory and its successors will explore the nature of gravitational waves and search for certain kinds of wavelike dark matter. But first, researchers need to figure out something pretty basic: how to get good photographs of the clouds of atoms at the heart of their experiment.

Researchers at the Department of Energy's SLAC National Accelerator Laboratory realized that task would be perhaps the ultimate exercise in ultra-low light photography.

But a SLAC team that included Stanford graduate students Sanha Cheong and Murtaza Safdari, SLAC Professor Ariel Schwartzman, and SLAC scientists Michael Kagan, Sean Gasiorowski, Maxime Vandegar, and Joseph Frish found a simple way to do it: mirrors. By arranging mirrors in a dome-like configuration around an object, they can reflect more light towards the camera and image multiple sides of an object simultaneously.

And, the team reports in the Journal of Instrumentation ("Novel light field imaging device with an enhanced light collection for cold atom clouds"), that there's an additional benefit. Because the camera now gathers views of an object taken from many different angles, the system is an example of “light-field imaging”, which captures not just the intensity of light but also which direction light rays travel. As a result, the mirror system can help researchers build a three-dimensional model of an object, such as an atom cloud.

How do you take a better image of atom clouds? Mirrors - lots of mirrors, SLAC National Accelerator Laboratory

Read more…

ARDP...

10736885455?profile=RESIZE_400x

The design concept of BWXT Advanced Nuclear Reactor. BWX Technologies

Topics: Applied Physics, Alternate Energy, Climate Change, Nuclear Power

According to the US Energy Information Administration, the US uses a mixture of 60.8% fossil fuel sources to generate 2,504 billion kilowatt hours of energy. Our nuclear expenditure is a paltry 18.9%. The totality of renewable sources (wind, hydropower, solar, biomass, and geothermal) is a little higher: 20.1%. This is the crux of the "Green New Deal."

Though I long for the cleaner, neater version of nuclear power in fusion, it's kind of hard to mimic the pressures and magnetic fields necessary to spark essentially a mini sun on the planet. I think the resistance to nuclear fission is cultural: from the atomic bomb, Oppenheimer quoting the Bhagavad-Gita at the first successful testing, a classic "what have we done" trope. Popular fiction emphasizes doomsday scenarios and radioactive zombies. Honorable mention: Space 1999, which like zombies I doubt could ever happen, but it kept my attention in my youth. There are also genuine concerns about Chernobyl (still in Ukraine), Three-Mile Island, and Fukushima Daichi that come to the public's mind.

The reason the percentages on fossil fuels are so high is that they release extreme amounts of energy to superheat water for turbines to turn magnets superfast in copper coils. That is how most of the electricity we consume is made.

France currently generates 70% of its energy from nuclear power plants, with plans to reduce this to 50% as they mix in renewables. This is proportional to the percentage the US already has in renewables. My only caveat is an obsolescence plan for solar panels (they have to be implanted with caustic impurities to MAKE them conductive, and after twenty years, could end up in a landfill near humans). Battery-operated vehicles are fine, but Lithium has to be mined, it requires a lot of water, typically the indigenous peoples near the mines don't make a profit, and their land and resources are spoiled.

If we truly are going to transition from fossil fuels to "cleaner energy," I think we should realize that power plant designs have improved greatly since the aforementioned disasters.

As an engineer, I always tried to follow this edict from my father: "Experience isn't the best teacher: other people's experiences are the best teacher." In short, learn from others' mistakes, and try to not repeat them. It works in other nontechnical areas of life as well.

I (fingers crossed) assume nuclear power plant design engineers follow something similar to improve on future designs for safety, and as we've been exposed to with the war in Ukraine, global energy security.

I'm proposing an "everything on the table strategy," not Pollyanna. By the way, our "carbon footprint" appears to be a boondoggle by the industries that caused our current malaise.

The U.S. Department of Energy's Advanced Reactor Demonstration Program commonly referred to as ARDP, is designed to help our domestic nuclear industry demonstrate its advanced reactor designs on accelerated timelines. This will ultimately help us build a competitive portfolio of new U.S. reactors that offer significant improvements over today’s technology.

The advanced reactors selected for risk-reduction awards are an excellent representation of the diverse designs currently under development in the United States. They range from advanced light-water-cooled small modular reactors to new designs that use molten salts and high-temperature gases to flexibly operate at even higher temperatures and lower pressures.

All of them have the potential to compete globally once deployed. They will offer consumers more access to a reliable, clean power source that can be depended on in the near future to flexibly generate electricity, drive industrial processes, and even provide potable drinking water to communities in water-scarce locations.

5 Advanced Reactor Designs to Watch in 2030, Alice Caponiti, Deputy Assistant Secretary for Reactor Fleet and Advanced Reactor Deployment, Office of Nuclear Energy

Read more…