applied physics (55)

PV Caveats...

12401778677?profile=RESIZE_710x

 Graphical abstract. Credit: Joule (2024). DOI: 10.1016/j.joule.2024.01.025

Topics: Applied Physics, Chemistry, Energy, Green Tech, Materials Science, Photovoltaics

 

The energy transition is progressing, and photovoltaics (PV) is playing a key role in this. Enormous capacities are to be added over the next few decades. Experts expect several tens of terawatts by the middle of the century. That's 10 to 25 solar modules for every person. The boom will provide clean, green energy. But this growth also has its downsides.

 

Several million tons of waste from old modules are expected by 2050—and that's just for the European market. Even if today's PV modules are designed to last as long as possible, they will end up in landfill at the end of their life, and with them some valuable materials.

 

"Circular economy recycling in photovoltaics will be crucial to avoiding waste streams on a scale roughly equivalent to today's global electronic waste," explains physicist Dr. Marius Peters from the Helmholtz Institute Erlangen-Nürnberg for Renewable Energies (HI ERN), a branch of Forschungszentrum Jülich.

 

Today's solar modules are only suitable for this to a limited extent. The reason for this is the integrated—i.e., hardly separable—structure of the modules, which is a prerequisite for their long service life. Even though recycling is mandatory in the European Union, PV modules are, therefore, difficult to reuse in a circular way.

 

The current study by Dr. Ian Marius Peters, Dr. Jens Hauch, and Prof Christoph Brabec from HI ERN shows how important it is for the rapid growth of the PV industry to recycle these materials. "Our vision is to move away from a design for eternity towards a design for the eternal cycle," says Peters "This will make renewable energy more sustainable than any energy technology before.

 

The consequences of the PV boom: Study analyzes recycling strategies for solar modules, Forschungszentrum Juelich

 

Read more…

Plastics and Infarctions...

12399328276?profile=RESIZE_710x

Plastic chokes a canal in Chennai, India. Credit: R. Satish Babu/AFP via Getty

Topics: Applied Physics, Biology, Chemistry, Environment, Medicine

People who had tiny plastic particles lodged in a key blood vessel were more likely to experience heart attack, stroke or death during a three-year study.

Plastics are just about everywhere — food packaging, tyres, clothes, water pipes. And they shed microscopic particles that end up in the environment and can be ingested or inhaled by people.

Now, the first data of their kind show a link between these microplastics and human health. A study of more than 200 people undergoing surgery found that nearly 60% had microplastics or even smaller nanoplastics in a main artery1. Those who did were 4.5 times more likely to experience a heart attack, a stroke, or death in the approximately 34 months after the surgery than were those whose arteries were plastic-free.

“This is a landmark trial,” says Robert Brook, a physician-scientist at Wayne State University in Detroit, Michigan, who studies the environmental effects on cardiovascular health and was not involved with the study. “This will be the launching pad for further studies across the world to corroborate, extend, and delve into the degree of the risk that micro- and nanoplastics pose.”

But Brook, other researchers and the authors themselves caution that this study, published in The New England Journal of Medicine on 6 March, does not show that the tiny pieces caused poor health. Other factors that the researchers did not study, such as socio-economic status, could be driving ill health rather than the plastics themselves, they say.

Landmark study links microplastics to serious health problems, Max Kozlov, Nature.

Read more…

Limit Shattered...

12368038269?profile=RESIZE_710x

TSMC is building Two New Facilities to Accommodate 2nm Chip Production

Topics: Applied Physics, Chemistry, Electrical Engineering, Materials Science, Nanoengineering, Semiconductor Technology

 

Realize that Moore’s “law” isn’t like Newton’s Laws of Gravity or the three laws of Thermodynamics. It’s simply an observation based on experience with manufacturing silicon processors and the desire to make money from the endeavor continually.

 

As a device engineer, I had heard “7 nm, and that’s it” so often that it became colloquial folklore. TSMC has proven itself a powerhouse once again and, in our faltering geopolitical climate, made itself even more desirable to mainland China in its quest to annex the island, sadly by force if necessary.

 

Apple will be the first electronic manufacturer to receive chips built by Taiwan Semiconductor Manufacturing Company (TSMC) using a two-nanometer process. According to Korea’s DigiTimes Asia, inside sources said that Apple is "widely believed to be the initial client to utilize the process." The report noted that TSMC has been increasing its production capacity in response to “significant customer orders.” Moreover, the report added that the company has recently established a production expansion strategy aimed at producing 2nm chipsets based on the Gate-all-around (GAA) manufacturing process.

 

The GAA process, also known as gate-all-around field-effect transistor (GAA-FET) technology, defies the performance limitations of other chip manufacturing processes by allowing the transistors to carry more current while staying relatively small in size.

 

Apple to jump queue for TSMC's industry-first 2-nanometer chips: Report, Harsh Shivam, New Delhi, Business Standard.

 

Read more…

Boltwood Estimate...

12365551887?profile=RESIZE_710x

Credit: Public Domain

Topics: Applied Physics, Education, History, Materials Science, Philosophy, Radiation, Research

We take for granted that Earth is very old, almost incomprehensibly so. But for much of human history, estimates of Earth’s age were scattershot at best. In February 1907, a chemist named Bertram Boltwood published a paper in the American Journal of Science detailing a novel method of dating rocks that would radically change these estimates. In mineral samples gathered from around the globe, he compared lead and uranium levels to determine the minerals’ ages. One was a bombshell: A sample of the mineral thorianite from Sri Lanka (known in Boltwood’s day as Ceylon) yielded an age of 2.2 billion years, suggesting that Earth must be at least that old as well. While Boltwood was off by more than 2 billion years (Earth is now estimated to be about 4.5 billion years old), his method undergirds one of today’s best-known radiometric dating techniques.

In the Christian world, Biblical cosmology placed Earth’s age at around 6,000 years, but fossil and geology discoveries began to upend this idea in the 1700s. In 1862, physicist William Thomson, better known as Lord Kelvin, used Earth’s supposed rate of cooling and the assumption that it had started out hot and molten to estimate that it had formed between 20 and 400 million years ago. He later whittled that down to 20-40 million years, an estimate that rankled Charles Darwin and other “natural philosophers” who believed life’s evolutionary history must be much longer. “Many philosophers are not yet willing to admit that we know enough of the constitution of the universe and of the interior of our globe to speculate with safety on its past duration,” Darwin wrote. Geologists also saw this timeframe as much too short to have shaped Earth’s many layers.

Lord Kelvin and other physicists continued studies of Earth’s heat, but a new concept — radioactivity — was about to topple these pursuits. In the 1890s, Henri Becquerel discovered radioactivity, and the Curies discovered the radioactive elements radium and polonium. Still, wrote physicist Alois F. Kovarik in a 1929 biographical sketch of Boltwood, “Radioactivity at that time was not a science as yet, but merely represented a collection of new facts which showed only little connection with each other.”

February 1907: Bertram Boltwood Estimates Earth is at Least 2.2 Billion Years Old, Tess Joosse, American Physical Society

Read more…

On-Off Superconductor...

12364246686?profile=RESIZE_710x

A team of physicists has discovered a new superconducting material with unique tunability for external stimuli, promising advancements in energy-efficient computing and quantum technology. This breakthrough, achieved through advanced research techniques, enables unprecedented control over superconducting properties, potentially revolutionizing large-scale industrial applications.

Topics: Applied Physics, Materials Science, Solid-State Physics, Superconductors

Researchers used the Advanced Photon Source to verify the rare characteristics of this material, potentially paving the way for more efficient large-scale computing.

As industrial computing needs grow, the size and energy consumption of the hardware needed to keep up with those needs grows as well. A possible solution to this dilemma could be found in superconducting materials, which can reduce energy consumption exponentially. Imagine cooling a giant data center full of constantly running servers down to nearly absolute zero, enabling large-scale computation with incredible energy efficiency.

Breakthrough in Superconductivity Research

Physicists at the University of Washington and the U.S. Department of Energy’s (DOE) Argonne National Laboratory have made a discovery that could help enable this more efficient future. Researchers have found a superconducting material that is uniquely sensitive to outside stimuli, enabling the superconducting properties to be enhanced or suppressed at will. This enables new opportunities for energy-efficient switchable superconducting circuits. The paper was published in Science Advances.

Superconductivity is a quantum mechanical phase of matter in which an electrical current can flow through a material with zero resistance. This leads to perfect electronic transport efficiency. Superconductors are used in the most powerful electromagnets for advanced technologies such as magnetic resonance imaging, particle accelerators, fusion reactors, and even levitating trains. Superconductors have also found uses in quantum computing.

Scientists Discover Groundbreaking Superconductor With On-Off Switches, Argonne National Laboratory

Read more…

Fast Charger...

12359976866?profile=RESIZE_710x

Significant Li plating capacity from Si anode. a, Li discharge profile in a battery of Li/graphite–Li5.5PS4.5Cl1.5 (LPSCl1.5)–LGPS–LPSCl1.5–SiG at current density 0.2 mA cm–2 at room temperature. Note that SiG was made by mixing Si and graphite in one composite layer. Inset shows the schematic illustration of stages 1–3 based on SEM and EDS mapping, which illustrate the unique Li–Si anode evolution in solid-state batteries observed experimentally in Figs. 1 and 2. b, FIB–SEM images of the SiG anode at different discharge states (i), (ii), and (iii) corresponding to points 1–3 in a, respectively. c, SEM–EDS mapping of (i), (ii), and (iii), corresponding to SEM images in b, where carbon signal (C) is derived from graphite, oxygen (O) and nitrogen (N) signals are from Li metal reaction with air and fluorine (F) is from the PTFE binder. d, Discharge profile of battery with cell construction Li-1M LiPF6 in EC/DMC–SiG. Schematics illustrate typical Si anode evolution in liquid-electrolyte batteries. e, FIB–SEM image (i) of SiG anode following discharge in the liquid-electrolyte battery shown in d; zoomed-in image (ii). Credit: Nature Materials (2024). DOI: 10.1038/s41563-023-01722-x

Topics: Applied Physics, Battery, Chemistry, Climate Change, Electrical Engineering, Mechanical Engineering

Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a new lithium metal battery that can be charged and discharged at least 6,000 times—more than any other pouch battery cell—and can be recharged in a matter of minutes.

The research not only describes a new way to make solid-state batteries with a lithium metal anode but also offers a new understanding of the materials used for these potentially revolutionary batteries.

The research is published in Nature Materials.

"Lithium metal anode batteries are considered the holy grail of batteries because they have ten times the capacity of commercial graphite anodes and could drastically increase the driving distance of electric vehicles," said Xin Li, Associate Professor of Materials Science at SEAS and senior author of the paper. "Our research is an important step toward more practical solid-state batteries for industrial and commercial applications."

One of the biggest challenges in the design of these batteries is the formation of dendrites on the surface of the anode. These structures grow like roots into the electrolyte and pierce the barrier separating the anode and cathode, causing the battery to short or even catch fire.

These dendrites form when lithium ions move from the cathode to the anode during charging, attaching to the surface of the anode in a process called plating. Plating on the anode creates an uneven, non-homogeneous surface, like plaque on teeth, and allows dendrites to take root. When discharged, that plaque-like coating needs to be stripped from the anode, and when plating is uneven, the stripping process can be slow and result in potholes that induce even more uneven plating in the next charge.

Solid-state battery design charges in minutes and lasts for thousands of cycles, Leah Burrows, Harvard John A. Paulson School of Engineering and Applied Sciences, Tech Xplore

Read more…

10x > Kevlar...

12347948292?profile=RESIZE_400x

Scientists have developed amorphous silicon carbide, a strong and scalable material with potential uses in microchip sensors, solar cells, and space exploration. This breakthrough promises significant advancements in material science and microchip technology. An artist’s impression of amorphous silicon carbide nanostrings testing to its limit tensile strength. Credit: Science Brush

Topics: Applied Physics, Chemistry, Materials Science, Nanomaterials, Semiconductor Technology

A new material that doesn’t just rival the strength of diamonds and graphene but boasts a yield strength ten times greater than Kevlar, renowned for its use in bulletproof vests.

Researchers at Delft University of Technology, led by assistant professor Richard Norte, have unveiled a remarkable new material with the potential to impact the world of material science: amorphous silicon carbide (a-SiC).

Beyond its exceptional strength, this material demonstrates mechanical properties crucial for vibration isolation on a microchip. Amorphous silicon carbide is particularly suitable for making ultra-sensitive microchip sensors.

The range of potential applications is vast, from ultra-sensitive microchip sensors and advanced solar cells to pioneering space exploration and DNA sequencing technologies. The advantages of this material’s strength, combined with its scalability, make it exceptionally promising.

Researchers at Delft University of Technology, led by assistant professor Richard Norte, have unveiled a remarkable new material with the potential to impact the world of material science: amorphous silicon carbide (a-SiC).

The researchers adopted an innovative method to test this material’s tensile strength. Instead of traditional methods that might introduce inaccuracies from how the material is anchored, they turned to microchip technology. By growing the films of amorphous silicon carbide on a silicon substrate and suspending them, they leveraged the geometry of the nanostrings to induce high tensile forces. By fabricating many such structures with increasing tensile forces, they meticulously observed the point of breakage. This microchip-based approach ensures unprecedented precision and paves the way for future material testing.

Why the focus on nanostrings? “Nanostrings are fundamental building blocks, the foundation that can be used to construct more intricate suspended structures. Demonstrating high yield strength in a nanostring translates to showcasing strength in its most elemental form.”

10x Stronger Than Kevlar: Amorphous Silicon Carbide Could Revolutionize Material Science, Delft University Of Technology

Read more…

Scandium and Superconductors...

12347514059?profile=RESIZE_710x

Scandium is the only known elemental superconductor to have a critical temperature in the 30 K range. This phase diagram shows the superconducting transition temperature (Tc) and crystal structure versus pressure for scandium. The measured results on all the five samples studied show consistent trends. (Courtesy: Chinese Phys. Lett. 40 107403)

Topics: Applied Physics, Chemistry, Condensed Matter Physics, Materials Science, Superconductors, Thermodynamics

Scandium remains a superconductor at temperatures above 30 K (-243.15 Celsius, -405.67 Fahrenheit), making it the first element known to superconduct at such a high temperature. The record-breaking discovery was made by researchers in China, Japan, and Canada, who subjected the element to pressures of up to 283 GPa – around 2.3 million times the atmospheric pressure at sea level.

Many materials become superconductors – that is, they conduct electricity without resistance – when cooled to low temperatures. The first superconductor to be discovered, for example, was solid mercury in 1911, and its transition temperature Tc is only a few degrees above absolute zero. Several other superconductors were discovered shortly afterward with similarly frosty values of Tc.

In the late 1950s, the Bardeen–Cooper–Schrieffer (BCS) theory explained this superconducting transition as the point at which electrons overcome their mutual electrical repulsion to form so-called “Cooper pairs” that then travel unhindered through the material. But beginning in the late 1980s, a new class of “high-temperature” superconductors emerged that could not be explained using BCS theory. These materials have Tc above the boiling point of liquid nitrogen (77 K), and they are not metals. Instead, they are insulators containing copper oxides (cuprates), and their existence suggests it might be possible to achieve superconductivity at even higher temperatures.

The search for room-temperature superconductors has been on ever since, as such materials would considerably improve the efficiency of electrical generators and transmission lines while also making common applications of superconductivity (including superconducting magnets in particle accelerators and medical devices like MRI scanners) simpler and cheaper.

Scandium breaks temperature record for elemental superconductors, Isabelle Dumé, Physics World

Read more…

Cooling Circuitry...

12345221085?profile=RESIZE_710x

Illustration of a UCLA-developed solid-state thermal transistor using an electric field to control heat movement. Credit: H-Lab/UCLA

Topics: Applied Physics, Battery, Chemistry, Electrical Engineering, Energy, Thermodynamics

A new thermal transistor can control heat as precisely as an electrical transistor can control electricity.

From smartphones to supercomputers, electronics have a heat problem. Modern computer chips suffer from microscopic “hotspots” with power density levels that exceed those of rocket nozzles and even approach that of the sun’s surface. Because of this, more than half the total electricity burned at U.S. data centers isn’t used for computing but for cooling. Many promising new technologies—such as 3-D-stacked chips and renewable energy systems—are blocked from reaching their full potential by errant heat that diminishes a device’s performance, reliability, and longevity.

“Heat is very challenging to manage,” says Yongjie Hu, a physicist and mechanical engineer at the University of California, Los Angeles. “Controlling heat flow has long been a dream for physicists and engineers, yet it’s remained elusive.”

But Hu and his colleagues may have found a solution. As reported last November in Science, his team has developed a new type of transistor that can precisely control heat flow by taking advantage of the basic chemistry of atomic bonding at the single-molecule level. These “thermal transistors” will likely be a central component of future circuits and will work in tandem with electrical transistors. The novel device is already affordable, scalable, and compatible with current industrial manufacturing practices, Hu says, and it could soon be incorporated into the production of lithium-ion batteries, combustion engines, semiconductor systems (such as computer chips), and more.

Scientists Finally Invent Heat-Controlling Circuitry That Keeps Electronics Cool, Rachel Newur, Scientific American

Read more…

Fusion's Holy Grail...

12344656301?profile=RESIZE_710x

A view of the assembled experimental JT-60SA Tokamak nuclear fusion facility outside Tokyo, Japan. JT-60SA.ORG

Topics: Applied Physics, Economics, Energy, Heliophysics, Nuclear Fusion, Quantum Mechanics

Japan and the European Union have officially inaugurated testing at the world’s largest experimental nuclear fusion plant. Located roughly 85 miles north of Tokyo, the six-story JT-60SA “tokamak” facility heats plasma to 200 million degrees Celsius (around 360 million Fahrenheit) within its circular, magnetically insulated reactor. Although JT-60SA first powered up during a test run back in October, the partner governments’ December 1 announcement marks the official start of operations at the world’s biggest fusion center, reaffirming a “long-standing cooperation in the field of fusion energy.”

The tokamak—an acronym of the Russian-language designation of “toroidal chamber with magnetic coils”—has led researchers’ push towards achieving the “Holy Grail” of sustainable green energy production for decades. Often described as a large hollow donut, a tokamak is filled with gaseous hydrogen fuel that is then spun at immense high speeds using powerful magnetic coil encasements. When all goes as planned, intense force ionizes atoms to form helium plasma, much like how the sun produces its energy.

[Related: How a US lab created energy with fusion—again.]

Speaking at the inauguration event, EU energy commissioner Kadri Simson referred to the JT-60SA as “the most advanced tokamak in the world,” representing “a milestone for fusion history.”

“Fusion has the potential to become a key component for energy mix in the second half of this century,” she continued.

The world’s largest experimental tokamak nuclear fusion reactor is up and running, Andrew Paul, Popular Science.

Read more…

'Teleporting' Images...

12344019066?profile=RESIZE_584x

High-dimensional quantum transport enabled by nonlinear detection. In our concept, information is encoded on a coherent source and overlapped with a single photon from an entangled pair in a nonlinear crystal for up-conversion by sum frequency generation, the latter acting as a nonlinear spatial mode detector. The bright source is necessary to achieve the efficiency required for nonlinear detection. Information and photons flow in opposite directions: one of [the] Bob’s entangled photons is sent to Alice and has no information, while a measurement on the other in coincidence with the upconverted photon establishes the transport of information across the quantum link. Alice need not know this information for the process to work, while the nonlinearity allows the state to be arbitrary and unknown dimension and basis. Credit: Nature Communications (2023). DOI: 10.1038/s41467-023-43949-x

Topics: Applied Physics, Computer Science, Cryptography, Cybersecurity, Quantum Computers, Quantum Mechanics, Quantum Optics

Nature Communications published research by an international team from Wits and ICFO- The Institute of Photonic Sciences, which demonstrates the teleportation-like transport of "patterns" of light—this is the first approach that can transport images across a network without physically sending the image and a crucial step towards realizing a quantum network for high-dimensional entangled states.

Quantum communication over long distances is integral to information security and has been demonstrated with two-dimensional states (qubits) over very long distances between satellites. This may seem enough if we compare it with its classical counterpart, i.e., sending bits that can be encoded in 1s (signal) and 0s (no signal), one at a time.

However, quantum optics allow us to increase the alphabet and to securely describe more complex systems in a single shot, such as a unique fingerprint or a face.

"Traditionally, two communicating parties physically send the information from one to the other, even in the quantum realm," says Prof. Andrew Forbes, the lead PI from Wits University.

"Now, it is possible to teleport information so that it never physically travels across the connection—a 'Star Trek' technology made real." Unfortunately, teleportation has so far only been demonstrated with three-dimensional states (imagine a three-pixel image); therefore, additional entangled photons are needed to reach higher dimensions.

'Teleporting' images across a network securely using only light, Wits University, Phys.org.

Read more…

Funny How It's Not Aliens...

12332094695?profile=RESIZE_584x

The 3D model of Menga was drawn with AutoCAD, showing the biofacies (microfacies) present in the stones. The fourth pillar, currently missing, has been added, while capstones C-2, C-3, C-4, and C-5 have been removed in order to show the interior of the monument (Lozano Rodríguez et al.25). (a) Pillar P-3 with examples of biofacies (a1a3 observed in hand specimen). (b) Orthostat O-15 with examples of biofacies (b1b4 observed petrographically) and in hand specimen (b5). (c) Orthostat O-8 with examples observed petrographically (crossed polars) (c1,c2). (d) Orthostat O-5 with examples observed through the petrographic microscope (d1,d2). The star-shaped symbol indicates the place where a section was made for the petrographic study—Qtz: Quartz (designations after Kretz,49).

Topics: Applied Physics, Archaeology, Dark Humor, History

Abstract

The technical and intellectual capabilities of past societies are reflected in the monuments they were able to build. Tracking the provenance of the stones utilized to build prehistoric megalithic monuments through geological studies is of utmost interest for interpreting ancient architecture as well as contributing to their protection. According to the scarce information available, most stones used in European prehistoric megaliths originate from locations near the construction sites, which would have made transport easier. The Menga dolmen (Antequera, Malaga, Spain), listed in UNESCO World Heritage since July 2016, was designed and built with stones weighing up to nearly 150 tons, thus becoming the most colossal stone monument built in its time in Europe (c. 3800–3600 BC). Our study (based on high-resolution geological mapping as well as petrographic and stratigraphic analyses) reveals key geological and archaeological evidence to establish the precise provenance of the massive stones used in the construction of this monument. These stones are mostly calcarenites, a poorly cemented detrital sedimentary rock comparable to those known as 'soft stones' in modern civil engineering. They were quarried from a rocky outcrop located at a distance of approximately 1 km. In this study, it can be inferred the use of soft stone in Menga reveals the human application of new wood and stone technologies, enabling the construction of a monument of unprecedented magnitude and complexity.

The provenance of the stones in the Menga dolmen reveals one of the greatest engineering feats of the Neolithic. Scientific Reports, Nature

José Antonio Lozano Rodríguez, Leonardo García Sanjuán, Antonio M. Álvarez-Valero, Francisco Jiménez-Espejo, Jesús María Arrieta, Eugenio Fraile-Nuez, Raquel Montero Artús, Giuseppe Cultrone, Fernando Alonso Muñoz-Carballeda & Francisco Martínez-Sevilla

Read more…

Nano Racetracks...

In this image, optical pulses (solitons) can be seen circling through conjoined optical tracks. (Image: Yuan, Bowers, Vahala, et al.) An animated gif is at the original link below.

Topics: Applied Physics, Astronomy, Electrical Engineering, Materials Science, Nanoengineering, Optics

(Nanowerk News) When we last checked in with Caltech's Kerry Vahala three years ago, his lab had recently reported the development of a new optical device called a turnkey frequency microcomb that has applications in digital communications, precision timekeeping, spectroscopy, and even astronomy.

This device, fabricated on a silicon wafer, takes input laser light of one frequency and converts it into an evenly spaced set of many distinct frequencies that form a train of pulses whose length can be as short as 100 femtoseconds (quadrillionths of a second). (The comb in the name comes from the frequencies being spaced like the teeth of a hair comb.)

Now Vahala, Caltech's Ted and Ginger Jenkins, Professor of Information Science and Technology and Applied Physics and executive officer for applied physics and materials science, along with members of his research group and the group of John Bowers at UC Santa Barbara, have made a breakthrough in the way the short pulses form in an important new material called ultra-low-loss silicon nitride (ULL nitride), a compound formed of silicon and nitrogen. The silicon nitride is prepared to be extremely pure and deposited in a thin film.

In principle, short-pulse microcomb devices made from this material would require very low power to operate. Unfortunately, short light pulses (called solitons) cannot be properly generated in this material because of a property called dispersion, which causes light or other electromagnetic waves to travel at different speeds, depending on their frequency. ULL has what is known as normal dispersion, and this prevents waveguides made of ULL nitride from supporting the short pulses necessary for microcomb operation.

In a paper appearing in Nature Photonics ("Soliton pulse pairs at multiple colors in normal dispersion microresonators"), the researchers discuss their development of the new micro comb, which overcomes the inherent optical limitations of ULL nitride by generating pulses in pairs. This is a significant development because ULL nitride is created with the same technology used for manufacturing computer chips. This kind of manufacturing technique means that these microcombs could one day be integrated into a wide variety of handheld devices similar in form to smartphones.

The most distinctive feature of an ordinary microcomb is a small optical loop that looks a bit like a tiny racetrack. During operation, the solitons automatically form and circulate around it.

"However, when this loop is made of ULL nitride, the dispersion destabilizes the soliton pulses," says co-author Zhiquan Yuan (MS '21), a graduate student in applied physics.

Imagine the loop as a racetrack with cars. If some cars travel faster and some travel slower, then they will spread out as they circle the track instead of staying as a tight pack. Similarly, the normal dispersion of ULL means light pulses spread out in the microcomb waveguides, and the microcomb ceases to work.

The solution devised by the team was to create multiple racetracks, pairing them up so they look a bit like a figure eight. In the middle of that '8,' the two tracks run parallel to each other with only a tiny gap between them.

Conjoined 'racetracks' make new optical devices possible, Nanowerk.

Read more…

Microlenses...

12313740872?profile=RESIZE_710x

Chromatic imaging of white light with a single lens (left) and achromatic imaging of white light with a hybrid lens (right). Credit: The Grainger College of Engineering at the University of Illinois Urbana-Champaign

Topics: 3D Printing, Additive Manufacturing, Applied Physics, Materials Science, Optics

Using 3D printing and porous silicon, researchers at the University of Illinois Urbana-Champaign have developed compact, visible wavelength achromats that are essential for miniaturized and lightweight optics. These high-performance hybrid micro-optics achieve high focusing efficiencies while minimizing volume and thickness. Further, these microlenses can be constructed into arrays to form larger area images for achromatic light-field images and displays.

This study was led by materials science and engineering professors Paul Braun and David Cahill, electrical and computer engineering professor Lynford Goddard, and former graduate student Corey Richards. The results of this research were published in Nature Communications.

"We developed a way to create structures exhibiting the functionalities of classical compound optics but in highly miniaturized thin film via non-traditional fabrication approaches," says Braun.

In many imaging applications, multiple wavelengths of light are present, e.g., white light. If a single lens is used to focus this light, different wavelengths focus at different points, resulting in a color-blurred image. To solve this problem, multiple lenses are stacked together to form an achromatic lens. "In white light imaging, if you use a single lens, you have considerable dispersion, and so each constituent color is focused at a different position. With an achromatic lens, however, all the colors focus at the same point," says Braun.

The challenge, however, is that the required stack of lens elements required to make an achromatic lens is relatively thick, which can make a classical achromatic lens unsuitable for newer, scaled-down technological platforms, such as ultracompact visible wavelength cameras, portable microscopes, and even wearable devices.

A new (micro) lens on optics: Researchers develop hybrid achromats with high focusing efficiencies,  Amber Rose, University of Illinois Grainger College of Engineering

Read more…

Anthrobots...

This image has an empty alt attribute; its file name is anthrobots.png

An Anthrobot is shown, depth colored, with a corona of cilia that provides locomotion for the bot. Credit: Gizem Gumuskaya, Tufts University

Topics: Applied Physics, Biology, Biomimetics, Biotechnology, Research, Robotics

Researchers at Tufts University and Harvard University's Wyss Institute have created tiny biological robots that they call Anthrobots from human tracheal cells that can move across a surface and have been found to encourage the growth of neurons across a region of damage in a lab dish.

The multicellular robots, ranging in size from the width of a human hair to the point of a sharpened pencil, were made to self-assemble and shown to have a remarkable healing effect on other cells. The discovery is a starting point for the researchers' vision to use patient-derived biobots as new therapeutic tools for regeneration, healing, and treatment of disease.

The work follows from earlier research in the laboratories of Michael Levin, Vannevar Bush, Professor of Biology at Tufts University School of Arts & Sciences, and Josh Bongard at the University of Vermont, in which they created multicellular biological robots from frog embryo cells called Xenobots, capable of navigating passageways, collecting material, recording information, healing themselves from injury, and even replicating for a few cycles on their own.

At the time, researchers did not know if these capabilities were dependent on their being derived from an amphibian embryo or if biobots could be constructed from cells of other species.

In the current study, published in Advanced Science, Levin, along with Ph.D. student Gizem Gumuskaya, discovered that bots can, in fact, be created from adult human cells without any genetic modification, and they are demonstrating some capabilities beyond what was observed with the Xenobots.

The discovery starts to answer a broader question that the lab has posed—what are the rules that govern how cells assemble and work together in the body, and can the cells be taken out of their natural context and recombined into different "body plans" to carry out other functions by design?

Anthrobots: Scientists build tiny biological robots from human tracheal cells, Tufts University

Read more…

Bitcoin and Gaia...

12271621897?profile=RESIZE_710x

"What are the environmental impacts of cryptocurrency?" Written by Paul Kim; edited by Jasmine Suarez Mar 17, 2022, 5:21 PM EDT, Business Insider.

 Image: Ethereum, the second biggest cryptocurrency on the market, plans on changing to proof of stake mining in the future. Rachel Mendelson/Insider

 

Topics: Applied Physics, Computer Science, Cryptography, Economics, Environment, Star Trek, Thermodynamics

In what is now “old school Internet” (or web surfing for fogies), I will get a friend request from someone on Facebook/Meta who is in cryptocurrency. I quote myself in the first paragraph of what I refer to as my “public service announcement):

I am not INTERESTED in crypto. As someone who worked with cryptography as a matter of national security, holding a TS/SCI clearance, when you start your message with “let me explain to YOU how crypto works,” expect to get blocked.

Invariably, I still do, which makes me wonder if they read the PSA or think “they will be the one” to sign me. News flash, pilgrim...I now have another pertinent reason to ignore your blockchain solicitations, actually, several good reasons.

Every time we turn on a light in our homes, there is a thermal budget that we are being charged for (that's how Duke Power makes its money in North Carolina and Perdernales Electric Cooperative in Texas). Bitcoin/Blockchain (I think) caught the imagination because it seemed like a "Federation Credit" from Star Trek, where no one explains fully how a society that is "post-scarcity" somehow feels the need for some type of currency in utopia. It's kind of like magic carpets: you go with the bit for the story - warp drive, Heisenberg compensators, Federation credits. The story, and if you are thoroughly entertained after the denouement, not the physics, is what matters.

You might not be extracting anything from the planet directly, but Bitcoin mining has a massive impact on the planet’s environment.

Mining resources from our planet can take a devastating toll on the environment, both local and global. Even beyond this, using the resource could cause disastrous effects on our planet, and dependence on a single resource can wreak havoc on a country’s economy. Yet, many of these resources are needed for our daily lives -- sometimes as a luxury, sometimes as a necessity. Any responsible country or company should always take pause to consider what impact mining of any kind can have on the planet.

It turns out that these days, one type of mining might be the worst for Earth’s environment: bitcoins. Yes, the “mining” of virtual currency makes its mark on our planet. The unequal distribution of Bitcoin mining across the globe means that some countries are making a much larger dent into the planet’s climate and environment than others ... all for a “resource” that is far from necessary for our society.

Bitcoin mining uses a lot of computing power to solve the cryptographic puzzles that lie at the heart of the industry. As of today (October 30, 2023), each Bitcoin is worth over $34,000, and with the multitude of other cryptocoins out there, using computers to unlock more can be a profitable endeavor. Almost half a trillion dollars of the global economy runs on these “virtual currencies.”

Worst Kind of Mining for the Environment? It Might Be Bitcoin. Erik Klemetti, Discover Magazine

 

Read more…

In Medias Res...

12271231481?profile=RESIZE_710x

Image source: Link below

Topics: Applied Physics, Astrophysics, Computer Modeling, Einstein, High Energy Physics, Particle Physics, Theoretical Physics

In the search for new physics, a new kind of scientist is bridging the gap between theory and experiment.

Traditionally, many physicists have divided themselves into two tussling camps: the theorists and the experimentalists. Albert Einstein theorized general relativity, and Arthur Eddington observed it in action as “bending” starlight; Murray Gell-Mann and George Zweig thought up the idea of quarks, and Henry Kendall, Richard Taylor, Jerome Freidman and their teams detected them.

In particle physics especially, the divide is stark. Consider the Higgs boson, proposed in 1964 and discovered in 2012. Since then, physicists have sought to scrutinize its properties, but theorists and experimentalists don’t share Higgs data directly, and they’ve spent years arguing over what to share and how to format it. (There’s now some consensus, although the going was rough.)

But there’s a missing player in this dichotomy. Who, exactly, is facilitating the flow of data between theory and experiment?

Traditionally, the experimentalists filled this role, running the machines and looking at the data — but in high-energy physics and many other subfields, there’s too much data for this to be feasible. Researchers can’t just eyeball a few events in the accelerator and come to conclusions; at the Large Hadron Collider, for instance, about a billion particle collisions happen per second, which sensors detect, process, and store in vast computing systems. And it’s not just quantity. All this data is outrageously complex, made more so by simulation.

In other words, these experiments produce more data than anyone could possibly analyze with traditional tools. And those tools are imperfect anyway, requiring researchers to boil down many complex events into just a handful of attributes — say, the number of photons at a given energy. A lot of science gets left out.

In response to this conundrum, a growing movement in high-energy physics and other subfields, like nuclear physics and astrophysics, seeks to analyze data in its full complexity — to let the data speak for itself. Experts in this area are using cutting-edge data science tools to decide which data to keep and which to discard and to sniff out subtle patterns.


Opinion: The Rise of the Data Physicist, Benjamin Nachman, APS News

Read more…

Tunnel Falls...

12128045054?profile=RESIZE_710x

Chip off the old block: Intel’s Tunnel Falls chip is based on silicon spin qubits, which are about a million times smaller than other qubit types. (Courtesy: Intel Corporation)

Topics: Applied Physics, Chemistry, Electrical Engineering, Quantum Computer, Quantum Mechanics

Intel – the world’s biggest computer-chip maker – has released its newest quantum chip and has begun shipping it to quantum scientists and engineers to use in their research. Dubbed Tunnel Falls, the chip contains a 12-qubit array and is based on silicon spin-qubit technology.

The distribution of the quantum chip to the quantum community is part of Intel’s plan to let researchers gain hands-on experience with the technology while at the same time enabling new quantum research.

The first quantum labs to get access to the chip include the University of Maryland, Sandia National Laboratories, the University of Rochester, and the University of Wisconsin-Madison.

The Tunnel Falls chip was fabricated on 300 mm silicon wafers in Intel’s “D1” transistor fabrication facility in Oregon, which can carry out extreme ultraviolet lithography (EUV) and gate and contact processing techniques.

Intel releases 12-qubit silicon quantum chip to the quantum community, Martijn Boerkamp, Physics World.

Read more…

Beyond Attogram Imaging...

12126828078?profile=RESIZE_400x

When X-rays (blue color) illuminate an iron atom (red ball at the center of the molecule), core-level electrons are excited. X-ray excited electrons are then tunneled to the detector tip (gray) via overlapping atomic/molecular orbitals, which provide elemental and chemical information about the iron atom. Credit: Saw-Wai Hla

Topics: Applied Physics, Instrumentation, Materials Science, Nanomaterials, Quantum Mechanics

A team of scientists from Ohio University, Argonne National Laboratory, the University of Illinois-Chicago, and others, led by Ohio University Professor of Physics, and Argonne National Laboratory scientist, Saw Wai Hla, have taken the world's first X-ray SIGNAL (or SIGNATURE) of just one atom. This groundbreaking achievement could revolutionize the way scientists detect materials.

Since its discovery by Roentgen in 1895, X-rays have been used everywhere, from medical examinations to security screenings in airports. Even Curiosity, NASA's Mars rover, is equipped with an X-ray device to examine the material composition of the rocks on Mars. An important usage of X-rays in science is to identify the type of materials in a sample. Over the years, the quantity of materials in a sample required for X-ray detection has been greatly reduced thanks to the development of synchrotron X-rays sources and new instruments. To date, the smallest amount one can X-ray a sample is in an attogram, which is about 10,000 atoms or more. This is due to the X-ray signal produced by an atom being extremely weak, so conventional X-ray detectors cannot be used to detect it. According to Hla, it is a long-standing dream of scientists to X-ray just one atom, which is now being realized by the research team led by him.

"Atoms can be routinely imaged with scanning probe microscopes, but without X-rays, one cannot tell what they are made of. We can now detect exactly the type of a particular atom, one atom-at-a-time, and can simultaneously measure its chemical state," explained Hla, who is also the director of the Nanoscale and Quantum Phenomena Institute at Ohio University. "Once we are able to do that, we can trace the materials down to the ultimate limit of just one atom. This will have a great impact on environmental and medical sciences and maybe even find a cure that can have a huge impact on humankind. This discovery will transform the world."

Their paper, published in the scientific journal Nature on May 31, 2023, and gracing the cover of the print version of the scientific journal on June 1, 2023, details how Hla and several other physicists and chemists, including Ph.D. students at OHIO, used a purpose-built synchrotron X-ray instrument at the XTIP beamline of Advanced Photon Source and the Center for Nanoscale Materials at Argonne National Laboratory.

Scientists report the world's first X-ray of a single atom, Ohio University, Phys.org.

Read more…

Straining Moore...

12126816677?profile=RESIZE_710x

Topics: Applied Physics, Chemistry, Computer Science, Electrical Engineering, Materials Science, Nanotechnology, Quantum Mechanics, Semiconductor Technology

Gordon Moore, the co-founder of Intel who died earlier this year, is famous for forecasting a continuous rise in the density of transistors that we can pack onto semiconductor chips. James McKenzie looks at how “Moore’s law” is still going strong after almost six decades but warns that further progress is becoming harder and ever more expensive to sustain.

When the Taiwan Semiconductor Manufacturing Company (TSMC) announced last year that it was planning to build a new factory to produce integrated circuits, it wasn’t just the eye-watering $33bn price tag that caught my eye. What also struck me is that the plant, set to open in 2025 in the city of Hsinchu, will make the world’s first “2-nanometer” chips. Smaller, faster, and up to 30% more efficient than any microchip that has come before, TSMC’s chips will be sold to the likes of Apple – the company’s biggest customer – powering everything from smartphones to laptops.

But our ability to build such tiny, powerful chips shouldn’t surprise us. After all, the engineer Gordon Moore – who died on 24 March this year, aged 94 – famously predicted in 1965 that the number of transistors we can squeeze onto an integrated circuit ought to double yearly. Writing for the magazine Electronics (38 114), Moore reckoned that by 1975 it should be possible to fit a quarter of a million components onto a single silicon chip with an area of one square inch (6.25 cm2).

Moore’s prediction, which he later said was simply a “wild extrapolation”, held true, although, in 1975, he revised his forecast, predicting that chip densities would double every two years rather than every year. What thereafter became known as “Moore’s law” proved amazingly accurate, as the ability to pack ever more transistors into a tiny space underpinned the almost non-stop growth of the consumer electronics industry. In truth, it was never an established scientific “law” but more a description of how things had developed in the past as well as a roadmap that the semiconductor industry imposed on itself, driving future development.

Moore's law: further progress will push hard on the boundaries of physics and economics, James McKenzie, Physics World

Read more…