computer modeling (7)

In Medias Res...

12271231481?profile=RESIZE_710x

Image source: Link below

Topics: Applied Physics, Astrophysics, Computer Modeling, Einstein, High Energy Physics, Particle Physics, Theoretical Physics

In the search for new physics, a new kind of scientist is bridging the gap between theory and experiment.

Traditionally, many physicists have divided themselves into two tussling camps: the theorists and the experimentalists. Albert Einstein theorized general relativity, and Arthur Eddington observed it in action as “bending” starlight; Murray Gell-Mann and George Zweig thought up the idea of quarks, and Henry Kendall, Richard Taylor, Jerome Freidman and their teams detected them.

In particle physics especially, the divide is stark. Consider the Higgs boson, proposed in 1964 and discovered in 2012. Since then, physicists have sought to scrutinize its properties, but theorists and experimentalists don’t share Higgs data directly, and they’ve spent years arguing over what to share and how to format it. (There’s now some consensus, although the going was rough.)

But there’s a missing player in this dichotomy. Who, exactly, is facilitating the flow of data between theory and experiment?

Traditionally, the experimentalists filled this role, running the machines and looking at the data — but in high-energy physics and many other subfields, there’s too much data for this to be feasible. Researchers can’t just eyeball a few events in the accelerator and come to conclusions; at the Large Hadron Collider, for instance, about a billion particle collisions happen per second, which sensors detect, process, and store in vast computing systems. And it’s not just quantity. All this data is outrageously complex, made more so by simulation.

In other words, these experiments produce more data than anyone could possibly analyze with traditional tools. And those tools are imperfect anyway, requiring researchers to boil down many complex events into just a handful of attributes — say, the number of photons at a given energy. A lot of science gets left out.

In response to this conundrum, a growing movement in high-energy physics and other subfields, like nuclear physics and astrophysics, seeks to analyze data in its full complexity — to let the data speak for itself. Experts in this area are using cutting-edge data science tools to decide which data to keep and which to discard and to sniff out subtle patterns.


Opinion: The Rise of the Data Physicist, Benjamin Nachman, APS News

Read more…

11117286286?profile=RESIZE_710x

Fractals are a never-ending pattern that you can zoom in on, and the image doesn’t change. Fractals can occur in two dimensions, like frost on a window, or in three dimensions, like tree limbs. A recent discovery from Purdue University researchers has established that superconducting images, seen above in red and blue, are actually fractals that fill a three-dimensional space and are disorder driven rather than driven by quantum fluctuations as expected. Frost and tree images by Adobe. Superconducting image (center) from "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x" in Nature Communications. Credit: Nature Communications (2023). DOI: 10.1038/s41467-023-38249-3

Topics: Applied Physics, Civilization, Computer Modeling, Condensed Matter Physics, Materials Science, Solid-State Physics, Superconductors

Meeting the world's energy demands is reaching a critical point. Powering the technological age has caused issues globally. It is increasingly important to create superconductors that can operate at ambient pressure and temperature. This would go a long way toward solving the energy crisis.

Advancements with superconductivity hinge on advances in quantum materials. When electrons inside quantum materials undergo a phase transition, the electrons can form intricate patterns, such as fractals. A fractal is a never-ending pattern. When zooming in on a fractal, the image looks the same. Commonly seen fractals can be a tree or frost on a windowpane in winter. Fractals can form in two dimensions, like the frost on a window, or in three-dimensional space, like the limbs of a tree.

Dr. Erica Carlson, a 150th Anniversary Professor of Physics and Astronomy at Purdue University, led a team that developed theoretical techniques for characterizing the fractal shapes that these electrons make in order to uncover the underlying physics driving the patterns.

Carlson, a theoretical physicist, has evaluated high-resolution images of the locations of electrons in the superconductor Bi2-xPbzSr2-yLayCuO6+x (BSCO) and determined that these images are indeed fractal and discovered that they extend into the full three-dimensional space occupied by the material, like a tree filling space.

What was once thought of as random dispersions within the fractal images are purposeful and, shockingly, not due to an underlying quantum phase transition as expected but due to a disorder-driven phase transition.

Carlson led a collaborative team of researchers across multiple institutions and published their findings, titled "Critical nematic correlations throughout the superconducting doping range in Bi2-xPbzSr2-yLayCuO6+x," in Nature Communications.

The team includes Purdue scientists and partner institutions. From Purdue, the team includes Carlson, Dr. Forrest Simmons, a recent Ph.D. student, and former Ph.D. students Dr. Shuo Liu and Dr. Benjamin Phillabaum. The Purdue team completed their work within the Purdue Quantum Science and Engineering Institute (PQSEI). The team from partner institutions includes Dr. Jennifer Hoffman, Dr. Can-Li Song, Dr. Elizabeth Main of Harvard University, Dr. Karin Dahmen of the University of Illinois at Urbana-Champaign, and Dr. Eric Hudson of Pennsylvania State University.

Researchers discover superconductive images are actually 3D and disorder-driven fractals, Cheryl Pierce, Purdue University, Phys.org.

Read more…

Catalysis and Energy Savings…

11024141267?profile=RESIZE_710x

Credit: Pixabay/CC0 Public Domain

Topics: Chemistry, Computer Modeling, Environment, Materials Science

In an advance, they consider a breakthrough in computational chemistry research. University of Wisconsin–Madison chemical engineers have developed a model of how catalytic reactions work at the atomic scale. This understanding could allow engineers and chemists to develop more efficient catalysts and tune industrial processes—potentially with enormous energy savings, given that 90% of the products we encounter in our lives are produced, at least partially, via catalysis.

Catalyst materials accelerate chemical reactions without undergoing changes themselves. They are critical for refining petroleum products and for manufacturing pharmaceuticals, plastics, food additives, fertilizers, green fuels, industrial chemicals, and much more.

Scientists and engineers have spent decades fine-tuning catalytic reactions—yet because it's currently impossible to directly observe those reactions at the extreme temperatures and pressures often involved in industrial-scale catalysis, they haven't known exactly what is taking place on the nano and atomic scales. This new research helps unravel that mystery with potentially major ramifications for the industry.

In fact, just three catalytic reactions—steam-methane reforming to produce hydrogen, ammonia synthesis to produce fertilizer, and methanol synthesis—use close to 10% of the world's energy.

"If you decrease the temperatures at which you have to run these reactions by only a few degrees, there will be an enormous decrease in the energy demand that we face as humanity today," says Manos Mavrikakis, a professor of chemical and biological engineering at UW–Madison who led the research. "By decreasing the energy needed to run all these processes, you are also decreasing their environmental footprint."

New atomic-scale understanding of catalysis could unlock massive energy savings, Jason Daley, University of Madison-Wisconson

Read more…

Dinosaurs and Dodos...

10865362064?profile=RESIZE_400x

Credit: Andrzej Puchta/Alamy Stock Photo

Topics: Asteroids, Astronomy, Astrophysics, Civilization, Computer Modeling

The following article, since it simulated the destruction of my hometown, two days after my sixtieth birthday, is a little personal.

*****

On August 16, 2022, an approximately 70-meter asteroid entered Earth’s atmosphere. At 2:02:10 P.M. EDT, the space rock exploded eight miles over Winston-Salem, N.C., with the energy of 10 megatons of TNT. The airburst virtually leveled the city and surrounding area. Casualties were in the thousands.

Well, not really. The destruction of Winston-Salem was the storyline of the fourth Planetary Defense Tabletop Exercise, run by NASA’s Planetary Defense Coordination Office. The exercise was a simulation where academics, scientists, and government officials gathered to practice how the United States would respond to a real planet-threatening asteroid. Held February 23–24, participants were both virtual and in-person, hailing from Washington D.C., the Johns Hopkins Applied Physics Lab (APL) campus in Laurel, Md., Raleigh, and Winston-Salem, N.C. The exercise included more than 200 participants from 16 different federal, state, and local organizations. On August 5, the final report came out, and the message was stark: humanity is not yet ready to meet this threat.

On the plus side, the exercise was meant to be hard—practically unwinnable. “We designed it to fall right into the gap in our capabilities,” says Emma Rainey, an APL senior scientist who helped to create the simulation. “The participants could do nothing to prevent the impact.” The main goal was to test the different government and scientific networks that should respond in a real-life planetary defense situation. “We want to see how effective operations and communications are between U.S. government agencies and the other organizations that would be involved, and then identify shortcomings,” says Lindley Johnson, planetary defense officer at NASA headquarters.

NASA Asteroid Threat Practice Drill Shows We’re Not Ready, Matt Brady, Scientific American

Read more…

Syncing Fireflies...


Some fireflies have a mystifying gift for flashing their abdomens in sync. New observations are overturning long-accepted explanations for how the synchronization occurs, at least for some species.

Topics: Biology, Biomimetics, Biotechnology, Computer Modeling, Mathematics

In Japanese folk traditions, they symbolize departing souls or silent, ardent love. Some Indigenous cultures in the Peruvian Andes view them as the eyes of ghosts. And across various Western cultures, fireflies, glow-worms, and other bioluminescent beetles have been linked to a dazzling and at times contradictory array of metaphoric associations: “childhood, crop, doom, elves, fear, habitat change, idyll, love, luck, mortality, prostitution, solstice, stars and fleetingness of words and cognition,” as one 2016 review noted.

Physicists revere fireflies for reasons that might seem every bit as mystical: Of the roughly 2,200 species scattered around the world, a handful has the documented ability to flash in synchrony. In Malaysia and Thailand, firefly-studded mangrove trees can blink on the beat as if strung up with Christmas lights; every summer in Appalachia, waves of eerie concordance ripple across fields and forests. The fireflies’ light shows lure mates and crowds of human sightseers, but they have also helped spark some of the most fundamental attempts to explain synchronization, the alchemy by which elaborate coordination emerges from even very simple individual parts.

Orit Peleg remembers when she first encountered the mystery of synchronous fireflies as an undergraduate studying physics and computer science. The fireflies were presented as an example of how simple systems achieve synchrony in Nonlinear Dynamics and Chaos, a textbook by the mathematician Steven Strogatz that her class was using. Peleg had never even seen a firefly, as they are uncommon in Israel, where she grew up.

“It’s just so beautiful that it somehow stuck in my head for many, many years,” she said. But by the time Peleg began her own lab, applying computational approaches to biology at the University of Colorado and at the Santa Fe Institute, she had learned that although fireflies had inspired a lot of math, quantitative data describing what the insects were actually doing was scant.

How Do Fireflies Flash in Sync? Studies Suggest a New Answer. Joshua Sokol, Quanta Magazine

Read more…

HETs...

9802247065?profile=RESIZE_584x

FIG. 1. Temporal evolution of chamber pressure assuming nominal operation for 30 s followed by a 40 s interval with flow rate reduced 100×. The colors correspond to 1 kW, 10 kW, 100 kW, and 1 MW power levels. The process is then repeated.

Topics: Applied Physics, Computer Modeling, NASA, Space Exploration, Spaceflight

Abstract

Hall effect thrusters operating at power levels in excess of several hundreds of kilowatts have been identified as enabling technologies for applications such as lunar tugs, large satellite orbital transfer vehicles, and solar system exploration. These large thrusters introduce significant testing challenges due to the propellant flow rate exceeding the pumping speed available in most laboratories. Even with proposed upgrades in mind, the likelihood that multiple vacuum facilities will exist in the near future to allow long-duration testing of high-power Hall thrusters operating at power levels in excess of 100 kW remains extremely low. In this article, we numerically explore the feasibility of testing Hall thrusters in a quasi-steady mode defined by pulsing the mass flow rate between a nominal and a low value. Our simulations indicate that sub-second durations available before the chamber reaches critical pressure are sufficiently long to achieve the steady-state current and flow field distributions, allowing us to characterize thruster performance and the near plume region.

I. INTRODUCTION

Hall effect thrusters (HETs) are spacecraft electric propulsion (EP) devices routinely used for orbit raising, repositioning, and solar system exploration applications. To date, the highest power Hall thruster flown is the 4.5 kW BPT-4000 launched in 2010 aboard the Advanced EHF satellite1 (which the HET helped to deliver to the correct orbit after a failure of the primary chemical booster), although a 13 kW system is being readied for near-term flight operation as part of the Lunar Gateway,2 and thrusters at 503,4–100 kWs power levels have been demonstrated in the laboratory. Solar cell advancements and a renewed interest in nuclear power have led the aerospace community to consider the use of Hall thrusters operating at even higher power levels. Multi-hundred kW EP systems would offer an economical solution for LEO to GEO orbit raising or for the deployment of an Earth-to-Moon delivery tug, and power levels in excess of 600 kW could be utilized for crewed transport to Mars.5–9 While such power levels could be delivered using existing devices, a single large thruster requires less system mass and has a reduced footprint than a cluster of smaller devices.10

Quasi-steady testing approach for high‐power Hall thrusters, Lubos Brieda, Yevgeny Raitses, Edgar Choueiri, Roger Myers, Michael Keidar, Journal of Applied Physics

Read more…

Modeling Spread...

8052690477?profile=RESIZE_400x

Image Source: Coronavirus and COVID-19: What You Should Know (WebMD)

Topics: Biology, Computer Modeling, COVID-19, Research

TOKYO (Reuters) – A Japanese supercomputer showed that humidity can have a large effect on the dispersion of virus particles, pointing to heightened coronavirus contagion risks in dry, indoor conditions during the winter months.

The finding suggests that the use of humidifiers may help limit infections during times when window ventilation is not possible, according to a study released on Tuesday by research giant Riken and Kobe University.

The researchers used the Fugaku supercomputer to model the emission and flow of virus-like particles from infected people in a variety of indoor environments.

Air humidity of lower than 30% resulted in more than double the amount of aerosolized particles compared to levels of 60% or higher, the simulations showed.

The study also indicated that clear face shields are not as effective as masks in preventing the spread of aerosols. Other findings showed that diners are more at risk from people to their side compared to across the table, and the number of singers in choruses should be limited and spaced out.

Japan supercomputer shows humidity affects aerosol spread of coronavirus, Rocky Swift, Reuters Science

Read more…