thermodynamics (21)

Wages of the Thermal Budget...

12998106495?profile=RESIZE_710x

 

Topics: Applied Physics, Astrobiology, Astrophysics, Civilization, Climate Change, Existentialism, Exoplanets, SETI, Thermodynamics

 

Well, this firmly puts a kink in the "Fermi Paradox."

 

The Industrial Revolution started in Britain around 1760 - 1840, and there was a colloquial saying that "the sun did not set on the British Empire." The former colony, America, cranked up its industrial revolution around 1790. Mary Shelley birthed the science fiction genre in the dystopian Frankenstein in 1818, around the time of climate-induced change of European weather, and a noticeable drop in temperature. It was also a warning of the overconfidence of science, the morality that should be considered when designing new technologies, its impact on the environment, and humans that sadly, don't think themselves a part of the environment. The divide between sci-fi is dystopian and Pollyannish: Star Trek mythology made that delicate balance between their fictional Eugenics Wars, World War III, the "Atomic Horror," and a 21st Century dark age, the discovery of superluminal space travel, and First Contact with benevolent, pointy-eared aliens, leading to Utopia post xenophobia. We somehow abandoned countries and currency, and thus, previous hierarchal power and inequality modalities. Roddenberry's dream was a secular version of Asgard, Heaven, Olympus, and Svarga: a notion of continuance for a species aware of its finite existence, buttressed by science and space lasers.

 

If aliens had a similar industrial revolution, they perhaps created currencies that allowed for trade and commerce, hierarchies to decide who would hoard resources, and which part of their societies were functionally peasantry. They would separate by tribes, complexions, and perhaps stripes if they're aquatic, and fight territorial wars over resources. Those wars would throw a lot of carbon dioxide in their oxygenated atmospheres. Selfishness, hoarding disorder, and avarice would convince the aliens that the weather patterns were "a hoax," they would pay the equivalent of lawyers to obfuscate the reality of their situations before it was too late on any of their planets to reverse the effects on their worlds. If they were colonizing the stars, it wouldn't be for the altruistic notion of expanding their knowledge by "seeking out life, and new civilizations": they would have exceeded the thermal budgets of their previous planets. Changing their galactic zip codes would only change the locations of their eventual outcomes.

 

Thermodynamics wins, and Lord Kelvin may have answered Enrico Fermi's question. Far be it for me to adjudicate whether or not anyone has had a "close encounter of the third kind," but I don't see starships coming out of this scenario. Cogito ergo sum homo stultus.

 

It may take less than 1,000 years for an advanced alien civilization to destroy its own planet with climate change, even if it relies solely on renewable energy, a new model suggests.

 

When astrophysicists simulated the rise and fall of alien civilizations, they found that, if a civilization were to experience exponential technological growth and energy consumption, it would have less than 1,000 years before the alien planet got too hot to be habitable. This would be true even if the civilization used renewable energy sources, due to inevitable leakage in the form of heat, as predicted by the laws of thermodynamics. The new research was posted to the preprint database arXiv and is in the process of being peer-reviewed.

 

While the astrophysicists wanted to understand the implications for life beyond our planet, their study was initially inspired by human energy use, which has grown exponentially since the 1800s. In 2023, humans used about 180,000 terawatt hours (TWh), which is roughly the same amount of energy that hits Earth from the sun at any given moment. Much of this energy is produced by gas and coal, which is heating up the planet at an unsustainable rate. But even if all that energy were created by renewable sources like wind and solar power, humanity would keep growing, and thus keep needing more energy."

 

This brought up the question, 'Is this something that is sustainable over a long period of time?'" Manasvi Lingam, an astrophysicist at Florida Tech and a co-author of the study, told Live Science in an interview.

 

Lingam and his co-author Amedeo Balbi, an associate professor of astronomy and astrophysics at Tor Vergata University of Rome, were interested in applying the second law of thermodynamics to this problem. This law says that there is no perfect energy system, where all energy created is efficiently used; some energy must always escape the system. This escaped energy will cause a planet to heat up over time.

 

"You can think of it like a leaky bathtub," Lingam said. If a bathtub that is holding only a little water has a leak, only a small amount can get out, he explained. But as the bathtub is filled more and more — as energy levels increase exponentially to meet demand — a small leak can suddenly turn into a flooded house.

 

Alien civilizations are probably killing themselves from climate change, bleak study suggests, Sierra Bouchér, Live Science

 

Read more…

Lasers and Plasma...

12958562296?profile=RESIZE_710x

A researcher holds the scaffolding with tiny copper foils attached. These copper pieces will be struck with lasers, heating them to thousands of degrees Fahrenheit.

Credit: Hiroshi Sawada

Topics: Applied Physics, Lasers, Materials Science, Plasma, Radiation, Thermodynamics

For the first time, researchers monitor the heat progression in laser-created plasma that occurs in only a few trillionths of a second.

A team of researchers supported by the U.S. National Science Foundation has developed a new method of tracking the ultra-fast heat progression in warm, dense matter plasmas — the type of matter created when metals are struck with high-powered lasers. Published in Nature Communications, the results of this study will help researchers better understand not only how plasma forms when metal is heated by high-powered lasers but also what's happening within the cores of giant planets and even aid in the development of fast ignition laser fusion with energy-generating potential here on Earth.

The research team aimed a high-powered laser at very thin strips of copper, which heated to 200,000 degrees Fahrenheit and momentarily shifted to a warm, dense matter plasma state before exploding. At the same time, the researchers used ultrashort-duration X-ray pulses from an X-ray free-electron laser to capture images of the copper's transformation down to a few picoseconds or trillionths of a second. By doing so, the researchers were able to observe the ultra-fast and microscopic transformation of matter.

"These findings shed new light on fundamental properties of plasmas in the warm dense matter state," says Vyacheslav Lukin, NSF program director for Plasma Physics. "The new methods to probe the plasma developed by this international team of researchers may also inform future experiments at extremely high-powered lasers, such as the NSF ZEUS Laser Facility."

Researchers track plasma creation using a novel ultra-fast laser method, National Science Foundation

Read more…

Nano Over Nukes...

12944869868?profile=RESIZE_710x

Heat trap The proposed nanoparticle warming method. (Courtesy: Aaron M. Geller, Northwestern Center for Interdisciplinary Exploration and Research in Astrophysics)

Topics: Aerogels, Exoplanets, Mars, Materials Science, Nanomaterials, NASA, Planetary Science. Thermodynamics

Suffice it to say, Mr. Musk's nuking the Martian planet idea is impractical, and a nonstarter, but to show that he's mature about it, he has T-shirts, because that always makes bad ideas palatable, like a spoon [full] of sugar to help bitter medicine go down (Mary Poppins thought so). The "real-life Tony Stark" he's not.

If humans released enough engineered nanoparticles into the atmosphere of Mars, the planet could become more than 30 K warmer – enough to support some forms of microbial life. This finding is based on theoretical calculations by researchers in the US, and it suggests that “terraforming” Mars to support temperatures that allow for liquid water may not be as difficult as previously thought.

“Our finding represents a significant leap forward in our ability to modify the Martian environment,” says team member Edwin Kite, a planetary scientist at the University of Chicago.

Today, Mars is far too cold for life as we know it to thrive there. But it may not have always been this way. Indeed, streams may have flowed on the red planet as recently as 600,000 years ago. The idea of returning Mars to this former, warmer state – terraforming – has long kindled imagination, and scientists have proposed several ways of doing it.

One possibility would be to increase the levels of artificial greenhouse gases, such as chlorofluorocarbons, in Mars’ currently thin atmosphere. However, this would require volatilizing roughly 100,000 megatons of fluorine, an element that is scarce on the red planet’s surface. This means that essentially all the fluorine required would need to be transported to Mars from somewhere else – something that is not really feasible.

An alternative would be to use materials already present on Mars’ surface, such as those in aerosolized dust. Natural Martian dust is mainly made of iron-rich minerals distributed in particles roughly 1.5 microns in radius, which are easily lofted to altitudes of 60 km and more. In its current form, this dust actually lowers daytime surface temperatures by attenuating infrared solar radiation. A modified form of dust might, however, experience different interactions. Could this modified dust make the planet warmer?

Nanoparticles designed to trap escaping heat and scatter sunlight

In a proof-of-concept study, Kite and colleagues at the University of Chicago, the University of Central Florida, and Northwestern University analyzed the atmospheric effects of nanoparticles shaped like short rods about nine microns long, which is about the same size as commercially available glitter. These particles have an aspect ratio of around 60:1, and Kite says they could be made from readily available Martian materials such as iron or aluminum.

To make Mars warmer, just add nanorods, Isabelle Dumé, Physics World

Read more…

Matrix...

12479171086?profile=RESIZE_710x

(a) Schematics of the word INFORMATION is written on a material in binary code using magnetic recording. Red denotes magnetization pointing out of the plane and blue is magnetization pointing into the plane. (b)–(d) Time evolution of the digital magnetic recording information states simulated using micromagnetic Monte Carlo. (b) Initial random state. (c) INFORMATION is written (t = 0 s). (d) Iteration 930 (t = 1395 s) showing the degradation of information states. Reproduced with permission from M. M. Vopson and S. Lepadatu, AIP Adv. 12, 075310 (2022). Copyright 2022 AIP Publishing.

Topics: Chemistry, DNA, General Relativity, Genetics, Nucleotides, Thermodynamics

Reference: Electronic Orbitals, Chem Libre Text dot org

As Morpheus describes, “You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill; you stay in Wonderland. And I show you how deep the rabbit hole goes.” Neo takes the red pill and wakes up in the real world. Source: Britannica Online: Red Pill and Blue Pill Symbolism

The simulation hypothesis is a philosophical theory in which the entire universe and our objective reality are just simulated constructs. Despite the lack of evidence, this idea is gaining traction in scientific circles as well as in the entertainment industry. Recent scientific developments in the field of information physics, such as the publication of the mass-energy-information equivalence principle, appear to support this possibility. In particular, the 2022 discovery of the second law of information dynamics (infodynamics) facilitates new and interesting research tools at the intersection between physics and information. In this article, we re-examine the second law of infodynamics and its applicability to digital information, genetic information, atomic physics, mathematical symmetries, and cosmology, and we provide scientific evidence that appears to underpin the simulated universe hypothesis.

Introduction

In 2022, a new fundamental law of physics has been proposed and demonstrated, called the second law of information dynamics or simply the second law of infodynamics.1 Its name is an analogy to the second law of thermodynamics, which describes the time evolution of the physical entropy of an isolated system, which requires the entropy to remain constant or to increase over time. In contrast to the second law of thermodynamics, the second law of infodynamics states that the information entropy of systems containing information states must remain constant or decrease over time, reaching a certain minimum value at equilibrium. This surprising observation has massive implications for all branches of science and technology. With the ever-increasing importance of information systems such as digital information storage or biological information stored in DNA/RNA genetic sequences, this new powerful physics law offers an additional tool for examining these systems and their time evolution.2 

The second law of infodynamics and its implications for the simulated universe hypothesis, Melvin M. Vopson, AIP Advances

Read more…

Spongy Narks...

12389875456?profile=RESIZE_710x

Scientists used samples from sclerosponges off the coast of Puerto Rico to calculate ocean surface temperatures going back 300 years. Douglas Rissing/iStockphoto/Getty Images

Topics: Climate Change, Existentialism, Global Warming, Research, Thermodynamics

CNN — Using sponges collected off the coast of Puerto Rico in the eastern Caribbean, scientists have calculated 300 years of ocean temperatures and concluded the world has already overshot one crucial global warming limit and is speeding toward another.

These findings, published Monday in the journal Nature Climate Change, are alarming but also controversial. Other scientists say the study contains too many uncertainties and limitations to draw such firm conclusions and could end up confusing public understanding of climate change.

Sponges — which grow slowly, layer by layer — can act like data time capsules, allowing a glimpse into what the ocean was like hundreds of years ago, long before the existence of modern data.

Using samples from sclerosponges, which live for centuries, the team of international scientists was able to calculate ocean surface temperatures going back 300 years.

They found human-caused warming may have started earlier than currently assumed and, as a result, global average temperature may have already warmed more than 1.5 degrees Celsius above pre-industrial levels. Researchers say the results also suggest global temperature could overshoot 2 degrees of warming by the end of the decade.

Under the 2015 Paris Agreement, countries pledged to restrict global warming to less than 2 degrees above pre-industrial levels, with an ambition to limit it to 1.5 degrees. The pre-industrial era — or the state of the climate before humans started burning large amounts of fossil fuels and warming the planet — is commonly defined as 1850-1900.

Data from centuries-old sea creatures suggest the world is warming faster than scientists thought, Rachel Ramirez, CNN

Read more…

Scandium and Superconductors...

12347514059?profile=RESIZE_710x

Scandium is the only known elemental superconductor to have a critical temperature in the 30 K range. This phase diagram shows the superconducting transition temperature (Tc) and crystal structure versus pressure for scandium. The measured results on all the five samples studied show consistent trends. (Courtesy: Chinese Phys. Lett. 40 107403)

Topics: Applied Physics, Chemistry, Condensed Matter Physics, Materials Science, Superconductors, Thermodynamics

Scandium remains a superconductor at temperatures above 30 K (-243.15 Celsius, -405.67 Fahrenheit), making it the first element known to superconduct at such a high temperature. The record-breaking discovery was made by researchers in China, Japan, and Canada, who subjected the element to pressures of up to 283 GPa – around 2.3 million times the atmospheric pressure at sea level.

Many materials become superconductors – that is, they conduct electricity without resistance – when cooled to low temperatures. The first superconductor to be discovered, for example, was solid mercury in 1911, and its transition temperature Tc is only a few degrees above absolute zero. Several other superconductors were discovered shortly afterward with similarly frosty values of Tc.

In the late 1950s, the Bardeen–Cooper–Schrieffer (BCS) theory explained this superconducting transition as the point at which electrons overcome their mutual electrical repulsion to form so-called “Cooper pairs” that then travel unhindered through the material. But beginning in the late 1980s, a new class of “high-temperature” superconductors emerged that could not be explained using BCS theory. These materials have Tc above the boiling point of liquid nitrogen (77 K), and they are not metals. Instead, they are insulators containing copper oxides (cuprates), and their existence suggests it might be possible to achieve superconductivity at even higher temperatures.

The search for room-temperature superconductors has been on ever since, as such materials would considerably improve the efficiency of electrical generators and transmission lines while also making common applications of superconductivity (including superconducting magnets in particle accelerators and medical devices like MRI scanners) simpler and cheaper.

Scandium breaks temperature record for elemental superconductors, Isabelle Dumé, Physics World

Read more…

Cooling Circuitry...

12345221085?profile=RESIZE_710x

Illustration of a UCLA-developed solid-state thermal transistor using an electric field to control heat movement. Credit: H-Lab/UCLA

Topics: Applied Physics, Battery, Chemistry, Electrical Engineering, Energy, Thermodynamics

A new thermal transistor can control heat as precisely as an electrical transistor can control electricity.

From smartphones to supercomputers, electronics have a heat problem. Modern computer chips suffer from microscopic “hotspots” with power density levels that exceed those of rocket nozzles and even approach that of the sun’s surface. Because of this, more than half the total electricity burned at U.S. data centers isn’t used for computing but for cooling. Many promising new technologies—such as 3-D-stacked chips and renewable energy systems—are blocked from reaching their full potential by errant heat that diminishes a device’s performance, reliability, and longevity.

“Heat is very challenging to manage,” says Yongjie Hu, a physicist and mechanical engineer at the University of California, Los Angeles. “Controlling heat flow has long been a dream for physicists and engineers, yet it’s remained elusive.”

But Hu and his colleagues may have found a solution. As reported last November in Science, his team has developed a new type of transistor that can precisely control heat flow by taking advantage of the basic chemistry of atomic bonding at the single-molecule level. These “thermal transistors” will likely be a central component of future circuits and will work in tandem with electrical transistors. The novel device is already affordable, scalable, and compatible with current industrial manufacturing practices, Hu says, and it could soon be incorporated into the production of lithium-ion batteries, combustion engines, semiconductor systems (such as computer chips), and more.

Scientists Finally Invent Heat-Controlling Circuitry That Keeps Electronics Cool, Rachel Newur, Scientific American

Read more…

Bitcoin and Gaia...

12271621897?profile=RESIZE_710x

"What are the environmental impacts of cryptocurrency?" Written by Paul Kim; edited by Jasmine Suarez Mar 17, 2022, 5:21 PM EDT, Business Insider.

 Image: Ethereum, the second biggest cryptocurrency on the market, plans on changing to proof of stake mining in the future. Rachel Mendelson/Insider

 

Topics: Applied Physics, Computer Science, Cryptography, Economics, Environment, Star Trek, Thermodynamics

In what is now “old school Internet” (or web surfing for fogies), I will get a friend request from someone on Facebook/Meta who is in cryptocurrency. I quote myself in the first paragraph of what I refer to as my “public service announcement):

I am not INTERESTED in crypto. As someone who worked with cryptography as a matter of national security, holding a TS/SCI clearance, when you start your message with “let me explain to YOU how crypto works,” expect to get blocked.

Invariably, I still do, which makes me wonder if they read the PSA or think “they will be the one” to sign me. News flash, pilgrim...I now have another pertinent reason to ignore your blockchain solicitations, actually, several good reasons.

Every time we turn on a light in our homes, there is a thermal budget that we are being charged for (that's how Duke Power makes its money in North Carolina and Perdernales Electric Cooperative in Texas). Bitcoin/Blockchain (I think) caught the imagination because it seemed like a "Federation Credit" from Star Trek, where no one explains fully how a society that is "post-scarcity" somehow feels the need for some type of currency in utopia. It's kind of like magic carpets: you go with the bit for the story - warp drive, Heisenberg compensators, Federation credits. The story, and if you are thoroughly entertained after the denouement, not the physics, is what matters.

You might not be extracting anything from the planet directly, but Bitcoin mining has a massive impact on the planet’s environment.

Mining resources from our planet can take a devastating toll on the environment, both local and global. Even beyond this, using the resource could cause disastrous effects on our planet, and dependence on a single resource can wreak havoc on a country’s economy. Yet, many of these resources are needed for our daily lives -- sometimes as a luxury, sometimes as a necessity. Any responsible country or company should always take pause to consider what impact mining of any kind can have on the planet.

It turns out that these days, one type of mining might be the worst for Earth’s environment: bitcoins. Yes, the “mining” of virtual currency makes its mark on our planet. The unequal distribution of Bitcoin mining across the globe means that some countries are making a much larger dent into the planet’s climate and environment than others ... all for a “resource” that is far from necessary for our society.

Bitcoin mining uses a lot of computing power to solve the cryptographic puzzles that lie at the heart of the industry. As of today (October 30, 2023), each Bitcoin is worth over $34,000, and with the multitude of other cryptocoins out there, using computers to unlock more can be a profitable endeavor. Almost half a trillion dollars of the global economy runs on these “virtual currencies.”

Worst Kind of Mining for the Environment? It Might Be Bitcoin. Erik Klemetti, Discover Magazine

 

Read more…

TEG...

11035675488?profile=RESIZE_584x

The new self-powered thermoelectric generator device uses an ultra-broadband solar absorber (UBSA) to capture sunlight, which heats the generator. Simultaneously, another component called a planar radiative cooling emitter (RCE) cools part of the device by releasing heat. Credit: Haoyuan Cai, Jimei University

Topics: Alternate Energy, Battery, Chemistry, Energy, Materials Science, Thermodynamics

Researchers have developed a new thermoelectric generator (TEG) that can continuously generate electricity using heat from the sun and a radiative element that releases heat into the air. Because it works during the day or night and in cloudy conditions, the new self-powered TEG could provide a reliable power source for small electronic devices such as outdoor sensors.

"Traditional power sources like batteries are limited in capacity and require regular replacement or recharging, which can be inconvenient and unsustainable," said research team leader Jing Liu from Jimei University in China. "Our new TEG design could offer a sustainable and continuous energy solution for small devices, addressing the constraints of traditional power sources like batteries."

TEGs are solid-state devices that use temperature differences to generate electricity without moving parts. In the journal Optics Express, Liu and a multi-institutional team of researchers describe and demonstrate a new TEG that can simultaneously generate the heat and cold necessary to create a temperature difference large enough to generate electricity even when the sun isn't out. The passive power source is made of components that can easily be manufactured.

"The unique design of our self-powered thermoelectric generator allows it to work continuously, no matter the weather," said Liu. "With further development, our TEG has the potential to impact a wide range of applications, from remote sensors to wearable electronics, promoting a more sustainable and eco-friendly approach to powering our daily lives."

New passive device continuously generates electricity during the day or night, Optica/Tech Explore

Read more…

Solid-State Cooling...

10814278684?profile=RESIZE_584x

Cool stuff: the diagram shows how the temperature of the caloric material was measured. The plot in the center shows the temperature change in the sample when exposed to a magnetic field. The plot on the right shows the change in temperature when the sample is strained. (Courtesy: Peng Wu et al/Acta Materialia 237 118154)

Topics: Global Warming, Green Tech, Materials Science, Solid-State Physics, Thermodynamics

Researchers in China have shown that applying strain to a composite material using an electric field induces a large and reversible caloric effect. This novel way of enhancing the caloric effect without a magnetic field could open new avenues of solid-state cooling and lead to more energy-efficient and lighter refrigerators.

The International Institute of Refrigeration estimates that 20% of all electricity used globally is expended on vapor-compression refrigeration – which is the technology used in conventional refrigerators and air conditioners. What is more, the refrigerants used in these systems are powerful greenhouse gases that contribute significantly to global warming. As a result, scientists are trying to develop more environmentally friendly refrigeration systems.

Cooling systems can also be made from completely solid-state systems, but these cannot currently compete with vapor compression for most mainstream applications. Today, most commercial solid-state cooling systems use the Peltier effect, which is a thermoelectric process that suffers from high cost and low efficiency.

Solid-state cooling is achieved via electric field-induced strain, Hardepinder Singh, Physics World

Read more…

Fourth Signature...

10265218097?profile=RESIZE_710x

How can you tell if a material is a superconductor? Four classic signatures are illustrated here. Left to right: 1) It conducts electricity with no resistance when chilled below a certain temperature. 2) It expels magnetic fields, so a magnet placed on top of it will levitate. 3) Its heat capacity – the amount of heat needed to raise its temperature by a given amount – shows a distinctive anomaly as the material transitions to a superconducting state. 4) And at that same transition point, its electrons pair up and condense into a sort of electron soup that allows current to flow freely. Now experiments at SLAC and Stanford have captured this fourth signature in cuprates, which become superconducting at relatively high temperatures, and show that it occurs in two distinct steps and at very different temperatures. Knowing how that happens in fine detail suggests a new and very practical direction for research into these enigmatic materials. (Courtesy: Greg Stewart, SLAC National Accelerator Laboratory)

Topics: Condensed Matter Physics, Superconductor, Thermodynamics

Researchers in the US report that they have observed the so-called “fourth signature” of superconducting phase transitions in materials known as cuprates. The result, obtained via photoemission spectroscopy of a cuprate called Bi2212, could shed fresh light on how these materials, which conduct electricity without resistance at temperatures of 77 K or higher, transition into the superconducting state.

The superconducting transition occurs when a material loses all resistance to an electrical current below a certain critical temperature Tc. At this temperature, bulk materials exhibit four characteristic “signatures” – electrical, magnetic, thermodynamic, and spectroscopic – indicating that transition has occurred. The electrical signature is the development of zero resistance. The magnetic signature is the onset of the Meissner effect – that is, the material expels magnetic fields. And the thermodynamic signature is that the material’s heat capacity (the amount of heat required to increase its temperature by a given value) displays a distinctive anomaly.

Elusive superconducting-transition signature seen for the first time, Isabelle Dumé, Physics World

Read more…

Thermo Limits

10249327499?profile=RESIZE_584x

A radical reimagining of information processing could greatly reduce the energy use—as well as greenhouse gas emissions and waste heat—from computers. Credit: vchal/Getty Images

Topics: Climate Change, Computer Science, Electrical Engineering, Global Warming, Semiconductor Technology, Thermodynamics

In case you had not noticed, computers are hot—literally. A laptop can pump out thigh-baking heat, while data centers consume an estimated 200 terawatt-hours each year—comparable to the energy consumption of some medium-sized countries. The carbon footprint of information and communication technologies as a whole is close to that of fuel used in the aviation industry. And as computer circuitry gets ever smaller and more densely packed, it becomes more prone to melting from the energy it dissipates as heat.

Now physicist James Crutchfield of the University of California, Davis, and his graduate student Kyle Ray have proposed a new way to carry out computation that would dissipate only a small fraction of the heat produced by conventional circuits. In fact, their approach, described in a recent preprint paper, could bring heat dissipation below even the theoretical minimum that the laws of physics impose on today’s computers. That could greatly reduce the energy needed to both perform computations and keep circuitry cool. And it could all be done, the researchers say, using microelectronic devices that already exist.

In 1961 physicist Rolf Landauer of IBM’s Thomas J. Watson Research Center in Yorktown Heights, N.Y., showed that conventional computing incurs an unavoidable cost in energy dissipation—basically, in the generation of heat and entropy. That is because a conventional computer has to sometimes erase bits of information in its memory circuits in order to make space for more. Each time a single bit (with the value 1 or 0) is reset, a certain minimum amount of energy is dissipated—which Ray and Crutchfield have christened “the Landauer.” Its value depends on ambient temperature: in your living room, one Landauer would be around 10–21 joule. (For comparison, a lit candle emits on the order of 10 joules of energy per second.)

‘Momentum Computing’ Pushes Technology’s Thermodynamic Limits, Phillip Ball, Scientific American

Read more…

Time...

9817128673?profile=RESIZE_710x

GIF source: article link below

Topics: Applied Physics, Education, Research, Thermodynamics

Also note the Hyper Physics link on the Second Law of Thermodynamics, particularly "Time's Arrow."

"The two most powerful warriors are patience and time," Leo Tolstoy, War, and Peace

The short answer

We can measure time intervals — the duration between two events — most accurately with atomic clocks. These clocks produce electromagnetic radiation, such as microwaves, with a precise frequency that causes atoms in the clock to jump from one energy level to another. Cesium atoms make such quantum jumps by absorbing microwaves with a frequency of 9,192,631,770 cycles per second, which then defines the international scientific unit for time, the second.

The answer to how we measure time may seem obvious. We do so with clocks. However, when we say we’re measuring time, we are speaking loosely. Time has no physical properties to measure. What we are really measuring is time intervals, the duration separating two events.

Throughout history, people have recorded the passage of time in many ways, such as using sunrise and sunset and the phases of the moon. Clocks evolved from sundials and water wheels to more accurate pendulums and quartz crystals. Nowadays when we need to know the current time, we look at our wristwatch or the digital clock on our computer or phone. 

The digital clocks on our computers and phones get their time from atomic clocks, including the ones developed and operated by the National Institute of Standards and Technology (NIST).

How Do We Measure Time? NIST

Read more…

Quantum Exorcism...

9786514062?profile=RESIZE_584x

Figure 2. Maxwell’s demon is a hypothetical being that can observe individual molecules in a gas-filled box with a partition in the middle separating chambers A and B. If the demon sees a fast-moving gas molecule, it opens a trapdoor in the partition to let fast-moving molecules into chamber B while leaving slow-moving ones behind. Repeating that action would allow the buildup of a temperature difference between the two sides of the partition. A heat engine could use that temperature difference to perform work, which would contradict the second law of thermodynamics.

Topics: Chemistry, History, Materials Science, Quantum Mechanics, Thermodynamics

Thermodynamics is a strange theory. Although it is fundamental to our understanding of the world, it differs dramatically from other physical theories. For that reason, it has been termed the “village witch” of physics.1 Some of the many oddities of thermodynamics are the bizarre philosophical implications of classical statistical mechanics. Well before relativity theory and quantum mechanics brought the paradoxes of modern physics into the public eye, Ludwig Boltzmann, James Clerk Maxwell, and other pioneers of statistical mechanics wrestled with several thought experiments, or demons, that threatened to undermine thermodynamics.

Despite valiant efforts, Maxwell and Boltzmann were unable to completely vanquish the demons besetting the village witch of physics—largely because they were limited to the classical perspective. Today, experimental and theoretical developments in quantum foundations have granted present-day researchers and philosophers greater insights into thermodynamics and statistical mechanics. They allow us to perform a “quantum exorcism” on the demons haunting thermodynamics and banish them once and for all.

Loschmidt’s demon and time reversibility

Boltzmann, a founder of statistical mechanics and thermodynamics, was fascinated by one of the latter field’s seeming paradoxes: How does the irreversible behavior demonstrated by a system reaching thermodynamic equilibrium, such as a cup of coffee cooling down or a gas spreading out, arise from the underlying time-reversible classical mechanics?2 That equilibrating behavior only happens in one direction of time: If you watch a video of a wine glass smashing, you know immediately whether the video was in rewind or not. In contrast, the underlying classical or quantum mechanics are time-reversible: If you were to see a video of lots of billiard balls colliding, you wouldn’t necessarily know whether the video was in rewind or not. Throughout his career, Boltzmann pursued a range of strategies to explain irreversible equilibrating behavior from the underlying reversible dynamics. Boltzmann’s friend Josef Loschmidt famously objected to those attempts. He argued that the underlying classical mechanics allow for the possibility that the momenta are reversed, which would lead to the gas retracing its steps and “anti-equilibrating” to the earlier, lower-entropy state. Boltzmann challenged Loschmidt to try to reverse the momenta, but Loschmidt was unable to do so. Nevertheless, we can envision a demon that could. After all, it is just a matter of practical impossibility—not physical impossibility—that we can’t reach into a box of gas and reverse each molecule’s trajectory.

Technological developments since Loschmidt’s death in 1895 have expanded the horizons of what is practically possible (see figure 1). Although it seemed impossible during his lifetime, Loschmidt’s vision of reversing the momenta was realized by Erwin Hahn in 1950 in the spin-echo experiment, in which atomic spins that have dephased and become disordered are taken back to their earlier state by an RF pulse. If it is practically possible to reverse the momenta, what does that imply about equilibration? Is Loschmidt’s demon triumphant?

The demons haunting thermodynamics, Katie Robertson, Physics Today

Read more…

Breaking Physics...

 

Topics: Quantum Computer, Quantum Mechanics, Thermodynamics

In what could prove to be a momentous accomplishment for fundamental physics and quantum physics, scientists say they’ve finally figured out how to manufacture a scientific oddity called a time crystal.

Time crystals harness a quirk of physics in which they remain ever-changing yet dynamically stable. In other words, they don’t give off energy as they change conformation, making them an apparent violation of the natural law that all things gradually turn towards entropy and disorder.

Now, it seems like it’s possible for these things to exist, after all, Quanta Magazine reports. At least, that’s according to what a massive team of researchers from Stanford, Princeton, and elsewhere working with Google’s quantum computing labs claimed in preprint research shared online last week. Aside from being an incredible scientific discovery in abstract — time crystals represent a new, bizarre phase of matter — the discovery could have profound implications for the finicky world of quantum computing.

“The consequence is amazing: You evade the second law of thermodynamics,” study coauthor and Max Planck Institute for the Physics of Complex Systems director Roderich Moessner told Quanta.

Google Claims To Create Time Crystals Inside Quantum Computer, Dan Robitzski, Futurism

Read more…

Power Density...

9222023659?profile=RESIZE_710x

Optimal size: wind farm efficiency drops as installations become bigger. (Courtesy: iStock/ssuaphoto)

Topics: Alternate Energy, Climate Change, Existentialism, Global Warming, Green Tech, Thermodynamics

Optimizing the placement of turbines within a wind farm can significantly increase energy extraction – but only until the installation reaches a certain size, researchers in the US conclude. This is just one finding of a computational study on wind turbines’ effects on the airflow around them, and consequently the ability of nearby turbines – and even nearby wind farms – to extract energy from that airflow.

Wind power could supply more than a third of global energy by 2050, so the researchers hope their analysis will assist in better designs of wind farms.

It is well known that the efficiencies of turbines in a wind farm can be significantly lower than that of a single turbine on its own. While small wind farms can achieve a power density of over 10 W/m2, this can drop to a little as 1 W/m2 in very large installations The first law of thermodynamics dictates that turbines must reduce the energy of the wind that has passed through them. However, turbines also inject turbulence into the flow, which can make it more difficult for downstream turbines to extract energy.

“People were already aware of these issues,” says Enrico Antonini of the Carnegie Institution for Science in California, “but no one had ever defined what controls these numbers.”

Optimal size for wind farms is revealed by computational study, Tim Wogan, Physics World

Read more…

Quantum Time...

 

weird_time_tunnel_cover_1024.jpg
"Weird Time Tunnel." Image Source Below.

 

Topics: Quantum Computer, Quantum Mechanics, Thermodynamics

It's easy to take time's arrow for granted - but the gears of physics actually work just as smoothly in reverse. Maybe that time machine is possible after all?

An experiment from 2019 shows just how much wiggle room we can expect when it comes to distinguishing the past from the future, at least on a quantum scale. It might not allow us to relive the 1960s, but it could help us better understand why not.

Researchers from Russia and the US teamed up to find a way to break, or at least bend, one of physics' most fundamental laws of energy.

The second law of thermodynamics is less a hard rule and more of a guiding principle for the Universe. It says hot things get colder over time as energy transforms and spreads out from areas where it's most intense.

It's a principle that explains why your coffee won't stay hot in a cold room, why it's easier to scramble an egg than unscramble it, and why nobody will ever let you patent a perpetual motion machine.

Virtually every other rule in physics can be flipped and still make sense. For example, you could zoom in on a game of pool, and a single collision between any two balls won't look weird if you happened to see it in reverse.

On the other hand, if you watched balls roll out of pockets and reform the starting pyramid, it would be a sobering experience. That's the second law at work for you.

Electrons aren't like tiny billiard balls, they're more akin to information that occupies a space. Their details are defined by something called the Schrödinger equation, which represents the possibilities of an electron's characteristics as a wave of chance.

Physicists Have Reversed Time on The Smallest Scale Using a Quantum Computer
Mike McCrae, Science Alert

 

Read more…

Kondo Effect...

227888.jpg
Daniel Mazzone led the project to explore the mechanism that causes samarium sulphide to expand dramatically when cooled. Credit: Brookhaven National Laboratory

 

Topics: Materials Science, Quantum Mechanics, Research, Thermodynamics

Most metals expand when heated and contract when cooled. A few metals, however, do the opposite, exhibiting what’s known as negative thermal expansion (NTE). A team of researchers led by Ignace Jarrige and Daniel Mazzone of Brookhaven National Laboratory in the US has now found that in one such metal, yttrium-doped samarium sulphide (SmS), NTE is linked to a quantum many-body phenomenon called the Kondo effect. The work could make it possible to develop alloys in which positive and negative expansion cancel each other out, producing a composite material with a net-zero thermal expansion – a highly desirable trait for applications in aerospace and other areas of hi-tech manufacturing.

Even within the family of NTE materials, yttrium-doped SmS is an outlier, gradually expanding by up to 3% when cooled over a few hundred degrees. To better understand the mechanisms behind this “giant” NTE behavior, Mazzone and Jarrige employed X-ray diffraction and spectroscopy to investigate the material’s electronic properties.

The researchers carried out the first experiments at the Pair Distribution Function (PDF) beamline at Brookhaven’s National Synchrotron Light Source (II) (NSLS-II). They placed their SmS sample inside a liquid-helium cooled cryostat in the beam of the synchrotron X-rays and measured how the X-rays scattered off the electron clouds around the atomic ions. By tracking how these X-rays scatter, they identified the locations of the atoms in the crystal structure and the spacings between them.

“Our results show that, as the temperature drops, the atoms of this material move farther apart, causing the entire material to expand by up to 3% in volume,” says Milinda Abeykoon, the lead scientist on the PDF beamline.

Kondo effect induces giant negative thermal expansion, Belle Dumé, Physics World

Read more…

Angry Summers...

6E29B90E-E7D9-46AF-9DC660AAA45BF510_source.jpg
Credit: David Gray Getty Images

 

Topics: Climate Change, Existentialism, Global Warming, Thermodynamics


In the U.S., it is post the winter solstice: tilted 23.5 degrees away from the sun, our days are shorter, nights are longer and we usually experience precipitation in the forms of rain and snow.

The southern hemisphere is tilted the same degrees TOWARDS the sun, thus it's their summer. A summer typically marked by tourism, lazy beaches, mixed drinks and one would assume selfies of once-in-a-lifetime experiences. This is what was the usual and typical.

No hellscape could be penned more bleak than what we're seeing now. A billion living creatures have died, and likely are headlong barreling to the endangered species list. The elderly, sick and disabled are cannon fodder. The prime minister, firmly in the pockets of big coal, is as much a climate change lunatic as our current lobotomized "leader."

Oh yes, endangered species are not important now, are they (even if its us)? The "Environmental Protection Agency" is oxymoron. Climate change is a Chinese hoax, and the Australians just need better "forest management" by sweeping as advised to California and (not-at-all) practiced by residents of Finland. If soon-to-be past is prologue, we can only expect a repeat performance in the northern hemisphere once we get past May, especially in states like Texas, where water rationing by zip code is more or less expected, and a spark on a curb scratched by the pipe of a pickup truck in high heat and drought can cause infernos.

Avarice and abject ignorance will kill us all.

Summer in Australia use to be something we yearned for: long, lazy days spent by the beach or pool, backyard barbecues, and games of cricket with family and friends. But recent summers have become a time of fear: Schools and workplaces are closed because of catastrophic fire danger, while we shelter in air-conditioned spaces to avoid dangerous heat waves and hazardous levels of smoke in the air. Campgrounds have been closed for the summer, and entire towns have been urged to evacuate ahead of “Code Red” fire weather. Welcome to our new climate.

Of course, unusually hot summers have happened in the past; so have bad bushfire seasons. But the link between the current extremes and anthropogenic climate change is scientifically indisputable.

The fires raging across the southern half of the Australian continent this year have so far burned through more than 5 million hectares. To put that in context, the catastrophic 2018 fire season in California saw nearly 740,000 hectares burned. The Australian fire season began this year in late August (before the end of our winter). Fires have so far claimed nine lives, including two firefighters, and destroyed around 1,000 homes. It is too early to tell what the toll on our wildlife has been, but early estimates suggest that around 500 million animals have died so far, including 30 percent of the koala population in their main habitat. And this is all before we have even reached January and February, when the fire season typically peaks in Australia.
 

 

Australia’s Angry Summer: This Is What Climate Change Looks Like
Nerilie Abram, Scientific American

Read more…

Internet Carnot...

Smokestacks.JPG
Credit: ALFRED T. PALMER/VICTOR TANGERMANN

 

Topics: Climate Change, Existentialism, Internet, Thermodynamics


The Carnot cycle is the only thermodynamic cycle that is reversible, because compression and expansion of the gas are isentropic (no heat flow), while heating and cooling are isothermal (T does not change, only P and V), meaning that no energy is lost into increasing the system's entropy. Quora

The world is modeled using "ideal" circumstances: the Ideal Gas Law also comes to mind. You obviously start with this, initially.

Then, you have to model based on the reality, the biology, chemistry and physics of the actual case at hand.

Basing a civilization on a non-renewable resource of dead dinosaurs is a recipe to become museum artifacts ourselves.

As far as environmental damage is concerned, our increasingly-online lives incur a massive toll.

If everything continues on its current course, then the internet is expected to generate about 20 percent of the world’s carbon emissions by 2030, according to The New Republic. That would make its environmental impact worse than any individual country on Earth, except for the U.S., China, or India.

In other words, our internet use is linked to a vicious cycle of environmental devastation, making it increasingly clear that something has to give.

 

In the Face of Climate Change, the Internet is Unsustainable, Dan Robitzski, Futurism

Read more…