Reginald L. Goodwin's Posts (3123)

Sort by

Half-Life...

Image Source: Hiroshima Peace Media


Topics: Existentialism, Nuclear Physics, Nuclear Power


The fear of entrusting "the nuclear codes" has always been casually thrown about without much understanding of the stakes.

There's a cartoon understanding of the power of nuclear weapons, even on science-friendly shows like Star Trek. The 22nd, 23rd and 24th Centuries are pristine, clean and pollution free. Human lifespan extended by almost one-hundred years, and the Third World War was fought in their fictional timeline of the 21st Century with a remarkable lack of radiation, fallout or uninhabitable areas of the globe.

We of course, in the real world, entered the nuclear age in World War II with the Enola Gay dropping the first of its kind weapons on Hiroshima and Nagasaki. The war ended with this savagery, and we were briefly the dominate and only nuclear power.


That of course changed rapidly. Our previous wartime allies - then the Soviet Union - developed their own weapons, which ushered in what became known as The Cold War and along with it spy statecraft. Popular franchises like Ian Fleming's James Bond 007 movies, The Man From U.N.C.L.E. and the original Jason Bourne novels by Robert Ludlum capitalized on our collective cultural angst with Armageddon.

The creation of nuclear weapons is likely one of physics, and by extension science's most regrettable sins. It is often pointed to as example of its usage for evil; fuel for the disdain of acquiring knowledge, encouraging inquiry, trusting facts and reality. Dr. J. Robert Oppenheimer put this regret in words, poignantly quoting the Bhagavad Gita:

What Dr. Oppenheimer described was an atomic weapon only, not to dismiss the destructiveness of "Little Boy" and "Fat Man." To further escalate the possibility of a human extinction-level event self-imposed, the Teller-Ulam design increased the megaton yield to unimaginable, dystopian levels.
Image Source: Thermonuclear Weapon on Wikipedia


Excerpts from The Atomic Archive:


All present nuclear weapon designs require the splitting of heavy elements like uranium and plutonium. The energy released in this fission process is many millions of times greater, pound for pound, than the most energetic chemical reactions. The smaller nuclear weapon, in the low-kiloton range, may rely solely on the energy released by the fission process, as did the first bombs which devastated Hiroshima and Nagasaki in 1945.

The larger yield nuclear weapons derive a substantial part of their explosive force from the fusion of heavy forms of hydrogen--deuterium and tritium. Since there is virtually no limitation on the volume of fusion materials in a weapon, and the materials are less costly than fissionable materials, the fusion, "thermonuclear," or "hydrogen" bomb brought a radical increase in the explosive power of weapons. However, the fission process is still necessary to achieve the high temperatures and pressures needed to trigger the hydrogen fusion reactions. Thus, all nuclear detonations produce radioactive fragments of heavy elements fission, with the larger bursts producing an additional radiation component from the fusion process.

The nuclear fragments of heavy-element fission which are of greatest concern are those radioactive atoms (also called radionuclides) which decay by emitting energetic electrons or gamma particles. (See "Radioactivity" note.) An important characteristic here is the rate of decay. This is measured in terms of "half-life"--the time required for one-half of the original substance to decay--which ranges from days to thousands of years for the bomb-produced radionuclides of principal interest. (See "Nuclear Half-Life" note.) Another factor which is critical in determining the hazard of radionuclides is the chemistry of the atoms. This determines whether they will be taken up by the body through respiration or the food cycle and incorporated into tissue. If this occurs, the risk of biological damage from the destructive ionizing radiation (see "Radioactivity" note) is multiplied.

Probably the most serious threat is cesium-137, a gamma emitter with a half-life of 30 years. It is a major source of radiation in nuclear fallout, and since it parallels potassium chemistry, it is readily taken into the blood of animals and men and may be incorporated into tissue. Other hazards are strontium-90, an electron emitter with a half-life of 28 years, and iodine-131 with a half-life of only 8 days. Strontium-90 follows calcium chemistry, so that it is readily incorporated into the bones and teeth, particularly of young children who have received milk from cows consuming contaminated forage. Iodine-131 is a similar threat to infants and children because of its concentration in the thyroid gland. In addition, there is plutonium-239, frequently used in nuclear explosives. A bone-seeker like strontium-90, it may also become lodged in the lungs, where its intense local radiation can cause cancer or other damage.

Plutonium-239 decays through emission of an alpha particle (helium nucleus) and has a half-life of 24,000 years. To the extent that hydrogen fusion contributes to the explosive force of a weapon, two other radionuclides will be released: tritium (hydrogen-3), an electron emitter with a half-life of 12 years, and carbon-14, an electron emitter with a half-life of 5,730 years. Both are taken up through the food cycle and readily incorporated in organic matter.

It is sobering any presidential candidate would openly speculate using nuclear weapons as a FIRST option. The knife edge philosophy of M.A.D.: Mutually Assured Destruction requires sober minds that will use diplomacy first and not salivate for the unthinkable, goaded by a mean-girl tweet. It is breathtaking "conscientious stupidity"*; a modern-day know-nothingness, an arrogant pride in ignorance: it is cartoon physics.



Half-life for the continuation of the human species...is no life at all.

* "Nothing in the world more dangerous than sincere ignorance and conscientious stupidity." Dr. Martin Luther King, Jr.

Atomic Archive: Worldwide Effects of Nuclear War - Radioactive Fallout

Read more…

The Brain on Math...

Image Source: Carnegie Mellon Dietrich College of Humanities and Social Science


Topics: Computer Science, Education, Mathematics, Neuroscience, STEM


Brain Activity Patterns Reveal Distinct Stages of Thinking That Can Be Used To Improve How Students Learn Mathematical Concepts

A new Carnegie Mellon University neuroimaging study reveals the mental stages people go through as they are solving challenging math problems.

Published in Psychological Science, researchers combined two analytical strategies to use functional MRI (fMRI) to identify patterns of brain activity that aligned with four distinct stages of problem-solving.

"How students were solving these kinds of problems was a total mystery to us until we applied these techniques," said John Anderson, the R.K. Mellon University Professor of Psychology and Computer Science and lead researcher on the study. "Now, when students are sitting there thinking hard, we can tell what they are thinking each second."

Carnegie Mellon: Watching the Brain Do Math, Shilo Rea

Read more…

Supercurrent @ Room...

Burkard Hillebrands of the University of Kaiserslautern and colleagues say they have detected the first ever supercurrent at room temperature, but certain peers are sceptical of the results and say the claims are premature.
(Courtesy: iStock/Johan Swanepoel)


Topics: Bose-Einstein Condensate, Particle Physics, Quantum Mechanics


A room-temperature "supercurrent" has been identified in a Bose–Einstein condensate of quasiparticles called magnons. That's the finding of an international team of researchers, which says the work opens the door to using magnons in information processing. Other researchers, however, believe the claim is premature, arguing that less-novel explanations have not been ruled out.

The term "supercurrent" describes the resistance-free current of charged particles in superconductors. It also describes the viscosity-free current of particles in superfluid helium. The common denominator of these systems is that they can be described as Bose–Einstein condensates (BECs) – collections of bosons, such as Cooper pairs or Helium-4, that can be described by a single wavefunction.

Physics World: First ever supercurrent observed at room temperature, Tim Wogan

Read more…

Quantum Gaming...

Artistic rendition of atoms in an optical lattice.
Image Credit: Public Domain


Topics: Computer Science, Quantum Computer, Quantum Mechanics


Quantum computing has been envisioned for decades, but is a difficult task to accomplish. Now, one research group is crowdsourcing human ingenuity to solve the problem—by turning it into a game.

Any computer system requires operations that result in a change in a physical system that leaves that system in a certain physical state. Two important requirements of a physical computing system are the ability to reproduce a physical state, and how long the created state lasts. These two quantities are known as fidelity and lifetime, respectively.

For a quantum computer, the degree of fidelity (how well the physical state can be reproduced) usually must be greater than 99.9%, depending on the physical system. The requirement is based on the ability to correct any errors that occur in the physical system so a build up of error does not occur. The requirement that executing an operation must occur faster than the lifetime of the quantum state, or what is typically called the quantum decoherence time, is difficult—if you try to execute an operation too quickly, you lose fidelity. Optimizing these two conditions has led scientists to rely on computer programs—algorithms—to try out many initial states and conditions. The algorithms are good, but there are an extremely large number of possibilities to try.

Physics Central: Quantum Computing, Human Processing, H.M. Doss

Read more…

Whisper to Shout...

MPI FOR GRAVITATIONAL PHYSICS/SIMULATING EXTREME SPACETIMES/AIRBORNE HYDRO MAPPING
Citation: Phys. Today 69, 8, 10 (2016); http://dx.doi.org/10.1063/PT.3.3249

Topics: Astrophysics, Black Holes, General Relativity, Gravitational Waves, Spacetime


On 11 February 2016, the Laser Interferometer Gravitational-Wave Observatory (LIGO) and its sister collaboration, Virgo, announced their earthshaking observation of Albert Einstein’s ripples in spacetime. LIGO had seen the death dance of a pair of massive black holes. As the behemoths circled each other faster and faster, the frequency and amplitude of the spacetime waves they produced grew into a crescendo as the black holes became one. Then the new doubly massive black hole began to ring softer and softer like a quieting bell. The escalating chirp and ringdown is also a metaphor for public information flow about the discovery. It could have unfolded differently.

When scientists make a discovery, they must choose how to disseminate it. A big decision they must make is whether to reveal the results before or after peer review. Reveal before peer review—sometimes even before the paper is written—and the community can use the results right away, but there is an increased risk that problems will be found in a very public way. Reveal after peer review, and the chance of such problems decreases, but there is more time for a competitor to announce first or for rumors to leak. At Physical Review Letters (PRL), where I am an editor, we allow authors to choose when they want to reveal their results. The LIGO collaborators chose to wait.



Just before LIGO’s experimental run began in September 2015, the team held a vote on which journal they would pick if they made a discovery. They picked PRL. Five days after the vote, LIGO’s detectors seemed to hear the universe sing out for the first time.

American Institute of Physics:
Commentary: How gravitational waves went from a whisper to a shout, Robert Garisto

Read more…

Juno to Juice Et Al...

An artist’s rendition of the JUICE spacecraft. (Credit: ESA)


Topics: Astrophysics, Planetary Science, Space Exploration, Spaceflight


Juno (JUpiter Near-polar Orbiter) is the sixth spacecraft to study Jupiter (give or take a few gravity assists), but will be the second to fall into orbit around the gas giant following the Galileo probe in 1995.

It is part of NASA’s New Frontiers space exploration program that specializes in researching the celestial bodies of the solar system. Juno was launched on August 5th, 2011 from Cape Canaveral Air Force Station in Florida and intended to be placed in a polar orbit around Jupiter to study the planet’s composition, magnetic and gravity fields, and the polar magnetosphere. Even though Juno’s scientific mission only lasts for a year, many more spacecraft are headed Jupiter’s way.

The next upcoming Jupiter mission following Juno is the European Space Agency’s (ESA) first large-class mission in its Cosmic Vision program, the JUICE (JUpiter ICy moon Explorer). It is planned for launch in 2022 from the Guiana Space Centre in French Guiana and will arrive at Jupiter in 2030. JUICE will then monitor Ganymede, Europa, and Callisto, three of the four Galilean moons, as well as Jupiter for three and a half years. As all three of these worlds are believed to possess significant bodies of water beneath their surfaces, and the JUICE Mission will explore their habitability in depth.

On December 9th, 2015, ESA and Airbus Defence & Space signed a contract signifying that Airbus would be building the spacecraft at their base in Friedrichshafen, Germany. The scientific instruments on JUICE will be built by scientific and engineering teams from all over Europe, with some participation from the United States and Japan.



Discovery: These Spacecraft Will Visit Jupiter After Juno, Jordan Rice

Read more…

Makemake Moon...

Dwarf planet Makemake and its newly discovered moon.
The newly discovered moon, MK 2, found in Hubble data orbiting Makemake.
NASA, ESA, A. Parke


Topics: Astronomy, NASA, Planetary Science, Space Exploration, Spaceflight


Once a lonely ice block, now it seems the dwarf planet may have a close-in companion.

In 2005, Caltech astronomers Mike Brown and Chad Trujillo discovered dwarf planet Makemake, currently believed to be the third largest object in the Kuiper Belt after Pluto and Eris. But at the time, astronomers believed it was alone out there on its long path around the Sun. But new data from the Hubble Space Telescope reveal a moon around the tiny world, and offer a little explanation as to where it was hiding.

“The satellite that we found was not that faint and not that close to Makemake,” says Alex Parker, principal investigator of the research and a planetary scientist at the Southwest Research Institute. “It popped right out of the data when we looked.”

It turns out it was always there. But the newly found object, provisionally called MK 2, orbits Makemake nearly edge-on from our point of view, meaning most of the time it’s obscured by the comparatively bright dwarf planet. Makemake is 886 miles (1,434 km) in diameter, while the new object appears to be only 100 miles (161 Km). Current scenarios also paint it as a dark companion compared to bright Makemake.

Astronomy:
Astronomers Find a Moon Hiding Around Makemake in Hubble Data, John Wenz

Read more…

The Silicon Wall...

Image Source: MIT Technology Review

Topics: Electrical Engineering, Materials Science, Moore's Law, Semiconductor Technology

It was inevitable. I joined the industry after the US Air Force in 1989. The epitome of the industry was the nineties. As gate feature sizes shrank, we looked forward to the future, spurred on by two Star Trek series: The Next Generation and Deep Space Nine. This was when the Internet became commercial; flip phones looked an awful lot like Star Trek communicators. I went to my oldest son's school with scrapped wafers, bunny suits at his teachers' requests, eager to clone myself in their enthusiastic eyes and lives.

We'll still manufacture semiconductors in some form, like Gate-All-Around FETs. The transition from the old to the new is (for me) pausing and poignant. 

In the next five years, it will be too expensive to further miniaturize—but chip makers will innovate in different ways.

Moore’s Law has been slowing for a while. But the U.S. industry that exploits it has finally recognized that it is about to die.

The Semiconductor Industry Association—made up of the likes of Intel, AMD, and Global Foundries—has published the 2015 International Technology Roadmap for Semiconductors. It suggests that, after decades of miniaturization, transistors look set to stop shrinking in size altogether by 2021. After that date, the report claims, it will not be economically efficient to reduce the size of silicon transistors any further.

The prediction is an acknowledgment that Moore’s Law—which states that the number of transistors in an integrated circuit doubles approximately every two years—isn't simply slowing. It’s grinding to a halt.

MIT Technology Review:
Chip Makers Admit Transistors Are About to Stop Shrinking, Jamie Condliffe

Read more…

To the 22nd Century...

Image Source: YouTube embed below


Topics: Mars, NASA, Planetary Science, Space Exploration, Spaceflight


(July 15, 2016) - The Boeing Co marked its centennial on Friday with plans to sharpen its focus on innovation, including ambitious projects for supersonic commercial flight and a rocket that could carry humans to other planets.

But innovation at Boeing will be "disciplined" and not endanger the future of the world's biggest plane maker, Chief Executive Dennis Muilenburg told reporters at an event marking the company's founding on July 15, 1916.

The enterprise established by William Boeing in a Seattle boathouse has faced numerous "bet the company" moments over its 10 decades to bring out new planes such as the 707 and 747.

"We have won for 100 years because of innovation," Muilenburg said. "The key is disciplined innovation. We'll take risks. We'll invest smartly."

Chicago-based Boeing has managed to stay ahead of European rival Airbus in plane production and is a major defense and space contractor, producing fighter jets, aerial refueling tankers, communications satellites and rockets.

The company is exploring the possibilities of commercial supersonic and hypersonic planes, Muilenburg said. It also is at work on a manned mission to Mars. Though those are perhaps many decades away, "I'm anticipating that person will be riding on a Boeing rocket," Muilenburg said.

Reuters: Boeing aims for supersonics and Mars at outset of second century, Alwyn Scott

Read more…

Physics and History...

Galileo Galilei shows the doge of Venice how to use a telescope in this 1858 fresco by Giuseppe Bertini.
Citation: Phys. Today 69, 7, 38 (2016); http://dx.doi.org/10.1063/PT.3.3235

Topics: Civil Engineering, Economy, Education, History, Physics, Science, STEM

Spoiler alert: I'll sound parental, but hopefully not too pedantic.

A Skype conversation with my youngest son revealed two things: 1) he liked working at his now third Civil Engineering summer internship (he's completing a project for an airbase in Japan); 2) he wished he could just do THAT and not return to school for his last year in the fall. My wife and I of course, encouraged him to do just that and the goal would be to get a job after graduation so presumably he would enjoy that too.

He gave an observation I think I had at his age: "why do they have you take all these classes that are unnecessary?" As you'd guess right, the unnecessary classes are those that didn't apply to Civil Engineering.

I told him I appreciated the classes that weren't engineering or physics classes; that sometimes you need "a mental break" from having to do designs and differential equations. It was a respite for me at least.

Plus, part of the entire matriculation experience isn't what you'll GET at the end: it's what you're becoming, and the process of that journey changes you from how you started to how you complete at least the undergraduate leg opening you up to other possibilities. For example, as a Freshman I only had ear for one type of music: Parliament Funkadelic. As a junior studying Thermodynamics and after a "rude" awakening by Al Jarreau singing "Roof Garden," I suddenly developed an appetite and appreciation for Jazz music. Personal research revealed its origins in my own culture and the root art of many popular music forms we take for granted today. If not for art, literature and music we would be stiff and joyless automatons, fulfilling the whims of an employer only; creativity - the fuel of innovation and invention would be significantly lessened. For nothing else, the trifecta is the stuff of "Star Trek" and "Star Wars." I hope I influenced him to think further on his viewpoint.

This article in Physics Today is kind of related to our video conference, which up to being a young adult wasn't only impossible without sophisticated video equipment, it was the stuff of science fiction and "The Jetsons" Saturday cartoon show.

But of course, that in and of itself is an appreciation...of history.

Just as physics is not a list of facts about the world, history is not a list of names and dates. It is a way of thinking that can be powerful and illuminating.

Some things about physics aren’t well covered in a physics education. Those are the messy, rough edges that make everything difficult: dealing with people, singly or in groups; misunderstandings; rivals and even allies who won’t fall in line. Physicists often do not see such issues as contributing to science itself. But social interactions really do influence what scientists produce. Often physicists learn that lesson the hard way. Instead, they could equip themselves for the actual collaborative world, not the idealized solitary one that has never existed.

History can help. An entire academic discipline—history of science—studies the rough edges. We historians of science see ourselves as illustrating the power of stories. How a community tells its history changes the way it thinks about itself. A historical perspective on science can help physicists understand what is going on when they practice their craft, and it provides numerous tools that are useful for physicists themselves.

Physics is a social endeavor

Research is done by people. And people have likes and dislikes, egos and prejudices. Physicists, like everyone else, get attached to their favorite ideas and hang on to them perhaps long after they should let them go. A classic case is the electromagnetic ether, an immensely fruitful concept that dominated physics for most of the 19th century. Even as it became clear that ether theory was causing more problems than it solved, physicists continued to use it as a central explanatory tool—even for many years after Einstein’s 1905 theory of special relativity declared it superfluous. The history of physics is littered with beautiful theories that commanded great loyalty.

People come from places too, and physicists want to protect their homes as much as anyone else. It is easy to forget that 100 years ago during World War I, British scientists refused to talk to their German colleagues on the other side of the trenches. Even after the end of the fighting, Germans and their wartime allies were officially forbidden from joining international scientific organizations. During World War II, the specter of an atomic bomb in the hands of Adolf Hitler terrified Allied physicists into opening the Pandora’s box of nuclear weapons. Many of the scientists involved bemoaned their actions afterward, but war and nationalism make for a potent impetus.

Those incidents are not exceptions. Physicists are not disinterested figures without political views, philosophical preferences, and personal feelings. The history of science can help dismantle the myth of the purely rational genius living outside the everyday world. It makes physics more human.

Physics Today: Why should physicists study history? Matthew Stanley

Read more…

Semiconductor Defects...

Configuration coordinate diagram, showing important energies and optical transitions. For this example, Etherm gives the acceptor level relative to the CBM.

Citation: J. Appl. Phys. 119, 181101 (2016); http://dx.doi.org/10.1063/1.4948245


Topics: Education, Nanotechnology, Semiconductor Technology, STEM


Abstract

Point defects affect or even completely determine physical and chemical properties of semiconductors. Characterization of point defects based on experimental techniques alone is often inconclusive. In such cases, the combination of experiment and theory is crucial to gain understanding of the system studied. In this tutorial, we explain how and when such comparison provides new understanding of the defect physics. More specifically, we focus on processes that can be analyzed or understood in terms of configuration coordinate diagrams of defects in their different charge states. These processes include light absorption, luminescence, and nonradiative capture of charge carriers. Recent theoretical developments to describe these processes are reviewed.

Introduction

Every material contains defects; perfect materials simply do not exist. While it may cost energy to create a defect, configurational entropy renders it favorable to incorporate a certain concentration of defects, since this lowers the free energy of the system.1 Therefore, even in equilibrium, we can expect defects to be present; kinetic limitations sometimes lead to formation of additional defects. Note that all of these considerations also apply to impurities that are unintentionally present in the growth or processing environment. Of course, impurities are often intentionally introduced to tailor the properties of materials. Doping of semiconductors with acceptors and donors is essential for electronic and optoelectronic applications. In the following, we will use the word “defect” as a generic term to cover both intrinsic defects (vacancies, self-interstitials, and antisites) and impurities.

Since defects are unavoidable, we must consider the effects they have on the properties of materials. These effects can be considerable, to the point of determining the functionality of the material, as in p- or n-type doping. Point defects play a key role in diffusion: virtually all diffusion processes are assisted by point defects. Defects are often responsible for degradation of a device. Even in the absence of degradation, defects can limit the performance of a device. Compensation by native point defects can decrease the level of doping that can be achieved. Defects with energy levels within the band gap can act as recombination centers, impeding carrier collection in a solar cell or light emission from a light-emitting diode. Sometimes, these effects can be used to advantage: luminescence centers in wide-band-gap materials can be used to emit light at specified wavelengths; or single-spin centers (such as the nitrogen–vacancy (NV) center in diamond) can act as an artificial atom and serve as a qubit in a quantum information system.2,3 Finally, sometimes, one deliberately wants to grow materials with many defects. Examples are materials for ultrafast optoelectronic switches or semiconductors used to optically generate THz pulses, where defect densities should be large enough so that carrier lifetimes are as short as a few picoseconds.4

Journal of Applied Physics:
Tutorial: Defects in semiconductors—Combining experiment and theory
Audrius Alkauskas1, Matthew D. McCluskey2 and Chris G. Van de Walle3,a)

Read more…

Simpler, Faster, Cheaper...

To prevent cores of single-wall carbon nanotubes from filling with water or other detrimental substances, the NIST researchers advise intentionally prefilling them with a desired chemical of known properties. Taking this step before separating and dispersing the materials, usually done in water, yields a consistently uniform collection of nanotubes, especially important for optical applications.
Credit: Fagan/NIST
View hi-resolution image

Topics: Carbon Nanotubes, Electrical Engineering, Nanotechnology, Semiconductor Technology

Just as many of us might be resigned to clogged salt shakers or rush-hour traffic, those working to exploit the special properties of carbon nanotubes have typically shrugged their shoulders when these tiniest of cylinders fill with water during processing. But for nanotube practitioners who have reached their Popeye threshold and “can’t stands no more,” the National Institute of Standards and Technology (NIST) has devised a cheap, quick and effective strategy that reliably enhances the quality and consistency of the materials—important for using them effectively in applications such as new computing technologies.

To prevent filling of the cores of single-wall carbon nanotubes with water or other detrimental substances, the NIST researchers advise intentionally prefilling them with a desired chemical of known properties. Taking this step before separating and dispersing the materials, usually done in water, yields a consistently uniform collection of nanotubes. In quantity and quality, the results are superior to water-filled nanotubes, especially for optical applications such as sensors and photodetectors.

The approach opens a straightforward route for engineering the properties of single-wall carbon nanotubes—rolled up sheets of carbon atoms arranged like chicken wire or honey combs—with improved or new properties.

“This approach is so easy, inexpensive and broadly useful that I can’t think of a reason not to use it,” said NIST chemical engineer Jeffrey Fagan.

NIST:
Simpler, Faster and Cheaper: A Full-filling Approach to Making Carbon Nanotubes of Consistent Quality, Mark Bello

Read more…

Wearable Photovoltaics...

Ultra-thin solar cells are flexible enough to bend around small objects, such as the 1mm-thick edge of a glass slide, as shown here.
CREDIT: Juho Kim, et al/APL

Topics: Consumer Electronics, Electrical Engineering, Materials Science, Photovoltaics, Solar Power

WASHINGTON, D.C., June 20, 2016 -- Scientists in South Korea have made ultra-thin photovoltaics flexible enough to wrap around the average pencil. The bendy solar cells could power wearable electronics like fitness trackers and smart glasses. The researchers report the results in the journal Applied Physics Letters, from AIP Publishing.

Thin materials flex more easily than thick ones -- think a piece of paper versus a cardboard shipping box. The reason for the difference: The stress in a material while it's being bent increases farther out from the central plane. Because thick sheets have more material farther out they are harder to bend.

“Our photovoltaic is about 1 micrometer thick,” said Jongho Lee, an engineer at the Gwangju Institute of Science and Technology in South Korea. One micrometer is much thinner than an average human hair. Standard photovoltaics are usually hundreds of times thicker, and even most other thin photovoltaics are 2 to 4 times thicker.

AIP: Ultra-thin Solar Cells Can Easily Bend Around a Pencil, Catherine Meyers

Read more…

Exciton Condensate...

Figure 1: A Coulomb drag experiment measures the interactions between charges in two closely spaced layers. The experiment entails running a current through the “drive” layer (here, the top layer) and measuring the resulting flow of charge in the “drag” layer (the bottom layer). The panels indicate three (of many) possible drag scenarios associated with two sheets of bilayer graphene (grey). At left, exciton pairs form between holes (red) in the drive layer and electrons (green) in the drag layer, giving rise to a large drag effect. At center, holes drag electrons in the same direction (positive drag) because of momentum transfer between the charges in different sheets. At right, holes drag electrons in the opposite direction (negative drag), an observation in bilayer graphene that is yet to be explained.

Topics: Atomic Physics, Bose-Einstein Condensate, Condensed Matter Physics, Quantum Mechanics

Superfluids (fluids with zero viscosity) and superconductors (materials with zero resistance) have a common ingredient: bosons. These particles obey Bose-Einstein statistics, allowing a collection of them at low temperatures to collapse into a single quantum-mechanical state, or Bose-Einstein condensate. Bosons in superconductors consist of two paired electrons, but the pairing is weak and only occurs at low temperatures. In a quest to build devices that carry electricity with low dissipation at higher temperatures, researchers have therefore explored the possibility of engineering electrical condensates [1] out of strongly bound pairs of electrons and holes, or excitons. Now, two research groups have, independently, fabricated and characterized a graphene-based device that is thought to be a promising platform for realizing an exciton condensate [2, 3]. Neither group has yet found evidence for such a condensate—the ultimate goal of such experiments. But their measurements lay the groundwork for future searches.

Excitons form in semiconductors and insulators. The binding energy between the exciton’s electron and hole can be quite strong, greatly exceeding their thermal energy at room temperature. Unfortunately, excitons recombine quickly, too fast to allow a condensate to form. Although excitons coupled to light confined within a cavity can form hybrid particles (exciton-polaritons) that do live long enough to condense [4], such condensates require a continuous input of light.

APS Viewpoint: Chasing the Exciton Condensate
Michael S. Fuhrer, Alex R. Hamilton

Read more…

A New Migration...

Topics: Climate Change, History, Octavia Butler, Politics, Science Fiction


It's been a breathtaking seven days that puts into context what a president has to do: gather information, calm fears for now the second police shooting - the first generated by Alton Sterling and Philando Castile's executions; a terrorist attack by truck in Nice, France in the backdrop of two political conventions poised to pick this president's successor in a volatile world. This election will be a reflection of our fears and our character, beyond our own self-deluding mythology, who we really are.

Some context: "The Great Migration" was of approximately six million African Americans from the rural south to northern cities for opportunities in the budding industrial revolution and (hopefully) AWAY from the De Jure and De Facto segregation, Jim Crow and racial terrorism they were all fleeing. Notable ex-patriots: The ancestors of First Lady Michelle Obama (documented in "The Warmth of Other Suns" by Isabel Wilkerson); James Lee Boggs, deceased husband of Grace Lee Boggs and author of "The American Revolution: Pages from a Negro Worker's Notebook," in which he predicted the impacts of automation and what he referred to at the time "cybernation" that we recognize as the advent of computers in what were once jobs done by humans and less robotics or apps.

Note the plot synopsis from "Parable of the Sower" written by Octavia Butler in 1993:

Set in a future where government has all but collapsed, Parable of the Sower centers on a young woman named Lauren Olamina who possesses what Butler dubbed hyperempathy – the ability to feel the perceived pain and other sensations of others – who develops a benign philosophical and religious system during her childhood in the remnants of a gated community in Los Angeles. Civil society has reverted to relative anarchy due to resource scarcity and poverty. When the community's security is compromised, her home is destroyed and her family murdered. She travels north with some survivors to try to start a community where her religion, called Earthseed, can grow. Wikipedia

Now look at the plot of the US as it relates to a heating climate (I'm sure the same applies overseas as well):


The previous migration was a drive for opportunity and fairness; the next one will be for the first level of Maslow's hierarchy: comfort. The strain on resources will split humanity along tribal and factional lines like never before. Those who "have" will hoard and build up walled cities; defended castles to maintain their bounty from the hungered herds of "have not's." For those youth that will still be around (I'm not anticipating I will), as 2050 approaches they will see how far we've actually migrated...from the caves.

Scientific American: U.S Cities Are Getting Dangerously Hot [Graphic]
A dramatic rise in “danger days” is underway, Mark Fischetti

Read more…

Scientism...

Source: izquotes.com

Topics: History, Physics, Philosophy, Science


Scientism: It's an old word, so old it has to be added to your online dictionary almost everywhere you might type it. It also at first glance sounds reasonable, and in my own oft-used urban descriptor: "science-y."

This description at the beginning of the article from The American Association for the Advancement of Science is instructive and concise:

Historian Richard G. Olson defines scientism as “efforts to extend scientific ideas, methods, practices, and attitudes to matters of human social and political concern.” (1) But this formulation is so broad as to render it virtually useless. Philosopher Tom Sorell offers a more precise definition: “Scientism is a matter of putting too high a value on natural science in comparison with other branches of learning or culture.” (2) MIT physicist Ian Hutchinson offers a closely related version, but more extreme: “Science, modeled on the natural sciences, is the only source of real knowledge.” (3) The latter two definitions are far more precise and will better help us evaluate scientism’s merit.

A History of Scientism

The Scientific Revolution


The roots of scientism extend as far back as early 17th century Europe, an era that came to be known as the Scientific Revolution. Up to that point, most scholars had been highly deferent to intellectual tradition, largely a combination of Judeo-Christian scripture and ancient Greek philosophy. But a torrent of new learning during the late Renaissance began to challenge the authority of the ancients, and long-established intellectual foundations began to crack. The Englishman Francis Bacon, the Frenchman Rene Descartes, and the Italian Galileo Galilei spearheaded an international movement proclaiming a new foundation for learning, one that involved careful scrutiny of nature instead of analysis of ancient texts.

Descartes and Bacon used particularly strong rhetoric to carve out space for their new methods. They claimed that by learning how the physical world worked, we could become “masters and possessors of nature.”(4) In doing so, humans could overcome hunger through innovations in agriculture, eliminate disease through medical research, and dramatically improve overall quality of life through technology and industry. Ultimately, science would save humans from unnecessary suffering and their self-destructive tendencies. And it promised to achieve these goals in this world, not the afterlife. It was a bold, prophetic vision.

From the seeds of this formed the basis for utopia: H.G. Wells was the first science fiction writer to tackle it; Utopia was written I think before the genre was invented by Mary Shelly ("Frankenstein," fairly dystopian to say the least). Star Trek and the proclivities of Gene Roddenberry (an atheist) embodied it in Mr. Spock and the planet Vulcan: human contact with an entire species of beings supposedly led fully by logic and reason. The Earth - post Armageddon - surviving its own hubris and learning to cooperate beyond borders, languages, religions and the previous things that separated the human tribe and made "Mutually Assured Destruction" (M.A.D.) possible in a hopefully fictional Trek timeline.

New Thought: It apparently started in the 19th century originating from Phineas Parkhurst Quimby - imitated ad nauseum by opportunistic others, branching into several realms via modern communications (radio, television, Internet) from faith healers, prosperity gospel, pseudoscience and general quackery. As the link indicates, the enduring appeal is humans feeling empowered in an unpredictable and often cruel cosmos. Many traditional, non denominational, modern and/or New Age gurus have cashed in on this uncertainty quite lucratively. You can see its sustained and prosperous modern incarnations with a simple exercise of channel-surfing.

I would say scientism in its modern expression would be (a representative off-the-top-of-my-head trio) Richard Dawkins, Sam Harris and Neil deGrasse Tyson. They ARE scientists, but have made a lucrative living speaking and writing about the virtues of science; how if we all thought more rationally we wouldn't have to wait for heaven on Earth: we could design it ourselves. Sociologist Jeffrey Guhin in New Scientist challenged the idea that Tyson forwarded of a nation totally run by logic, reason and science (sounds familiar? \\//_). He posits the very simple question that gives one pause: what does "rational" mean? Things that "sounded" rational and science-y like Eugenics was used for wholesale discriminatory behavior by Hitler's Third Reich (you know: concentration camps and gas chambers). If we just "follow-the-data" of standardized test scores, then the often debunked thesis behind "The Bell Curve" sounds rational, because one does not have to take into account generations of poverty vis-à-vis slavery; sharecropping (a word that is a contradiction in terms on its own); racial terrorism; Jim Crow; De Facto and De Jure segregation; bank red lining; differentiated education (for me, torn and outdated books supplemented by xeroxed copies my teachers purchased at their own expense) and no career opportunities to climb the economic ladder to a better life. The better correlation is wealth of parents and guardians to academic achievement, most of which happens to be the dominant culture.

The National Science Foundation (I think) was right to commission a study on Science Literacy and the public good, as more than anything that will determine the outcome of nations as we share and contest resources on this Earth, or prepare as a species to inhabit other worlds to extend us beyond the fate of the dinosaurs.

The broad brush of "all we need is science" is the proportional equivalent to its antithesis: "all we need is (fill in the blank): Buddha, Chia Pets, Gaia, Jesus, Odin, Mood Rings, Mood Rocks, Positive Thinking, Possibility Thinking, Prayer Cloths, Quantum Physics (since we travel < c, highly doubtful), Holy Water; Thor."

What we could all use is a return to actually teaching civics to our respective populations, and leave proselytizing to family units. Methinks both camps need to step back and consider a true "separation of church and state" (& science). It will benefit both camps better to stay in their lanes, without either one harmfully denigrating the other. We need to survive together as a species, or in the words of Dr. King "perish together as fools." The Earth does not need us to circumnavigate the sun, and the universe if we were so foolish wouldn't blink at our hubris...or departure.
Read more…

Neural Networks and H2O...

Schematic showing water molecules in the denser water phase (left) and the ice phase. (Courtesy: Tobias Morawietz)


Topics: Artificial Intelligence, Computer Engineering, Computer Science


Artificial neural networks have been used to simulate interactions between water molecules and provide important clues about the remarkable properties of this live-giving substance. The study has been carried out by physicists in Germany and Austria, who used the networks to perform simulations 100,000 times faster than possible with conventional computers. Their work offers explanations for two key properties of water – its maximum density at 4 °C and its melting temperature – but the technique could be expanded to include other aspects of this ubiquitous substance.

Physicists and chemists have long found water's unusual properties difficult to explain. Its density, for example, peaks at around 4 °C, which means that frozen water floats on liquid water – a property that is vital for aquatic creatures that have to survive in cold climates. Massive computer simulations have shown that hydrogen bonds between water molecules play a key role, but these simulations do not tell the whole story.

One key challenge is understanding the role of van der Waals interactions, which arise from quantum fluctuations in the electrical polarizations of water and other molecules. Van der Waals interactions have traditionally been hard to include in computer simulations, but Tobias Morawietz and colleagues at the Ruhr-Universität Bochum and the University of Vienna have now used artificial neural networks (ANNs) to model them in water. ANNs are computer algorithms that "learn" how to perform a specific task by being fed data related to that task. An ANN could, for example, learn how to recognize an individual's face by being fed photographs of people and being told which images are of the target person.

Physics World: Neural networks provide deep insights into the mysteries of water
Hamish Johnston

Read more…

Genesis Planet...

Image Source: Daily Galaxy link below


Topics: Astronomy, Astrophysics, Big Bang, Cosmology, White Dwarfs


I took the title from the Daily Galaxy's original post. It seemed apropos and succinct, but I am aware of the strong feelings it may generate.

Science strives mightily to fight "confirmation bias" : "the tendency to interpret new evidence as confirmation of one's existing beliefs or theories." The way scientists try to weed out minutiae is through peer review. Feelings are bruised, but truth is winnowed from social and preconceived chaff. Previous theories once held in high regard are thrown away. As new technology and instruments become available, this disciplined process is repeated. A scientific discovery may or may not confirm already preconceived notions. It's usually the latter. Such is not science, but the seeds of the boondoggle, pseudoscience and superstition; it is the natural tendency in an ever-changing world to reach for the comfortable instead of lighting "a candle in the dark" (Carl Sagan).

“There are more things in Heaven and Earth, Horatio, than are dreamt of in your philosophy.”

William Shakespeare, Hamlet

"I would rather have questions I can't answer, than answers I can't question."

Richard Feynman

In 2015, NASA's Hubble Space Telescope precisely measured the mass of the oldest known planet in our Milky Way galaxy. At an estimated age of 13 billion years, the planet is more than twice as old as Earth's 4.5 billion years. It's about as old as a planet can be. It formed around a young, sun-like star barely 1 billion years after our universe's birth in the Big Bang. The ancient planet has had a remarkable history because it resides in an unlikely, rough neighborhood. A few intrepid astronomers have concluded that the most productive to look for planets that can support life is around dim, dying stars white dwarfs.

"In the quest for extraterrestrial biological signatures, the first stars we study should be white dwarfs," said Avi Loeb, theorist at the Harvard-Smithsonian Center for Astrophysics (CfA) and director of the Institute for Theory and Computation. Even dying stars could host planets with life - and if such life exists, we might be able to detect it within the next decade.

The ancient planet orbits a peculiar pair of burned-out stars in the crowded core of a cluster of more than 100,000 stars. The new Hubble findings close a decade of speculation and debate about the identity of this ancient world. Until Hubble's measurement, astronomers had debated the identity of this object. Was it a planet or a brown dwarf? Hubble's analysis shows that the object is 2.5 times the mass of Jupiter, confirming that it is a planet. Its very existence provides tantalizing evidence that the first planets formed rapidly, within a billion years of the Big Bang, leading astronomers to conclude that planets may be very abundant in our galaxy.

The Daily Galaxy:
Hubble Space Telescope Reveals "The Genesis Planet" --The Oldest Known Planet in the Milky Way (Today's Most Popular)

Read more…

cQED...

A. Houck/Princeton

Figure 1: Scanning defect microscopy provides a map of photons in a resonator lattice. Houck and colleagues demonstrated the technique using 49 resonators (grey lines) that were coupled together to form a kagome lattice. This configuration consists of a triangular arrangement of three resonators at each point in a honeycomb lattice.

Topics: Electrical Engineering, Nanotechnology, Quantum Electrodynamics


A scanning probe detects the quantum states of photons in a microwave circuit, providing the information needed for quantum simulations.

Quantum mechanics rules the dynamics of light and matter. Yet performing a quantum-mechanical simulation of a material from first principles is practically impossible on a classical computer because the complexity of the simulation increases exponentially with the number of particles involved. The solution, according to Richard Feynman, was to build a machine out of quantum building blocks that could directly emulate the material itself [1]. Prototypes of such quantum simulators that are based on ultracold atoms, ions, photons, and superconducting microwave circuits are now available [2], with the latter, in particular, having attracted Silicon Valley’s interest. The challenge with these circuit-based simulators, however, is that they are 2D, which complicates the readout of their constituent elements. Andrew Houck from Princeton University, New Jersey, and colleagues have now delivered an attractive solution by developing a technique [3], called scanning defect microscopy (Fig. 1), that determines the number of photons occupying each mode of a 2D microwave circuit. It is this information that would serve as the fundamental input and output for certain quantum simulations.

Superconducting microwave circuits combine electronic and photonic degrees of freedom [4, 5]. The main element of the circuit is a transmission line, which is made up of a central superconducting wire separated by a gap from two grounded plates. All of these structures are on a single plane, as if one had taken a 2D slice through a coaxial cable. When truncated, the transmission line becomes a resonator, which can host discrete photon modes within its gaps. Large lattices of resonators can be engineered in various 1D or 2D geometries by coupling two, three, or more resonators together via a capacitive interface. In many ways, photons in such devices behave similarly to electrons in a solid.

To make microwave circuits that can simulate quantum phenomena faster than a classical computer, however, resonator lattices have to be integrated with superconducting qubits. Such qubits are controlled with electrical currents in a Josephson tunnel junction, and in many respects, they behave like artificial atoms, which couple to the photons in the circuit. As a result, superconducting microwave circuits can be used to explore the coupling between the quantum states of light and matter, the regime of circuit quantum electrodynamics (cQED). Photons in these devices often exhibit striking matter-like behavior [6, 7], providing the basis for the simulation of complex materials. Such circuits can be fabricated on a substrate using standard lithographic techniques, with qubits and resonators that are hundreds of micrometers or even millimeters in size.

APS Viewpoint: A Bird’s Eye View of Circuit Photons
Sebastian Schmidt, Institute for Theoretical Physics, ETH Zurich, 8093 Zurich, Switzerland

Read more…

Fuels and Futures...



Figure 1. The correlation between hydrocarbon-based power consumption and economic output for most countries on Earth. A power-law fit finds that annual GDP per person is G = $10 500 (C/kW)0.64, where C is hydrocarbon-based energy consumption per second per person. The tight power-law relationship indicates that economic prosperity is not currently feasible without consumption of hydrocarbon fuels. The power law is reminiscent of scaling laws in biology; 15 the flow of petroleum through economies resembles the flow of blood in mammals. On average, the hydrocarbon power consumed in the US is 8 kW per person, the same as 80 incandescent 100 W bulbs burning continuously. If the US were to rely only on its currently available renewables—biomass cogeneration, wood, hydropower, geothermal, wind, passive solar, and photovoltaics—power consumption would drop to four bulbs per person; eliminating hydropower and biofuels would reduce the number to one or two. The reduction would entail such a change in lifestyle as to make the US unrecognizable. 16 (Data source: Central Intelligence Agency, World Factbook, 2015; DOE/Energy Information Administration, 2015.)



Citation: Phys. Today 69, 7, 46 (2016); http://dx.doi.org/10.1063/PT.3.3236



Topics: Alternative Energy, Economy, Green Energy, Green Tech, Politics


President George W. Bush famously said: "we're addicted to oil." That's an understatement, as it is evident this is the underpinning of the planetary economy.

The sad part is, without physics to give an intervention of sorts, the kind of utopia envisioned by Gene Roddenberry in Star Trek is highly unlikely. We're already showing the strains of automation, globalization and trade deals without a forethought on the impacts with populations at the bottom of societal ladders. It makes way for demagogues in the US, the UK and elsewhere that don't quite have a clue how to solve the problem, but play into xenophobic fears (as evidenced) to their advantage.

To contend with the challenges of fueling modern society, the physics community must collaborate with other disciplines and remain broadly engaged in research and education on energy.

For how long and in what ways can humans sustain the energy-intensive way of life we take for granted? That consequential question is one that physicists must help answer. As we pass the middle of 2016, oil prices are at a 10-year low, partly because of the surge in production of oil and natural gas from fracking. The current fracking boom may ease the transition to a new mix of energy resources. Conversely, it may make us complacent and delay the transition or incite popular resentment and impede the transition.

The physics community must participate in shaping how energy issues play out over the coming decades. The development of fusion reactors, photovoltaic cells, and other potential energy sources clearly requires contributions from physicists. As educators, many of us occupy the central position of teaching students the very definition of energy and the fundamental limits on extraction of free energy from heat. Beyond the classroom, we should all be concerned with the public’s understanding of what energy means. Even in the specific case of fossil fuels, there is room for our increased technical engagement through collaboration.

Physics Today: Physics, fracking, fuel, and the future
Michael Marder, Tadeusz Patzek and Scott W. Tinker

Read more…