Reginald L. Goodwin's Posts (3117)

Sort by

Electric Realpolitik...

Figure 1. A discharging battery converts chemical potential into electric potential. At the anode, an oxidation reaction frees electrons (e−) from their parent atoms. The electrons pass through an external circuit, where they do work on a load, while the ions they leave behind diffuse through an electrolyte and separator to the cathode. There, the electrons and ions recombine via a reduction reaction. During recharging, the process is reversed, and the anode is restored. In lithium-ion batteries, the electrode materials are typically layered structures, with lithium stored in the gaps between layers.

Topics: Alternative Energy, Electrical Vehicles, Green Tech, Global Warming, Solid State Physics

This post reminded me of the documentary "Who Killed the Electric Car?" and the synopsis that powerful forces - the same that fuel climate change denial as it did obfuscation on the dangers of cigarette smoking - are holding back progress because they want no other competition is the "free market" of commerce. Sounds less libertarian and more like targeted socialism for the already well-heeled 1%.

The electric vehicle’s history offers a lesson to the wise: Harvesting the fruits of basic science requires industrial foresight, investment, and a healthy dose of realpolitik.

By 2004 the all-electric vehicle seemed destined for the dustbin of history. General Motors (GM) was recalling and destroying all copies of the EV1, its first-generation electric car, after company officials convinced themselves and regulators that fuel cells, not batteries, were the ultimate power source of the future electric car. Meanwhile, hybrid electrics had begun to proliferate as a more economically viable alternative in the short run. Most batteries were then considered simply too expensive, too heavy, and too weak to power cars on their own. Then came the lithium-ion battery. (See the article by Héctor Abruña, Yasuyuki Kiya, and Jay Henderson, Physics Today, December 2008, page 43.) With higher energy density than older rechargeables—and with the ability to release that energy quickly on demand—the battery is widely viewed as having led a revival of the electric vehicle. Tesla Motors pioneered its use in automobiles with the Roadster, and today most all-electric vehicles have batteries that use some sort of lithium chemistry. Although concerns about safety, cost, and durability linger, few would dispute that the lithium-ion battery has been the chief technological enabler of the renaissance of the all-electric vehicle.

The emergence of the lithium-ion battery did not happen overnight. It was shaped for decades by the influence of materials scientists. It was the product not of a singular eureka moment but of many strands of research tracing back to the rise of the US national security state at the dawn of the Cold War. That’s when John Goodenough, a physicist by training, found himself helping to build a sophisticated air-defense computer for the US military. Although he couldn’t have imagined it at the time, he was about to embark on research that would help found solid-state ionics—the science of inserting and storing ions inside solids without changing their fundamental structures—and contribute to revolutionizing automobile transport.

The many twists and turns that ensued illustrate the unpredictability and contingency of innovation. The story of the long road to lithium-ion power shows how changing social, economic, and environmental conditions after World War II altered the R&D priorities of government and industry. It affords insight into how trends in the energy economy shaped science and engineering over time. And it reveals a hidden history of the shifting fortunes of physics, a discipline that has traditionally relied on state patronage.

Physics Today:
Cold War computers, California supercars, and the pursuit of lithium-ion power
Matthew N. Eisler

Read more…

Laser Plasma...

Simulation of a laser pulse that's created a plasma aperture in a thin foil. (Courtesy: Bruno Gonzalez-Izquierdo et al/Nature Communications)

Topics: Laser, Optical Physics, Plasma Physics, Research


The quality of laser-accelerated proton beams can be improved by controlling the polarization of the incident laser light, researchers in the UK have discovered. The finding could help physicists to create compact sources of proton beams for use in medicine, lithography or even astrophysics.

Beams of protons and other positive ions have a wide range of applications, including particle physics, materials processing and medicine. Proton-beam therapy, for example, is used to destroy some cancerous tumours with a minimum of collateral damage to surrounding healthy tissue. However, the practical use of proton and ion beams is held back by the need for large and expensive particle accelerators to generate high-quality beams.

One way forward is laser-plasma acceleration, in which a high-power laser pulse is fired into a target. This creates a plasma in which the electrons separate from the ions. This creates huge electric fields that are capable of accelerating protons, ions and electrons to very high energies.

Physics World: Laser polarization boosts quality of proton beams, Tim Wogan

Read more…

7 Kilometers...

Calgary has seen quantum communication
brettA/Getty

Topics: Computer Science, Entanglement, Quantum Computer, Quantum Teleportation

A little math for perspective: 7 kilometers x 1,000 meters/1 kilometer x 39.37 inches/1 meter x 1 foot/12 inches x 1 mile/5,280 feet = 4.35 miles. Fiber optic cables typically have a range, and stations to repeat/boost the signals. This could be improved of course, and part of an infrastructure buildup that could spur the education industry K-12 and post secondary to prepare future workers for building a new communications architecture. A lot of automated and outsourced jobs are not coming back, and these likely wouldn't be frustrated by old money, like the fossil fuel industry does alternative energy.

A new world record for quantum teleportation has been set, bringing quantum communication networks that can stretch between cities a step closer. Two independent teams have transferred quantum information over several kilometres of fibre optic networks.

Being able to establish teleportation over long distances is a crucial step towards exchanging quantum cryptographic keys needed for encoding data sent over the fibres.

Quantum teleportation is a phenomenon in which the quantum states of one particle can be transferred to another, distant particle without anything physical traveling between them. It relies on a property called entanglement, in which measuring the state of one particle immediately affects the state of its entangled partner, regardless of the distance between them.

Conceptually, one way of doing teleportation involves three participants: say, Alice, Bob and Charlie. In order for Alice and Bob to exchange cryptographic keys, they have to first establish the capacity for teleportation, with Charlie’s help.

First Alice sends a particle (A) to Charlie. Bob, meanwhile, creates a pair of entangled particles (B & C), sends B to Charlie and holds on to C. Charlie receives both A and B, and measures the particles in such a way that it’s impossible to tell which particle was sent by Alice and which by Bob. This so-called Bell state measurement results in the quantum state of particle A being transferred to particle C, which is with Bob.

New Scientist:
Quantum teleportation over 7 kilometres of cables smashes record, Anil Ananthaswamy

Read more…

Apathy's Aftermath...

At the City Market in Charleston, S.C., one of the most popular spots in town, shoppers dodged seawater that bubbled up from storm drains during high tide in June.
Credit Hunter McRae for The New York Times


Topics: Economy, Education, Climate Change, Global Warming, Politics


This Is What It Looks Like: That's the post I wrote for the impact of Hurricane Sandy; recalling my experience with Hurricanes Katrina and Rita. Yet Congressman Lamar Smith is not impressed [2]. We're at the point where only the blithely self-ignorant would overlook the signs. How many polar bears on receding ice; how many that turn to cannibalism to momentarily survive shall we ignore...until we cannot?

The problem is we all think in short-term: business quarters and targets; election cycles and not generational; not to posterity. As long as our hedge fund earns this quarter, or we win an election cycle, we're stupidly callous of the unintended consequences our actions have down-the-road. The only thing of concern in a self-centered culture is the immediacy of NOW. The fact that we have the equivalent of Cray Supercomputers in our hip pockets and a search engine that's eviscerated the need to memorize, take notes on or recall ANYTHING and the disdain of expertise in every sphere of endeavor: we have devolved, and it has obviously affected our politics.

A bigoted birther, bloviating, conspiracy provocateur and reality-TV star is a major party nominee and Keeping Up With the Kardashians is a top-rated show. Before the tragic death of Anna Nicole Smith, Bill Mahr reflected on his show "but, what does she do?" We are living in an epoch where fame is more important than skills one studies for, earns and has experience in. Young people used to dream of being astronauts, doctors or politicians: now they aspire to be reality TV stars. We are literally at a juncture in history where people are "famous for being famous" and nothing else.


The New York Times: For decades, as the global warming created by human emissions caused land ice to melt and ocean water to expand, scientists warned that the accelerating rise of the sea would eventually imperil the United States’ coastline.

Now, those warnings are no longer theoretical: The inundation of the coast has begun. The sea has crept up to the point that a high tide and a brisk wind are all it takes to send water pouring into streets and homes. [1]

The New Yorker: From climate change and evolution to sex education and vaccination, there has always been tension between scientists and Congress. But Smith, who has been in Congress since 1987 and assumed the chairmanship of the Science Committee in 2013, has escalated that tension into outright war. [Congressman Lamar] Smith has a background in American studies and law, not science. He has, however, received more than six hundred thousand dollars in campaign contributions from the oil-and-gas industry during his time in Congress—more than from any other single industry. With a focus that is unprecedented, he’s now using his position to attack scientists and activists who work on climate change. Under his leadership, the committee has issued more subpoenas than it had during its previous fifty-four-year history. [2]

The Insider starring Russell Crowe was simply about the analogy to climate change denial in the tobacco industry. They both tend to use the same lawyers and disinformation tactics.

After seeking the expertise of former "Big Tobacco" executive Jeffrey Wigand (Russell Crowe), seasoned TV producer Lowell Bergman (Al Pacino) suspects a story lies behind Wigand's reluctance to speak. As Bergman persuades Wigand to share his knowledge of industry secrets, the two must contend with the courts and the corporations that stand between them and exposing the truth. All the while, Wigand must struggle to maintain his family life amidst lawsuits and death threats.

I would hate to watch the televised Mea Culpa of Lamar Smith or other like-minded political figures that have pushed this mythology of climate conspiracy; about how they were wrong as "Super Storm _____" becomes the norm. I would hate to see the blank faces as military naval bases are damaged; as more conflagrations spurred by weather-generated lack of supply; the silence that speaks more volumes than the actual answer they cannot say...

"I don't know."

That will not be existentially, "good enough" or expedient.

1. Flooding of Coasts, Caused By Global Warming, Has Already Begun, Justin Gillis
2. The House Science Committee's Anti-Science Rampage, Lawrence M. Krausss

Read more…

Free Money...

Image Source: Technology Review (see Basic Income below)


Topics: Economy, Existentialism, Philosophy


I have the unpleasant distinction of having been on unemployment, even self-published a book about it. I don't recall it as halcyon days.

My unemployment compensation amounted to $1,320 per month that the State of Texas initially sent me a live check, then populated a debit card. I had the obligation of a minimum of three job searches per week I had to keep logs of, and if asked - unannounced - to produce them.

Let's say I made $200 for one week, as I did in seasonal work for a shipping company lifting boxes and a sales job with a security company. I had to report that (example) minuscule increase. Then legally, the state unemployment adjusted what I received to $1,120/month. In essence, your ceiling was established. Lying was illegal state and federally, and I did not desire to see life behind bars. You of course, desired to earn whatever living you became accustomed to prior to unemployment. It was the financial equivalent of treading water...

I read the print version of Technology Review's business issue because the cover - titled "Free Money" with a cartoon techie next to a sketched Segway giving what looked like a check or money to an presumably out-of-work caricatured individual. The Silicon Valley idea was to give the unemployed $10,000 per year to give them space to "invent" or invest in their own education to retrain for another career.

Caveat 1: Duchess Community College Tuition and Fees (since this is where I live now)

Tuition for Full-Time Students (over 11 credits)+

New York State Resident† $1,764.00 per semester

Nonresident $3,528.00 per semester

Student Activity Fee $5.00 per credit hour

Technology Fee $13.00 per credit hour

Caveat 2 - Rent: A decent apartment runs about $1,500 - 2,100/month. Add to that FOOD,  gas or transportation; clothing and entertainment - retraining workers aren't monks. I'll leave it to you to do the math.

If the unemployment compensation were without conditions, a 12-month compensation would be $15,840.00, though Texas and many states only do with (with seeking employment conditions) for six months.

So...I'm not a fan of this approach. I agree SOMETHING has to be done, but $10 - 15,000 probably doesn't cut it. Something like new entry-level jobs in alternate energy, for example that does not require much training after high school, up to design engineers and researchers. There should be a program of continuous training and lifelong education for career advancement and frankly, for the fact humans get easily bored. Proverbs about "idle minds" and the work shed of Beelzebub applies.
Image Source: Ibid

However, the current level of income inequality has to be solved (reference graphs), unless we want something in a modern society - first, second or third world - decidedly undesirable, tribal, horribly stratified, weaponized and...dystopian.

MIT Technology Review:
Basic Income: A Sellout of the American Dream, David H. Freedman
What the Great Economists Would Have Thought of a Universal Basic Income
Letter to the Editor

Read more…

Moore or Less...



Topics: Economy, Jobs, Moore's Law, Semiconductor Technology, STEM


I’ve written about this before, but now living this pendulum swing and what is the pending aftermath of the industry, I thought I’d give some perspective to what is occurring that most consumers aren’t aware of.

Dimensional perspective: The average human hair is 100,000 nanometers in diameter, or 100,000 x 10E-9 meters (0.0001 meters if you were wondering). Your average smart phone device has a gate controlling the flow of electrons that is printed at around 35 – 20 nanometers (0.000000035 - 0.000000020 meters), so it’s ridiculously small in comparison. Note that shorthand metric notation saves a lot of typing.

Process Engineering your chip: It involves Epitaxial Growth, Chemical Vapor Deposition and Plasma Vapor Deposition to deposit the layer films needed to pattern; extreme ultraviolet photolithography to mask out patterns in the films, which runs into struggles with standing wave phenomena and quantum effects; reactive ion etch to transfer those incredibly small features into layers of built-from-the-ground-up circuitry (some wet chemical etch as well, mostly to clean or clear surfaces of defects, but some etching too); ion implantation that dopes (introduces impurities) into Silicon or Germanium to make a semiconductor conductive; chemical mechanical planarization for topology and Rapid Thermal Anneal processes to activate certain films. That’s done pretty much over and over for greater than 100 infinitesimally thin film layers to put a chip in a plastic dip that ends up in your laptop or hip pocket. That is how we fill the demand of the market for faster processors to share the ubiquitous cat, dancing turtle and cut puppy videos.

Without a background in physics or chemistry, and a little simple math I can show you why we’ve hit this wall and where we’re likely to go next.

From the formula above for resistance, "resistivity" is an intrinsic property related to the material that's conducting electrons, so it doesn't change (unless you dope it to). Notice length in the numerator; cross sectional area is in the denominator. The length of a longer wire has more resistance to conducting electricity than a shorter one. As these devices undergo shrinks following Moore's Law, length and area will decrease, but specifically a decrease in cross sectional area will increase resistance.

Another formula:


It's very simple to note an increase in resistance - due to the shrinks typified by Moore's Law - will result in an increase in power and thereby: heat. This has numerous and quite dramatic examples recent news reports provide. Heat has to be dissipated with a "heat sink" (Thermodynamics), that usually entails a fan to cool your chip that has to be attached to your device's battery. It's not your imagination that your battery life is less after months or years of usage as Lithium degrades over time - using the same battery to power your device and its heat sink to cool it is why. Plus, consumers are both more savvy and satisfied: how fast does one need to share a cat video? To coin a phrase, "if it ain't broke, don't fix it!" or for that matter, replace it with the exploding variety. We're now at the natural limitations of Moore's Law. The game going forward will likely be memory and battery improvements, which should have transferable benefits in computing, electric vehicles and power storage; newer industries and employment.

That being said: This link from Semi Wiki on "Age, Training and Winning in the Silicon Valley Culture" is so apropos, I think it should be shouted from rooftops. If we're going to create a new economy, we need to be willing to do some things we've done traditionally - like education - differently and lifelong.


The Internet of Things (IoT) and what I like to call "smart car tech" (so-called driver-less cars, which has an existing analog already: public transportation if a national infrastructure were implemented) is a reflection of that as the device gates are typically larger (65-130 nm) than what's needed for cute pet videos and Snap Chat updates.

I think cell or mobile phones are ubiquitous enough that they're as much a fixture of modern life as the transistor radio and walk-man (G-d, I just dated myself!) used to be.

The industry is going "Back to the Future" so to speak on what it will manufacture until something beyond Silicon, Germanium and a wholly different application is discovered.

Tomorrow: Free Money

Related links:

http://news.utexas.edu/2015/04/15/hot-chips-managing-moores-law

http://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338

http://www.slate.com/articles/technology/technology/2005/12/the_end_of_moores_law.html

http://www.trustedreviews.com/opinions/what-is-moore-s-law

http://techland.time.com/2012/05/01/the-collapse-of-moores-law-physicist-says-its-already-happening/

https://www.wired.com/insights/2015/01/the-rise-of-diamond-technology/

http://www.allaboutcircuits.com/textbook/direct-current/chpt-2/calculating-electric-power/

http://www.sengpielaudio.com/calculator-ohm.htm

Read more…

Re-Alerting...

Missile launch officers. (Source: Dept. of Defense)


Topics: Existentialism, Politics


I still have a faded green book given in my ELEMENTARY school classroom on "how-to survive a nuclear attack." The reason Putin gives excitement now is many in positions of influence are old enough to recall the old "duck-and-cover" days (which did absolutely NOTHING towards survival) and in the Cold War which side the former KGB master spy was on and rooting for.

This article by the Union of Concerned Scientists intrigued me. I hope it does you, and gives you pause as you read this excerpt on your laptop or mobile device. For more details, click on the link provided below.

We’ve posted previously about the dangers of the US policy of keeping nuclear missiles on hair-trigger alert so that they can be launched quickly in response to warning of attack. There is a surprisingly long list of past incidents in which human and technical errors have led to false warning of attack in both the both US and Soviet Union/Russia—increasing the risk of an accidental nuclear war.

The main reason administration officials give for keeping missiles on alert is the “re-alerting race” and crisis instability. The argument is that if the United States takes its missiles off hair-trigger alert and a crisis starts to brew, it would want to put them back on alert so they would not be vulnerable to an attack. And the act of putting them back on alert—“re-alerting”—could exacerbate the crisis and lead Russia to assume the United States was readying to launch an attack. If Russia had de-alerted its missiles, it would then re-alert them, further exacerbating the crisis. Both countries could have an incentive to act quickly, leading to instability.

This argument gets repeated so often that people assume it’s simply true.

However, the fallacy of this argument is that there is no good reason for the US to re-alert its ICBMs in a crisis. They are not needed for deterrence since, as noted above, deterrence is provided by the submarine force. Moreover, historical incidents have shown that having missiles on alert during a crisis increases the risk of a mistaken launch due to false or ambiguous warning. So having ICBMs on alert in a crisis increases the risk without providing a benefit.

Union of Concerned Scientists:
Nuclear Weapons and the Myth of the “Re-Alerting Race”
David Wright, Physicist and Co-Director, Global Security

Read more…

Freshly Squeezed...

The LIGO gravitational wave detector in Livingston Louisiana. LIGO detectors could soon be using a new and more efficient scheme for squeezing light


Topics: Black Holes, General Relativity, Gravitational Waves, Quantum Mechanics


The quantum state of light has been squeezed more than ever before by physicists in Germany, who have developed a new low-loss technique. Squeezed light has been used to increase the sensitivity of gravitational wave detectors, and scientists are planning to deploy the new method on the GEO600 and LIGO gravitational wave detectors.

Detecting gravitational waves – the ripples in spacetime caused by energetic events in the Universe – relies on splitting a laser beam using an interferometer and sending the two halves back and forth along two orthogonal arms. When the two halves of the beam recombine, all the light normally comes out of one port of the interferometer. A passing gravitational wave will change the relative lengths of the two arms, creating an interference pattern and directing some of the light out of the "dark" port. However, by the time they reach Earth, gravitational waves from even the most dramatic events have tiny amplitudes, so sensitivity is crucial. The first confirmed discovery of a gravitational wave, announced by LIGO in February, was produced by the collision and merger of two black holes and changed the 4.2 km arm lengths by barely 10–19m (see "LIGO detects first ever gravitational waves – from two merging black holes").

At such extreme sensitivity, one of the main noise sources in such detectors is uncorrelated photons emerging from the quantum vacuum as a result of its zero-point energy – the energy that Heisenberg's uncertainty principle dictates can never be removed from a system. But, amazingly, even this source of noise can be minimized. The uncertainty principle puts a lower limit on the product of the variance in the amplitude (or number) of photons and the variance in the phase. Vacuum photons naturally have equal variance in both amplitude and phase. It is, however, possible to create a "squeezed state" of light, in which either one of these quantities is minimized (squeezed) and the other is allowed to increase (antisqueezed).

Physics World:
Squeezed light shatters previous record for manipulating quantum uncertainty
Tim Wogan

Read more…

Ultracapacitors...

FastCAP Systems' ultracapacitors (pictured) can withstand extreme temperatures and harsh environments, opening up new uses for the devices across a wide range of industries, including oil and gas, aerospace and defense, and electric vehicles. Credit: FastCAP Systems


Topics: Condensed Matter Physics, Materials Science, Solid State Physics


Devices called ultracapacitors have recently become attractive forms of energy storage: They recharge in seconds, have very long lifespans, work with close to 100 percent efficiency, and are much lighter and less volatile than batteries. But they suffer from low energy-storage capacity and other drawbacks, meaning they mostly serve as backup power sources for things like electric cars, renewable energy technologies, and consumer devices.

But MIT spinout FastCAP Systems is developing ultracapacitors, and ultracapacitor-based systems, that offer greater energy density and other advancements. This technology has opened up new uses for the devices across a wide range of industries, including some that operate in extreme environments.

Based on MIT research, FastCAP's ultracapacitors store up to 10 times the energy and achieve 10 times the power density of commercial counterparts. They're also the only commercial ultracapacitors capable of withstanding temperatures reaching as high as 300 degrees Celsius and as low as minus 110 C, allowing them to endure conditions found in drilling wells and outer space. Most recently, the company developed a AA-battery-sized ultracapacitor with the perks of its bigger models, so clients can put the devices in places where ultracapacitors couldn't fit before.

Phys.org: New applications for ultracapacitors, Rob Matheson

Read more…

The Transcension Hypothesis...

Interstellar Wiki: Cooper in the Tessaract


Topics: Existentialism, Fermi Paradox, Philosophy, Planetary Science, Space Exploration


It's an intriguing hypothesis, albeit a convenient one since at this current juncture, it's kind of hard to prove experimentally or observationally. John G. Messerly said on the site Institute for Ethics and Emerging Technologies (quoting John Smart):

The transcension hypothesis proposes that a universal process of evolutionary development guides all sufficiently advanced civilizations into what may be called “inner space,” a computationally optimal domain of increasingly dense, productive, miniaturized, and efficient scales of space, time, energy, and matter, and eventually, to a black-hole-like destination.

In essence, they've "left the universe," which as you know, is a pretty big place. There are two important questions this generates at least for me: where and how?

Where: Did the aliens evolve to some hyper-dimensional tesseract, as in the movie Interstellar? I can see now where the idea originated. Again, convenient as hyper dimensions are presently undetectable. Or, as the embed below suggests, instead of "falling in love with Siri" we'll eventually become Siri after The Singularity?

How: Another way for intelligent civilizations to be "gone" is unfortunately...extinction, which can be as we are starting to see, self-induced.

Knowing the answer to either question can be illuminating and species-extending.



Ever since Enrico Fermi questioned back in the 1950’s why, if a multitude of civilisations are likely to exist in the Milky Way, no sign of their existence in the form of probes or spacecraft has ever been detected, scientists and critical thinkers have struggled to resolve the problem by supplying a host of inventive arguments with mixed reception.

To date one of the most common answers to the Great Silence was simply that life is so rare, so widely distributed, and the scale of the universe so immense, that the probability of contact or communication between any two space-faring civilisations is almost non-existent. Needless to say an outlook which seems like a very lonely, sad and pessimistic state of affairs for intelligent life to find itself in.

BrighterBrains.org:
The Transcension Hypothesis: An Intriguing Answer to the Fermi Paradox?
Owen Nicholas

Read more…

Rosetta's MIDAS...

Courtesy: http://blogs.esa.int/rosetta/2014/03/26/introducing-midas-rosettas-micro-imaging-dust-analysis-system/

Topics: Astrophysics, Comets, ESA, NASA, Rosetta, Space Exploration, Women in Science

There is a movie coming out in 2017 called "Hidden Figures" about the African American women that were "computers" as they were all called at the time. Behind the scenes and out of notice (purposely) from the public eye, these scientists were responsible for mankind getting to the moon, despite what your conspiracy provocateur uncle spouts around the dinner table at Thanksgiving.

Dr. Claudia Alexander was a project scientist on the American portion of the international Rosetta mission. She sadly lost her battle with breast cancer last year. I always try to highlight such achievements since its obvious from a societal structural sense, negative stereotypes are often forwarded to maintain an inane "status quo" while simultaneously complaining about "bootstraps." I salute Dr. Alexander, a modern Hidden Figure in Science that paved the way for new discoveries by humankind.

Thanks to in situ measurements from MIDAS (the Micro-Imaging Dust Analysis System) on-board the Rosetta spacecraft, researchers have now found out more about the structure of the dust particles on comet 67P/Churyumov-Gerasimenko. The particles are made up of aggregates and cover a range of sizes – from tens of microns to a few hundred nanometres. They also appear to have formed from the hierarchical assembly of smaller constituents and come in a range of shapes, from single grains to larger, porous aggregated particles with some dust grains being elongated. The study could shed more light on the processes that occurred when our Solar System formed nearly five billion years ago.

Planetary systems like our own Solar System started out as dust particles in protoplanetary nebulae – clouds of gas and dust that gave rise to stars and planets. The particles collided and agglomerated to form planetesimals – the building blocks of planets. Comets are leftover planetesimals and are made of ice and dust particles. They range in size from a few hundreds of metres to ten of kilometres and are mainly found on the outskirts of the Solar Systems, far from damaging radiation, high temperatures and collisions with other objects.

Nanotech Web:
Rosetta’s MIDAS analyses cometary dust particles, Belle Dumé

Read more…

Past and Future Bennu...

Image Source: SETI Institute, Osiris-REx


Topics: Asteroids, Astrophysics, NASA, Planetary Science, Space Exploration


MOUNTAIN VIEW – NASA’s OSIRIS-REx spacecraft is slated to launch from Cape Canaveral on Thursday, September 8th. Its mission is to rendezvous with asteroid Bennu in 2018, take a sample from its surface, and return that sample to Earth in 2023.

Why are scientists so interested in this ancient lump of rock? First, Bennu is one of the darkest objects in the Solar System, suggesting it is rich in organic materials that might have seeded Earth with the starting blocks of life. We cannot find these materials today, because the organic compounds that first fell to Earth have long since disappeared – processed and endlessly recycled by geology and biology. On Bennu however, these mysterious compounds have been almost perfectly preserved. Bennu is a veritable museum in space that has been waiting 4.5 billion years to open its doors to Earth’s scientists.

Scientists are also interested in the so-called Yarkovsky effect. This is a process whereby solar radiation gently nudges the asteroid, subtly changing its orbit. By being up-close with Bennu, we can better understand how surface properties affect this process. Combined with understanding material properties, this enables us to better predict if and when this asteroid might impact Earth in the 22nd century. It is a potentially dangerous lump of rock, already on our watch list.

SETI.org: Mission To Examine The Past And Safeguard The Future

Read more…

Na vs Li...

Electrochemical characteristics of Na2Ti3O7 and VOPO4 electrodes in the "half-cell format" vs. Na+/Na. Courtesy: G Yu

Topics: Condensed Matter Physics, Nanotechnology, Semiconductor Technology, Solid State Physics

Researchers at the University of Texas at Austin in the US and Nanjing University of Aeronautics and Astronautics in China have developed a high-energy sodium-ion battery based on sodium titanate nanotubes and vanadyl phosphate layered nanosheet materials. The new device, which works over a wide temperature range of between –20 to +55°C, has a high operating voltage of close to 2.9 V and delivers a large reversible capacity of 114 mA h/g. It also boasts a high energy density of 220 Wh/kg, which makes it competitive with state-of-the-art lithium-ion batteries.

Sodium-ion batteries are similar to their lithium-ion cousins since they store energy in the same way. They consist of two electrodes – anode and cathode – separated by an electrolyte. When the battery is being charged with electrical energy, metal ions move from the cathode through the electrolyte to the anode, where they are absorbed into the bulk of the anode material. Sodium-based devices are in principle more attractive though since sodium is highly abundant on Earth (its Clarke’s number is 2.64) and is therefore much cheaper than lithium. Sodium is also more environmentally friendly than lithium.

However, the radius of the sodium ion is significantly larger than that of the lithium ion. This makes it difficult to find a host electrolyte material that allows ions to be rapidly absorbed and removed. What is more, sodium-ion batteries made thus far suffer from a relatively low working potential, large capacity decay during cycling (which leads to a limited battery life) and poor safety.

Nanotechweb: Sodium-ion device could compete with lithium-ion batteries, Belle Dumé

Read more…

The Anthropocene Epoch...

Nuclear test explosion in Mururoa atoll, French Polynesia, in 1971. The official expert group says the Anthropocene should begin about 1950 and is likely to be defined by the radioactive elements dispersed across Earth by nuclear bomb tests. Photograph: AFP/Getty Images


Topics: Climate Change, Environment, Global Warming


The problem that humans have is no appreciation for the vastness of the passage of time in large scale. In other words, we don't believe what we haven't seen physically: climate change, evolution and a ~13.5 billion age universe noted science examples. It's exacerbated by the Internet and our current notions that information - and thus problems and resolutions - are concluded quickly. It is sobering this epoch has now been declared twelve years before I appeared on the planet.

Humanity’s impact on the Earth is now so profound that a new geological epoch – the Anthropocene – needs to be declared, according to an official expert group who presented the recommendation to the International Geological Congress in Cape Town on Monday.

The new epoch should begin about 1950, the experts said, and was likely to be defined by the radioactive elements dispersed across the planet by nuclear bomb tests, although an array of other signals, including plastic pollution, soot from power stations, concrete, and even the bones left by the global proliferation of the domestic chicken were now under consideration.

The current epoch, the Holocene, is the 12,000 years of stable climate since the last ice age during which all human civilisation developed. But the striking acceleration since the mid-20th century of carbon dioxide emissions and sea level rise, the global mass extinction of species, and the transformation of land by deforestation and development mark the end of that slice of geological time, the experts argue. The Earth is so profoundly changed that the Holocene must give way to the Anthropocene.

The Guardian:
The Anthropocene epoch: scientists declare dawn of human-influenced age
Damian Carrington

Read more…

Party of Apocalypse...

This is an essay I posted on Scribd.com. I wasn't going to post it until I heard about the apathy of my millennial niece and her friend back in Texas during this election cycle. I hope she reads this. I hope she's pissed off with Unc to the point she rolls her eyes and doesn't speak to me for a while...it'll mean she's at least listening.

Intro

I realize invoking the word apocalypse is a cultural malapropism, since it actually means “to reveal” instead of the popular association to Armageddon and mass extinction. Mind you, I really, REALLY wasn’t going to post this because…Internet. The shiver that could result from the pat-on-the-back self-congratulatory achievement of “going viral” can be career-limiting in many fields. However, we’re on the verge of electing a contrived fiction to the most powerful office ever created and give him the nuclear codes. I hear a lot of millennials – one my own niece and her friend – who aren’t voting for either candidate. Rather than say you should review “Schoolhouse Rock” videos and less “Keeping Up with the Kardashians,” I’m going to “keep it 100” and put it in terms you can all understand and hopefully act on before the country you take for granted becomes your favorite dystopian movie. You can wait for the credits that won’t be coming.

The first night of the RNC convention could have been a success with the noted exception of Melania Trump lifting whole cloth parts of now First Lady Michelle Obama’s speech to the DNC convention in 2008. An out-of-work journalist was the first to catch and tweet it (a sad indictment of the employed journalists ACTUALLY at the RNC convention) [1]. The last night of the RNC was like “The Dark Knight Returns”: the world was essentially a shit show like Gotham, and Batman screamed for 75 minutes incoherent, semi form, hand-tossed Word Salad anointing himself Bruce-Wayne-Almighty-Cheetos-Jesus savior of the planet by the strength of his will alone (no cool gadgets – just a Galaxy Smart Phone and a twitter handle he misspells as he jacks off on almost daily). The Bat’s bravery was previously demonstrated during his selfless sacrificed Vietnam five deferments to let others more worthy die in his place.

Link: Party of Apocalypse

Huffington Post:

GOP Operative Lashes Out At Party, Calls Trump 'Cheetos Jesus' In Epic Tweetstorm

Read more…

The Morality of Skynet...

Image Source: NY Times


Topics: Artificial Intelligence, Computer Science, Philosophy, Robotics, Singularity


From Frankenstein to Terminator, the cultural angst is the same: that which we create eventually destroy us. Now we have Siri and driver-less vehicles. The Singularity is what Terminator dramatized, that when an Artificial Intelligence becomes exponentially smarter than us, we may amount to it (our "children") as much as we regard gnats.

I've read some have projected 2030 as the year of The Singularity. I think personally that is more of a hope than prediction. I'll be 68, and I expect in reasonably good health. Its advent I'm guessing won't hurt too much, and be more closer to Data and the Enterprise main computer than HAL (2001: A Space Odyssey) or T-1000. If humanity's children are to have any morals, it will have to be those we're willing to display towards one another as well as teach. At this current epoch, we're not good examples to emulate.

Isaac Asimov's Three Laws of Robotics:


1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Read more…

Separation Anxiety...

Lithium-7 as a test case was successfully purified by magnetically activated and guided isotope separation with the lab setup shown here. The oven for heating lithium is sitting on the red lab jack to the right. Circular view ports were used for shining lasers to optically pump the isotopes. Inside the rectangular box are the magnetic guides.

THOMAS MAZUR
Citation: Phys. Today 69, 9, 22 (2016); http://dx.doi.org/10.1063/PT.3.3292


Topics: Atomic Physics, Isotopes, Mark G. Raizen, Research, Thermodynamics




Atomic beams, optical pumping, and magnet geometry are the crux of a fledgling method that may help meet the demand for pure isotopes.

Mark Raizen didn’t set out to separate isotopes. But a few years ago the University of Texas at Austin physicist realized that the methods he was using to cool atoms to near absolute zero could be adapted to enrich isotopes, and he had a hunch his approach—magnetically activated and guided isotope separation (MAGIS)—could help satisfy the growing demand for isotopes.

Fundamental research, medicine, energy, and other markets are finding new and growing applications for isotopically enriched materials, both stable and radioactive. “Many isotopes have been expensive and rare. They’re like an untapped natural resource,” says Raizen. It’s not unusual for enriched stable isotopes to cost $50 000 per gram, he notes.

Separation anxiety


For decades, the main instrument for separating stable isotopes has been the calutron, which was first built in 1941 and separates by charge-to-mass ratio (see the article by Bill Parkins, Physics Today, May 2005, page 45). A sample is ionized, accelerated with electric fields, and then deflected with magnetic fields. Because different isotopes of a given element have the same charge but vary in mass, they become separated in a magnetic field, with heavier isotopes deflected less. The US shuttered its last calutrons in the 1990s. Today the bulk of the world’s stable isotopes come from national inventories and from decades-old calutrons in Russia. Radioisotopes are made in reactors and accelerators around the globe.



Physics Today: Can MAGIS work magic for separating stable isotopes? Toni Feder

Read more…

Big Data...

Demand for data scientists is booming. Shown here is the relative growth in US data science job postings. (Data courtesy of Indeed.com.)

Citation: Phys. Today 69, 8, 20 (2016); http://dx.doi.org/10.1063/PT.3.3261


Topics: Computer Science, Economy, Jobs, STEM


A PhD is a heavy commitment, and many just like Bachelors and Masters STEM-prepared graduates have the same struggles anyone else has in the job market. It's a broad and somewhat inaccurate assumption that a STEM graduate doesn't have concerns with employment. The pendulum swings between massive need and largest expense: salaries on balance sheets. Despite the fact my youngest son will have a guaranteed job with his Civil Engineering firm, he heard over his last lunch with them before the semester starts when they've laid off, even affecting an employee that just came back from her maternity leave. It was sobering for him to say the least.

It is important most of all to remember why you entered a science-related field in the first place: the love of discovery that will never change, nor should you repent of. It is also important in knowing who you are to be flexible.

If different people buy the same items at the grocery store, will their taste in movies also strongly overlap? Can a company recognize when someone tries to make a fraudulent payment? Is a home buyer getting a fair price? Those are the sorts of problems that data scientists tackle.

“Data science is the marriage of statistics and computer science,” says Janet Kamin, chief admissions officer at NYC Data Science Academy. “It is the art of finding patterns and insights in large sets of data that allow you to make better decisions or learn things you couldn’t otherwise learn.” The demand for data scientists is booming across industries—retail, automotive, banking, health care, and more. It’s also growing in the nonprofit and government sectors. (See the plot on page 22.)



Physics Today: Data science can be an attractive career for physicists, Toni Feder

Read more…

Quantum Supersolution Techniques...

Figure 1

(a) Two photonic wave functions on the image plane, each coming from a point source. X1 and X2 are the point-source positions, θ1 is the centroid, θ2 is the separation, and σ is the width of the point-spread function. (b) If photon counting is performed on the image plane, the statistics are Poisson with a mean intensity proportional to Λ(x)=[|ψ1(x)|2+|ψ2(x)|2]/2 .


Topics: Modern Physics, Particle Physics, Quantum Mechanics


Abstract

Rayleigh’s criterion for resolving two incoherent point sources has been the most influential measure of optical imaging resolution for over a century. In the context of statistical image processing, violation of the criterion is especially detrimental to the estimation of the separation between the sources, and modern far-field superresolution techniques rely on suppressing the emission of close sources to enhance the localization precision. Using quantum optics, quantum metrology, and statistical analysis, here we show that, even if two close incoherent sources emit simultaneously, measurements with linear optics and photon counting can estimate their separation from the far field almost as precisely as conventional methods do for isolated sources, rendering Rayleigh’s criterion irrelevant to the problem. Our results demonstrate that superresolution can be achieved not only for fluorophores but also for stars.

APS Physics: Quantum Theory of Superresolution for Two Incoherent Optical Point Sources
Mankei Tsang, Ranjith Nair, and Xiao-Ming Lu
Phys. Rev. X 6, 031033 – Published 29 August 2016
DOI:http://dx.doi.org/10.1103/PhysRevX.6.031033

Read more…

Jupiter's Extended Family...

Comparing Jupiter with Jupiter-like planets that orbit other stars can teach us about those distant worlds, and reveal new insights about our own solar system's formation and evolution. (Illustration)
Credits: NASA/JPL-Caltech

Topics: Astronomy, Astrophysics, Exoplanets, NASA, Planetary Science, Space Exploration

Our galaxy is home to a bewildering variety of Jupiter-like worlds: hot ones, cold ones, giant versions of our own giant, pint-sized pretenders only half as big around.

Astronomers say that in our galaxy alone, a billion or more such Jupiter-like worlds could be orbiting stars other than our sun. And we can use them to gain a better understanding of our solar system and our galactic environment, including the prospects for finding life.

It turns out the inverse is also true -- we can turn our instruments and probes to our own backyard, and view Jupiter as if it were an exoplanet to learn more about those far-off worlds. The best-ever chance to do this is now, with Juno, a NASA probe the size of a basketball court, which arrived at Jupiter in July to begin a series of long, looping orbits around our solar system's largest planet. Juno is expected to capture the most detailed images of the gas giant ever seen. And with a suite of science instruments, Juno will plumb the secrets beneath Jupiter's roiling atmosphere.

It will be a very long time, if ever, before scientists who study exoplanets -- planets orbiting other stars -- get the chance to watch an interstellar probe coast into orbit around an exo-Jupiter, dozens or hundreds of light-years away. But if they ever do, it's a safe bet the scene will summon echoes of Juno.

NASA: Jupiter's Extended Family? A Billion or More

Read more…