If you could wake up tomorrow and be one of these characters which one would you choose? Panther's power or Icon's powers?
All Posts (6487)
Topics: Alternative Energy, Electrical Vehicles, Green Tech, Global Warming, Solid State Physics
This post reminded me of the documentary "Who Killed the Electric Car?" and the synopsis that powerful forces - the same that fuel climate change denial as it did obfuscation on the dangers of cigarette smoking - are holding back progress because they want no other competition is the "free market" of commerce. Sounds less libertarian and more like targeted socialism for the already well-heeled 1%.
The electric vehicle’s history offers a lesson to the wise: Harvesting the fruits of basic science requires industrial foresight, investment, and a healthy dose of realpolitik.
By 2004 the all-electric vehicle seemed destined for the dustbin of history. General Motors (GM) was recalling and destroying all copies of the EV1, its first-generation electric car, after company officials convinced themselves and regulators that fuel cells, not batteries, were the ultimate power source of the future electric car. Meanwhile, hybrid electrics had begun to proliferate as a more economically viable alternative in the short run. Most batteries were then considered simply too expensive, too heavy, and too weak to power cars on their own. Then came the lithium-ion battery. (See the article by Héctor Abruña, Yasuyuki Kiya, and Jay Henderson, Physics Today, December 2008, page 43.) With higher energy density than older rechargeables—and with the ability to release that energy quickly on demand—the battery is widely viewed as having led a revival of the electric vehicle. Tesla Motors pioneered its use in automobiles with the Roadster, and today most all-electric vehicles have batteries that use some sort of lithium chemistry. Although concerns about safety, cost, and durability linger, few would dispute that the lithium-ion battery has been the chief technological enabler of the renaissance of the all-electric vehicle.
The emergence of the lithium-ion battery did not happen overnight. It was shaped for decades by the influence of materials scientists. It was the product not of a singular eureka moment but of many strands of research tracing back to the rise of the US national security state at the dawn of the Cold War. That’s when John Goodenough, a physicist by training, found himself helping to build a sophisticated air-defense computer for the US military. Although he couldn’t have imagined it at the time, he was about to embark on research that would help found solid-state ionics—the science of inserting and storing ions inside solids without changing their fundamental structures—and contribute to revolutionizing automobile transport.
The many twists and turns that ensued illustrate the unpredictability and contingency of innovation. The story of the long road to lithium-ion power shows how changing social, economic, and environmental conditions after World War II altered the R&D priorities of government and industry. It affords insight into how trends in the energy economy shaped science and engineering over time. And it reveals a hidden history of the shifting fortunes of physics, a discipline that has traditionally relied on state patronage.
Physics Today:
Cold War computers, California supercars, and the pursuit of lithium-ion power
Matthew N. Eisler
![]() |
Simulation of a laser pulse that's created a plasma aperture in a thin foil. (Courtesy: Bruno Gonzalez-Izquierdo et al/Nature Communications) |
Topics: Laser, Optical Physics, Plasma Physics, Research
The quality of laser-accelerated proton beams can be improved by controlling the polarization of the incident laser light, researchers in the UK have discovered. The finding could help physicists to create compact sources of proton beams for use in medicine, lithography or even astrophysics.
Beams of protons and other positive ions have a wide range of applications, including particle physics, materials processing and medicine. Proton-beam therapy, for example, is used to destroy some cancerous tumours with a minimum of collateral damage to surrounding healthy tissue. However, the practical use of proton and ion beams is held back by the need for large and expensive particle accelerators to generate high-quality beams.
One way forward is laser-plasma acceleration, in which a high-power laser pulse is fired into a target. This creates a plasma in which the electrons separate from the ions. This creates huge electric fields that are capable of accelerating protons, ions and electrons to very high energies.
Physics World: Laser polarization boosts quality of proton beams, Tim Wogan
![]() |
Calgary has seen quantum communication brettA/Getty |
Topics: Computer Science, Entanglement, Quantum Computer, Quantum Teleportation
A little math for perspective: 7 kilometers x 1,000 meters/1 kilometer x 39.37 inches/1 meter x 1 foot/12 inches x 1 mile/5,280 feet = 4.35 miles. Fiber optic cables typically have a range, and stations to repeat/boost the signals. This could be improved of course, and part of an infrastructure buildup that could spur the education industry K-12 and post secondary to prepare future workers for building a new communications architecture. A lot of automated and outsourced jobs are not coming back, and these likely wouldn't be frustrated by old money, like the fossil fuel industry does alternative energy.
A new world record for quantum teleportation has been set, bringing quantum communication networks that can stretch between cities a step closer. Two independent teams have transferred quantum information over several kilometres of fibre optic networks.
Being able to establish teleportation over long distances is a crucial step towards exchanging quantum cryptographic keys needed for encoding data sent over the fibres.
Quantum teleportation is a phenomenon in which the quantum states of one particle can be transferred to another, distant particle without anything physical traveling between them. It relies on a property called entanglement, in which measuring the state of one particle immediately affects the state of its entangled partner, regardless of the distance between them.
Conceptually, one way of doing teleportation involves three participants: say, Alice, Bob and Charlie. In order for Alice and Bob to exchange cryptographic keys, they have to first establish the capacity for teleportation, with Charlie’s help.
First Alice sends a particle (A) to Charlie. Bob, meanwhile, creates a pair of entangled particles (B & C), sends B to Charlie and holds on to C. Charlie receives both A and B, and measures the particles in such a way that it’s impossible to tell which particle was sent by Alice and which by Bob. This so-called Bell state measurement results in the quantum state of particle A being transferred to particle C, which is with Bob.
New Scientist:
Quantum teleportation over 7 kilometres of cables smashes record, Anil Ananthaswamy
Hello BSFS,
I would to announce that Incus Interactive (William Smith member) has been invited to special game
development event based in Seattle, Washington called Steam Dev Days 2016. This event (press-free)is going to
go over many topics from computer (Steam) hardware development to virtual reality development and a lot more! My invite is good for up to four more people to go with if interested and the link is here;
http://steamcommunity.com/devdays.
William Smith Jr.| Incus Interactive Productions
Technical Artist and Steam Hardware OEM
https://incusinteractiveproductions.wordpress.com/
You can now buy hard copy or Kindle versions of my novel, THE BRITTLE RIDERS. If you are a critic interested in reviewing it let me know and I'll send you a copy.
Topics: Economy, Education, Climate Change, Global Warming, Politics
This Is What It Looks Like: That's the post I wrote for the impact of Hurricane Sandy; recalling my experience with Hurricanes Katrina and Rita. Yet Congressman Lamar Smith is not impressed [2]. We're at the point where only the blithely self-ignorant would overlook the signs. How many polar bears on receding ice; how many that turn to cannibalism to momentarily survive shall we ignore...until we cannot?
The problem is we all think in short-term: business quarters and targets; election cycles and not generational; not to posterity. As long as our hedge fund earns this quarter, or we win an election cycle, we're stupidly callous of the unintended consequences our actions have down-the-road. The only thing of concern in a self-centered culture is the immediacy of NOW. The fact that we have the equivalent of Cray Supercomputers in our hip pockets and a search engine that's eviscerated the need to memorize, take notes on or recall ANYTHING and the disdain of expertise in every sphere of endeavor: we have devolved, and it has obviously affected our politics.
A bigoted birther, bloviating, conspiracy provocateur and reality-TV star is a major party nominee and Keeping Up With the Kardashians is a top-rated show. Before the tragic death of Anna Nicole Smith, Bill Mahr reflected on his show "but, what does she do?" We are living in an epoch where fame is more important than skills one studies for, earns and has experience in. Young people used to dream of being astronauts, doctors or politicians: now they aspire to be reality TV stars. We are literally at a juncture in history where people are "famous for being famous" and nothing else.
The New York Times: For decades, as the global warming created by human emissions caused land ice to melt and ocean water to expand, scientists warned that the accelerating rise of the sea would eventually imperil the United States’ coastline.
Now, those warnings are no longer theoretical: The inundation of the coast has begun. The sea has crept up to the point that a high tide and a brisk wind are all it takes to send water pouring into streets and homes. [1]
The New Yorker: From climate change and evolution to sex education and vaccination, there has always been tension between scientists and Congress. But Smith, who has been in Congress since 1987 and assumed the chairmanship of the Science Committee in 2013, has escalated that tension into outright war. [Congressman Lamar] Smith has a background in American studies and law, not science. He has, however, received more than six hundred thousand dollars in campaign contributions from the oil-and-gas industry during his time in Congress—more than from any other single industry. With a focus that is unprecedented, he’s now using his position to attack scientists and activists who work on climate change. Under his leadership, the committee has issued more subpoenas than it had during its previous fifty-four-year history. [2]
The Insider starring Russell Crowe was simply about the analogy to climate change denial in the tobacco industry. They both tend to use the same lawyers and disinformation tactics.
After seeking the expertise of former "Big Tobacco" executive Jeffrey Wigand (Russell Crowe), seasoned TV producer Lowell Bergman (Al Pacino) suspects a story lies behind Wigand's reluctance to speak. As Bergman persuades Wigand to share his knowledge of industry secrets, the two must contend with the courts and the corporations that stand between them and exposing the truth. All the while, Wigand must struggle to maintain his family life amidst lawsuits and death threats.
I would hate to watch the televised Mea Culpa of Lamar Smith or other like-minded political figures that have pushed this mythology of climate conspiracy; about how they were wrong as "Super Storm _____" becomes the norm. I would hate to see the blank faces as military naval bases are damaged; as more conflagrations spurred by weather-generated lack of supply; the silence that speaks more volumes than the actual answer they cannot say...
"I don't know."
That will not be existentially, "good enough" or expedient.
1. Flooding of Coasts, Caused By Global Warming, Has Already Begun, Justin Gillis
2. The House Science Committee's Anti-Science Rampage, Lawrence M. Krausss
![]() |
Image Source: Technology Review (see Basic Income below) |
Topics: Economy, Existentialism, Philosophy
I have the unpleasant distinction of having been on unemployment, even self-published a book about it. I don't recall it as halcyon days.
My unemployment compensation amounted to $1,320 per month that the State of Texas initially sent me a live check, then populated a debit card. I had the obligation of a minimum of three job searches per week I had to keep logs of, and if asked - unannounced - to produce them.
Let's say I made $200 for one week, as I did in seasonal work for a shipping company lifting boxes and a sales job with a security company. I had to report that (example) minuscule increase. Then legally, the state unemployment adjusted what I received to $1,120/month. In essence, your ceiling was established. Lying was illegal state and federally, and I did not desire to see life behind bars. You of course, desired to earn whatever living you became accustomed to prior to unemployment. It was the financial equivalent of treading water...
I read the print version of Technology Review's business issue because the cover - titled "Free Money" with a cartoon techie next to a sketched Segway giving what looked like a check or money to an presumably out-of-work caricatured individual. The Silicon Valley idea was to give the unemployed $10,000 per year to give them space to "invent" or invest in their own education to retrain for another career.
Caveat 1: Duchess Community College Tuition and Fees (since this is where I live now)
Tuition for Full-Time Students (over 11 credits)+
New York State Resident† $1,764.00 per semester
Nonresident $3,528.00 per semester
Student Activity Fee $5.00 per credit hour
Technology Fee $13.00 per credit hour
Caveat 2 - Rent: A decent apartment runs about $1,500 - 2,100/month. Add to that FOOD, gas or transportation; clothing and entertainment - retraining workers aren't monks. I'll leave it to you to do the math.
If the unemployment compensation were without conditions, a 12-month compensation would be $15,840.00, though Texas and many states only do with (with seeking employment conditions) for six months.
So...I'm not a fan of this approach. I agree SOMETHING has to be done, but $10 - 15,000 probably doesn't cut it. Something like new entry-level jobs in alternate energy, for example that does not require much training after high school, up to design engineers and researchers. There should be a program of continuous training and lifelong education for career advancement and frankly, for the fact humans get easily bored. Proverbs about "idle minds" and the work shed of Beelzebub applies.
![]() |
Image Source: Ibid |
However, the current level of income inequality has to be solved (reference graphs), unless we want something in a modern society - first, second or third world - decidedly undesirable, tribal, horribly stratified, weaponized and...dystopian.
MIT Technology Review:
Basic Income: A Sellout of the American Dream, David H. Freedman
What the Great Economists Would Have Thought of a Universal Basic Income
Letter to the Editor
Topics: Economy, Jobs, Moore's Law, Semiconductor Technology, STEM
I’ve written about this before, but now living this pendulum swing and what is the pending aftermath of the industry, I thought I’d give some perspective to what is occurring that most consumers aren’t aware of.
Dimensional perspective: The average human hair is 100,000 nanometers in diameter, or 100,000 x 10E-9 meters (0.0001 meters if you were wondering). Your average smart phone device has a gate controlling the flow of electrons that is printed at around 35 – 20 nanometers (0.000000035 - 0.000000020 meters), so it’s ridiculously small in comparison. Note that shorthand metric notation saves a lot of typing.
Process Engineering your chip: It involves Epitaxial Growth, Chemical Vapor Deposition and Plasma Vapor Deposition to deposit the layer films needed to pattern; extreme ultraviolet photolithography to mask out patterns in the films, which runs into struggles with standing wave phenomena and quantum effects; reactive ion etch to transfer those incredibly small features into layers of built-from-the-ground-up circuitry (some wet chemical etch as well, mostly to clean or clear surfaces of defects, but some etching too); ion implantation that dopes (introduces impurities) into Silicon or Germanium to make a semiconductor conductive; chemical mechanical planarization for topology and Rapid Thermal Anneal processes to activate certain films. That’s done pretty much over and over for greater than 100 infinitesimally thin film layers to put a chip in a plastic dip that ends up in your laptop or hip pocket. That is how we fill the demand of the market for faster processors to share the ubiquitous cat, dancing turtle and cut puppy videos.
Without a background in physics or chemistry, and a little simple math I can show you why we’ve hit this wall and where we’re likely to go next.
From the formula above for resistance, "resistivity" is an intrinsic property related to the material that's conducting electrons, so it doesn't change (unless you dope it to). Notice length in the numerator; cross sectional area is in the denominator. The length of a longer wire has more resistance to conducting electricity than a shorter one. As these devices undergo shrinks following Moore's Law, length and area will decrease, but specifically a decrease in cross sectional area will increase resistance.
Another formula:
It's very simple to note an increase in resistance - due to the shrinks typified by Moore's Law - will result in an increase in power and thereby: heat. This has numerous and quite dramatic examples recent news reports provide. Heat has to be dissipated with a "heat sink" (Thermodynamics), that usually entails a fan to cool your chip that has to be attached to your device's battery. It's not your imagination that your battery life is less after months or years of usage as Lithium degrades over time - using the same battery to power your device and its heat sink to cool it is why. Plus, consumers are both more savvy and satisfied: how fast does one need to share a cat video? To coin a phrase, "if it ain't broke, don't fix it!" or for that matter, replace it with the exploding variety. We're now at the natural limitations of Moore's Law. The game going forward will likely be memory and battery improvements, which should have transferable benefits in computing, electric vehicles and power storage; newer industries and employment.
That being said: This link from Semi Wiki on "Age, Training and Winning in the Silicon Valley Culture" is so apropos, I think it should be shouted from rooftops. If we're going to create a new economy, we need to be willing to do some things we've done traditionally - like education - differently and lifelong.
The Internet of Things (IoT) and what I like to call "smart car tech" (so-called driver-less cars, which has an existing analog already: public transportation if a national infrastructure were implemented) is a reflection of that as the device gates are typically larger (65-130 nm) than what's needed for cute pet videos and Snap Chat updates.
I think cell or mobile phones are ubiquitous enough that they're as much a fixture of modern life as the transistor radio and walk-man (G-d, I just dated myself!) used to be.
The industry is going "Back to the Future" so to speak on what it will manufacture until something beyond Silicon, Germanium and a wholly different application is discovered.
Tomorrow: Free Money
Related links:
http://news.utexas.edu/2015/04/15/hot-chips-managing-moores-law
http://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338
http://www.slate.com/articles/technology/technology/2005/12/the_end_of_moores_law.html
http://www.trustedreviews.com/opinions/what-is-moore-s-law
http://techland.time.com/2012/05/01/the-collapse-of-moores-law-physicist-says-its-already-happening/
https://www.wired.com/insights/2015/01/the-rise-of-diamond-technology/
http://www.allaboutcircuits.com/textbook/direct-current/chpt-2/calculating-electric-power/
http://www.sengpielaudio.com/calculator-ohm.htm
![]() |
Missile launch officers. (Source: Dept. of Defense) |
Topics: Existentialism, Politics
I still have a faded green book given in my ELEMENTARY school classroom on "how-to survive a nuclear attack." The reason Putin gives excitement now is many in positions of influence are old enough to recall the old "duck-and-cover" days (which did absolutely NOTHING towards survival) and in the Cold War which side the former KGB master spy was on and rooting for.
This article by the Union of Concerned Scientists intrigued me. I hope it does you, and gives you pause as you read this excerpt on your laptop or mobile device. For more details, click on the link provided below.
We’ve posted previously about the dangers of the US policy of keeping nuclear missiles on hair-trigger alert so that they can be launched quickly in response to warning of attack. There is a surprisingly long list of past incidents in which human and technical errors have led to false warning of attack in both the both US and Soviet Union/Russia—increasing the risk of an accidental nuclear war.
The main reason administration officials give for keeping missiles on alert is the “re-alerting race” and crisis instability. The argument is that if the United States takes its missiles off hair-trigger alert and a crisis starts to brew, it would want to put them back on alert so they would not be vulnerable to an attack. And the act of putting them back on alert—“re-alerting”—could exacerbate the crisis and lead Russia to assume the United States was readying to launch an attack. If Russia had de-alerted its missiles, it would then re-alert them, further exacerbating the crisis. Both countries could have an incentive to act quickly, leading to instability.
This argument gets repeated so often that people assume it’s simply true.
However, the fallacy of this argument is that there is no good reason for the US to re-alert its ICBMs in a crisis. They are not needed for deterrence since, as noted above, deterrence is provided by the submarine force. Moreover, historical incidents have shown that having missiles on alert during a crisis increases the risk of a mistaken launch due to false or ambiguous warning. So having ICBMs on alert in a crisis increases the risk without providing a benefit.
Union of Concerned Scientists:
Nuclear Weapons and the Myth of the “Re-Alerting Race”
David Wright, Physicist and Co-Director, Global Security
![]() |
The LIGO gravitational wave detector in Livingston Louisiana. LIGO detectors could soon be using a new and more efficient scheme for squeezing light |
Topics: Black Holes, General Relativity, Gravitational Waves, Quantum Mechanics
The quantum state of light has been squeezed more than ever before by physicists in Germany, who have developed a new low-loss technique. Squeezed light has been used to increase the sensitivity of gravitational wave detectors, and scientists are planning to deploy the new method on the GEO600 and LIGO gravitational wave detectors.
Detecting gravitational waves – the ripples in spacetime caused by energetic events in the Universe – relies on splitting a laser beam using an interferometer and sending the two halves back and forth along two orthogonal arms. When the two halves of the beam recombine, all the light normally comes out of one port of the interferometer. A passing gravitational wave will change the relative lengths of the two arms, creating an interference pattern and directing some of the light out of the "dark" port. However, by the time they reach Earth, gravitational waves from even the most dramatic events have tiny amplitudes, so sensitivity is crucial. The first confirmed discovery of a gravitational wave, announced by LIGO in February, was produced by the collision and merger of two black holes and changed the 4.2 km arm lengths by barely 10–19m (see "LIGO detects first ever gravitational waves – from two merging black holes").
At such extreme sensitivity, one of the main noise sources in such detectors is uncorrelated photons emerging from the quantum vacuum as a result of its zero-point energy – the energy that Heisenberg's uncertainty principle dictates can never be removed from a system. But, amazingly, even this source of noise can be minimized. The uncertainty principle puts a lower limit on the product of the variance in the amplitude (or number) of photons and the variance in the phase. Vacuum photons naturally have equal variance in both amplitude and phase. It is, however, possible to create a "squeezed state" of light, in which either one of these quantities is minimized (squeezed) and the other is allowed to increase (antisqueezed).
Physics World:
Squeezed light shatters previous record for manipulating quantum uncertainty
Tim Wogan
Topics: Condensed Matter Physics, Materials Science, Solid State Physics
Devices called ultracapacitors have recently become attractive forms of energy storage: They recharge in seconds, have very long lifespans, work with close to 100 percent efficiency, and are much lighter and less volatile than batteries. But they suffer from low energy-storage capacity and other drawbacks, meaning they mostly serve as backup power sources for things like electric cars, renewable energy technologies, and consumer devices.
But MIT spinout FastCAP Systems is developing ultracapacitors, and ultracapacitor-based systems, that offer greater energy density and other advancements. This technology has opened up new uses for the devices across a wide range of industries, including some that operate in extreme environments.
Based on MIT research, FastCAP's ultracapacitors store up to 10 times the energy and achieve 10 times the power density of commercial counterparts. They're also the only commercial ultracapacitors capable of withstanding temperatures reaching as high as 300 degrees Celsius and as low as minus 110 C, allowing them to endure conditions found in drilling wells and outer space. Most recently, the company developed a AA-battery-sized ultracapacitor with the perks of its bigger models, so clients can put the devices in places where ultracapacitors couldn't fit before.
Phys.org: New applications for ultracapacitors, Rob Matheson
![]() |
Interstellar Wiki: Cooper in the Tessaract |
Topics: Existentialism, Fermi Paradox, Philosophy, Planetary Science, Space Exploration
It's an intriguing hypothesis, albeit a convenient one since at this current juncture, it's kind of hard to prove experimentally or observationally. John G. Messerly said on the site Institute for Ethics and Emerging Technologies (quoting John Smart):
The transcension hypothesis proposes that a universal process of evolutionary development guides all sufficiently advanced civilizations into what may be called “inner space,” a computationally optimal domain of increasingly dense, productive, miniaturized, and efficient scales of space, time, energy, and matter, and eventually, to a black-hole-like destination.
In essence, they've "left the universe," which as you know, is a pretty big place. There are two important questions this generates at least for me: where and how?
Where: Did the aliens evolve to some hyper-dimensional tesseract, as in the movie Interstellar? I can see now where the idea originated. Again, convenient as hyper dimensions are presently undetectable. Or, as the embed below suggests, instead of "falling in love with Siri" we'll eventually become Siri after The Singularity?
How: Another way for intelligent civilizations to be "gone" is unfortunately...extinction, which can be as we are starting to see, self-induced.
Knowing the answer to either question can be illuminating and species-extending.
Ever since Enrico Fermi questioned back in the 1950’s why, if a multitude of civilisations are likely to exist in the Milky Way, no sign of their existence in the form of probes or spacecraft has ever been detected, scientists and critical thinkers have struggled to resolve the problem by supplying a host of inventive arguments with mixed reception.
To date one of the most common answers to the Great Silence was simply that life is so rare, so widely distributed, and the scale of the universe so immense, that the probability of contact or communication between any two space-faring civilisations is almost non-existent. Needless to say an outlook which seems like a very lonely, sad and pessimistic state of affairs for intelligent life to find itself in.
BrighterBrains.org:
The Transcension Hypothesis: An Intriguing Answer to the Fermi Paradox?
Owen Nicholas
When “Star Trek” premiered 50 years ago today, its reception was colder than the weather outside the Klingon penal colony on Rura Penthe.
“And away we go to another planet for the sci-fi buffs to lick the plate clean,” Variety‘s Sept. 8, 1966 review of the premiere episode, “The Man Trap,” declared. “But there had better be a hefty cargo of them or the Nielsen samplers may come up short.” Predicting doom, it continued, “The opener won’t open up many new frequencies after this sampler.” So not exactly boffo.
The review was typical of the initial response to Gene Roddenberry’s science fiction drama. After a troubled development that saw the initial pilot scrapped and a new one with mostly new characters — the only holdover being Leonard Nimoy’s Spock — created from scratch, “Star Trek” hung on for a short while, renewed for a second, then a third season before being cancelled.
The run was just long enough to create a library that would catch fire years later in syndication, finding a popularity it never achieved in its first window. A TV show that had at best been a moderate success for NBC would spawn four live-action spinoff series — soon to be five with the addition of CBS All Access’ “Star Trek: Discovery” — 13 movies, one animated series, comic books, postage stamps, documentaries, tell-all books, conventions and untold units of prosthetic ears sold. When Nimoy died last year, the White House issued a lengthy statement from President Obama in which he wrote, “I loved Spock.”
The “Star Trek” universe extends far beyond the 79 episodes that aired on NBC from 1966 to 1969. But that series’ impact is still being felt today. For its 50th anniversary, Variety asked several of the stars, writers and fans of “Star Trek” and its offshoots to name their favorite episodes of the original series.
FULL STORY AT:
http://variety.com/2016/tv/news/star-trek-anniversary-favorite-episodes-50-years-1201853965/
![]() |
Courtesy: http://blogs.esa.int/rosetta/2014/03/26/introducing-midas-rosettas-micro-imaging-dust-analysis-system/ |
Topics: Astrophysics, Comets, ESA, NASA, Rosetta, Space Exploration, Women in Science
There is a movie coming out in 2017 called "Hidden Figures" about the African American women that were "computers" as they were all called at the time. Behind the scenes and out of notice (purposely) from the public eye, these scientists were responsible for mankind getting to the moon, despite what your conspiracy provocateur uncle spouts around the dinner table at Thanksgiving.
Dr. Claudia Alexander was a project scientist on the American portion of the international Rosetta mission. She sadly lost her battle with breast cancer last year. I always try to highlight such achievements since its obvious from a societal structural sense, negative stereotypes are often forwarded to maintain an inane "status quo" while simultaneously complaining about "bootstraps." I salute Dr. Alexander, a modern Hidden Figure in Science that paved the way for new discoveries by humankind.
Thanks to in situ measurements from MIDAS (the Micro-Imaging Dust Analysis System) on-board the Rosetta spacecraft, researchers have now found out more about the structure of the dust particles on comet 67P/Churyumov-Gerasimenko. The particles are made up of aggregates and cover a range of sizes – from tens of microns to a few hundred nanometres. They also appear to have formed from the hierarchical assembly of smaller constituents and come in a range of shapes, from single grains to larger, porous aggregated particles with some dust grains being elongated. The study could shed more light on the processes that occurred when our Solar System formed nearly five billion years ago.
Planetary systems like our own Solar System started out as dust particles in protoplanetary nebulae – clouds of gas and dust that gave rise to stars and planets. The particles collided and agglomerated to form planetesimals – the building blocks of planets. Comets are leftover planetesimals and are made of ice and dust particles. They range in size from a few hundreds of metres to ten of kilometres and are mainly found on the outskirts of the Solar Systems, far from damaging radiation, high temperatures and collisions with other objects.
Nanotech Web:
Rosetta’s MIDAS analyses cometary dust particles, Belle Dumé
![]() |
Image Source: SETI Institute, Osiris-REx |
Topics: Asteroids, Astrophysics, NASA, Planetary Science, Space Exploration
MOUNTAIN VIEW – NASA’s OSIRIS-REx spacecraft is slated to launch from Cape Canaveral on Thursday, September 8th. Its mission is to rendezvous with asteroid Bennu in 2018, take a sample from its surface, and return that sample to Earth in 2023.
Why are scientists so interested in this ancient lump of rock? First, Bennu is one of the darkest objects in the Solar System, suggesting it is rich in organic materials that might have seeded Earth with the starting blocks of life. We cannot find these materials today, because the organic compounds that first fell to Earth have long since disappeared – processed and endlessly recycled by geology and biology. On Bennu however, these mysterious compounds have been almost perfectly preserved. Bennu is a veritable museum in space that has been waiting 4.5 billion years to open its doors to Earth’s scientists.
Scientists are also interested in the so-called Yarkovsky effect. This is a process whereby solar radiation gently nudges the asteroid, subtly changing its orbit. By being up-close with Bennu, we can better understand how surface properties affect this process. Combined with understanding material properties, this enables us to better predict if and when this asteroid might impact Earth in the 22nd century. It is a potentially dangerous lump of rock, already on our watch list.
SETI.org: Mission To Examine The Past And Safeguard The Future
A, far future, Earth had already been visited by an alien race, called the Sominids, who came here for the express purpose of drinking and having sex with everyone they could. When one of their, infamous, parties resulted in the moon being cut in half, and killing everyone who happened to live there, they quietly left.
Their encounter with the Sominids taught the human race many things, primarily that faster than light travel didn’t exist. Denied the stars the human race began to dwindle in number and terminate any space programs.
A thousand years later a guy named Edward Q. Rohta circumvented anti-AI laws, which had been on the books for millennia, by creating organic creatures to provide manual labor. Instead of dying after ten years, as promised in the company brochure, they would develop flu-like symptoms and go into hiding. Eventually, fed up with the mistreatment they suffered at the hands of humans, they rose up and killed every man, woman, and child on the planet.
This the story of what happens next.
The Brittle Riders, apocalypses are funny that way.
Coming out on Azoth Khem Publishing - 2016
![]() |
Electrochemical characteristics of Na2Ti3O7 and VOPO4 electrodes in the "half-cell format" vs. Na+/Na. Courtesy: G Yu |
Topics: Condensed Matter Physics, Nanotechnology, Semiconductor Technology, Solid State Physics
Researchers at the University of Texas at Austin in the US and Nanjing University of Aeronautics and Astronautics in China have developed a high-energy sodium-ion battery based on sodium titanate nanotubes and vanadyl phosphate layered nanosheet materials. The new device, which works over a wide temperature range of between –20 to +55°C, has a high operating voltage of close to 2.9 V and delivers a large reversible capacity of 114 mA h/g. It also boasts a high energy density of 220 Wh/kg, which makes it competitive with state-of-the-art lithium-ion batteries.
Sodium-ion batteries are similar to their lithium-ion cousins since they store energy in the same way. They consist of two electrodes – anode and cathode – separated by an electrolyte. When the battery is being charged with electrical energy, metal ions move from the cathode through the electrolyte to the anode, where they are absorbed into the bulk of the anode material. Sodium-based devices are in principle more attractive though since sodium is highly abundant on Earth (its Clarke’s number is 2.64) and is therefore much cheaper than lithium. Sodium is also more environmentally friendly than lithium.
However, the radius of the sodium ion is significantly larger than that of the lithium ion. This makes it difficult to find a host electrolyte material that allows ions to be rapidly absorbed and removed. What is more, sodium-ion batteries made thus far suffer from a relatively low working potential, large capacity decay during cycling (which leads to a limited battery life) and poor safety.
Nanotechweb: Sodium-ion device could compete with lithium-ion batteries, Belle Dumé
Topics: Climate Change, Environment, Global Warming
The problem that humans have is no appreciation for the vastness of the passage of time in large scale. In other words, we don't believe what we haven't seen physically: climate change, evolution and a ~13.5 billion age universe noted science examples. It's exacerbated by the Internet and our current notions that information - and thus problems and resolutions - are concluded quickly. It is sobering this epoch has now been declared twelve years before I appeared on the planet.
Humanity’s impact on the Earth is now so profound that a new geological epoch – the Anthropocene – needs to be declared, according to an official expert group who presented the recommendation to the International Geological Congress in Cape Town on Monday.
The new epoch should begin about 1950, the experts said, and was likely to be defined by the radioactive elements dispersed across the planet by nuclear bomb tests, although an array of other signals, including plastic pollution, soot from power stations, concrete, and even the bones left by the global proliferation of the domestic chicken were now under consideration.
The current epoch, the Holocene, is the 12,000 years of stable climate since the last ice age during which all human civilisation developed. But the striking acceleration since the mid-20th century of carbon dioxide emissions and sea level rise, the global mass extinction of species, and the transformation of land by deforestation and development mark the end of that slice of geological time, the experts argue. The Earth is so profoundly changed that the Holocene must give way to the Anthropocene.
The Guardian:
The Anthropocene epoch: scientists declare dawn of human-influenced age
Damian Carrington