April 8th (TODAY) is the date of opposition, when Mars, Earth, and the sun are arranged in a nearly-straight line.
If the orbits of Mars and Earth were perfectly circular, April 8th would also be the date of closest approach. However, planetary orbits are elliptical--that is, slightly egg-shaped--so the actual date of closest approach doesn't come until almost a week later.
On April 14th, Earth and Mars are at their minimum distance: 92 million km, a 6+ month flight for NASA's speediest rockets. You won't have any trouble finding Mars on this night. The full Moon will be gliding by the Red Planet in the constellation Virgo, providing a can't-miss "landmark" in the midnight sky.
Remarkably, on the same night that Mars is closest to Earth, there will be a total lunar eclipse. The full Moon of April 14-15 will turn as red as the Red Planet itself.
Try to get your taxes done, so you can enjoy the show. Working on write-offs (to join you)...
ROBOTS came into the world as a literary device whereby the writers and film-makers of the early 20th century could explore their hopes and fears about technology, as the era of the automobile, telephone and aeroplane picked up its reckless jazz-age speed. From Fritz Lang’s “Metropolis” and Isaac Asimov’s “I, Robot” to “WALL-E” and the “Terminator” films, and in countless iterations in between, they have succeeded admirably in their task.
Since moving from the page and screen to real life, robots have been a mild disappointment. They do some things that humans cannot do themselves, like exploring Mars, and a host of things people do not much want to do, like dealing with unexploded bombs or vacuuming floors (there are around 10m robot vacuum cleaners wandering the carpets of the world). And they are very useful in bits of manufacturing. But reliable robots—especially ones required to work beyond the safety cages of a factory floor—have proved hard to make, and robots are still pretty stupid. So although they fascinate people, they have not yet made much of a mark on the world.
That seems about to change. The exponential growth in the power of silicon chips, digital sensors and high-bandwidth communications improves robots just as it improves all sorts of other products. And, as our special report this week explains, three other factors are at play.
One is that robotics R&D is getting easier. New shared standards make good ideas easily portable from one robot platform to another. And accumulated know-how means that building such platforms is getting a lot cheaper. A robot like Rethink Robotics’s Baxter, with two arms and a remarkably easy, intuitive programming interface, would have been barely conceivable ten years ago. Now you can buy one for $25,000.
The difficulty of predicting local effects of climate change makes a compelling case for preventing it.
This week the Intergovernmental Panel on Climate Change (IPCC) released a major report focused on what actions might or could be taken to adapt to climate change. It attempts to describe who and what is especially vulnerable to climate change, and gives an overview of ways some are adapting.
The report makes clear that specific estimates of how climate change will affect places, people, and things are very uncertain. Brought down to a local level, climate change could go in either direction—there are risks that a given area could get drier or wetter, or suffer floods or droughts, or both. This uncertainty makes efforts to prevent climate change even more important.
Specific risks to natural systems are well documented by the report. It finds, for example, the greatest risks are to those ecosystems, people, and things in low-lying coastal areas, because expected sea-level changes are in only one direction, up. This is also the case in the Arctic, where the temperature rise is expected to be much greater than the global average. There is good science and unanimous agreement among climate models behind these assertions.
But a frustrating aspect of the report—and a reflection of the difficulty of working in this line of research—is that very few specific risks to humans are quantified in a meaningful way. For example, one might ask: has my risk of death increased because of more hot days? The report says, “Local changes in temperature and rainfall have altered the distribution of some water-borne illnesses and disease vectors (medium confidence).” This seems to state the obvious, while giving no indication of whether the alterations may have increased or decreased risk or what the magnitude of the alteration might be. Given that the statement seems to say little, it is hard to imagine there is not high confidence.
For the last 50 years, Star Trek has captivated audiences as the crew of the U.S.S. Enterprise explored the galaxy using technological advances – warp drive, wormholes, beaming technology, holodecks – in order to do so. Dirk K. Morr, a professor at the University of Illinois at Chicago, joins us to discuss the scientific ideas behind Star Trek technologies. Morr will present his findings at 6:00 pm on Wednesday at the University of Illinois at Chicago in the Behavioral Science Building.
The security of a data connection protected using a flawed U.S. encryption standard promoted by the National Security Agency could be broken in under 16 seconds using a single computer processor. That’s according to the first in-depth study of how easily encryption systems that use the now deprecated Dual_EC random number generator could be defeated by an attacker that had “backdoored” the standard.
The flawed standard has never been widely used to protect Internet communications, even though the security company RSA got $10 million from the NSA to make it the default random number generator in one of its software packages. It is not known whether the NSA or anyone else knows the crucial mathematical relationship needed to exploit the flaw and undo encryption based on Dual_EC.
However, the study conclusively shows that an attacker that did know the key to the Dual_EC backdoor could put it to practical use. Not all of the six different encryption software packages tested could be defeated in seconds: half took a 16-processor cluster between 60 and 80 minutes of work to break. But a national intelligence agency could significantly improve on those times by devoting more computing power to the problem.
Maps of gamma rays from the center of the Milky Way galaxy, before (left) and after signals from known sources were removed, reveal an excess that is consistent with the distribution of dark matter.
Not long after the Fermi Gamma-ray Space Telescope took to the sky in 2008, astrophysicists noticed that it was picking up a steady rain of gamma rays pouring outward from the center of the Milky Way galaxy. This high-energy radiation was consistent with the detritus of annihilating dark matter, the unidentified particles that constitute 84 percent of the matter in the universe and that fizzle upon contact with each other, spewing other particles as they go. If the gamma rays did in fact come from dark matter, they would reveal its identity, resolving one of the biggest mysteries in physics. But some argued that the gamma rays could have originated from another source.
Now a new analysis of the signal claims to rule out all other plausible explanations and makes the case that the gamma rays trace back to a type of particle that has long been considered the leading dark matter candidate — a weakly interacting massive particle, or WIMP. Meanwhile, a more tentative X-ray signal reported in two other new studies suggests the existence of yet another kind of dark matter particle called a sterile neutrino.
In the new gamma-ray analysis, which appeared Feb. 27 on the scientific preprint site arXiv.org, Dan Hooper and his collaborators used more than five years’ worth of the cleanest Fermi data to generate a high-resolution map of the gamma-ray excess extending from the center of the galaxy outward at least 10 angular degrees, or 5,000 light-years, in all directions.
“The results are extremely interesting,” said Kevork Abazajian, an associate professor of physics and astronomy at the University of California, Irvine. “The most remarkable part of the analysis is that the signal follows the shape of the dark matter profile out to 10 degrees,” he said, explaining that it would be “very difficult to impossible” for other sources to mimic this predicted dark matter distribution over such a broad range.
The findings do not constitute a discovery of dark matter, the scientists said, but they prepare the way for an upcoming test described by many researchers as a “smoking gun”: If the gamma-ray excess comes from annihilating WIMPs, and not conventional astrophysical objects, then the signal will also be seen emanating from dwarf galaxies that orbit the Milky Way — diffuse objects that are rich in dark matter but not in other high-energy photon sources such as pulsars, rotating neutron stars that have been floated as alternative explanations for the excess.
Ozgenur “Ozge” Kahvecioglu Feridun first came to Argonne in 2010, when she was as a Visiting Scientist working on a scale-up project, the Ultrafast and Large Scale Boriding project.
A visiting scientist at Argonne in 2010 and a postdoctoral research fellow since 2012, Ozgenur “Ozge” Kahvecioglu Feridun is a metallurgical and materials engineer with the Process Technology Research group in the Energy Systems division.
What do you do at Argonne?
I work on process development and scale-up of advanced cathode materials. We scale processes from bench to pilot scale, identifying and resolving process challenges when producing materials. This reduces the risks associated with the commercialization of new materials.
What made you choose Argonne as the place to continue your postdoc work?
Actually, this is my second time working at Argonne. In 2010, I was here as a Visiting Scientist working on another scale-up project, the Ultrafast and Large Scale Boriding project, under Ali Erdemir. During this project, I learned firsthand how to apply my skills to solve process scale-up problems and how working on a diverse team contributed to the overall success of the project. Everyone brought a different expertise to the table that helped us solve many difficult issues.
On that project, we scaled an advanced heat-treating process from bench to industrial scale. It was subsequently licensed to an industrial partner and won an R&D 100 award in 2012.
After following the BICEP2 announcement via Twitter, I had to board a transcontinental flight, so I had 5 uninterrupted hours to think about what it all meant. Without Internet access or references, and having not thought seriously about inflation for decades, I wanted to reconstruct a few scraps of knowledge needed to interpret the implications of r ~ 0.2.
I did what any physicist would have done … I derived the basic equations without worrying about niceties such as factors of 3 or 2π. None of what I derived was at all original — the theory has been known for 30 years — but I’ve decided to turn my in-flight notes into a blog post. Experts may cringe at the crude approximations and overlooked conceptual nuances, not to mention the missing references. But some mathematically literate readers who are curious about the implications of the BICEP2 findings may find these notes helpful. I should emphasize that I am not an expert on this stuff (anymore), and if there are serious errors I hope better informed readers will point them out.
By tradition, careless estimates like these are called “back-of-the-envelope” calculations. There have been times when I have made notes on the back of an envelope, or a napkin or place mat. But in this case I had the presence of mind to bring a notepad with me.
...and, it's Friday! One more week to geek on "Winter Soldier." I'd buy on Fandango: after the east coast nuclear winter, people will be stir crazy from telecommuting and need to get out...and see real people again. The analysis below is the reason why you shouldn't take a physics person to the movies with you...just kidding, and my wife sadly has no choice in the matter. We'll behave, promise. Just don't ask questions like these.
At the end, Cap throws his shield at the Winter Soldier – because that’s what Captain America does. But wait! The Winter Soldier just catches the shield and throws it right back at Captain America. The real cool part is what happens when Cap catches the shield. The impact is strong enough to push him back a little bit. Is this enough to get an estimate for the mass of the shield? I think so.
Sliding Back
This is really a multi-part problem. First, the shield is thrown by the Winter Soldier. I don’t really care about the throwing motion. Next, the shield moves through the air to Captain America and collides with him. This gives him some recoil velocity. However, Cap is standing on the ground such that his recoiling body is slowed down to a stop by friction.
It might not seem to be the best place to start, but I am going to start backwards. Let’s look at Captain America sliding after the impact with the shield. By estimating the frictional force and the sliding distance, I can get a value for the recoil speed after the impact.
In this first problem, I can just consider Captain America as a block with some initial speed moving across the ground. Here is a force diagram while he is slowing down (after the impact).
The forces in the vertical direction must add up to zero since Cap doesn’t accelerate up or down. This means that I can find the force the ground pushes up on him:
Why do I need this force pushing up (usually called the Normal force)? If I use the typical model for sliding friction, the magnitude of the frictional force can be determined by:
The process of producing anti-counterfeit nano-fingerprints based on randomly distributed silver nanowires. [1] Silver nanowires (AgNWs) are prepared by the self-seeding method and an amorphous silica shell is coated on the surface using tetraethyl orthosilicate (TEOS). [2] Fluorescein isothiocyanate (FITC) and rhodamine B-isothiocyanate (RITC) are attached covalently to the surface of the pre-formed silica shell by allowing the formation of covalent bonds between the silica surface and 3-aminopropyltrimethoxysilane (APTMS). [3] A photolithographic process is used to inscribe the direction and target markers on the surface of the PET film. The orientation marker (“KAIST”) can be used to determine the correct direction of the PET film. The target marker (“X”) has an empty space in its central region where AgNWs are loaded to generate fingerprints.
Counterfeiting is a steady and increasingly important problem that occurs in nearly every trade and industry. Recognizing the difficulty in distinguishing counterfeit goods from genuine products, new nanoscale technologies are being developed to prevent and identify this illegal practice. Using dye-coated one-dimensional (1D) nanowires, researchers at the Korea Advanced Institute of Science and Technology (KAIST) in South Korea have demonstrated that randomly distributed nanowires can generate unique and simple barcode patterns readily applicable by many to anti-counterfeiting.
Reporting in Nanotechnology, nanoscale fingerprint patterns are generated by simply casting fluorescent dye-coated silver nanowires onto a transferrable flexible polyethylene terephthalate (PET) film. The direction and target markers ("KAIST" and "X") are patterned by a photolithographic technique to provide positional information for identification and the nanowires are cast onto it. Then, using an optical microscope, the resulting unique fingerprint patterns can be visually authenticated in a simple and straightforward manner, as shown fully in the figure above.
Photon-shaping technique could lead to "nuclear" quantum computers. (Courtesy: iStockphoto/polygraphus)
A way of modulating the waveforms of individual, coherent high-energy photons at room temperature has been demonstrated by researchers in the US and Russia. The advance opens the way for new quantum-optics technologies capable of extremely high-precision measurements, as well as the possibility of <>quantum-information systems based on nuclear processes. The new approach could also be useful for those doing fundamental research in a variety of areas, ranging from the role of quantum phenomena in biological processes to fundamental questions in quantum optics itself.
The technique was developed by Olga Kocharovskaya, Farit Vagizov and colleagues at Texas A&M University and the Kazan Federal University. Their set-up bears some similarity to a Mössbauer spectroscopy experiment. A sample of radioactive cobalt-57 decays to an excited state of iron-57, which then decays by emitting a 14.4 keV "soft" gamma-ray photon. This photon can then be absorbed and re-emitted by a nearby stainless-steel foil containing iron-57. Because of the Mössbauer effect, no energy is lost in the recoil of the stainless-steel lattice and the photon is emitted at 14.4 keV with very little spectral blurring.
As the foil absorbs and re-emits the photons, it is vibrated at megahertz frequencies. By making clever use of the Doppler effect, the team is able to shape a single photon into a double pulse and even a train of ultrashort pulses. This makes it possible to use the gamma-ray photons to encode quantum information in a "time-bin qubit" – quantum bits in which information is encoded in terms of the relative arrival time of pulses.
A little something to do while vegging on the couch (nothing on the "boob tube" except COSMOS, really).
Controlling a lab from home
The Remote Control Glow Discharge (RGDX) is a plasma that you can control from the comfort of your browser. YOU have control of the entire experiment including the gas pressure inside the tube, the voltage produced by the power supply that makes the plasma, and the strength of an electromagnet surrounding the plasma. You can perform experiments from any computer anywhere in the world!
In 2002, we began developing plasma sources for educational purposes and one of our devices won 2nd place in the National Apparatus Competition sponsored by the American Association of Physics Teachers. In 2003, we began controlling our plasma sources by computer for a plasma exhibit in a science museum. The progression of this has led to remote control of a plasma from any location by anyone with an internet connection. This type of control could serve as an experimental component of an online physics class or for a school that typically does not have plasma physics equipment.
As with all other Science Education Department labs, the RGDX has been developed in large part by high school and undergraduate interns.
The Remote Glow Discharge Experiment was officially released to the public on 3/12/2014. The story can be found here.
Established in 1847, the Children's Home provides a range of services and programs giving hope and healing to abused and neglected children in the Hudson River Region. The Children's Home served 397 children and their family members last year. The Home provides a full range of residential services including campus-based care, community-based group homes and boarding homes, and independent living apartments. It also provides regular foster care and intensive therapeutic foster homes.
Throughout our 167 year history, the underlying mission of the Children’s Home of Poughkeepsie has remained the same: The Home is dedicated to providing a safe and nurturing environment that improves lives and empowers at-risk children and families in the Hudson Valley and surrounding communities.
I did this presentation on Saturday, 22 March at the behest of the local Alpha Kappa Alpha alumni chapter. I have done such presentations before. I was more than happy to do it.
It was a focused audience of five young men and three young women. They participated well, and at least a few of them said they were going to purchase electronics snap kits from "The Shack." I came away encouraged and inspired by the curiosity of these young people in spite of their challenging circumstances. Especially in an era of error andpseudosciencepropagated as alternate "truth" more outreach like this is needed: these are the "meek who will inherit the earth" and will need the tools to manage it.
Physics teachers: all the links in the embed are active, including those in the pictures of slides 8 and 12. If you want the Power Point version with all the "bells and whistles" of this embed, email: physics4thecool@gmail.com. Please attribute the source. My reference to Korea on slide 4 was a quote from "The Smartest Kids in the World and How They Got That Way," by Amanda Ripley and not meant to be derisive: the Korean children spend 16 hours a day in school M-F and 8 hours on Saturday. They are brilliant via focus and immersion. I used parts from Electronics 101 and Electronics 303 kits purchased from Radio Shack. Apparently, Radio Shack only carries the Electronics 101 snap kit. I'd try their service number for the more advanced Electronics 303. However, Amazon carries comparable manipulatives: SC-100, SC-300 and SC-750. Good luck.
On 18 June 2009, NASA launched the Lunar Reconnaissance Orbiter (LRO) to map the surface of the Moon and collect measurements of potential future landing sites as well as key science targets. After two and a half years in a near-circular polar orbit, LRO entered an elliptical polar orbit on 11 December 2011 with a periapsis (point where the LRO is closest to the surface) near the south pole, and the apoapsis (point where LRO is furthest from the surface) near the north pole. The increased altitude over the northern hemisphere enables the two Narrow Angle Cameras (NACs) and Wide Angle Camera (WAC) to capture more terrain in each image acquired in the northern hemisphere. As a result, the Lunar Reconnaissance Orbiter Camera (LROC) archive now contains complete coverage from 60°N to the north pole (except of course for areas of permanent shadow) with a pixel scale of 2 meters.
The LROC team assembled 10,581 NAC images, collected over 4 years, into a spectacular northern polar mosaic. The LROC Northern Polar Mosaic (LNPM) is likely one of the world’s largest image mosaics in existence, or at least publicly available on the web, with over 680 gigapixels of valid image data covering a region of the Moon (2.54 million km², 0.98 million miles²) slightly larger than the combined area of Alaska (1.72 million km²) and Texas (0.70 million km²) -- at a resolution of 2 meters per pixel! To create the mosaic, each LROC NAC image was map projected on a 30 m/pixel Lunar Orbiter Laser Altimeter (LOLA) derived Digital Terrain Model (DTM) using a software package called Integrated Software for Imagers and Spectrometers (ISIS). A polar stereographic projection was used in order to limit mapping distortions when creating the 2-D map. In addition, the LROC team used improved ephemeris provide by the LOLA and GRAIL teams and an improved camera pointing model to enable accurate projection of each image in the mosaic to within 20 meters.