quantum_computing (5)


Cyber threat analysis requires high-speed supercomputers, such as Theta at Argonne’s Leadership Computing Facility, a DOE Office of Science User Facility. (Image by Argonne National Laboratory.)


Topics: Artificial Intelligence, Computer Science, Internet, Mathematical Models, Quantum Computing

"Locks are made for honest people."

Robert H. Goodwin, June 19, 1925 - August 26, 1999 ("Pop")

It is indisputable that technology is now a fundamental and inextricable part of our everyday existence—for most people, our employment, transportation, healthcare, education, and other quality of life measures are fully reliant on technology. Our dependence has created an urgent need for dynamic cybersecurity that protects U.S. government, research and industry assets in the face of technology advances and ever more sophisticated adversaries.

The U.S. Department of Energy’s (DOE) Argonne National Laboratory is helping lead the way in researching and developing proactive cybersecurity, including measures that leverage machine learning, to help protect data and critical infrastructure from cyberattacks.

Machine learning is a category of artificial intelligence that involves training machines to continually learn from and identify patterns in data sets.

“Applying machine learning approaches to cybersecurity efforts makes sense due to the large amount of data involved,” said Nate Evans, program lead for cybersecurity research in the Strategic Security Sciences (SSS) Division. ​“It is not efficient for humans to mine data for these patterns using traditional algorithms.”

Argonne computer scientists develop machine learning algorithms using large data sets— comprising log data from different devices, network traffic information, and instances of malicious behavior—that enable the algorithms to recognize specific patterns of events that lead to attacks. When such patterns are identified, a response team investigates instances matching those patterns.

Following an attack, the response team patches the vulnerability in the laboratory’s intrusion protection systems. Forensic analysis can then lead to changes that prevent similar future attacks.

“We are looking for ways to stop attacks before they happen,” said Evans. ​“We’re not only concerned with protecting our own lab, we’re also developing methods to protect other national labs, and the country as a whole, from potential cyberattacks.”


Argonne applies machine learning to cybersecurity threats
Savannah Mitchem, Argonne National Laboratory

Read more…

Structured Light...

This image shows the creation of hybrid entangled photons by combining polarization with a "twisted" pattern that carries orbital angular momentum. Credit: Forbes and Nape


Topics: Electrical Engineering, Electromagnetic Radiation, Quantum Computing, Quantum Electrodynamics, Quantum Mechanics

Structured light is a fancy way to describe patterns or pictures of light, but deservedly so as it promises future communications that will be both faster and more secure.

Quantum mechanics has come a long way during the past 100 years but still has a long way to go. In AVS Quantum Science researchers from the University of Witwatersrand in South Africa review the progress being made in using structured light in quantum protocols to create a larger encoding alphabet, stronger security and better resistance to noise.

"What we really want is to do quantum mechanics with patterns of light," said author Andrew Forbes. "By this, we mean that light comes in a variety of patterns that can be made unique—like our faces."

Since patterns of light can be distinguished from each other, they can be used as a form of alphabet. "The cool thing is that there are, in principle at least, an infinite set of patterns, so an infinite alphabet is available," he said.

Traditionally, quantum protocols have been implemented with the polarization of light, which has only two values—a two-level system with a maximum information capacity per photon of just 1 bit. But by using patterns of light as the alphabet, the information capacity is much higher. Also, its security is stronger, and the robustness to noise (such as background light fluctuations) is improved.

"Patterns of light are a route to what we term high-dimensional states," Forbes said. "They're high dimensional, because many patterns are involved in the quantum process. Unfortunately, the toolkit to manage these patterns is still underdeveloped and requires a lot of work."

Structured light promises path to faster, more secure communications
American Institute of Physics, Phys.org

Read more…

Quantum Google...

Linear computation: montage of a photo of the chip containing the trapped ions and an image of the ions in a 1D array (Courtesy: Christopher Monroe) Physicsworld.com


Topics: Internet, Quantum Computer, Quantum Computing, Quantum Mechanics

Google said it has achieved a breakthrough in quantum computing research, saying an experimental quantum processor has completed a calculation in just a few minutes that would take a traditional supercomputer thousands of years.

The findings, published Wednesday in the scientific journal Nature, show that "quantum speedup is achievable in a real-world system and is not precluded by any hidden physical laws," the researchers wrote.

Quantum computing is a nascent and somewhat bewildering technology for vastly sped-up information processing. Quantum computers are still a long way from having a practical application but might one day revolutionize tasks that would take existing computers years, including the hunt for new drugs and optimizing city and transportation planning.

The technique relies on quantum bits, or qubits, which can register data values of zero and one—the language of modern computing—simultaneously. Big tech companies including Google, Microsoft, IBM and Intel are avidly pursuing the technology.

"Quantum things can be in multiple places at the same time," said Chris Monroe, a University of Maryland physicist who is also the founder of quantum startup IonQ. "The rules are very simple, they're just confounding."


Google touts quantum computing milestone
Rachel Lerman

Read more…


Credit: Getty Images

Topics: Computer Engineering, Quantum Computing, Quantum Teleportation, Star Trek

For the first time, researchers have teleported a qutrit, a tripartite unit of quantum information. The independent results from two teams are an important advance for the field of quantum teleportation, which has long been limited to qubits—units of quantum information akin to the binary “bits” used in classical computing.

These proof-of-concept experiments demonstrate that qutrits, which can carry more information and have greater resistance to noise than qubits, may be used in future quantum networks.

Chinese physicist Guang-Can Guo and his colleagues at the University of Science and Technology of China (USTC) reported their results in a preprint paper on April 28, although that work remains to be published in a peer-reviewed journal. On June 24 the other team, an international collaboration headed by Anton Zeilinger of the Austrian Academy of Sciences and Jian-Wei Pan of USTC, reported its results in a preprint paper that has been accepted for publication in Physical Review Letters. That close timing—as well as the significance of the result—has each team vying for credit and making critiques of the other’s work.

The name quantum teleportation brings to mind a technology out of Star Trek, where “transporters” can “beam” macroscale objects—even living humans—between far-distant points in space. Reality is less glamorous. In quantum teleportation, the states of two entangled particles are what is transported—for instance, the spin of an electron. Even when far apart, entangled particles share a mysterious connection; in the case of two entangled electrons, whatever happens to one’s spin influences that of the other, instantaneously.


“Qutrit” Experiments Are a First in Quantum Teleportation, Daniel Garisto, Scientific American

Read more…

Quantum Robustness...

A study demonstrates that a combination of two materials, aluminum and indium arsenide, forming a device called a Josephson junction could make quantum bits more resilient. Credit: University of Copenhagen image/Antonio Fornieri


Topics: Computer Science, Quantum Computing, Quantum Mechanics

Researchers have been trying for many years to build a quantum computer that industry could scale up, but the building blocks of quantum computing, qubits, still aren't robust enough to handle the noisy environment of what would be a quantum computer.

A theory developed only two years ago proposed a way to make qubits more resilient through combining a semiconductor, indium arsenide, with a superconductor, aluminum, into a planar device. Now, this theory has received experimental support in a device that could also aid the scaling of qubits.

This semiconductor-superconductor combination creates a state of "topological superconductivity," which would protect against even slight changes in a qubit's environment that interfere with its quantum nature, a renowned problem called "decoherence."

The device is potentially scalable because of its flat "planar" surface – a platform that industry already uses in the form of silicon wafers for building classical microprocessors.

The work, published in Nature, was led by the Microsoft Quantum lab at the University of Copenhagen's Niels Bohr Institute, which fabricated and measured the device. The Microsoft Quantum lab at Purdue University grew the semiconductor-superconductor heterostructure using a technique called molecular beam epitaxy, and performed initial characterization measurements.


New robust device may scale up quantum tech, researchers say, Kayla Wiles, Purdue University

Read more…