computer science (17)

Running on Air...

12958550901?profile=RESIZE_710x

Running on air Close-up of the air-powered sensing device. (Courtesy: William Grover/UCR)

Topics: Computer Science, Electrical Engineering, Materials Science, Microfluidics

A device containing a pneumatic logic circuit made from 21 microfluidic valves could be used as a new type of air-powered computer that does not require any electronic components. The device could help make a wide range of important air-powered systems safer and less expensive, according to its developers at the University of California at Riverside.

Electronic computers rely on transistors to control the flow of electricity. But in the new air-powered computer, the researchers use tiny valves instead of transistors to control the flow of air rather than electricity. “These air-powered computers are an example of microfluidics, a decades-old field that studies the flow of fluids (usually liquids but sometimes gases) through tiny networks of channels and valves,” explains team leader William Grover, a bioengineer at UC Riverside.

By combining multiple microfluidic valves, the researchers made air-powered versions of standard logic gates. For example, they combined two valves in a row to make a Boolean AND gate. This gate works because air will flow through the two valves only if both are open. Similarly, two valves connected in parallel make a Boolean OR gate. Here, air will flow if either one or the other of the valves is open.

Air-powered computers make a comeback, Isabelle Dumé, Physics World

Read more…

FHM...

12762128884?profile=RESIZE_710x

Antiferromagnetically ordered particles are represented by red and blue spheres in this artist’s impression. The particles are in an array of optical traps. Credit: Chen Lei

Topics: Applied Physics, Computer Science, Quantum Computer, Quantum Mechanics

Experiments on the Fermi–Hubbard model can now be made much larger, more uniform, and more quantitative.

A universal quantum computer—capable of crunching the numbers of any complex problem posed to it—is still a work in progress. But for specific problems in quantum physics, there’s a more direct approach to quantum simulation: Design a system that captures the physics you want to study, and then watch what it does. One of the systems most widely studied that way is the Fermi–Hubbard model (FHM), in which spin-up and spin-down fermions can hop among discrete sites in a lattice. Originally conceived as a stripped-down description of electrons in a solid, the FHM has attracted attention for its possible connection to the mysterious physics of high-temperature superconductivity.

Stripped down, though it may be, the FHM defies solution, either analytical or numerical, except in the simplest cases, so researchers have taken to studying it experimentally. In 2017, Harvard University’s Markus Greiner and colleagues made a splash when they observed antiferromagnetic order—a checkerboard pattern of up and down spins—in their FHM experiment consisting of fermionic atoms in a 2D lattice of 80 optical traps. (See Physics Today, August 2017, page 17.) The high-temperature-superconductor phase diagram has an antiferromagnetic phase near the superconducting one, so the achievement promised more exciting results to come. But the small size of the experiment limited the observations the researchers could make.

A 10 000-fold leap for a quintessential quantum simulator, Johanna L. Miller, Physics Today.

Read more…

'Teleporting' Images...

12344019066?profile=RESIZE_584x

High-dimensional quantum transport enabled by nonlinear detection. In our concept, information is encoded on a coherent source and overlapped with a single photon from an entangled pair in a nonlinear crystal for up-conversion by sum frequency generation, the latter acting as a nonlinear spatial mode detector. The bright source is necessary to achieve the efficiency required for nonlinear detection. Information and photons flow in opposite directions: one of [the] Bob’s entangled photons is sent to Alice and has no information, while a measurement on the other in coincidence with the upconverted photon establishes the transport of information across the quantum link. Alice need not know this information for the process to work, while the nonlinearity allows the state to be arbitrary and unknown dimension and basis. Credit: Nature Communications (2023). DOI: 10.1038/s41467-023-43949-x

Topics: Applied Physics, Computer Science, Cryptography, Cybersecurity, Quantum Computers, Quantum Mechanics, Quantum Optics

Nature Communications published research by an international team from Wits and ICFO- The Institute of Photonic Sciences, which demonstrates the teleportation-like transport of "patterns" of light—this is the first approach that can transport images across a network without physically sending the image and a crucial step towards realizing a quantum network for high-dimensional entangled states.

Quantum communication over long distances is integral to information security and has been demonstrated with two-dimensional states (qubits) over very long distances between satellites. This may seem enough if we compare it with its classical counterpart, i.e., sending bits that can be encoded in 1s (signal) and 0s (no signal), one at a time.

However, quantum optics allow us to increase the alphabet and to securely describe more complex systems in a single shot, such as a unique fingerprint or a face.

"Traditionally, two communicating parties physically send the information from one to the other, even in the quantum realm," says Prof. Andrew Forbes, the lead PI from Wits University.

"Now, it is possible to teleport information so that it never physically travels across the connection—a 'Star Trek' technology made real." Unfortunately, teleportation has so far only been demonstrated with three-dimensional states (imagine a three-pixel image); therefore, additional entangled photons are needed to reach higher dimensions.

'Teleporting' images across a network securely using only light, Wits University, Phys.org.

Read more…

Bitcoin and Gaia...

12271621897?profile=RESIZE_710x

"What are the environmental impacts of cryptocurrency?" Written by Paul Kim; edited by Jasmine Suarez Mar 17, 2022, 5:21 PM EDT, Business Insider.

 Image: Ethereum, the second biggest cryptocurrency on the market, plans on changing to proof of stake mining in the future. Rachel Mendelson/Insider

 

Topics: Applied Physics, Computer Science, Cryptography, Economics, Environment, Star Trek, Thermodynamics

In what is now “old school Internet” (or web surfing for fogies), I will get a friend request from someone on Facebook/Meta who is in cryptocurrency. I quote myself in the first paragraph of what I refer to as my “public service announcement):

I am not INTERESTED in crypto. As someone who worked with cryptography as a matter of national security, holding a TS/SCI clearance, when you start your message with “let me explain to YOU how crypto works,” expect to get blocked.

Invariably, I still do, which makes me wonder if they read the PSA or think “they will be the one” to sign me. News flash, pilgrim...I now have another pertinent reason to ignore your blockchain solicitations, actually, several good reasons.

Every time we turn on a light in our homes, there is a thermal budget that we are being charged for (that's how Duke Power makes its money in North Carolina and Perdernales Electric Cooperative in Texas). Bitcoin/Blockchain (I think) caught the imagination because it seemed like a "Federation Credit" from Star Trek, where no one explains fully how a society that is "post-scarcity" somehow feels the need for some type of currency in utopia. It's kind of like magic carpets: you go with the bit for the story - warp drive, Heisenberg compensators, Federation credits. The story, and if you are thoroughly entertained after the denouement, not the physics, is what matters.

You might not be extracting anything from the planet directly, but Bitcoin mining has a massive impact on the planet’s environment.

Mining resources from our planet can take a devastating toll on the environment, both local and global. Even beyond this, using the resource could cause disastrous effects on our planet, and dependence on a single resource can wreak havoc on a country’s economy. Yet, many of these resources are needed for our daily lives -- sometimes as a luxury, sometimes as a necessity. Any responsible country or company should always take pause to consider what impact mining of any kind can have on the planet.

It turns out that these days, one type of mining might be the worst for Earth’s environment: bitcoins. Yes, the “mining” of virtual currency makes its mark on our planet. The unequal distribution of Bitcoin mining across the globe means that some countries are making a much larger dent into the planet’s climate and environment than others ... all for a “resource” that is far from necessary for our society.

Bitcoin mining uses a lot of computing power to solve the cryptographic puzzles that lie at the heart of the industry. As of today (October 30, 2023), each Bitcoin is worth over $34,000, and with the multitude of other cryptocoins out there, using computers to unlock more can be a profitable endeavor. Almost half a trillion dollars of the global economy runs on these “virtual currencies.”

Worst Kind of Mining for the Environment? It Might Be Bitcoin. Erik Klemetti, Discover Magazine

 

Read more…

Quantum Slow Down...

12222716882?profile=RESIZE_710x

Topics: Chemistry, Computer Science, Quantum Computer, Quantum Mechanics

Scientists at the University of Sydney have, for the first time, used a quantum computer to engineer and directly observe a process critical in chemical reactions by slowing it down by a factor of 100 billion times.

Joint lead researcher and Ph.D. student Vanessa Olaya Agudelo said, "It is by understanding these basic processes inside and between molecules that we can open up a new world of possibilities in materials science, drug design, or solar energy harvesting.

"It could also help improve other processes that rely on molecules interacting with light, such as how smog is created or how the ozone layer is damaged."

Specifically, the research team witnessed the interference pattern of a single atom caused by a common geometric structure in chemistry called a "conical intersection."

Conical intersections are known throughout chemistry and are vital to rapid photochemical processes such as light harvesting in human vision or photosynthesis.

Chemists have tried to directly observe such geometric processes in chemical dynamics since the 1950s, but it is not feasible to observe them directly, given the extremely rapid timescales involved.

To get around this problem, quantum researchers in the School of Physics and the School of Chemistry created an experiment using a trapped-ion quantum computer in a completely new way. This allowed them to design and map this very complicated problem onto a relatively small quantum device—and then slow the process down by a factor of 100 billion. Their research findings are published August 28 in Nature Chemistry.

"In nature, the whole process is over within femtoseconds," said Olaya Agudelo from the School of Chemistry. "That's a billionth of a millionth—or one quadrillionth—of a second.

"Using our quantum computer, we built a system that allowed us to slow down the chemical dynamics from femtoseconds to milliseconds. This allowed us to make meaningful observations and measurements.

"This has never been done before."

Joint lead author Dr. Christophe Valahu from the School of Physics said, "Until now, we have been unable to directly observe the dynamics of 'geometric phase'; it happens too fast to probe experimentally.

"Using quantum technologies, we have addressed this problem."

Scientists use a quantum device to slow down simulated chemical reactions 100 billion times. University of Sydney, Phys.org.

Read more…

Straining Moore...

12126816677?profile=RESIZE_710x

Topics: Applied Physics, Chemistry, Computer Science, Electrical Engineering, Materials Science, Nanotechnology, Quantum Mechanics, Semiconductor Technology

Gordon Moore, the co-founder of Intel who died earlier this year, is famous for forecasting a continuous rise in the density of transistors that we can pack onto semiconductor chips. James McKenzie looks at how “Moore’s law” is still going strong after almost six decades but warns that further progress is becoming harder and ever more expensive to sustain.

When the Taiwan Semiconductor Manufacturing Company (TSMC) announced last year that it was planning to build a new factory to produce integrated circuits, it wasn’t just the eye-watering $33bn price tag that caught my eye. What also struck me is that the plant, set to open in 2025 in the city of Hsinchu, will make the world’s first “2-nanometer” chips. Smaller, faster, and up to 30% more efficient than any microchip that has come before, TSMC’s chips will be sold to the likes of Apple – the company’s biggest customer – powering everything from smartphones to laptops.

But our ability to build such tiny, powerful chips shouldn’t surprise us. After all, the engineer Gordon Moore – who died on 24 March this year, aged 94 – famously predicted in 1965 that the number of transistors we can squeeze onto an integrated circuit ought to double yearly. Writing for the magazine Electronics (38 114), Moore reckoned that by 1975 it should be possible to fit a quarter of a million components onto a single silicon chip with an area of one square inch (6.25 cm2).

Moore’s prediction, which he later said was simply a “wild extrapolation”, held true, although, in 1975, he revised his forecast, predicting that chip densities would double every two years rather than every year. What thereafter became known as “Moore’s law” proved amazingly accurate, as the ability to pack ever more transistors into a tiny space underpinned the almost non-stop growth of the consumer electronics industry. In truth, it was never an established scientific “law” but more a description of how things had developed in the past as well as a roadmap that the semiconductor industry imposed on itself, driving future development.

Moore's law: further progress will push hard on the boundaries of physics and economics, James McKenzie, Physics World

Read more…

Chiplets...

11812006900?profile=RESIZE_710x

Source: Semiengineering dot com - Chiplets

Topics: Computer Science, Electrical Engineering, Materials Science, Semiconductor Technology, Solid-State Physics

Depending on who you’re speaking with at the time, the industry’s adoption of chiplet technology to extend the reach of Moore’s Law is either continuing to roll along or is facing the absence of a commercial market. However, both assertions cannot be true. What is true is that chiplets have been used to build at least some commercial ICs for more than a decade and that semiconductor vendors continue to expand chiplet usability and availability. At the same time, the interface and packaging standards that are essential to widespread chiplet adoption remain in flux.

On the positive side of this question are existence proofs. Xilinx, now AMD, has been using 2.5D chiplet technology with large silicon interposers to make FPGAs for more than a decade. The first commercial use of this packaging technology appeared back in 2011 when Xilinx announced its Virtex-7 2000T FPGA, a 2-Mgate device built from four FPGA semiconductor tiles bonded to a silicon interposer. Xilinx jointly developed this chiplet-packaging technology with its foundry, TSMC, which now offers this CoWoS (Chip-on-Wafer-on-Substrate) interposer-and-chiplet technology to its other foundry customers. TSMC customers that have announced chiplet-based products include Broadcom and Fujitsu. AMD is now five generations along the learning curve with this packaging technology, which is now essential to the continued development of bigger and more diverse FPGAs. AMD will be presenting an overview of this multi-generation, chiplet-based technology, including a status update at the upcoming Hot Chips 2023 conference being held at Stanford University in Palo Alto, California, in August.

Similarly, Intel has long been developing and using chiplet technology in its own packaged ICs. The company has been using its 2.5D EMIB (embedded multi-die interconnect bridge) chiplet-packaging technology for years to manufacture its Stratix 10 FPGAs. That technology has now spread throughout Intel’s product line to include CPUs and SoCs. The poster child for Intel’s chiplet-packaging technologies is unquestionably the company’s Ponte Vecchio GPU, which packages 47 active “tiles” – Intel’s name for chiplets – in a multi-chip package. These 47 dies are manufactured by multiple semiconductor vendors using five different semiconductor process nodes, all combined in one package using Intel’s EMIB 2.5D and 3D Foveros chiplet-packaging techniques to produce an integrated product with more than 100 billion transistors – something not currently possible on one silicon die. Intel is now opening these chiplet-packaging technologies to select customers through IFS – Intel Foundry Services – and consequently expanding the size and number of its packaging facilities.

The Chiplet’s Time Is Coming. It’s Here, Or Not. Steven Leibson, Tirias Research, Forbes

Read more…

Caveat Modifier...

11028930884?profile=RESIZE_710x

The Biofire Smart Gun. Photographer: James Stukenberg for Bloomberg Businessweek

Topics: Biometrics, Biotechnology, Computer Science, Democracy, Materials Science, Semiconductor Technology

Tech Target (Alyssa Provazza, Editorial Director): "A smartphone is a cellular telephone with an integrated computer and other features not originally associated with telephones, such as an operating system, web browsing, and the ability to run software applications." Smartphones, however, have had a detrimental effect on humans regarding health, critical thinking, and cognitive skills, convenient though they are.

I've seen the idea of "smart guns" for decades. Like the fingerprint scan for biometric safes, it's a safeguard that some will opt for but most likely won't unless compelled by legislation, which in the current "thoughts and prayers" environment (i.e., sloganeering is easier than proposing a law if you continually get away with it), I'm not holding my breath. A recent, late 20th Century example:

In 1974, the federal government passed the National Maximum Speed Law, which restricted the maximum permissible vehicle speed limit to 55 miles per hour (mph) on all interstate roads in the United States.1 The law was a response to the 1973 oil embargo, and its intent was to reduce fuel consumption. In the year after the National Maximum Speed Law was enacted, road fatalities declined 16.4%, from 54,052 in 1973 to 45,196 in 1974.2

In April of 1987, Congress passed the Surface Transportation and Uniform Relocation Assistance Act, which permitted states to raise the legal speed limit on rural interstates to 65 mph.3 Under this legislation, 41 states raised their posted speed limits to 65 mph on segments of rural interstates. On November 28, 1995, Congress passed the National Highway Designation Act, which officially removed all federal speed limit controls. Since 1995, all US states have raised their posted speed limits on rural interstates; many have also raised the posted speed limits on urban interstates and non interstate roads.

Conclusions. Reduced speed limits and improved enforcement with speed camera networks could immediately reduce speeds and save lives, in addition to reducing gas consumption, cutting emissions of air pollutants, saving valuable years of productivity, and reducing the cost of motor vehicle crashes.

Long-Term Effects of Repealing the National Maximum Speed Limit in the United States, Lee S. Friedman, Ph.D., corresponding author Donald Hedeker, Ph.D., and Elihu D. Richter, MD, MPH, National Library of Medicine, National Institutes of Health

Homo Sapiens, (Latin) "wise men," don't always do smart things.

In an office parking lot about halfway between Denver and Boulder, a former 50-foot-long shipping container has been converted into a cramped indoor shooting range. Paper targets with torsos printed on them hang from two parallel tracks, and a rubber trap waits at the back of the container to catch the spent bullets. Black acoustic foam padding on the walls softens the gunshot noise to make the experience more bearable for the shooter, while an air filtration system sucks particulates out of the air. It’s a far cry from the gleaming labs of the average James Bond movie, but Q might still be proud.

The weapons being tested at this site are smart guns: They can identify their registered users and won’t fire [for] anyone else. Smart guns have been a notoriously quixotic category for decades. The weapons carry the hope that an extra technological safeguard might prevent a wide range of gun-related accidents and deaths. But making a smart gun that’s good enough to be taken seriously has proved beyond difficult. It’s rare to find engineers with a strong understanding of both ballistics and biometrics whose products can be expected to work perfectly in life-or-death situations.

Some recent attempts have amounted to little more than a sensor or two slapped onto an existing weapon. More promising products have required too many steps and taken too much time to fire compared with the speed of a conventional handgun. What separates the Biofire Smart Gun here in the converted shipping container is that its ID systems, which scan fingerprints and faces, have been thoroughly melded into the firing mechanism. The battery-powered weapon has the sophistication of high-end consumer electronics, but it’s still a gun at its core.

A Smart Gun Is Finally Here, But Does Anyone Want It? Ashlee Vance, Bloomberg Business Week

Read more…

Zero Days...

11019670494?profile=RESIZE_710x

Image Source: Tech Target

Topics: Computer Science, Cryptography, Cybersecurity, Spyware

Spyware vendors are exploiting zero days and known vulnerabilities in Android, iOS, and Chrome, sparking an increase in "dangerous hacking tools," warned Google's Threat Analysis Group.

In a blog post on Wednesday, Clement Lecigne, a security engineer at Google, detailed two recent campaigns that TAG discovered to be "both limited and highly targeted." The campaigns leveraged zero-day exploits alongside known vulnerabilities, or N days, against unpatched devices on widely used platforms.

In addition to emphasizing an ongoing patching problem, Google said the threat activity showed just how prevalent spyware vendors have become and the dangers they present, especially when wielding zero days.

"These campaigns are a reminder that the commercial spyware industry continues to thrive," Lecigne wrote in the blog post.

TAG currently tracks more than 30 commercial surveillance vendors that sell exploits or spyware programs to various governments and nation-state threat groups. While Google acknowledged spyware use might be legal under national or international laws, such tools have historically been used against targets such as government officials, journalists, political dissidents, and human rights activists. For example, in 2018, NSO Group's Pegasus spyware was linked to the death of journalist Jamal Khashoggi, who was killed by Saudi government agents in 2018 after being surveilled and tracked via his mobile phone.

While spyware has been used to track high-value targets in the past, Lecigne warned vendors that access to zero days and N days poses an even broader threat.

"Even smaller surveillance vendors have access to 0-days, and vendors stockpiling and using 0-day vulnerabilities in secret pose a severe risk to the internet," Lecigne wrote. "These campaigns may also indicate that exploits and techniques are being shared between surveillance vendors, enabling the proliferation of dangerous hacking tools."

Google: Spyware vendors exploiting iOS, Android zero days, Arielle Waldman, Tech Target News Writer

Read more…

Chip Act and Wave Surfing...

10943737673?profile=RESIZE_584x

Massive subsidies to regain the edge of the US semiconductor industry will not likely succeed unless progress is made in winning the global race of idea flow and monetization.

Topics: Applied Physics, Chemistry, Computer Science, Electrical Engineering, Semiconductor Technology

Intelligent use of subsidies for winning the global idea race is a must for gaining and regaining semiconductor edge.

The US semiconductor industry started with the invention of Bell Labs. Subsequently, it attained supremacy in semiconductor production due to the success of making computers better and cheaper. Notably, the rise of the PC wave made Intel and Silicon Valley seemingly unsinkable technology superpowers. But during the first two decades of the 21st century, America has lost it. The USA now relies on Asia to import the most advanced chips. Its iconic Intel is now a couple of technology generation behind Asia’s TSMC and Samsung.

Furthermore, China’s aggressive move has added momentum to America’s despair, triggering a chip war. But why has America lost the edge? Why does it rely on TSMC and Samsung to supply the most advanced chips to power iPhones, Data centers, and Weapons? Is it due to Asian Governments’ subsidies? Or is it due to America’s failure to understand dynamics, make prudent decisions and manage technology and innovation?

Invention and rise and fall of US semiconductor supremacy

In 1947, Bell Labs of the USA invented a semiconductor device—the Transistor. Although American companies developed prototypes of Transistor radios and other consumer electronic products, they did not immediately pursue them. But American firms were very fast in using the Transistor to reinvent computers—by changing the vacuum tube technology core. Due to weight advantage, US Airforce and NASA found transistors suitable for onboard computers. Besides, the invention of integrated circuits by Fairchild and Texas instruments accelerated the weight and size reduction of digital logic circuits. Consequentially, the use of semiconductors in building onboard computers kept exponentially growing. Hence, by the end of the 1960s, the US had become a powerhouse in logic circuit semiconductors. But America remained 2nd to Japan in global production, as Japanese companies were winning the race of consumer electronics by using transistors.

US Semiconductor–from invention, supremacy to despair, Rokon Zaman, The-Waves.org

Read more…

QAOA and Privacy…

10928256657?profile=RESIZE_710x

A quantum computer at IBM’s Thomas J. Watson Research Center.

Credit: Connie Zhou for IBM

Topics: Computer Science, Cryptography, Cybersecurity, Quantum Computer

A team of researchers in China has unveiled a technique that — theoretically — could crack the most commonly used types of digital privacy using a rudimentary quantum computer.

The technique worked in a small-scale demonstration, the researchers report, but other experts are skeptical that the procedure could scale up to beat ordinary computers at the task. Still, they Are quantum computers about to break online privacy. Davide Castelvecchi, Naturewarn that the paper, posted late last month on the arXiv repository1, is a reminder of the vulnerability of online privacy.

Quantum computers are known to be a potential threat to current encryption systems. However, the technology is still in its infancy, and researchers typically estimate that it will be many years until it can be faster than ordinary computers at cracking cryptographic keys.

Researchers realized in the 1990s that quantum computers could exploit peculiarities of physics to perform tasks that seem to be beyond the reach of ‘classical’ computers. Peter Shor, a mathematician now at the Massachusetts Institute of Technology in Cambridge, showed in 19942 how to apply the phenomena of quantum superposition and interference to factoring integer numbers into primes — the integers that cannot be further divided without a remainder.

Are quantum computers about to break online privacy? Davide Castelvecchi, Nature

Read more…

Pushing Beyond Moore...

10917425279?profile=RESIZE_710x

Clean-room technicians at the AIM Photonics NanoTech chip fabrication facility in Albany, New York.  Credit: SUNY Polytechnic Institute

Topics: Computer Science, Electrical Engineering, Materials Science, Nanotechnology, Semiconductor Technology

Over 50 Years of Moore's Law - Intel

GAITHERSBURG, Md. — The U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) has entered into a cooperative research and development agreement with AIM Photonics that will give chip developers a critical new tool for designing faster chips that use both optical and electrical signals to transmit information. Called integrated photonic circuits, these chips are key components in fiber-optic networks and high-performance computing facilities. They are used in laser-guided missiles, medical sensors, and other advanced technologies. 

AIM Photonics, a Manufacturing USA institute, is a public-private partnership that accelerates the commercialization of new technologies for manufacturing photonic chips. The New York-based institute provides small and medium-sized businesses, academics, and government researchers access to expertise and fabrication facilities during all phases of the photonics development cycle, from design to fabrication and packaging.

NIST and AIM Photonics Team Up on High-Frequency Optical/Electronic Chips

Read more…

Thermo Limits

10249327499?profile=RESIZE_584x

A radical reimagining of information processing could greatly reduce the energy use—as well as greenhouse gas emissions and waste heat—from computers. Credit: vchal/Getty Images

Topics: Climate Change, Computer Science, Electrical Engineering, Global Warming, Semiconductor Technology, Thermodynamics

In case you had not noticed, computers are hot—literally. A laptop can pump out thigh-baking heat, while data centers consume an estimated 200 terawatt-hours each year—comparable to the energy consumption of some medium-sized countries. The carbon footprint of information and communication technologies as a whole is close to that of fuel used in the aviation industry. And as computer circuitry gets ever smaller and more densely packed, it becomes more prone to melting from the energy it dissipates as heat.

Now physicist James Crutchfield of the University of California, Davis, and his graduate student Kyle Ray have proposed a new way to carry out computation that would dissipate only a small fraction of the heat produced by conventional circuits. In fact, their approach, described in a recent preprint paper, could bring heat dissipation below even the theoretical minimum that the laws of physics impose on today’s computers. That could greatly reduce the energy needed to both perform computations and keep circuitry cool. And it could all be done, the researchers say, using microelectronic devices that already exist.

In 1961 physicist Rolf Landauer of IBM’s Thomas J. Watson Research Center in Yorktown Heights, N.Y., showed that conventional computing incurs an unavoidable cost in energy dissipation—basically, in the generation of heat and entropy. That is because a conventional computer has to sometimes erase bits of information in its memory circuits in order to make space for more. Each time a single bit (with the value 1 or 0) is reset, a certain minimum amount of energy is dissipated—which Ray and Crutchfield have christened “the Landauer.” Its value depends on ambient temperature: in your living room, one Landauer would be around 10–21 joule. (For comparison, a lit candle emits on the order of 10 joules of energy per second.)

‘Momentum Computing’ Pushes Technology’s Thermodynamic Limits, Phillip Ball, Scientific American

Read more…

Fantastic Plastic...

10108593093?profile=RESIZE_710x

Plastic fantastic: this perovskite-based device can be reconfigured and could play an important role in artificial intelligence systems. (Courtesy: Purdue University/Rebecca McElhoe)

Topics: Artificial Intelligence, Biology, Computer Science, Materials Science

Researchers in the US have developed a perovskite-based device that could be used to create a high-plasticity architecture for artificial intelligence. The team, led by Shriram Ramanathan at Purdue University, has shown that the material’s electronic properties can be easily reconfigured, allowing the devices to function like artificial neurons and other components. Their results could lead to more flexible artificial-intelligence hardware that could learn much like the brain.

Artificial intelligence systems can be trained to perform a task such as voice recognition using real-world data. Today this is usually done in software, which can adapt when additional training data are provided. However, machine learning systems that are based on hardware are much more efficient and researchers have already created electronic circuits that behave like artificial neurons and synapses.

However, unlike the circuits in our brains, these electronics are not able to reconfigure themselves when presented with new training information. What is needed is a system with high plasticity, which can alter its architecture to respond efficiently to new information.

Device can transform into four components for artificial intelligence systems, Sam Jarman, Physics World

Read more…

Quantum AI...

9858607872?profile=RESIZE_710x

Rutgers researchers and their collaborators have found that learning - a universal feature of intelligence in living beings - can be mimicked in synthetic matter, a discovery that in turn could inspire new algorithms for artificial intelligence (AI). (Courtesy: Rutgers University-New Brunswick)

Topics: Artificial Intelligence, Computer Science, Materials Science, Quantum Mechanics

Quantum materials known as Mott insulators can “learn” to respond to external stimuli in a way that mimics animal behavior, say researchers at Rutgers University in the US. The discovery of behaviors such as habituation and sensitization in these non-living systems could lead to new algorithms for artificial intelligence (AI).

Neuromorphic, or brain-inspired, computers aim to mimic the neural systems of living species at the physical level of neurons (brain nerve cells) and synapses (the connections between neurons). Each of the 100 billion neurons in the human brain, for example, receives electrical inputs from some of its neighbors and then “fires” an electrical output to others when the sum of the inputs exceeds a certain threshold. This process, also known as “spiking”, can be reproduced in nanoscale devices such as spintronic oscillators. As well as being potentially much faster and energy-efficient than conventional computers, devices based on these neuromorphic principles might be able to learn how to perform new tasks without being directly programmed to accomplish them.

Quantum material ‘learns’ like a living creature, Isabelle Dumé, Physics World

Read more…

Tech Authoritarianism...

9598249699?profile=RESIZE_710x

GIF source: Link below

Topics: Computer Science, Politics, Social Media

To build the metaverse, Facebook needs us to get used to smart glasses.

Last week Facebook released its new $299 “Ray-Ban Stories” glasses. Wearers can use them to record and share images and short videos, listen to music, and take calls. The people who buy these glasses will soon be out in public and private spaces, photographing and recording the rest of us, and using Facebook’s new “View” app to sort and upload that content.

My issue with these glasses is partially what they are, but mostly what they will become, and how that will change our social landscape.

How will we feel going about our lives in public, knowing that at any moment the people around us might be wearing stealth surveillance technology? People have recorded others in public for decades, but it’s gotten more difficult for the average person to detect, and Facebook’s new glasses will make it harder still since they resemble and carry the Ray-Ban brand.

That brand’s trusted legacy of “cool” could make Facebook’s glasses appeal to many more people than Snap Spectacles and other camera glasses. (Facebook also has roughly 2 billion more users than Snapchat.) And Facebook can take advantage of the global supply chain and retail outlet infrastructure of Luxottica, Ray-Ban’s parent company. This means the product won’t have to roll out slowly—even worldwide.

Why Facebook is using Ray-Ban to stake a claim on our faces, S.A. Applin, MIT Technology Review

Read more…

Space-Based Quantum Technology...

9316444057?profile=RESIZE_584x

(Credit: Yurchanka Siarhei/Shutterstock)

Topics: Computer Science, Quantum Computer, Quantum Mechanics

Quantum technologies are already revolutionizing life on Earth. But they also have the potential to change the way we operate in space. With the U.S., China, and Europe all investing heavily in this area, these changes are likely to be with us sooner rather than later.

So how will space-based quantum technologies make a difference?

Now, we get an overview thanks to the work of Rainer Kaltenbaek at the Institute for Quantum Optics and Quantum Information, in Austria, and colleagues throughout Europe, who have mapped out the future in this area and set out the advances that space-based quantum technologies will make possible.

While quantum computing and quantum communication grab most of the headlines, Kaltenbaek and colleagues point out that other quantum technologies are set to have equally impressive impacts. Take, for example, atom interferometry with quantum sensors.

These devices can measure with unprecedented accuracy any change in motion of a satellite in orbit as it is buffeted by tiny variations in the Earth’s gravitational field. These changes are caused by factors such as the movement of cooler, higher-density water flows in the deep ocean, flooding, the movement of the continents, and ice flows.

The Future of Space-Based Quantum Technology, Discover/Physics arXiv

Read more…