In a step that brings silicon-based quantum computers closer to reality, researchers at Princeton University have built a device in which a single electron can pass its quantum information to a particle of light. The particle of light, or photon, can then act as a messenger to carry the information to other electrons, creating connections that form the circuits of a quantum computer.
The research, published today in the journal Science and conducted at Princeton and HRL Laboratories in Malibu, California, represents a more than five-year effort to build a robust capability for an electron to talk to a photon, said Jason Petta, a Princeton professor of physics.
Princeton Professor of Physics Jason Petta, from left, and physics graduate students David Zajac and Xiao Mi, have built a device that is a step forward for silicon-based quantum computers, which when built will be able to solve problems beyond the capabilities of everyday computers. The device isolates an electron so that can pass its quantum information to a photon, which can then act as a messenger to carry the information to other electrons to form the circuits of the computer. (Photo by Denise Applewhite, Office of Communications)
“Just like in human interactions, to have good communication a number of things need to work out — it helps to speak the same language and so forth,” Petta said. “We are able to bring the energy of the electronic state into resonance with the light particle, so that the two can talk to each other.”
The discovery will help the researchers use light to link individual electrons, which act as the bits, or smallest units of data, in a quantum computer. Quantum computers are advanced devices that, when realized, will be able to perform advanced calculations using tiny particles such as electrons, which follow quantum rules rather than the physical laws of the everyday world.
Each bit in an everyday computer can have a value of a 0 or a 1. Quantum bits — known as qubits — can be in a state of 0, 1, or both a 0 and a 1 simultaneously. This superposition, as it is known, enables quantum computers to tackle complex questions that today’s computers cannot solve.
Simple quantum computers have already been made using trapped ions and superconductors, but technical challenges have slowed the development of silicon-based quantum devices. Silicon is a highly attractive material because it is inexpensive and is already widely used in today’s smartphones and computers.
The qubit consists of a single electron that is trapped below the surface of a silicon chip (gray). The green, pink and purple wires on top of the silicon structure deliver precise voltages to the qubit. The purple plate reduces electronic interference that can destroy the qubit’s quantum information. By adjusting the voltages in the wires, the researchers can trap a single electron in a double quantum dot and adjust its energy so that it can communicate its quantum information to a nearby photon. (Photo courtesy of the Jason Petta research group, Department of Physics)
The researchers trapped both an electron and a photon in the device, then adjusted the energy of the electron in such a way that the quantum information could transfer to the photon. This coupling enables the photon to carry the information from one qubit to another located up to a centimeter away.
Quantum information is extremely fragile — it can be lost entirely due to the slightest disturbance from the environment. Photons are more robust against disruption and can potentially carry quantum information not just from qubit to qubit in a quantum computer circuit but also between quantum chips via cables.
For these two very different types of particles to talk to each other, however, researchers had to build a device that provided the right environment. First, Peter Deelman at HRL Laboratories, a corporate research-and-development laboratory owned by the Boeing Company and General Motors, fabricated the semiconductor chip from layers of silicon and silicon-germanium. This structure trapped a single layer of electrons below the surface of the chip. Next, researchers at Princeton laid tiny wires, each just a fraction of the width of a human hair, across the top of the device. These nanometer-sized wires allowed the researchers to deliver voltages that created an energy landscape capable of trapping a single electron, confining it in a region of the silicon called a double quantum dot.
The researchers used those same wires to adjust the energy level of the trapped electron to match that of the photon, which is trapped in a superconducting cavity that is fabricated on top of the silicon wafer.
Prior to this discovery, semiconductor qubits could only be coupled to neighboring qubits. By using light to couple qubits, it may be feasible to pass information between qubits at opposite ends of a chip.
The electron’s quantum information consists of nothing more than the location of the electron in one of two energy pockets in the double quantum dot. The electron can occupy one or the other pocket, or both simultaneously. By controlling the voltages applied to the device, the researchers can control which pocket the electron occupies.
“We now have the ability to actually transmit the quantum state to a photon confined in the cavity,” said Xiao Mi, a graduate student in Princeton’s Department of Physics and first author on the paper. “This has never been done before in a semiconductor device because the quantum state was lost before it could transfer its information.”
The success of the device is due to a new circuit design that brings the wires closer to the qubit and reduces interference from other sources of electromagnetic radiation. To reduce this noise, the researchers put in filters that remove extraneous signals from the wires that lead to the device. The metal wires also shield the qubit. As a result, the qubits are 100 to 1,000 times less noisy than the ones used in previous experiments.
Jeffrey Cady, a 2015 graduate, helped develop the filters to reduce the noise as part of his undergraduate senior thesis, and graduate student David Zajac led the effort to use overlapping electrodes to confine single electrons in silicon quantum dots.
Eventually the researchers plan to extend the device to work with an intrinsic property of the electron known as its spin. “In the long run we want systems where spin and charge are coupled together to make a spin qubit that can be electrically controlled,” Petta said. “We’ve shown we can coherently couple an electron to light, and that is an important step toward coupling spin to light.”
David DiVincenzo, a physicist at the Institute for Quantum Information in RWTH Aachen University in Germany, who was not involved in the research, is the author of an influential 1996 paper outlining five minimal requirements necessary for creating a quantum computer. Of the Princeton-HRL work, in which he was not involved, DiVincenzo said: “It has been a long struggle to find the right combination of conditions that would achieve the strong coupling condition for a single-electron qubit. I am happy to see that a region of parameter space has been found where the system can go for the first time into strong-coupling territory.”
Founded in 1746 in Elizabeth as the College of New Jersey, Princeton is one of the nine Colonial Colleges established before the American Revolution as well as the fourth chartered institution of higher education in the American colonies. The university moved to Newark in 1747, then to Princeton in 1756 and was renamed Princeton University in 1896. The present-day College of New Jersey in nearby Ewing Township, New Jersey, is an unrelated institution. Princeton had close ties to the Presbyterian Church, but has never been affiliated with any denomination and today imposes no religious requirements on its students.
Princeton now provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences, and engineering. It does not have schools of medicine, law, divinity, or business, but it does offer professional degrees through the Woodrow Wilson School of Public and International Affairs, the Princeton University School of Engineering and Applied Science, and the School of Architecture. The institute has ties with the Institute for Advanced Study, Princeton Theological Seminary, and the Westminster Choir College of Rider University. Princeton has been associated with 35 Nobel Laureates, 17 National Medal of Science winners, and three National Humanities Medal winners. On a per-student basis, Princeton has the largest university endowment in the world.
Princeton University research articles from Innovation Toronto
- More than 1,200 new planets confirmed using new technique for verifying Kepler data – May 11, 2016
- How an artificial protein rescues dying cells – insight into how life can adapt and potentially be reinvented – March 13, 2016
- Army ants’ living bridges span collective intelligence, swarm robotics – November 26, 2015
- 3D-Printed Guide Helps Regrow Complex Nerves After Injury – September 19, 2015
- We Are Entering a “Golden Age” of Animal Tracking – June 13, 2015
- Dirty pool: Soil’s large carbon stores could be freed by increased CO2, plant growth (Nature Climate Change) – – December 24, 2014
- Much better, cheaper, brighter and flexible LED’s – September 28, 2014
- ‘Fracking’ in the dark: Biological fallout of shale-gas production still largely unknown – August 3, 2014
- Solar panels light the way from carbon dioxide to fuel – July 2, 2014
- Physicists Find a Link between Wormholes and Spooky Action at a Distance
- Princeton Laser breakthrough will enable sniffing the air at a distance
- “Futurity” service launches to promote university research as traditional science journalism declines
- New Pattern Recognition Makes for Easier Sifting of Big Data
- From slowdown to shutdown — US leadership in biomedical research takes a blow, says ASCB
- Cool heads likely won’t prevail in a hotter, wetter world
- Printable ‘bionic’ ear melds electronics and biology
- Printable ‘bionic’ ear melds electronics and biology
- Bacterial byproduct offers route to avoiding antibiotic resistance
- Steps toward quantum computing
- New Breakthrough Prize Awards Millions to Life Scientists
- What Will It Take to Solve Climate Change?
- The costs of climate change can be mitigated if economic activity moves in response
- Synthetic fuels could eliminate entire U.S. need for crude oil, create ‘new economy’
- Tiny Structure Gives Big Boost to Solar Power
- Breakthrough offers new route to large-scale quantum computing
- A Chemist Comes Very Close to a Midas Touch
- Forgoing College to Pursue Dreams
- Experts propose ‘cyber war’ on cancer
- New nanoparticle discovery opens door for pharmaceuticals
- Innovation promises to cut massive power use at big data companies in a flash
- Researchers use nanotech to make cancer 3M times more detectable
- Effective World Government Will Be Needed to Stave Off Climate Catastrophe
- Can Fracking and Carbon Sequestration Coexist?
- ‘Storm of the Century’ May Become ‘Storm of the Decade’
- Academic Earth
- Teen’s invention boosts solar panel output 40 percent
- Plan B for Energy: 8 Revolutionary Energy Sources
- Rubber sheets harness body movement to power electrical devices
- Quantum computing researchers achieve control over individual electrons
- Computer Scientists Take Over Electronic Voting Machine With New Programming Technique
- Reverse Combustion: Can CO2 Be Turned Back into Fuel?
- Lower cost solar panels using plastic electronics
- Internet Ideology War: Google’s Spat with China Could Reshape Traditional Online Freedoms
- Unreliable research: Trouble at the lab
- Cultured Beef: Do We Really Need a $380,000 Burger Grown in Petri Dishes?
- Ditch Time-Wasting Meetings By Turning Your Office Into An Ant Colony
- College of Future Could Be Come One, Come All
- Researchers say pharmaceutical ‘innovation crisis’ is a myth
- Smart teeth invention makes cover of New York Times magazine
- Cracking Open the Scientific Process
- M.I.T. Game-Changer: Free Online Education For All
- Exploring open access in higher education: live chat best bits
- Beneficial Biofuels: Leading National Experts Reach Consensus
Researchers at Columbia University, Princeton and Harvard University have developed a new approach for analyzing big data that can drastically improve the ability to make accurate predictions about medicine, complex diseases, social science phenomena, and other issues.
In a study published in the December 13 issue of Proceedings of the National Academy of Sciences (PNAS), the authors introduce the Influence score, or “I-score,” as a statistic correlated with how much variables inherently can predict, or “predictivity”, which can consequently be used to identify highly predictive variables.
“In our last paper, we showed that significant variables may not necessarily be predictive, and that good predictors may not appear statistically significant,” said principal investigator Shaw-Hwa Lo, a professor of statistics at Columbia University. “This left us with an important question: how can we find highly predictive variables then, if not through a guideline of statistical significance? In this article, we provide a theoretical framework from which to design good measures of prediction in general. Importantly, we introduce a variable set’s predictivity as a new parameter of interest to estimate, and provide the I-score as a candidate statistic to estimate variable set predictivity.”
Current approaches to prediction generally include using a significance-based criterion for evaluating variables to use in models and evaluating variables and models simultaneously for prediction using cross-validation or independent test data.
“Using the I-score prediction framework allows us to define a novel measure of predictivity based on observed data, which in turn enables assessing variable sets for, preferably high, predictivity,” Lo said, adding that, while intuitively obvious, not enough attention has been paid to the consideration of predictivity as a parameter of interest to estimate. Motivated by the needs of current genome-wide association studies (GWAS), the study authors provide such a discussion.
In the paper, the authors describe the predictivity for a variable set and show that a simple sample estimation of predictivity directly does not provide usable information for the prediction-oriented researcher. They go on to demonstrate that the I-score can be used to compute a measure that asymptotically approaches predictivity. The I-score can effectively differentiate between noisy and predictive variables, Lo explained, making it helpful in variable selection. A further benefit is that while usual approaches require heavy use of cross-validation data or testing data to evaluate the predictors, the I-score approach does not rely as much on this as much.
“We offer simulations and an application of the I-score on real data to demonstrate the statistic’s predictive performance on sample data,” he said. “These show that the I-score can capture highly predictive variable sets, estimates a lower bound for the theoretical correct prediction rate, and correlates well with the out of sample correct rate. We suggest that using the I-score method can aid in finding variable sets with promising prediction rates, however, further research in the avenue of sample-based measures of predictivity is needed.”
The authors conclude that there are many applications for which using the I-score would be useful, for example in formulating predictions about diseases with high dimensional data, such as gene datasets, in the social sciences for text prediction or financial markets predictions; in terrorism, civil war, elections and financial markets.
“We’re hoping to impress upon the scientific community the notion that for those of us who might be interested in predicting an outcome of interest, possibly with rather complex or high dimensional data, we might gain by reconsidering the question as one of how to search for highly predictive variables (or variable sets) and using statistics that measure predictivity to help us identify those variables to then predict well,” Lo said. “For statisticians in particular, we’re hoping this opens up a new field of work that would focus on designing new statistics that measure predictivity.”
Researchers at Princeton, Columbia and Harvard have created a new method to analyze big data that better predicts outcomes in health care, politics and other fields.
The study appears this week in the journal Proceedings of the National Academy of Sciences. A PDF is available on request.
In previous studies, the researchers showed that significant variables might not be predictive and that good predictors might not appear statistically significant. This posed an important question: how can we find highly predictive variables if not through a guideline of statistical significance? Common approaches to prediction include using a significance-based criterion for evaluating variables to use in models and evaluating variables and models simultaneously for prediction using cross-validation or independent test data.
In an effort to reduce the error rate with those methods, the researchers proposed a new measure called the influence score, or I-score, to better measure a variable’s ability to predict. They found that the I-score is effective in differentiating between noisy and predictive variables in big data and can significantly improve the prediction rate. For example, the I-score improved the prediction rate in breast cancer data from 70 percent to 92 percent. The I-score can be applied in a variety of fields, including terrorism, civil war, elections and financial markets.
“The practical implications are what drove the project, so they’re quite broad,” says lead author Adeline Lo, a postdoctoral researcher in Princeton’s Department of Politics. “Essentially anytime you might be interested in predicting and identifying highly predictive variables, you might have something to gain by conducting variable selection through a statistic like the I-score, which is related to variable predictivity. That the I-score fares especially well in high dimensional data and with many complex interactions between variables is an extra boon for the researcher or policy expert interested in predicting something with large dimensional data.”
Incentives that are designed to enable smarter use of the ocean while also protecting marine ecosystems can and do work, and offer significant hope to help address the multiple environmental threats facing the world’s oceans, researchers conclude in a new analysis.
Whether economic or social, incentive-based solutions may be one of the best options for progress in reducing impacts from overfishing, climate change, ocean acidification and pollution, researchers from Oregon State University and Princeton University say in a new report published this week in Proceedings of the National Academy of Sciences.
And positive incentives – the “carrot” – work better than negative incentives, or the “stick.”
Part of the reason for optimism, the researchers report, is changing awareness, attitudes and social norms around the world, in which resource users and consumers are becoming more informed about environmental issues and demanding action to address them. That sets the stage for economic incentives that can convert near-disaster situations into sustainable fisheries, cleaner water and long-term solutions.
“As we note in this report, the ocean is becoming higher, warmer, stormier, more acidic, lower in dissolved oxygen and overfished,” said Jane Lubchenco, the distinguished university professor in the College of Science and advisor in marine studies at Oregon State University, lead author of the new report, and U.S. science envoy for the ocean at the Department of State.
“The threats facing the ocean are enormous, and can seem overwhelming. But there’s actually reason for hope, and it’s based on what we’ve learned about the use of incentives to change the way people, nations and institutions behave. We believe it’s possible to make that transition from a vicious to a virtuous cycle. Getting incentives right can flip a disaster to a resounding success.”
Simon A. Levin, the James S. McDonnell distinguished university professor in ecology and evolutionary biology at Princeton University and co-author of the publication, had a similar perspective.
“It is really very exciting that what, until recently, was theoretical optimism is proving to really work,” Levin said. “This gives me great hope for the future.”
The stakes are huge, the scientists point out in their study.
The global market value of marine and coastal resources and industries is about $3 trillion a year; more than 3 billion people depend on fish for a major source of protein; and marine fisheries involve more than 200 million people. Ocean and coastal ecosystems provide food, oxygen, climate regulation, pest control, recreational and cultural value.
“Given the importance of marine resources, many of the 150 or more coastal nations, especially those in the developing world, are searching for new approaches to economic development, poverty alleviation and food security,” said Elizabeth Cerny-Chipman, a postdoctoral scholar working with Lubchenco. “Our findings can provide guidance to them about how to develop sustainably.”
In recent years, the researchers said in their report, new incentive systems have been developed that tap into people’s desires for both economic sustainability and global environmental protection. In many cases, individuals, scientists, faith communities, businesses, nonprofit organizations and governments are all changing in ways that reward desirable and dissuade undesirable behaviors.
One of the leading examples of progress is the use of “rights-based fisheries.” Instead of a traditional “race to fish” concept based on limited seasons, this growing movement allows fishers to receive a guaranteed fraction of the catch, benefit from a well-managed, healthy fishery and become part of a peer group in which cheating is not tolerated.
There are now more than 200 rights-based fisheries covering more than 500 species among 40 countries, the report noted. One was implemented in the Gulf of Mexico red snapper commercial fishery, which was on the brink of collapse after decades of overfishing. A rights-based plan implemented in 2007 has tripled the spawning potential, doubled catch limits and increased fishery revenue by 70 percent.
“Multiple turn-around stories in fisheries attest to the potential to end overfishing, recover depleted species, achieve healthier ocean ecosystems, and bring economic benefit to fishermen and coastal communities,” said Lubchenco. “It is possible to have your fish and eat them too.”
A success story used by some nations has been combining “territorial use rights in fisheries,” which assign exclusive fishing access in a particular place to certain individuals or communities, together with adjacent marine reserves. Fish recover inside the no-take reserve and “spillover” to the adjacent fished area outside the reserve. Another concept of incentives has been “debt for nature” swaps used in some nations, in which foreign debt is exchanged for protection of the ocean.
“In parallel to a change in economic incentives,” said Jessica Reimer, a graduate research assistant with Lubchenco, “there have been changes in behavioral incentives and social norms, such as altruism, ethical values, and other types of motivation that can be powerful drivers of change.”
The European Union, based on strong environmental support among its public, has issued warnings and trade sanctions against countries that engage in illegal, unregulated and unreported fishing. In the U.S., some of the nation’s largest retailers, in efforts to improve their image with consumers, have moved toward sale of only certified sustainable seafood.
Incentives are not a new idea, the researchers noted. But they emphasize that their power may have been under-appreciated.
“Recognizing the extent to which a change in incentives can be explicitly used to achieve outcomes related to biodiversity, ecosystem health and sustainability . . . holds particular promise for conservation and management efforts in the ocean,” they wrote in their conclusion.
Neural networks using light could lead to superfast computing.
Neural networks are taking the world of computing by storm. Researchers have used them to create machines that are learning a huge range of skills that had previously been the unique preserve of humans—object recognition, face recognition, natural language processing, machine translation. All these skills, and more, are now becoming routine for machines.
So there is great interest in creating more capable neural networks that can push the boundaries of artificial intelligence even further. The focus of this work is in creating circuits that operate more like neurons, so-called neuromorphic chips. But how to make these circuits significantly faster?
Today, we get an answer of sorts thanks to the work of Alexander Tait and pals at Princeton University in New Jersey. These guys have built the world’s first photonic neuromorphic chip and show that it computes at ultrafast speeds.
Optical computing has long been the great white hope of computer science. Photons have significantly more bandwidth than electrons and so can process more data more quickly. But the advantages of optical data processing systems have never outweighed the additional cost of making them, and so they have never been widely adopted.
That has started to change in some areas of computing, such as analog signal processing, which requires the kind of ultrafast data processing that only photonic chips can provide.
Now neural networks are opening up a new opportunity for photonics. “Photonic neural networks leveraging silicon photonic platforms could access new regimes of ultrafast information processing for radio, control, and scientific computing,” say Tait and co.
At the heart of the challenge is to produce an optical device in which each node has the same response characteristics as a neuron. The nodes take the form of tiny circular waveguides carved into a silicon substrate in which light can circulate. When released this light then modulates the output of a laser working at threshold, a regime in which small changes in the incoming light have a dramatic impact on the laser’s output.
Crucially, each node in the system works with a specific wavelength of light—a technique known as wave division multiplexing. The light from all the nodes can be summed by total power detection before being fed into the laser. And the laser output is fed back into the nodes to create a feedback circuit with a non-linear character.
An important question is just how closely this non-linearity mimics neural behavior. Tait and co measure the output and show that it is mathematically equivalent to a device known as a continuous-time recurrent neural network. “This result suggests that programming tools for CTRNNs could be applied to larger silicon photonic neural networks,” they say.
That’s an important result because it means the device that Tait and co have made can immediately exploit the vast range of programming nous that has been gathered for these kinds of neural networks.
They go on to demonstrate how this can be done using a network consisting of 49 photonic nodes. They use this photonic neural network to solve the mathematical problem of emulating a certain kind of differential equation and compare it to an ordinary central processing unit.
The results show just how fast photonic neural nets can be. “The effective hardware acceleration factor of the photonic neural network is estimated to be 1,960 × in this task,” say Tait and co. That’s a speed up of three orders of magnitude.
That opens the doors to an entirely new industry that could bring optical computing into the mainstream for the first time. “Silicon photonic neural networks could represent first forays into a broader class of silicon photonic systems for scalable information processing,” say Taif and co.
Of course much depends on how well the first generation of electronic neuromorphic chips perform. Photonic neural nets will have to offer significant advantages to be widely adopted and will therefore require much more detailed characterization. Clearly, there are interesting times ahead for photonics.
Learn more: World’s First Photonic Neural Network Unveiled
A system that can compare physical objects while potentially protecting sensitive information about the objects themselves has been demonstrated experimentally at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL). This work, by researchers at Princeton University and PPPL, marks an initial confirmation of the application of a powerful cryptographic technique in the physical world.
“This is the first experimental demonstration of a physical zero-knowledge proof,” said Sébastien Philippe, a graduate student in the Department of Mechanical and Aerospace Engineering at Princeton University and lead author of the paper. “We have translated a major method of modern cryptography devised originally for computational tasks into use for a physical system.” Cryptography is the science of disguising information.
This research, supported by funding from the DOE’s National Nuclear Security Administration through the Consortium for Verification Technology, marks a promising first experimental step toward a technique that could prove useful in future disarmament agreements, pending the results of further development, testing and evaluation. While important questions remain, the technique, first proposed in a paper published in 2014 in Nature magazine, might have potential application to verify that nuclear warheads presented for disarmament were in fact true warheads. Support for this work came also from the John D. and Catherine T. MacArthur Foundation and the Carnegie Foundation of New York.
The research, outlined in a paper in Nature Communications on September 20, 2016, was conducted on a set of 2-inch steel and aluminum cubes arranged in different combinations. Researchers first organized the cubes into a designated “true” pattern and then into a number of “false” ones. Next, they beamed high-energy neutrons into each arrangement and recorded how many passed through to bubble neutron detectors produced by Yale University, on the other side. When a neutron interacts with a superheated droplet in the detector, it creates a stable macroscopic bubble.
To avoid revealing information about the composition and configuration of the cubes, bubbles created in this manner were added to those already preloaded into the detectors. The preload was designed so that if a valid object were presented, the sum of the preload and the signal detected with the object present would equal the count produced by firing neutrons directly into the detectors – with no object in front of them.
The experiment found that the count for the “true” pattern equaled the sum of the preload and the object when neutrons were beamed with nothing in front of them, while the count for the significantly different “false” arrangements clearly did not.
“This was an extremely important experimental demonstration,” said Robert Goldston, a fusion scientist and coauthor of the paper who is former director of PPPL and a Princeton professor of astrophysical sciences. “We had a theoretical idea and have now provided a proven practical example.” Joining him as coauthors are Alex Glaser, associate professor in Princeton’s Woodrow Wilson School of Public and International Affairs and the Department of Mechanical and Aerospace Engineering; and Francesco d’Errico, senior research scientist at the Yale School of Medicine and professor at the University of Pisa, Italy.
When further developed for a possible arms control application, the technique would add bubbles from irradiation of a putative warhead to those already preloaded into detectors by the warhead’s owner.
If the total for the new and preloaded bubbles equaled the count produced by beaming neutrons into the detectors with nothing in front of them, the putative weapon would be verified to be a true one. But if the total count for the preload plus warhead irradiation did not match the no-object count, the inspected weapon would be exposed as a spoof. Prior to the test, the inspector would randomly select which preloaded detectors to use with which putative warhead, and which preload to use with a warhead that was, for example, selected from the owner’s active inventory.
In a sensitive measurement, such as one involving a real nuclear warhead, the proposition is that no classified data would be exposed or shared in the process, and no electronic components that might be vulnerable to tampering or snooping would be used. Even statistical noise — or random variation in neutron measurement — would convey no data. Indeed, “For the zero-knowledge property to be conserved, neither the signal nor the noise may carry information,” the authors write. A necessary future step is to assess this proposition fully, and to develop and review a concept of operations in detail to determine actual viability and information sensitivity.
Important questions yet to be resolved include the details of obtaining and confirming a target warhead during the zero-knowledge measurement; specifics of establishing and maintaining the pre-loaded detectors in a way that ensures inspecting party confidence without revealing any data considered sensitive by the inspected party; and feasibility questions associated with safely deploying active interrogation measurement techniques on actual nuclear warheads in sensitive physical environments, in a way that provides confidence to both the inspected and inspecting parties.
Glaser, Goldston and Boaz Barak, a professor of computer science at Harvard University and former Princeton associate professor, first launched the concept for a zero-knowledge protocol for warhead verification in the 2014 paper in Nature magazine. That paper led Foreign Policy magazine to name the authors among its “100 Leading Global Thinkers of 2014,” and prompted other research centers to embark on similar projects. “We are happy to see this important field of research gain new momentum and create new opportunities for collaboration between national laboratories and universities,” Glaser said.
Increased power and slashed energy consumption for data centers
Princeton University researchers have built a new computer chip that promises to boost performance of data centers that lie at the core of online services from email to social media.
Data centers – essentially giant warehouses packed with computer servers – enable cloud-based services, such as Gmail and Facebook, as well as store the staggeringly voluminous content available via the internet. Surprisingly, the computer chips at the hearts of the biggest servers that route and process information often differ little from the chips in smaller servers or everyday personal computers.
By designing their chip specifically for massive computing systems, the Princeton researchers say they can substantially increase processing speed while slashing energy needs. The chip architecture is scalable; designs can be built that go from a dozen processing units (called cores) to several thousand. Also, the architecture enables thousands of chips to be connected together into a single system containing millions of cores. Called Piton, after the metal spikes driven by rock climbers into mountainsides to aid in their ascent, it is designed to scale.
“With Piton, we really sat down and rethought computer architecture in order to build a chip specifically for data centers and the cloud,” said David Wentzlaff, an assistant professor of electrical engineering and associated faculty in the Department of Computer Science at Princeton University. “The chip we’ve made is among the largest chips ever built in academia and it shows how servers could run far more efficiently and cheaply.”
Wentzlaff’s graduate student, Michael McKeown, will give a presentation about the Piton project Tuesday, Aug. 23, at Hot Chips, a symposium on high performance chips in Cupertino, California. The unveiling of the chip is a culmination of years of effort by Wentzlaff and his students. Mohammad Shahrad, a graduate student in Wentzlaff’s Princeton Parallel Group said that creating “a physical piece of hardware in an academic setting is a rare and very special opportunity for computer architects.”
Other Princeton researchers involved in the project since its 2013 inception are Yaosheng Fu, Tri Nguyen, Yanqi Zhou, Jonathan Balkind, Alexey Lavrov, Matthew Matl, Xiaohua Liang, and Samuel Payne, who is now at NVIDIA. The Princeton team designed the Piton chip, which was manufactured for the research team by IBM. Primary funding for the project has come from the National Science Foundation, the Defense Advanced Research Projects Agency, and the Air Force Office of Scientific Research.
The current version of the Piton chip measures six by six millimeters. The chip has over 460 million transistors, each of which are as small as 32 nanometers – too small to be seen by anything but an electron microscope. The bulk of these transistors are contained in 25 cores, the independent processors that carry out the instructions in a computer program. Most personal computer chips have four or eight cores. In general, more cores mean faster processing times, so long as software ably exploits the hardware’s available cores to run operations in parallel. Therefore, computer manufacturers have turned to multi-core chips to squeeze further gains out of conventional approaches to computer hardware.
In recent years companies and academic institutions have produced chips with many dozens of cores; but Wentzlaff said the readily scalable architecture of Piton can enable thousands of cores on a single chip with half a billion cores in the data center.
“What we have with Piton is really a prototype for future commercial server systems that could take advantage of a tremendous number of cores to speed up processing,” said Wentzlaff.
The Piton chip’s design focuses on exploiting commonality among programs running simultaneously on the same chip. One method to do this is called execution drafting. It works very much like the drafting in bicycle racing, when cyclists conserve energy behind a lead rider who cuts through the air, creating a slipstream.
At a data center, multiple users often run programs that rely on similar operations at the processor level. The Piton chip’s cores can recognize these instances and execute identical instructions consecutively, so that they flow one after another, like a line of drafting cyclists. Doing so can increase energy efficiency by about 20 percent compared to a standard core, the researchers said.
A second innovation incorporated into the Piton chip parcels out when competing programs access computer memory that exists off of the chip. Called a memory traffic shaper, this function acts like a traffic cop at a busy intersection, considering each programs’ needs and adjusting memory requests and waving them through appropriately so they do not clog the system. This approach can yield an 18 percent performance jump compared to conventional allocation.
The Piton chip also gains efficiency by its management of memory stored on the chip itself. This memory, known as the cache memory, is the fastest in the computer and used for frequently accessed information. In most designs, cache memory is shared across all of the chip’s cores. But that strategy can backfire when multiple cores access and modify the cache memory. Piton sidesteps this problem by assigning areas of the cache and specific cores to dedicated applications. The researchers say the system can increase efficiency by 29 percent when applied to a 1,024-core architecture. They estimate that this savings would multiply as the system is deployed across millions of cores in a data center.
The researchers said these improvements could be implemented while keeping costs in line with current manufacturing standards. To hasten further developments leveraging and extending the Piton architecture, the Princeton researchers have made its design open source and thus available to the public and fellow researchers at the OpenPiton website: http://www.
“We’re very pleased with all that we’ve achieved with Piton in an academic setting, where there are far fewer resources than at large, commercial chipmakers,” said Wentzlaff. “We’re also happy to give out our design to the world as open source, which has long been commonplace for software, but is almost never done for hardware.”
Scientists from Princeton University and NASA have confirmed that 1,284 objects observed outside Earth’s solar system by NASA’s Kepler spacecraft are indeed planets. Reported in The Astrophysical Journal on May 10, it is thelargest single announcement of new planets to date and more than doubles the number of confirmed planets discovered by Kepler so far to more than 2,300.
The researchers’ discovery hinges on a technique developed at Princeton that allows scientists to efficiently analyze thousands of signals Kepler has identified to determine which are most likely to be caused by planets and which are caused by non-planetary objects such as stars. This automated technique — implemented in a publicly available custom software package called Vespa — computes the chances that the signal is in fact caused by a planet.
The researchers used Vespa to compute the reliability values for over 7,000 signals identified in the latest Kepler catalog, and verified the 1,284 planets with 99 percent certainty. They also independently verified 651 additional planet signals that had already been confirmed as planets by other methods. In addition, the researchers identified 428 candidates as likely “false positives,” or signals generated by something other than a planet.
A team of researchers at Princeton University has predicted the existence of a new state of matter in which current flows only through a set of surface channels that resemble an hourglass.
These channels are created through the action of a newly theorized particle, dubbed the “hourglass fermion,” which arises due to a special property of the material. The tuning of this property can sequentially create and destroy the hourglass fermions, suggesting a range of potential applications such as efficient transistor switching.
In an article published in the journal Nature this week, the researchers theorize the existence of these hourglass fermions in crystals made of potassium and mercury combined with either antimony, arsenic or bismuth. The crystals are insulators in their interiors and on their top and bottom surfaces, but perfect conductors on two of their sides where the fermions create hourglass-shaped channels that enable electrons to flow.
Scientists at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) have helped design and test a component that could improve the performance of doughnut-shaped fusion facilities known as tokamaks.
Called a “liquid lithium limiter,” the device has circulated the protective liquid metal within the walls of China’s Experimental Advanced Superconducting Tokamak (EAST) and kept the plasma from cooling down and halting fusion reactions. The journal Nuclear Fusion published results of the experiment in March 2016. The research was supported by the DOE Office of Science.
“We demonstrated a continuous, recirculating lithium flow for several hours in a tokamak,” said Rajesh Maingi, head of boundary physics research and plasma-facing components at PPPL. “We also demonstrated that the flowing liquid lithium surface was compatible with high plasma confinement and with reduced recycling of the hydrogen isotope deuterium to an extent previously achieved only with evaporated lithium coatings. The recirculating lithium provides a fresh, clean surface that can be used for long-lasting plasma discharges.”
A new study from Princeton has revealed how a synthetic protein revives E. coli cells that lack a life-sustaining gene, offering insight into how life can adapt to survive and potentially be reinvented.
Researchers in the Hecht lab discovered the unexpected way in which a synthetic protein called SynSerB promotes the growth of cells that lack the natural SerB gene, which encodes an enzyme responsible for the last step in the production of the essential amino acid serine. The findings were published in the Proceedings of the National Academy of Sciences.
The Hecht group first discovered SynSerB’s ability to rescue serine-depleted E. coli cells in 2011. At that time, they also discovered several other de novo proteins capable of rescuing the deletions of three other essential proteins in E. coli. “These are novel proteins that have never existed on Earth, and aren’t related to anything on Earth yet they enable life to grow where it otherwise would not,” said Michael Hecht, professor of chemistry at Princeton and corresponding author on the article.
Natural proteins are complex molecular machines constructed from a pool of twenty different amino acids. Typically they range from several dozen to several hundred amino acids in length. In principle, there are more possible protein sequences than atoms in the universe, but through evolution Nature has selected just a small fraction to carry out the cellular functions that make life possible.
“Those proteins must be really special,” Hecht said. “The driving question was, ‘Can we do that in the laboratory? Can we come up with non-natural sequences that are that special, from an enormous number of possibilities?”
To address this question, the Hecht lab developed a library of non-natural proteins guided by a concept called binary design. The idea was to narrow down the number of possible sequences by choosing from eleven select amino acids that were divided into two groups: polar and non-polar. By using only the polar or non-polar characteristics of those amino acids, the researchers could design a plethora of novel proteins to fold into a particular shape based on their affinity to and repulsion from water. Then, by allowing the specific positions to have different amino acids, the researchers were able to produce a diverse library of about one million proteins, each 102 amino acids long.
“We had to focus on certain subsets of proteins that we knew would fold and search there first for function,” said Katie Digianantonio, a graduate student in the Hecht lab and first author on the paper. “It’s like instead of searching the whole universe for life, we’re looking in specific solar systems.”
Having found several non-natural proteins that could rescue specific cell lines, this latest work details their investigation specifically into how SynSerB promotes cell growth. The most obvious explanation, that SynSerB simply catalyzed the same reaction performed by the deleted SerB gene, was discounted by an early experiment.
To discern SynSerB’s mechanism among the multitude of complex biochemical pathways in the cell, the researchers turned to a technique called RNA sequencing. This technique allowed them to take a detailed snapshot of the serine-depleted E. Coli cells with and without their synthetic protein and compare the differences.
“Instead of guessing and checking, we wanted to look at the overall environment to see what was happening,” Digianantonio said. The RNA sequencing experiment revealed that SynSerB induced overexpression of a protein called HisB, high levels of which have been shown to promote the key reaction normally performed by the missing gene. By enlisting the help of HisB, the non-natural protein was able to induce the production of serine, which ultimately allowed the cell to survive.
“Life is opportunistic. Some proteins are going to work by acting similarly to what they replaced and some will find another pathway,” Hecht said. “Either way it’s cool.”
Learn more: How an artificial protein rescues dying cells