Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons. The primary campus is located on Kirtland Air Force Base in Albuquerque, New Mexico and the other is in Livermore, California, next to Lawrence Livermore National Laboratory. Sandia is a National Nuclear Security Administration laboratory.
It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste. Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology, mathematics (through its Computer Science Research Institute), materials science, alternative energy, psychology, MEMS, and cognitive science initiatives. Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm, originally known as Thor’s Hammer. Sandia is also home to the Z Machine. The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear weapons.
Sandia National Laboratories research articles from Innovation Toronto
- Iron nitride transformers could boost energy storage options – March 27, 2016
- Enormous blades longer than two football fields could lead to more offshore energy in U.S. – January 30, 2016
- Thor’s hammer pulsed-power accelerator to crush materials at 1 million atmospheres – January 10, 2016
- Way cheaper catalyst may lower fuel costs for hydrogen-powered cars – October 8, 2015
- Algae nutrient recycling is a triple win – August 26, 2015
- Storing hydrogen underground could boost transportation, energy security – December 11, 2014
- New Portable Nuclear Device Detector – November 3, 2014
- Magnetized fusion technique produces significant results: ‘Break-even’ point close – September 26, 2014
- Diamond plates create nanostructures through pressure, not chemistry – June 28, 2014
- How to Keep the Lights on after a Superstorm | micro grid
- Triple-Threat Method Sparks Hope for Nuclear Fusion Energy
- Sandia Labs harnessing the sun’s energy with tiny particles
- Sandia probability maps help sniff out food contamination
- Lifelike, cost-effective robotic Sandia Hand can disable IEDs
- Miniature Sandia sensors to Help Climate Research
- Sandia simulation suggests sunny skies for fusion reactors
- Sandia hoppers have robots jumping for joy
- A development that could have profound implications for the future of electronics, sensors, energy conversion and energy storage. | metal-organic framework
- Fusion, Anyone?
- Tiny Detectors Sniff Out Chemical, Biological Threats
- DARPA’s ATLAS humanoid robot gears up for disaster response
- Power for seaports may be the next job for hydrogen fuel cells
- Return of the Hydrogen Car?
- Team observes real-time charging of a lithium-air battery
- Offshore use of vertical-axis wind turbines gets closer look
- US draws up plans for nuclear drones
- U.S. Army Recruiting an Array of Animal-Inspired Robots to Assist Battlefield Troops
- Electric cars could fill up at the MetILs pump
- “Interface scaffolds” could wire prosthetics directly into amputees’ nervous systems
- Gemini-Scout mine rescue robot to lead the way to trapped miners
- World’s Smallest Battery
- New approach could mean break-even nuclear fusion reactions within 2-3 years
- LEDs Promise Brighter Future, Not Necessarily Greener
- Lead-Carbon: A Game Changer for Alternative Energy Storage
- Are Engines the Future of Solar Power?
- Tiny glitter-sized photovoltaic cells could revolutionize solar power
- First Suncatcher solar dishes to be used in Arizona
- Microelectronic Photovoltaics: The Solar Breakthrough We’ve Been Looking For?
Sandia National Laboratories researchers have shown it’s possible to make transistors and diodes from advanced semiconductor materials that could perform much better than silicon, the workhorse of the modern electronics world.
The breakthrough work takes a step toward more compact and efficient power electronics, which in turn could improve everything from consumer electronics to electrical grids. Power electronics are vital for electrical systems because they transfer power from its source to the load, or user, by converting voltages, currents and frequencies. Sandia’s research was published this summer in Applied Physics Letters and Electronics Letters and presented at conferences.
“The goal is to be able to shrink power supplies, power conversion systems,” said electrical engineer Bob Kaplar, who leads a Laboratory Directed Research and Development project studying ultrawide bandgap (UWBG) semiconductor materials. The project explores ways to grow those materials with fewer defects and create different device designs that exploit the properties of these new materials that have significant advantages over silicon.
The project is laying the scientific groundwork for the new UWBG research area, answering such questions as how the materials behave and how to work with them. It also will aid Sandia’s broader work through developments, such as compact power conversion by using better semiconductor devices. “Understanding the science helps lead toward that second goal,” Kaplar said.
Bandgap is a fundamental materials property that helps determine electrical conductivity and ultimately transistor performance. Wide bandgap (WBG) materials allow devices to operate at higher voltages, frequencies and temperatures, and are starting to have impact on power conversion systems. Emerging ultrawide bandgap materials are even more attractive because they could allow further scaling to devices that operate at even higher voltages, frequencies and temperatures. When made into transistors, the materials have the potential to vastly improve the performance and efficiency of electrical power grids, electric vehicles, computer power supplies and motors for such things as heating, ventilation and air conditioning (HVAC) systems. Faster switching also could lead to smaller capacitors and associated circuit components, miniaturizing the entire power system.
Work demonstrates highest-bandgap transistor
Sandia researchers demonstrated the highest-bandgap transistor ever, a High Electron Mobility Transistor, and published those results in the July 18 edition of Applied Physics Letters. Sandia published papers in June and July in Electronics Letters analyzing the performance of diodes made from gallium nitride (GaN) and aluminum gallium nitride (AlGaN).
“All three of these papers represent progress on the road to more compact and higher-efficiency power converters,” Kaplar said. “They are also very exciting developments in semiconductor materials and device physics in their own right.”
However, he cautioned that the work doesn’t mean UWBG devices are ready for the marketplace.
“There are a lot more improvements that need to be made to the transistor,” he said. “The same with the diodes. There’s a lot more optimization that needs to be done, a lot we don’t understand about their behavior.”
Researchers at Sandia and elsewhere have studied WBG materials, such as silicon carbide (SiC) and GaN, for about two decades. In recent years, Sandia also has looked at next-generation UWBG materials, such as AlGaN. In fact, Sandia coined the term ultrawide bandgap, which has caught on throughout the research community, Kaplar said.
Researchers studying best way to grow new materials
One critical piece of the puzzle is figuring out the best way to grow new semiconductor materials. Researchers also must understand defects in the materials, how to process materials into working devices and find ways to improve passive elements, such as magnetic inductors.
Semiconductor materials are characterized by their efficiency and effectiveness, so it’s easy to assume you could make a power supply 10 times smaller if one material is 10 times better than another. But it’s not that simple. “It depends on other components in the power converter. There’s magnetics, there’s capacitors,” Kaplar said. “We’re starting to look at what is a more realistic scaling.”
He and his colleagues collaborate with Sandia experts in other fields to understand the relationship of semiconductors to other components in a system. “The semiconductor enables the system, but if you have something else that’s limiting it, then you can’t reach the semiconductor’s full potential for shrinking the size of power conversion,” Kaplar said.
Better semiconductor materials would mean higher absolute voltages for such uses as distributing power grid energy. Right now that’s done by stacking devices in series to reach a desired combined voltage. Since UWBG materials have higher voltages than more traditional materials, far fewer devices would be needed in the stack. Kaplar said UWBG materials also could be useful at extreme temperatures or radiation environments — applications of interest for nuclear weapons or satellites.
Because of the potential impact on so much of Sandia’s work, Kaplar expects UWBG research to continue after the current project ends next September. “We lay the foundation and then we want it to continue to advance, both the science and the eventual applications.”
Learn more: Honey, I shrunk the circuit
High-convergence implosions produce thermonuclear fusion from high-temperature plasma
Using magnetic field thermal insulation to keep plasmas hot enough to achieve thermonuclear fusion was first proposed by the Italian physicist Enrico Fermi in 1945, and independently a few years later by Russian physicist Andrei Sakharov.
An approach known as magneto-inertial fusion uses an implosion of material surrounding magnetized plasma to compress it and thereby generate temperatures in excess of the 20 million degrees required to initiate fusion. But historically, the concept has been plagued by insufficient temperature and stagnation pressure production, due to instabilities and thermal losses in the system. Recently, however, researchers using the Z Machine at Sandia National Laboratories have demonstrated improved control over and understanding of implosions in a Z-pinch, a particular type of magneto-inertial device that relies on the Lorentz force to compress plasma to fusion-relevant densities and temperatures.
The breakthrough was enabled by unforeseen and entirely unexpected physics. The researchers’ approach to fusion relies on laser preheating of the fuel contained within a solid cylindrical metal liner, both of which are pre-magnetized by a magnetic field of 100,000 Gauss — a crucial distinction. Applying a force of 20 million Amperes over 100 nanoseconds causes the liner to implode, compressing the plasma and raising temperatures to 30 million degrees and magnetic flux to 100 million Gauss. When the fusion yield is large enough, such an enormous magnetic field is able to trap the heat given off by the fusion reactions and “boot-strap” itself to higher temperatures, leading to ignition of the fuel.
According to existing theory, however, the imposed magnetic field should not have significantly impacted the growth of the instabilities that normally shred the liner and prevent high levels of compression during the implosion. But, while fusion plasmas are subject to various forms of instability, referred to as modes, not all these instabilities are detrimental. The pre-magnetized system demonstrated unprecedented implosion stability due to the unpredicted growth of helical modes, rather than the usual azimuthallycorrelated modes that are most damaging to implosion integrity. The dominant helical modes replaced and grew more slowly than the so-called “sausage” modes found in most Z-pinches, allowing the plasma to be compressed to the thermonuclear fusion-producing temperature of 30 million degrees and one billion times atmospheric pressure.
The origin of the helical modes themselves, however, remained a mystery. Advanced simulations of the system solved the mystery by uncovering the origin of the helical instability growth that enabled high temperatures, magnetic fields, and plasma pressures from such high-convergence implosions. The researchers achieved the critical new breakthrough when they included effects from the plasma and magnetic field in the transmission line that delivers the intense current pulse to the implosion region. They found that the plasma outside the liner participated in an upper hybrid oscillation and bombarded the liner, resulting in a helically correlated perturbation to the liner early in time that overrides other perturbations.
The new perturbation source was also found to be the previously unexplained origin of the ubiquitous “sausage” fundamental mode that has historically dominated and spoiled the Z-pinch implosion dynamics in the un-magnetized versions of these systems. Once they included the new physics in the modeling, the researchers were able to reproduce and explain the two-dozen observables from the magnetized liner inertial fusion experiments at the Z Machine (Figure 1). The implosions were found to efficiently convert liner kinetic energy into the internal energy of the fusion fuel and confirm the system behaved as expected and could scale to higher yields on future facilities.
Since the thermonuclear hot spot produced the expected stagnation pressure and was not dominated by 3D instability, it is now thought to provide the basis for a promising route to achieve higher thermonuclear fusion yields in the laboratory.
One of the biggest untapped clean energy sources on the planet — wave energy — could one day power millions of homes across the U.S. But more than a century after the first tests of the power of ocean waves, it is still one of the hardest energy sources to capture.
Now, engineers at Sandia National Laboratories are conducting the largest model-scale wave energy testing of its kind to improve the performance of wave-energy converters (WECs). The project is taking place at the U.S. Navy’s Maneuvering and Sea Keeping facility at the Carderock Division in Bethesda, Maryland, one of the largest wave tanks in the world at 360 feet long and 240 feet wide and able to hold 12 million gallons of water.
Sandia project leads Ryan Coe and Giorgio Bacelli spend long days in the dark wave tank, where minimal lighting reduces the growth of algae in the water. They are collecting data from their numerical modeling and experimental research to benefit wave energy technology with improved methodologies, strategic control systems design and testing practices for wave energy converters.
“Our goal is to improve the economic viability of these devices,” said Coe. “In order to do so, we are working out ways to control the WEC’s generator to increase the amount of power it absorbs. At the same time, we are looking at how to reduce the loads and stresses on these devices in harsh conditions to ultimately lengthen a WEC’s lifespan in the water.”
Coe said numerous initial studies estimate that improving control of the WECs’ generators can dramatically increase energy absorption by as much as 300 percent. Transitioning these simplified studies to more realistic large-scale devices is the challenge at hand.
To control the dynamics for better, faster results in the wave tank, Coe and Bacelli are using modeling and control methods that have been successful in other industries, such as in the aerospace industry.
“The systems we used have been around for a while, but strangely enough they had never been applied to wave energy converters,” Bacelli said. “So far, we know the techniques we are using are more efficient and cost-effective than existing methods. We are getting more information in a fraction of the time.”
Now that Sandia has completed the first round of analyses in the water, Coe said the goal is to process all the collected data to develop a new, enhanced model that will make sure the next test yields even more valuable results.
“Make no mistake, these are extremely complex machines,” Bacelli said. “They have to be fine-tuned continuously because ocean waves are constantly changing. With this setup at the Navy’s facility, we have a unique opportunity to study the problems and quantify the effects. We want to help the industry by offering solutions to the challenges the wave energy world is facing.”
Sandia explores neural computing to extend Moore’s Law
Computation is stuck in a rut. The integrated circuits that powered the past 50 years of technological revolution are reaching their physical limits.
This predicament has computer scientists scrambling for new ideas: new devices built using novel physics, new ways of organizing units within computers and even algorithms that use new or existing systems more efficiently. To help coordinate new ideas, Sandia National Laboratories has assisted organizing the Institute of Electrical and Electronics Engineers (IEEE) International Conference on Rebooting Computing held Oct. 17-19.
Researchers from Sandia’s Data-driven and Neural Computing Dept. will present three papers at the conference, highlighting the breadth of potential non-traditional neural computing applications.
“We’re taking a stab at the scope of what neural algorithms can do. We’re not trying to be exhaustive, but rather we’re trying to highlight the kind of application over which algorithms may be impactful,” said Brad Aimone, a computational neuroscientist and co-author of one paper. Historically, neural computing has been seen as approximate and fuzzy, he added; however, Sandia researchers in their papers aim to extend neural algorithms so they incorporate rigor and predictability, which shows they may have a role in high performance scientific computing.
The three papers are entitled “Overcoming the Static Learning Bottleneck — the Need for Adaptive Neural Learning” by Craig Vineyard and Steve Verzi; “Computing with Dynamical Systems” by Fred Rothganger; and “Spiking Network Algorithms for Scientific Computing” by William Severa, Ojas Parekh, Kris Carlson, Conrad James and Aimone.
Troubles and benefits of continuously learning
The brain is continually learning. “While we do learn in school, our learning doesn’t stop when school ends. Instead, our brains are continually adapting through processes, such as synaptic modifications. However, most machine-learning algorithms learn once and are done,” said Vineyard, a computer scientist.
Most so-called machine-learning algorithms have a learning phase and a separate testing and operation phase. This is really time consuming. Ambitious — and challenging — attempts to develop algorithms that learn continuously also run the risk of the algorithm “learning” something that’s wrong, Vineyard said.
His paper argues for continual learning and suggests the use of game theory — the mathematics of logical decisions, such as when to take out the trash and when to hope your roommate will do it for you — to bring precision to the decision of when an algorithm should learn.
What are dynamical systems anyway?
A dynamical system is an equation that describes how things change with time. A simple dynamical system is a function that describes the movement of a grandfather clock’s pendulum. “The idea behind using dynamical systems for computation is to build a machine such that its dynamics — which has to do with the structure of the machine or the structure of the math — will lead it to the answer based on feeding it the question,” said Rothganger, a computer scientist.
Both our brains and, in a way, conventional computers are dynamical systems: They find answers just based on the question and how the computers are constructed, said Rothganger. His paper proposes that if researchers think of a traditional scientific computing problem, matrix decomposition, as a dynamical system, they could solve them rigorously on neuro-inspired systems.
“There’s a lot of potential and also a lot of risk in the idea I’m working on,” said Rothganger. If his idea works, “it would provide a point of unification between neural algorithms and traditional numerical algorithms.”
Artisan mathematicians craft spiking network algorithms
The third paper identifies three hand-crafted algorithms that use the careful arrangement of spiking neuron-like nodes to perform precise computations. In the brain, each neuron is connected to many other neurons and uses spikes of electricity to communicate. Severa, a mathematician, and his co-authors took inspiration from these aspects of the brain.
An example of these innovative algorithms is a kind of flow estimation called particle image velocimetry. By taking two pictures of dust motes moving through the air and figuring out how far they moved in the time between photos, researchers can determine the speed of the air and any local eddies. This can be done on a conventional computer using fancy math, but Severa’s method uses the massively parallel nature of neurons to calculate all the possible shifts efficiently, he said.
“By carefully designing your networks and the properties of your neurons, you can do exact things,” said Severa. “You can push the envelope of what you can expect a neural network to do.”
Whether the future holds neuro-inspired computers in your cellphone that understand phrases like “Show me a cute picture of Fluffy” and “Order my favorite Chinese food,” or if neural computers can also work alongside future quantum computers in solving tough math problems quickly, computing needs to be reinvented, and soon, said Aimone. By bringing together experts in many different disciplines, he said the International Conference on Rebooting Computing aims to nurture new ideas and spur this revolution.
Funding for all the projects was provided by Sandia’s Laboratory Directed Research and Developmentoffice. Two projects also were part of the Hardware Acceleration of Adaptive Neural Algorithms (HAANA) Grand Challenge.
The broader rebooting computing effort
Sandia employees are among the organizers of the IEEE’s Rebooting Computing initiative and the conference. Sandia’s Chief Technology Officer Rob Leland will give the conference kickoff talk on the history of innovation in computing. Sandia researchers Erik DeBenedictis and Matt Marinella are members of the conference program committee.
DeBenedictis, Sapan Agarwal, Jeanine Cook and Michael Frank also are presenting four papers on low-energy logic and memory. Christopher DeRose and Tony Lentine are presenting a paper on optical communications.
Learn more: Turning to the brain to reboot computing
Sandia researchers decode metabolic pathway of soil bacterium that thrives on lignin
Abundant, chock full of energy and bound so tightly that the only way to release its energy is through combustion — lignin has frustrated scientists for years. With the help of an unusual soil bacteria, researchers at Sandia National Laboratories believe they now know how to crack open lignin, a breakthrough that could transform the economics of biofuel production.
Lignin is a component of lignocellulosic biomass, the dry plant matter found virtually everywhere. As a biomass source that does not compete with food or feed, lignin is critical to biofuel production. Lignin makes up the fortress-like cell walls of plants to enable water transport against gravity while protecting them from microbial attack and environmental stress. These beneficial traits make lignin hard to break down and even harder to convert into something valuable.
By following the metabolic pathway of an unusual soil bacteria that lives off lignin, Sandia research team members led by principal investigator Seema Singh believe they can develop technologies to break down lignin and extract valuable platform chemicals. High-value chemicals like muconic acid and adipic acid can be derived from the platform chemicals.
“Lignin is an untapped resource,” said Singh. “But as a basis for high-value chemicals, it is of immense value. Those high-value chemicals can be the basis for polyurethane, nylon, and other bioplastics.”
The work is reported in a paper titled “Decoding how a soil bacterium extracts building blocks and metabolic energy from ligninolysis provides road map for lignin valorization” published on Sept. 15 inProceeding of National Academy of Sciences. The work is funded by Sandia’s Laboratory Directed Research and Development program.
Chemical production key to biorefinery economics
Biofuels simply don’t work as a replacement for gasoline due to the high cost of production.
But if you add the production of high-value chemicals to the biorefinery business model the economics fall into place — just as with the refinery industry, where crude oil is used to produce high-value chemicals and high-volume polymers used in our daily lives.
“Gasoline is a low-value, high-volume product. This is balanced by the high-value chemicals derived from about 6-10 percent of every barrel of oil,” said Singh.
Lignin is seen as a byproduct of limited use, typically burned for its energy content. Using biomass for chemical production could yield at least 10 times more value, compared to burning it to make electricity.
Living off lignin
For inspiration on how to break down lignin, the researchers looked to nature.
“We know that over a long period of time fungus and bacteria do eventually break down lignin,” explained Singh. “If we can understand this process, we can use what nature already knows for biofuel and chemical production from lignin.”
Since bacteria are easier to engineer for industrial production of desired chemicals, the researchers focused on bacteria. The best candidate was Sphingobium, or SYK-6, found in the lignin-rich waste stream from wood pulp production.
SYK-6 was extremely intriguing because it only feeds on lignin. Microbes generally live off sugar, which is much easier to break down and extract energy from. Imagine a choice between eating a corn kernel or a corn husk.
“In terms of thermodynamics, it doesn’t make sense for this bacteria to go after lignin instead of sugar,” said Singh. “It does not metabolize sugar. So, how does it survive? We knew SYK-6 must have a special mechanism to break down the strong linkages of polymeric lignin.”
Mapping the metabolic pathway
Just as following the money is key to investigating corruption, the researchers set out to follow the carbon to understand how SYK-6 lives off lignin. When the bacteria metabolizes lignin, it ends up via different pathways in various metabolite and building blocks. By following the carbon from start to finish in various networks — a method called metabolic flux analysis — the researchers hoped to map the metabolic pathway.
“This was the first time metabolic flux analysis was used to track lignin metabolism in a microbe,” said Singh. “Identifying and locating labeled source for the carbon substrate that could serve as a realistic surrogate proved very difficult.”
Because of the complexity of metabolic pathways, running the experiments did not yield an immediate answer. Singh describes it as “putting together the pieces of a fascinating puzzle driven by analysis.”
The Sandia team’s paper reports the method used to decipher the metabolic pathway of SYK-6.
Valorizing lignin through chemical production
The next step is to engineer a microbial chassis to harness SYK-6’s metabolic pathway. The trick will be to stop the pathway at the right step to extract a useful product. Platform chemicals, which can be used to derive valuable chemicals like muconic acid and adipic acid, are the goal.
One path forward is to genetically engineer SYK-6 to stop its metabolic process at a point when platform chemicals can be extracted from the lignin. Another path would be to splice the genes responsible for the important desired metabolic process in SYK-6 onto a strong industrial host like E. coli to create a chassis for desired fuels and chemicals. Singh and the other researchers hope to explore both options.
“This understanding casts lignin in a whole new light,” said Singh. “Now that we know how to begin deriving value from lignin, a vast resource opens up. Decoding SYK-6 metabolic pathway is providing a roadmap for lignin valorization.”
Rice physicists probe photon-electron interactions in vacuum cavity experiments
Where light and matter intersect, the world illuminates. Where light and matter interact so strongly that they become one, they illuminate a world of new physics, according to Rice University scientists.
Rice physicists are closing in on a way to create a new condensed matter state in which all the electrons in a material act as one by manipulating them with light and a magnetic field. The effect made possible by a custom-built, finely tuned cavity for terahertz radiation shows one of the strongest light-matter coupling phenomena ever observed.
The work by Rice physicist Junichiro Kono and his colleagues is described in Nature Physics. It could help advance technologies like quantum computers and communications by revealing new phenomena to those who study cavity quantum electrodynamics and condensed matter physics, Kono said.
Condensed matter in the general sense is anything solid or liquid, but condensed matter physicists study forms that are much more esoteric, like Bose-Einstein condensates. A Rice team was one of the first to make a Bose-Einstein condensate in 1995 when it prompted atoms to form a gas at ultracold temperatures in which all the atoms lose their individual identities and behave as a single unit.
The Kono team is working toward something similar, but with electrons that are strongly coupled, or “dressed,” with light. Qi Zhang, a former graduate student in Kono’s group and lead author of the paper, designed and constructed an extremely high-quality cavity to contain an ultrathin layer of gallium arsenide, a material they’ve used to study superfluorescence. By tuning the material with a magnetic field to resonate with a certain state of light in the cavity, they prompted the formation of polaritons that act in a collective manner.
“This is a nonlinear optical study of a two-dimensional electronic material,” said Zhang, who based his Ph.D. thesis on the work. “When you use light to probe a material’s electronic structure, you’re usually looking for light absorption or reflection or scattering to see what’s happening in the material. That light is just a weak probe and the process is called linear optics.
The researchers employed a parameter known as vacuum Rabi splitting to measure the strength of the light-matter coupling. “In more than 99 percent of previous studies of light-matter coupling in cavities, this value is a negligibly small fraction of the photon energy of the light used,” said Xinwei Li, a co-author and graduate student in Kono’s group. “In our study, vacuum Rabi splitting is as large as 10 percent of the photon energy. That puts us in the so-called ultrastrong coupling regime.
“This is an important regime because, eventually, if the vacuum Rabi splitting becomes larger than the photon energy, the matter goes into a new ground state. That means we can induce a phase transition, which is an important element in condensed matter physics,” he said.
Phase transitions are transitions between states of matter, like ice to water to vapor. The specific transition Kono’s team is looking for is the superradiant phase transition in which the polaritons go into an ordered state with macroscopic coherence.
Kono said the amount of terahertz light put into the cavity is very weak. “What we depend on is the vacuum fluctuation. Vacuum, in a classical sense, is an empty space. There’s nothing. But in a quantum sense, a vacuum is full of fluctuating photons, having so-called zero-point energy. These vacuum photons are actually what we are using to resonantly excite electrons in our cavity.
“This general subject is what’s known as cavity quantum electrodynamics (QED),” Kono said. “In cavity QED, the cavity enhances the light so that matter in the cavity resonantly interacts with the vacuum field. What is unique about solid-state cavity QED is that the light typically interacts with this huge number of electrons, which behave like a single gigantic atom.”
He said solid-state cavity QED is also key for applications that involve quantum information processing, like quantum computers. “The light-matter interface is important because that’s where so-called light-matter entanglement occurs. That way, the quantum information of matter can be transferred to light and light can be sent somewhere.
“For improving the utility of cavity QED in quantum information, the stronger the light-matter coupling, the better, and it has to use a scalable, solid-state system instead of atomic or molecular systems,” he said. “That’s what we’ve achieved here.”
The high-quality gallium arsenide materials used in the study were synthesized via molecular beam epitaxy by John Reno of Sandia National Laboratories and John Watson and Michael Manfra of Purdue University, all co-authors of the paper. Weil Pan of Sandia National Laboratories and Rice graduate student Minhan Lou, who participated in sample preparation and transport and terahertz measurements, are also co-authors.
Experiments at CERN’s Large Hadron Collider generate 15 million gigabytes of data per year. That is a lot of digital data to inscribe on hard drives or beam up to the “cloud.”
Digital data storage degrades and can become obsolete and old-school books and paper require lots of space. Compared to digital and analog information storage, DNA is more compact and durable and never becomes obsolete. Readable DNA was extracted from the 600,000-year-old remains of a horse found in the Yukon.
Bachand was inspired by the recording of all of Shakespeare’s sonnets into 2.5 million base pairs of DNA — about half the genome of the tiny E. coli bacterium. Using this method, the group at the European Bioinformatics Institute could theoretically store 2.2 petabytes of information — 200 times the printed material in the Library of Congress — in one gram of DNA.
Technique for “phase locking” arrays of tiny lasers could lead to terahertz security scanners.
Terahertz radiation — the band of electromagnetic radiation between microwaves and visible light — has promising applications in security and medical diagnostics, but such devices will require the development of compact, low-power, high-quality terahertz lasers.
In this week’s issue of Nature Photonics, researchers at MIT and Sandia National Laboratories describe a new way to build terahertz lasers that could significantly reduce their power consumption and size, while also enabling them to emit tighter beams, a crucial requirement for most practical applications.
The work also represents a fundamentally new approach to laser design, which could have ramifications for visible-light lasers as well.
The researchers’ device is an array of 37 microfabricated lasers on a single chip. Its power requirements are so low because the radiation emitted by all of the lasers is “phase locked,” meaning that the troughs and crests of its waves are perfectly aligned. The device represents a fundamentally new way to phase-lock arrays of lasers.
A group of scientists from Hong Kong University of Science and Technology; the University of California, Santa Barbara; Sandia National Laboratories and Harvard University were able to fabricate tiny lasers directly on silicon — a huge breakthrough for the semiconductor industry and well beyond.
For more than 30 years, the crystal lattice of silicon and of typical laser materials could not match up, making it impossible to integrate the two materials — until now.
As the group reports in Applied Physics Letters, from AIP Publishing, integrating subwavelength cavities — the essential building blocks of their tiny lasers — onto silicon enabled them to create and demonstrate high-density on-chip light-emitting elements.
To do this, they first had to resolve silicon crystal lattice defects to a point where the cavities were essentially equivalent to those grown on lattice-matched gallium arsenide (GaAs) substrates. Nano-patterns created on silicon to confine the defects made the GaAs-on-silicon template nearly defect free and quantum confinement of electrons within quantum dots grown on this template made lasing possible.
The group was then able to use optical pumping, a process in which light, rather than electrical current, “pumps” electrons from a lower energy level in an atom or molecule to a higher level, to show that the devices work as lasers.
“Putting lasers on microprocessors boosts their capabilities and allows them to run at much lower powers, which is a big step toward photonics and electronics integration on the silicon platform,” said professor Kei May Lau, Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology.
By chemically modifying and pulverizing a promising group of compounds, scientists at the National Institute of Standards and Technology (NIST) have potentially brought safer, solid-state rechargeable batteries two steps closer to reality.
These compounds are stable solid materials that would not pose the risks of leaking or catching fire typical of traditional liquid battery ingredients and are made from commonly available substances.
Since discovering their properties in 2014, a team led by NIST scientists has sought to enhance the compounds’ performance further in two key ways: Increasing their current-carrying capacity and ensuring that they can operate in a sufficiently wide temperature range to be useful in real-world environments.
Considerable advances have now been made on both fronts, according to Terrence Udovic of the NIST Center for Neutron Research, whose team has published a pair of scientific papers that detail each improvement.
The first advance came when the team found that the original compounds — made primarily of hydrogen, boron and either lithium or sodium — were even better at carrying current with a slight change to their chemical makeup. Replacing one of the boron atoms with carbon improved their ability to conduct charged particles, or ions, which are what carry electricity inside a battery. As the team reported in February in their first paper, the switch made the compounds about 10 times better at conducting.
But perhaps more important was clearing the temperature hurdle. The compounds conducted ions well enough to operate in a battery — as long as it was in an environment typically hotter than boiling water. Unfortunately, there’s not much of a market for such high-temperature batteries, and by the time they cooled to room temperature, the materials’ favorable chemical structure often changed to a less conductive form, decreasing their performance substantially.
One solution turned out to be crushing the compounds’ particles into a fine powder. The team had been exploring particles that are measured in micrometers, but as nanotechnology research has demonstrated time and again, the properties of a material can change dramatically at the nanoscale. The team found that pulverizing the compounds into nanometer-scale particles resulted in materials that could still perform well at room temperature and far below.
“This approach can remove worries about whether batteries incorporating these types of materials will perform as expected even on the coldest winter day,” said Udovic, whose collaborators on the most recent paper include scientists from Japan’s Tohoku University, the University of Maryland and Sandia National Laboratories. “We are currently exploring their use in next-generation batteries, and in the process we hope to convince people of their great potential.”