Since its discovery in 2004, scientists have believed that graphene may have the innate ability to superconduct. Now Cambridge researchers have found a way to activate that previously dormant potential.
Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor – meaning that it can be made to carry an electrical current with zero resistance.
The finding, reported in Nature Communications, further enhances the potential of graphene, which is already widely seen as a material that could revolutionise industries such as healthcare and electronics. Graphene is a two-dimensional sheet of carbon atoms and combines several remarkable properties; for example, it is very strong, but also light and flexible, and highly conductive.
Since its discovery in 2004, scientists have speculated that graphene may also have the capacity to be a superconductor. Until now, superconductivity in graphene has only been achieved by doping it with, or by placing it on, a superconducting material – a process which can compromise some of its other properties.
But in the new study, researchers at the University of Cambridge managed to activate the dormant potential for graphene to superconduct in its own right. This was achieved by coupling it with a material called praseodymium cerium copper oxide (PCCO).
Superconductors are already used in numerous applications. Because they generate large magnetic fields they are an essential component in MRI scanners and levitating trains. They could also be used to make energy-efficient power lines and devices capable of storing energy for millions of years.
Superconducting graphene opens up yet more possibilities. The researchers suggest, for example, that graphene could now be used to create new types of superconducting quantum devices for high-speed computing. Intriguingly, it might also be used to prove the existence of a mysterious form of superconductivity known as “p-wave” superconductivity, which academics have been struggling to verify for more than 20 years.
The research was led by Dr Angelo Di Bernardo and Dr Jason Robinson, Fellows at St John’s College, University of Cambridge, alongside collaborators Professor Andrea Ferrari, from the Cambridge Graphene Centre; Professor Oded Millo, from the Hebrew University of Jerusalem, and Professor Jacob Linder, at the Norwegian University of Science and Technology in Trondheim.
“It has long been postulated that, under the right conditions, graphene should undergo a superconducting transition, but can’t,” Robinson said. “The idea of this experiment was, if we couple graphene to a superconductor, can we switch that intrinsic superconductivity on? The question then becomes how do you know that the superconductivity you are seeing is coming from within the graphene itself, and not the underlying superconductor?”
Similar approaches have been taken in previous studies using metallic-based superconductors, but with limited success. “Placing graphene on a metal can dramatically alter the properties so it is technically no longer behaving as we would expect,” Di Bernardo said. “What you see is not graphene’s intrinsic superconductivity, but simply that of the underlying superconductor being passed on.”
PCCO is an oxide from a wider class of superconducting materials called “cuprates”. It also has well-understood electronic properties, and using a technique called scanning and tunnelling microscopy, the researchers were able to distinguish the superconductivity in PCCO from the superconductivity observed in graphene.
Superconductivity is characterised by the way the electrons interact: within a superconductor electrons form pairs, and the spin alignment between the electrons of a pair may be different depending on the type – or “symmetry” – of superconductivity involved. In PCCO, for example, the pairs’ spin state is misaligned (antiparallel), in what is known as a “d-wave state”.
By contrast, when graphene was coupled to superconducting PCCO in the Cambridge-led experiment, the results suggested that the electron pairs within graphene were in a p-wave state. “What we saw in the graphene was, in other words, a very different type of superconductivity than in PCCO,” Robinson said. “This was a really important step because it meant that we knew the superconductivity was not coming from outside it and that the PCCO was therefore only required to unleash the intrinsic superconductivity of graphene.”
It remains unclear what type of superconductivity the team activated, but their results strongly indicate that it is the elusive “p-wave” form. If so, the study could transform the ongoing debate about whether this mysterious type of superconductivity exists, and – if so – what exactly it is.
In 1994, researchers in Japan fabricated a triplet superconductor that may have a p-wave symmetry using a material called strontium ruthenate (SRO). The p-wave symmetry of SRO has never been fully verified, partly hindered by the fact that SRO is a bulky crystal, which makes it challenging to fabricate into the type of devices necessary to test theoretical predictions.
“If p-wave superconductivity is indeed being created in graphene, graphene could be used as a scaffold for the creation and exploration of a whole new spectrum of superconducting devices for fundamental and applied research areas,” Robinson said. “Such experiments would necessarily lead to new science through a better understanding of p-wave superconductivity, and how it behaves in different devices and settings.”
The study also has further implications. For example, it suggests that graphene could be used to make a transistor-like device in a superconducting circuit, and that its superconductivity could be incorporated into molecular electronics. “In principle, given the variety of chemical molecules that can bind to graphene’s surface, this research can result in the development of molecular electronics devices with novel functionalities based on superconducting graphene,” Di Bernardo added.
Learn more: Graphene’s sleeping superconductivity awakens
Originally founded in 1209, it is the second-oldest university in English-speaking areas, and the world’s third-oldest surviving university. The university grew out of an association of scholars that was formed in 1209, early records suggest, by scholars leaving Oxford after a dispute with townsfolk. The two “ancient universities” have many common features and are often jointly referred to as Oxbridge.
Today, Cambridge is formed from a variety of institutions which include 31 constituent colleges and comprehensive academic departments which are organised into six academic schools. All these organisations occupy different locations in the town including purposely-built sites and the student life is found in the arts, sport clubs and societies. Cambridge has nurtured many prominent alumni, and 90 Nobel laureates have been affiliated with the university. It is also a member of various academic associations and forms part of the ‘golden triangle’ of English universities.
University of Cambridge research articles from Innovation Toronto
- DNA in blood can track cancer development and response in real time – November 9, 2015
- Cambridge Researchers Make Lithium-Air Battery Tech Breakthrough – November 2, 2015
- Entanglement at heart of ‘two-for-one’ singlet fission could double solar cell output – October 29, 2015
- New glass manufacturing technique could enable design of hybrid glasses and revolutionise gas storage – August 29, 2015
- On the origin of (robot) species – August 16, 2015
- Robots learn to evolve and improve – August 13, 2015
- Can Computers be Creative? – July 7, 2015
- ‘Pick & mix’ smart materials for robotics – June 24, 2015
- Silent flights: How owls could help make wind turbines and planes quieter – June 22, 2015
- New gold standard established for open and reproducible research – May 7, 2015
- Alzheimer’s breakthrough: scientists home in on molecule which halts development of disease – February 16, 2015
- Responsive material could be the ‘golden ticket’ of sensing – January 9, 2015
- Finally, a method for recycling of plastic-aluminum laminates – December 30, 2014
- Airplanes Go Hybrid-Electric – December 26, 2014
- Cambridge breakthrough in artifical muscle research – November 21, 2014
- New research lights the way to super-fast computers – November 9, 2014
- Quick-change materials break the silicon speed limit for computers – September 22, 2014
- First graphene-based flexible display produced – September 15, 2014
- To clean air and beyond: Catching greenhouse gases with advanced membranes – September 7, 2014
- Changing global diets is vital to reducing climate change – September 5, 2014
- Pairing old technologies with new for next generation electronic devices – August 14, 2014
- Cambridge team breaks superconductor world record – July 17, 2014
- Cancer breakthrough as scientists discover how cells spread for the first time paving the way for new treatments to halt disease in its tracks – July 8, 2014
- Revolutionary solar cells double as lasers
- App turns a smartphone into a portable medical diagnostic device
- Holographic diagnostics
- Near error-free wireless detection made possible | wireless tag detection
- Cells from the eye are inkjet printed for the first time | artificial tissue grafts
- Could Revolutionize Solar Energy: Quantum waves at the heart of organic solar cells
- University of Cambridge Makes a Potential Printing Breakthrough
- 2 for 1 in solar power
- Quantum ‘sealed envelope’ system enables ‘perfectly secure’ information storage
- Future internet aims to sever links with servers
- Maths study of photosynthesis clears the path to developing new super-crops
- Stem cell breakthrough could set up future transplant therapies
- New sensor could prolong the lifespan of high-temperature engines
- How does your garden grow?
- How We’ll Grow The Next Generation Of Buildings With Bacteria
- VIDEO: Electron ‘spin’ key to solar cell breakthrough
- Cost of Arctic methane release could be ‘size of global economy’ warn experts
- From spiders, a material to rival Kevlar
- VIDEO: Carbon ‘candy floss’ could help prevent energy blackouts
- Brain-Scan Lie Detectors Just Don’t Work
- New Synthetic Material mimics the brightest and most vivid colours in nature and changes colour
- Ultrashort Laser Pulses Squeezed Out of Graphene
- Wonder pill cuts risk of arthritis
- Study Led by NUS Scientists Reveals Escalating Cost of Forest Conservation
- Earth feels impact of middle class
- Can Synthetic Biology Save Wildlife?
- Roads could help rather than harm the environment
- Laser-like photons signal major step towards quantum ‘Internet’
- Face of the future rears its head
- Digital records could expose intimate details and personality traits of millions
- Hope for threatened Tasmanian devils
- Scientists produce H2 for fuel cells using an inexpensive catalyst under real-world conditions
- Deep sea bacteria could provide breakthroughs for solar panels
- Humanity’s last invention and our uncertain future
- Proliferation warnings on nuclear “wonder-fuel”
- One Step Closer To Real Medical Tech Breakthrough
- Visions for open evaluation of scientific papers by post-publication peer review
- Therapy over the phone as effective as face-to-face
- Metalysis Sand-to-Metal Breakthrough
- Scientists produce H2 for fuel cells using an inexpensive catalyst under real-world conditions
- “Living furniture” could power laptops and desk lamps
- Breakthrough in search for alien life as scientists manufacture DNA-like molecule which can transmit genetic material
- Workhorse Climate Satellite Goes Silent
- The 4 Factors That Make A Country Ideal For Innovation
- Laser-powered ‘unprinter’ wipes documents in a flash
- New Hybrid Solar Cells Harness More Of The Sun’s Light Spectrum
- Human brain cells created from skin
- First Demonstration of Inkjet-Printed Graphene Electronics
- U.K. Researchers to Test “Artificial Volcano” for Geoengineering the Climate
- Brazil promises 75,000 scholarships in science and technology
- Limit to Nanotechnology Mass-Production?
- Research sheds new light on wall-climbing critters
- Microwaves utilized to convert used motor oil into fuel
- Transgenic chickens get bird flu without passing it on
- Researchers develop interactive, emotion-detecting GPS robot
- Improving Ammonia Synthesis Could Have Major Implications for Agriculture and Energy
- Creation of liver cells from skin cells gives hope in fight against liver disease
- Better speech-recognition technology
- New butterfly-wing technology could foil counterfeiters
- War Is Peace: Can Science Fight Media Disinformation?
- Superb vistas from reborn Hubble
A computer algorithm for analyzing time-lapse biological images could make it easier for scientists and clinicians to find and track multiple molecules in living organisms. The technique is faster, less expensive and more accurate than current methods — and it even works with cell phone images.
A new image analysis technique makes finding important biological molecules — including tell-tale signs of disease — and learning how they interact in living organisms much faster and far less expensive. Called Hyper-Spectral Phasor analysis, or HySP, it could even be useful for diagnosing and monitoring diseases using cell phone images.
Researchers use fluorescent imaging to locate proteins and other molecules in cells and tissues. It works by tagging the molecules with dyes that glow under certain kinds of light — the same principle behind so-called “black light” images.
Fluorescent imaging can help scientists understand which molecules are produced in large amounts in cancer or other diseases, information that may be useful in diagnosis or in identifying possible targets for therapeutic drugs.
Looking at just one or two molecules in cell or tissue samples is fairly straightforward. Unfortunately, it doesn’t provide a clear picture of how those molecules are behaving in the real world. For that, scientists need to expand their view.
“Biological research is moving toward complex systems that extend across multiple dimensions, the interaction of multiple elements over time,” said postdoctoral fellow Francesco Cutrale. He developed HySP with Scott Fraser
, Elizabeth Garrett Chair in Convergent Bioscience and Provost Professor of Biological Science. The work was done at USC’s Translational Imaging Center, a joint venture of USC Dornsife and USC Viterbi School of Engineering.
“By looking at multiple targets, or watching targets move over time, we can get a much better view of what’s actually happening within complex living systems,” Cutrale said.
Currently, researchers must look at different labels separately, then apply complicated techniques to layer them together and figure out how they relate to one another, a time-consuming and expensive process, Cutrale said. HySP can look at many different molecules in one pass.
“Imagine looking at 18 targets,” Cutrale said. “We can do that all at once, rather than having to perform 18 separate experiments and try to combine them later.”
In addition, the algorithm effectively filters through interference to discern the true signal, even if that signal is extremely weak — very much like finding the proverbial needle in a haystack. Recent technology from NASA’s Jet Propulsion Laboratory can also do this, but the equipment and process are both extremely expensive and time-consuming.
In research published Jan. 9 online by the scientific journal Nature Methods, Cutrale and Fraser, along with researchers from Keck School of Medicine, Caltech and the University of Cambridge in the United Kingdom, have used zebra fish to test and develop HySP. In this common laboratory model, the system works extremely well. But what about in people?
“In experimental models, we can use genetic manipulation to label molecules, but we can’t do that with people,” said Fraser. “In people, we have to use the intrinsic signals of those molecules.”
Those inherent signals, the natural fluorescence from biomolecules, normally gets in the way of imaging, Fraser said. However, using this new computer algorithm that can effectively find weak signals in a cluttered background, the team can pinpoint their targets in the body.
Different fluorescent light wavelengths reveal features of a zebra fish embryo. Photo courtesy of Francesco Cutrale.
The scientists hope to test the process in the next couple of years with the help of soldiers whose lungs have been damaged by chemicals and irritants they may have encountered in combat. The researchers will extend a light-emitting probe down into the soldiers’ lungs while the probe records images of the fluorescence in the surrounding tissues. They will then use HySP to create what amounts to a fluorescent map and compare it with that of healthy lung tissue to see if they can discern the damage. If so, they hope to further develop the technology so it may one day help these soldiers and other lung patients receive more targeted treatment.
It might also be possible one day for clinicians to use HySP to analyze cell phone pictures of skin lesions to determine if they are at risk of being cancerous, according to Fraser and Cutrale.
“We could determine if the lesions have changed color or shape over time,” Cutrale said. Clinicians could then examine the patient further to be certain of a diagnosis and respond appropriately.
Cutrale and Fraser see the technology as a giant leap forward for both research and medicine.
“Both scientists at the bench and scientists at the clinic will be able to perform their work faster and with greater confidence in the results,” Cutrale said. “Better, faster, cheaper. That’s the payoff here.”
Over a third of new conservation science documents published annually are in non-English languages, despite assumption of English as scientific ‘lingua franca’. Researchers find examples of important science missed at international level, and practitioners struggling to access new knowledge, as a result of language barriers.
Language barriers continue to impede the global compilation and application of scientific knowledge
English is now considered the common language, or ‘lingua franca’, of global science. All major scientific journals seemingly publish in English, despite the fact that their pages contain research from across the globe.
However, a new study suggests that over a third of new scientific reports are published in languages other than English, which can result in these findings being overlooked – contributing to biases in our understanding.
As well as the international community missing important science, language hinders new findings getting through to practitioners in the field say researchers from the University of Cambridge.
They argue that whenever science is only published in one language, including solely in English, barriers to the transfer of knowledge are created.
The Cambridge researchers call on scientific journals to publish basic summaries of a study’s key findings in multiple languages, and universities and funding bodies to encourage translations as part of their ‘outreach’ evaluation criteria.
“While we recognise the importance of a lingua franca, and the contribution of English to science, the scientific community should not assume that all important information is published in English,” says Dr Tatsuya Amano from Cambridge’s Department of Zoology.
“Language barriers continue to impede the global compilation and application of scientific knowledge.”
The researchers point out an imbalance in knowledge transfer in countries where English is not the mother tongue: “much scientific knowledge that has originated there and elsewhere is available only in English and not in their local languages.”
This is a particular problem in subjects where both local expertise and implementation is vital – such as environmental sciences.
As part of the study, published today in the journal PLOS Biology, those in charge of Spain’s protected natural areas were surveyed. Over half the respondents identified language as an obstacle to using the latest science for habitat management.
The Cambridge team also conducted a litmus test of language use in science. They surveyed the web platform Google Scholar – one of the largest public repositories of scientific documents – in a total of 16 languages for studies relating to biodiversity conservation published during a single year, 2014.
Of the over 75,000 documents, including journal articles, books and theses, some 35.6% were not in English. Of these, the majority was in Spanish (12.6%) or Portuguese (10.3%). Simplified Chinese made up 6%, and 3% were in French.
The researchers also found thousands of newly published conservation science documents in other languages, including several hundred each in Italian, German, Japanese, Korean and Swedish.
Random sampling showed that, on average, only around half of non-English documents also included titles or abstracts in English. This means that around 13,000 documents on conservation science published in 2014 are unsearchable using English keywords.
This can result in sweeps of current scientific knowledge – known as ‘systematic reviews’ – being biased towards evidence published in English, say the researchers. This, in turn, may lead to over-representation of results considered positive or ‘statistically significant’, and these are more likely to appear in English language journals deemed ‘high-impact’.
In addition, information on areas specific to countries where English is not the mother tongue can be overlooked when searching only in English.
For environmental science, this means important knowledge relating to local species, habitats and ecosystems – but also applies to diseases and medical sciences. For example, documents reporting the infection of pigs with avian flu in China initially went unnoticed by international communities, including the WHO and the UN, due to publication in Chinese-language journals.
“Scientific knowledge generated in the field by non-native English speakers is inevitably under-represented, particularly in the dominant English-language academic journals. This potentially renders local and indigenous knowledge unavailable in English,” says lead author Amano.
“The real problem of language barriers in science is that few people have tried to solve it. Native English speakers tend to assume that all the important information is available in English. But this is not true, as we show in our study.
“On the other hand, non-native English speakers, like myself, tend to think carrying out research in English is the first priority, often ending up ignoring non-English science and its communication.
“I believe the scientific community needs to start seriously tackling this issue.”
Amano and colleagues say that, when conducting systematic reviews or developing databases at a global scale, speakers of a wide range of languages should be included in the discussion: “at least Spanish, Portuguese, Chinese and French, which, in theory, cover the vast majority of non-English scientific documents.”
The website conservationevidence.com, a repository for conservation science developed at Cambridge by some of the authors, has also established an international panel to extract the best non-English language papers, including Portuguese, Spanish and Chinese.
“Journals, funders, authors and institutions should be encouraged to supply translations of a summary of a scientific publication – regardless of the language it is originally published in,” says Amano. The authors of the new study have provided a summary in Spanish, Portuguese, Chinese and French as well as Japanese.
“While outreach activities have recently been advocated in science, it is rare for such activities to involve communication across language barriers.”
The researchers suggest efforts to translate should be evaluated in a similar way to other outreach activities such as public engagement, particularly if the science covers issues at a global scale or regions where English is not the mother tongue.
Adds Amano: “We should see this as an opportunity as well as a challenge. Overcoming language barriers can help us achieve less biased knowledge and enhance the application of science globally.”
A study published in the journal Oryx finds off-the-shelf drones can be used to guard crops and keep elephants safe along the borders of Tanzanian parks.
A new study finds that low-cost drones have a significant impact in protecting elephants by preventing human-elephant conflict in farmland near Tarangire and Serengeti National Parks in Tanzania. The project, designed by RESOLVE’s Biodiversity and Wildlife Solutions program, in partnership with Tanzanian Wildlife officials and the Mara Elephant Project, works by using the drones to safely shepherd elephants away from farms and communities—where conflict can cause more deaths than poaching.
From April through July, elephants wander out of parks across Tanzania to gorge on maize (corn), watermelon, and sorghum that dot subsistence farm plots. A wild herd can wipe out a maize plot in a single night and leave farmers struggling to feed their families for the rest of the year. Farmers and rangers have to sneak within range of the elephants to throw stones and bang drums to drive them off, or, worse, hurl chili-laced condoms with firecrackers in a futile and often dangerous effort. Angry villagers can also retaliate by provisioning the fields with poisoned fruit or turning a blind eye to poaching gangs targeting the elephants for ivory.
Elephants are not entirely to blame; people are moving into their homelands and traditional movement corridors, planting crops and competing with wildlife for space, water, and food. In certain regions of Africa and across much of the range of the Asiatic elephant, this conflict presents a greater risk to elephants than poaching and has become a high priority for wildlife managers. Now, conservationists may have found an unexpected solution that works in the African bush. Beginning in late 2014, researchers from Biodiversity and Wildlife Solutions, the Tanzanian Wildlife Research Institute (TAWIRI), and the Mara Elephant Project, found that quadcopter unmanned aerial vehicles (UAVs, a.k.a. drones) make elephants flee. This discovery presented a possible new tool to keep elephants out of high-risk areas, but the technique needed more testing to be proclaimed safe for wildlife and people.
In a paper released in the journal Oryx, the research team reported on 51 field trials in farmland bordering Tarangire and Serengeti National Parks. The trials show that rangers using UAVs have been able to consistently move wild elephants out of crops during the day and night. Results from the flights suggest that the UAVs—which currently cost $800 fully equipped—can aid wildlife managers who regularly respond to human-elephant conflict (HEC) in community areas and croplands.
“We’ve stressed the importance of data collection throughout this project. There is sometimes a tendency to overstate the power of new technologies, and we wanted to fairly assess the utility of the drones for moving elephants out of crops and other areas. The results are very positive and show that UAVs can be an effective, flexible way for wildlife managers to deal with human-elephant conflict,” said lead author Nathan Hahn, from Biodiversity and Wildlife Solutions.
Trained ranger teams stationed along the border of these parks have now made over 120 flights in response to calls about elephants on community and farming lands.
“The greater interaction distance the UAVs provide lends a much-needed safety buffer for our rangers, the farmers, and the elephants. Here is a useful piece of technology we didn’t have in our tool kit one year ago” explained Angela Mwakatobe, head of research management at TAWIRI and co-author on the study.
While some biologists warn that elephants may become habituated to the sound of drones and no longer move from crop fields, rangers have not yet noticed signs of this, even among habitual raiders who have “met” the drones multiple times. Results of this work suggest that small drones offer a new way to reduce negative interactions between people and elephants. The UAVs have also revealed unintended applications. In one instance, rangers used a UAV to move a wounded bull out of dense bush into the open so that a veterinary team could remove a poisoned arrow lodged in his leg.
Loving elephants is easy. Living next to them in harmony requires a little creative engineering to negotiate a peace treaty. “It’s good that we can help the communities,” observed ranger Kateto Ollekashe. “When we can help farmers move the elephants away, we can build relationships and get them on our side. That’s also how we can help stop poaching.”
In the end, scientists and wildlife managers agree that this conflict will not be solved until larger protected areas and safe corridors are established for elephant dispersal; thus, it will not be solved overnight. But at least now there is some hope for peaceful coexistence between farmers and elephants, brokered by a creative use of technology and early adoption by the Tanzanian government.
Photo credit: Lori Price Biodiversity and Wildlife Solutions, RESOLVE
RenSeq1 is the method to sequence Resistance (R) genes that confer disease resistance in plants.
Each plant typically carries hundreds of potential R gene sequences, encoding NB-LRR proteins, identified by the presence of specific sequence motifs. R genes are often part of families of closely related sequences.
While shared sequences make it possible to capture the R-genes, it also makes it hard to tell them apart and find the exact gene that enables plants to survive attack. Longer molecules and sequences of DNA allow easier and more accurate genetic analysis to identify variation.
The NB-LRR gene family enables plants to withstand infection from a suite of diseases and form a second line of defence. After a pathogen has managed to invade a plant, it uses ‘effector’ molecules to weaken a plant’s defences – the R gene proteins recognise these ‘effector’ molecules and signal to the plant to activate defence responses – killing cells around the site of infection in an attempt to stop it spreading.
This constant evolutionary arms race between plants and pathogens, whereby the organisms causing disease in plants are mutating to avoid plant defences, causes the plants to evolve through changes in their own genetic makeup. This is where a huge variety of R genes come into play that are highly similar in structure and DNA sequence.
Researchers at the Earlham Institute (EI), The Sainsbury Laboratory (TSL) and the James Hutton Institute, have found a new way to decipher these large stretches of DNA to discover and annotate pathogen resistance in plants.
Using the PacBio, which can read longer stretches of DNA in their entirety, along with the developed NB-LRR gene workflow ‘RenSeq’ (Resistance gene enrichment sequencing), the data not only targets R genes, but also the important regulatory regions of DNA – promoters and terminators that signal when to start making a protein and when to stop.
Dr Matt Clark, Head of Technology Development at EI and lead author of the study, said: “Wild relatives of crops contain a huge repertoire of novel genes that could be used to breed more resistant varieties that need less pesticide treatments. When it comes to identifying key genes it can be very difficult for researchers to find the exact resistance gene due to the sheer similarity of their DNA sequences.
“Typical sequencing methods use short reads eg from the Illumina HiSeq, but these are often too short to prise similar genes apart.
“RenSeq diverges from normal DNA sequencing on the PacBio by focussing the sequencing effort on a specific gene family i.e. R-genes. In this study, by optimising multiple steps in the library construction, we can identify the protein-coding sequences and the neighbouring regulatory regions; indeed in many cases, we can reconstruct the entire DNA region even if it contains many similar genes which normally are too hard to tell apart. This means we can identify the exact gene that confers resistance to a certain infection, and used in breeding programmes.”
Professor Jonathan Jones, Senior Scientist at TSL and co-author, said: “This improvement to the RenSeq method will greatly facilitate building reliable inventories of R genes in multiple plant species, helping us clone additional genes that could protect our crops.”
Dr Ingo Hein, Principal Investigator at the James Hutton Institute and co-author, added: “R genes can control diverse plant diseases including major threats to global crop production. The ability to capture and sequence long genomic DNA fragments that contain full-length R genes enables the rapid identification of novel, functional resistance genes from wild species. These genes, if introgressed into new cultivars via breeding or alternative routes, could significantly reduce the dependency on pesticides for crop production.”
Researchers have discovered a way to remove specific fears from the brain, using a combination of artificial intelligence and brain scanning technology.
Their technique, published in the inaugural edition of Nature Human Behaviour, could lead to a new way of treating patients with conditions such as post-traumatic stress disorder (PTSD) and phobias.
The challenge then was to find a way to reduce or remove the fear memory, without ever consciously evoking it
Fear related disorders affect around one in 14 people and place considerable pressure on mental health services. Currently, a common approach is for patients to undergo some form of aversion therapy, in which they confront their fear by being exposed to it in the hope they will learn that the thing they fear isn’t harmful after all. However, this therapy is inherently unpleasant, and many choose not to pursue it. Now a team of neuroscientists from the University of Cambridge, Japan and the USA, has found a way of unconsciously removing a fear memory from the brain.
The team developed a method to read and identify a fear memory using a new technique called ‘Decoded Neurofeedback’. The technique used brain scanning to monitor activity in the brain, and identify complex patterns of activity that resembled a specific fear memory. In the experiment, a fear memory was created in 17 healthy volunteers by administering a brief electric shock when they saw a certain computer image. When the pattern was detected, the researchers over-wrote the fear memory by giving their experimental subjects a reward.
Dr. Ben Seymour, of the University of Cambridge’s Engineering Department, was one of the authors on the study. He explained the process:
“The way information is represented in the brain is very complicated, but the use of artificial intelligence (AI) image recognition methods now allow us to identify aspects of the content of that information. When we induced a mild fear memory in the brain, we were able to develop a fast and accurate method of reading it by using AI algorithms. The challenge then was to find a way to reduce or remove the fear memory, without ever consciously evoking it.
“We realised that even when the volunteers were simply resting, we could see brief moments when the pattern of fluctuating brain activity had partial features of the specific fear memory, even though the volunteers weren’t consciously aware of it. Because we could decode these brain patterns quickly, we decided to give subjects a reward – a small amount of money – every time we picked up these features of the memory.”
The team repeated the procedure over three days. Volunteers were told that the monetary reward they earned depended on their brain activity, but they didn’t know how. By continuously connecting subtle patterns of brain activity linked to the electric shock with a small reward, the scientists hoped to gradually and unconsciously override the fear memory.
Dr Ai Koizumi, of the Advanced Telecommunicatons Research Institute International, Kyoto and Centre of Information and Neural Networks, Osaka, led the research:
“In effect, the features of the memory that were previously tuned to predict the painful shock, were now being re-programmed to predict something positive instead.”
The team then tested what happened when they showed the volunteers the pictures previously associated with the shocks.
“Remarkably, we could no longer see the typical fear skin-sweating response. Nor could we identify enhanced activity in the amygdala – the brain’s fear centre,” she continued. “This meant that we’d been able to reduce the fear memory without the volunteers ever consciously experiencing the fear memory in the process.”
Although the sample size in this initial study was relatively small, the team hopes the technique can be developed into a clinical treatment for patients with PTSD or phobias.
“To apply this to patients, we need to build a library of the brain information codes for the various things that people might have a pathological fear of, say, spiders” adds Dr Seymour. “Then, in principle, patients could have regular sessions of Decoded Neurofeedback to gradually remove the fear response these memories trigger.”
Such a treatment could have major benefits over traditional drug based approaches. Patients could also avoid the stress associated with exposure therapies, and any side-effects resulting from those drugs.
Learn more: Reconditioning the brain to overcome fear
Launched in July this year, Pokémon Go has become a global phenomenon, reaching 500 million downloads within two months of release.
The augmented reality game, designed for mobile devices, allows users to capture, battle and train virtual creatures called Pokémon that appear on screen as if part of the real-world environment.
But can the game’s enormous success deliver any lessons to the fields of natural history and conservation?
A new paper by a group of researchers from the universities of Oxford and Cambridge, UNEP World Conservation Monitoring Centre, and University College London (UCL) explores whether Pokémon Go’s success in getting people out of their homes and interacting with virtual ‘animals’ could be replicated to redress what is often perceived as a decline in interest in the natural world among the general public.
Or, could the game’s popularity pose more problems than opportunities for conservation?
Study author Leejiah Dorward, a doctoral candidate in Oxford University’s Department of Zoology, said: ‘When Pokémon Go first came out, one of the most striking things was its similarity with many of the concepts seen in natural history and conservation. The basic facts and information about Pokémon Go make it sound like an incredibly successful citizen science project, rather than a smartphone game.
‘We wanted to explore how the success of Pokémon Go might create opportunities or challenges for the conservation movement.’
Co-author John C Mittermeier, a doctoral candidate in Oxford’s School of Geography and the Environment, said: ‘There is a widespread belief that interest in natural history is waning and that people are less interested in spending time outside and exploring the natural world.
‘Pokémon Go is only one step removed from natural history activities like bird watching or insect collecting: Pokémon exist as “real” creatures that can be spotted and collected, and the game itself has been getting people outdoors. What’s going on here, and can we as conservationists take advantage of it?’
In the paper, the researchers explain that Pokémon Go has been shown to inspire high levels of behavioural change among its users, with people making significant adjustments to their daily routines and to the amount of time spent outside in order to increase their chances of encountering target ‘species’. There is also evidence that users are discovering non-virtual wildlife while playing Pokémon Go, leading to the Twitter hashtag #Pokeblitz that helps people identify ‘real’ species found and photographed during play.
Pokémon Go, the researchers write, exposes users first hand to basic natural history concepts such as species’ habitat preferences and variations in abundance. ‘Grass Pokémon’, for example, tend to appear in parks, while water-related types are more likely to be found close to bodies of water. There are also four regional species that are continent restricted: Tauros to the Americas, Mr Mime to Western Europe, Farfetch’d to Asia, and the marsupial-like Kangaskhan to Australasia. This differentiation captures a fundamental aspect of natural history observation – that exploring new habitats and continents will lead to encounters with different species.
And hundreds of people congregated near New York’s Central Park one night over the summer to try to find a rare Vaporeon – something that will sound familiar to birdwatchers used to similar gatherings to see a rare species.
The authors write: ‘The spectacular success of Pokémon Go provides significant lessons for conservation. Importantly, it suggests that conservation is continuing to lag behind Pokémon in efforts to inspire interest in its portfolio of species.
‘There is clear potential to modify Pokémon Go itself to increase conservation content and impact above and beyond simply bringing gamers into closer physical proximity to non-human wildlife as a by-product of the game. Pokémon Go could be adapted to enhance conservation benefits by: a) making Pokémon biology and ecology more realistic; b) adding real species to the Pokémon Go universe to introduce those species to a huge number of users, and creating opportunities to raise awareness about them; c) deliberately placing Pokémon in more remote natural settings rather than urban areas to draw people to experience non-urban nature; or d) adding a mechanism for users to catalogue real species, building on the popularity of the “Pokeblitz” concept.
‘Less directly, lessons from Pokémon Go could be applied to conservation through the development of new conservation-focused augmented reality (AR) games. Following the model of Pokémon Go, games that encourage users to look for real species could provide a powerful tool for education and engagement. AR could also be used in zoos and protected areas to provide visitors with information about species and their habitats.’
However, the researchers caution that the success of Pokémon Go could also bring challenges: for example, it may be that this type of augmented reality – featuring engaging, brightly coloured fictional creatures – could replace people’s desire to interact with real-world nature, or the focus on catching and battling Pokémon may encourage exploitation of wildlife. There has also been controversy in the Netherlands, where Pokémon Go players have been blamed for damage caused to a protected dune system south of The Hague.
Co-author Dr Chris Sandbrook, a senior lecturer at UNEP World Conservation Monitoring Centre, said: ‘Just getting people outside does not guarantee a conservation success from Pokémon Go. It might actually make things worse – for example, if interest in finding a rare Vaporeon replaces concern for real species threatened with extinction. Real nature could be seen as just a mundane backdrop for more exciting virtual wildlife.’
Leejiah Dorward added: ‘One of the positive things about Pokémon Go is that there’s a very low barrier for entry. As long as you have a smartphone, you can play – and the game itself does a lot of things for you. Finding ways to break down barriers to engagement with real-life nature is a priority for conservation. Pokémon are also relatable “characters”, whereas modern conservation tends to frame itself purely in scientific terms, which may be off-putting to many.
‘There is something called the biophilia hypothesis, which suggests that people have an in-built affinity with nature and a desire to explore the natural world. If that’s one of the reasons Pokémon Go has proved to be so popular – because it’s a natural history proxy – then that could be a huge boost to conservation. It’s possible that the desire to connect with nature is there and to get people to engage with conservation we just need to “sell” it correctly.’
Using the strange properties of tiny particles of gold, researchers have concentrated light down smaller than a single atom, letting them look at individual chemical bonds inside molecules, and opening up new ways to study light and matter.
Single gold atoms behave just like tiny metallic ball bearings in our experiments, with conducting electrons roaming around, which is very different from their quantum life.
For centuries, scientists believed that light, like all waves, couldn’t be focused down smaller than its wavelength, just under a millionth of a metre. Now, researchers led by the University of Cambridge have created the world’s smallest magnifying glass, which focuses light a billion times more tightly, down to the scale of single atoms.
In collaboration with European colleagues, the team used highly conductive gold nanoparticles to make the world’s tiniest optical cavity, so small that only a single molecule can fit within it. The cavity – called a ‘pico-cavity’ by the researchers – consists of a bump in a gold nanostructure the size of a single atom, and confines light to less than a billionth of a metre. The results, reported in the journal Science, open up new ways to study the interaction of light and matter, including the possibility of making the molecules in the cavity undergo new sorts of chemical reactions, which could enable the development of entirely new types of sensors.
According to the researchers, building nanostructures with single atom control was extremely challenging. “We had to cool our samples to -260°C in order to freeze the scurrying gold atoms,” said Felix Benz, lead author of the study. The researchers shone laser light on the sample to build the pico-cavities, allowing them to watch single atom movement in real time.
“Our models suggested that individual atoms sticking out might act as tiny lightning rods, but focusing light instead of electricity,” said Professor Javier Aizpurua from the Center for Materials Physics in San Sebastian in Spain, who led the theoretical section of this work.
“Even single gold atoms behave just like tiny metallic ball bearings in our experiments, with conducting electrons roaming around, which is very different from their quantum life where electrons are bound to their nucleus,” said Professor Jeremy Baumberg of the NanoPhotonics Centre at Cambridge’s Cavendish Laboratory, who led the research.
The findings have the potential to open a whole new field of light-catalysed chemical reactions, allowing complex molecules to be built from smaller components. Additionally, there is the possibility of new opto-mechanical data storage devices, allowing information to be written and read by light and stored in the form of molecular vibrations.
A photoreceptor molecule in plant cells has been found to moonlight as a thermometer after dark – allowing plants to read seasonal temperature changes. Scientists say the discovery could help breed crops that are more resilient to the temperatures expected to result from climate change.
Discovering the molecules that allow plants to sense temperature has the potential to accelerate the breeding of crops resilient to thermal stress and climate change
An international team of scientists led by the University of Cambridge has discovered the ‘thermometer’ molecule that enables plants to develop according to seasonal temperature changes.
Researchers have revealed that molecules called phytochromes – used by plants to detect light during the day – actually change their function in darkness to become cellular temperature gauges that measure the heat of the night.
The new findings, published today in the journal Science, show that phytochromes control genetic switches in response to temperature as well as light to dictate plant development.
At night, these molecules change states, and the pace at which they change is “directly proportional to temperature” say scientists, who compare phytochromes to mercury in a thermometer. The warmer it is, the faster the molecular change – stimulating plant growth.
Farmers and gardeners have known for hundreds of years how responsive plants are to temperature: warm winters cause many trees and flowers to bud early, something humans have long used to predict weather and harvest times for the coming year.
The latest research pinpoints for the first time a molecular mechanism in plants that reacts to temperature – often triggering the buds of spring we long to see at the end of winter.
With weather and temperatures set to become ever more unpredictable due to climate change, researchers say the discovery that this light-sensing molecule moonlights as the internal thermometer in plant cells could help us breed tougher crops.
“It is estimated that agricultural yields will need to double by 2050, but climate change is a major threat to such targets. Key crops such as wheat and rice are sensitive to high temperatures. Thermal stress reduces crop yields by around 10% for every one degree increase in temperature,” says lead researcher Dr Philip Wigge from Cambridge’s Sainsbury Laboratory.
“Discovering the molecules that allow plants to sense temperature has the potential to accelerate the breeding of crops resilient to thermal stress and climate change.”
In their active state, phytochrome molecules bind themselves to DNA to restrict plant growth. During the day, sunlight activates the molecules, slowing down growth.
If a plant finds itself in shade, phytochromes are quickly inactivated – enabling it to grow faster to find sunlight again. This is how plants compete to escape each other’s shade. “Light driven changes to phytochrome activity occur very fast, in less than a second,” says Wigge.
At night, however, it’s a different story. Instead of a rapid deactivation following sundown, the molecules gradually change from their active to inactive state. This is called “dark reversion”.
“Just as mercury rises in a thermometer, the rate at which phytochromes revert to their inactive state during the night is a direct measure of temperature,” says Wigge.
“The lower the temperature, the slower phytochromes revert to inactivity, so the molecules spend more time in their active, growth-suppressing state. This is why plants are slower to grow in winter.
“Warm temperatures accelerate dark reversion, so that phytochromes rapidly reach an inactive state and detach themselves from DNA – allowing genes to be expressed and plant growth to resume.”
Wigge believes phytochrome thermo-sensing evolved at a later stage, and co-opted the biological network already used for light-based growth during the downtime of night.
Some plants mainly use day-length as an indicator of the season. Other species, such as daffodils, have considerable temperature sensitivity, and can flower months in advance during a warm winter.
In fact, the discovery of the dual role of phytochromes provides the science behind a well-known rhyme long used to predict the coming season: Oak before Ash we’ll have a splash, Ash before Oak we’re in for a soak.
Wigge explains: “Oak trees rely much more on temperature, likely using phytochromes as thermometers to dictate development, whereas Ash trees rely on measuring day length to determine their seasonal timing.
“A warmer spring, and consequently a higher likeliness of a hot summer, will result in Oak leafing before Ash. A cold spring will see the opposite. As the British know only too well, a colder summer is likely to be a rain-soaked one.”
The new findings are the culmination of twelve years of research involving scientists from Germany, Argentina and the US, as well as the Cambridge team. The work was done in a model system, a mustard plant called Arabidopsis, but Wigge says the phytochrome genes necessary for temperature sensing are found in crop plants as well.
“Recent advances in plant genetics now mean that scientists are able to rapidly identify the genes controlling these processes in crop plants, and even alter their activity using precise molecular ‘scalpels’,” adds Wigge.
“Cambridge is uniquely well-positioned to do this kind of research as we have outstanding collaborators nearby who work on more applied aspects of plant biology, and can help us transfer this new knowledge into the field.”
A new prototype of a lithium-sulphur battery – which could have five times the energy density of a typical lithium-ion battery – overcomes one of the key hurdles preventing their commercial development by mimicking the structure of the cells which allow us to absorb nutrients.
This gets us a long way through the bottleneck which is preventing the development of better batteries.
Researchers have developed a prototype of a next-generation lithium-sulphur battery which takes its inspiration in part from the cells lining the human intestine. The batteries, if commercially developed, would have five times the energy density of the lithium-ion batteries used in smartphones and other electronics.
The new design, by researchers from the University of Cambridge, overcomes one of the key technical problems hindering the commercial development of lithium-sulphur batteries, by preventing the degradation of the battery caused by the loss of material within it. The results are reported in the journal Advanced Functional Materials.
Working with collaborators at the Beijing Institute of Technology, the Cambridge researchers based in Dr Vasant Kumar’s team in the Department of Materials Science and Metallurgy developed and tested a lightweight nanostructured material which resembles villi, the finger-like protrusions which line the small intestine. In the human body, villi are used to absorb the products of digestion and increase the surface area over which this process can take place.
In the new lithium-sulphur battery, a layer of material with a villi-like structure, made from tiny zinc oxide wires, is placed on the surface of one of the battery’s electrodes. This can trap fragments of the active material when they break off, keeping them electrochemically accessible and allowing the material to be reused.
“It’s a tiny thing, this layer, but it’s important,” said study co-author Dr Paul Coxon from Cambridge’s Department of Materials Science and Metallurgy. “This gets us a long way through the bottleneck which is preventing the development of better batteries.”
A typical lithium-ion battery is made of three separate components: an anode (negative electrode), a cathode (positive electrode) and an electrolyte in the middle. The most common materials for the anode and cathode are graphite and lithium cobalt oxide respectively, which both have layered structures. Positively-charged lithium ions move back and forth from the cathode, through the electrolyte and into the anode.
The crystal structure of the electrode materials determines how much energy can be squeezed into the battery. For example, due to the atomic structure of carbon, each carbon atom can take on six lithium ions, limiting the maximum capacity of the battery.
Sulphur and lithium react differently, via a multi-electron transfer mechanism meaning that elemental sulphur can offer a much higher theoretical capacity, resulting in a lithium-sulphur battery with much higher energy density. However, when the battery discharges, the lithium and sulphur interact and the ring-like sulphur molecules transform into chain-like structures, known as a poly-sulphides. As the battery undergoes several charge-discharge cycles, bits of the poly-sulphide can go into the electrolyte, so that over time the battery gradually loses active material.
The Cambridge researchers have created a functional layer which lies on top of the cathode and fixes the active material to a conductive framework so the active material can be reused. The layer is made up of tiny, one-dimensional zinc oxide nanowires grown on a scaffold. The concept was trialled using commercially-available nickel foam for support. After successful results, the foam was replaced by a lightweight carbon fibre mat to reduce the battery’s overall weight.
“Changing from stiff nickel foam to flexible carbon fibre mat makes the layer mimic the way small intestine works even further,” said study co-author Dr Yingjun Liu.
This functional layer, like the intestinal villi it resembles, has a very high surface area. The material has a very strong chemical bond with the poly-sulphides, allowing the active material to be used for longer, greatly increasing the lifespan of the battery.
“This is the first time a chemically functional layer with a well-organised nano-architecture has been proposed to trap and reuse the dissolved active materials during battery charging and discharging,” said the study’s lead author Teng Zhao, a PhD student from the Department of Materials Science & Metallurgy. “By taking our inspiration from the natural world, we were able to come up with a solution that we hope will accelerate the development of next-generation batteries.”
For the time being, the device is a proof of principle, so commercially-available lithium-sulphur batteries are still some years away. Additionally, while the number of times the battery can be charged and discharged has been improved, it is still not able to go through as many charge cycles as a lithium-ion battery. However, since a lithium-sulphur battery does not need to be charged as often as a lithium-ion battery, it may be the case that the increase in energy density cancels out the lower total number of charge-discharge cycles.
“This is a way of getting around one of those awkward little problems that affects all of us,” said Coxon. “We’re all tied in to our electronic devices – ultimately, we’re just trying to make those devices work better, hopefully making our lives a little bit nicer.”
A new design for transistors which operate on ‘scavenged’ energy from their environment could form the basis for devices which function for months or years without a battery, and could be used for wearable or implantable electronics.
If we were to draw energy from a typical AA battery based on this design, it would last for a billion years.
A newly-developed form of transistor opens up a range of new electronic applications including wearable or implantable devices by drastically reducing the amount of power used. Devices based on this type of ultralow power transistor, developed by engineers at the University of Cambridge, could function for months or even years without a battery by ‘scavenging’ energy from their environment.
Using a similar principle to a computer in sleep mode, the new transistor harnesses a tiny ‘leakage’ of electrical current, known as a near-off-state current, for its operations. This leak, like water dripping from a faulty tap, is a characteristic of all transistors, but this is the first time that it has been effectively captured and used functionally. The results, reported in the journal Science, open up new avenues for system design for the Internet of Things, in which most of the things we interact with every day are connected to the Internet.
The transistors can be produced at low temperatures and can be printed on almost any material, from glass and plastic to polyester and paper. They are based on a unique geometry which uses a ‘non-desirable’ characteristic, namely the point of contact between the metal and semiconducting components of a transistor, a so-called ‘Schottky barrier.’
“We’re challenging conventional perception of how a transistor should be,” said Professor Arokia Nathan of Cambridge’s Department of Engineering, the paper’s co-author. “We’ve found that these Schottky barriers, which most engineers try to avoid, actually have the ideal characteristics for the type of ultralow power applications we’re looking at, such as wearable or implantable electronics for health monitoring.”
The new design gets around one of the main issues preventing the development of ultralow power transistors, namely the ability to produce them at very small sizes. As transistors get smaller, their two electrodes start to influence the behaviour of one another, and the voltages spread, meaning that below a certain size, transistors fail to function as desired. By changing the design of the transistors, the Cambridge researchers were able to use the Schottky barriers to keep the electrodes independent from one another, so that the transistors can be scaled down to very small geometries.
The design also achieves a very high level of gain, or signal amplification. The transistor’s operating voltage is less than a volt, with power consumption below a billionth of a watt. This ultralow power consumption makes them most suitable for applications where function is more important than speed, which is the essence of the Internet of Things.
“If we were to draw energy from a typical AA battery based on this design, it would last for a billion years,” said Dr Sungsik Lee, the paper’s first author, also from the Department of Engineering. “Using the Schottky barrier allows us to keep the electrodes from interfering with each other in order to amplify the amplitude of the signal even at the state where the transistor is almost switched off.”
“This is an ingenious transistor concept,” said Professor Gehan Amaratunga, Head of the Electronics, Power and Energy Conversion Group at Cambridge’s Engineering Department. “This type of ultra-low power operation is a pre-requisite for many of the new ubiquitous electronics applications, where what matters is function – in essence ‘intelligence’ – without the demand for speed. In such applications the possibility of having totally autonomous electronics now becomes a possibility. The system can rely on harvesting background energy from the environment for very long term operation, which is akin to organisms such as bacteria in biology.”