Researchers at Columbia University, Princeton and Harvard University have developed a new approach for analyzing big data that can drastically improve the ability to make accurate predictions about medicine, complex diseases, social science phenomena, and other issues.
In a study published in the December 13 issue of Proceedings of the National Academy of Sciences (PNAS), the authors introduce the Influence score, or “I-score,” as a statistic correlated with how much variables inherently can predict, or “predictivity”, which can consequently be used to identify highly predictive variables.
“In our last paper, we showed that significant variables may not necessarily be predictive, and that good predictors may not appear statistically significant,” said principal investigator Shaw-Hwa Lo, a professor of statistics at Columbia University. “This left us with an important question: how can we find highly predictive variables then, if not through a guideline of statistical significance? In this article, we provide a theoretical framework from which to design good measures of prediction in general. Importantly, we introduce a variable set’s predictivity as a new parameter of interest to estimate, and provide the I-score as a candidate statistic to estimate variable set predictivity.”
Current approaches to prediction generally include using a significance-based criterion for evaluating variables to use in models and evaluating variables and models simultaneously for prediction using cross-validation or independent test data.
“Using the I-score prediction framework allows us to define a novel measure of predictivity based on observed data, which in turn enables assessing variable sets for, preferably high, predictivity,” Lo said, adding that, while intuitively obvious, not enough attention has been paid to the consideration of predictivity as a parameter of interest to estimate. Motivated by the needs of current genome-wide association studies (GWAS), the study authors provide such a discussion.
In the paper, the authors describe the predictivity for a variable set and show that a simple sample estimation of predictivity directly does not provide usable information for the prediction-oriented researcher. They go on to demonstrate that the I-score can be used to compute a measure that asymptotically approaches predictivity. The I-score can effectively differentiate between noisy and predictive variables, Lo explained, making it helpful in variable selection. A further benefit is that while usual approaches require heavy use of cross-validation data or testing data to evaluate the predictors, the I-score approach does not rely as much on this as much.
“We offer simulations and an application of the I-score on real data to demonstrate the statistic’s predictive performance on sample data,” he said. “These show that the I-score can capture highly predictive variable sets, estimates a lower bound for the theoretical correct prediction rate, and correlates well with the out of sample correct rate. We suggest that using the I-score method can aid in finding variable sets with promising prediction rates, however, further research in the avenue of sample-based measures of predictivity is needed.”
The authors conclude that there are many applications for which using the I-score would be useful, for example in formulating predictions about diseases with high dimensional data, such as gene datasets, in the social sciences for text prediction or financial markets predictions; in terrorism, civil war, elections and financial markets.
“We’re hoping to impress upon the scientific community the notion that for those of us who might be interested in predicting an outcome of interest, possibly with rather complex or high dimensional data, we might gain by reconsidering the question as one of how to search for highly predictive variables (or variable sets) and using statistics that measure predictivity to help us identify those variables to then predict well,” Lo said. “For statisticians in particular, we’re hoping this opens up a new field of work that would focus on designing new statistics that measure predictivity.”
It is the oldest institution of higher learning in the State of New York, the fifth oldest in the United States, and one of the country’s nine Colonial Colleges founded before the American Revolution. Today the university operates Columbia Global Centers overseas in Amman, Beijing, Istanbul, Paris, Mumbai, Rio de Janeiro, Santiago and Nairobi.
The university was founded in 1754 as King’s College by royal charter of George II of Great Britain. After the American Revolutionary War, King’s College briefly became a state entity, and was renamed Columbia College in 1784. The University now operates under a 1787 charter that places the institution under a private board of trustees, and in 1896 it was further renamed Columbia University. That same year, the university’s campus was moved from Madison Avenue to its current location in Morningside Heights, where it occupies more than six city blocks, or 32 acres (12.9 ha).
The university encompasses twenty schools and is affiliated with numerous institutions, including Teachers College (which is an academic department of the university, and serves as Columbia’s Graduate School of Education), Barnard College, and the Union Theological Seminary, with joint undergraduate programs available through the Jewish Theological Seminary of America as well as the Juilliard School.
Columbia University research articles from Innovation Toronto
- Evaporation-powered Motor and Light – December 9, 2015
- Columbia engineers build biologically powered chip – December 8, 2015
- Renewable energy from evaporating water – June 22, 2015
- New Computational Technique Means Big Advances in Color 3D Printing Process – May 23, 2015
- Digital Imaging Revolution: Video Camera that Runs without a Battery – April 16, 2015
- New Technology May Immediately Double Existing Radio Frequency Data Capacity – March 16, 2015
- Smartphone, Finger Prick, 15 Minutes, Diagnosis—Done! – February 9, 2015
- New High-Speed 3D Microscope—SCAPE—Gives Deeper View of Living Things – January 23, 2015
- Researchers Develop World’s Thinnest Electric Generator – October 18, 2014
- Shift in Arabian Sea Plankton May Threaten Fisheries – September 10, 2014
- New Research Proves Gender Bias Extraordinarily Prevalent in STEM Careers – June 24, 2014
- Is It a Crow or a Raven? New Birdsnap App Will Tell You! – May 30, 2014
- Massive online open classrooms not yet meeting high expectations – May 19, 2014
- A Call for Urgent Talks on Mutant Flu-Strain Research
- Coming soon . . . No More Lung Donors Needed? | human stem cells
- Columbia Engineers Make World’s Smallest FM Radio Transmitter
- “Futurity” service launches to promote university research as traditional science journalism declines
- Indian finds breakthrough innovation in the fight against global warming
- Narrow-Spectrum UV Light May Reduce Surgical Infections
- New Soil Testing Kit for Third World Countries
- Knowledge for earnings’ sake
- Delayed aging is better investment than cancer, heart disease research
- A Major Cause of Age-Related Memory Loss Identified
- New findings on how the ear hears could lead to better hearing aids
- How We’ll Grow The Next Generation Of Buildings With Bacteria
- This App Teaches Kids To Code By Letting Them Make Their Own Games
- Can the World Afford Cheap Water?
- Telerobotic system designed to treat bladder cancer
- Scientists develop a biodegradable nanoparticle that effectively resolves inflammation
- Designing Interlocking Building Blocks to Create Complex Tissues
- Mobile Device Speeds Up Diagnostic Testing for HIV and More
- This Week’s Forecast: Sunny with a 40 Percent Chance of Flu
- Auto-Immune: “Symbiotes” Could Be Deployed to Thwart Cyber Attack
- When Robotic Surgery Leaves Just a Scratch
- Building Small: In Many Industries, Economies of Size Is Shifting to Economies of Numbers
- UCSB scientists examine effects of manufactured nanoparticles on soybean crops
- I-Corps: Startups with a difference
- New flu virus found in seals concerns scientists
- Research May Lead to New Treatment for Type of Brain Cancer
- Combating climate change
- Scientists Present Prostate Cancer Breakthrough at International Oncology Conference
- Software That Listens for Lies
- Study Explains Why Muscles Weaken With Age and Points to Possible Therapy
- Birth control pill for men being developed
- Tissue-Engineering Platform Enables Heart Tissue to Repair Itself
- Dr. No Money: The Broken Science Funding System
- Arid Land, Thirsty Crops
- Breakthrough: Bone Graft Grown in Exact Shape of Complex Skull-Jaw Joint
- Treepods air-scrubbers could clean up Boston
- Letting a thousand flowers wither
- Suffer the Little Children
- How Can Humanity Avoid or Reverse the Dangers Posed by a Warming Climate?
- Computers That Trade on the News
- New Tool Tracks Culture Through the Centuries via Google Books
- Here Comes The Wetware
- How a Mild Virus Might Turn Vicious
- Wider Streets for Internet Traffic
- All Consuming
- Scaffolding to help mend a broken heart
- Biological Substitute for Dental Implants?
- Scientific team creates molecular robot from DNA
- Leading Lights
- Street Smarts: The BioBus Brings a Rolling Science Lab to Resource-Strapped Schools
- Hybrid Solar Panels Combine Photovoltaics with Thermoelectricity
- Another Inconvenient Truth: The World’s Growing Population Poses a Malthusian Dilemma
- Rock of ages – Another way of getting rid of carbon dioxide
- Eating carbon – Big WOW!
- Google Earth reveals untold fish catches
- Quantum state world record smashed
- Slacktivism: ‘Liking’ on Facebook may mean less giving
- 2013 Ocean Health Index Shows Food Provision Remains an Area of Great Concern
- New technology can prevent cellular overload, dropped calls
- Breakthrough discovery could result in fragrant golden harvest
- Toxic Nanoparticles Might be Entering Human Food Supply
- Helping Many People Boosts Social Standing More Than Helping Many Times
- Sharks worth more in the ocean than on the menu
- UBC engineer helps pioneer flat spray-on optical lens
- Climate change may be catastrophic to a third of all animals on Earth
- Plasma Device Developed at MU Could Revolutionize Energy Generation and Storage
- Breakthrough Cancer-Killing Treatment Has No Side-Effects, Says MU Researcher
- Genetic analysis saves major apple-producing region of Washington state
- A Quantum Leap in Gene Therapy of Duchenne Muscular Dystrophy
- The Case For the Wooden Skyscraper
- Mussel goo inspires blood vessel glue
- Targeted Micro-Bubbles Detect Artery Inflammation
- Discovery could lead to new treatment for lung inflammation
- UBC research creates wireless charger for electric cars
- Seeking Cures, Patients Enlist Mice Stand-Ins
- STUDY FINDS A NEW PATHWAY FOR INVASIVE SPECIES – SCIENCE TEACHERS
- Gold Nanoparticles Could Treat Prostate Cancer With Fewer Side Effects than Chemotherapy
- How to make global fisheries worth five times more
- UBC researchers make flu vaccine breakthrough
- One-Pound Boat That Could Float 1,000 Pounds
- Team Tracks a Food Supply at the End of the World
- MU professor’s battery technology holds promise
- Censoring of Tweets Sets Off #Outrage
- Switch off the lights, here comes the sun
- Older adults may be vulnerable to new swine flu virus
- Creating Artificial Intelligence Based on the Real Thing
- More Powerful ‘Lab-On-A-Chip’ Made for Genetic Analysis
- ‘Wi-fi for energy’ wins Penn invention competition
- New power sources use magnesium
- Folding Plants Inspire Next Generation of Shape Shifting Robots
- Herbs ‘can be natural pesticides’
- A Promise to Be Ethical in an Era of Immorality
- Underwater Robot to Explore Ice-Covered Ocean and Antarctic Ice Shelf
- U.S. Could Eliminate CO2 Emissions from Coal in 20 Years
- Energy-efficiency measures could save consumers $41 billion
- Ostara reactors harvest phosphorus from raw sewage
- The Cyber Sea: World’s Largest Internet Undersea Science Station Boots Up
- ‘Scary’ climate message from past
- Money Buys Happiness When You Spend On Others, Study Shows
Researchers at Princeton, Columbia and Harvard have created a new method to analyze big data that better predicts outcomes in health care, politics and other fields.
The study appears this week in the journal Proceedings of the National Academy of Sciences. A PDF is available on request.
In previous studies, the researchers showed that significant variables might not be predictive and that good predictors might not appear statistically significant. This posed an important question: how can we find highly predictive variables if not through a guideline of statistical significance? Common approaches to prediction include using a significance-based criterion for evaluating variables to use in models and evaluating variables and models simultaneously for prediction using cross-validation or independent test data.
In an effort to reduce the error rate with those methods, the researchers proposed a new measure called the influence score, or I-score, to better measure a variable’s ability to predict. They found that the I-score is effective in differentiating between noisy and predictive variables in big data and can significantly improve the prediction rate. For example, the I-score improved the prediction rate in breast cancer data from 70 percent to 92 percent. The I-score can be applied in a variety of fields, including terrorism, civil war, elections and financial markets.
“The practical implications are what drove the project, so they’re quite broad,” says lead author Adeline Lo, a postdoctoral researcher in Princeton’s Department of Politics. “Essentially anytime you might be interested in predicting and identifying highly predictive variables, you might have something to gain by conducting variable selection through a statistic like the I-score, which is related to variable predictivity. That the I-score fares especially well in high dimensional data and with many complex interactions between variables is an extra boon for the researcher or policy expert interested in predicting something with large dimensional data.”
Researchers discover that electrons mimic light in graphene, confirming a 2007 prediction – their finding may enable new low power electronics and lead to new experimental probes.
A team led by Cory Dean, assistant professor of physics at Columbia University, Avik Ghosh, professor of electrical and computer engineering at the University of Virginia, and James Hone, Wang Fong-Jen Professor of Mechanical Engineering at Columbia Engineering, has directly observed—for the first time—negative refraction for electrons passing across a boundary between two regions in a conducting material. First predicted in 2007, this effect has been difficult to confirm experimentally. The researchers were able to observe the effect in graphene, demonstrating that electrons in the atomically thin material behave like light rays, which can be manipulated by such optical devices as lenses and prisms. The findings, which are published in the September 30 edition of Science, could lead to the development of new types of electron switches, based on the principles of optics rather than electronics.
“The ability to manipulate electrons in a conducting material like light rays opens up entirely new ways of thinking about electronics,” says Dean. “For example, the switches that make up computer chips operate by turning the entire device on or off, and this consumes significant power. Using lensing to steer an electron ‘beam’ between electrodes could be dramatically more efficient, solving one of the critical bottlenecks to achieving faster and more energy efficient electronics.”
Dean adds, “These findings could also enable new experimental probes. For example, electron lensing could enable on-chip versions of an electron microscope, with the ability to perform atomic scale imaging and diagnostics. Other components inspired by optics, such as beam splitters and interferometers, could additionally enable new studies of the quantum nature of electrons in the solid state.”
While graphene has been widely explored for supporting high electron speed, it is notoriously hard to turn off the electrons without hurting their mobility. Ghosh says, “The natural follow-up is to see if we can achieve a strong current turn-off in graphene with multiple angled junctions. If that works to our satisfaction, we’ll have on our hands a low-power, ultra-high-speed switching device for both analog (RF) and digital (CMOS) electronics, potentially mitigating many of the challenges we face with the high energy cost and thermal budget of present day electronics.”
Light changes direction – or refracts – when passing from one material to another, a process that allows us to use lenses and prisms to focus and steer light. A quantity known as the index of refraction determines the degree of bending at the boundary, and is positive for conventional materials such as glass. However, through clever engineering, it is also possible to create optical “metamaterials” with a negative index, in which the angle of refraction is also negative. “This can have unusual and dramatic consequences,” Hone notes. “Optical metamaterials are enabling exotic and important new technologies such as super lenses, which can focus beyond the diffraction limit, and optical cloaks, which make objects invisible by bending light around them.”
Electrons travelling through very pure conductors can travel in straight lines like light rays, enabling optics-like phenomena to emerge. In materials, the electron density plays a similar role to the index of refraction, and electrons refract when they pass from a region of one density to another. Moreover, current carriers in materials can either behave like they are negatively charged (electrons) or positively charged (holes), depending on whether they inhabit the conduction or the valence band. In fact, boundaries between hole-type and electron-type conductors, known as p-n junctions (“p” positive, “n” negative), form the building blocks of electrical devices such as diodes and transistors.
“Unlike in optical materials”, says Hone, “where creating a negative index metamaterial is a significant engineering challenge, negative electron refraction occurs naturally in solid state materials at any p-n junction.”
The development of two-dimensional conducting layers in high-purity semiconductors such as GaAs (Gallium arsenide) in the 1980s and 1990s allowed researchers to first demonstrate electron optics including the effects of both refraction and lensing. However, in these materials, electrons travel without scattering only at very low temperatures, limiting technological applications. Furthermore, the presence of an energy gap between the conduction and valence band scatters electrons at interfaces and prevents observation of negative refraction in semiconductor p-n junctions. In this study, the researchers’ use of graphene, a 2D material with unsurpassed performance at room temperature and no energy gap, overcame both of these limitations.
The possibility of negative refraction at graphene p-n junctions was first proposed in 2007 by theorists working at both the University of Lancaster and Columbia University. However, observation of this effect requires extremely clean devices, such that the electrons can travel ballistically, without scattering, over long distances. Over the past decade, a multidisciplinary team at Columbia – including Hone and Dean, along with Kenneth Shepard, Lau Family Professor of Electrical Engineering and professor of biomedical engineering, Abhay Pasupathy, associate professor of physics, and Philip Kim, professor of physic at the time (now at Harvard) – has worked to develop new techniques to construct extremely clean graphene devices. This effort culminated in the 2013 demonstration of ballistic transport over a length scale in excess of 20 microns. Since then, they have been attempting to develop a Veselago lens, which focuses electrons to a single point using negative refraction. But they were unable to observe such an effect and found their results puzzling.
In 2015, a group at Pohang University of Science and Technology in South Korea reported the first evidence focusing in a Veselago-type device. However, the response was weak, appearing in the signal derivative. The Columbia team decided that to fully understand why the effect was so elusive, they needed to isolate and map the flow of electrons across the junction. They utilized a well-developed technique called “magnetic focusing” to inject electrons onto the p-n junction. By measuring transmission between electrodes on opposite sides of the junction as a function of carrier density they could map the trajectory of electrons on both sides of the p-n junction as the incident angle was changed by tuning the magnetic field.
Crucial to the Columbia effort was the theoretical support provided by Ghosh’s group at the University of Virginia, who developed detailed simulation techniques to model the Columbia team’s measured response. This involved calculating the flow of electrons in graphene under the various electric and magnetic fields, accounting for multiple bounces at edges, and quantum mechanical tunneling at the junction. The theoretical analysis also shed light on why it has been so difficult to measure the predicted Veselago lensing in a robust way, and the group is developing new multi-junction device architectures based on this study. Together the experimental data and theoretical simulation gave the researchers a visual map of the refraction, and enabled them to be the first to quantitatively confirm the relationship between the incident and refracted angles (known as Snell’s Law in optics), as well as confirmation of the magnitude of the transmitted intensity as a function of angle (known as the Fresnel coefficients in optics).
“In many ways, this intensity of transmission is a more crucial parameter,” says Ghosh, “since it determines the probability that electrons actually make it past the barrier, rather than just their refracted angles. The transmission ultimately determines many of the performance metrics for devices based on these effects, such as the on-off ratio in a switch, for example.”
The Science study was supported by Semiconductor Research Corporation’s NRI Center for Institute for Nanoelectronics Discovery and Exploration (INDEX). The collaboration was made possible through the support of the NRI program, which has brought together some of the country’s best groups in the study of 2D materials to focus on novel device architectures that may outperform conventional silicon-based technologies.
In a discovery that could have profound implications for future energy policy, Columbia scientists have demonstrated it is possible to manufacture solar cells that are far more efficient than existing silicon energy cells by using a new kind of material, a development that could help reduce fossil fuel consumption.
The team, led by Xiaoyang Zhu, a professor of Chemistry at Columbia University, focused its efforts on a new class of solar cell ingredients known as Hybrid Organic Inorganic Perovskites (HOIPs). Their results, reported in the prestigious journal Science, also explain why these new materials are so much more efficient than traditional solar cells—solving a mystery that will likely prompt scientists and engineers to begin inventing new solar materials with similar properties in the years ahead.
“The need for renewable energy has motivated extensive research into solar cell technologies that are economically competitive with burning fossil fuel,” Zhu says. “Among the materials being explored for next generation solar cells, HOIPs have emerged a superstar. Until now no one has been able to explain why they work so well, and how much better we might make them. We now know it’s possible to make HOIP-based solar cells even more efficient than anyone thought possible.”
Solar cells are what turn sunlight into electricity. Also known as photovoltaic cells, these semiconductors are most frequently made from thin layers of silicon that transmit energy across its structure, turning it into DC current.
Silicon panels, which currently dominate the market for solar panels, must have a purity of 99.999 percent and are notoriously fragile and expensive to manufacture. Even a microscopic defect—such as misplaced, missing or extra ions—in this crystalline structure can exert a powerful pull on the charges the cells generate when they absorb sunlight, dissipating those charges before they can be transformed into electrical current.
In 2009, Japanese scientists demonstrated it was possible to build solar cells out of HOIPs, and that these cells could harvest energy from sunlight even when the crystals had a significant number of defects. Because they don’t need to be pristine, HOIPs can be produced on a large scale and at low cost. The Columbia team has been investigating HOIPs since 2014. Their findings could help boost the use of solar power, a priority in the age of global warming.
Over the last seven years, scientists have managed to increase the efficiency with which HOIPs can convert solar energy into electricity, to 22 percent from 4 percent. By contrast, it took researchers more than six decades to create silicon cells and bring them to their current level, and even now silicon cells can convert no more than about 25 percent of the sun’s energy into electrical current.
This discovery, Zhu said, meant that “scientists have only just begun to tap the potential of HOIPs to convert the sun’s energy into electricity.”
Theorists long ago demonstrated that the maximum efficiency silicon solar cells might ever reach— the percentage of energy in sunlight that might be converted to electricity we can use—is roughly 33 percent. It takes hundreds of nanoseconds for energized electrons to move from the part of a solar cell that infuses them with the sun’s energy, to the part of the cell that harvests the energy and converts it into electricity that can ultimately be fed into a power grid. During this migration across the solar cell, the energized electrons quickly dissipate their excess energy. But those calculations assume a specific rate of energy loss. The Columbia team discovered that the rate of energy loss is slowed down by over three-orders of magnitude in HOIPs – making it possible for the harvesting of excess electronic energy to increase the efficiency of solar cells.
Related: Columbia Scientists Unlock Big Perovskite Solar Cell Mystery, Clean Technica, Sept 22, 2016
“We’re talking about potentially doubling the efficiency of solar cells,” says Prakriti P. Joshi, a Ph.D. student in Zhu’s lab who is a coauthor on the paper. “That’s really exciting because it opens up a big, big field in engineering.” Adds Zhu, “This shows we can push the efficiencies of solar cells much higher than many people thought possible.”
After demonstrating this, the team then turned to the next question: what is it about the molecular structure of HOIPs that gives them their unique properties? How do electrons avoid defects? They discovered that the same mechanism that slows down the cooling of electron energy also protects the electrons from bumping into defects. This “protection” makes the HOIPs turn a blind eye to the ubiquitous defects in a material developed from room-temperature and solution processing, thus allowing an imperfect material to behave like a perfect semiconductor.
HOIPs contain lead, and are also water soluble, meaning the solar cells could begin to dissolve and leach lead into the environment around them if not carefully protected from the elements.
With the explanation of the mysterious mechanisms that give HOIPs their remarkable efficiencies, Zhu knew, material scientists would likely be able to mimic them with more environmentally-friendly materials.
“Now we can go back and design materials which are environmentally benign and really solve this problem everybody is worried about,” Zhu says. “This principle will allow people to start to design new materials for solar energy.”
Researchers at University of California San Diego’s Big Pixel Initiative are using unique tools to map urban areas around the globe, potentially revolutionizing large-scale analysis of urbanization. Using Google Earth Engine, they developed and tested new machine-learning approaches that use high-resolution satellite data to detect and map settlements around the world.
These methods, detailed in the paper “Detecting the Boundaries of Urban Areas in India: A Dataset for Pixel-Based Image Classification from Google Earth Engine,” will eventually allow for the creation of a high-resolution map of all inhabited locations and for a better understanding of how cities expand and evolve. They provide, for the first time, a reliable and comprehensive open-source data for detecting and mapping urban areas through satellite images.
The paper appears in the August 2016 issue of Remote Sensing, one of the top peer-reviewed journals on satellite-based research. Authors are Big Pixel Initiative researchers Ran Goldblatt and Gordon Hanson of the School of Global Policy and Strategy, with UC San Diego Department of Economics doctoral candidate Wei You and Amit K. Khandelwal of Columbia Business School at Columbia University.
With the hope to provide a tool that can identify urbanization and industrialization to other researchers, the authors found there is currently no reliable open-source dataset to automatically detect urban areas and to validate the existing maps that currently exist. They explain that urbanization is a fundamental force that shapes almost all dimensions of the modern world, from land cover and land use around cities to economics and policy making. However, the rate and magnitude of these changes have not yet been mapped globally with sharp precision.
“With the availability of cloud-based platforms such as [Google Earth Engine], it is now feasible to monitor urbanization in multi-spatial and temporal resolutions and to understand urban dynamics globally,” the authors write. These platforms allow researchers to analyze geospatial data and to understand the rate and magnitude of urban growth, especially in regions and countries where maps of urban areas do not exist.
“Ours is the first to provide comprehensive open-source ground-truth data that can serve as a training set for supervised classification of built-up land cover,” they write. “Understanding the various ecological, environmental, social and economic impacts of these processes is essential for the preservation of a sustainable human society.”
Goldblatt and the team constructed a unique dataset of 21,030 manually classified image samples representing different forms of built-up and not built-up land cover in India. These samples were then used for supervised image classification designed to detect urban areas, performing the analysis in cloud-based Google Earth Engine. Their goal in part is to use high-resolution satellite-data to create a continuous map of the urbanization process: for the first time looking extensively over time and over large-scale areas.
“Expanding this research frontier creates an urgent need for ground-truth data that can facilitate the development of supervised machine-learning algorithms and enable reliable evaluation and validation,” they said. Although this research was designed to detect urban areas in India, the methodology can easily be applied to other countries and regions, and will have impacts for governments, policy makers, business and property development as well as humanitarian and environmental workers.
Founded by Hanson and Albert Yu-Min Lin of the Qualcomm Institute, the Big Pixel Initiative’s mission is to develop advanced geospatial capacity to address the world’s greatest challenges. The initiative was launched in 2015 by unique, two-year access to DigitalGlobe Foundation data, and then expanded to include analysis on the Google Earth Engine platform.
In working with geoscientists across campus and experts at Google, Hanson is leading the university’s efforts to measure urbanization worldwide using satellite imagery. “We want to be able to measure how cities grow and expand on the whole planet as closet to real time as we can, by using the vast amounts of satellite imagery that are coming online,” Hanson said.
Neurons that fire together really do wire together, says a new study in Science, suggesting that the three-pound computer in our heads may be more malleable than we think.
In the latest issue of Science, neuroscientists at Columbia University demonstrate that a set of neurons trained to fire in unison could be reactivated as much as a day later if just one neuron in the network was stimulated. Though further research is needed, their findings suggest that groups of activated neurons may form the basic building blocks of learning and memory, as originally hypothesized by psychologist Donald Hebb in the 1940s.
“I always thought the brain was mostly hard-wired,” said the study’s senior author, Dr. Rafael Yuste, a neuroscience professor at Columbia University. “But then I saw the results and said ‘Holy moly, this whole thing is plastic.’ We’re dealing with a plastic computer that’s constantly learning and changing.”
The researchers were able to control and observe the brain of a living mouse using the optogenetic tools that have revolutionized neuroscience in the last decade. They injected the mouse with a virus containing light-sensitive proteins engineered to reach specific brain cells. Once inside a cell, the proteins allowed researchers to remotely activate the neuron with light, as if switching on a TV.
The mouse was allowed to run freely on a treadmill while its head was held still under a microscope. With one laser, the researchers beamed light through its skull to stimulate a small group of cells in the visual cortex. With a second laser, they recorded rising levels of calcium in each neuron as it fired, thus imaging the activity of individual cells.
Before optogenetics, scientists had to open the skull and implant electrodes into living tissue to stimulate neurons with electricity and measure their response. Even a mouse brain of 100 million neurons, nearly a thousandth the size of ours, was too dense to get a close look at groups of neurons.
Dinner in 3D
We’re all accustomed to having appliances on our kitchen counters, from toasters and blenders to coffee makers and microwaves. If Mechanical Engineering Professor Hod Lipson has his way, we’ll soon need to make room for one more—a 3D food printer that could revolutionize the way we think about food and prepare it.
Over the past year, Lipson and his students have been developing a 3D food printer that can fabricate edible items through computer-guided software and the actual cooking of edible pastes, gels, powders, and liquid ingredients—all in a prototype that looks like an elegant coffee machine. The printer is the result of a design project devised by Lipson and his students, led by Drim Stokhuijzen, an industrial design graduate student visiting from Delft University of Technology in the Netherlands, and Jerson Mezquita, an undergraduate student visiting from SUNY Maritime who is now a research associate in Lipson’s Creative Machines Lab (CML).
“Food printers are not meant to replace conventional cooking—they won’t solve all of our nutritional needs, nor cook everything we should eat,” says Lipson, a pioneering roboticist who works in the areas of artificial intelligence and digital manufacturing. “But they will produce an infinite variety of customized fresh, nutritional foods on demand, transforming digital recipes and basic ingredients supplied in frozen cartridges into healthy dishes that can supplement our daily intake. I think this is the missing link that will bring the benefits of personalized data-driven health to our kitchen tables—it’s the ‘killer app’ of 3D printing.”
Researchers achieve real-time single molecule electronic DNA sequencing at single-base resolution using a protein nanopore array–a future platform for precision medicine and enabling it to be used in routine medical diagnoses
Researchers from Columbia University, with colleagues at Genia Technologies (Roche), Harvard University and the National Institute of Standards and Technology (NIST) report achieving real-time single molecule electronic DNA sequencing at single-base resolution using a protein nanopore array.
DNA sequencing is the key technology for personalized and precision medicine initiatives, enabling rapid discoveries in biomedical science. An individual’s complete genome sequence provides important markers and guidelines for medical diagnostics, healthcare, and maintaining a healthy life. To date, the cost and speed involved in obtaining highly accurate DNA sequences has been a major challenge. While various advancements have been made over the past decade, the high-throughput sequencing instruments widely used today depend on optics for the detection of four DNA building blocks: A, C, G and T. To explore alternative measurement capabilities, electronic sequencing of an ensemble of DNA templates has been developed for genetic analysis. Nanopore strand sequencing, wherein a single strand DNA is threaded through the nanoscale pores under an applied electrical voltage to produce electronic signals for sequence determination at single molecule level, has recently been developed; however, because the four nucleotides are very similar in their chemical structures, they cannot easily be distinguished using this method. Researchers are therefore actively pursing the research and development of an accurate single-molecule electronic DNA sequencing platform as it has the potential to produce a miniaturized DNA sequencer capable of deciphering the genome to facilitate personalized precision medicine.
A team of researchers at Columbia Engineering, headed by Jingyue Ju (Samuel Ruben-Peter G. Viele Professor of Engineering, Professor of Chemical Engineering and Pharmacology, Director of the Center for Genome Technology & Biomolecular Engineering), with colleagues at Harvard Medical School, led by George Church (Professor of Genetics); Genia Technologies, led by Stefan Roever (CEO of Genia); and John Kasianowicz, the Principal Investigator at NIST, have developed a complete system to sequence DNA in nanopores electronically at single molecule level with single-base resolution. This work, entitled, “Real-Time Single Molecule Electronic DNA Sequencing by Synthesis Using Polymer Tagged Nucleotides on a Nanopore Array,” is published in the journal, Proceedings of the National Academy of Sciences (PNAS) Early Edition: http://www.
Previously, researchers from the laboratories of Ju at Columbia and Kasianowicz at NIST reported the general principle of nanopore sequencing by synthesis (SBS), the feasibility of design and synthesis of polymer-tagged nucleotides as substrates for DNA polymerase, the detection and the differentiation of the polymer tags by nanopore at the single molecule level [Scientific Reports 2, 684 (2012) doi: 10.1038/srep00684; http://www.
As Carl Fuller, lead author, Adjunct Senior Research Scientist in the Ju Laboratory of the Chemical Engineering Department at Columbia and Director of Chemistry at Genia, points out, “The novelty of our nanopore SBS approach begins with the design, synthesis, and selection of four different polymer-tagged nucleotides. We use a DNA polymerase covalently attached to the nanopore and the tagged nucleotides to perform SBS. During replication of the DNA bound to the polymerase, the tag of each complementary nucleotide is captured in the pore to produce a unique electrical signal. Four distinct polymer tags yielding distinct signatures that are recognized by the electronic detector in the nanopore array chip are used for sequence determination. Thus, DNA sequences are obtained for many single molecules in parallel and in real time. The four polymer tags are designed to offer much better distinctions among themselves, in contrast to the small differences among the four native DNA nucleotides, thereby overcoming the major challenge faced by other direct nanopore sequencing methods.” Moreover, the tags can be further optimized with respect to size, charge, and structure to provide optimal resolution in the nanopore SBS system.
“This exciting project brings together scientists and engineers from both academia and industry with combined expertise in molecular engineering, nanotechnology, genomics, electronics and data science to produce revolutionary, cost-effective genetic diagnostic platforms with unprecedented potential for precision medicine,” says Ju. “We are extremely grateful for the generous support from the NIH that enabled us to make rapid progress in the research and development of the nanopore SBS technology, and the outstanding contributions from all the members of our research consortium.”
According to Ju, the researchers have already pushed beyond what was demonstrated in thePNAS study where the sequencing data was obtained on an early prototype sequencer based on nanopore SBS. The throughput and performance of the current sequencer has progressed beyond what was reported in the PNAS paper. The feasibility of reaching read-lengths of over 1000 bases of DNA has recently been achieved. Going forward, the collaborative research team will continue to optimize the tags by tweaking the linkers, structure, and charge at the molecular level, and fine tuning the polymerase and the electronics for the nanopore SBS system with an aim to accurately sequence an entire human genome rapidly and at low cost, thereby enabling it to be used in routine medical diagnoses.
Columbia Engineers develop the first on-chip RF circulator that doubles WiFi speeds with a single antenna — could transform telecommunications
Last year, Columbia Engineering researchers were the first to invent a technology–full-duplex radio integrated circuits (ICs)–that can be implemented in nanoscale CMOS to enable simultaneous transmission and reception at the same frequency in a wireless radio. That system required two antennas, one for the transmitter and one for the receiver. And now the team, led by Electrical Engineering Associate Professor Harish Krishnaswamy, has developed a breakthrough technology that needs only one antenna, thus enabling an even smaller overall system. This is the first time researchers have integrated a non-reciprocal circulator and a full-duplex radio on a nanoscale silicon chip. The circulator research is published online April 15 in Nature Communications (DOI is 10.1038/NCOMMS11217) and the paper detailing the single-chip full-duplex radio with the circulator and additional echo cancellation was presented at the 2016 IEEE International Solid-State Circuits Conference on February 2.
“This technology could revolutionize the field of telecommunications,” says Krishnaswamy, director of the Columbia High-Speed and Mm-wave IC (CoSMIC) Lab. “Our circulator is the first to be put on a silicon chip, and we get literally orders of magnitude better performance than prior work. Full-duplex communications, where the transmitter and the receiver operate at the same time and at the same frequency, has become a critical research area and now we’ve shown that WiFi capacity can be doubled on a nanoscale silicon chip with a single antenna. This has enormous implications for devices like smartphones and tablets.”
Krishnaswamy’s group has been working on silicon radio chips for full duplex communications for several years and became particularly interested in the role of the circulator, a component that enables full-duplex communications where the transmitter and the receiver share the same antenna. In order to do this, the circulator has to “break” Lorentz Reciprocity, a fundamental physical characteristic of most electronic structures that requires electromagnetic waves travel in the same manner in forward and reverse directions.
“Reciprocal circuits and systems are quite restrictive because you can’t control the signal freely,” says PhD student Negar Reiskarimian, who developed the circulator and is lead author of the Nature Communications paper. “We wanted to create a simple and efficient way, using conventional materials, to break Lorentz Reciprocity and build a low-cost nanoscale circulator that would fit on a chip. This could open up the door to all kinds of exciting new applications.”
The traditional way of breaking Lorentz Reciprocity and building radio-frequency circulators has been to use magnetic materials such as ferrites, which lose reciprocity when an external magnetic field is applied. But these materials are not compatible with silicon chip technology, and ferrite circulators are bulky and expensive. Krishnaswamy and his team were able to design a highly miniaturized circulator that uses switches to rotate the signal across a set of capacitors to emulate the non-reciprocal “twist” of the signal that is seen in ferrite materials. Aside from the circulator, they also built a prototype of their full-duplex system–a silicon IC that included both their circulator and an echo-cancelling receiver–and demonstrated its capability at the 2016 IEEE International Solid- State Circuits Conference this past February.
“Being able to put the circulator on the same chip as the rest of the radio has the potential to significantly reduce the size of the system, enhance its performance, and introduce new functionalities critical to full duplex,” says PhD student Jin Zhou, who integrated the circulator with the full-duplex receiver that featured additional echo cancellation.
Non-reciprocal circuits and components have applications in many different scenarios, from radio-frequency full-duplex communications and radar to building isolators that prevent high-power transmitters from being damaged by back-reflections from the antenna. The ability to break reciprocity also opens up new possibilities in radio-frequency signal processing that are yet to be discovered. Full-duplex communications is of particular interest to researchers because of its potential to double network capacity, compared to half-duplex communications that current cell phones and WiFi radios use. The Krishnaswamy group is already working on further improving the performance of their circulator, and exploring “beyond-circulator” applications of non-reciprocity.
“What really excites me about this research is that we were able to make a contribution at a theoretically fundamental level, which led to the publication in Nature Communications, and also able to demonstrate a practical RF circulator integrated with a full-duplex receiver that exhibited a factor of nearly a billion in echo cancellation, making it the first practical full-duplex receiver chip and which led to the publication in the 2016 IEEE ISSCC,” Krishnaswamy adds. “It is rare for a single piece of research, or even a research group, to bridge fundamental theoretical contributions with implementations of practical relevance. It is extremely rewarding to supervise graduate students who were able to do that!”
Learn more: WiFi capacity doubled at less than half the size
Columbia Engineering researchers develop a deformable lens array and set the stage for thin and flexible sheet cameras
A team led by Shree K. Nayar, T.C. Chang Professor of Computer Science at Columbia Engineering, has developed a novel sheet camera that can be wrapped around everyday objects to capture images that cannot be taken with one or more conventional cameras.
The Columbia team, which includes research engineer Daniel Sims BS’14 and postdoctoral researcher Yonghao Yue, designed and fabricated a flexible lens array that adapts its optical properties when the sheet camera is bent. This optical adaptation enables the sheet camera to produce high quality images over a wide range of sheet deformations. Sims will present the work at the International Conference on Computational Photography (ICCP) at Northwestern University in Evanston, IL, May 13 to 15.
“Cameras today capture the world from essentially a single point in space,” says Nayar. “While the camera industry has made remarkable progress in shrinking the camera to a tiny device with ever increasing imaging quality, we are exploring a radically different approach to imaging. We believe there are numerous applications for cameras that are large in format but very thin and highly flexible.”
If such an imaging system could be manufactured cheaply, like a roll of plastic or fabric, it could be wrapped around all kinds of things, from street poles to furniture, cars, and even people’s clothing, to capture wide, seamless images with unusual fields of view. This design could also lead to cameras the size of a credit card that a photographer could simply flex to control its field of view.
The new “flex-cam” requires two technologies–a flexible detector array and a thin optical system that can project a high quality image on the array. One approach would be to attach a rigid lens with fixed focal length to each detector on the flexible array. In this case, however, bending the camera would result in “gaps” between the fields of views of adjacent lenses. This would cause the captured image to have missing information, or appear “aliased.”
To solve this problem, the Columbia Engineering team developed an adaptive lens array made of elastic material that enables the focal length of each lens in the sheet camera to vary with the local curvature of the sheet in a way that mitigates aliasing in the captured images. This inherent optical adaptation of the lens is passive, avoiding the use of complex mechanical or electrical mechanisms to independently control each lens of the array.
The researchers arrived at their passively adaptive lens array by optimizing its geometry and material properties. They fabricated their prototype lens array using silicone and demonstrated its ability to produce high image quality over a wide range of deformations of the sheet camera. The research was conducted in Nayar’s Computer Vision Laboratory and was funded by the Office of Naval Research.
“The adaptive lens array we have developed is an important step towards making the concept of flexible sheet cameras viable,” Nayar says. “The next step will be to develop large-format detector arrays to go with the deformable lens array. The amalgamation of the two technologies will lay the foundation for a new class of cameras that expand the range of applications that benefit from imaging.”