It comes down to privacy — biomedical research can’t proceed without human genomic data sharing, and genomic data sharing can’t proceed without some reasonable level of assurance that de-identified data from patients and other research participants will stay de-identified after they’re released for research.
Data use agreements that carry penalties for attempted re-identification of participants may be a deterrent, but they’re hardly a guarantee of privacy. Genomic data can be partially suppressed as they’re released, addressing vulnerabilities and rendering individual records unrecognizable, but suppression quickly spoils a data set’s scientific usefulness.
A new study from Vanderbilt University presents an unorthodox approach to re-identification risk, showing how optimal trade-offs between risk and scientific utility can be struck as genomic data are released for research.
The study appears in the American Journal of Human Genetics.
Doctoral candidate Zhiyu Wan, Bradley Malin, Ph.D., and colleagues draw on game theory to simulate the behavior of would-be data privacy adversaries, and show how marrying data use agreements with a more sensitive, scalpel-like data suppression policy can provide greater discretion and control as data are released. Their framework can be used to suppress just enough genomic data to persuade would-be snoops that their best privacy attacks will be unprofitable.
“Experts in the privacy field are prone to assume the worse case scenario, an attacker with unlimited capability and no aversion to financial losses. But that may not happen in the real world, so you would tend to overestimate the risk and not share anything,” Wan said. “We developed an approach that gives a better estimate of the risk.”
Malin agrees that failure to come to grips with real-world risk scenarios could stifle genomic data sharing.
“Historically, people have argued that it’s too difficult to represent privacy adversaries. But the game theoretic perspective says you really just have to represent all the ways people can interact with each other around the release of data, and if you can do that, then you’re going to see the solution. You’re doing a simulation of what happens in the real world, and the question just becomes whether you’ve represented the rules of the game correctly,” said Malin, associate professor of Biomedical Informatics, Biostatistics and Computer Science.
To date, no one has faced prosecution for attacking the privacy of de-identified genomic data. Privacy experts nevertheless assume a contest of computerized algorithms as de-identified data are released, with privacy algorithms patrolling the ramparts while nefarious re-identification algorithms try to scale them.
Re-identification attacks have occurred, but according to earlier research by Malin and colleagues, the perpetrators appear to be motivated by curiosity and academic advancement rather than by criminal self-interest. They’re sitting at computers just down the hall, so to speak, overpowering your data set’s de-identification measures, then publishing an academic paper saying just how they did it. It’s all very bloodless and polite.
The new study is something different, more tough-minded, situating data sharing and privacy algorithms in the real world, where people go to jail or are fined for violations. Here the envisaged privacy adversary doesn’t wear elbow patches, lacks government backing and is simply out to make a buck through the illicit sale of private information.
De-identified genotype records are linked to de-identified medical, biometric and demographic information. In what the study refers to as “the game,” the attacker is assumed already to have some named genotype data in hand, and will attempt to match this identified data to de-identified genotype records as study data are released.
To bring these prospective attackers out of the shadows, the authors present a detailed case study involving release of genotype data from some 8,000 patients. They painstakingly assign illicit economic rewards for the criminal re-identification of research data. Based on costs for generating data, they also assign economic value to the scientific utility of study data.
On the way to estimating risk and the attacker’s costs, the authors estimate the likelihood that any named individual genotype record already held by the attacker is included in the de-identified data set slated for release; according to the authors, this key estimate is often neglected in re-identification risk assessments.
The authors measure the utility of a study’s genomic data in terms of the frequencies of genetic variants: for a given variant, the greater the difference between its frequency in the study group and its frequency in the general population (based on available reference data), the greater its scientific utility. This approach to utility triumphed recently when Wan and Malin won the 2016 iDASH Healthcare Privacy Protection Challenge. Their winning algorithm proved best at preserving the scientific utility of a genomic data set while thwarting a privacy attack.
For any genomic data set, before any data are released in a game’s opening move, the sharer can use the game to compare various data sharing policies in terms of risk and utility. In the case study, the game theoretic policy provides the best payoff to the sharer, vastly outperforming a conventional data suppression policy and edging out a data use agreement policy.
No matter where parameters are set regarding illicit financial rewards or information that’s likely to be wielded by an attacker, the authors show that the game theoretic approach generally provides the best payoff to the sharer. They sketch how their approach could serve the release of data from other sources, including the federal government’s upcoming Precision Medicine Initiative.
Founded in 1873, the university is named in honor of shipping and rail magnate “Commodore” Cornelius Vanderbilt, who provided the school its initial $1 million endowment despite having never been to the South. The Commodore hoped that his gift and the greater work of the university would help to heal the sectional wounds inflicted by the Civil War.
Today Vanderbilt enrolls approximately 12,000 students from all 50 U.S. states and over 90 foreign countries in four undergraduate and six graduate and professional schools. Several research centers and institutes are affiliated with the university, including the Vanderbilt Institute for Public Policy Studies, Freedom Forum First Amendment Center, Dyer Observatory, and Vanderbilt University Medical Center, the only Level I trauma center in Middle Tennessee. With the exception of the off-campus observatory and satellite medical clinics, all of university’s facilities are situated on its 330-acre (1.3 km2) campus in the heart of Nashville, only 1.5 miles (2.4 km) from downtown. Despite its urban surroundings, the campus itself is a national arboretum and features over 300 different species of trees and shrubs.
Vanderbilt University research articles from Innovation Toronto
- How to make electric vehicles that actually reduce carbon – March 4, 2016
- Cotton candy machines may hold key for making artificial organs – February 13, 2016
- New detector perfect for asteroid mining – November 24, 2015
- The pronoun ‘I’ is becoming obsolete – August 21, 2015
- Tiny mechanical wrist gives new dexterity to needlescopic surgery – July 24, 2015
- Brain surgery through the cheek – October 17, 2014
- Liberating devices from their power cords – May 21, 2014
- Shifting Evolution Into Reverse Promises Cheaper, Greener Way to Make New Drugs
- Significant progress toward creating ‘benchtop human’ reported
- Electric “thinking cap” controls learning speed
- Nanoscale optical switch breaks miniaturization barrier
- Graphene for Radically New Energy Technologies
- Robotic advances promise artificial legs that emulate healthy limbs
- New device stores electricity on silicon chips
- Robot Brain Surgery: Robot uses steerable needles to treat brain clots
- Researchers strike gold with nanotech vaccine
- Telerobotic system designed to treat bladder cancer
- Humanoid robot helps train children with autism
- NASA announces new CubeSat space mission candidates
- ISIS plays key role in efforts to revolutionize military manufacturing
- When Robotic Surgery Leaves Just a Scratch
- Advanced exoskeleton promises more independence for people with paraplegia
- Test flight over Peru ruins could revolutionize archaeological mapping
- Graphene could find use as world’s thinnest anti-corrosion coating
- “Bionic” leg anticipates the wearer’s moves
- ‘Robotic biologist’ crunches raw data and formulates equations
- New Spin On Ibuprofen’s Actions
- New ‘Bionic’ Leg Gives Amputees a Natural Gait
- Stamping out Low Cost Nanodevices
- Nanosponge drug delivery claimed to work better than direct injection on tumors
Imagine a “DNA photocopier” small enough to hold in your hand that could identify the bacteria or virus causing an infection even before the symptoms appear.
This possibility is raised by a fundamentally new method for controlling a powerful but finicky process called the polymerase chain reaction. PCR was developed in 1983 by Kary Mullis, who received the Nobel Prize for his invention. It is generally considered one of the most important advances in the field of molecular biology because it can make billions of identical copies of small segments of DNA so they can be used in molecular and genetic analyses.
Vanderbilt University biomedical engineers Nicholas Adams and Frederick Haselton came up with an out-of-the-box idea, which they call adaptive PCR. It uses left-handed DNA (L-DNA) to monitor and control the molecular reactions that take place in the PCR process.
Left-handed DNA is the mirror image of the DNA found in all living things. It has the same physical properties as regular, right-handed DNA but it does not participate in most biological reactions. As a result, when fluorescently tagged L-DNA is added to a PCR sample, it behaves in an identical way to the regular DNA and provides a fluorescent light signal that reports information about the molecular reactions taking place and can be used to control them.
In order to test their idea, Adams and Haselton recruited Research Assistant Professor of Physics William Gabella to create a working prototype of an adaptive PCR machine and then they tested it extensively with the assistance of biomedical engineering undergraduate Austin Hardcastle.
A description of the technique and their test results are described in the paper “Adaptive PCR Based on Hybridization Sensing of Mirror-Image L-DNA”published in the journal Analytical Chemistry.
Although the technology is generally considered to be mature, PCR machines have proven to be complicated to operate and hypersensitive to small variations in the chemical composition of samples and environmental conditions. That is largely because there has been no direct way to monitor what is taking place at the molecular level.
As a result, the adaptive approach for controlling the PCR process promises to make it simpler to operate, improve its reliability, reduce its sensitivity to environmental conditions and shrink it from desktop to handheld size. As a consequence, it could free PCR from the laboratory setting and allow it to work in the field or at the bedside where it could be used to identify different diseases by their DNA signatures.
The difficulty lab technicians have operating current PCR technology was captured by Ernesto Llamas, a Ph.D. student at the Center for Research In Agricultural Genomics in Barcelona, who draws cartoons about science on his Sketching Science Facebook page. In a cartoon titled, “PCR Protocol,” he shows a researcher running a sample through a PCR machine. The panels are titled “get the reagents; prepare the mix; set up conditions; analyze the gel; negative result; cry.”
“PCR machines are pretty finicky,” said Adams, giving an example: “We have three commercial PCR machines in our lab. For awhile one of them wasn’t working. When we put identically prepared samples in all three machines, two of them worked and one didn’t. As I was discussing this problem on the phone with one of the company’s technicians, she asked me if the problem machine was within eight inches of a wall. It turned out it was. According to the technician the wall was interfering with the air flow to the machine. She was right because when I moved it out from the wall it began working properly!”
Laboratory technicians have found methods to compensate for these problems. The machines are kept in temperature-controlled rooms. They purify the DNA samples so they have a uniform chemical composition. Even so, it can take operators several weeks to optimize the machines to run samples from new sources. And, even when optimized, they run samples in triplicate, just in case one of them fails.
To appreciate Adams and Haselton’s innovation, first you need to understand how PCR works.
There are five core “ingredients” required for PCR: a DNA template to be copied; primers, short stretches of DNA that initiate the PCR reaction that are designed to bind to either side of the section of DNA that you want to copy; DNA nucleotide bases, the building blocks of DNA that are needed to build new strands of DNA; DNA polymerase, a special enzyme that builds new DNA strands; and a buffer that provides the proper chemical conditions for the reaction.
A sample containing these ingredients is first heated almost to its boiling point, the temperature at which double-stranded DNA separates into two single strands: a process called denaturing. Next, the sample is cooled to a temperature where the primers attach to the single strands: a process called annealing. Once the annealing is complete, the temperature begins to rise and the DNA polymerase automatically begins making new strands of double-stranded DNA using the single strands as templates: a process called elongation. The end result is two exact copies of the original double-stranded DNA.
When the process is complete, the cycle repeats: The temperature ramps up again to the sample’s boiling point, causing the two double-stranded DNA to separate into four single strands which act as templates. In this fashion, the two copies of DNA become four copies, four copies become eight copies and so on. The cycle is repeated as many as 30 to 40 times, producing more than a billion exact copies of the original DNA segment.
Temperature and chemical composition are the critical factors in the PCR process. The requirement to heat the sample to precise temperatures has led to PCR machines with sophisticated heating systems and heavy insulation. Nevertheless, small changes in air circulation and room temperature can throw them out of kilter. Similarly, variations in the chemical composition of the samples can significantly change the temperatures required for the various steps. For example, differences in levels of salts, sugars and alcohols – chemicals often used in sample preparation – all can alter the temperature at which the DNAs attach and dissociate. As a result, technicians must go through an extensive calibration process to optimize the chemicals and cycle temperatures, and then precisely maintain those conditions for each new sample.
Adaptive PCR sidesteps all these variables by relying on the fluorescent L-DNA to determine the ideal cycle temperatures for annealing and denaturing. L-DNA sequences are commercially available. So the first step is to order L-DNA with the same sequence as the right-handed DNA that you want to amplify along with left-handed primers. The L-DNAs are ordered with a fluorescent dye on one strand and a “quencher” on the other strand. The quencher suppresses the fluorescence of the dye. So, as the L-DNA strands separate in the denaturing step, the quencher and dye also separate which causes the fluorescence level in the sample to increase. By analyzing the rate of change of the fluorescent level, a microprocessor can determine when virtually all of the DNA has separated.
Similarly, a dye quencher is attached to the left-handed primers. So as the process moves into the annealing step and the primers attach to the L-DNA strands, the quenchers they carry begin suppressing the fluorescent dye on the L-DNA. This provides a dimming signal that can be analyzed to identify the point when the primers are attached to virtually all the DNA strands. The amount of L-DNA in the sample remains constant from cycle to cycle because it does not participate in the amplification step.
The researchers report that experiments with the prototype system have demonstrated that the technique duplicates the results of conventional PCR machines in controlled conditions and can efficiently amplify DNA under conditions that cause standard PCR to fail. “These advantages have the potential to make PCR-based diagnostics more accessible outside of well-controlled laboratories, such as point-of-care and field settings that lack the resources to accurately control the reaction temperature or perform high quality sample preparation.”
Vanderbilt University has applied for a patent on adaptive PCR and its representatives are in active discussions with several companies about licensing the technology.
Learn more: DNA duplicator small enough to hold in your hand
Novel Drug May Help Repair Failing Hearts
Cimaglermin, a new experimental drug, may help restore cardiac function after heart failure, according to a first-in-man study published today in JACC: Basic to Translational Science.
Heart failure, characterized by a loss of cardiac function, is among the leading causes of death worldwide. A significant portion of heart failure patients, particularly those with severe left ventricular systolic dysfunction, do not sufficiently respond to current medical therapy.
Researchers examined the safety and efficacy of a single infusion of cimaglermin, which acts as a growth factor for the heart, helping the structural, metabolic and contractile elements of the heart to repair itself following injury. The study enrolled 40 heart failure patients who were taking optimal medical therapy for at least three months prior to the trial. Compared to patients who received a placebo, patients who received a high dose of cimaglermin had a sustained increase in left ventricular ejection fraction, or pumping capacity, through 90 days after dosing, with the maximum increase reached at day 28.
“These findings support continued clinical development of the investigational drug cimaglermin, including further safety evaluations and detailing the potential improvement on clinical heart failure outcome measures,” said Daniel J. Lenihan MD, from the division of cardiovascular medicine at Vanderbilt University and the lead author of the study. “As with all experimental therapeutics, additional studies will be required and subject to regulatory review to determine if the relative risks and benefits of cimaglermin warrant approval.”
The most common side effects were headache and nausea, which were temporarily associated with exposure to the drug. One patient receiving the highest planned dose of cimaglermin experienced an adverse reaction that met the stopping criteria of Federal Drug Administration guidance for drug induced liver injury.
Limitations of this study include the small sample size and the fact that patients only received a single infusion rather than multiple doses.
“Although the results of the study must be regarded as provisional because of the small numbers of patients, the results of this study are nonetheless very exciting,” said Douglas L. Mann, MD, FACC, editor-in-chief of JACC: Basic to Translational Science. “Instead of blocking the fundamental mechanisms that lead to cardiac injury, the early results with cimaglermin suggest that it may also be possible to administer therapeutics that allow the failing heart to repair itself using its own repair mechanisms. If the results of this study can be replicated and translated into improvements in clinical outcomes in larger numbers of patients in phase II and III clinical trials, it will represent a paradigm shift in the way in which clinicians treat patients with heart failure.”
Learn more: Novel Drug May Help Repair Failing Hearts
In a new study, Vanderbilt pharmacologist Jerod Denton, Ph.D., Ohio State entomologist Peter Piermarini, Ph.D., and colleagues report an experimental molecule that inhibits kidney function in mosquitoes and thus might provide a new way to control the deadliest animal on Earth.
The investigators aim their inhibitor, named VU041, at the mosquito Anopheles gambiae, the leading vector for malaria, and Aedes aegypti, a mosquito that transmits Zika virus and other pathogens. The study appears in the journal Scientific Reports.
Over several decades of exposure, mosquitoes have evolved genetic resistance to various insecticides that attack their nervous system. The new study shows for the first time that inducing kidney failure — or, more correctly, Malpighian tubule failure — in mosquitoes can circumvent resistance to conventional insecticides.
“We’re essentially preventing mosquitoes from producing urine after they take a blood meal,” said Denton, associate professor of Anesthesiology and Pharmacology.
According to Denton, in taking a blood meal mosquitoes can double or even triple their body weight.
Besides providing nutrients, blood meals carry toxic salts; the potassium chloride lurking in red blood cells, if not quickly voided, can depolarize cell membrane potentials and kill straightaway.
“So they’ve evolved a rapid diuretic process to very quickly separate the salt water from all the nutrients that they need for egg development. A lot of people don’t realize that mosquitoes have kidneys, and when they take a blood meal from you they also urinate on you almost simultaneously.
“What our compounds do is stop urine production, so they swell up and can’t volume regulate, and in some cases they just pop,” he said.
Conventional mosquitocides cause death of males and females at all stages of mosquito development, and in doing so exert considerable selective pressure for the development of genetic resistance.
“By targeting blood feeding female mosquitoes, we predict that there will be less selective pressure for the emergence of resistant mutations,” Denton said.
The investigators show VU041 to be effective when applied topically, which indicates that it potentially could be adapted as a sprayed insecticide. They also show that it doesn’t harm honeybees.
Arrangements are underway to test VU041 in a spray formulation. If that’s successful, additional safety testing would be needed before deciding about commercial development, Denton said.
“Mood ring materials” could play an important role in minimizing and mitigating damage to the nation’s failing infrastructure.
The LASIR researchers are taking a different tack by incorporating fluorescent nanoparticles into the material itself that react to stress by changing their optical properties in order to create a new kind of detection system that can monitor these structures in an efficient and cost-effective fashion.
“So we need to somehow change the materials we are using so they illuminate these tiny cracks.” The team’s initial studies, published last April in the Proceedings of the SPIE Conference on Sensors and Smart Structures Technologies for Civil, Mechanical and Aerospace Systems, have determined that adding a tiny concentration of special nanoparticles to an optically clear polymer matrix produces a distinctive light signature that changes as the material is subjected to a broad range of compressive and tensile loads.
The Vanderbilt group isn’t the only research team using nanoparticles to create smart materials, but they have a special advantage.
These quantum dots are unique because they emit white light where other quantum dots only emit light at specific wavelengths.
“The end result is that the strength of the quantum dot emissions gives us a permanent record of the level of stress that a material has experienced,” said Brubaker.
In this fashion, the researchers have verified that the material can act as a new kind of strain gauge that permanently records the cumulative amount of stress that the material to which it is applied experiences.
In their initial experiments, the engineers have kept the loads relatively modest, under 1,250 pounds, well within the elastic limits that the materials can withstand without permanent damage.
As a result, the material must be shielded from external light.
“There is a lot we have to learn before we can create a smart material that is ready for real world applications, but all the signs are positive,” said Adams.
Researchers at Vanderbilt University Medical Center and Washington University School of Medicine in St. Louis have isolated a human monoclonal antibody that in a mouse model “markedly reduced” infection by the Zika virus.
The antibody, called ZIKV-117, also protected the fetus in pregnant mice infected with the virus, the researchers reported today in the journal Nature. Zika is believed to cause microcephaly, unusually small heads, and other congenital malformations in children born to infected women.
Similar protection studies in primates are warranted, and if the findings hold up, ZIKV-177 could be developed as a protective antibody treatment for pregnant women at risk of Zika infection, the researchers concluded.
The findings may also aid efforts to develop an effective anti-Zika vaccine, said James Crowe Jr., M.D., director of the Vanderbilt Vaccine Center and co-corresponding author of the paper with Michael S. Diamond, M.D., Ph.D., at Washington University.
“These naturally occurring human antibodies isolated from humans represent the first medical intervention that prevents Zika infection and damage to fetuses,” said Crowe, who also is Ann Scott Carell Professor in the Departments of Pediatrics and Pathology, Microbiology & Immunology in the Vanderbilt University School of Medicine.
“We’re excited because the data suggests we may have antibody treatments in hand that could be developed for use in pregnant women,” he said.
“The remarkable potency and breadth of inhibition by ZIKV-117 has great promise,” Diamond said, “as it was able to inhibit infection by strains from both Africa and America in cell culture and in animals, including during pregnancy.”
Diamond is associate director of The Andrew M. and Jane M. Bursky Center for Human Immunology & Immunotherapy Programs at Washington University.
Zika is a mosquito-borne virus that has emerged as a global public health threat. In addition to its association with congenital birth defects, Zika has been linked to Guillain-Barre syndrome, a neurological disorder that can lead to paralysis and death.
Since a major outbreak was reported in Brazil last year, Zika infections transmitted by mosquitoes have been reported throughout Africa, Asia, the Pacific, and the Americas, including Miami-Dade County, Florida.
During the past 15 years, Crowe and his colleagues have developed a high-efficiency method for isolating human monoclonal antibodies that can neutralize a wide range of viruses, from Ebola to HIV.
The Crowe and Diamond laboratories have collaborated recently on several projects including the generation of protective human monoclonal antibodies against Dengue, West Nile, Chikungunya and now Zika viruses.
Monoclonal antibodies are made from a single clone of B cells, a type of white blood cell, that have been fused to myeloma (cancer) cells to form fast-growing “hybridomas.” This allows researchers to quickly generate large quantities of antibodies against specific viral targets.
In the current study, the researchers isolated antibodies from the blood of people who’d been previously infected with the Zika virus in different parts of the world. The antibodies reacted to the envelope or “E” protein on the surface of the virus.
The researchers then generated a variety of monoclonal antibodies. In cell culture studies, they identified one, ZIKV-117, which broadly neutralized several different strains of the virus. In mice infected by the Zika virus, injection of the antibody markedly reduced disease and mortality, and reduced transmission from mother to fetus.
Advance Could Also Work Against Other Viruses
In research published online today in Science, a team of scientists describe a new therapeutic strategy to target a hidden Achilles’ heel shared by all known types of Ebola virus. Two antibodies developed with this strategy blocked the invasion of human cells by all five ebolaviruses, and one of them protected mice exposed to lethal doses of Ebola Zaire and Sudan, the two most dangerous. The team included scientists from Albert Einstein College of Medicine, U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID), Integrated Biotherapeutics, Vanderbilt University Medical Center, and The Scripps Research Institute.
Ebola viruses cause a highly fatal disease for which no approved vaccines or treatments are available. About two dozen Ebola outbreaks have been documented since 1976, when infections first occurred in villages along the Ebola River in Africa. The largest outbreak in history—the 2014-2015 Western Africa epidemic—caused more than 11,000 deaths and infected approximately 29,000 people.
Monoclonal antibodies, which bind to and neutralize specific pathogens and toxins, have emerged as the most promising treatments for Ebola patients. A critical problem, however, is that most antibody therapies target only one specific ebolavirus. For example, the most promising experimental therapy—ZMappTM, a cocktail of three monoclonal antibodies—is specific for Ebola virus Zaire, and doesn’t work against the other two viruses (Sudan and Bundibugyo), which have both caused major outbreaks. The broad-spectrum antibodies developed by the research team represent an important advance against one of the world’s most dangerous pathogens.
Exploiting Ebola’s Achilles’ Heel
In 2011, a team that included co-senior authors Kartik Chandran, Ph.D. professor of microbiology & immunology at Einstein, and John M. Dye, Ph.D., chief of viral immunology at USAMRIID, discovered that all filoviruses (the family to which ebolaviruses and the more distantly related Marburg virus belong) have an Achilles’ heel: To infect and multiply in human cells, they must all bind to a host-cell protein called Niemann-Pick C1 (NPC1).
But capitalizing on that knowledge required a completely new approach to targeting viruses: exploiting the fact that Ebola and many other viruses must enter host cell compartments called lysosomes. Once safely inside the lysosomes, the viruses transform and expose key portions of their exterior that the research team successfully targeted using monoclonal antibodies.
To gain entry to cells, filoviruses bind to the host cell’s outer membrane via glycoproteins (proteins to which carbohydrate chains are attached) that bristle from the virus’s surface. (See illustration.) A portion of the cell membrane then surrounds the virus and pinches off, eventually developing into a lysosome—a membrane-bound, intracellular compartment filled with enzymes to digest foreign and cellular components.
Filoviruses then use the host cells’ resources to break out of their lysosomal “prisons” so they can enter the host cell’s cytoplasm to multiply. Enzymes in the lysosome slice a “cap” from the virus’s glycoproteins, unveiling a site that binds to the NPC1 embedded in the lysosome membrane. NPC1, which normally helps transport cholesterol within the cell, offers Ebola virus its only means of escaping the lysosome and multiplying. By fitting its protein “key” into the NPC1 “lock,” the virus fuses itself to the lysosome membrane. (See illustration close-up.) Now the virus can propel its RNA from the lysosome and into the cell’s cytoplasm, where it can finally replicate itself.
Penetrating an Invisibility Cloak
The research team realized that monoclonal antibodies could potentially thwart all filovirus infections by neutralizing the viral protein that binds to NPC1, or by neutralizing NPC1 itself. There was just one problem: Reflecting Ebola’s ingenuity, both targets reside only in lysosomes deep within cells—making them invisible to the immune system and shielded from attack by conventional antibodies.
Dr. Chandran, Dr. Dye and co-senior author Jonathan R. Lai, Ph.D., associate professor of biochemistry at Einstein and an expert in engineering antibodies, devised a clever “Trojan Horse” strategy for overcoming the virus’s invisibility cloak: Just as the citizens of Troy unwittingly pulled a wooden horse filled with Greek soldiers into their walled city, they tricked the viruses into carrying the means of their own destruction along with them into host cells.
To do so, the research team synthesized two types of “bispecific” antibodies, each consisting of two monoclonal antibodies combined into one molecule. One bispecific antibody was devised to neutralize the viral protein that binds to NPC1, the other to target NPC1. Both had one monoclonal antibody in common: antibody FVM09, which binds to the surface glycoproteins of all ebolaviruses while the virus is outside cells, allowing the bispecific antibodies to hitch a ride with the virus into the lysosome. FVM09 was developed by co-senior author M. Javad Aman, Ph.D. at Integrated Biotherapeutics.
Once in the lysosome, the bispecific antibodies are released from the viral surface when enzymes in the lysosome slice off the glycoprotein caps—allowing the business ends of the bispecific antibodies to swing into action.
One bispecific antibody combined FVM09 with antibody MR72, which was isolated from a human survivor of Marburg virus infection by co-senior author James E. Crowe Jr., M.D., director of the Vanderbilt Vaccine Center. MR72 targets the NPC1-binding viral protein that is unveiled by all filoviruses in lysosomes. The second bispecific antibody links FVM09 to antibody mAb-548, developed at Einstein, which zeroes in on NPC1. With one bispecific antibody targeting the “lock” (NPC1) and the other targeting the “key” (the virus’s NPC1-binding protein), both had the potential for preventing Ebola virus from interacting with NPC1 and escaping from the lysosome into the cytoplasm.
Putting Antibodies to the Test
The researchers then tested their bispecific antibodies against ebolaviruses in the lab. They initially used a harmless virus (vesicular stomatitis virus) that had been genetically engineered to display glycoproteins from all five ebolaviruses on its surface. The researchers incubated the bispecific antibodies with the Ebola-like viruses and then added the mixtures to human cells in tissue culture. Both bispecific antibodies successfully neutralized all five viruses. Work in the high-containment facilities at USAMRIID confirmed that these antibodies also blocked infection by the actual Zaire, Sudan, and Bundibugyo ebolaviruses.
Next came studies at USAMRIID to test whether the two bispecific antibodies could protect mice infected with the two most dangerous ebolaviruses, Zaire and Sudan. Researchers, led by Dr. Dye, administered the bispecific antibodies two days after mice were exposed to a lethal dose of virus.
The bispecific antibody that targeted the viral binding protein provided good protection to mice exposed to both viruses. As expected, the bispecific antibody that targeted NPC1 did not protect mice. It was designed to bind specifically to human NPC1, which differs slightly in structure from the NPC1 protein found in mice.
As a next step, both bispecific antibodies will need to be tested in nonhuman primates, the current gold standard for anti-Ebola therapeutics.
A little spark for sharper sight
Stimulating the visual cortex of the brain for 20 minutes with a mild electrical current can improve vision for about two hours, and those with worse vision see the most improvement, according to a Vanderbilt University study published this week in Current Biology.
Could we make someone’s vision better—not at the level of the eye, like Lasik or glasses, but directly at the level of the brain?”“It’s actually a very simple idea,” said co-author Geoff Woodman, associate professor of psychology. “This kind of stimulation can improve cognitive processing in other brain areas, so if we stimulate the visual system, could we improve processing? Could we make someone’s vision better—not at the level of the eye, like Lasik or glasses, but directly at the level of the brain?”
Twenty young, healthy subjects with normal or near-normal vision were asked to evaluate the relative position of two identical vertical lines and judged whether they were perfectly aligned or offset. The test is more sensitive than a standard eye chart, and gave the researchers are very precise measurement of each subjects’ visual acuity.
The researchers then passed a very mild electric current through the area at the back of the brain that processes visual information. After 20 minutes, the subjects were asked to perform the test again, and about 75 percent showed measurable improvement following the brain stimulation.
Researchers are one step closer to understanding the genetic and biological basis of diseases like cancer, diabetes, Alzheimer’s and rheumatoid arthritis – and identifying new drug targets and therapies – thanks to work by three computational biology research teams from the University of Arizona Health Sciences, University of Pennsylvania and Vanderbilt University.
The researchers’ findings – a method demonstrating that independent DNA variants linked to a disease share similar biological properties – were published online in the April 27 edition of npj Genomic Medicine.
“The discovery of these shared properties offer the opportunity to broaden our understanding of the biological basis of disease and identify new therapeutic targets,” said Yves A. Lussier, MD, FACMI, lead and senior corresponding author of the study and UAHS associate vice president for health sciences and director of the UAHS Center for Biomedical Informatics and Biostatistics (CB2).
The researchers are striving to better understand the common genetic and biological backgrounds that make certain people susceptible to the same disease. They have developed a method to demonstrate how individual, disease-associated DNA variants share similar biological properties that provide a road map for disease origin.
Over the last ten years, genetics researchers have conducted large studies, called Genome Wide Association Studies (GWAS), which analyze DNA variants across thousands of human genomes to identify those that are more frequent in people with a disease. However, the impact of many of these disease-associated variants on the function and regulation of genes remains elusive, making clinical interpretation difficult.
A method to explore the biological impact of these variants and how they are linked to disease was developed through the collaboration of bioinformatics and systems biology researchers Dr. Lussier;Haiquan Li, PhD, research associate professor and director for translational bioinformatics, Department of Medicine, UA College of Medicine – Tucson; Ikbel Achour, PhD, director for precision health, CB2; Jason H. Moore, PhD, director, Institute for Biomedical Informatics, Perelman School of Medicine, University of Pennsylvania; and Joshua C. Denny, MD, MS, FACMI, associate professor of biomedical informatics and medicine, Vanderbilt University, along with their teams.
In their new paper, the researchers demonstrate that DNA risk variants can affect biological activities such as gene expression and cellular machinery, which together provide a more comprehensive picture of disease biology. When DNA risk variants for a given disease were analyzed in combination, similar biological activities were discovered, suggesting that distinct risk variants can affect the same or shared biological functions and thus cause the same disease. More detailed analyses of variants linked to bladder cancer, Alzheimer’s disease and rheumatoid arthritis showed that two variants can contribute to disease independently, but also interact genetically. Therefore, the precise combination of DNA variants of a patient may work to increase or decrease the relative risk of disease.