Scientists from the University of Utah and University of Washington have developed blueprints that instruct human cells to assemble a virus-like delivery system that can transport custom cargo from one cell to another. As reported online in Nature on Nov. 30, the research is a step toward a nature-inspired means for delivering therapeutics directly to specific cell types within the body.
“We’re shifting our perception from viruses as pathogens, to viruses as inspiration for new tools,” says Wesley Sundquist, Ph.D., co-chair of the Department of Biochemistry at the University of Utah School of Medicine. He is also co-senior author on the study with Neil King, Ph.D., an assistant professor at the Institute for Protein Design at the University of Washington.
The carefully designed instructions set forth a series of self-propelled events that mimic how some viruses transfer their infectious contents from one cell to the next.
From the blueprints tumbled out self-assembling, soccer ball-shaped “nanocages”, the structure of which was reported previously. Adding on specific pieces of genetic code from viruses caused the nanocages to be packaged within cell membranes, and then exported from cells. Like a shuttle leaving Earth to bring goods to a space station, the tiny capsules undocked from one cell, traveled to another and docked there, emptying its contents upon arrival.
In this case, the protective nanocages carried cargo that the scientists used like homing beacons to track the vessels’ journeys. Next steps are to design nanocages that hold drugs or other small molecules.
“We are now able to accurately and consistently design new proteins with tailor-made structures,” says King. “Given the remarkably sophisticated and varied functions that natural proteins perform, it’s exciting to consider the possibilities that are open to us.”
The researchers’ decision to model the microscopic shipping system after viruses was no accident. Viruses have honed their skills to effectively spread their infectious wares to large numbers of cells. Decades of research, including in-depth investigations of the human immunodeficiency virus (HIV) by Sundquist’s team, have led to an understanding of how the pathogens accomplish this goal with such efficiency.
A test of whether you truly understand something is to build it yourself. And that’s what Sundquist and King’s teams have done here. “The success of our system is the first formal proof that this is how virus budding works,” remarks Sundquist.
Viruses taught them that such a delivery system must include three essential properties: an ability to grasp membranes, self-assemble, and to be released from cells. Introducing coding errors into any one of those steps brought shipments to a halt.
“I was sure that this would need fine-tuning but it was clean from the very beginning,” says lead author Jörg Votteler, Ph.D., a postdoctoral fellow in biochemistry at the University of Utah. When electron microscopist David Belnap, Ph.D. saw that images of the cages aligned closely with computer models, he knew they had made what they set out to design. “When it’s right, you know it,” he says.
The system could be modified as long as the three basic tenets were left intact. For example, the scientists could swap in differently shaped cages, or cause another type of membranes to surround them. Modularity means the vessels can be customized for various applications.
This study is proof of principle that the systems works, but more needs to be done before it can be applied therapeutically. Researchers will need to determine whether the capsules can navigate long journeys within living animals, for instance, and whether they can deliver medicines in sufficient quantities.
“As long as we keep pushing knowledge forward we can guarantee there will be good outcomes, though we can’t guarantee what or when,” says Sundquist.
From Harry Potter’s Cloak of Invisibility to the Romulan cloaking device that rendered their warship invisible in “Star Trek,” the magic of invisibility was only the product of science fiction writers and dreamers.
But University of Utah electrical and computer engineering associate professor Rajesh Menon and his team have developed a cloaking device for microscopic photonic integrated devices — the building blocks of photonic computer chips that run on light instead of electrical current — in an effort to make future chips smaller, faster and consume much less power.
Menon’s discovery was published online Wednesday in the latest edition of the science journal, Nature Communications. The paper was co-written by University of Utah doctoral student Bing Shen and Randy Polson, senior optical engineer in the U’s Utah Nanofab.
The future of computers, data centers and mobile devices will involve photonic chips in which data is shuttled around and processed as light photons instead of electrons. The advantages of photonic chips over today’s silicon-based chips are they will be much faster and consume less power and therefore give off less heat. And inside each chip are potentially billions of photonic devices, each with a specific function in much the same way that billions of transistors have different functions inside today’s silicon chips. For example, one group of devices would perform calculations, another would perform certain processing, and so on.
The problem, however, is if two of these photonic devices are too close to each other, they will not work because the light leakage between them will cause “crosstalk” much like radio interference. If they are spaced far apart to solve this problem, you end up with a chip that is much too large.
So Menon and his team discovered you can put a special nanopatterened silicon-based barrier in between two of the photonic devices, which acts like a “cloak” and tricks one device from not seeing the other.
“The principle we are using is similar to that of the Harry Potter invisibility cloak,” Menon says. “Any light that comes to one device is redirected back as if to mimic the situation of not having a neighboring device. It’s like a barrier — it pushes the light back into the original device. It is being fooled into thinking there is nothing on the other side.”
Consequently, billions of these photonic devices can be packed into a single chip, and a chip can contain more of these devices for even more functionality. And since these photonic chips use light photons instead of electrons to transfer data, which builds up heat, these chips potentially could consume 10 to 100 times less power, which would be a boon for places like data centers that use tremendous amounts of electricity.
Menon believes the most immediate application for this technology and for photonic chips in general will be for data centers similar to the ones used by services like Google and Facebook. According to a study from the U.S. Department of Energy’s Lawrence Berkeley National Laboratory, data centers just in the U.S. consumed 70 billion kilowatt hours in 2014, or about 1.8 percent of total U.S. electricity consumption. And that power usage is expected to rise another 4 percent by 2020.
“By going from electronics to photonics we can make computers much more efficient and ultimately make a big impact on carbon emissions and energy usage for all kinds of things,” Menon says. “It’s a big impact and a lot of people are trying to solve it.”
Currently, photonic devices are used mostly in high-end military equipment, and he expects full photonic-based chips will be employed in data centers within a few years.
Utah engineers develop process for electronic devices that stops wasteful power leakage
According to the National Resource Defense Council, Americans waste up to $19 billion annually in electricity costs due to “vampire appliances,” always-on digital devices in the home that suck power even when they are turned off.
But University of Utah electrical and computer engineering professor Massood Tabib-Azar and his team of engineers have come up with a way to produce microscopic electronic switches for appliances and devices that can grow and dissolve wires inside the circuitry that instantly connect and disconnect electrical flow. With this technology, consumer products such as smartphones and computer laptops could run at least twice as long on a single battery charge, and newer all-digital appliances such as televisions and video game consoles could be much more power efficient.
Tabib-Azar’s research was published in a new paper this week in the current issue of Solid State Electronics. The paper was co-authored by Intel engineer Pradeep Pai, Omnivision Technologies engineer Yuying Zhang and IM Flash engineer Nurunnahar Islam Mou.
To operate different functions, all electronics have switches that instantaneously turn electrical flow on and off throughout the circuitry, much like turning a light switch on and off. But unlike a mechanical switch, these solid-state switches waste small doses of electricity while they are in a waiting state.
“Whenever they are off, they are not completely off, and whenever they are on, they may not be completely on,” says Tabib-Azar, who also is a professor with the Utah Science Technology and Research (USTAR) initiative. “That uses battery life. It heats up the device, and it’s not doing anything for you. It’s completely wasted power.”
Tabib-Azar and his team have devised a new kind of switch for electronic circuits that uses solid electrolytes such as copper sulfide to literally grow a wire between two electrodes when an electrical current passes through them, turning the switch on. When you reverse the polarity of the electrical current, then the metallic wire between the electrodes breaks down—leaving a gap between them—and the switch is turned off. A third electrode is used to control this process of growing and breaking down the wire.
“The distance between the two electrodes where the wire is grown can be as little as a nanometer long, which is as thin as 1/100,000 of the diameter of a hair,” Tabib-Azar says.
Consequently, billions of these switches could be built onto a computer processor or in solid-state memory chips such as the RAM in a laptop computer. In a smartphone, for example, this technology could be employed in the communications circuitry of the phone, which typically wastes battery power while it is in a state waiting to be used.
Besides better power efficiency, another advantage of this technology is it would produce less heat in the appliance or device because less electrical current is constantly running though its circuitry. Heat buildup has especially been a problem with laptops and phones and can affect the reliability of components over time.
Tabib-Azar added that this process doesn’t require expensive retooling of manufacturing plants to implement it because these plants already use materials such as copper sulfide in the manufacturing of electronics.
Right now, the only disadvantage to this process is that it is slower than typical switches in regular silicon-based electronics because of the time it takes to grow and break down the wires. But Tabib-Azar expects that to improve as he and his researchers continue to optimize the process. He also said this technology could be used for devices where speed isn’t a priority but battery power is.
“In lots of applications you really don’t utilize the full speed of the silicon anyway,” he says. “Right now, the biggest problem to solve is reducing the power leakage and addressing the energy-efficiency issues.”
A team of physicians and laboratory scientists has taken a key step toward a cure for sickle cell disease, using CRISPR-Cas9 gene editing to fix the mutated gene responsible for the disease in stem cells from the blood of affected patients.
For the first time, they have corrected the mutation in a proportion of stem cells that is high enough to produce a substantial benefit in sickle cell patients.
Mark DeWitt, a researcher with UC Berkeley’s Innovative Genomics Initiative and first author of the new paper, explains the significance of the advance. (UC Berkeley video by Roxanne Makasdjian and Stephen McNally)
The researchers from UC Berkeley, UC San Francisco Benioff Children’s Hospital Oakland Research Institute (CHORI) and the University of Utah School of Medicine hope to re-infuse patients with the edited stem cells and alleviate symptoms of the disease, which primarily afflicts those of African descent and leads to anemia, painful blood blockages and early death.
“We’re very excited about the promise of this technology,” said Jacob Corn, a senior author on the study and scientific director of the Innovative Genomics Initiative at UC Berkeley. “There is still a lot of work to be done before this approach might be used in the clinic, but we’re hopeful that it will pave the way for new kinds of treatment for patients with sickle cell disease.”
In tests in mice, the genetically engineered stem cells stuck around for at least four months after transplantation, an important benchmark to ensure that any potential therapy would be lasting.
“This is an important advance because for the first time we show a level of correction in stem cells that should be sufficient for a clinical benefit in persons with sickle cell anemia,” said co-author Mark Walters, a pediatric hematologist and oncologist and director of UCSF Benioff Oakland’s Blood and Marrow Transplantation Program.
The results were reported in the Oct. 12 issue of the online journal Science Translational Medicine.
Sickle cell disease is a recessive genetic disorder caused by a single mutation in both copies of a gene coding for beta-globin, a protein that forms part of the oxygen-carrying molecule hemoglobin. This homozygous defect causes hemoglobin molecules to stick together, deforming red blood cells into a characteristic “sickle” shape. These misshapen cells get stuck in blood vessels, causing blockages, anemia, pain, organ failure and significantly shortened lifespan. Sickle cell disease is particularly prevalent in African Americans and the sub-Saharan African population, affecting hundreds of thousands of people worldwide.
The goal of the multi-institutional team is to develop genome engineering-based methods for correcting the disease-causing mutation in each patient’s own stem cells to ensure that new red blood cells are healthy.
The shape of sickled blood cells causes them to get stuck in vessels and block the flow of blood, which can result in pain and infections in those with sickle cell disease. (Video courtesy of American Society of Hematology)
The team used CRISPR-Cas9 to correct the disease-causing mutation in hematopoietic stem cells — precursor cells that mature into red blood cells — isolated from whole blood of sickle cell patients. The corrected cells produced healthy hemoglobin, which mutated cells do not make at all.
Future pre-clinical work will require additional optimization, large-scale mouse studies and rigorous safety analysis, the researchers emphasize. Corn and his lab have joined with Walters, an expert in developing curative treatments such as bone marrow transplant and gene therapy for sickle cell disease, to initiate an early-phase clinical trial to test this new treatment within the next five years.
Notably, research groups might be able to apply the approach described in this study to develop treatments for other blood diseases such as ?-thalassemia, severe combined immunodeficiency (SCID), chronic granulomatous disease, rare disorders like Wiskott-Aldrich syndrome and Fanconi anemia, and even HIV infection.
“Sickle cell disease is just one of many blood disorders caused by a single mutation in the genome,” Corn said. “It’s very possible that other researchers and clinicians could use this type of gene editing to explore ways to cure a large number of diseases.”
“There is a clear path for developing therapies for certain diseases,” said co-senior author Dana Carroll of the University of Utah, who co-developed one of the first genome editing techniques over a decade ago. “It’s very gratifying to see gene editing technology being brought to practical applications.”
The work is the fruit of the Innovative Genomics Initiative, a joint effort between UC Berkeley and UCSF that aims to correct DNA mutations that underlie human disease using CRISPR-Cas9, a pioneering technology co-developed by scientists at UC Berkeley that has made genome editing easier and more efficient than ever before.
The project also leverages the expertise of physicians and scientists at UCSF Benioff Children’s Hospital Oakland, a major center for research and treatment of sickle cell disease, and Carroll’s expertise in the field of genome engineering.
In addition to Corn, Walters and Carroll, other co-authors are Mark DeWitt, Nicolas Bray, Tianjiao Wang and Therese Mitros of UC Berkeley; Wendy Magis, Seok-Jin Heo, Denise Muñoz, Dario Boffelli and David Martin of CHORI; Jennifer Berman of Bio-Rad Laboratories in Pleasanton, California; and Fabrizia Urbinati and Donald Kohn of UCLA.
The research is supported by the National Institutes of Health, the Li Ka Shing Foundation, the Siebel Scholars Fund, the Jordan Family Fund and the Doris Duke Charitable Foundation.
Software may appear to operate without bias because it strictly uses computer code to reach conclusions. That’s why many companies use algorithms to help weed out job applicants when hiring for a new position.
But a team of computer scientists from the University of Utah, University of Arizona and Haverford College in Pennsylvania have discovered a way to find out if an algorithm used for hiring decisions, loan approvals and comparably weighty tasks could be biased like a human being.
The researchers, led by Suresh Venkatasubramanian, an associate professor in the University of Utah’s School of Computing, have discovered a technique to determine if such software programs discriminate unintentionally and violate the legal standards for fair access to employment, housing and other opportunities. The team also has determined a method to fix these potentially troubled algorithms. Venkatasubramanian presented his findings Aug. 12 at the 21st Association for Computing Machinery’s SIGKDD Conference on Knowledge Discovery and Data Mining in Sydney, Australia.
“There’s a growing industry around doing résumé filtering and résumé scanning to look for job applicants, so there is definitely interest in this,” says Venkatasubramanian. “If there are structural aspects of the testing process that would discriminate against one community just because of the nature of that community, that is unfair.”
Many companies have been using algorithms in software programs to help filter out job applicants in the hiring process, typically because it can be overwhelming to sort through the applications manually if many apply for the same job. A program can do that instead by scanning résumés and searching for keywords or numbers (such as school grade point averages) and then assigning an overall score to the applicant.
These programs also can learn as they analyze more data. Known as machine-learning algorithms, they can change and adapt like humans so they can better predict outcomes. Amazon uses similar algorithms so they can learn the buying habits of customers or more accurately target ads, and Netflix uses them so they can learn the movie tastes of users when recommending new viewing choices.
But there has been a growing debate on whether machine-learning algorithms can introduce unintentional bias much like humans do.
“The irony is that the more we design artificial intelligence technology that successfully mimics humans, the more that A.I. is learning in a way that we do, with all of our biases and limitations,” Venkatasubramanian says.
Venkatasubramanian’s research determines if these software algorithms can be biased through the legal definition of disparate impact, a theory in U.S. anti-discrimination law that says a policy may be considered discriminatory if it has an adverse impact on any group based on race, religion, gender, sexual orientation or other protected status.
Venkatasubramanian’s research revealed that you can use a test to determine if the algorithm in question is possibly biased. If the test — which ironically uses another machine-learning algorithm — can accurately predict a person’s race or gender based on the data being analyzed, even though race or gender is hidden from the data, then there is a potential problem for bias based on the definition of disparate impact.
“I’m not saying it’s doing it, but I’m saying there is at least a potential for there to be a problem,” Venkatasubramanian says.
As hazard warnings increase, experts urge better decisions on who and when to warn
A group of risk experts is proposing a new framework and research agenda that they believe will support the most effective public warnings when a hurricane, wildfire, toxic chemical spill or any other environmental hazard threatens safety. Effective warnings are a growing need as expanding global populations confront a wide range of hazards.
Right now, “the potential for errors is high” when officials decide when to issue emergency warnings, who to send them to, and what safety measures to urge the public to take, says Thomas Cova, a professor in the University of Utah geography department.
That’s because “researchers tend to focus on one or two of those questions,” Cova says. “But it’s a challenge to think about all three,” which is necessary to avoid such errors as deciding the right time and right action but wrong target group or the right group and right time but wrong protective action, he adds. Emergency managers must contend with uncertainty about how the three components interact, and have to consider how likely and how costly it might be to make “false positive” decisions to issue a warning when hazards don’t occur or “false negative” decisions to continue normally when hazards do occur.
Cova and colleagues have published a paper called “Warning triggers in environmental hazards: Who should be warned to do what and when?” that proposes a way forward in improving emergency warning by thinking constructively and critically about all three issues. The paper, published in the online version of Risk Analysis, a publication of the Society for Risk Analysis, was co-authored by Cova with colleagues Philip E. Dennison, Dapeng Li, and Frank Drews, also of University of Utah, as well as Laura K. Siebeneck of University of North Texas and Michael K. Lindell of University of Washington.
Essential to improving emergency warning practices is research into the most effective methods for alerting the public. But, currently, public warning researchers are each carving out little hazard niches (hurricanes, wildfires, hazmat), as well as single dimensions of the warning problem (timing them, delimiting risk zones, selecting protective actions). “The end result is that no one is taking on the big question of simultaneously asking: who should do what when?” Cova explains. The authors’ goal is to sound a “wake up call” that they hope will lead to an improved understanding of how warnings are formulated and implemented across hazards, which in turn could lead to improved training methods, warning system innovations, and synergy between researchers and practicing emergency managers. “We’re not proposing a new approach to warnings, we’re proposing a new approach to public warning research,” Cova says, but adds, “The results may have beneficial feedbacks into public warning improvements and innovations.”
Today’s guidance on emergency warnings is not optimal. In light of the many global environmental hazards, experts are developing new procedures to simplify the warning process, aiming to prevent casualties and increase transparency about the decision making process. Widely used “warning triggers” are a decision rule that links an environmental condition to “protective action recommendations” for a specified target group, helping answer the questions: “Who should take what action and when?” For example, fire occurrence is a common qualitative trigger condition, but a more specific indicator would be a flame front crossing a prominent ridgeline, river, or road toward a community. Rainfall rates and duration can serve to define a threshold value that, once exceeded, results in a warning for flooding or landslides. Triggers aid managers in deciding when to change from “wait and see” to “take immediate action,” thereby helping them stay ahead of the emergency’s advancing curve.
But, even though warning triggers are used often, little research has been done on how emergency managers set them or how effective they are when combined with integrated early warning systems, the authors write. In an overview of key warning trigger issues, the authors discuss the critical importance of an “unambiguous trigger condition” when deciding when to issue a warning, as well as methods for defining the condition, such as directly observed environmental cues or measurements from sensors. They also discuss issues pertaining to deciding which population to warn, including the use of “emergency planning zones” such as “everyone on Manhattan Island,” a physical feature, or “south of Central Park,” a built feature that includes apartment buildings and stores. And they review challenges of deciding on the most effective actions to recommend, such as evacuating or sheltering in place.
The problem of effective warnings “has dimensions that are geographic, temporal, cognitive, and perceptual, particularly in how the public might respond,” says Cova. “So it’s an ideal challenge for interdisciplinary research” that addresses all three of the key systems at work (natural, built, social) and their interactions across different types of hazards. Emergency managers simultaneously deal with who, what, and when issues every day, and therefore so should research to improve warnings, the authors suggest.
Engineers from the University of Utah and the University of Minnesota have discovered that interfacing two particular oxide-based materials makes them highly conductive, a boon for future electronics that could result in much more power-efficient laptops, electric cars and home appliances that also don’t need cumbersome power supplies.
Their findings were published this month in the scientific journal, APL Materials, from the American Institute of Physics.
The team led by University of Utah electrical and computer engineering assistant professor Berardi Sensale-Rodriguez and University of Minnesota chemical engineering and materials science assistant professor Bharat Jalan revealed that when two oxide compounds — strontium titanate (STO) and neodymium titanate (NTO) — interact with each other, the bonds between the atoms are arranged in a way that produces many free electrons, the particles that can carry electrical current. STO and NTO are by themselves known as insulators — materials like glass — that are not conductive at all.
But when they interface, the amount of electrons produced is a hundred times larger than what is possible in semiconductors. “It is also about five times more conductive than silicon [the material most used in electronics],” Sensale-Rodriguez says.
This innovation could greatly improve power transistors — devices in electronics that regulate the electrical current —by making power supplies much more efficient for items ranging from televisions and refrigerators to handheld devices, Sensale-Rodriguez says. Today, electronics manufacturers use a material called gallium nitride for transistors in power supplies and other electronics that carry large electrical currents. But that material has been explored and optimized for many years and likely cannot be made more efficient. In this discovery made by the Utah and Minnesota team, the interface between STO and NTO can be at the very least as conductive as gallium nitride and likely will be much more in the future.
The University of Utah (also referred to as the U, the U of U, or Utah) is a public coeducational space-grant research university in Salt Lake City, Utah, United States.
As the state’s flagship university, the university offers more than 100 undergraduate majors and more than 92 graduate degree programs. Graduate studies include the S.J. Quinney College of Law and the School of Medicine, Utah’s only medical school. As of Autumn 2012, there are 24,840 undergraduate students and 7,548 graduate students, for an enrollment total of 32,388; with 83% coming from Utah and 9% coming from foreign countries. Just over 10% of students live on campus.
January 19, 2017 - “They've made a remarkable and important and indelible contribution to who we have become and how we will become,” says Pamela Perlich, director of demographic research at the University of Utah's Kem C. Gardner Policy Institute, who has observed the ...
January 18, 2017 - Baskin is best known for his innovative research on Great Salt Lake, collaborating with the Scripps Institution of Oceanography and University of Utah to provide information vital for effective lake management. His work has greatly contributed to the ...
January 18, 2017 - The second keynote speaker is Ryan E. Smith, the associate dean of research and community engagement and the director of the Integrated Technology in Architecture Center in the College of Architecture and Planning at the University of Utah. Smith's ...
January 18, 2017 - The Kem C. Gardner Policy Institute at the University of Utah enhances Utah's economy by placing data-driven research into the hands of decision makers. Housed within the David Eccles School of Business, its mission is to develop and share economic, ...
January 18, 2017 - "The elephants have had 55 million years of research and development to really design the perfect cancer-fighting protein," Schiffman explained. "Nature has already solved the problem." Due to their size and the number of cells their bodies generate, ...
January 14, 2017 - He died from bullet wounds but investigators said they didn't know if police bullets killed him or if he shot himself. In another incident, a husband and wife in their 20s both died in another murder-suicide shooting in Research Park on the University ...
December 30, 2016 - SALT LAKE CITY -- A woman was shot and killed by her husband in a parking lot at the University of Utah Research Park Thursday, and the husband has died after turning the weapon on himself. Police identified the victim as 23-year-old Katherine Peralta, ...
Utah engineers develop new flat, ultralight lens that could change how cameras are designed
University of Utah electrical and computer engineering professor Rajesh Menon holds up the prototype of the first flat thin camera lens that he and his team developed. Menon and his doctoral students, Peng Wang and Nabil Mohamma, have developed a new kind of optical lens that is flat and ultrathin instead of the traditionally curved lens but can still focus all the colors of light to one point.
The new lens can be used in cameras and other devices such as smartphones where the lens does not have to jet out of the body.
January 19, 2017 - WASHINGTON, D.C.—President-elect Donald Trump's choice to lead the U.S. Environmental Protection Agency expressed doubt about the science behind global climate change during a contentious Senate confirmation hearing on Wednesday, but added he ...
January 18, 2017 - ABU DHABI // A winner of the UAE Rain Enhancement Programme award on Tuesday said the uncertainty about whether rain could be produced on demand proved the importance of funding. "That's why we're doing science, to try and unravel more of the ...
January 17, 2017 - While Obama championed science from day one, some argue that his zeal for research and its possibilities didn't fully translate into meaningful funding. “On the science funding front, there is a bit of divergence between rhetoric and reality,” Matt ...
January 16, 2017 - In the past, Republicans have been advocates for funding basic science, which they see as an important role of the federal government. This played out most recently in the 21st Century Cures Act, a law pushing for medical innovations, which passed in ...
January 12, 2017 - “I got into science funding in 1985,” he says, “when the University of Oxford refused [Margaret] Thatcher an honorary degree on the grounds that she was destroying British science. My problem was that I had just finished my PhD in Oxford and was ...