In September 2015, a team of astronomers from the National Astronomical Observatory of Japan, University of Michigan, Kyoto Sangyo University, Rikkyo University and the University of Tokyo successfully observed the entire hydrogen coma of the comet 67P/Churyumov-Gerasimenko, using the LAICA telescope onboard the PROCYON spacecraft. They also succeeded in obtaining the absolute rate of water discharge from the comet.
This comet was the target of ESA’s Rosetta mission in 2015. Because the Rosetta spacecraft was actually inside the cometary coma, it couldn’t observe the overall coma structure. There were bad observing conditions during the time the comet could be observed from Earth, so through our observations, we were able to test the coma models for the comet for the first time.
Comet observation by the PROCYON spacecraft had not been scheduled in the original mission plan. Thanks to the efforts of the spacecraft and telescope operation teams, observations were conducted shortly after we started discussing the possibility, producing results of great scientific importance.
This result is the first scientific achievement by a micro spacecraft for deep space exploration. Moreover, this provides an ideal example where observations by a low-cost mission (e.g., the PROCYON mission) support precise observations by a large mission (e.g., the Rosetta mission). We hope this will become a model case for micro spacecraft observations in support of large missions.
The Rosetta mission and its limits
The 2015 apparition (appearance) of the comet 67P/Churyumov-Gerasimenko was a target of ESA’s Rosetta mission (see Figure 1). In the Rosetta mission, precise observations of the comet were carried out from close to the surface of the nucleus for more than two years including when the comet passed perihelion (closest approach to the Sun) on August 13, 2015. However, observation of the entire coma was difficult because the Rosetta spacecraft was located in the cometary coma.
To extrapolate from Rosetta’s observations of specific areas and estimate the total amount of water released by the comet per second (water production rate), we need a model for the coma. But the water production rate strongly depends on the coma model we use. To test the coma models, we have to compare the absolute water production rate derived from entire coma observations to predictions based on Rosetta’s results and the various coma models. Therefore, it was useful to observe the entire coma from farther away from the comet with another satellite.
Conventionally, the SWAN telescope onboard the SOHO spacecraft has often been used to observe such targets. Unfortunately, the comet moved to a region where there are many stars behind it, and because of the SWAN telescope’s low spatial-resolution it could not distinguish the comet from the background stars.
Our observations with the PROCYON spacecraft
PROCYON is the smallest spacecraft for deep space exploration, with a weight of ~65 kg, developed by the University of Tokyo and others. LAICA, which observed the comet, is a telescope which can observe emissions from hydrogen atoms and its development was led by Rikkyo University. The main objective of the LAICA telescope was wide-field-of-view imaging observations from deep space of the complete view of the 42 year-old geocorona and geotail (a layer of hydrogen gas expanding away from the Earth) left over from Apollo 16 in 1972. Despite its small size, the LAICA telescope has high spatial resolution (more than 10 times that of the SWAN telescope), so the LAICA telescope could distinguish the comet from the background stars. The PROCYON spacecraft was launched together with the Hayabusa2 spacecraft in December 2014.
Most of the hydrogen atoms in a cometary coma form from water molecules ejected from the cometary nucleus which are then broken apart by solar UV radiation (photo-dissociation; see Diagram 1). By using coma models based on these mechanisms, we can estimate the water release rate from a brightness map of the hydrogen atoms. Because water is the most abundant molecule in cometary ice, it is important for understanding not just the level of cometary activity but also for understanding the process by which molecules were incorporated into comets as they formed in the early Solar System.
We performed imaging observations of the entire hydrogen coma of the comet and derived the absolute water production rates near the perihelion in 2015 (see Figure 2). Based on our results, we could test the coma models for the comet (see Diagram 2). Combined with Rosetta’s results, such as water production rates at different distances from the Sun and chemical composition, we could accurately estimate the total ejected mass of the comet in the 2015 apparition.
Story of the observations of the comet by the LAICA telescope and future implications
Although observations of the comet were not scheduled in the original mission plan of the PROCYON spacecraft, discussion of the possibility of comet observations started after the end of geocolona observations in May 2015. In general, a comet moves through the Solar System in a short period, so the observing conditions (such as the direction and brightness) from the spacecraft change day by day. We were able to conduct the observations of 67P/C-G and obtained scientifically significant results in a short timeframe thanks to the wide field-of-view and high spatial-resolution of the LAICA telescope, the pointing control performance of the PROCYON satellite, and the hard work of the management teams of the satellite and telescope.
This result is the first scientific achievement by a micro spacecraft for deep space exploration. Around the world, plans are progressing for more micro spacecraft like this. Moreover, this result is an ideal example of a low-cost mission supporting important parts that cannot be implement in a large mission. We hope this result will become a model case for micro spacecraft observations in support of large missions in the future.
Combining optical and electronic technology, Stanford researchers have made a new type of computer that can solve problems that are a challenge for traditional computers.
The processing power of standard computers is likely to reach its maximum in the next 10 to 25 years. Even at this maximum power, traditional computers won’t be able to handle a particular class of problem that involves combining variables to come up with many possible answers, and looking for the best solution.
Now, an entirely new type of computer that blends optical and electrical processing, reported Oct. 20 in the journal Science, could get around this impending processing constraint and solve those problems. If it can be scaled up, this non-traditional computer could save costs by finding more optimal solutions to problems that have an incredibly high number of possible solutions.
“This is a machine that’s in a sense the first in its class, and the idea is that it opens up a sub-field of research in the area of non-traditional computing machines,” said Peter McMahon, postdoctoral scholar in applied physics and co-author of the paper. “There are many, many questions that this development raises and we expect that over the next few years, several groups are going to be investigating this class of machine and looking into how this approach will pan out.”
The traveling salesman problem
There is a special type of problem – called a combinatorial optimization problem – that traditional computers find difficult to solve, even approximately. An example is what’s known as the “traveling salesman” problem, wherein a salesman has to visit a specific set of cities, each only once, and return to the first city, and the salesman wants to take the most efficient route possible. This problem may seem simple but the number of possible routes increases extremely rapidly as cities are added, and this underlies why the problem is difficult to solve.
“Those problems are challenging for standard computers, even supercomputers, because as the size grows, at some point, it takes the age of the universe to search through all the possible solutions,” said Alireza Marandi, a former postdoctoral scholar at Stanford and co-author of the study. “This is true even with a supercomputer because the growth in possibilities is so fast.”
It may be tempting to simply give up on the traveling salesman, but solving such hard optimization problems could have enormous impact in a wide range of areas. Examples include finding the optimal path for delivery trucks, minimizing interference in wireless networks, and determining how proteins fold. Even small improvements in some of these areas could result in massive monetary savings, which is why some scientists have spent their careers creating algorithms that produce very good approximate solutions to this type of problem.
An Ising machine
The Stanford team has built what’s called an Ising machine, named for a mathematical model of magnetism. The machine acts like a reprogrammable network of artificial magnets where each magnet only points up or down and, like a real magnetic system, it is expected to tend toward operating at low energy.
The theory is that, if the connections among a network of magnets can be programmed to represent the problem at hand, once they settle on the optimal, low-energy directions they should face, the solution can be derived from their final state. In the case of the traveling salesman, each artificial magnet in the Ising machine represents the position of a city in a particular path.
Rather than using magnets on a grid, the Stanford team used a special kind of laser system, known as a degenerate optical parametric oscillator, that, when turned on, will represent an upward- or downward-pointing “spin.” Pulses of the laser represent a city’s position in a path the salesman could take. In an earlier version of this machine (published two years ago), the team members extracted a small portion of each pulse, delayed it and added a controlled amount of that portion to the subsequent pulses. In traveling salesman terms, this is how they program the machine with the connections and distances between the cities. The pulse-to-pulse couplings constitute the programming of the problem. Then the machine is turned on to try to find a solution, which can be obtained by measuring the final output phases of the pulses.
The problem in this previous approach was connecting large numbers of pulses in arbitrarily complex ways. It was doable but required an added controllable optical delay for each pulse, which was costly and difficult to implement.
The latest Stanford Ising machine shows that a drastically more affordable and practical version could be made by replacing the controllable optical delays with a digital electronic circuit. The circuit emulates the optical connections among the pulses in order to program the problem and the laser system still solves it.
Nearly all of the materials used to make this machine are off-the-shelf elements that are already used for telecommunications. That, in combination with the simplicity of the programming, makes it easy to scale up. Stanford’s machine is currently able to solve 100-variable problems with any arbitrary set of connections between variables, and it has been tested on thousands of scenarios.
A group at NTT in Japan that consulted with Stanford’s team has also created an independent version of the machine; its study has been published alongside Stanford’s by Science. For now, the Ising machine still falls short of beating the processing power of traditional digital computers when it comes to combinatorial optimization. But it is gaining ground fast and the researchers are looking forward to seeing what other work will be possible based on this breakthrough.
“I think it’s an exciting avenue of exploration for finding alternative computers. It can get us closer to more efficient ways of tackling some of the most daunting computational problems we have,” said Marandi. “So far, we’ve made a laser-based computer that can target some of these problems, and we have already shown some promising results.”
From trial and error to brute force
A Franco-Japanese research group at the University of Tokyo has developed a new “brute force” technique to test thousands of biochemical reactions at once and quickly home in on the range of conditions where they work best. Until now, optimizing such biomolecular systems, which can be applied for example to diagnostics, would have required months or years of trial and error experiments, but with this new technique that could be shortened to days.
“We are interested in programming complex biochemical systems so that they can process information in a way that is analogous to electronic devices. If you could obtain a high-resolution map of all possible combinations of reaction conditions and their corresponding outcomes, the development of such reactions for specific purposes like diagnostic tests would be quicker than it is today,” explains Centre National de la Recherche Scientifique (CNRS) researcher Yannick Rondelez at the Institute of Industrial Science (IIS).
“Currently researchers use a combination of computer simulations and painstaking experiments. However, while simulations can test millions of conditions, they are based on assumptions about how molecules behave and may not reflect the full detail of reality. On the other hand, testing all possible conditions, even for a relatively simple design, is a daunting job.”
Rondelez and his colleagues at the Laboratory for Integrated Micro-Mechanical Systems (LIMMS), a 20-year collaboration between the IIS and the French CNRS, demonstrated a system that can test ten thousand different biochemical reaction conditions at once. Working with the IIS Applied Microfluidic Laboratory of Professor Teruo Fujii, they developed a platform to generate a myriad of micrometer-sized droplets containing random concentrations of reagents and then sandwich a single layer of them between glass slides. Fluorescent markers combined with the reagents are automatically read by a microscope to determine the precise concentrations in each droplet and also observe how the reaction proceeds.
The University of Tokyo (東京大学 Tōkyō daigaku?), abbreviated as Todai (東大 Tōdai?), is a research university located in Bunkyo, Tokyo, Japan.
The University has 10 faculties with a total of around 30,000 students, 2,100 of whom are foreign. Its five campuses are in Hongō, Komaba, Kashiwa, Shirokane and Nakano. It is the first of Japan’s National Seven Universities, and is considered the most prestigious university in Japan. It ranks as the highest in Asia and 21st in the world according to the Academic Ranking of World Universities 2013.
The university was chartered by the Meiji government in 1877 under its current name by amalgamating older government schools for medicine and Western learning. It was renamed “the Imperial University (帝國大學 Teikoku daigaku?)” in 1886, and then Tokyo Imperial University (東京帝國大學 Tōkyō teikoku daigaku?) in 1897 when the Imperial University system was created. In September 1923, an earthquake and the following fires destroyed about 700,000 volumes of the Imperial University Library. The books lost included the Hoshino Library (星野文庫 Hoshino bunko?), a collection of about 10,000 books. The books were the former possessions of Hoshino Hisashi before becoming part of the library of the university and were mainly about Chinese philosophy and history.
In 1947, after Japan’s defeat in World War II, it re-assumed its original name. With the start of the new university system in 1949, Todai swallowed up the former First Higher School (today’s Komaba campus) and the former Tokyo Higher School, which thenceforth assumed the duty of teaching first- and second-year undergraduates, while the faculties on Hongo main campus took care of third- and fourth-year students.
Although the university was founded during the Meiji period, it has earlier roots in the Astronomy Agency (天文方; 1684), Shoheizaka Study Office (昌平坂学問所; 1797), and the Western Books Translation Agency (蕃書和解御用; 1811). These institutions were government offices established by the 徳川幕府 Tokugawa shogunate (1603–1867), and played an important role in the importation and translation of books from Europe.
Kikuchi Dairoku, an important figure in Japanese education, served as president of Tokyo Imperial University.
For the 1964 Summer Olympics, the university hosted the running portion of the modern pentathlon event.
On 20 January 2012, Todai announced that it would shift the beginning of its academic year from April to September to align its calendar with the international standard. The shift would be phased in over five years.
According to the Japan Times, the university had 1,282 professors in February 2012. Of those, 58 were women.
The Latest Updated Research News:
University of Tokyo research articles from Innovation Toronto
- Testing biochemical reactions moves discovery from months and years to days – June 21, 2016
- Ultrathin organic material enhances e-skin display and could eventually eliminate the need to carry electronic devices – April 18, 2016
- A possible cure for allergies – September 11, 2015
- Discovery of a “heat-storage ceramic” – July 18, 2015
- New conductive ink for electronic apparel – June 28, 2015
- New synthetic technology for medicines and fine chemicals – April 19, 2015
- A repulsive material – December 31, 2014
- New Implanted Devices May Reshape Medicine – May 14, 2014
- The University of Tokyo reveals next generation cargo ship concept with 50m high sails uses 30% less fuel
- Japanese Team Dominates Competition to Create Rescue Robots
- Georgia Tech Develops Inkjet-Based Circuits at Fraction of Time and Cost
- Downloadable Soil Sensors Can Be Printed by Local Farmers
- New bird flu strain seen adapting to mammals, humans
- Scientists spin cells into thread form
- Creating a Coating of Water-Repellent Microscopic Particles to Keep Ice Off Airplanes
- NOROROT robotic vehicle could lead to stair-climbing wheelchairs by 2017
- Ultrafast Camera Renews Promise of Blood Test for Early Cancer Detection
- Honda Develops Successful Traffic Jam Detector
- Microsoft HoloDesk lets users handle virtual 3D objects
- PossessedHand controls hand movement with electrical stimulation
- Video: the state of the art in robot perception and dexterity
- World’s first 25Gbps data communication using Quantum Dot Laser achieved