The odds of living up to 110 years or more level out at 105, says study

Want to be a supercentenarian? The chances of reaching the ripe old age of 110 are within reach – if you survive the perilous 90s and make it to 105 when death rates level out, according to a study of extremely old Italians led by the University of California, Berkeley, and Sapienza University of Rome.

Researchers tracked the death trajectories of nearly 4,000 residents of Italy who were aged 105 and older between 2009 and 2015. They found that the chances of survival for these longevity warriors plateaued once they made it past 105.

The findings, to be published in the June 29 issue of the journal Science, challenge previous research that claims the human lifespan has a final cut-off point. To date, the oldest human on record, Jeanne Calment of France, died in 1997 at age 122.

“Our data tell us that there is no fixed limit to the human lifespan yet in sight,” said study senior author Kenneth Wachter, a UC Berkeley professor emeritus of demography and statistics. “Not only do we see mortality rates that stop getting worse with age, we see them getting slightly better over time.”

Specifically, the results show that people between the ages of 105 and 109, known as semi-supercentenarians, had a 50/50 chance of dying within the year and an expected further life span of 1.5 years. That life expectancy rate was projected to be the same for 110-year-olds, or supercentenarians, hence the plateau.

The trajectory for nonagenarians is less forgiving. For example, the study found that Italian women born in 1904 who reached age 90 had a 15 percent chance of dying within the next year, and six years, on average, to live. If they made it to 95, their odds of dying within a year increased to 24 percent and their life expectancy from that point on dropped to 3.7 years.

Overall, Wachter and fellow researchers tracked the mortality rate of 3,836 Italians — supercentenarians and semi-supercentenarians – born between 1896 and 1910 using the latest data from the Italian National Institute of Statistics.

They credit the institute for reliably tracking extreme ages due to a national validation system that measures age at time of death to the nearest day: “These are the best data for extreme-age longevity yet assembled,” Wachter said.

As humans live into their 80s and 90s, mortality rates surge due to frailty and a higher risk of such ailments as heart disease, dementia, stroke, cancer and pneumonia.

Evolutionary demographers like Wachter and study co-author James Vaupel theorize that those who survive do so because of demographic selection and/or natural selection. Frail people tend to die earlier while robust people, or those who are genetically blessed, can live to extreme ages, they say.

Wachter notes that similar lifecycle patterns have been found in other species, such as flies and worms.

“What do we have in common with flies and worms?” he asked. “One thing at least: We are all products of evolution.”

‘Breakthrough’ algorithm exponentially faster than any previous one

What if a large class of algorithms used today — from the algorithms that help us avoid traffic to the algorithms that identify new drug molecules — worked exponentially faster?

Computer scientists at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a completely new kind of algorithm, one that exponentially speeds up computation by dramatically reducing the number of parallel steps required to reach a solution.

The researchers will present their novel approach at two upcoming conferences: the ACM Symposium on Theory of Computing (STOC), June 25-29 and International Conference on Machine Learning (ICML), July 10 -15.

A lot of so-called optimization problems, problems that find the best solution from all possible solutions, such as mapping the fastest route from point A to point B, rely on sequential algorithms that haven’t changed since they were first described in the 1970s. These algorithms solve a problem by following a sequential step-by-step process. The number of steps is proportional to the size of the data. But this has led to a computational bottleneck, resulting in lines of questions and areas of research that are just too computationally expensive to explore.

“These optimization problems have a diminishing returns property,” said Yaron Singer, Assistant Professor of Computer Science at SEAS and senior author of the research. “As an algorithm progresses, its relative gain from each step becomes smaller and smaller.”

Singer and his colleague asked: what if, instead of taking hundreds or thousands of small steps to reach a solution, an algorithm could take just a few leaps?

“This algorithm and general approach allows us to dramatically speed up computation for an enormously large class of problems across many different fields, including computer vision, information retrieval, network analysis, computational biology, auction design, and many others,” said Singer. “We can now perform computations in just a few seconds that would have previously taken weeks or months.”

“This new algorithmic work, and the corresponding analysis, opens the doors to new large-scale parallelization strategies that have much larger speedups than what has ever been possible before,” said Jeff Bilmes, Professor in the Department of Electrical Engineering at the University of Washington, who was not involved in the research. “These abilities will, for example, enable real-world summarization processes to be developed at unprecedented scale.”

Traditionally, algorithms for optimization problems narrow down the search space for the best solution one step at a time. In contrast, this new algorithm samples a variety of directions in parallel. Based on that sample, the algorithm discards low-value directions from its search space and chooses the most valuable directions to progress towards a solution.

Take this toy example:

You’re in the mood to watch a movie similar to The Avengers. A traditional recommendation algorithm would sequentially add a single movie in every step which has similar attributes to those of The Avengers. In contrast, the new algorithm samples a group of movies at random, discarding those that are too dissimilar to The Avengers. What’s left is a batch of movies that are diverse (after all, you don’t want ten Batman movies) but similar to The Avengers. The algorithm continues to add batches in every step until it has enough movies to recommend.

This process of adaptive sampling is key to the algorithm’s ability to make the right decision at each step.

“Traditional algorithms for this class of problem greedily add data to the solution while considering the entire dataset at every step,” said Eric Balkanski, graduate student at SEAS and co-author of the research. “The strength of our algorithm is that in addition to adding data, it also selectively prunes data that will be ignored in future steps.”

In experiments, Singer and Balkanski demonstrated that their algorithm could sift through a data set which contained 1 million ratings from 6,000 users on 4,000 movies and recommend a personalized and diverse collection of movies for an individual user 20 times faster than the state-of-the-art.

The researchers also tested the algorithm on a taxi dispatch problem, where there are a certain number of taxis and the goal is to pick the best locations to cover the maximum number of potential customers. Using a data set of two million taxi trips from the New York City taxi and limousine commission, the adaptive-sampling algorithm found solutions 6 times faster.

“This gap would increase even more significantly on larger scale applications, such as clustering biological data, sponsored search auctions, or social media analytics,” said Balkanski.

Of course, the algorithm’s potential extends far beyond movie recommendations and taxi dispatch optimizations. It could be applied to:

  • designing clinical trials for drugs to treat Alzheimer’s, multiple sclerosis, obesity, diabetes, hepatitis C, HIV and more
  • evolutionary biology to find good representative subsets of different collections of genes from large datasets of genes from different species
  • designing sensor arrays for medical imaging
  • identifying drug-drug interaction detection from online health forums

This process of active learning is key to the algorithm’s ability to make the right decision at each step and solves the problem of diminishing returns.

“This research is a real breakthrough for large-scale discrete optimization,” said Andreas Krause, professor of Computer Science at ETH Zurich, who was not involved in the research. “One of the biggest challenges in machine learning is finding good, representative subsets of data from large collections of images or videos to train machine learning models. This research could identify those subsets quickly and have substantial practical impact on these large-scale data summarization problems.”

Singer-Balkanski model and variants of the algorithm developed in the paper could also be used to more quickly assess the accuracy of a machine learning model, said Vahab Mirrokni, a principal scientist at Google Research, who was not involved in the research.

“In some cases, we have a black-box access to the model accuracy function which is time-consuming to compute,” said Mirrokni. “At the same time, computing model accuracy for many feature settings can be done in parallel. This adaptive optimization framework is a great model for these important settings and the insights from the algorithmic techniques developed in this framework can have deep impact in this important area of machine learning research.”

Singer and Balkanski are continuing to work with practitioners on implementing the algorithm.

New spider species found deep in southern Indiana cave

IMAGE: This is a female specimen of the newly described rare spider species Islandiana lewisi.

Credit: Dr. Marc Milne

Spiders are ubiquitous within our forests, fields, and backyards. Although you may be used to seeing the beautiful yellow and black spiders of the genus Argiope in your garden, large ground-scurrying wolf spiders in your yard, or spindly cellar spiders in your basement, this new sheet-web-building spider is probably one you haven’t seen before. The reason is that it’s known from a single cave in the world, Stygeon River Cave, in southern Indiana.

The University of Indianapolis assistant professor, Dr. Marc Milne, described the rare species in the open access journal Subterranean Biology with the help of a University of Indianapolis alumnus, Elizabeth Wells, who illustrated the spider for the manuscript.

Sheet weavers, also known as dwarf spiders or money spiders, are minute creatures growing no larger than a few centimetres in length, which makes them particularly elusive. Their peculiar webs are flat and sheet-like, hence their common English name.

The new spider, Islandiana lewisi, is an homage. Milne was shown the spider by a fellow scientist, Dr. Julian Lewis, who noticed the critter on one of his many cave expeditions. In appreciation for his help, Milne and Wells named the spider after Lewis.

This is the fifteenth species in its genus (Islandiana) and the fifth known to live exclusively in caves. It has been over 30 years since the last species has been added to this group.

At about 2 mm in size, Islandiana lewisi is thought to feed on even smaller arthropods, such as springtails living in the debris on the cave floor. It is unknown when it reproduces or if it exists anywhere else. The spider is likely harmless to humans.

The collectors of the spider, Milne and Lewis, described the hostile conditions within the cave, which the new species calls home: “because the cave floods from time to time, the insides were wet, muddy, slippery, and dangerous to walk on without the proper equipment.”

Milne and Lewis found the spider in small, horizontal webs between large, mud-caked boulders in the largest room in the cave. It was collected in October 2016 with the permission of the landowner.

Milne hypothesized that he had collected something special, stating, “I didn’t know what the spider was at first, I just thought it was odd that so many were living within this dark cave with no other spider species around.”

After returning to the lab and inspecting the spider under a microscope, Milne initially misidentified the species. However, when he re-examined it months later, he realized that the species was indeed new to science.

NASA infrared data reveals Tropical Storm Emilia is strengthening

IMAGE: On June 28 at 4:59 p.m. EDT (2059 UTC) the AIRS instrument aboard NASA’s Aqua satellite showed powerful storms with very cold cloud top temperatures (purple). Credits: NASA JPL, Heidar Thrastarson

Infrared NASA satellite imagery provided cloud top temperatures of thunderstorms that make up Tropical Storm Emilia. Comparing those NASA temperature readings with another satellite’s data obtained the following day, forecasters determined that Emilia had strengthened.

At NASA’s Jet Propulsion Laboratory in Pasadena, California, infrared data taken of Emilia by the Atmospheric Infrared Sounder or AIRS instrument that flies aboard NASA’s Aqua satellite was made into a false-colored infrared image. That data from June 28 at 4:59 p.m. EDT (2059 UTC) revealed powerful storms with very cold cloud top temperatures in excess of minus 63 degrees Fahrenheit (minus 53 degrees Celsius) around the center.

By Friday, June 29, 2018 the National Hurricane Center noted that those cloud tops had cooled, indicating the uplift in the storm was stronger, and the cloud tops were higher. That means the storm was intensifying. NHC said “Shortwave infrared imagery and an earlier [4:55 a.m. EDT] 0855 UTC polar orbiter (satellite) pass show deep convective bursts, with associated minus 78 degree Celsius [minus 108.4 degrees Fahrenheit] cloud tops, developing near the surface center.”

Emilia is far enough away from land so that there are no coastal watches or warnings in effect.

At 11 a.m. EDT (1500 UTC) on June 29, the center of Tropical Storm Emilia was located near latitude 16.2 degrees north and longitude 116.3 degrees west. That’s about 620 miles (1,000 km) southwest of the southern tip of Baja California, Mexico.

The National Hurricane Center (NHC) said that Emilia is moving toward the west-northwest near 12 mph (19 kph), and this general motion is expected to continue for the next few days. Maximum sustained winds have increased to near 60 mph (95 kph) with higher gusts. Tropical-storm-force winds extend outward up to 80 miles (130 km) from the center. The estimated minimum central pressure is 997 millibars.

NHC said “Some additional strengthening is possible during the next 24 hours before Emilia moves over cool waters and begins to weaken over the weekend.”

How the scent of a male sensed to measure him up by female blackbucks?

At Tal Chhapar, a wildlife sanctuary in the heart of the Thar desert, a strange drama is staged twice every year. In the blistering heat of summer from March to April and the post-monsoon months of September and October, up to a hundred blackbuck males stake out territories on the flat land to entice females to mate with them in a unique assemblage called a lek.

Female blackbuck who visit the lek generally spend large amounts of time evaluating males before choosing one as a mate. A large part of this evaluation seems to be based on sniffing–even when being courted, females are so intent on inspecting odors from the dung piles, that they are often oblivious to the males’ antics.

What are these females nosing around for?

To answer this question, Jyothi Nair, a student from Uma Ramakrishnan’s group at the National Centre for Biological Sciences (NCBS), Bangalore, collaborated with Shannon Olsson’s team, also from NCBS, to develop a pipeline for investigating odors in a quick, efficient way. In a publication in the journal, Ecology and Evolution, the researchers document their evaluation of different odor collection, identification, and analysis techniques, and describe a protocol optimized for large-scale sampling of odors. Using this protocol, the team have also found that dung piles of males with high mating success seem to be much richer in the chemical meta-cresol than those of less successful males.

“Collecting odor samples from a remote area like Tal Chhapar is an extremely difficult task,” says Nair. This is because most collection methods require many hours to obtain enough amounts of odors for successful analysis. Furthermore, depending on collection methods, samples are often unstable and decompose very quickly even when stored at low temperatures. This is often impossible in remote field sites where refrigeration facilities are non-existent.

Through trial and error, however, the research team from NCBS found a solution–solid phase extraction. In this technique, odors from fecal samples were absorbed onto tiny tubes made of a silicone polymer called polydimethylsiloxane (PDMS). The odor samples in these PDMS tubes were found to be stable enough that they could then be transported safely without refrigeration to NCBS, Bangalore for chemical analyses.

In the laboratory, standard procedures such as thermal desorption (where the odor-laden PDMS tubes are heated to release trapped volatiles), gas chromatography, and mass spectrometry were used to separate and analyze the compounds making up each odor sample.

“During analysis, we faced a lot of problems in identifying compounds,” says V.S. Pragadeesh, who helped Nair with this work. “Compared to plant volatiles, there are comparatively few studies on mammalian volatiles, so we had very little information to help us recognize chemicals in these odors.”

Routine analysis for such data usually involves manual identification and documentation of the detected compounds to create a chemical profile of each odor sample. Conventionally, the process would have taken more than 8 months for all the data in this study. However, the analysis time was reduced to just two weeks through a collaboration with computational biologist, Snehal Karpe, who helped the team develop a semi-automated process that could quickly and efficiently analyze components of each sample with fairly low error rates.

To make sense of all this information, Nair then used a complex statistical technique called, ‘Random Forests’ to compare the chemical profiles of dung piles from different locations within the lek. What emerged, was a strong spatial pattern–dung piles of males at the centre of the lek, where mating success was highest, had much higher levels of the chemical meta-cresol than those of males towards the periphery.

Meta-cresol is a well-known chemical used for communication in many insects and a few mammals such as elephants and horses. The team is now busy testing different chemicals identified in this study, including meta-cresol, on the behavior of captive blackbuck at Mysore zoo.

“This research has been exciting on so many fronts. There are relatively few population studies on chemical communication, and this is the first field-based chemical ecology study for this amazing Indian mammal,” says Dr. Shannon Olsson, who heads the NICE (Naturalist-Inspired Chemical Ecology) laboratory at NCBS, and has been a close collaborator in this study.

“It’s amazing to think that we can map ‘smells’ on the blackbuck lek! This is the first step to better understand whether smells vary across the lek, and potentially, how successful males smell. All thanks to our collaboration with the NICE lab,” says Dr. Uma Ramakrishnan, who is Nair’s mentor at NCBS.

“We hope that our pipeline will inspire more large-scale studies in chemical ecology that can be used to understand the remarkable biodiversity of this country and the world,” adds Olsson.

Self-monitoring diabetes reduces future costs by half: Study

Self-monitoring of type 2 diabetes used in combination with an electronic feedback system results in considerable savings on health care costs especially in sparsely populated areas, a new study from the University of Eastern Finland shows.

Self-monitoring delivers considerable savings on the overall costs of type 2 diabetes care, as well as on patients’ travel costs. Glycated hemoglobin testing is an important part of managing diabetes, and also a considerable cost item.

By replacing half of the required follow-up visits with self-measurements and electronic feedback, the annual total costs of glycated hemoglobin monitoring were reduced by nearly 60 per cent, bringing the per-patient cost down from 280 EUR (300 USD) to 120 EUR (130 USD). With fewer follow-up visits required, the average annual travel costs of patients were reduced over 60 per cent, from 45 EUR (48 USD) to 17 EUR (18 USD) per patient. The study was published in the International Journal of Medical Informatics.

Carried out in the region of North Karelia in Finland, the study applies geographic information systems (GIS) -based geospatial analysis combined with patient registers. This was the first time the costs of type 2 diabetes follow-up were systematically calculated over a health care district in Finland. The study analysed 9,070 patients diagnosed with type 2 diabetes. Combined travel and time costs amount to 21 per cent of the total costs of glycated hemoglobin monitoring for patients with type 2 diabetes.

“The societal cost-efficiency of type 2 diabetes care could be improved in by taking into consideration not only the direct costs of glycated hemoglobin monitoring, but also the indirect costs, such as patients’ travel costs,” Researcher Aapeli Leminen from the University of Eastern Finland says.

The study used a georeferenced cost model to analyse health care accessibility and different costs associated with the follow-up of type 2 diabetes. Patients’ travel and time costs were analysed by looking at how well health care services could be reached on foot or by bike, or by using a private car, a bus, or a taxi. According to Leminen, the combination of patient registers and GIS opens up new opportunities for research within the health care sector.

“This cost model we’ve now tested in the eastern part of Finland can easily be used in other places as well to calculate the costs of different diseases, such as cancer and cardiovascular diseases.”

Forests may lose ability to protect against extremes of climate change: Study

Forests, one of the most dominate ecosystems on Earth, harbor significant biodiversity. Scientists have become increasingly interested in how this diversity is enhanced by the sheltering microclimates produced by trees.

A recent University of Montana study suggests that a warming climate in the Pacific Northwest would lessen the capacity of many forest microclimates to moderate climate extremes in the future.

The study was published in Ecography: A Journal of Space and Time in Ecology. It is online at http://bit.ly/2KcO1iC.

“Forest canopies produce microclimates that are less variable and more stable than similar settings without forest cover,” said Kimberley Davis, a UM postdoctoral research associate and the lead author of the study. “Our work shows that the ability of forests to buffer climate extremes is dependent on canopy cover and local moisture availability – both of which are expected to change as the Earth warms.”

She said many plants and animals that live in the understory of forests rely on the stable climate conditions found there. The study suggests some forests will lose their capacity to buffer climate extremes as water becomes limited at many sites.

“Changes in water balance, combined with accelerating canopy losses due to increases in the frequency and severity of disturbance, will create many changes in the microclimate conditions of western U.S. forests,” Davis said.

What makes dog man’s best friend?

From pugs to labradoodles to huskies, dogs are our faithful companions. They live with us, play with us and even sleep with us. But how did a once nocturnal, fearsome wolf-like animal evolve over tens of thousands of years to become beloved members of our family? And what can dogs tell us about human health? Through the power of genomics, scientists have been comparing dog and wolf DNA to try and identify the genes involved in domestication.

Amanda Pendleton, a postdoctoral research fellow in the Michigan Medicine Department of Human Genetics, has been reviewing current domestication research and noticed something peculiar about the DNA of modern dogs: at some places it didn’t appear to match DNA from ancient dogs. Pendleton and her colleagues in assistant professor Jeffrey Kidd, Ph.D.’s laboratory are working to understand the dog genome to answer questions in genome biology, evolution and disease.

Breed dogs, which mostly arose around 300 years ago, are not fully reflective of the genetic diversity in dogs around the world, she explains. Three-quarters of the world’s dogs are so-called village dogs, who roam, scavenge for food near human populations and are able to mate freely. In order to get a fuller picture of the genetic changes at play in dog evolution, the team looked at 43 village dogs from places such as India, Portugal and Vietnam.

Armed with DNA from village dogs, ancient dogs found at burial sites from around 5,000 years ago, and wolves, they used statistical methods to tease out genetic changes that resulted from humans’ first efforts at domestication from those associated with the development of specific breeds. This new genetic review revealed 246 candidate domestication sites, most of them identified for the first time by their lab.

Now that they’d identified the candidate genes the question remained: What do those genes do?

‘A good entry point’

Upon closer inspection, the researchers noticed that these genes influenced brain function, development and behavior. Moreover, the genes they found appeared to support what is known as the neural crest hypothesis of domestication. “The neural crest hypothesis posits that the phenotypes we see in domesticated animals over and over again — floppy ears, changes to the jaw, coloration, tame behavior — can be explained by genetic changes that act in a certain type of cell during development called neural crest cells, which are incredibly important and contribute to all kinds of adult tissues,” explains Pendleton. Many of the genetic sites they identified contained genes that are active in the development and migration of neural crest cells.

One gene in particular stuck out, called RAI1, which was the study’s highest ranked gene. In a different lab within the Department of Human Genetics, Michigan Medicine assistant professor of human genetics Shigeki Iwase, has been studying this gene’s function and role in neurodevelopmental disorders. He notes that in humans, changes to the RAI1 gene result in one of two syndromes — Smith-Magensis syndrome if RAI1 is missing or Potocki-Lupski syndrome if RAI1 is duplicated.

Kidd said that they are using these changes that were selected for by humans for thousands of years as a way to understand the natural function and gene regulatory environment of the neural crest in all vertebrates.

Cyclone Prapiroon: NASA’s GPM satellite finds heavy rainfall on southwestern side

When the Global Precipitation Measurement mission or GPM core satellite passed over the Northwestern Pacific Ocean, it saw very heavy rainfall occurring in one part of Tropical Storm Prapiroon.

Tropical Depression 09W was located east of the Philippines when it was upgraded early today, June 29, to Tropical Storm Prapiroon. The tropical storm is in a favorable environment for intensification. Vertical wind shear is low above the tropical cyclone and sea surface temperatures are warm below.

NASA’s GPM core observatory satellite had a good view of Tropical Storm Prapiroon on June 29, 2018 at 0246 UTC (June 28 at 10:46 p.m. EDT). GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

At the time GPM passed overhead, Prapiroon was barely a tropical storm with maximum sustained wind speeds estimated at about 35 knots (40.3 mph). GPM’s Microwave Imager (GMI) and Dual-Frequency Precipitation Radar (DPR) instruments measured precipitation around Prapiroon. GPM showed that intensifying the storm was fairly large with its most intense rainfall located in the southern part of the storm. GPM’s radar (DPR Ku Band) scanned convective storms in a feeder band on the southwestern side of the tropical storm where it found that some very intense storms there were dropping rain at a rate of over 192 mm (7.6 inches) per hour.

A 3-D view of Tropical Storm Prapiroon’s precipitation, looking toward the southwest, was created at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, using data captured by GPM’s radar (DPR Ku Band). GPM’s DPR probes provided excellent information about the powerful storms in the large rain band wrapping around Prapiroon’s western side. Storm top heights in that part of the storm were measured by GPM’s radar reaching heights above 12.5 km (7.8 miles).

On June 29 at 11 a.m. EDT (1500 UTC), Prapiroon was centered near 20.0 degrees north latitude and 129.7 degrees east longitude. That’s about 404 nautical miles south-southeast of Kadena Air Base, Okinawa, Japan. The storm is moving to the northwest at 5 knots (5.7 mph/9.2 kph). Maximum sustained winds 45 knots (52 mph/83 kph).

The Joint Typhoon Warning Center (JTWC) predicts that Prapiroon will move toward the north-northwest and intensify into a typhoon on June 30. Prapiroon is expected to continue intensifying and have peak wind speeds of about 75 knots (86 mph/139 kph) as it passes over the East China Sea in a few days.

Prapiroon is predicted by the JTWC to be a minimal typhoon with winds of 65 knots (75 mph/120 kph) as it approaches South Korea on July 2, 2018.

Facing music from volcano eruption? Better listen to it before, says study

A volcano in Ecuador with a deep cylindrical crater might be the largest musical instrument on Earth, producing unique sounds scientists could use to predict its eruption, said a new study.

New infrasound recordings of Cotopaxi volcano in central Ecuador witnessed eruptions in 2015 and its crater changed shape. Now the deep narrow crater makes air to reverberate against the crater walls when the volcano rumbled, producing sound waves like those of a pipe organ.

“It’s the largest organ pipe you’ve ever come across,” said Jeff Johnson, a volcanologist at Boise State University in Idaho and lead author of a new study detailing the findings in Geophysical Research Letters, a journal of the American Geophysical Union.

The new findings show that each volcano’s unique “voiceprint” can help scientists better monitor these natural hazards and alert scientists to understand what’s going on inside the volcano before realizing its impending eruption, said the study authors.

“Understanding how each volcano speaks is vital to understanding what’s going on,” Johnson said. “Once you realize how a volcano sounds, if there are changes to that sound, that leads us to think there are changes going on in the crater, and that causes us to pay attention.”

The ongoing eruption of Kilauea in Hawaii could be a proving ground for studying how changes to a crater’s shape influence the sounds it makes, according to Johnson.

The lava lake at Kilauea’s summit drained as the magma supplying it flowed downward, which should change the tones of the infrasounds emitted by the crater.

Listening to Kilauea’s infrasound could help scientists monitor the magma depth from afar and forecast its potential eruptive hazards, according to David Fee, a volcanologist at the University of Alaska Fairbanks who was not connected to the new study.

When magma levels at Kilauea’s summit drop, the magma can heat groundwater and cause explosive eruptions, which is believed to have happened at Kilauea over the past several weeks. This can change the infrasound emitted by the volcano.

“It’s really important for scientists to know how deep crater is, if the magma level is at the same depth and if it’s interacting with the water table, which can create a significant hazard,” Fee said.

 

Teleconnection? To forecast winter rainfall in LA, take cue from New Zealand summer

Variability in El Niño cycles was long considered a reliable tool for predicting winter precipitation in the Southwest United States, but its forecasting power has diminished considerably over the years. Scientists at the University of California, Irvine (UCI) have found a new link to predict wet or dry conditions for the winter far ahead.

“Influences between the hemispheres promise earlier and more accurate prediction of winter precipitation in California and the Southwest U.S.,” said study co-author Efi Foufoula-Georgiou of UCI. “Knowing how much rain to expect in the coming winter is crucial for the economy, water security, and ecosystem management of the region.”

The researchers call the new ‘teleconnection’ the New Zealand Index (NZI) because the sea surface temperature anomaly that triggers it begins in July and August in the southwestern Pacific Ocean near New Zealand.

The heating or cooling of the sea surface temperature there causes a change in the southern Hadley cell, a zone of atmospheric circulation from the equator to the 30-degree south parallel.

In turn, the Hadley cell change causes an anomaly east of the Philippine Islands, resulting in a strengthening or weakening of the jet stream in the Northern Hemisphere. That directly influences the amount of rain that falls on California from November through March.

In conducting the research, the team performed an analysis of sea surface temperatures and atmospheric pressures from 1950 to 2015 in 1- and 2-degree cells around the globe.

“With the NZI, we can predict the likelihood of above- or below-normal winter precipitation in the Southwest U.S.,” said lead author Antonios Mamalakis of UCI. “Our research also shows an amplification of this newly discovered “teleconnection” over the past four decades.”

Mamalakis said that the most unexpected result is the discovery of persistent sea surface temperature and atmospheric pressure patterns in the southwestern Pacific that show a strong connection with precipitation in Southern California, Nevada, Arizona and Utah.

In recent years, however, El Niño conditions did not bring heavy rains to California as they have in the past, forcing researchers to study the link in the interhemispheric bridge.

The study was published in Nature Communications. 

Weather forecast to become more accurate as new technique found now

Meteorologists have long known for wrong rainfall forecasts but now researchers from the University of Missouri have developed a system that improves the precision of forecasts by accounting for evaporation in rainfall estimates, particularly for locations 30 miles or more from the nearest National Weather Service radar.

“Right now, forecasts are generally not accounting for what happens to a raindrop after it is picked up by radar,” said Neil Fox of the School of Natural Resources at MU. “Evaporation has a substantial impact on the amount of rainfall that actually reaches the ground. By measuring that impact, we can produce more accurate forecasts that give farmers, agriculture specialists and the public the information they need.”

Fox and doctoral student Quinn Pallardy used dual-polarization radar, which sends out two radar beams polarized horizontally and vertically, to differentiate between the sizes of raindrops. Since the size of a raindrop affects both its evaporation rate and its motion, with smaller raindrops evaporating more quickly but encountering less air resistance, a combination technique has helped them make the prediction more accurate.

By combining this information with a model that assessed the humidity of the atmosphere, the researchers were able to develop a tracing method that followed raindrops from the point when they were observed by the radar to when they hit the ground, precisely determining how much evaporation would occur for any given raindrop.

Researchers found that this method significantly improved the accuracy of rainfall estimates, especially in locations at least 30 miles from the nearest National Weather Service radar. Radar beams rise higher into the atmosphere as they travel, and as a result, radar that does not account for evaporation becomes less accurate at greater distances because it observes raindrops that have not yet evaporated.

“Many of the areas that are further from the radar have a lot of agriculture,” Fox said. “Farmers depend on rainfall estimates to help them manage their crops, so the more accurate we can make forecasts, the more those forecasts can benefit the people who rely on them.”

Fox said more accurate rainfall estimates also contribute to better weather forecasts in general, as rainfall can affect storm behavior, air quality and a variety of other weather factors.

NASA satellite Aqua caaptures formation of Tropical Storm Maliksi over Philippines

Aqua captured an image of Tropical Storm Maliksi on June 8 that showed the circulation center over open waters of the Philippine Sea. Bands of thunderstorms circling the center extended over the northern and central Philippines bringing rainfall and gusty winds.
Credit
Credits: NASA

Tropical Storm Maliksi formed in the Philippine Sea, off the northeastern coast of the Philippines as NASA’s Aqua satellite passed overhead.

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard Aqua captured an image of the storm on June 8 that showed the circulation center over open waters of the Philippine Sea. Bands of thunderstorms circling the center extended over the northern and central Philippines bringing rainfall and gusty winds.

On June 8 at 5 a.m. EDT (0900 UTC), Tropical Storm Maliksi was located near 19.5 degrees north latitude and 127.2 degrees east longitude. That’s about 443 nautical miles northeast of Manila, Philippines. Maliksi was moving north at 12 knots (14 mph/22 kph). Maximum sustained winds were near 40 knots (46 mph/74 kph).

Maliksi is forecast to move to the northeast and parallel the coast of Japan while remaining several hundred miles off shore.

KCET 2018 Document Verification Postponed, New Dates to be Announced Shortly

The Karnataka Coomon Entrance Test 2018 has withdrawn the proposed schedule dates for verification of documents for admission to engineering and allied courses in the state. The new schedule will be announced on its official website, said Karnataka Examinations Authority (KEA) in a statement.

“Schedule published for commencement of verification of documents at all the helpline centres from 7-06-2018 for CET-2018 is withdrawn. The document verification will not be commenced from 07-06-2018. The revised document verification schedule will be published shortly on the KEA Website,” said the KCET exam authority KEA.

It further advised candidates to keep visiting KEA Website http:/kea.kar.nic.in for revised schedule for verification of documents for CET-2018. Around 1,25,000 candidates will be called for the counselling process in phases spread over two weeks.

KCET 2018 results for admission to engineering seats in Karnataka had been announced on Friday, June 1, two days after COMEDK, the other entrance test for private engineering colleges, was announced on Monday, May 28.

Usually, the KEA usually conducts its counselling first for the government seats in engineering and other government quota seats in the state’s aided and unaided private colleges.

KCET takes into account 50 percent of Class 12 or state Pre-university Course (PUC) marks and another 50 percent from the marks scored by candidates in respective subjects in KCET 2018. Now that the final ranking is also known, candidates can seek admission based in respective colleges at government-fixed fee at approximately Rs.60,000 which is one-third less than the COMEDK fees at Rs.200,000 per annum for these seats.

KCET 2018 was held from April 18 to 20 this year and more than 1.98 lakh candidates in Karnataka had appeared for the exam seeking admission to B. Tech or BE programmes in various state universities, government colleges and private institutes. The top college RV College of Engineering usually has cutoff at around 74 in computers for the toppers, followed by PES University and others.

.