NASA Engineer Develops Tiny, High-Powered terahertz Laser to Find Water on the Moon

Finding water on the Moon could be easier with a Goddard technology that uses an effect called quantum tunneling to generate a high-powered terahertz laser, filling a gap in existing laser technology.

Locating water and other resources is a NASA priority crucial to exploring Earth’s natural satellite and other objects in the solar system and beyond. Previous experiments inferred, then confirmed the existence of small amounts of water across the Moon. However, most technologies do not distinguish among water, free hydrogen ions, and hydroxyl, as the broadband detectors used cannot distinguish between the different volatiles.

Goddard engineer Dr. Berhanu Bulcha said a type of instrument called a heterodyne spectrometer could zoom in on particular frequencies to definitively identify and locate water sources on the Moon. It would need a stable, high-powered, terahertz laser, which was prototyped in collaboration with Longwave Photonics through NASA’s Small Business Innovation Research (SBIR) program.

“This laser allows us to open a new window to study this frequency spectrum,” he said. “Other missions found hydration on the Moon, but that could indicate hydroxyl or water. If it’s water, where did it come from? Is it indigenous to the formation of the Moon, or did it arrive later by comet impacts? How much water is there? We need to answer these questions because water is critical for survival and can be used to make fuel for further exploration.”

As the name implies, spectrometers detect spectra or wavelengths of light in order to reveal the chemical properties of matter that light has touched. Most spectrometers tend to operate across broad sections of the spectrum. Heterodyne instruments dial in to very specific light frequencies such as infrared or terahertz. Hydrogen-containing compounds like water emit photons in the terahertz frequency range — 2 trillion to 10 trillion cycles per second — between microwave and infrared.

Like a microscope for subtle differences within a bandwidth like terahertz, heterodyne spectrometers combine a local laser source with incoming light. Measuring the difference between the laser source and the combined wavelength provides accurate readings between sub-bandwidths of the spectrum.

Traditional lasers generate light by exciting an electron within an atom’s outer shell, which then emits a single photon as it transitions, or returns to its resting energy level. Different atoms produce different frequencies of light based on the fixed amount of energy it takes to excite one electron. However, lasers fall short in a particular portion of the spectrum between infrared and microwave known as the terahertz gap.

“The problem with existing laser technology,” Dr. Bulcha said, “is that no materials have the right properties to produce a terahertz wave.”

This tiny laser capitalizes on quantum-scale effects of materials just tens of atoms across to generate a high-powered beam in a portion of the spectrum where traditional lasers fade in strength/NASA/Michael Giunto

Electromagnetic oscillators like those that generate radio or microwave frequencies produce low-powered terahertz pulses by using a series of amplifiers and frequency multipliers to extend the signal into the terahertz range. However, this process consumes a lot of voltage, and the materials used to amplify and multiply the pulse have limited efficiency. This means they lose power as they approach the terahertz frequencies.

From the other side of the terahertz gap, optical lasers pump energy into a gas to generate photons. However, high-powered, terahertz-band lasers are large, power hungry, and not suitable for space exploration purposes where mass and power are limited, particularly hand-held or Small Satellite applications. The power of the pulse also drops as optical lasers push towards the terahertz bandwidths.

To fill that gap, Dr. Bulcha’s team is developing quantum cascade lasers that produce photons from each electron transition event by taking advantage of some unique, quantum-scale physics of materials layered just a few atoms thick.

In these materials, a laser emits photons in a specific frequency determined by the thickness of alternating layers of semiconductors rather than the elements in the material. In quantum physics, the thin layers increase the chance that a photon can then tunnel through to the next layer instead of bouncing off the barrier. Once there, it excites additional photons. Using a generator material with 80 to 100 layers, totaling less than 10 to 15 microns thick, the team’s source creates a cascade of terahertz-energy photons.

This cascade consumes less voltage to generate a stable, high-powered light. One drawback of this technology is its beam spreads out in a large angle, dissipating quickly over short distances. Using innovative technology supported by Goddard’s Internal Research and Development (IRAD) funding, Dr. Bulcha and his team integrated the laser on a waveguide with a thin optical antenna to tighten the beam. The integrated laser and waveguide unit reduces this dissipation by 50% in a package smaller than a quarter.

He hopes to continue the work to make a flight-ready laser for NASA’s Artemis program.

The laser’s low size and power consumption allow it to fit in a 1U CubeSat, about the size of a teapot, along with the spectrometer hardware, processor, and power supply. It could also power a handheld device for use by future explorers on the Moon, Mars, and beyond.

Twitter saga continues after Zatko revelations; Parag ridicules false claims

Twitter’s Indian-origin CEO Parag Agrawal has lashed out at the company’s former security chief Peiter ‘Mudge’ Zatko terming his claims false and riddled with inaccuracies.

Reacting to ongoing saga over bots controversy with Zatko, who was fired in January, he said, “We are reviewing the redacted claims that have been published, but what we’ve seen so far is a false narrative that is riddled with inconsistencies and inaccuracies, and presented without important context.”

Zatko claimed that Twitter lied about the actual number of bots on its platform and misled federal regulators about users’ data safety, substantiating Tesla CEO Elon Musk’s takeover bid and withdrawal from the move.

“There are news reports outlining claims about Twitter’s privacy, security, and data protection practices that were made by Mudge Zatko, a former Twitter executive who was terminated in January 2022 for ineffective leadership and poor performance,” Agrawal said in an internal message sent to the staff.

Zatko also alleged that the Indian government forced the micro-blogging platform to hire a “government agent” and allow him access to users’ sensitive data, a claim that was trashed by Twitter.

Agrawal said that this is frustrating and confusing to read, “given Mudge was accountable for many aspects of this work that he is now inaccurately portraying more than six months after his termination”.

“But none of this takes away from the important work you have done and continue to do to safeguard the privacy and security of our customers and their data,” he told employees.

Zatko’s disclosure before SEC

According to Zatko’s disclosure before the US Securities and Exchange Commission (SEC), Twitter has “major security problems that pose a threat to its own users’ personal information, to company shareholders, to national security, and to democracy”.

Agrawal said that given the spotlight on Twitter, “we can assume that we will continue to see more headlines in the coming days — this will only make our work harder. We will pursue all paths to defend our integrity as a company and set the record straight.”

Famous Galileo manuscript in Michigan University turns out to be a forged one

The popular Galileo manuscript at the University of Michigan Library was found to be a forgery, as per an investigation by the University authorities, after a historian flagged it. Based on the watermarks, they realized that it was no more than a century old and not from 1609 as claimed.

“It was pretty gut-wrenching when we first learned our Galileo was not actually a Galileo,” admitted Donna L. Hayward, the interim dean of Michigan’s libraries.

Caption: A handwritten manuscript believed to be the original work of astronomer Galileo Galilei in 1609 turned out to be a modern forgery (Image credit: University of Michigan Library)

The university was donated the piece in 1938 by a trustee Tracy McGregor, purportedly authenticized by Cardinal Pietro Maffi (1858-1931), the Archbishop of Pisa.

The manuscript was a replica of a rough draft by Galileo about the telescope, his new invention as a letter to the Doge of Venice in 1609. The final version is available in the State Archive in Venezia, Italy. It talks about the moons of Jupiter that Galileo found and wrote in his letter.

Credit for discovering the forgery goes to historian Nick Wilding who found the watermark odd and sought a probe. The university found it a forgery three months later and admitted it to the public. They’re suspecting that it was done a notorious forger from Italy by name Tobia Nicotra.

Actually, the watermark on the paper belonged to a post-1770 paper mill company, while Galileo wrote the letter in 1609. “It just kind of jumps out as weird,” Wilding told the NY Times.

 

Emotional AI and gen Z: The attitude towards new technology and its concerns

Artificial intelligence (AI) governs all that come under “smart technology” today. From self-driving cars to voice assistants on our smartphones, AI has ubiquitous presence in our daily lives. Yet, it had been lacking a crucial feature: the ability to engage human emotions.

The scenario is quickly changing, however. Algorithms that can sense human emotions and interact with them are quickly becoming mainstream as they come embedded in existing systems. Known as “emotional AI,” the new technology achieves this feat through a process called “non-conscious data collection”(NCDC), in which the algorithm collects data on the user’s heart and respiration rate, voice tones, micro-facial expressions, gestures, etc. to analyze their moods and personalize its response accordingly.

However, the unregulated nature of this technology has raised many ethical and privacy concerns. In particular, it is important to know the attitude of the current largest demographic towards NCDC, namely Generation Z (Gen Z). Making up 36% of the global workforce, Gen Z is likely to be the most vulnerable to emotional AI. Moreover, AI algorithms are rarely calibrated for socio-cultural differences, making their implementation all the more concerning.

We found that being male and having high income were both correlated with having positive attitudes towards accepting NCDC. In addition, business majors were more likely to be more tolerant towards NCDC,” highlights Prof. Ghotbi. Cultural factors, such as region and religion, were also found to have an impact, with people from Southeast Asia, Muslims, and Christians reporting concern over NCDC.

Research by Team:

Our study clearly demonstrates that sociocultural factors deeply impact the acceptance of new technology. This means that theories based on the traditional technology acceptance model by Davis, which does not account for these factors, need to be modified,” explains Prof. Mantello.

The study addressed this issue by proposing a “mind-sponge” model-based approach that accounts for socio-cultural factors in assessing the acceptance of AI technology. Additionally, it also suggested a thorough understanding of the potential risks of the technology to enable effective governance and ethical design. “Public outreach initiatives are needed to sensitize the population about the ethical implications of NCDC. These initiatives need to consider the demographic and cultural differences to be successful,” says Dr. Nguyen.

Overall, the study highlights the extent to which emotional AI and NCDC technologies are already present in our lives and the privacy trade-offs they imply for the younger generation. Thus, there is an urgent need to make sure that these technologies serve both individuals and societies well.

How to detect nanoplastics present in air

Large pieces of plastic can break down into nanosized particles that often find their way into the soil and water. Perhaps less well known is that they can also float in the air. It’s unclear how nanoplastics impact human health, but animal studies suggest they’re potentially harmful. As a step toward better understanding the prevalence of airborne nanoplastics, researchers have developed a sensor that detects these particles and determines the types, amounts and sizes of the plastics using colorful carbon dot films.

The researchers will present their results today at the fall meeting of the American Chemical Society (ACS). ACS Fall 2022 is a hybrid meeting being held virtually and in-person Aug. 21–25, with on-demand access available Aug. 26–Sept. 9. The meeting features nearly 11,000 presentations on a wide range of science topics.

“Nanoplastics are a major concern if they’re in the air that you breathe, getting into your lungs and potentially causing health problems,” says Raz Jelinek, Ph.D., the project’s principal investigator. “A simple, inexpensive detector like ours could have huge implications, and someday alert people to the presence of nanoplastics in the air, allowing them to take action.”

Of the many well-documented risks of dirty air, one potential danger is lesser known: chronic kidney disease. Learn about new research and how to protect yourself. CREDIT: Michigan Medicine

Millions of tons of plastic are produced and thrown away each year. Some plastic materials slowly erode while they’re being used or after being disposed of, polluting the surrounding environment with micro- and nanosized particles. Nanoplastics are so small — generally less than 1-µm wide — and light that they can even float in the air, where people can then unknowingly breathe them in. Animal studies suggest that ingesting and inhaling these nanoparticles may have damaging effects. Therefore, it could be helpful to know the levels of airborne nanoplastic pollution in the environment.

Previously, Jelinek’s research team at Ben-Gurion University of the Negev developed an electronic nose or “e-nose” for monitoring the presence of bacteria by adsorbing and sensing the unique combination of gas vapor molecules that they release. The researchers wanted to see if this same carbon-dot-based technology could be adapted to create a sensitive nanoplastic sensor for continuous environmental monitoring.

Carbon dots are formed when a starting material that contains lots of carbon, such as sugar or other organic matter, is heated at a moderate temperature for several hours, says Jelinek. This process can even be done using a conventional microwave. During heating, the carbon-containing material develops into colorful, and often fluorescent, nanometer-size particles called “carbon dots.” And by changing the starting material, the carbon dots can have different surface properties that can attract various molecules.

To create the bacterial e-nose, the team spread thin layers of different carbon dots onto tiny electrodes, each the size of a fingernail. They used interdigitated electrodes, which have two sides with interspersed comb-like structures. Between the two sides, an electric field develops, and the stored charge is called capacitance. “When something happens to the carbon dots — either they adsorb gas molecules or nanoplastic pieces — then there is a change of capacitance, which we can easily measure,” says Jelinek.

Then the researchers tested a proof-of-concept sensor for nanoplastics in the air, choosing carbon dots that would adsorb common types of plastic — polystyrene, polypropylene and poly(methyl methacrylate). In experiments, nanoscale plastic particles were aerosolized, making them float in the air. And when electrodes coated with carbon-dot films were exposed to the airborne nanoplastics, the team observed signals that were different for each type of material, says Jelinek. Because the number of nanoplastics in the air affects the intensity of the signal generated, Jelinek adds that currently, the sensor can report the amount of particles from a certain plastic type either above or below a predetermined concentration threshold. Additionally, when polystyrene particles in three sizes — 100-nm wide, 200-nm wide and 300-nm wide — were aerosolized, the sensor’s signal intensity was directly related to the particles’ size.

The team’s next step is to see if their system can distinguish the types of plastic in mixtures of nanoparticles. Just as the combination of carbon dot films in the bacterial e-nose distinguished between gases with differing polarities, Jelinek says it’s likely that they could tweak the nanoplastic sensor to differentiate between additional types and sizes of nanoplastics. The capability to detect different plastics based on their surface properties would make nanoplastic sensors useful for tracking these particles in schools, office buildings, homes and outdoors, he says.

This tiny sensor detects medicine levels from sweat drop in 30 seconds

Lithium can alleviate the symptoms of bipolar disorder and depression — if taken in just the right amount. Too little won’t work, while too much can bring on dangerous side effects. To precisely monitor the amount of this medication in the body, patients must undergo invasive blood tests. But today, scientists report the invention of a tiny sensor that detects lithium levels from sweat on the surface of a fingertip in as little as 30 seconds, without a trip to the clinic.

The researchers will present their results today at the fall meeting of the American Chemical Society (ACS). ACS Fall 2022 is a hybrid meeting being held virtually and in-person Aug. 21–25, with on-demand access available Aug. 26–Sept. 9. The meeting features nearly 11,000 presentations on a wide range of science topics.

Not only must lithium be taken at a certain dosage, but patients often struggle to take it as prescribed and may miss pills. So, when the medication doesn’t appear to be working, health care providers need to know how much medication the patient is actually swallowing. But current options for monitoring have significant drawbacks. For example, blood draws produce accurate results, but they are invasive and time consuming. Pill counters, meanwhile, don’t directly measure the intake of the medication. To address these limitations, the team turned to another body fluid.

“Although it may not be visible, the human body constantly produces sweat, often only in very small amounts,” says Shuyu Lin, Ph.D., a postgraduate student researcher who is co-presenting the work with graduate student Jialun Zhu at the meeting. “Small molecules derived from medication, including lithium, show up in that sweat. We recognized this as an opportunity to develop a new type of sensor that would detect these molecules.”

“Through a single touch, our new device can obtain clinically useful molecular-level information about what is circulating in the body,” says Sam Emaminejad, Ph.D., the project’s principal investigator, who is at the University of California, Los Angeles (UCLA). “We already interact with a lot of touch-based electronics, such as smart phones and keyboards, so this sensor could integrate seamlessly into daily life.”

Devising a sensor to detect lithium presented some technical challenges, however. Sweat is generally only present in minute amounts, but the electrochemical sensing needed to detect charged particles of lithium required an aqueous, or watery, environment. To provide it, the team engineered a water-based gel containing glycerol. This extra ingredient prevented the gel from drying out and created a controlled environment for the electronic portion of the sensor.

To trap the lithium ions after they traversed the gel, the team used an ion-selective electrode. The accumulating ions generate a difference in electrical potential compared with a reference electrode. The researchers used this difference to infer the concentration of lithium present in sweat. Together, these components comprise a tiny, rectangular sensor that is smaller than the head of a thumbtack and can detect lithium in about 30 seconds. The sensor is still in the preliminary testing phase, but ultimately, the researchers envision incorporating it into a larger, yet-to-be designed system that provides visual feedback to the provider or the patient.

After characterizing the sensor using an artificial fingertip, the team recruited real people to test it, including one person on a lithium treatment regimen. The researchers recorded this person’s lithium levels before and after taking the medication. They found that these measurements fell close to those derived from saliva, which prior research has shown to accurately measure lithium levels. In the future, the researchers plan to study the effects of lotion and other skin products on the sensor’s readings.

This technology also has applications beyond lithium. Emaminejad is developing similar touch-based sensors to monitor alcohol and acetaminophen, a painkiller also known as Tylenol®, while also exploring the possibility of detecting other substances. The complete sensing systems could include additional features, such as encryption secured by a fingerprint, or, for substances prone to abuse, a robotic dispensing system that releases medication only if the patient has a low level in their bloodstream.

The researchers acknowledge support and funding from the National Science Foundation, Brain and Behavior Foundation, Precise Advanced Technologies and Health Systems for Underserved Populations and the UCLA Henry Samueli School of Engineering and Applied Sciences.

ACS Fall 2022 will be a vaccination-required and mask-recommended event for all attendees, exhibitors, vendors and ACS staff who plan to participate in-person in Chicago. For detailed information about the requirement and all ACS safety measures, please visit the ACS website.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS’ mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio.

AI to help India recruiters to eliminate bias, pace up process

As artificial intelligence (AI) is entering all office systems, nearly 50 per cent of recruiters believe that it will become a regular part of their hiring process in the coming years, said a report by chat-based direct hiring platform Hirect.

A whopping 96.5 per cent of recruiters at Indian startups and small and medium enterprises (SMEs) in India believe that the use of AI will improve the recruitment process and eliminate bias from the hiring process, said the report released on Tuesday.

And 52 per cent of the recruiters said building a diverse workforce is necessary to address the huge disparity in the representation of women in leadership roles, 97.4 per cent of them believe that skill-based hiring is the future and necessary and 87 per cent of recruiters are “in favour of retaining old employees instead of hiring new ones.”

“In the employee-driven market, the employers must quickly adapt to the current reality of talent acquisition to remain competitive in today’s labour market,” said Raj Das, Global Co-founder and CEO of Hirect India.

The startups often rely on referrals and that is why startups formulate referral policies and around 88.2 per cent of recruiters believe that referral is the best way to hire people with the right talents, added the report.

 

Beware of cash payments at hospitals, you may come under Income-Tax radar

The Income Tax department is scanning all the transactions at hospitals wherre the bills are paid in cash as it violates banking protocol and amounts to tax evasion.

The I-T sleuths have decided to monitor cash transactions at hospitals, banquet halls and businesses in a recent move to prevent tax evasion. As per the rule, any cash transaction above Rs. 20,000 in cash will be under scanner and lands you in trouble. Whether collecting hand loans or investment in cash is prohibited under the law and all such transactions should be routed via banks and accounted for.

To begin with, the department is currently scanning the hospitals and patients who have paid significant sums of money to private medical institutions. Hospitals or political parties and religious institutions cannot accept a total cash payment of Rs. 2 lakh or more from another person and they are not eligible for a tax deduction.


The department is also brining under its radar such professions or businesses where cash payments are made. In health care, PAN card details are to be duly recorded upon patient’s admission. Several health care facilities have periodically ignored this rule, the department officials told media sources.

Banquets, high-end market places and architects are under scanner currently for tax evasion, while others will soon be brough under the net based. Once concrete evidence is established, the I-T department is likely to send them notices. Most of these cash transactions are prevalent in small towns where the I-T departments’ limited presence is giving the idea that its tax net is not wide spread.

 

 

Scientists take a deep dive into how ‘elasmobranchs’ use the ocean depth

Using sophisticated electronic tags, scientists have assembled a large biologging dataset to garner comparative insights on how sharks, rays, and skates – also known as “elasmobranchs” – use the ocean depths. While some species spend their entire lives in shallow waters close to our shores on the continental shelf, others plunge hundreds of meters or more off the slope waters into the twilight zone, beyond where sunlight penetrates. This new understanding of how elasmobranchs use the ocean will enable policymakers and resource managers the opportunity to examine the threats these animals face, and guide future management and conservation plans.

A study published Aug. 19 in Science Advances, led by Stanford University and ZSL (Zoological Society of London) researchers, is the largest global investigation of where and when a diverse group of elasmobranchs move vertically. A team of 171 researchers from 135 institutions across 25 countries brought together two decades of data from satellite and archival tags that remotely tracked the movements and behaviors of 38 species in oceans across the globe.

“For the first time, we have a standardized, global database that we used to fill important knowledge gaps about the diving behaviors of sharks and rays,” said Samantha Andrzejaczek, co-lead author of the study and a postdoctoral research fellow at the Hopkins Marine Station of Stanford University. “This will enable better understanding of what fisheries interact with elasmobranchs and how to improve management of many of these long-lived animals.”

Sharpest image ever of universe’s most massive known star

By harnessing the capabilities of the 8.1-meter Gemini South telescope in Chile, which is part of the International Gemini Observatory operated by NSF’s NOIRLab, astronomers have obtained the sharpest image ever of the star R136a1, the most massive known star in the Universe. Their research, led by NOIRLab astronomer Venu M. Kalari, challenges our understanding of the most massive stars and suggests that they may not be as massive as previously thought.

Astronomers have yet to fully understand how the most massive stars — those more than 100 times the mass of the Sun — are formed. One particularly challenging piece of this puzzle is obtaining observations of these giants, which typically dwell in the densely populated hearts of dust-shrouded star clusters. Giant stars also live fast and die young, burning through their fuel reserves in only a few million years. In comparison, our Sun is less than halfway through its 10 billion year lifespan. The combination of densely packed stars, relatively short lifetimes, and vast astronomical distances makes distinguishing individual massive stars in clusters a daunting technical challenge.

By pushing the capabilities of the Zorro instrument on the Gemini South telescope of the International Gemini Observatory, operated by NSF’s NOIRLab, astronomers have obtained the sharpest-ever image of R136a1 — the most massive known star. This colossal star is a member of the R136 star cluster, which lies about 160,000 light-years from Earth in the center of the Tarantula Nebula in the Large Magellanic Cloud, a dwarf companion galaxy of the Milky Way.

Previous observations suggested that R136a1 had a mass somewhere between 250 to 320 times the mass of the Sun. The new Zorro observations, however, indicate that this giant star may be only 170 to 230 times the mass of the Sun. Even with this lower estimate, R136a1 still qualifies as the most massive known star.

Astronomers are able to estimate a star’s mass by comparing its observed brightness and temperature with theoretical predictions. The sharper Zorro image allowed NSF’s NOIRLab astronomer Venu M. Kalari and his colleagues to more accurately separated the brightness of R136a1 from its nearby stellar companions, which led to a lower estimate of its brightness and therefore its mass.

Our results show us that the most massive star we currently know is not as massive as we had previously thought,” explained Kalari, lead author of the paper announcing this result. “This suggests that the upper limit on stellar masses may also be smaller than previously thought.

This result also has implications for the origin of elements heavier than helium in the Universe. These elements are created during the cataclysmicly explosive death of stars more than 150 times the mass of the Sun in events that astronomers refer to as pair-instability supernovae. If R136a1 is less massive than previously thought, the same could be true of other massive stars and consequently pair instability supernovae may be rarer than expected.

The star cluster hosting R136a1 has previously been observed by astronomers using the NASA/ESA Hubble Space Telescope and a variety of ground-based telescopes, but none of these telescopes could obtain images sharp enough to pick out all the individual stellar members of the nearby cluster.

Gemini South’s Zorro instrument was able to surpass the resolution of previous observations by using a technique known as speckle imaging, which enables ground-based telescopes to overcome much of the blurring effect of Earth’s atmosphere [1]. By taking many thousands of short-exposure images of a bright object and carefully processing the data, it is possible to cancel out almost all this blurring [2]. This approach, as well as the use of adaptive optics, can dramatically increase the resolution of ground-based telescopes, as shown by the team’s sharp new Zorro observations of R136a1 [3].

This result shows that given the right conditions an 8.1-meter telescope pushed to its limits can rival not only the Hubble Space Telescope when it comes to angular resolution, but also the James Webb Space Telescope,” commented Ricardo Salinas, a co-author of this paper and the instrument scientist for Zorro. “This observation pushes the boundary of what is considered possible using speckle imaging.

We began this work as an exploratory observation to see how well Zorro could observe this type of object,” concluded Kalari. “While we urge caution when interpreting our results, our observations indicate that the most massive stars may not be as massive as once thought.

Zorro and its twin instrument `Alopeke are identical imagers mounted on the Gemini South and Gemini North telescopes, respectively. Their names are the Hawaiian and Spanish words for “fox” and represent the telescopes’ respective locations on Maunakea in Hawai‘i and on Cerro Pachón in Chile. These instruments are part of the Gemini Observatory’s Visiting Instrument Program, which enables new science by accommodating innovative instruments and enabling exciting research. Steve B. Howell, current chair of the Gemini Observatory Board and senior research scientist at the NASA Ames Research Center in Mountain View, California, is the principal investigator on both instruments.

Gemini South continues to enhance our understanding of the Universe, transforming astronomy as we know it. This discovery is yet another example of the scientific feats we can accomplish when we combine international collaboration, world-class infrastructure, and a stellar team,” said NSF Gemini Program Officer Martin Still.

Medieval friars were ‘riddled with parasites’, new findings reveal

A new analysis of remains from medieval Cambridge shows that local Augustinian friars were almost twice as likely as the city’s general population to be infected by intestinal parasites.

This is despite most Augustinian monasteries of the period having latrine blocks and hand-washing facilities, unlike the houses of ordinary working people.

Researchers from the University of Cambridge’s Department of Archaeology say the difference in parasitic infection may be down to monks manuring crops in friary gardens with their own faeces, or purchasing fertiliser containing human or pig excrement.

The study, published today in the International Journal of Paleopathology, is the first to compare parasite prevalence in people from the same medieval community who were living different lifestyles, and so might have differed in their infection risk.

The population of medieval Cambridge consisted of residents of monasteries, friaries and nunneries of various major Christian orders, along with merchants, traders, craftsmen, labourers, farmers, and staff and students at the early university.

Cambridge archaeologists investigated samples of soil taken from around the pelvises of adult remains from the former cemetery of All Saints by the Castle parish church, as well as from the grounds where the city’s Augustinian Friary once stood.

Most of the parish church burials date from the 12-14th century, and those interred within were primarily of a lower socio-economic status, mainly agricultural workers.

The Augustinian friary in Cambridge was an international study house, known as a studium generale, where clergy from across Britain and Europe would come to read manuscripts. It was founded in the 1280s and lasted until 1538 before suffering the fate of most English monasteries: closed or destroyed as part of Henry VIII’s break with the Roman Church.

The researchers tested 19 monks from the friary grounds and 25 locals from All Saints cemetery, and found that 11 of the friars (58%) were infected by worms, compared with just eight of the general townspeople (32%).

They say these rates are likely the minimum, and that actual numbers of infections would have been higher, but some traces of worm eggs in the pelvic sediment would have been destroyed over time by fungi and insects.

The 32% prevalence of parasites among townspeople is in line with studies of medieval burials in other European countries, suggesting this is not particularly low – but rather the infection rates in the monastery were remarkably high.

“The friars of medieval Cambridge appear to have been riddled with parasites,” said study lead author Dr Piers Mitchell from Cambridge’s Department of Archaeology. “This is the first time anyone has attempted to work out how common parasites were in people following different lifestyles in the same medieval town.”

Cambridge researcher Tianyi Wang, who did the microscopy to spot the parasite eggs, said: “Roundworm was the most common infection, but we found evidence for whipworm infection as well. These are both spread by poor sanitation.”

Standard sanitation in medieval towns relied on the cesspit toilet: holes in the ground used for faeces and household waste. In monasteries, however, running water systems were a common feature – including to rinse out the latrine – although that has yet to be confirmed at the Cambridge site, which is only partly excavated.

Not all people buried in Augustinian friaries were actually clergy, as wealthy people from the town could pay to be interred there. However, the team could tell which graves belonged to friars from the remains of their clothing.

“The friars were buried wearing the belts they wore as standard clothing of the order, and we could see the metal buckles at excavation,” said co-author Craig Cessford of the Cambridge Archaeological Unit.

As roundworm and whipworm are spread by poor sanitation, researchers argue that the difference in infection rates between the friars and the general population must have been due to how each group dealt with their human waste.

“One possibility is that the friars manured their vegetable gardens with human faeces, not unusual in the medieval period, and this may have led to repeated infection with the worms,” said Mitchell.

Medieval records reveal how Cambridge residents may have understood parasites such as roundworm and whipworm. John Stockton, a medical practitioner in Cambridge who died in 1361, left a manuscript to Peterhouse college that included a section on De Lumbricis (‘on worms’).

It notes that intestinal worms are generated by excess of various kinds of phlegm: “Long round worms form from an excess of salt phlegm, short round worms from sour phlegm, while short and broad worms came from natural or sweet phlegm.”

The text prescribes “bitter medicinal plants” such as aloe and wormwood, but recommends they are disguised with “honey or other sweet things” to help the medicine go down.

Another text – Tabula medicine – found favour with leading Cambridge doctors of the 15th century, and suggests remedies as recommended by individual Franciscan monks, such as Symon Welles, who advocated mixing a powder made from moles into a curative drink.

Overall, those buried in medieval England’s monasteries had lived longer than those in parish cemeteries, according to previous research, perhaps due to a more nourishing diet, a luxury of wealth.

Japan’s Tonga volcano eruption nine times taller than 2011 tsunami

New research reveals more about the magnitude of January eruption, as researchers call for better preparedness.

  • The eruption of the Hunga Tonga-Hunga Ha’apai volcano in January created an initial wave 90 metres high – almost the height of the Statue of Liberty (93m)
  • University of Bath tsunami expert calls for better warning systems to detect volcanic eruptions, saying systems are 30 years behind comparable earthquake detection tools

The initial tsunami wave created by the eruption of the underwater Hunga Tonga Ha’apai volcano in Tonga in January 2022 reached 90 metres in height, around nine times taller than that from the highly destructive 2011 Japan tsunami, new research has found.

An international research team says the eruption should serve as a wake-up call for international groups looking to protect people from similar events in future, claiming that detection and monitoring systems for volcano-based tsunamis are ’30 years behind’ comparable tools used to detect earthquake-based events.

Tsunami/en.wikipedia.org

Dr Mohammad Heidarzadeh, Secretary-General of the International Tsunami Commission and a senior lecturer in the University of Bath’s Department of Architecture & Civil Engineering, authored the research alongside colleagues based in Japan, New Zealand, the UK and Croatia.

By comparison, the largest tsunami waves due to earthquakes before the Tonga event were recorded following the Tōhoku earthquake near Japan in 2011 and the 1960 Chilean earthquake, reached 10 metres in initial height. Those were more destructive as they happened closer to land, with waves that were wider.

Dr Heidarzadeh says the Tonga tsunami should serve as a wake-up call for more preparedness and understanding of the causes and signs of tsunamis cause by volcanic eruptions. He says: “The Tongan tsunami tragically killed five people and caused large scale destruction, but its effects could have been even greater had the volcano been located closer to human communities. The volcano is located approximately 70 km from the Tongan capital Nuku’alofa – this distance significantly minimized its destructive power.

“This was a gigantic, unique event and one that highlights that internationally we must invest in improving systems to detect volcanic tsunamis as these are currently around 30 years behind the systems we used to monitor for earthquakes. We are under-prepared for volcanic tsunamis.”

The research was carried out by analysing ocean observation data recordings of atmospheric pressure changes and sea level oscillations, in combination with computer simulations validated with real-world data.

The research team found that the tsunami was unique as the waves were created not only by the water displaced by the volcano’s eruption, but also by huge atmospheric pressure waves, which circled around the globe multiple times. This ‘dual mechanism’ created a two-part tsunami – where initial ocean waves created by the atmospheric pressure waves were followed more than one hour later by a second surge created by the eruption’s water displacement.

The eruption created an initial wave 90 metres high/University of Bath

This combination meant tsunami warning centres did not detect the initial wave as they are programmed to detect tsunamis based on water displacements rather than atmospheric pressure waves.

The research team also found that the January event was among very few tsunamis powerful enough to travel around the globe – it was recorded in all world’s oceans and large seas from Japan and the United States’ western seaboard in the North Pacific Ocean to the coasts within the Mediterranean Sea.

The paper, co-authored by colleagues from New Zealand’s GNS Science, the Association for the Development of Earthquake Prediction in Japan, the University of Split in Croatia and at London’s Brunel University, was published this week in Ocean Engineering.

Dr Aditya Gusman, Tsunami Modeller at the New Zealand-based geoscience service, says: “The 2018 Anak Krakatau volcano and 2022 Hunga Tonga-Hunga Ha’apai volcano eruptions clearly showed us that coastal areas surrounding volcano islands are at risk of being hit by destructive tsunamis. Although it may be preferable to have low-lying coastal areas completely clear from residential buildings, such a policy may not be practical for some places as volcanic tsunamis can be considered infrequent events.”

Co-author Dr Jadranka Šepić, from the University of Split, Croatia, adds: “What is important is to have efficient warning systems, which include both real-time warnings and education on what to do in a case of a tsunami or warning – such systems save lives. In addition, at volcanic areas, monitoring of volcanic activity should be organized, and more high-quality research into volcanic eruptions and areas at hazard is always a good idea.”

Separate research led by the University of Bath atmospheric physicist Dr Corwin Wright published in June found that the Tonga eruption triggered atmospheric gravity waves that reached the edge of space.

NASA gears up for livestreaming mega event of Artemis I launch

As the SLS rocket is scheduled for launch during a two-hour window that opens at 8:33 a.m. EDT Monday, Aug. 29, from Launch Pad 39B at Kennedy Space Center, NASA is planning to provide a wide coverage of prelaunch, launch, and postlaunch activities for Artemis I.

Artemis I will be the first integrated test of NASA’s Orion spacecraft, Space Launch System (SLS) rocket, and the ground systems at the launch center in Florida, heralding future crewed flight test and future human lunar exploration.

The rocket and spacecraft have already reached the launch pad last week after the nearly 10-hour, four-mile trek from the Vehicle Assembly Building and a livestream of the rocket and spacecraft at the launch pad has been made available on the NASA Kennedy YouTube channel.

Live coverage of events will air on NASA Television, the NASA app, and the agency’s website, with prelaunch events starting Monday, Aug. 22. The launch countdown will begin Saturday, Aug. 27, at 10:23 a.m.

 

Artemis I set for launch / NASA

A live broadcast of the launch includes celebrity appearances by Jack Black, Chris Evans, and Keke Palmer, as well as a special performance of “The Star-Spangled Banner” by Josh Grobin and Herbie Hancock. It also will feature a performance of “America the Beautiful” by The Philadelphia Orchestra and cellist Yo-Yo Ma, conducted by Yannick Nézet-Séguin.

The first in a series of complex missions, Artemis I is an uncrewed flight test that will provide a foundation to extend human presence to the Moon and beyond. The mission will test the performance of the SLS rocket and test Orion’s capabilities over a period of about six weeks while on its travel that covers about 40,000 miles beyond the Moon and back to Earth.

Random acts of kindness make recipients feel elated

Even though they often enhance happiness, acts of kindness such as giving a friend a ride or bringing food for a sick family member can be somewhat rare because people underestimate how good these actions make recipients feel, according to new research from The University of Texas at Austin.

The study by UT Austin McCombs School of Business Assistant Professor of Marketing Amit Kumar, along with Nicholas Epley of the University of Chicago, found that although givers tend to focus on the object they’re providing or action they’re performing, receivers instead concentrate on the feelings of warmth the act of kindness has conjured up. This means that givers’ “miscalibrated expectations” can function as a barrier to performing more prosocial behaviors such as helping, sharing or donating.

The research is online in advance in the Journal of Experimental Psychology: General.

To quantify these attitudes and behaviors, the researchers conducted a series of experiments.

In one, the researchers recruited 84 participants in Chicago’s Maggie Daley Park. Participants could choose whether to give away to a stranger a cup of hot chocolate from the park’s food kiosk or keep it for themselves. Seventy-five agreed to give it away.

Researchers delivered the hot chocolate to the stranger and told them the study participant had chosen to give them their drink. Recipients reported their mood, and performers indicated how they thought recipients felt after getting the drink.

Performers underestimated the significance of their act. They expected recipients’ mood at an average of 2.7 on a scale of -5 (much more negative than normal) to 5 (much more positive than normal), while recipients reported an average of 3.5.

“People aren’t way off base,” Kumar said. “They get that being kind to people makes them feel good. What we don’t get is how good it really makes others feel.”

The researchers also performed a similar experiment in the same park with cupcakes. They recruited 200 participants and divided them into two groups. In the control group, 50 participants received a cupcake for participating. They rated their mood, and the other 50 people rated how they thought the receivers felt after getting a cupcake.

For the second group of 100, 50 people were told they could give away their cupcake to strangers. They rated their own mood and the expected mood of the cupcake recipients. The researchers found that participants rated cupcake recipients’ happiness at about the same level whether they got their cupcake through an act of random kindness or from the researchers. What’s more, recipients who received a cupcake through an act of kindness were happier than control group recipients.

“Performers are not fully taking into account that their warm acts provide value from the act itself,” Kumar said. “The fact that you’re being nice to others adds a lot of value beyond whatever the thing is.”

In a lab experiment, Kumar and Epley added a component to assess the consequences of kindness. Participants first either received a gift from the lab store or were gifted one by another participant, then played a game. All participants who received an item were told to divide $100 between themselves and an unknown study recipient.

The researchers found that recipients who received their lab gift through another participant’s random act of kindness were more generous to strangers during the game. They divvyed up the $100 more equally, giving away $48.02 on average versus $41.20.

“It turns out generosity can actually be contagious,” Kumar said. “Receivers of a prosocial act can pay it forward. Kindness can actually spread.”

Wheat prices spike due to climate change: Study

Rising temperatures are harmful to wheat yields. However, crop yields do not provide a holistic vision of food security. The impacts of climate change on wheat price, livelihood and agricultural market fundamentals are also important to food security but have been largely overlooked.

An international research team has now estimated the comprehensive impact of climate change and extreme climate events on global wheat supply and the demand chain in a 2 ℃ warmer world by using a novel climate-wheat-economic ensemble modelling approach.

The effect of CO2 fertilization could cancel out temperature stress on crops, with a slightly greater wheat yield under 2 ℃ warming as a result. However, increases in global yield do not necessarily result in lower consumer prices. Indeed, the modelling results suggest that global wheat price spikes would become higher and more frequent, thus placing additional economic pressure on daily livelihood.

The findings, by scientists from six countries, were published in One Earth on August 19.

“This counterintuitive result is initially driven by uneven impacts geographically. Wheat yields are projected to increase in high-latitude wheat exporting countries but show decreases in low-latitude wheat importing countries,” said lead author ZHANG Tianyi, an agrometeorologist at the Institute of Atmospheric Physics, Chinese Academy of Sciences.

Co-author Karin van der Wiel, a climate scientist at the Royal Netherlands Meteorological Institute, further explained: “This leads to higher demand for international trade and higher consumer prices in the importing countries, which would deepen the traditional trade patterns between wheat importing and exporting countries.”

Earlier researchers pointed out that trade liberalization would help mitigate climate stress via improving market mobility. The current research team revealed that such policies could indeed reduce consumers’ economic burden from wheat products. However, the impact on farmers’ income would be mixed. For example, trade liberalization policy under 2 ℃ warming could stabilize or even improve farmers’ income in wheat exporting countries but would reduce income for farmers in wheat importing countries.

“These results would potentially cause a larger income gap, creating a new economic inequality between wheat importing and exporting countries,” said WEI Taoyuan, co-author and an economic scientist at the CICERO Center for International Climate Research. ZHANG further explained more dependence on imports could lower the wheat self-sufficiency ratio, thus causing a “vicious negative cycle” for wheat importing and less-developed countries in the long term.

“This study highlights that effective measures in trade liberalization policies are necessary to protect grain food industries in importing countries, support resilience, and enhance global food security under climate change,” said Frank Selten, a researcher at the Royal Netherlands Meteorological Institute and co-author of the study.

COVID mRNA vaccines are safe in patients with heart failure

COVID mRNA vaccines are associated with a decreased risk of death in patients with heart failure, according to research presented at ESC Congress 2022.1 The study also found that the vaccines were not associated with an increased risk of worsening heart failure, venous thromboembolism or myocarditis in heart failure patients.

“Our results indicate that heart failure patients should be prioritised for COVID-19 vaccinations and boosters,” said study author Dr. Caroline Sindet-Pedersen of Herlev and Gentofe Hospital, Hellerup, Denmark. “COVID-19 vaccines will continue to be important for preventing morbidity and mortality in vulnerable patient populations. Thus, studies emphasising the safety of these vaccines are essential to reassure those who might be hesitant and ensure continued uptake of vaccinations.”

Patients with heart failure are at increased risk of hospitalisation, need for mechanical ventilation, and death due to COVID-19.2 Vaccination reduces the risk of serious illness from COVID-19. However, “Due to perceptions about possible cardiovascular side effects from mRNA vaccines in heart failure patients, this study examined the risk of cardiovascular complications and death associated with mRNA vaccines in a nationwide cohort of patients with heart failure,” said Dr. Sindet-Pedersen.

The study included 50,893 unvaccinated patients with heart failure in 2019 and 50,893 patients with heart failure in 2021 who were vaccinated with either of the two mRNA vaccines (BNT162B2 or mRNA-1273).3 The two groups were matched for age, sex, and duration of heart failure. The median age of participants was 74 years and 35% were women. The median duration of heart failure was 4.1 years. Participants were followed for 90 days for all-cause mortality, worsening heart failure, venous thromboembolism, and myocarditis, starting from the date of the second vaccination for the 2021 group and the same date in 2019 for the unvaccinated group.

The researchers compared the risk of adverse outcomes in the two groups, after standardising for age, sex, heart failure duration, use of heart failure medications, ischaemic heart disease, cancer, diabetes, atrial fibrillation, and admission with heart failure less than 90 days before the first date of follow up. Dr. Sindet-Pedersen explained: “Standardisation imitates a randomised trial and is a way to obtain a better causal interpretation of the results from observational studies.”

Among 101,786 heart failure patients, the researchers found that receiving an mRNA vaccine was not associated with an increased risk of worsening heart failure, myocarditis or venous thromboembolism but was associated with a decreased risk of all-cause mortality. The standardised risk of all-cause mortality within 90 days was 2.2% in the 2021 cohort (vaccinated) and 2.6% in the 2019 cohort (not vaccinated), showing a significantly lower risk for all-cause mortality in 2021 versus 2019. The standardised risk of worsening heart failure within 90 days was 1.1% in both cohorts. Similarly, no significant differences were found between groups for venous thromboembolism or myocarditis.

Dr. Sindet-Pedersen concluded: “The study suggests that there should be no concern about cardiovascular side effects from mRNA vaccines in heart failure patients. In addition, the results point to a beneficial effect of vaccination on mortality.”

 

ACS team unveils a more environment friendly air conditioner

Summer is in full swing in the U.S., and people are turning up their air conditioners to beat the heat. But the hydrofluorocarbon refrigerants in these and other cooling devices are potent greenhouse gases and major drivers of climate change. Today, scientists report a prototype device that could someday replace existing “A/Cs.” It’s much more environmentally friendly and uses solid refrigerants to efficiently cool a space.

The researchers will present their results today at the fall meeting of the American Chemical Society (ACS). ACS Fall 2022 is a hybrid meeting being held virtually and in-person Aug. 21–25, with on-demand access available Aug. 26–Sept. 9. The meeting features nearly 11,000 presentations on a wide range of science topics.

“Just installing an air conditioner or throwing one away is a huge driver of global warming,” says Adam Slavney, Ph.D., who is presenting this work at the meeting. The refrigerants used in these systems are thousands of times more potent than carbon dioxide and can accidentally leak out of systems when they are being handled or disposed of.

Traditional cooling systems, such as air conditioners, work by causing a refrigerant to cycle between being a gas or a liquid. When the liquid becomes a gas, it expands and absorbs heat, cooling a room or the interior of a refrigerator. A compressor that works at about 70–150 pounds per square inch (psi) turns the gas back into a liquid, releasing heat. In the case of air conditioners, this heat is directed outside the home. Though this cycle is efficient, concerns about climate change and stricter regulations on hydrofluorocarbon refrigerants are spurring the search for more environmentally responsible ones.

Solid refrigerants could be an ideal solution. Unlike gases, solids won’t leak into the environment from A/C units. One class of solid refrigerants, called barocaloric materials, work similarly to traditional gas-liquid cooling systems. They use pressure changes to go through heat cycles, but in this case, the pressure drives a solid-to-solid phase change. That means the material remains a solid, but the internal molecular structure changes. The key structural aspect of these barocaloric solid materials is that they contain long, flexible molecular chains that are typically floppy and disordered. But under pressure, the chains become more ordered and rigid — a change that releases heat. The process of going from an ordered to a relaxed structure is like melting wax, but without it becoming a liquid, says Jarad Mason, Ph.D., the project’s principal investigator, who is at Harvard University. When the pressure is released, the material reabsorbs heat, completing the cycle.

A disadvantage of barocaloric systems, however, is that most of these materials require massive pressures to drive heat cycles. To produce these pressures, the systems need expensive, specialized equipment that’s not practical for real-world cooling applications. Mason and his team recently reported barocaloric materials that can act as refrigerants at much lower pressures. They’ve now shown that the refrigerants, which are called metal-halide perovskites, can work in a cooling system they’ve built from scratch. “The materials we reported are able to cycle at about 3,000 psi, which are pressures that a typical hydraulics system can work at,” says Slavney.

The team has now built a first-of-its-kind prototype that demonstrates the use of these new materials in a practical cooling system. The device has three main parts. One is a metal tube packed with the solid refrigerant and an inert liquid — water or an oil. Another piece of the device is a hydraulic piston that applies pressure to the liquid. Finally, the liquid helps transfer that pressure to the refrigerant and helps carry heat through the system.

After solving several engineering challenges, the team has shown that the barocaloric materials work as functional refrigerants, turning pressure changes into full temperature-changing cycles. “Our system still doesn’t use pressures as low as those of commercial refrigeration systems, but we’re getting closer,” says Mason. To the team’s knowledge, this is the first working cooling system using solid-state refrigerants that rely on pressure changes.

With the device now in hand, the team plans to test a variety of barocaloric materials. “We’re really hoping to use this machine as a testbed to help us find even better materials,” says Slavney, including ones that work at lower pressures and that conduct heat better. With an optimal material, the researchers believe solid-state refrigerants could become a viable replacement for current air conditioning and other cooling technologies.

The researchers acknowledge support and funding from the Harvard University Materials Science Research and Engineering Center, the Harvard Climate Change Solutions Fund, and the Arnold and Mabel Beckman Foundation.

Super-fast electric car charging is here with Mida’s touch

Despite the growing popularity of electric vehicles, many consumers still hesitate as it may take longer to power up an electric car than it does to gas up a conventional one.

Another concern is that frequent charging or speeding up the charging process can damage the battery and reduce its lifespan. Now, scientists have developed a superfast charging methods tailored to power different types of electric vehicle batteries in 10 minutes or less without harm.

The researchers will present their results Monday at the fall meeting of the American Chemical Society (ACS) Fall 2022, a hybrid meeting being held virtually and in-person on Aug. 21-25, with nearly 11,000 presentations on a wide range of science topics.

“Fast charging is the key to increasing consumer confidence and overall adoption of electric vehicles,” says Eric Dufek, who is presenting this work at the meeting. “It would allow vehicle charging to be very similar to filling up at a gas station.” Such an advance could help the US reach President Biden’s goal that by 2030, half of all vehicles sold should be electric or hybrid.

When a lithium-ion battery is being charged, lithium ions migrate from one side of the device, the cathode, to the other, the anode. By making the lithium ions migrate faster, the battery is charged more quickly, but sometimes the lithium ions don’t fully move into the anode. In this situation, lithium metal can build up, and this can trigger early battery failure and reducing the lifetime of the battery.

To address these challenges, Dufek and his research team at Idaho National Laboratory used machine learning to create unique charging protocols. By inputting information about the condition of many lithium-ion batteries during their charging and discharging cycles, the scientists trained the machine learning analysis to predict lifetimes. The team then analyzed to identify and optimize new protocols.

“We’ve significantly increased the amount of energy that can go into a battery cell in a short amount of time,” says Dufek. “Currently, we’re seeing batteries charge to over 90% in 10 minutes without lithium plating or cathode cracking.”

Going from a nearly dead battery to one at 90% power in only 10 minutes is a far cry from current methods, which, at best, can get an electric vehicle to full charge in about half an hour. While many researchers are looking for methods to achieve this sort of super-fast charging, Dufek says that one advantage of their machine learning model is that it ties the protocols to the physics of what is actually happening in a battery.

The researchers plan to use their model to develop and design new lithium-ion batteries that are optimized to undergo fast charging.

Early COVID-19 pandemic induced cancer survivors to reduce smoking: Study

Recent study shows that during the first year of the COVID-19 pandemic, the proportion of working-aged UAmerican adults without health insurance did not change despite increases in unemployment, and the prevalence of unhealthy behaviors decreased.

The findings, published by Wiley online in CANCER, studied individuals with and without a history of cancer. While cancer survivors often have high health care needs, they are more vulnerable to the effects of economic and health care disruptions, as happened during the first year of the COVID-19 pandemic.

File Photo of Johns Hopkins Covid-19 map

Xuesong Han of the American Cancer Society, and her colleagues used data from the nationwide, population-based Behavioral Risk Factor Surveillance System—an annual household telephone survey—to examine changes in multiple health-related measures in 2020 among cancer survivors.

Among adults aged 18–64 years, the uninsured rate did not change significantly in 2020 despite huge job cuts. The prevalence of unhealthy behaviors, including sleeplessness and smoking decreased in 2020, and health improved, regardless of cancer history, showed the analysis.

Declines in smoking were greater among cancer survivors than among adults without a cancer history, it noted. “Our findings suggest that the pandemic may have motivated people to adopt certain healthier behaviors, and national and regional policy responses to the pandemic regarding insurance coverage, unemployment benefits, and financial assistance may have contributed to the observed positive changes,” said Han.

Sony opens Play Station 5 at 12 noon for pre-order in India, here’s how to book

Sony’s PlayStation 5 and PS5 Digital Edition gaming consoles was made available in India starting 12PM today at Sony Centre, and other online retailers including Amazon India, Flipkart, Vijay Sales, Croma, Reliance Digital, Game the Shop, Prepaid Gamer Card, and Games the Shop.

Available at Sony Centre, the Play Station 5 gaming console will be available with a combination of Horizon Forbidden West game at a price of Rs 53,990 while the PS5 Digital Edition gaming console with the game will be available at a price of Rs 43,990.

Sony Centre said the pre – bookings will begin at 12PM today, deliveries for the Play Station 5 and Play Station 5 Digital Edition gaming console in India will start from September 5, 2022. On offer is an EMI option on the purchase of PlayStation 5 and Play Station 5 Digital Edition gaming consoles — Rs 8,998 for six months or Rs 17,996 for a span of three months.

Sony PS5 console

Sony’s next-generation gaming console – PS5 is available in two editions- the Standard and the Digital Edition, the latter supports physical Blu-ray discs.

Comes with lightning-fast loading with an ultra-high-speed SSD, deeper immersion with support for haptic feedback, adaptive triggers and 3D Audio, and an all-new generation of incredible PlayStation games.

PS5 Digital Edition

Besides lightning-fast loading with an ultra-high-speed SSD, the digital edition provides support for haptic feedback, adaptive triggers and 3D Audio, and an all-new generation of incredible PlayStation games.

Sony open PS5 for pre-booking in India

PS5 Digital Edition is an all-digital version of the PS5 console with no disc drive. You have to sign into your account for PlayStation Network and go to PlayStation Store to buy and download games.

PS5’s Digital Edition supports 16GB GDDR6 RAM and 825GB of internal storage with expansion support using PS5 SSDs. It is also the cheaper than the standard model and powered by an octa-core CPU based on Zen 2 architecture.