UN warns copper shortage risks slowing global energy and technology shift

In its latest Global Trade Update, released this week, UNCTAD describes copper as “the new strategic raw material” at the heart of the rapidly electrifying and digitising global economy.  

But with demand set to rise more than 40 per cent by 2040, copper supply is under severe strain – posing a critical bottleneck for technologies ranging from electric vehicles and solar panels to AI infrastructure and smart grids.

More than just metal

Copper is no longer just a commodity,” said Luz María de la Mora, Director of the International Trade and Commodities Division at UNCTAD.

Valued for its high conductivity and durability, copper is essential to power systems and clean energy technologies. It runs through homes, cars, data centres and renewable infrastructure.

Yet developing new mines is a slow and expensive process, and fraught with environmental risks – often taking up to 25 years from discovery to operation.

Meeting projected demand by 2030 could require $250 billion in investment and at least 80 new mining projects, according to UNCTAD estimates.

The Democratic Republic of the Congo holds some of the world’s largest copper reserves, yet most of the metal is exported, limiting the country’s ability to benefit fully from this valuable resource.

The Democratic Republic of the Congo holds some of the world’s largest copper reserves, yet most of the metal is exported, limiting the country’s ability to benefit fully from this valuable resource.

Uneven geography, unequal gains

Over half of the world’s known copper reserves are concentrated in just five countries – Chile, Australia, Peru, the Democratic Republic of the Congo and Russia.

However, much of the value-added production occurs elsewhere, particularly in China, which now imports 60 per cent of global copper ore and produces over 45 per cent of the world’s refined copper, says the UN.

This imbalance leaves many developing countries stuck at the bottom of the value chain, unable to fully benefit from their resources.

“Digging and shipping copper is not enough,” the report states.

To move up the ladder, copper-rich developing countries must invest in refining, processing and manufacturing – this means strengthening infrastructure and skills, establishing industrial parks, offering tax incentives and pursuing trade policies that support higher-value production.”

Tariff and trade barriers

UNCTAD also highlights the challenge of tariff escalation, where duties on refined copper are relatively low – typically below two per cent – but can rise to as high as eight per cent for finished products like wires, tubes and pipes.

These trade barriers discourage investment in higher-value industries and lock countries into roles as raw material suppliers, the report warns.

To address this, UNCTAD is urging governments to streamline permitting, reduce trade restrictions, and develop regional value chains to help developing economies climb the industrial ladder.

Scrappy solution

With new mining projects facing long lead times, recycling is emerging as a vital part of the solution.

In 2023, secondary sources accounted for 4.5 million tonnes – nearly 20 per cent of global refined copper output. The United States, Germany and Japan are the top exporters of copper scrap, while China, Canada and the Republic of Korea are major importers.

“For developing countries, copper scrap could be a strategic asset,” UNCTAD notes.

“Investing in recycling and processing capacity can reduce import dependence, support value-added trade and advance a more circular, sustainable economy.”

Test case for critical materials

Copper, UNCTAD says, is a likely “test case” for how global trade systems handle rising demand for critical materials amid growing pressures.

“The age of copper has arrived…but without coordinated trade and industrial strategies, supply will remain under strain and many developing countries risk missing out,” the report concludes.

Source link

Electronic health records: Quick access to patient’s records improves patient care

When a patient gets transferred from a hospital to a nearby specialist or rehabilitation facility, it is often difficult for personnel at the new facility to access the patient’s electronic health records – which includes important patient-specific information such as their medication history and allergies. This lack of electronic compatibility often leads to wasteful and expensive duplication of tests, X-rays and paperwork that can interfere with the treatment of patients.

A recent study at the University of Missouri highlights how the use of electronic health records have resulted better quality of care – which can direct the next steps of government programs to ensure hospitals use electronic health records in a way that promotes interoperability, or the ability for various health care organizations to quickly access a patient’s records, reduce waste and speed up decision-making to improve patient health outcomes.

So far, the adoption and implementation of electronic health records has been a bumpy road, said Kate Trout, assistant professor in the MU School of Health Professions and lead author on the study. Electronic health records have widely been cited in research literature as the most cumbersome technology ever implemented in the health care industry.

“They have the potential to be very helpful, but in practice they tend to be very disruptive because it’s time consuming to train personnel how to use them. They’re expensive, and there’s always new complicated updates and new forms that come out, and there is often a lack of interoperability for the data to be shared among different health care organizations,” Trout said. “Given the massive national investments, we wanted to see if electronic health records are being utilized in a meaningful way to promote interoperability and ultimately improve quality of care.”

electronic health records/University of Missouri

More than $30 billion has been invested by the federal government in the adoption and use of electronic health records by health care organizations in an attempt to improve the quality of care delivered to patients.

In 2011, Centers for Medicare and Medicaid established the “Meaningful Use” program – now known as the “Promoting Interoperability Program” – which offers financial incentives to health care providers who effectively utilize electronic health records in a way that promotes information sharing, public health reporting and interoperability.

Trout analyzed the impact of electronic health records on mortality rates for patients with various medical procedures and conditions. More than 5 million patients in 300 U.S. hospitals were included in the study, which merged large datasets from the electronic health records, the American Hospital Association and Centers for Medicare and Medicaid.

Three main categories emerged:

  1. hospitals that meet the “Meaningful Use” requirements with their electronic health records,
  2. hospitals that fully implement electronic health records but not in a way that meets the “Meaningful Use” requirements, and
  3. hospitals that have either none or only partially implemented electronic health records.

Trout found that the hospitals that meet the “Meaningful Use” requirements were able to improve quality of care and reduce patient mortality rates to a greater extent than hospitals in the other two groups. While the results show some optimism, Trout cautions that more still needs to be done, including the need to analyze the impact of interoperability and advanced electronic health record functions on quality of care.

“This research highlights the importance of using electronic health records in a way that promotes interoperability to streamline processes, speed up decision-making, reduce wasted time and ultimately improve patient health outcomes,” Trout said. “Ideally, the United States could implement one standardized electronic health records system for everyone to ensure compatibility, so policy makers can hopefully benefit from this research.”

Trout added that with the use of data mining and analytics, electronic health records can be better used going forward to potentially identify patient characteristics that put them at higher risks for possible infections or other conditions.

healthcare

“With this information, are there alerts we can put in after a surgery to ensure we follow up at critical points in time?” Trout said. “Are there certain patient populations that we can use the data to catch them earlier and make sure we give them extra care and not just put them through the same routine protocols as everyone else? That is how we move away from only focusing on implementing the technology and progress toward encouraging innovative ideas that ultimately improve patient health outcomes.”

Trout said this research can be particularly useful for rural hospitals that historically have less resources and lag behind their urban counterparts in adopting health technology like telehealth and electronic health records. There have been many closures of rural hospitals, an issue that has been worsened by the COVID-19 pandemic, and rural patients tend to have more co-morbidities and worse health outcomes.

“I am passionate about helping vulnerable, underserved populations, and our personal health is often tied to where we live and various social determinants of health,” Trout said. “Those ideas are not incorporated into our clinical data yet, but they should be going forward. My overall goal is to harness the data in a way that we can hopefully start to spend less and get more.”

Emotional AI and gen Z: The attitude towards new technology and its concerns

Artificial intelligence (AI) governs all that come under “smart technology” today. From self-driving cars to voice assistants on our smartphones, AI has ubiquitous presence in our daily lives. Yet, it had been lacking a crucial feature: the ability to engage human emotions.

The scenario is quickly changing, however. Algorithms that can sense human emotions and interact with them are quickly becoming mainstream as they come embedded in existing systems. Known as “emotional AI,” the new technology achieves this feat through a process called “non-conscious data collection”(NCDC), in which the algorithm collects data on the user’s heart and respiration rate, voice tones, micro-facial expressions, gestures, etc. to analyze their moods and personalize its response accordingly.

However, the unregulated nature of this technology has raised many ethical and privacy concerns. In particular, it is important to know the attitude of the current largest demographic towards NCDC, namely Generation Z (Gen Z). Making up 36% of the global workforce, Gen Z is likely to be the most vulnerable to emotional AI. Moreover, AI algorithms are rarely calibrated for socio-cultural differences, making their implementation all the more concerning.

We found that being male and having high income were both correlated with having positive attitudes towards accepting NCDC. In addition, business majors were more likely to be more tolerant towards NCDC,” highlights Prof. Ghotbi. Cultural factors, such as region and religion, were also found to have an impact, with people from Southeast Asia, Muslims, and Christians reporting concern over NCDC.

Research by Team:

Our study clearly demonstrates that sociocultural factors deeply impact the acceptance of new technology. This means that theories based on the traditional technology acceptance model by Davis, which does not account for these factors, need to be modified,” explains Prof. Mantello.

The study addressed this issue by proposing a “mind-sponge” model-based approach that accounts for socio-cultural factors in assessing the acceptance of AI technology. Additionally, it also suggested a thorough understanding of the potential risks of the technology to enable effective governance and ethical design. “Public outreach initiatives are needed to sensitize the population about the ethical implications of NCDC. These initiatives need to consider the demographic and cultural differences to be successful,” says Dr. Nguyen.

Overall, the study highlights the extent to which emotional AI and NCDC technologies are already present in our lives and the privacy trade-offs they imply for the younger generation. Thus, there is an urgent need to make sure that these technologies serve both individuals and societies well.

Team Indus gears up for second chance at Lunar XPRIZE

India’s first private sector space agency Team Indus is gearing up for the XPRIZE lunar launch giving final touches to its stalled latest mini rover meant for lunar landing.

Since Team Indus failed to raise funds for ISRO-backed PSLV launch, it could not meet the deadline set by Google as of March 31, 2018. Eventually, Google withdrew the prize money of $30 million and called it off.

Now that XPRIZE, which had successfully launched similar scientific missions in the past has renewed its mission to undertake the Lunar XPRIZE and Bangalore-based Team Indus emerges with renewed vigour.

Team Indus founder and CEO Rahul Narayan was upbeat. “The Google Lunar XPRIZE served as an excellent early catalyst to get new people, partners and money involved. With the renewed interest in beyond Earth-orbit exploration by multiple large government space agencies, a new Lunar XPRIZE will be a perfectly timed platform with the chances of multiple successful launches being much higher than before,” he said.

Under the terms of Google XPRIZE competition, the space company or its competitors have to make a soft-landing of their lunar rover, which should traverse at least 500 metres and send high-quality images back to the ground control on the earth. However, the new parameters will be re-defined now, said XPRIZE in a statement.

While Team Indus was seen as a sure-shot winner with its progress in designing the lunar rover and displaying it at several space events in India and abroad, the company failed to raise the required funds to pay for ISRO’s PSLV launcher.

Since 2007, Google Lunar XPRIZE teams have raised over $300 million through corporate sponsorships, government contracts and venture capital. As of 2017 January, Team Indus from India, Japan’s HAKUTO, Israel-based SpaceIL, American firms Moon Express and Synergy Moon were selected out of the 33 teams from 17 countries for the $1 million initial prize. However, the failure of any of them to raise next round of funds forced Google to withdraw the $30 million grand prize.

Chanda Gonzales-Mowrer, Senior Director of XPRIZE, said: “These space entrepreneurs are developing long-term business models around lunar transportation and we cannot give up on them now.”

All the Lunar XPRIZE startups are equally enthusiastic to participate in the Lunar XPRIZE competition without a monetary prize but the organisers are hopeful to find a new sponsor to replace elusive Google. Peter H. Diamandis, XPRIZE founder and executive chairman said, “XPRIZE is now looking for our next visionary Title Sponsor who wants to put their logo on these teams and on the lunar surface.”

Bob Richards, founder & CEO of Moon Express welcomed XPRIZE decision to renew the competition. “While we plan to win this Moon race and are committed to carrying the Lunar XPRIZE logo, the real opportunity is in opening the lunar frontier and the multibillion dollar industry that follows.”

Takeshi Hakamada, founder and CEO of Japan’s space firm iSpace, which has designed HAKUTO, echoed similar views when he said, “We believe a new competition would again elevate our industry to an even higher level, so we eagerly welcome a new Lunar XPRIZE.”

While all eyes are on a new Title Sponsor, whoever pitches in would have the benefit of having their name and branding incorporated into the competition, and in success, on the surface of the Moon, said XPRIZE.

XPRIZE had conducted the $20M NRG COSIA Carbon XPRIZE, the $15M Global Learning XPRIZE, the $10M ANA Avatar XPRIZE, the $7M Shell Ocean Discovery XPRIZE, the $7M Barbara Bush Foundation Adult Literacy XPRIZE, the $5M IBM Watson AI XPRIZE, the $1.75M Water Abundance XPRIZE and the $1M Anu and Naveen Jain Women’s Safety XPRIZE in the past.

New type of supercomputer could be based on ‘magic dust’ combination of light and matter

A team of researchers from the UK and Russia have successfully demonstrated that a type of ‘magic dust’ which combines light and matter can be used to solve complex problems and could eventually surpass the capabilities of even the most powerful supercomputers.

The researchers, from Cambridge, Southampton and Cardiff Universities in the UK and the Skolkovo Institute of Science and Technology in Russia, have used quantum particles known as polaritons – which are half light and half matter – to act as a type of ‘beacon’ showing the way to the simplest solution to complex problems. This entirely new design could form the basis of a new type of computer that can solve problems that are currently unsolvable, in diverse fields such as biology, finance or space travel. The results are reported in the journal Nature Materials.

Our technological progress — from modelling protein folding and behaviour of financial markets to devising new materials and sending fully automated missions into deep space — depends on our ability to find the optimal solution of a mathematical formulation of a problem: the absolute minimum number of steps that it takes to solve that problem.

The search for an optimal solution is analogous to looking for the lowest point in a mountainous terrain with many valleys, trenches, and drops. A hiker may go downhill and think that they have reached the lowest point of the entire landscape, but there may be a deeper drop just behind the next mountain. Such a search may seem daunting in natural terrain, but imagine its complexity in high-dimensional space. “This is exactly the problem to tackle when the objective function to minimise represents a real-life problem with many unknowns, parameters, and constraints,” said Professor Natalia Berloff of Cambridge’s Department of Applied Mathematics and Theoretical Physics and the Skolkovo Institute of Science and Technology, and the paper’s first author.

Modern supercomputers can only deal with a small subset of such problems when the dimension of the function to be minimised is small or when the underlying structure of the problem allows it to find the optimal solution quickly even for a function of large dimensionality. Even a hypothetical quantum computer, if realised, offers at best the quadratic speed-up for the “brute-force” search for the global minimum.

Berloff and her colleagues approached the problem from an unexpected angle: What if instead of moving along the mountainous terrain in search of the lowest point, one fills the landscape with a magical dust that only shines at the deepest level, becoming an easily detectible marker of the solution?

“A few years ago our purely theoretical proposal on how to do this was rejected by three scientific journals,” said Berloff. “One referee said, ‘Who would be crazy enough to try to implement this?!’ So we had to do it ourselves, and now we’ve proved our proposal with experimental data.”

Their ‘magic dust’ polaritons are created by shining a laser at stacked layers of selected atoms such as gallium, arsenic, indium, and aluminium. The electrons in these layers absorb and emit light of a specific colour. Polaritons are ten thousand times lighter than electrons and may achieve sufficient densities to form a new state of matter known as a Bose-Einstein condensate, where the quantum phases of polaritons synchronise and create a single macroscopic quantum object that can be detected through photoluminescence measurements.

The next question the researchers had to address was how to create a potential landscape that corresponds to the function to be minimised and to force polaritons to condense at its lowest point. To do this, the group focused on a particular type of optimisation problem, but a type that is general enough so that any other hard problem can be related to it, namely minimisation of the XY model which is one of the most fundamental models of statistical mechanics. The authors have shown that they can create polaritons at vertices of an arbitrary graph: as polaritons condense, the quantum phases of polaritons arrange themselves in a configuration that correspond to the absolute minimum of the objective function.

“We are just at the beginning of exploring the potential of polariton graphs for solving complex problems,” said co-author Professor Pavlos Lagoudakis, Head of the Hybrid Photonics Lab at the University of Southampton and the Skolkovo Institute of Science and Technology, where the experiments were performed. “We are currently scaling up our device to hundreds of nodes, while testing its fundamental computational power. The ultimate goal is a microchip quantum simulator operating at ambient conditions.”

 

India Pitches for Cashless, Digital Payments Campaign

As the demonetisation laid bare the difficulties of rural India not matching the urban centres in banking and payment technology, the Indian government has decided to pitch for a campaign provide information, education and communication, holding camps for transiting to the digital mode of payment.

Among the series of measures undertaken include incentives to the district administration which will give a boost to cashless digital payment systems across the districts, talukas and panchayats.

NITI Aayog has prepared a blueprint of incentives for the campaign for the district authorities and administration which include include incentives for digital payments for day-to-day financial transactions like buying or selling of goods and services, transferring money etc.

NITI Aayog will provide logistic support for outreach activities at these three levels in the form of the seed money of Rs.5 lakh per district administration to enhance the seeding of Mobile and Aadhar numbers to the bank account, issue of Rupay cards wherever necessary, issue of PIN, downloading of app and finally achieving two successful transactions.

The top ten best performing districts will be awarded the Digital Payment Champions of India award.

The first 50 Panchayats which go cashless will be awarded Digital Payment Award of Honour

The five digital payment systems are –

1.Unified Payment Interface, UPI

2.USSD (*99#banking)

3.Adhar Enabled Systems

4.Wallets &

5.Rupay/Debit/Credit/Prepaid Cards

The Hindi/English version of the brochure is made available on the website- www.niti.gov.in/conetent/digital-payments and NITI Aayog has put up the entire sets of creative material – presentations/posters and FM radio spots/ films on its website – www.niti.gov.in/conetent/digital-payments.

In addition, Common Service Networks are being mobilized in going cashless and the Ministry of Electronics and Information Technology has announced cash incentive of Rs.100 for every merchant enabled to transact digitally. Two resource persons have been provided in each district collectorate to co-ordinate the CSE in each district.

NITI Aayog has also solicited the feedback on the challenges being faced by them, the solutions thereof and the manner in which they can be supported.