Security Council to meet following Russia’s violation of Polish airspace, as concern mounts over drone warfare

The meeting was requested by Poland after reporting that at least 19 violations by Russian drones of its territory overnight into Wednesday during a large-scale missile and drone strike against Ukraine.

The episode marked the most serious such incursion since the start of Russia’s full-scale invasion of Ukraine in 2022.  

While Poland and its NATO allies reportedly downed several of the drones, the incident has heightened tensions across the region – and put the new threats posed by drone warfare at the heart of diplomatic debate.

Russia’s Defence Ministry said the strikes were aimed at Ukraine’s military-industrial targets and that it did not intend drones to stray across the border.

UN political chief to brief

UN’s political official, Rosemary DiCarlo, is expected to brief ambassadors. Poland’s deputy foreign minister will attend, alongside regional states and the European Union.

The incident has raised deep concern over spillover of the conflict in Ukraine.

Stay with UN News as we bring you live coverage of the meeting…

Rise of the drone

The reported incursion into Polish airspace highlights the growing role of drones in modern conflict.

Relatively inexpensive and easy to deploy, drones are increasingly supplementing – and in some cases supplanting – conventional military hardware.  

Armies, armed groups and militias worldwide are rapidly adapting to their use, allowing for strikes and reconnaissance with lower risk to personnel.

However, malfunctions, loss of control and human error can lead to unintended strikes or impacts – especially when they’re deployed in towns and cities as opposed to the battlefield.

Analysts also say drones blur the line between traditional military operations and asymmetric warfare, raising the risk of unintended escalation across borders.

Read more in our explainer, here.

Source link

UN condemns deadly Russian strikes on Ukrainian capital as civilian toll mounts

According to the UN Human Rights Monitoring Mission in Ukraine (HRMMU), more than 30 locations across seven districts of Kyiv were struck in what it described as “the deadliest attack” on the Ukrainian capital in nearly a year.

Last night’s attack exemplifies the grave threat posed by the tactic of deploying missiles and large numbers of drones simultaneously into populated areas,” said Danielle Bell, Head of HRMMU.

Humanitarian Coordinator for Ukraine, Matthias Schmale, also strongly condemned the attacks, which extended to Odesa, Zaporizhzhia and other areas.

“The people of Ukraine should not have to take cover in shelters night after night,” he said. “Each day, the war takes a devastating toll on civilians.”

In the southern port city of Odesa, strikes reportedly injured several civilians and damaged a kindergarten and a centre for children with special needs – places where children should feel safe. In Zaporizhzhia, residential buildings were hit.

First responders and humanitarian agencies are already on the ground, providing emergency care and supplies while assessing further needs.

Human toll rising

The barrage included 440 long-range drones and 32 missiles launched by Russian forces, HRMMU noted in a news release citing information from Ukrainian authorities, of which 175 drones and 14 missiles targeted Kyiv.

It marked the fourth time this month that more than 400 munitions were fired in a single night – far surpassing the 544 total launched during the entire month of June 2024.

Even before this latest attack, the human toll of such tactics had been rising sharply. HRMMU had already verified at least 29 civilian deaths and 126 injuries from long-range weapons in June alone.

The overall civilian casualty count in the first five months of 2025 is nearly 50 per cent higher than in the same period last year.

Mr. Schmale reiterated that attacks on civilians and civilian infrastructure are prohibited under international humanitarian law.

Civilians, including children, must never be a target,” he said. “We must not normalize the war.”

Refugee crisis deepens

Meanwhile, the broader humanitarian crisis continues to deepen. The intense conflict, now in its third year since Russia’s full-scale invasion, has driven more than 6.3 million Ukrainians to seek refuge across Europe.

Most are women, children, and older persons, many of whom rely on temporary protection directives extended by host countries like the European Union (EU) and Moldova, according to a report released on Tuesday by Office of the UN High Commissioner for Refugees (UNHCR).

Noting the volatile situation in Ukraine, the agency urged the respective governments to maintain legal status for refugees until conditions allow for safe, dignified, and sustainable returns.

Source link

As AI evolves, pressure mounts to regulate ‘killer robots’

Every day, we voluntarily give up information about ourselves to machines. This happens when we accept an online cookie or use a search engine. We barely think about how our data is sold and used before clicking “agree” to get to the page we want, dimly aware that it will be used to target us as consumers and convince us to buy something we didn’t know we needed.

But what if the machines were using the data to decide who to target as enemies that need to be killed? The UN and a group of non-governmental organisations are worried that this scenario is close to being a reality. They are calling for international regulation of Lethal Autonomous Weapons (LAWS) to avoid a near-future where machines dictate life-and-death choices.

Large-scale drone warfare unfolding in Ukraine

For several months, the Kherson region of Ukraine has come under sustained attack from weaponised drones operated by the Russian military, principally targeting non-combatants. More than 150 civilians have been killed, and hundreds injured, according to official sources. An independent UN-appointed human rights investigation has concluded that these attacks constitute crimes against humanity.

The Ukrainian army is also heavily reliant on drones and is reportedly developing a “drone wall” – a defensive line of armed Unmanned Aerial Vehicles (UAVs) – to protect vulnerable sections of the country’s frontiers.

Once the preserve of the wealthiest nations that could afford the most high-tech and expensive UAVs, Ukraine has proved that, with a little ingenuity, low-cost drones can be modified to lethal effect. As conflicts around the world mirror this shift, the nature of modern combat is being rewritten.

© UNICEF/Oleksii Filippov

Creeping ‘digital dehumanisation’

But, as devastating as this modern form of warfare may be, the rising spectre of unmanned drones or other autonomous weapons is adding fresh urgency to ongoing worries about ‘killer robots’ raining down death from the skies, deciding for themselves who they should attack.

“The Secretary-General has always said that using machines with fully delegated power, making a decision to take human life is just simply morally repugnant,” says Izumi Nakamitsu, the head of the UN Office for Disarmament Affairs. It should not be allowed. It should be, in fact, banned by international law. That’s the United Nations position.”

Human Rights Watch, an international NGO, has said that the use of autonomous weapons will be the latest, most serious example of encroaching “digital dehumanisation,” whereby AI makes a host of life-altering decisions on matters affecting humans, such as policing, law enforcement and border control.

Several countries with major resources are investing heavily in artificial intelligence and related technologies to develop, land and sea based autonomous weapons systems. This is a fact,” warns Mary Wareham, advocacy director of the Arms Division on Human Rights Watch. “It’s being driven by the United States, but other major countries such as Russia, China, Israel and South Korea, have been investing heavily in autonomous weapons systems.”

Advocates for AI-driven warfare often point to human limitations to justify its expansion. Soldiers can make errors in judgment, act on emotion, require rest, and, of course, demand wages – while machines, they argue, improve every day at identifying threats based on behavior and movement patterns. The next step, some proponents suggest, is allowing autonomous systems to decide when to pull the trigger.

A UAV (Unmanned Aerial Vehicle) is pictured airborne over Afghanistan.

There are two main objections to letting the machines take over on the battlefield: firstly, the technology is far from foolproof. Secondly, the UN and many other organisations see the use of LAWS as unethical.

“It’s very easy for machines to mistake human targets,” says Ms. Wareham of Human Rights Watch. “People with disabilities are at particular risk because they of the way they move. Their wheelchairs can be mistaken for weapons. There’s also concern that facial recognition technology and other biometric measurements are unable to correctly identify people with different skin tones. The AI is still flawed, and it brings with it the biases of the people who programmed those systems.”

As for the ethical and moral objections, Nicole Van Rooijen, Executive Director of Stop Killer Robots, a coalition campaigning for a new international law on autonomy in weapons systems, says that they would make it very difficult to ascertain responsibility for war crimes and other atrocities.

“Who is accountable? Is it the manufacturer? Or the person who programmed the algorithm? It raises a whole range of issues and concerns, and it would be a moral failure if they were widely used.”

A ban by 2026?

The speed at which the technology is advancing, and evidence that AI enabled targeting systems are already being used on the battlefield, is adding to the urgency behind calls for international rules of the technology.

In May, informal discussions were held at UN Headquarters, at which Mr. Guterres called on Member States to agree to a legally binding agreement to regulate and ban their use by 2026.

Attempts to regulate and ban LAWS are not new. In fact, the UN held the first meeting of diplomats in 2014, at the Palais des Nations in Geneva, where the chair of the four-day expert talks, Ambassador Jean-Hugues Simon-Michel of France, described LAWS as “a challenging emerging issue on the disarmament agenda right now,” even though no autonomous weapons systems were being used in conflicts at the time. The view then was that pre-emptive action was needed to get rules in place in the eventuality that the technology would make LAWS a reality.

11 years later, talks are ongoing, but there is still no consensus over the definition of autonomous weapons, let alone agreed regulation on their use. Nevertheless, NGOs and the UN are optimistic that the international community is inching slowly towards a common understanding on key issues.

“We’re not anywhere close to negotiating a text,” says Ms. Rouijen from Stop Killer Robots. “However, the current chair of the Convention on Certain Conventional Weapons (a UN humanitarian law instrument to ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately) has put forward a rolling text that is really quite promising and that, if there is political will and political courage, could form the basis of negotiations.”

Ms. Wareham from Human Rights Watch also sees the May talks at the UN as an important step forward. “At least 120 countries are fully on board with the call to negotiate a new international law on autonomous weapons systems. We see a lot of interest and support, including from peace laureates, AI experts, tech workers, and faith leaders.”

“There is an emerging agreement that weapon systems that are fully autonomous should be prohibited,” says Ms. Nakamitsu, from the UN Office for Disarmament Affairs. “When it comes to war, someone has to be held accountable.”

Source link