The House of Lords Artificial Intelligence in Weapons Systems Committee is set to convene for its fifth public evidence session on Thursday, 11 May at 10 am.

The meeting aims to examine the implications of artificial intelligence (AI) on warfare and the development of policies to counter its rapid advancement.

Professor Sir Lawrence Freedman, an esteemed war and strategy specialist and Emeritus Professor of War Studies at King’s College London, will testify before the committee.

The session will be accessible to the public via live streaming on Parliamentlive.tv or in person at Committee Room 4A, Palace of Westminster.

Topics for discussion may include the potential for AI to fundamentally alter warfare, the UK’s stance on autonomous weapons systems (AWS), the UK government’s role in international regulatory discussions, and strategies to mitigate the increased risks posed by AWS.

Autonomous weapons systems, which can identify and engage targets without human intervention, have the potential to transform warfare by offering increased speed, accuracy, and resilience compared to existing weaponry. Furthermore, proponents argue that AWS could reduce casualties during armed conflicts.

However, questions have arisen regarding the ethics of deploying such systems, their safe and reliable operation, the possibility of hastening warfare escalation, and compliance with international humanitarian law.

Throughout the inquiry, the committee, chaired by Lord Lisvane, will explore a myriad of issues related to autonomous weapons systems, including:

  • The challenges, risks, and benefits associated with these systems.
  • The technical, legal, and ethical safeguards necessary for their safe, reliable, and accountable use.
  • The adequacy of current UK policy and the state of international policymaking on autonomous weapons systems.
Tom has spent the last 13 years working in the defence industry, specifically military and commercial shipbuilding. His work has taken him around Europe and the Far East, he is currently based in Scotland.
Subscribe
Notify of
guest

12 Comments
oldest
newest
Inline Feedbacks
View all comments
Bloke down the pub
Bloke down the pub
10 months ago

Mines have been used on land and sea for over an hundred years and are autonomous in that once positioned they can activate without further human permission. While they can kill and maim innocent civilians their use raises few qualms in times of war. I don’t see AI weapons as being any different in that respect.

Dave
Dave
10 months ago

The UK does not use mines without human-in the loop command detonation. It’s almost as if the people involved in this review had thought about that already and weren’t caught unawares by a bloke from the pub. This issue is vastly different and requires detailed and transparent consideration, as is happening.

Jack
Jack
10 months ago

These “great and good” ethical and endless discussions ignore reality. All jaw, jaw whilst potential enemies roll out future weapons systems almost monthly. You are either prepared, or not. No one wants or likes warfare. However tech is proceeding at such a rapid pace we risk being left far behind. In my time we saw cluster and offensive gas/bio being banned. Only to see them all used elsewhere. Perhaps “Their Lordships” need to stop trying to justify their existence by endless and useless debates. They are well past their sell by date. A bunch of chocolate fireguards.

Bulkhead
Bulkhead
10 months ago
Reply to  Jack

🖕

David Lloyd
David Lloyd
10 months ago

Hopefully their Lordships will watch the 2013 Tom Cruise film “Oblivion”. The AI drones hunting down humans were terrifying…..

Graham Moore
Graham Moore
10 months ago

Why does the HoL want to conduct the ‘development of policies to counter AI’s rapid advancement’?.

This sounds negative. Smaller armed forces most need to exploit new technology. Use of AI does not have to mean that there is no human involvement in targetting and weapon release.

BobA
BobA
10 months ago
Reply to  Graham Moore

I think that is the main point though Graham – AI has the potential to make a really short OODA loop. The issue is whether it is desirable to have a human in the kill chain or not – and unless you drive international agreement on what is and is not acceptable, you better not be on the wrong side of history! So better to debate the policy now before it can be truly autonomous so at least we know what we want to negotiate. As a professional military officer, I don’t want a fair fight – but as a… Read more »

Graham Moore
Graham Moore
10 months ago
Reply to  BobA

Thanks Bob. Of course we have long had munitions in service that have no human in the kill (decision) chain once the weapon has been ‘set’ or switched on – mines, CIWS Phalanx/Goalkeeper. But that is somewhat different, so park that one! AI does not have to do everything in the OODA cycle of course – it can Observe, Orientate, Decide (in terms of producing a firing solution to a human) – then the human could Act ie engage the target. I doubt that is remotely controversial. For the fully autonomous AI weapon system that does the Decide itself and… Read more »

George
George
10 months ago

The words Lords, Committee and intelligence used in the same sentence. Not a combination one sees very often. At least not without “low” and “senile” being included too. Will any of them wake up and take part.

N.
N.
10 months ago
Reply to  George

methinks you haven’t seen any lords committees on air. On a rare occassion I have (on other subjects though), some of those wrinklies were ‘anything but’ senile. Actually, the same applies to their US counterparts, I remember the grilling they gave the US chiefs of intelligence about Ukraine/Russia intel failures, and how the victims squirmed and wriggled, I really felt sorry for them.

Thuận
Thuận
10 months ago

F22 raptor, @ potus, f16 fighting falcon, B2 Spirit, b52h bomber, stratofortress, Boeing, USAF,

Thuận
Thuận
10 months ago

@ real, FLOTUS45, FLOTUS, president, united states of America, f16 fighting falcon, f22 raptor