New Technologies in the Global Battlespace: themes emerging from the IPSIG/Law and Future of War workshop held on Friday 13 May 2022

Written by Catherine Thornton and Kirsty McRuvie



By Catherine Thornton and Kirsty McRuvie

 (Kevin Hu, Unsplash)

The workshop ‘New Technologies in the Global Battlespace’ was held by the International Peace and Security Interest Group (ISPIG) of ANZSIL and the University of Queensland’s Law and Future of War Research Group on the 13th of May 2022. The purpose of the workshop was to discuss academic papers and practical implications concerning the impact of new technologies on compliance with international humanitarian law (IHL) and public international law more broadly.

Through presentations and Q&A dialogues, the workshop discussed current challenges within IHL concerning, inter alia, autonomy in weapons systems and data processing, accountability of the corporate actors creating autonomous weapons systems, the role of visual evidence in IHL compliance, the applicability of IHL to cyber military operations, efforts to strengthen the Biological Weapons Convention, the legalities of technology-sharing in the maritime environment, and the viability of digital emblems in safeguarding protected persons during cyber-operations.  

The workshop featured contributions by various experts in the academic, humanitarian and government sectors, bringing holistic and comprehensive views on the topic of new technologies.  Whilst each speaker covered an independent but related topic, there were four key themes that we identified as resonating across the workshop.


The rapid pace of technological change and innovation is an aspect of new technologies that was frequently referenced throughout the day. While law is not static, it does not progress at the same rate as technology, and consequently it is often playing catch up. Technologies such as autonomous weapons systems (AWS) and cyber capabilities present many challenges to the existing legal framework. One such example is where principles of IHL may, in their current form, have the potential to increase the risk to civilians during hostile military cyber operations. This is primarily due to the inherent interconnectivity of networks and the speed at which militaries are embracing and increasing their cyber capabilities as a method of warfare. In response to the militarisation of the cybersphere, organisations such as the ICRC have begun projects such as the digitisation of Red Cross emblems as a protective measure against the damage or destruction of critical civilian and medical digital infrastructure. 

Considering these and other technological challenges, academic experts suggested a careful and critical reading of IHL may be necessary to understand where and why the law falls short of regulating such change. Other contributors suggested coming back to IHL legal principles as a way to grant sufficient flexibility and breadth to the law in order to adequately regulate emerging technologies in real time. 

Additionally, the point was made that it was important to focus on the consequences of the technologies and perhaps less so on the often-fraught efforts to define the technologies themselves. With respect to the speed at which new technology is developing, the Biological Weapons Convention (BWC) was used as an example of the need to keep pace with the evolution of global threats. For example, COVID-19 was noted as having a somewhat positive effect on the upcoming BWC review conference, as States Parties now recognise the danger and effect of a virus or pathogen that can move around the globe with such ease and speed – causing devastating human and economic casualties.


The (de)humanisation of IHL was another key theme, specifically concerning the moral and ethical dilemmas of decision making and the processing and evaluation of data. Questions subsequently arose regarding the efficacy and limitations of technology. How does removing the ‘human’ affect the way IHL is seen and applied? In some cases, technology was recognised as being beneficial, providing unbiased information with extreme efficiency. Other aspects of human interaction with technology proved problematic, for instance the human deference to technology for the collection and processing of data (such as artificial intelligence that processes images) and the high level of unreliability evidenced in this. Again, consideration was given to the benefits of such technology, but also the limitations posed for accuracy of information by the ever-diminishing human filter. 

Removing the ‘human’ from the decision-making process was also discussed with respect to autonomous vehicles and platforms, and the complexity involved with programming machines, and ultimately algorithms, to make ethical and legal decisions without human input at the decision-making stage. The ‘human-tech’ paradigm has the potential to erode human judgement, reduce situational awareness and increase invisible subjectivity.   


Accountability was discussed specifically with respect to private actors and their involvement in the development of autonomous weapons systems (AWS) and autonomous vehicles (AV). Big Tech companies are writing legal and ethical decision-making into software designed for use in AWS and/or in the operation of AV. The pre-determined decisions often made by private sector designers or software developers are made either in collaboration with, or independently, from States. Currently however, these private actors avoid primary accountability for any unlawful act or harm which directly results from AWS or AV software. 

A focus in this area is ensuring that autonomous systems technology evolves in conjunction with international law, to ensure compliance with the law and to extend accountability for potential IHL violations in the development and deployment of these systems in armed conflict. It was discussed that in concert with international law, domestic regulations of AV and AWS development must be reinforced to avoid problematic AV and AWS programming in the early stages. However, given the perception that technology has evolved outside the State apparatus it may be time for States and private tech companies to design a universal set of international guidelines for AV and AWS development. This ignited discussion around the relationship of the State and private actors and the States’ obligation to ensure respect for IHL, particularly in terms of the obligation extending to accountability of private organisations.


Soft law was raised as a pragmatic tool to regulate emerging technologies in new domains such as cyberspace. The development of new technologies proves a challenge for lawmakers. This is not only because of its rapid pace and asymmetric developments, but also because of its politicisation within global fora. Soft law was noted as an effective way to establish global norms and promote responsible State behaviour without committing to a legal obligation. 

Continued dialogue within multilateral fora is essential and Australia has made significant contributions to the discussions and drafting of non-legally binding documents, through platforms such as the United Nations Group of Governmental Experts (UNGGE) and Open-Ended Working Group (OEWG) in relation to the use of information and communications technology. The outcome of these multilateral fora was a consensus in 2021 that legal principles associated with IHL apply to cyber military operations in much the same way as conventional warfare. The theme of global cooperation also arose in discussions regarding the AUKUS alliance and the legal questions surrounding capability-sharing (of nuclear powered but also of other new technologies). 

The benefit of multilateral diplomacy and engagement in the development of new technology is that it lessens the pressure on finite definitions and interpretations of international law in its application to an ever-evolving technology sector. The integration of soft law also allows for broad perspectives and the flexibility to supplement the law where it may be limited in its definitive application. 

This analysis has been compiled by University of Queensland students Catherine Thornton (MIR/MIL) and Kirsty McRuvie (LLB) who have engaged with the Law and the Future of War research group at UQ during their studies.

Catherine Thornton
Catherine Thornton
+ posts

Catherine Thornton is completing her final year of dual Master of International Law and Master of International Relations programme at the University of Queensland (UQ). She also holds a Bachelor of International Studies (with majors in international relations and Spanish) from UQ. Catherine has studied international humanitarian law at a postgraduate level and during her undergraduate studies at Tecnologico de Monterrey, Mexico. Her engagement with the Law and Future of War Research Group includes attending the course ‘Technology and National Security’ conducted by the Group. She is also a member of Australian Red Cross IHL Community Engagement Network in Queensland.

Kirsty McRuvie
Kirsty McRuvie
+ posts

Kirsty McRuvie recently graduated from the University of Queensland after completing a dual Bachelor of Laws (Honours) and Bachelor of Arts (extended major in international relations). She is the Research Assistant for Dr Eve Massingham, focusing predominantly on IHL and the obligation to respect and ensure respect for IHL.  Kirsty is the Social Media Editor on the Journal of International Humanitarian Legal Studies, and volunteers as Convenor of the Queensland Community Engagement Network and is a member of Australian Red Cross’ IHL Advisory Committee. She currently works at the University of Queensland TC Beirne School of Law’s Pro Bono Centre.

Get in Touch with the Editor

ANZSIL Perspective is pleased to hear from its readership and answer any questions from prospective contributors. We aim to respond within three business days.