ChatGPT: The end of diplomacy as we know it

By Corneliu Bjola and Ilan Manor. - 25 April 2023
ChatGPT: The end of diplomacy as we know it

Corneliu Bjola and Ilan Manor argue that the stage is set for the emergence of AI-driven diplomacy and explore its implications.

Lord Palmerston, a British Prime Minister and Foreign Secretary, famously exclaimed, "My God, this is the end of diplomacy!" when he received the first telegraph cable in 1860. His reaction may have been hyperbolic and uncharacteristically British in its lack of understatement, but it reflected genuine concerns about the potential impact of the telegraph on diplomacy. Palmerston's fears included the possibility of mistakes in coding or decoding cables leading to unintended consequences, the potential curtailment of diplomats' agency by centralized coordination, and the accelerated speed of diplomacy, which could disrupt traditional diplomatic practices that required time.

Fast forward to 2022, the launch of ChatGPT3, a powerful conversational AI system, has raised similar concerns about the impact of AI on diplomacy. ChatGPT3 has demonstrated remarkable sophistication, even passing exams at prestigious law and medical schools. The media's portrayal of ChatGPT3 as all-knowing has heightened its credibility. The rapid development of newer versions, such as GPT-4 and GPT-5, has raised alarms among AI entrepreneurs and researchers, including Elon Musk, who have called for a pause on powerful AI systems due to their risks and the need for safety measures. We argue that ChatGPT, the next generation of conversational AI, could have a profound impact on diplomacy, disrupting how diplomats communicate, negotiate, and manage crises.

First, AI-generated content could negatively impact a nation's reputation, the credibility of its leaders, and undo previous diplomatic efforts. With the ability to mimic the tone, language, and logic of different leaders, ChatGPT could create more sophisticated and believable misinformation than current forms of disinformation. Imagine a scenario in which a hostile party uses ChatGPT to create fake minutes of secret discussions between the German Chancellor and the French President, expressing doubts about NATO’s resolve and suggesting that a secret deal be struck with Russia to end the Ukraine War. These fake minutes could be leaked to the media, shared online, and picked up by other world capitals, creating confusion and undermining diplomatic efforts. The sheer volume of false information that could be generated and shared by a single user of ChatGPT could create "noise" that makes it difficult for nations to distinguish between information and misinformation, hampering diplomats' ability to manage crises. This may lead to a situation where AI systems battle against each other, leaving human diplomats "out of the loop."

Second, as Palmerston predicted, ChatGPT may diminishing diplomats’ agency. Conversational AIs could author press releases, create content for social media campaigns, draft UN resolutions, and generate speeches before human diplomats even get to work. While automation may be cost-effective, it may not necessarily translate into effectiveness in diplomacy, which relies heavily on informal conversations and personal relationships between diplomats. Moreover, the skills required to use conversational AIs are different from those of traditional diplomacy, which require communicative skills, charm, and the ability to foster personal relationships. Placing more diplomats in front of computer screens and training them to converse with AIs rather than humans may result in diplomats who are ill-equipped to manage diplomatic relations with other states. Ethical and legal challenges may also arise about responsibility for the actions and outcomes of AI systems in diplomatic settings.

Third, ChatGPT could also be used as a tool for "continuing negotiations," a concept proposed by Cardinal Richelieu to address potential crises before they emerge. By developing simulations that explore various paths of escalation or de-escalation in emerging conflicts, diplomats could use ChatGPT to assess the likelihood of future diplomatic crises and devise strategies to prevent or manage them. The recent diplomatic crisis caused by the detection of a Chinese balloon in US airspace serves as a useful case study.  When asked to assess the likelihood of a future diplomatic crisis between the United States and China on the basis of this incident, ChatGPT offered two possible scenarios, likely or unlikely, based on the extent of intelligence gathered by the balloon and public concern. However, diplomatic simulations using AI raise questions about the validity and accuracy of the data and assumptions on which the simulations are based. If the data used to train ChatGPT is biased or incomplete, it could result in flawed simulations that may not accurately represent the complexities of real-world diplomatic scenarios.

In conclusion, ChatGPT has the potential to significantly alter the landscape of diplomacy as we know it. Diplomacy has always relied on human skills, judgment, and personal relationships, and while AI may offer new opportunities, it should complement and enhance human diplomacy, rather than replace or undermine it. By using ChatGPT, diplomatic processes could become faster, more efficient, yet also less reliant on human expertise. This sets the stage for the emergence of AI-driven diplomacy, a prospect that would likely have been met with disapproval by Lord Palmerston.

 

Corneliu Bjola (University of Oxford) and Ilan Manor (Ben-Gurion University of the Negev).

Photo by Sanket Mishra

Disqus comments