1994-1996: Reaching critical mass
CTBT Article IV: Verification
Article IV is a key component of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). It provides for a global verification regime—or global “alarm system”—to monitor all States Parties’ compliance with the Treaty’s provisions.
The main obligations in the CTBT are clear: all nuclear explosions are prohibited no matter whether they are conducted for military or peaceful purposes and no matter what yield they have. The verification regime thus has to be as tight-knit as possible, making it extremely difficult, if not impossible, to evade the prohibitions.
How this was accomplished during the negotiations of the Treaty in 1994-1996—both technologically and politically—was, according to Ambassador Jaap Ramaker of the Netherlands, who chaired the Treaty negotiations in their final year of 1996, “an example of diplomatic successes erected upon a bedrock of sound science”.
To read more about the negotiations of the Treaty, see Treaty History.
The creation of the CTBT’s verification regime is “an example of diplomatic successes erected upon a bedrock of sound science”.
Setting the stage
According to Ramaker, it was clear from the outset of the negotiations in 1994 that each and every one of the five nuclear weapon States – China, France, the Russian Federation, the United Kingdom and the United States – would not accept a test-ban treaty unless it included an international verification regime that would monitor the other States’ nuclear testing activities while not threatening their own national security.
In 1994 the Ad Hoc Committee established to negotiate the CTBT set up a working group on verification issues. This Working Group (Working Group 1) immediately initiated two years of intensive talks to hammer out what range of technologies (i.e. seismic and non-seismic) would be required to monitor nuclear testing in all environments—the earth, the seas, the air and, initially, also outer space.
Acquiring the experts
Preparations for the scientific work had already begun as early as 1976. This work conducted in the so-called Ad-Hoc Group of Scientific Experts (GSE) became continuous in 1982. The group’s expertise proved valuable in making decisions on which technologies to use and how to dovetail transmission of data in real time. The frequent presence of experts in seismology and other complementary technologies in the disarmament work in Geneva, also before the actual CTBT negotiations began, proved invaluable to the deliberations on the verification regime.
The contributions of the Ad Hoc Group of Scientific Experts (GSE) proved invaluable to the deliberations on the verification regime.
Selecting the technologies
Thanks to the GSE, there was consensus quite early on that the International Monitoring System (IMS) would have seismic stations at its core and be reinforced by other monitoring technologies. There was also consensus that it was only through the use of complementary technologies that the capability of detecting testing in all environments would be guaranteed. Things began to take more concrete shape during the actual negotiations between 1994 and 1996:
- Seismic: An “alpha” or primary network of some 50 stations, complemented by a “beta” or auxiliary network of about 120 already operating stations which provide supplementary data upon request, and later a “gamma’ network at the national level. The GSE conducted several technical tests to demonstrate how the system actually would work. The Group of Scientific Experts’ Third Technical Test from 1995-2000, called GSETT-3 for short, helped refine the concept and translate it into reality. When the results were presented, delegations could begin to envisage how a global network of high quality seismic stations could successfully transmit data, in real time to a central International Data Centre, and also where these stations could be established and what they would cost.
- Non-Seismic: Now delegations were invited to present papers on a variety of other technologies that could be introduced into a synergistic, cost-effective monitoring system.
Satellite photography, satellite systems to monitor explosions in the atmosphere and in outer space and such things as “flying radionuclide laboratories” for airborne surveillance, and electromagnetic pulse stations to detect nuclear explosions in space with the help of ground-based opticals comprised some of the wide-ranging proposals tabled during the course of the Working Group’s deliberations.
Such proposals were later sidelined, primarily due to their prohibitively high costs (e.g. US$ 3,000 million for 20 dedicated satellites) or other potential technical inadequacies, such as a too high false alarm rate. Instead the GSE and Working Group 1 soon turned to complementary technologies:
There was consensus that the International Monitoring System would have seismological stations at its core, reinforced by other monitoring technologies.
- Radionuclide: Participants were quick to recognize the unique value of radionuclide technology that could detect radioactive particles and noble gases that would identify an explosion as being a nuclear one, in contrast to other man-made or natural phenomena like chemical explosions or earthquakes and ocean floor volcanoes. This technology would require certified laboratories to analyze the data collected. Total costs were estimated at between US$ 60,000 and US$ 1 million, depending on the type of stations, US$ 15,000 – 120,000 annual running costs and US$ 2-5 million per certified lab. The unique value of radionuclide technology is that it can detect radioactive particles and noble gases that identify an explosion as being a nuclear one.
- Infrasound: This technology could be used to monitor explosions in the atmosphere. The experts foresaw between 30 and 120 stations (US$ 50-100,000 each).
- Hydroacoustic: Though expensive, these stations would be very few in number due to their effectiveness in detecting underwater explosions. Equipped with sensors deployed in the oceans or on the shore of islands, these stations could be utilized in synergy with the seismic network. Total costs ranged from US$ 2 million to US$ 50 million, depending on the sophistication of the system. Running costs would average about US$ 500,000 per year.
The sheer abundance of highly technical information triggered, according to Ramaker, uncertainty and indecision among the delegates, most of whom were not technical experts. Such was the complexity of their task that, finally in June 1994, Peter Marshall (UK), one of the six “Friends of the Chair” (FOC) appointed to deal with non-seismic techniques in private and open-ended consultations, was asked to present his personal opinion. This presentation eventually formed the substance of this part of the final Treaty.
The complementary system finally selected was chosen because it would “provide a significant level of deterrence, at a reasonable cost”.
Marshall endorsed the four technologies later selected—seismic, radionuclide, infrasound and hydroacoustics—plus the International Data Centre (IDC) as a vital analytical component. These would be backed up by
“The system I have described will provide a … significant level of deterrence, at a reasonable cost” Peter Marshall
Marshall ended his presentation by saying that “The system I have described will provide a … significant level of deterrence, at a reasonable cost” and saw his summary greeted with a round of applause and the adoption of his suggestions – the stations and technologies that came to comprise the basis of the CTBT’s International Monitoring System.
To read more about the work in the Group of Scientific Experts from 1982-1996 and the build-up of the International Monitoring System and other aspects of the verification regime, please read the interview with Ola Dahlman.
Dr Dahlman was the chairman of the Group of Scientific Experts that worked out the details of the verification regime before and during the negotiations of the CTBT, and chairman of the CTBTO Working Group on verification of the from 1996-2006.
Next chapter: 1997-2006: The pioneering decade