General H.R. McMaster recently published an op-ed in the New York Times on the folly of thinking war can be easily won, and the intellectual gymnastics policy makers will do to maintain that illusion. As I read his analysis, many of his observations are germane when thinking about the drive to “tech-up” peacekeeping operations. McMaster’s critique focuses on the U.S. defense establishment’s recent failure to account for the political and human dynamics of warfare, wrongly assuming that technological superiority would win the day. While the peacekeeping community has realized that local human and political dynamics do affect mission success, there is a similar trend toward focusing on mission efficiency through technology acquisition and application.
McMaster explains the three assumptions, all of which were ignored leading up to and during the Iraq and Afghanistan wars: war is political, war is human, and war is unpredictable. While network-centric warfare, technological superiority, and “shock and awe” air campaigns delivered swift tactical success, the political and human dimensions of war were ignored, dimensions that created unpredictability and chaos after the Taliban was routed and Saddam was captured. While the peacekeeping community has done well to recognize the limitations of large-scale technocratic solutions to post-conflict stability, McMaster’s three assumptions of warfare could prove useful when trying to balance technological innovation with political-human aspects of peacekeeping.
Peacekeeping is political. Not just at UN headquarters either; the politics of post-conflict peacebuilding, which includes ceasefires, and peace negotiations are complex. Peacekeeping missions play a major role in monitoring ceasefires and weapons transfers, and must provide a credible commitment to all sides that the others will abide by the ceasefire and terms of the peace. Dr. Walter Dorn, a professor at the Canadian Forces College who specializes in peacekeeping and surveillance technology, has written extensively on how missions have been under-supplied and which technologies would prove useful for enhancing the surveillance and monitoring ability of peacekeepers. This is certainly useful information, but if mission planners forget that the technology must work in support of the the mission’s political goals, not vice versa, then night vision and satellite imaging (etc. etc.) become tools looking for problems to solve, and are a distraction instead of a solution.
Peacekeeping is human. This notion is where a lot of recent critical work has focused. Severine Autesserre has written extensively about the problems with peacekeeping missions focusing on national solutions such as elections, while missing opportunities to support local peacebuilding initiatives. This tendency to focus on big solutions, such as elections, is both economic and systemic. The political economics of peacekeeping mean that the faster a mission can achieve the minimum level of stability in a post conflict state and get out, the better. Peacekeeping missions are also big systems meant to operate as a counterpart to the national host government. Missions aren’t built to deal with local micro issues; often they aren’t mandated for local peacebuilding, and when they have the mandate they lack the bureaucratic flexibility and scale to quickly allocate resources across multiple localities. Mission structuring and management need to be reformed to focus on micro-level peacebuilding; technologies like drones, Ushahidi maps, and Twitter don’t provide much value-added if peacekeepers are unable to act on the data that these technologies provide.
Peacekeeping is unpredictable. The peacekeeping environment is not as unpredictable as an invasion, since a peacekeeping mission is entering a conflict after a ceasefire. In this case there is at least some knowledge of how the conflict has progressed by time a peacekeeping mission arrives. Can innovative technology help alleviate unpredictability further? The short answer is yes, but only if missions invest in skilled analysts to manage all the data coming in from these technologies. Inferential analysis is difficult to do well, and technology does not make it easier. In fact, technology might make it harder. For example, statistical analysis software can do huge amounts of math, but will not give you the right answer unless you actually know how to do statistics. It will happily do exactly what you tell it to do, even if what you’re telling it to do is wrong. Big data is another realm of analytics where the hype is outpacing the demonstrable value. An analyst should be focused on gathering sufficient data to do their work; millions and millions of data points don’t help if they can’t be parsed into useful samples. Indeed, unstructured big data could lead to confounding results and problems of endogeneity.
McMaster highlighted the hubris that went into thinking that we could fight a war quickly, and that our technology would allow us to ignore the socio-politics of conflict. The same pressures that led U.S. planners to focus on speed and predictability are also felt in the peacekeeping community. Peacekeeping missions, so the policy thinking goes, should have small footprints, tangible goals, and minimal cost. It can be tempting to think that the application of some new technologies and enough statistical analysis can support these goals, while being able to sidestep the difficult political and socio-economic aspects of sustainable peacebuilding. If we keep the focus on the politics, people and unpredictability of peacekeeping though, then the tech and the analytics can play their proper supporting role and indeed help missions be more effective.