Learnings from ISA

Another March, another ISA conference. 2014 has been good, especially since the networking and socializing was matched by excellent feedback on what I presented. The highlights:

What I thought was a failed experiment in getting Twitter to love me actually teased out some interesting methodological challenges that other panelists on the Crowdsourcing Violence panel faced. Basically, the problem is how to encourage participation in the crowd when there isn’t an emergency. Whether it was crowdsourcing using Twitter or crowdseeding using trusted reporters, we all faced a challenge in getting participants to respond. This makes crowdsourcing and crowdseeding difficult to use as research methods. It’ll be interesting seeing how we all approach this challenge in our different papers and projects, to see if there are ways that incentives or networks can be tapped to get more consistent participation.

My paper on using crowdsourcing to support peacekeeping operations also got some good feedback. The paper was my attempt to think about technology in the context of peacekeeping operations, as opposed to peacekeeping being responsive to the technology available (e.g. how do we avoid deploying a technology solution seeking a problem). I’m going to take this in an institutional analysis direction, and focus on interviews with peacekeeping staff and experts since there is a paucity of documentation on the few crowdsourcing and crowdseeding projects that have been undertaken by missions.

This was an overall excellent week, with solid panels, fascinating topics and good conversation. If you have thoughts or feedback on my papers, feel free to share in the comments section, or shoot me an email!

Kristof, Columbia, and the ‘Public Intellectual-Professor’: Part 2

Earlier this week I wrote the first half of this pair of posts, focusing on the problems in Nicholas Kristof’s piece on why professors should be more engaged in the public debate. I came down pretty hard on it, not because I disagree with the general sentiment (my doctoral research and interests are very policy relevant and I make an effort to be in the policy space as much as the academic), but because his logic was surprisingly faulty and he didn’t seem to have any understanding of the institutional culture and expectations of academia. In effect, he missed an opportunity to discuss the actual problems facing the academy, and how these prevent professors from being more publicly engaged.

Fortunately, Michelle Goldberg wrote an excellent rejoinder about the plight of two highly respected public intellectual-professors being let go after long careers with Columbia’s Mailman School of Public Health for failing to raise 80% of their salaries in external grants. While Kristof went off on confused, ill-informed tangents about what makes an academic or academic field relevant in to the public space, Goldberg focused on what Kristof should have been writing about: the corporatization of universities, where pulling in external funding is the difference between having a job and not. Doing ground-breaking research counts for nothing it seems, unless you hit the your fundraising target. I’ll take this a step further; the problem facing universities and researchers isn’t just the outcome of bad business planning at the university level. It’s the politicization of research, and by extension validity and truth.

To start: I don’t have a problem with the idea of encouraging professors in research-oriented fields to seek external funding. Part of my dissertation studies were funded by money my dissertation supervisor pulled in; he was able to hire me as a research assistant when the department didn’t have funds immediately available to give me a stipend. At a much larger scale pulling in something like a National Science Foundation (or National Institutes of Health) grant can give a department the latitude to fund students (saving money in the core budget), hire post-docs, pay visiting scholars, and generally increase the capacity to do research. This can be a useful model for certain projects, especially in the natural sciences where the costs of equipment and logistics can run into the millions of dollars. But there’s a danger to universities placing a demand that professors raise significant portions of their salaries through external grants, while not maintaining core budgets to keep them on during lean times.

One is the business model. The Mailman School, like many large research schools, relies heavily on government research grants. These days its not good to be relying on federal research grants, especially if you’re in the social and behavioral sciences. If an academic institution decided to hitch its financial wagon to the soft money strategy of external funding, then politics in Washington is currently delivering a harsh lesson in how the political economy of ridiculous budget battles affects university staffing. But this isn’t merely a technical budgeting and forecasting issue; budgets and Federal spending don’t live in an otological bubble, disconnected from politics and popular sentiment. This is where Kristof fell off the wagon and Goldberg hit the nail on the head. To quote:

“Kristof is right that universities have become inhospitable places for public intellectuals, but he misses the ultimate cause. The real problem isn’t culture. It’s money.”

Basically it was the only thing Kristof was right about, which is why it’s unfortunate that he only dedicated one short paragraph at the beginning of his article to it. But it’s not just money, it’s the interplay between Congressional politics and how we proxy the public interest with Federal (and State) budgets. Congress is the policy representation of our societal id, and as Kristof notes there’s a strong current of anti-intellectualism in that id these days. This is where the political economy of how we define validity, truth and public good comes in, and why we’re in such a pickle even though at times we’ve been leaders in natural and social science research.

Let’s start with the obvious; a Senator or House member doesn’t get elected by telling their rabidly anti-intellectual constituents that they’re wrong or ignorant. They also don’t get elected by telling their corporate funders that truly empirical research based on first principles has indicated that their business model or industry is god-awful for the public good. In combination, this is a solid reason for a member of Congress to be unsupportive of Federally funded research, since most of it points out uncomfortable truths about our economic system, global warming, poverty, infrastructure, system of government, etc. But let’s assume that a member of congress really wanted to understand what was going on in all that natural and social science research. How many are trained to properly evaluate science (social and natural) research?

Thanks to Business Week, we can find out. In the House, we have 1 microbiologist, 2 engineers, and 1 physicist, out of 433. In the Senate, there’s 1 engineer, out of 100. That’s a grand total of 5 out of 533 members of the Legislative branch. It’s important that we know this, since Senator Tom Coburn passed a bill that effectively gives Congress the ability to pick and choose with research gets funded through the National Science Foundation. Essentially, Coburn politicized the process of scientific and empirical inquiry. Don’t like research about homeless people because it shows that your anti-poverty policy prescriptions are fanciful lies? You can cut funding for it. Running for office and your petroleum industry donors find climate research distasteful? No problem, you can eliminate that in that NSF funding stream. We have allowed politicians, only 1% of whom might be even remotely qualified to understand science research, to be the ones who decide what is worthy of scientific inquiry even if they have no idea what a P-value or co-linearity is.

This is what was so infuriating about Kristof’s article; while peddling insulting caricatures of zany academics and their ethereal models and theories, it failed to address the real problems facing academia and universities. Goldberg hit on the problem of funding and what it means for the vibrancy of a research community, but the problem goes farther than that. As a nation we’ve allowed ourselves to be duped into believing that we can be world leaders in research, commerce, and foreign policy among other things, while simultaneously dismantling and defunding the institutions that for the last 70 years have been key to our success.

Fundamentally this isn’t a problem of university funding structures or academics doing their jobs, those are just symptoms. At it’s core, this is a problem of an American society that has given into cynicism and handed the reigns to politicians who prey on fear and ignorance. The only way to beat this slump is to regain our national spirit of inquiry, adventure, and critical thinking, the exact things made us leaders in research and discovery for much of the last 70 years.

Kristof, Columbia, and the ‘Public Intellectual-Professor’: Part 1

This will be a two-parter since there’s a lot in it. It’s been interesting reading the initial article about why professors need to be involved in public debate from Nicholas Kristof and seeing the rejoinders, particularly Michelle Goldbergs’ article about Columbia University’s decision to let two of their best professors of public health go. I’m a doctoral candidate whose research agenda is a hybrid between political science and public policy, and I haven’t decided yet on whether I want to go into academia or public policy, so I’ve found this debate interesting. Starting with Kristof, who I usually enjoy reading, I agreed with his sentiment at a meta level, but found the article generally ill-informed and at times oddly contradictory. Continue reading

New post on the TechChange blog!

I just had a new post go up on the TechChange blog – I haven’t written for them in a while, so it feels good to be writing for them again!

Here’s a brief intro, and you can read the rest here:

“In recent years, mobile phones have drawn tremendous interest from the conflict management community. Given the successful, high profile uses of mobile phone-based violence prevention in Kenya in voting during 2010 and 2013, what can the global peacebuilding community learn from Kenya’s application of mobile technology to promote peace in other conflict areas around the world? What are the social and political factors that explain why mobile phones can have a positive effect on conflict prevention efforts in general?…”

Nancy Ngo, one of the TechChange staff members helped get it written, so a big thanks to her for getting it up!

Samoa Post: End of semester observations

So I’ve been in Samoa for a semester now, working with the Ministry of Communications and Information Technology and getting things in order to do dissertation fieldwork. I’ll probably post again before the end of the year, but here are a few key themes that have emerged in conversation as I’ve developed relationships with my counterparts.

Continue reading

Social Network Analysis: A cool analysis of how SNA worked during the American Revolution

Lots of people saw Kieran Healy’s humorous and thought proviking post about how some very basic matrix algebra and centrality analysis can be used to identify people within social networks using basic metadata.  This article by Shin Kap Han goes into more depth about centrality and the power of weak bonds; I found the analysis of the socio-economic stratifications that existed between the various revolutionary groups, which often caused collective action problems, to be most fascinating.

It took me a little while to revisit Healy’s post, and decided to check out Han’s piece this time – definitely worth a look if you’re into social and political mobilization.  It’s a bit quantitative, and focuses on the American Revolution, but also has a lot of observations that are germane to modern political organizing.

Disaggregating Peacekeeping Data: A new dataset on peacekeeping contributions

Jacob Kathman at the University of Buffalo has an article in the current issue of Conflict Management and Peace Science about his new dataset on the numbers and nationalities of all peacekeeper contributions by month since 1990.  This is a pretty fantastic undertaking since peacekeeping data is often difficult to find, and no small feat given how challenging it is not only to code a 100,000+ point dataset, but do it in such a way that it complements other datasets like Correlates of War and Uppsala/PRIO.  I’m particularly excited about this dataset because it highlights something I’ve been interested in, and will continue to work on throughout my career: gathering and coding historical data on peacekeeping missions so that social scientists and economists can start producing quantitative research to compliment the existing case study-oriented research on peacekeeping operations and practice.

As Kathman points out, there has usually been a focus on case study approaches to researching peacekeeping.  This makes sense: most of the research is geared toward identifying lessons learned from mission success and failure, and is meant to be easily integrated into operational behavior, instead of addressing theoretical issues.  This also reflects the ad hoc nature of peacekeeping; a mission gets a mandate to deal with a specific issue, and missions tend to be short (with some exceptions), so the data tends to be mission and context specific which lends to case study research approaches.  As civil wars became the norm in the 1990s though, missions expanded their roles to include war fighting, humanitarian aid delivery, medical provision, policing, and other aspects of civil society.  This meant that peacekeeping missions became part of the political, economic and social fabric of the post-ceasefire environment, and over the last ten years social scientists started studying the effects of peacekeeping missions on ceasefire duration and economic development, among other things.

One of the things that has lacked, and that Kathman’s dataset helps with, is data about the missions themselves.  Studies, such as Virginia Page Fortna’s excellent book on the effect of peacekeeping missions on ceasefire durability tend to rely on conflict start-stop data to make inferences about the impact of peacekeeping.  Studies of peacekeeping and economics also run into the same issues; researchers have used baseline effect on GDP of peacekeeping missions, but this is a blunt instrument approach and suffers from problems of endogeneity.  Caruso et al’s analysis of the UN mission in South Sudan’s positive effect on cereal production treats the UN mission as a mass entity, but is unable to show comparative impacts on food production across missions since there isn’t finer grained mission data readily available.

Given the need, I would suggest pushing forward with datasets that contain not only data on troop contributions, but also data on mission expenditures, since peacekeeping missions have effects on the local economy which could be positive.  The problem is that the positive effects might not be seen without finer grained data on how missions use their money in the country they’re operating in.  Do investments in durable infrastructure make a difference to the durability of peace and economic growth?  What about focusing on local provision of goods and services where available?  At the moment data on these things is hard to find, but would be useful to conflict researchers.

Kathman’s paper is worth a read since he gives us a road map for how to develop further datasets on peacekeeping missions.  More datasets like this are important for the theorists who do research in the abstract, but can also help inform better processes for mission mandating, procurement and staffing.  If you want to download the datasets, Kathman has them in zip files on his website.

Syria Update

Yesterday I mentioned the need to be transparent with our intelligence on chemical weapons use in Syria if we wanted to take the moral high ground.  Today I read the release outlining the U.S. intelligence findings on the attack.  The Huffington Post linked to this, along with a quote from Secretary of State Kerry that “Its findings are as clear as they are compelling.”  This is not exactly an example of unimpeachable proof that causes erstwhile allies to galvanize, while shaming Russia and China into a more placable position.  Statecraft: fail.

Getting traction in the United Nations on Syria

As I’ve been following story of the chemical weapons attacks in Syria, and the resulting moves to prepare for military strikes, I’ve felt like the U.N. has been an under-utilized resource for dealing with the crisis.  A few friends mention that President Obama’s ‘red line’ could be defined as something other than a military strike, and I would posit that alternative ‘red lines’ could exist at the United Nations.  This would require a rethink of how the U.S. uses diplomacy at the U.N. though.

Continue reading

Matrix Math and Paul Revere

This week has been a rather stat oriented week of posts.  I blame this on the fact that political economy and peacekeeping has been dominating my official academic life in the form of a comprehensive exam.  The silver lining is that I will soon have political economy and peacekeeping content galore.

To keep everyone entertained this Friday, I wanted to revisit what has become one of my favorite little write ups on social network analysis, matrix math, and why we should be concerned about how our meta data gets used.  Kieran Healy is a Duke professor of sociology, and wrote this in response to the revelations about the NSA’s PRISM program.  He does a great job demonstrating the math in an accessible way, while also elegantly demonstrating why we should be wary of talking points such as “…we only capture meta data, not the content of transmissions…”

Have a great weekend!