Skip to Main Content
Maxwell School

Institute for Qualitative and Multi-Method Research – June 13-24, 2016

Schedule and Reading List  

(download pdf version)


[Please note that this page provides details from the 2016 institute. While the 2017 institute is expected to be similar, there will be some revisions.]

There are three types of institute sessions: (1) Unified (whole institute) sessions; (2) research design discussion groups; and (3) elective modules. The unified sessions are on the first Monday (6/13).

The research design discussion groups will be held for two hours on most mornings of the institute. A separate schedule will be available.

There are 31 elective modules, of which participants will select nine. That is, for each of the nine days on which there is a choice, participants will select one of the modules offered as triples (e.g. modules 1, 2 or 3) or as quadruples (e.g. 13, 14, 15 or 16).


Monday, June 13

Unified Session qualitative and multi-method inquiry: Colin Elman, John Gerring, Jason Seawright, James Mahoney, and Lisa Wedeen

Tuesday, June 14

Module 1, Multimethod Research I - Jason Seawright

Module 2, Comparative Methods in Qualitative Research I - James Mahoney and Tulia Falleti

Module 3, Discourse Analysis - Lisa Weeden

Wednesday, June 15

Module 4, Multimethod Research II - Jason Seawright

Module 5, Causal Mechanisms, Process Tracing, and Counterfactuals - James Mahoney and Tulia Falleti

Module 6, Ethnographic Methods I - Frederic Schaffer and Timothy Pachirat

Thursday, June 16

Module 7, Natural Experiments I - Thad Dunning and Daniel Hidalgo

Module 8, Process Tracing I - Andrew Bennett, Tasha Fairfield, and David Waldner

Module 9, Ethnographic Methods II - Frederic Schaffer and Timothy Pachirat

Friday, June 17

Module 10, atural Experiments II - Thad Dunning and Daniel Hidalgo

Module 11, Process Tracing II - Andrew Bennett, Tasha Fairfield, and David Waldner

Module 12, Ethnographic Methods III - Frederic Schaffer and Timothy Pachirat

Monday, June 20

Module 13, Qualitative Data Mgt. - Diana Kapiszewski, Sebastian Karcher, Dessi Kirilova

Module 14, Comparative Methods in Qualitative Research II - Gary Goertz

Module 15, Focus Groups - Jennifer Cyr

Module 16, Counterfactuals - Jack Levy

Tuesday, June 21

Module 17, Content Analysis I - William Lowe and James Lo

Module 18, QCA/fs I - Charles Ragin and Carsten Schnneider

Module 19, Designing and Conducting Fieldwork I - Diana Kapiszewski and Lauren MacLean

Wednesday, June 22

Module 20, Content Analysis II - William Lowe and James Lo

Module 21, QCA/fs II - Charles Ragin and Carsten Schnneider

Module 22, Archives and Elite Interviews - James Goldgeier, Andrew Moravcsik, Elizabeth Saunders

Module 23, Designing and Conducting Fieldwork II - Diana Kapiszewski and Lauren MacLean

Thursday, June 23

Module 24, Mixed‐method Research and Causal Mechanisms I - Nick Weller and Jeb Barnes

Module 25, CAQDAS (atlas.ti) I - Robert Rubinstein

Module 26, Geographic Information Systems (GIS) I - Jonnell Robinson

Module 27, Interpretation and History I - Thomas Dodman and Daragh Grant

Friday, June 24

Module 28, Mixed‐method Research and Causal Mechanisms II - Nick Weller and Jeb Barnes

Module 29, CAQDAS (atlas.ti) II - Robert Rubinstein

Module 30, Geographic Information Systems (GIS) II - Jonnell Robinson

Module 31, Interpretation and History II - Thomas Dodman and Daragh Grant


Choosing Which Modules to Take

While many of the 31 modules can be taken as stand‐alone units, there are some limitations on selections.

Modules with higher numbered suffixes (e.g. Content Analysis II) can usually only be taken wit the first module in the sequence (e.g. Content Analysis I). [That is, while it is often fine to take I and not II in a sequence, it is usually not possible to take II and not I.] The exception to this rule is module 14 Comparative Methods in Qualitative Research II. (It is also possible to take Module 30 Geographic Information Systems (GIS) II without Module 26 Geographic Information Systems (GIS) I, but only if you already have some familiarity with GIS.)

Modules 6, 9, and 12 (Ethnographic Methods I, II, and III) should be considered as a single unit and accordingly can only be selected together (i.e. participants cannot take only Ethnographic Methods I, or I and II). Modules 8 and 11 (Process Tracing I, II) should also be considered as single unit, and can only be selected together (participants may not take only Process Tracing I).

Apart from these formal limitations, we should also note that there are several modules whic follow in a natural sequence and/or lend themselves to being taken as a group. For the avoidance of doubt, we outline these informal sequences simply to help you navigate the table above. Beyond the two limitations we mention above, you may take whichever modules you
would find most helpful.


Modules 1 and 4 (Multimethod I and II), Modules 7 and 10 (Natural Experiments I and II), and Modules 24 and 28 (Mixed‐method research and causal mechanisms I and II).

Modules 2 and 5 (Comparative Methods in Qualitative Research, and Causal Mechanisms Process Tracing, and Counterfactuals), Module 14 (Comparative Methods in Qualitative Research II), and Modules 18 and 21 (QCA/fs I and II).

Module 3 (Discourse Analysis), Modules 6, 9, and 12 (Ethnographic Methods I, II, and III), and Modules 27 and 31 (Interpretation and History).

Module 13 (Qualitative Data Management) or Module 15 (Focus Groups), and Modules 19 and 23 (Designing and Conducting Fieldwork I and II).

Books to Purchase or Otherwise Obtain

The reading for some unified sessions and modules includes a book or books that must be purchased, or borrowed from your university library [please note that they are unlikely to be available at the Syracuse University bookstore or library]. You will also see that there is some overlap: some books are used in more than one module.

Manuscripts in Press or in Progress

To the extent possible, IQMR uses the most up‐to‐date readings on the methods covered at the institute. One consequence is that we are often using manuscripts that are either in press or in progress. Please note that the authors are allowing us to use these materials as a courtesy. A with all IQMR materials, they are made available for current attendees’ use only.

Monday, June 13 Unified Sessions, Colin Elman, John Gerring, Jason Seawright, James Mahoney, Lisa Wedeen

U1 8:30am – 9:30am – Introduction

Colin Elman, Syracuse University

U2 9:30am‐10:30am A Criterial Framework for Social Science Methodology

John Gerring, Boston University

  • U.2.1. John Gerring, Social Science Methodology: A Unified Framework 2nd edition 2012, chapters 1, 13, 14, and postscript.

10:30am ‐ 11:00 am Coffee Break

U3 11:00am – 12:00pm Statistical/multi‐method Approaches

Jason Seawright, Northwestern University

  • U.3.1. David A. Freedman, “On Types of Scientific Enquiry: The Role of Qualitative Reasoning.” In Janet Box‐Steffensemeir, Henry Brady, and David Collier, eds., Oxford Handbook of Political Methodology (Oxford University Press, 2008), pp. 221‐236.
  • U.3.2. Evan Lieberman, “Nested Analysis as a Mixed‐Method Strategy for Comparative Research,” American Political Science Review 99 (August 2005): 435‐452.

12:00pm‐2:00pm Lunch

U4 2:00pm ‐ 3:00pm Logic and Qualitative Methods

James Mahoney, Northwestern University

This session introduces the idea that logic and set theory constitute one important set of tools
used in qualitative research.

  • U.4.1. James Mahoney, “After KKV: The New Methodology of Qualitative Research,” World
    Politics 62:1 (January 2010), pp. 120‐147.
  • U. 4. 2. James Mahoney and Rachel Sweet Vanderpoel, “Set Diagrams and Qualitative
    Research,” Comparative Political Studies 48:1 (January 2015), pp. 65‐100.

3:00pm – 3:30pm Coffee Break

U5 3:30‐4:30 The Interpretive Approach to Qualitative Research

Lisa Wedeen, University of Chicago

  • U.5.1. Clifford Geertz, "Thick Description: Toward an Interpretive Theory of Culture" in The Interpretation of Cultures (Basic Books, 1973), pp. 3‐30
  • U.5.2. Clifford Geertz, "Deep Play: Notes on the Balinese Cockfight" in The Interpretation of Cultures (Basic Books, 1973), pp. 412‐453
  • U.5.3. Michel Foucault, "The Body of the Condemned" in Discipline and Punish: The Birth of the Prison (Vintage Books, 1995). (That's the second edition; the 1979 first edition is fine too), pp. 3‐31.
  • U.5.4. Michel Foucault, “Questions of Method.” In Graham Burchell, Colin Gordon, and Peter Miller, eds., The Foucault Effect: Studies in Governmentality (University of Chicago Press, 1991), pp. 73‐86

U6 4:30 ‐ 5:15pm Roundtable on “How Do We Bring All of this Together?” The Implications of Multiple Approaches to Qualitative and Multi‐Method Research

Lisa Wedeen, James Mahoney, Jason Seawright, Colin Elman


Tuesday, June 14 Module 1 – Multimethod Research I, Jason Seawright

Combining regression and case studies for the purpose of improving causal inference has long been the central focus of research on multi‐method design. In this module, we will carefully analyze this topic, developing appropriate qualitative and quantitative design components as well as case selection strategies to facilitate improved causal inference.

8:45am ‐ 10:15am Regression, Case Studies, and Causal Inference

Multi‐method research design is built around the idea that combining qualitative an quantitative research activities can produce stronger causal inferences than either mode of research can produce alone. To develop this idea for designs that combine regression and case studies, we need a clear idea of exactly what each kind of method actually contributes to causal inference.

  • 1.1.1. Jason Seawright, Multi‐Method Social Science: Combining Qualitative and Quantitative Tools, (Cambridge University Press, 2016), Chapter 2.
  • 1.1.2. Stephen L. Morgan and Christopher Winship, Counterfactuals and Causal Inference Methods and Principles for Social Research (Cambridge University Press 2014), Chapter 6.

Recommended:

  • 1.1.3. Andrew Bennett and Jeffrey Checkel, Process Tracing: From Metaphor to Analytic Too (Cambridge University Press 2015), Chapters 1, 5, 8, 10, and Appendix.

10:00am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Combining Regression and Case Studie

Several of the critical assumptions for regression‐based causal inference can be at least partiall tested with qualitative design components. We will discuss tests for confounding, for measurement problems, and for the existence of theoretically crucial causal pathways.

  • 1.2.1. Jason Seawright, Multi‐Method Social Science Combining Qualitative and Quantitative Tools, (Cambridge University Press, 2016), Chapter 3.
  • 1.2.2 Lieberman (2005). ``Nested Analysis as a Mixed‐Method Strategy for Comparative Research.'' American Political Science Review: 435‐452.

Recommended:

  • 1.2.3. Small, Mario Luis, 2011. "How to Conduct a Mixed Methods Study: Recent Trends in Rapidly Growing Literature." Annual Review of Sociology 37: 57‐86.
  • 1.2.4. Weller, Nicholas and Jeb Barnes, 2014. Finding Pathways: Mixed‐Method Research for Studying Causal Mechanisms. Chapter 7.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Multimethod Case Selection

A key hurdle in multi‐method research is figuring out how to make sure that the qualitative an quantitative design components are strongly interrelated. Statistically informed case selection can help establish this connection, and the right choices about case selection can increase the
chances of causally relevant discoveries within the qualitative analysis

  • 1.3.1. Jason Seawright, Multi‐Method Social Science Combining Qualitative and Quantitative Tools, (Cambridge University Press, 2016), Chapter 4
  • 1.3.2. Seawright and Gerring, 2008, "Case Selection Techniques in Case Study Research: A Menu of Qualitative and Quantitative Options." Political Research Quarterly 61 (June): 294‐308.

Recommended:

  • 1.3.3. Herron, Michael C., Kevin M. Quinn. 2014. “A Careful Look at Modern Qualitative
    Case Selection Methods.” Sociological Methods & Research (October)
  • 1.3.4. Dan Slater and Daniel Ziblatt. 2013. ``The Enduring Indispensability of the Controlled
    Comparison.'' Comparative Political Studies 46 (Oct.): 1301‐27.

Tuesday, June 14 Module 2 – Comparative Methods in Qualitative Research, James Mahoney and Tulia Falleti

This module introduces core concepts and methods for comparative and case‐study research in the qualitative tradition. The first section explores how comparative and historical researchers use small‐N comparisons and sequential logic to assess hypotheses. The second session discusses how the distinctive strengths of comparative‐historical analysis derive from its core defining orientations, and permit the use of analytic tools such as novel concept formation, critical junctures, and path dependence. Finally, the third section introduces the comparative sequential method, its main components, and applications in the historical institutionalism research tradition.

8:45am ‐ 10:15am – Methods of Comparative‐Historical Analysis (Session led by James Mahoney)

This session focuses on tools of cross‐case and sequential causal inference in the field of comparative‐historical analysis. Both classic methods (e.g., Mill’s methods) and more contemporary methods (e.g., the method of sequence elaboration) are discussed

  • 2.1.1. James Mahoney, “Strategies of Causal Inference in Small‐N Analysis,” Sociological
    Methods and Research 28: 4 (May 2000), pp. 387‐424 (focus especially on sections on crosscase
    analysis).
  • 2.1.2 James Mahoney, Erin Kimball, and Kendra Koivu, “The Logic of Historical Explanation
    in the Social Sciences,” Comparative Political Studies 42:1 (January 2009), pp. 114‐146.
    Recommended:
  • 2.1.3 Matthew Lange, Comparative‐Historical Methods (Sage: London, 2013)
  • 2.1.4 James Mahoney and Kathleen Thelen, eds., Advances in Comparative‐Historical Analysis (Cambridge: Cambridge University Press, 2015).

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm – Analytical Tools for Comparative‐Historical Analysis (Session led by James Mahoney)

This session has two parts. The first part explores three orientations associated with comparative‐historical analysis: macro‐configurational explanation; case‐based research; and temporally‐oriented analysis. The second part focuses on the meanings, uses, and logic of critical juncture analysis, path dependence, and institutional change in comparative‐historical analysis

  • 2.2.1. Kathleen Thelen and James Mahoney, “Comparative‐Historical Analysis in Contemporary Political Science,” in James Mahoney and Kathleen Thelen, eds., Advances in Comparative‐Historical Analysis. New York: Cambridge University Press, 2015, 3‐36. (Book
    for purchase.)
  • 2.2.2 James Mahoney, Khairunnisa Mohamedali, and Christoph Nguyen, “Causality and Time in Historical Institutionalism,” in Orfeo Fioretos, Tulia G. Falleti, and Adam Sheingate, eds., The Oxford Handbook of Historical Institutionalism. Oxford: Oxford University Press, 2016), pp. 71‐88.

Recommended:

  • 2.2.3 Paul Pierson, Politics in Time: History Institutions and Social Analysis (Princeton: Princeton University Press, 2004).
  • 2.2.4 James Mahoney and Kathleen Thelen, eds., Explaining Institutional Change: Agency Ambiguity, and Power (New York: Cambridge University Press, 2010).

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm ‐ The Comparative Sequential Method and Historical Institutionalism (Session led by Tulia Falleti)

This session also has two parts. In the first part, we introduce the main components of the Comparative Sequential Method (occurrences, events, sequences, and processes), and provide definitions and examples. In the second part, we analyze the types of comparative sequentia arguments that have been advanced in the historical institutionalist research tradition in political science. In studying these arguments, we will focus on issues of causality and the temporal effects that result from the ordering and pacing of events within sequences.

  • 2.3.1. Falleti, Tulia G., and James Mahoney. 2015. "The Comparative Sequential Method." In Advances in Comparative‐Historical Analysis, ed. J. Mahoney and K. Thelen. Cambridge, UK: Cambridge University Press, 211‐39. (Book for purchase.)
  • 2.3.2. Fioretos, Orfeo, Tulia G. Falleti, and Adam Sheingate. 2016. "Historical Institutionalism in Political Science." In The Oxford Handbook of Historical Institutionalism, ed. O. Fioretos, T.G. Falleti and A. Sheingate. Oxford, UK: Oxford University Press, 3‐28.

Recommended:

  • 2.3.3. Hall, Peter A. 2016. "Politics as a Process Structured in Space and Time." In The Oxford Handbook of Historical Institutionalism, ed. O. Fioretos, T. G. Falleti and A. Sheingate. Oxford, UK: Oxford University Press, 31‐50.
  • 2.3.4. Pierson, Paul. 2016. "Power in Historical Institutionalism." In The Oxford Handbook of Historical Institutionalism, ed. O. Fioretos, T. G. Falleti and A. Sheingate. Oxford, UK: Oxford University Press, 124‐41.

Tuesday, June 14 Module 3, Discourse Analysis, Lisa Wedeen

This module provides students with an introduction to three different modes of discourse analysis. Participants will learn to "read" texts while becoming familiar with contemporary thinking about interpretation, narrative, and social construction. In these three sessions we shall explore the following methods: Foucault’s “interpretive analytics”; Wittgenstein’s understanding of language as activity and its relevance to ordinary language‐use analysis (including theories of “performativity”); and an analysis of the rhetoric of cinema.

8:45am ‐ 10:15am Wittgenstein and Ordinary Language‐Use Analysis

Lisa Wedeen, University of Chicago

This session introduces participants to Ludwig Wittgenstein’s thought and its relationship to ordinary language‐use methods. We shall focus on several key ways in which Wittgensteinianinspired methods can be used in ethnographic and analytical research. Among the questions we shall ask are: What is the “value added” of concentrating on language? Why is understanding language as an activity important? How can social scientists grapple with vexed issues of intention? What does “performative” mean, and how do political theories about language as performative differ from discussions of performance? How can social scientists uninterested in taking on new jargon use this kind of political theory to further their theoretical and empirical work?

  • 3.1.1. Hanna Fenichel Pitkin, “Justice, Socrates and Thrasymachus” in Wittgenstein and Justice: On the Significance of Ludwig Wittgenstein for Social and Political Thought (University of California Press, 1972), pp. 169‐192.
  • 3.1.2. Lisa Wedeen, Peripheral Visions: Publics, Power, and Performance in Yemen (University of Chicago Press, 2008), Chapter 2, chapter 3, and conclusion. (Book to purchase)
  • 3.1.3. Ludwig Wittgenstein, The Philosophical Investigations, G. E. M. Anscombe, trans. (Blackwell Publishers, 2001), Paragraphs 1‐33; paragraph 154; pages 194‐195.

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12: 30pm – 2:00pm Lunch

2:00pm – 3:30pm Foucauldian Discourse Analysis
Lisa Wedeen, University of Chicago

This session introduces participants to the techniques of Foucauldian discourse analysis of “interpretive analytics.” Participants will learn how to conduct a discourse analysis, what the underlying assumptions of such an analysis are, and how these techniques can be used to advance political inquiry. The session will consider both the power and limitations of the method, the ways in which it differs from other modes of interpretation, and its advantages over content analysis.

  • 3.2.1. Michel Foucault, “Nietzsche, Genealogy, History” in Language, Counter‐Memory, Practice: Selected Essays and Interviews, Donald F. Bouchard, ed., Donald F. Bouchard and Sherry Simon, trans. (Cornell University Press, 1977), pp. 139‐164.
  • 3.2.2. Michel Foucault, The History of Sexuality, Vol. 1, Robert Hurley, trans. (Vintage Books, 1990), pp. 1‐35 and pp. 92‐114.
  • 3.2.3 Revisit King, Keohane, and Verba’s Designing Social Inquiry and bring this text to class. Gary King, Robert O. Keohane, and Sidney Verba, Designing Social Inquiry: Scientific Inference in Qualitative Research (Princeton University Press, 1994).

Recommended:

  • 3.2.4. Hubert L. Dreyfus and Paul Rabinow, Michel Foucault: Beyond Structuralism and Hermeneutics (University of Chicago Press, 1983), Part Two.

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Rhetorics of the Visual

In media contexts that have increasingly relied on motion‐picture narratives for our sense of historical reality, both the proximate past and the distant, it is incumbent on contemporary scholarship, not least in the social sciences, to cultivate the requisite analytic skills for making sense of what we see. Films, advertisements, and other visual media can be used as evidence in the social sciences. But what kind of evidence and what theories of language, the gaze, the senses, and affect undergird what we “see," the “we” that sees, and how to grapple with vexed issues such as intention, audience reception, etc.? Drawing from Roland Barthes' influential Mythologies, as well as from works central to cinema and media studies (including two from political science), this section will give students basic skills for “reading” and justifying the use of visual material in political science.

  • 3.3.1. Roland Barthes, Mythologies. READ the following THREE SHORT (2‐page) essays: "In the Ring," "Saponids and Detergents," “Toys" PLUS Part Two: “Myth Today.
  • 3.3.2. Laura Mulvey, “Visual Pleasure and Narrative Cinema,” in Visual and Other Pleasures.(In the second edition pp. 14‐27)
  • 3.3.3. Michael Paul Rogin, "Kiss Me Deadly: Communism, Motherhood, and Cold War Movies" [from Ronald Reagan, The Movie and Other Episodes in Political Demonology (Berkeley, CA: University of California Press, pp. 236‐71.)
  • 3.3.4. Michael Rogin, "'Make My Day!': Spectacle as Amnesia in Imperial Politics,"REPRESENTATIONS, Winter 1990, 29: 99‐123.

Wednesday, June 15 Module 4 – Multimethod Research II, Jason Seawright

While the regression‐type causal inferences discussed in the first module remain widespread in the social sciences, more advanced tools for causal inference and for theory‐building are increasingly mainstream. This module extends ideas about multi‐method research to the contexts of natural and laboratory experiments, and discusses multi‐method designs for discovering and testing theories about complex causation.

8:45am ‐ 10:15am Multimethod Natural Experiments

The popularity of natural experiments in the social sciences has grown alongside that of multimethod designs. While natural experiments are attractive because they rely on different assumptions than regression‐type inferences, they still require strong assumptions. We will explore qualitative design components for testing the key assumptions in classic natural experiments, instrumental‐variables natural experiments, and regression discontinuity designs.

  • 4.1.1. Jason Seawright, Multi‐Method Social Science Combining Qualitative and Quantitative Tools, (Cambridge University Press, 2016), Chapter 6.
  • 4.1.2. Bennett and Checkel 2015: Chapter 8.

Recommended:

  • 4.1.3. Jeremy Ferwerda and Nicholas L. Miller, 2014, ``Political Devolution and Resistance to Foreign Rule: A Natural Experiment.'' American Political Science Review 108 (Aug.): 642‐60.
  • 4.1.4. Matthew Adam Kocher and Nuno P. Monteiro, 2015. "What's in a Line? Natural Experiments and the Line of Demarcation in WWII Occupied
    France." http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2555716

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Multimethod Experiments

True randomized experiments require the fewest and simplest assumptions of any quantitative tool for causal inference, but they still make important assumptions. We will discuss case study designs to test SUTVA and experimental realism. We will also discuss designs using experiments or other quantitative tools as steps in a process‐tracing argument.

  • 4.2.1. Jason Seawright, Multi‐Method Social Science Combining Qualitative and Quantitative Tools, (Cambridge University Press, 2016), Chapters 7‐8.

Recommended:

  • 4.2.2 Elizabeth Levy Paluck. 2010. ``The Promising Integration of Qualitative Methods and Field Experiments.'' The Annals of the American Academy of Political and Social Science 628 (March): 59‐71.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Complexity, Theory Testing, and Theory Building

To date, most work on multi‐method research has emphasized relatively simple causal questions about average treatment effects. Yet more complex causal models are common in the social sciences. We will explore multi‐method strategies for discovering and testing complex theories (with multiple outcomes, interactivity, and/or nonlinearity).

  • 4.3.1. Dafoe, Caughey, and Seawright. Forthcoming. ``Global Tests of Complex Hypotheses" A Nonparametric Framework for Testing Elaborate Theories.'' Journal of Politics.
  • 4.3.2. Breiman, 2001. "Random Forests." Machine Learning 45 (1): 5‐32.

Recommended:

  • 4.3.3. Ziona Austrian. 2000. ``Cluster Case Studies: The Marriage of Quantitative and Qualitative Information for Action.'' Economic Development Quarterly 14 (Feb.): 97‐110.
  • 4.3.4. Carsten Q. Schneider and Ingo Rohlfing. 2013. ``Combining QCA and Process Tracing in Set‐Theoretic Multi‐Method Research.'' Sociological Methods & Research 42 (Nov.): 559‐597.

Wednesday, June 15 Module 5 – Causal Mechanisms, Process Tracing, and Counterfactuals, James Mahoney and Tulia Falleti

This module is divided into sections that focus on three specific core tools and orientations in qualitative research. The first is causal mechanisms, and here attention centers on the definition, uses, and concrete applications of causal mechanisms. The second section focuses on process tracing. We discuss both inductive and deductive uses of process tracing, and we provide many illustrations of this method. Finally, we consider the uses and place of counterfactual analysis in case study and historical research.

8:45am ‐ 10:15am ‐ Causal Mechanisms (Session led by Tulia Falleti)

What are causal mechanisms? Are they different from variables? How do causal mechanisms fit in the recent political science debate on casual inference? This session will aim to address these questions by providing conceptual definitions and working through concrete examples from the
political science literature.

  • 5.1.1. Hall, Peter A. 2003. "Aligning Ontology and Methodology in Comparative Politics." In Comparative Historical Analysis in the Social Sciences, ed. J. Mahoney and D. Rueschemeyer. New York: Cambridge UP, 373‐404.
  • 5.1.2. Falleti, Tulia G., and Julia Lynch. 2008. "From Process to Mechanism: Varieties of Disaggregation." Qualitative Sociology 31 (3), 333‐9.

Recommended:

  • 5.1.3. Mahoney, James. 2001. "Beyond Correlational Analysis: Recent Innovations in Theory and Method." Sociological Forum 16 (3), 575‐93.
  • 5.1.4. Falleti, Tulia G., and Julia Lynch. 2009. "Context and Causal Mechanisms in Political Analysis" Comparative Political Studies 42 (9), 1143‐66.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12:30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm ‐ Process Tracing (Session led by Tulia Falleti)

This session will focus on the method of process tracing. It will track its origins in the fields of cognitive psychology and comparative historical sociology, in order to arrive to the different conceptualizations and uses of the method as developed in the last two decades in political science. We will distinguish between inductive and deductive process tracing (focusing on the former), on the one hand, and between intensive and extensive process tracing, on the other hand.

  • 5.2.1. Bennett, Andrew, and Jeffrey T. Checkel. 2015. "Process tracing: from philosophical roots to best practices." In Process Tracing. From Metaphor to Analytic Tool, ed. A. Bennett and J. T. Checkel. Cambridge, U.K.: Cambridge University Press.
  • 5.2.2 Falleti, Tulia G. 2016. "Process tracing of extensive and intensive processes." New Political Economy, 1‐8.

Recommended:

  • 5.2.3. George, Alexander L., and Andrew Bennett. 2005. Case Studies and Theory Development in the Social Sciences. Cambridge, MA and London, England: MIT Press, Chapter 10: “Process‐Tracing and Historical Explanation,” pp. 205‐232.
  • 5.2.4 Hall, Peter A. 2006. "Systematic Process Analysis: When and How to Use It." European Management Review 3, 24‐31.
  • 5.2.5. Hall, Peter A. 2013. "Tracing the Progress of Process Tracing." European Political Science 12, 20‐30.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm ‐ Process‐Tracing Tests and Counterfactual Analysis (Session led by James Mahoney)

This session will focus on two sets of tools of causal analysis in qualitative research: processtracing tests and counterfactual analysis. Both of these tools are developed from the perspective of logic and set‐theory.

  • 5.3.1. David Collier, “Understanding Process Tracing.” PS: Political Science and Politics 44, No. 4 (October 2011), 823–30.
  • 5.3.2. James Mahoney, “The Logic of Process Tracing Tests in the Social Sciences,"Sociological Methods and Research, 41:4 (November 2012), 570‐597.
  • 5.3.3. Levy, Jack S. 2008. “Counterfactuals and Case Studies.” In The Oxford Handbook of Political Methodology, edited by Janet M. Box‐Steffensmeier, Henry E. Brady, and David Collier. Oxford: Oxford University Press, 627‐644.

Module 6, Ethnography I, Frederic Schaffer and Timothy Pachirat

How does sustained attention to meaning making in the research world contribute to the study of politics? What are the promises, and perils, of social research that invites the unruly minutiae of lived experience and conceptual lifeworlds to converse with, and contest, abstract disciplinary
theories and categories? In this practice‐intensive short course, we explore two ethnographic methods ‐ participant observation and ordinary language interviewing ‐ with specific attention to their potential to subvert, generate, and extend understandings of politics and power.

8:45am ‐ 10:15am Introductions

Part A: Introduction to Ethnography [Pachirat]

This part of the session explores the promises and pitfalls of ethnographic approaches to the
political.

  • 6.1.1. Clifford Geertz, "Thick Description: Toward an Interpretive Theory of Culture" in The Interpretation of Cultures (Basic Books, 1973), 3‐30.
  • 6.1.2. Bent Flyvbjerg, “The Power of Example,” in Making Social Science Matter: Why Social Inquiry Fails and How it Can Succeed Again, Steven Sampson, trans. (Cambridge University Press, 2001), 66‐87.
  • 6.1.3. Edward Schatz, “Ethnographic Immersion and the Study of Politics” and “What Kind(s) of Ethnography does Political Science Need?” In Edward Schatz, ed., Political Ethnography: What Immersion Contributes to the Study of Power (University of Chicago Press, 2009).
  • 6.1.4. Timothy Pachirat, "The Political in Political Ethnography: Dispatches from the Kill Floor." In Edward Schatz, ed., Political Ethnography: What Immersion Contributes to the Study of Power (University of Chicago Press, 2009), 143‐161.

Part B: Introduction to Ordinary Language Interviewing [Schaffer]

Ordinary language interviewing is a tool for uncovering the meaning of words in everyday talk. By studying the meaning of words (in English or other languages), the promise is to gain insight into the various social realities these words name, evoke, or realize. This part of the session covers some basic questions about ordinary language interviewing: what it is, what can be discovered through it, and how it is similar to and different from other types of ethnographic interviewing.

  • 6.1.5. Barbara Sherman Heyl, “Ethnographic Interviewing.” In Paul Atkinson, Amanda Coffey, Sara Delamont, John Lofland and Lyn Lofland, eds., Handbook of Ethnography (Sage, 2001), pp. 369‐383.
  • 6.1.6. Frederic Charles Schaffer, Elucidating Social Science Concepts: An Interpretivist Guide (Routledge, 2016). Read the entire book, but pay special attention to pp. 1‐64 and 89‐98. [Book to purchase]
  • 6.1.7. Frederic Charles Schaffer, “Thin Descriptions: The Limits of Survey Research on the Meaning of Democracy.” Polity (2014) 46,3: 303‐30.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

Session 2 (1:40 ‐ 3:30) How to Do an Ordinary Language Interview [Schaffer]

In this session participants will learn how to conduct an ordinary language interview, and practice doing one focusing on words of their own choosing. Participants will also learn and practice different strategies for approaching people to interview. By this time, participants will have selected the sites in which they will do their field exercises. Participants will work with their fieldsite groups during this session’s exercises and in the short course’s subsequent exercises.

Session 3 (3:40 ‐ 6:00) Ordinary Language Interviewing Field Exercise and Write‐Up [Schaffer]

Participants will go to fieldsites (around campus or at the Carousel Center Mall) to conduct ordinary language interviews. They will then write‐up their main findings


Thursday, June 16 Module 7: Natural Experiments —Thad Dunning and Daniel Hidalgo

8:45am ‐ 10:15am Introduction to Natural Experiments

Thad Dunning, University of California, Berkeley and
Daniel Hidalgo, Massachusetts Institute of Technology

What are natural experiments? We introduce the concept of natural experiments and discuss their strengths and limitations through a survey of recent examples from political science and economics.

  • 7.1.1. Thad Dunning, Natural Experiments in the Social Sciences: A Design‐Based Approach (Cambridge University Press, 2012), Chapters 1‐4. (Book to purchase)
  • 7.1.2. Rafael Di Tella, Sebastian Galiani, and Ernesto Schargrodsky, “The Formation of Beliefs: Evidence from the Allocation of Land Titles to Squatters,” Quarterly Journal of Economics 122(1) (February 2007): 209–241.
  • 7.1.3. Daniel Posner, “The Political Salience of Cultural Difference: Why Chewas and Tumbukas are Allies in Zambia and Adversaries in Malawi,” American Political Science Review 98(4) (November 2004): 529‐545.
  • 7.1.4. David Clingingsmith, Asim Ijaz Khwaja, and Michael Kremer, “Estimating the Impact of the Hajj: Religion and Tolerance in Islam's Global Gathering,” Quarterly Journal of Economics 124(3) (August 2009): 1133‐1170.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm – 3:30pm Natural Experiments: Quantitative Methods

Thad Dunning, University of California, Berkeley and
Daniel Hidalgo, Massachusetts Institute of Technology

We discuss the role of statistical models in the analysis of natural experiments and provide an overview of quantitative techniques suitable for estimating causal effects. We emphasize the advantages of simplicity and transparency in the quantitative analysis of natural experiments.

  • 7.2.1. Bjorn Tyrefors Hinnerich and Per Pettersson‐Lidbom, “Democracy, Redistribution, and Political Participation: Evidence from Sweden 1919‐1938.” Econometrica 82(3) (May, 2014).
  • 7.2.2. Thad Dunning, Natural Experiments in the Social Sciences: A Design‐Based Approach (Cambridge University Press, 2012), Chapters 5‐‐6. (Book to purchase)
  • 7.2.3 F. Daniel Hidalgo, Julio Canello, and Renato de Lima Olivera. “Can Politicians Police Themselves? Natural Experimental Evidence from Brazil’s Audit Courts.” Read pre‐analysis plan and working paper.
  • 7.2.4. Thad Dunning, Felipe Monestier, Rafael Piñeiro, Fernando Rosenblatt, and Guadalupe Tuñón. “Is Paying Taxes Habit Forming? Experimental Evidence from Uruguay.” Read preanalyisis plan(s) and working paper.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Natural Experiments: Qualitative Methods

We highlight the essential role of qualitative methods in the analysis of natural experiments. We present examples that illustrate how qualitative evidence can bolster the credibility of causal assumptions and aid in the interpretation of quantitative results.

  • 7.3.1. Thad Dunning, Natural Experiments in the Social Sciences: A Design‐Based Approach (Cambridge University Press, 2012), Chapter 7. (Book to purchase)
  • 7.3.2. Jeremy Ferwerda and Nicholas Miller, “Political Devolution and Resistance to Foreign Rule: A Natural Experiment,” American Political Science Review, forthcoming
  • 7.3.3 Kocher, Matthew and Monteiro, Nuno. “What’s in a Line? Natural Experiments and the Line of Demarcation in WWII Occupied France”. 2015 Manuscript.

Thursday, June 16 Module 8, Process Tracing I, Andrew Bennett, Tasha Fairfield, and David Waldner

This module will introduce the state‐of‐the‐art on process tracing methodology, with an emphasis on both theory and practice. We will elaborate the basic ideas behind process tracing as a method for causal inference in qualitative case‐study research. We will discuss best practices and examples of good process tracing, drawing on research in international relations,
comparative politics, and other subfields.

The module will include online worksheets and group exercises designed to help participants learn and apply current process‐tracing techniques. Participants should bring a laptop to access online worksheets during the sessions.

8:45am ‐ 10:15am Introduction to Process Tracing

Andrew Bennett, Georgetown University
Tasha Fairfield, London School of Economics

This session provides a general introduction to process tracing and practical research applications. We will also present some initial online exercises to explore how scholars reason when working with qualitative case evidence (please be sure to bring a laptop).

  • 8.1.1. Andrew Bennett, and Jeffrey Checkel, eds., Process Tracing in the Social Sciences (Cambridge University Press, 2014), Chapter 1 (Book for purchase)

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12: 30pm – 2:00pm Lunch

2:00pm – 3:30pm Process Tracing Theory and Standards

Tasha Fairfield, London School of Economics
David Waldner, University of Virginia

Participants will break into groups that will explore various aspects of methodological theory and best‐practice recommendations for process tracing.

  • 8.2.1. David Waldner, “What Makes Process Tracing Good: Causal Mechanisms, Causal Inference, and the Completeness Standard in Comparative Politics,” in Andrew Bennett and Jeffrey T. Checkel, eds., Process Tracing: From Metaphor to Analytic Tool (Cambridge
    University Press, 2014): 126‐152. (Book for purchase)
  • 8.2.2. John Owen, “How Liberalism Produces Democratic Peace,” International Security 19 (Autumn 1994): 87‐125.
  • 8.2.3. Tasha Fairfield, “Going Where the Money Is: Strategies for Taxing Economic Elites in Unequal Democracies,” World Development 47 (2013). Skim pp. 42‐45, Read Chilean cases, pp. 47–49.

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Process Tracing Exercises

Andrew Bennett, Georgetown University
Tasha Fairfield, London School of Economics

We will work in groups on practical process‐tracing exercises. Participants will receive a short homework exercise to complete before the Friday morning session.

  • 8.3.1. David Collier, “Understanding Process Tracing,” PS: Political Science and Politics 44(4) (October 2011): 823‐830, and associated exercises (online at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1944646)

Thursday, June 16 Module 9. Ethnography II, Frederic Schaffer and Timothy Pachirat

Session 4 (8:45 ‐10:15) Ordinary Language Interview Debriefing [Schaffer]

First we will discuss the challenges participants encountered in approaching people to interview, conducting ordinary language interviews, and writing up results. Next we will catalogue the different word uses/meanings that participants discovered in doing their fieldsite interviews.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

Session 5 (2:00 ‐ 3:30) Ethics and Praxis in Participant Observation [Pachirat]

An exploration of the practice of participant observation, with special emphasis on jottings, fieldnote writing, and the ethics of fieldwork.

  • 9.2.1. Robert M. Emerson, Rachel I. Fretz, and Linda L. Shaw, Writing Ethnographic Fieldnotes (University of Chicago Press, 1995), Chapters 1‐5. [Book to purchase]

Session 6 (3:40 ‐ 5:30) Participant Observation Fieldwork Exercise [Pachirat]

In their fieldsite groups, participants will conduct participant‐observation exercises in preselected sites.

Session 7 (5:30 ‐ 7:30) Fieldnote Writing

Participants will use this time to write up a set of fieldnotes based on jottings taken in their fieldsites.

Friday, June 17 Module 10: Natural Experiments, Thad Dunning and Daniel Hidalgo

8:45am ‐ 10:15am Evaluating Natural Experiments

Thad Dunning, University of California, Berkeley and
Daniel Hidalgo, Massachusetts Institute of Technology

We critically assess natural‐experimental research using an evaluative framework based on (1) the plausibility of as‐if random assignment; (2) the credibility of causal and statistical assumptions; and (3) the substantive and theoretical relevance of the intervention. We emphasize the importance of quantitative and qualitative diagnostics and substantive knowledge for building successful natural‐experimental designs.

  • 10.1.1. Thad Dunning, Natural Experiments in the Social Sciences: A Design‐Based Approach (Cambridge University Press 2012), Chapters 8‐10. (Book to purchase)
  • 10.1.2. Devin Caughey and Jasjeet Sekhon, “Elections and the Regression Discontinuity Design: Lessons from Close U.S. House Races, 1942–2008,” Political Analysis 19(4) (October 2011): 385‐408.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Design Your Own Natural Experiment

Thad Dunning, University of California, Berkeley and
Daniel Hidalgo, Massachusetts Institute of Technology

In this session, we give participants the opportunity to design a natural experiment related to their own work and receive feedback from course participants.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Multi‐Method Research and Natural Experiments

Thad Dunning, University of California, Berkeley and
Daniel Hidalgo, Massachusetts Institute of Technology

We end the course by evaluating the promise and obstacles to the use of multi‐method research in the analysis of natural experiments. Drawing upon the previous sessions and readings, we discuss how qualitative methods can help address some of the criticisms of natural experiments, as well as how natural experiments can bolster the inferences drawn from qualitative evidence.

  • 10.3.1. Thad Dunning, Natural Experiments in the Social Sciences: A Design‐Based Approach (Cambridge University Press 2012), Chapter 11. (Book to purchase)

Further Readings by Topic (for both Modules 6 and 10):

Standard Natural Experiments:

Christopher Blattman, “From Violence to Voting: War and Political Participation in Uganda" American Political Science Review 103(2) (May 2009): 231‐247.

Raghabendra Chattopadhyay and Esther Duflo, “Women as Policy Makers: Evidence from a Randomized Experiment in India,” Econometrica 72(5) (September 2004): 1409‐1443.

Daniel Doherty, Donald Green, and Alan Gerber, “Personal Income and Attitudes toward Redistribution: A Study of Lottery Winners,” Political Psychology 27(3) (June 2006): 441‐458.

Claudio Ferraz and Frederico Finan, “Exposing Corrupt Politicians: The Effect of Brazil’s Publicly Released Audits on Electoral Outcomes,” Quarterly Journal of Economics 123(2) (May 2008): 703‐745.

Susan Hyde, “The Observer Effect in International Politics: Evidence from a Natural Experiment,” World Politics 60(1) (October 2007): 37–63.

Jason Lyall, “Does Indiscriminate Violence Incite Insurgent Attacks? Evidence from Chechnya," Journal of Conflict Resolution 53(3) (June 2009): 331‐362.

Daniel N. Posner, “The Political Salience of Cultural Difference: Why Chewas and Tumbukas Are Allies in Zambia and Adversaries in Malawi,” American Political Science Review 98(4) (November 2004): 529‐545.

Regression‐Discontinuity Designs:

Thad Dunning and Janhavi Nilekani, “Ethnic Quotas and Political Mobilization: Caste, Parties, and Distribution in Indian Village Councils.” Working paper, Department of Political Science, Yale University (2010). Available at http://www.thaddunning.com/research/all‐research.

David S. Lee, “Randomized Experiments from Non‐random Selection in U.S. House Elections," Journal of Econometrics 142(2) (February 2008): 675‐697.

Amy Lerman, “Bowling Alone (With my Own Ball and Chain): The Effects of Incarceration and the Dark Side of Social Capital.” Manuscript, Department of Politics, Princeton University (2008).

Donald L. Thistlewaite and Donald T. Campbell, “Regression‐discontinuity Analysis: An Alternative to the Ex‐post Facto Experiment,” Journal of Educational Psychology 51(6) (December 1960): 309‐317.

Instrumental‐Variables Designs:

Edward Miguel, Shanker Satyanath, and Ernest Sergenti, “Economic Shocks and Civil Conflict: An Instrumental Variables Approach,” Journal of Political Economy 112(4) (August 2004): 725‐753.

Analysis and Design:

Joshua D. Angrist and Alan B. Krueger, “Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments,” Journal of Economic Perspectives 15(4) (Fall 2001): 69‐85.

Henry Brady and David Collier, eds., Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd ed. (Rowman & Littlefield, 2010).

Donald T. Campbell and Julian C. Stanley, Experimental and Quasi‐Experimental Designs for Research (Houghton Mifflin Co., 1963).

Thad Dunning, “Improving Causal Inference: Strengths and Limitations of Natural Experiments," Political Research Quarterly 61(2) (June 2008): 282‐293.

Thad Dunning, “Model Specification in Instrumental‐Variables Regression,” Political Analysis 16(3) (July 2008): 290‐302.

Thad Dunning, “Natural and Field Experiments: The Role of Qualitative Methods,” Qualitative Methods Newsletter 6(2) (2008).

David Freedman, Statistical Models: Theory and Practice (Cambridge University Press, 2005).

David Freedman, Robert Pisani, and Roger Purves, Statistics, 4th ed. (W.W. Norton & Co., 2007), Chapter 1 (“Controlled Experiments”) and Chapter 2 (“Observational Studies”).

Donald P. Green, Terence Y. Leong, Holger L. Kern, Alan S. Gerber, and Christopher W. Larimer, “Testing the Accuracy of Regression Discontinuity Analysis Using Experimentall Benchmarks,” Political Analysis 17(4) (October 2009): 400‐417.

Allison J. Sovey and Donald P. Green, “Instrumental Variables Estimation in Political Science: A Readers’ Guide,” American Journal of Political Science 55(1) (January 2011): 188‐200.

Qualitative Methods:

Kripa Ananthpur, Kabir Malik, and Vijayendra Rao, “The Anatomy of Failure: An Ethnography of a Randomized Trial to Deepen Democracy in Rural India.” June 2014

Christopher Blattman, Tricia Gonwa, Julian Jamison, Katherine Rodrigues, and Margaret Sheridan. “Measuring the Measurement Error: A Method to Qualitatively Validate Survey Data”. November 2014.

Elizabeth Levy Plaluck. “The Promising Integration of Qualitative Methods and Field Experiments”. Annals of the American Academy of Political and Social Sciences”. 628 March 2010.

Friday, June 17 Module 11, Process Tracing II, Andrew Bennett, Tasha Fairfield, and David Waldner

8:45am ‐ 10:15am Process Tracing Working Sessions

Andrew Bennett, Georgetown University
Tasha Fairfield, London School of Economics
David Waldner, University of Virginia

Participants will break into groups to discuss how they plan to use process tracing in their own research, work on additional process tracing exercises, and discuss the homework assigned on Thursday afternoon. Please bring a laptop to access online exercises during the session.

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12: 30pm – 2:00pm Lunch

2:00pm – 3:30pm Process Tracing Theory and Standards, Continued.

Tasha Fairfield, London School of Economics
David Waldner, University of Virginia

We will continue our discussions of different approaches to understanding the methodological underpinnings of causal analysis in case studies and best practice guidelines for process tracing. Participants will work with a different instructor from the previous day to cover complementary material.

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Process Tracing Exercises and Concluding Discussion.

Andrew Bennett, Georgetown University
Tasha Fairfield, London School of Economics
David Waldner, University of Virginia

Participants will work on a new set of process‐tracing exercises. The module with conclude with discussion about how scholars can best improve inference and transparency in processtracing research.


Friday, June 17 Module 12. Ethnography III, Frederic Schaffer and Timothy Pachirat

Session 8 (9:15 ‐ 10:15) Fieldsite Group Reviews of Fieldnotes

Participants exchange and comment on each other’s fieldnotes.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

Session 9 (2:00 ‐ 3:30) Fieldsite Group Discussions and Presentations

Participants combine with other fieldsite groups to discuss the experience of doing participant observation.

Session 10 (4:00 ‐ 5:30) Overall Debriefing (ordinary language interviewing and participant observation) [Pachirat and Schaffer]

In this session, we will reflect together on the following three clusters of questions: (1) How can ordinary language interviewing and participant observation be fruitfully combined when doing ethnographic fieldwork? What are the potential pitfalls of such a combination? (2) To what extent does the method one adopts shape what one apprehends? Specifically, do we learn something different when we access meaning by means of (relatively unstructured) participant observation as opposed to (relatively structured) ordinary language interviewing? (3) Is there anything that you learned about ordinary language interview and/or participant observation
that might or will inform your *own* research?


Monday, June 20 Module 13, Managing/Sharing Qualitative Data, Making Qualitative Research Transparen, Diana Kapiszewski, Sebastian Karcher, and Dessi Kirilova

Funding agencies, publishers, and academic associations alike are increasingly requiring that scholars share their research data. In this module, we discuss strategies for managing and sharing qualitative data. We demonstrate that effective data management throughout the research lifecycle is a key prerequisite for successful data sharing. We highlight the benefits of sharing data, including enhanced citation and collaboration, and the catalyzing of secondary analysis. We also consider some perceived barriers to data sharing and demonstrate appropriate techniques for overcoming them. Finally, we discuss how making qualitative research more transparent (i.e., clearly conveying how data were generated and analyzed to produce inferences and interpretations) helps scholars to showcase the strength of their work, and we introduce strategies for achieving research transparency in qualitative inquiry. The module includes numerous types of exercises and practical applications. Participants will benefit most if they have an actual research project, including its data‐generation issues and challenges, in mind. For those who are neither about to begin nor in the midst of a project, we
will provide an example. For the third session, participants will be working directly with one of their own research products (i.e., a paper, published article, etc.).

8:45am ‐ 10:15am Managing Data

Dessi Kirilova, Qualitative Data Repository

We introduce the notion of the ‘research lifecycle’ to demonstrate that research data can prove useful far beyond the research project that created them. We consider the role of data in planning and designing research projects and examine the strategies and techniques required to manage data effectively, both for the benefit of the immediate project and to give them a longer life beyond it. We use examples of real research projects to establish what types of protocols might be needed at key stages of the research cycle, and to identify trigger points at which data sharing considerations come into play. Finally, we discuss briefly the role of describing and contextualizing data in order for them to be reusable, and consider the issues that need to be addressed in order to manage data safely. Exercises are used to help consolidate knowledge. Students will also receive guidance on developing, and will have the opportunity to begin to develop, a Data Management Plan (DMP).

  • 13.1.1. Lupia, Arthur and Colin Elman (2014) “Openness in Political Science: Data Access and Research Transparency – Introduction.” PS: Political Science & Politics 47(01): 19‐42.
  • 13.1.2. Van den Eynden, V., Corti, L., Bishop, L. and Woollard, M. (2011, third fully revised edition) Managing and Sharing Research Data: A Guide to Good Practice, Essex: UK Data Archive. ISBN: 1‐904059‐78‐3.
  • 13.1.3. MIT Libraries (2013) ‘Writing an NSF Data Management Plan' (http://libraries.mit.edu/guides/subjects/data‐management/nsf‐dm‐plan.pdf)
  • 13.1.4. ICPSR (2014) ‘Framework for Creating a Data Management Plan’, University of Michigan (http://www.icpsr.umich.edu/icpsrweb/content/datamanagement/dmp/framework.html)

Recommended

  • 13.1.5. Corti, L. and Thompson, P. (2012) 'Secondary analysis of archived data' in J. Goodwin (ed.) SAGE: Secondary Data Analysis London: Sage Publications Ltd (http://repository.essex.ac.uk/2444/)
  • 13.1.6. Corti, L., Van den Eynden, V., Bishop, L. and Woollard, M. (2014) Managing and Sharing Research Data: A Guide to Good Practice, London: Sage. ISBN: 978‐1‐44626‐726‐4. 10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Sharing Qualitative Data

Diana Kapiszewski, Georgetown University and Qualitative Data Repository
Dessi Kirilova, Qualitative Data Repository

We discuss the benefits of sharing qualitative data, and best practices for addressing the ethical, legal, and logistical challenges of doing so. With regard to human participant concerns, we consider adaptations to the process of soliciting informed consent to enable data sharing at the end of a project; discuss strategies for anonymizing qualitative data aiming to preserve original content while minimizing disclosure risk where confidentiality has been promised; and examine how to select access controls for shared data. We briefly cover issues of rights management – who owns ‘your’ data? – and debate copyright concerns and how they can be addressed; we also consider the notion of “fair use.” We describe suitable venues for sharing data and highlight the advantages of doing so in an institutional venue, including curation and llong‐term availability of data; we also introduce the Qualitative Data Repository (www.qdr.org). Students are encouraged to consider questions of ethics and rights in relation to the DMP they began to develop in the first session.

  • 13.2.1. Bishop, L. (2009) 'Ethical Sharing and Re‐Use of Qualitative Data', Australian Journal of Social Issues, 44(3).
  • 13.2.2. Gunsalus, C. K. et al. (2007). “The Illinois White Paper, Improving the System for Protecting Human Subjects: Counteracting IRB ‘Mission Creep.’” Qualitative Inquiry 13(5): 617‐649.
  • 13.2.3. Israel, Mark and Iain Hay. (2006). “Research Ethics for Social Scientists: Avoiding Harm and Doing Good.” In Mark Israel and Iain Hay, eds. Research Ethics for Social Scientists. London: Sage Publications.
  • 13.2.4. Resource on Fair Use http://www.copyright.gov/fair‐use/more‐info.html

Recommended

  • 13.2.5. Clark, A. (2006) ‘Anonymising Research Data’, ESRC National Centre for Research Methods, Working Paper 7/06. (http://eprints.ncrm.ac.uk/480/1/0706_anonymising_research_data.pdf
  • 13.2.6. Yardley, S. et al. (2013) ‘Ethical Issues in the Reuse of Qualitative Data: Perspectives From Literature, Practice, and Participants’, Qualitative Health Research, 2014, Vol. 24(1) 102–113. (http://qhr.sagepub.com/content/24/1/102.full.pdf+html
  • 13.2.7. Bailey, C., Baxter, J, Mort, M. and Convery, I. (2006) ‘Community Experiences of the 2001 Foot and Mouth Disease Epidemic in North Cumbria: An Archiving Story’ Methodological Innovations Online, (2006) 1(2) 83-94 (http://www.esds.ac.uk/news/publications/MIOBailey‐pp83‐94.pdf)

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Making Qualitative Research Transparent

Diana Kapiszewski, Georgetown University and Qualitative Data Repository
Sebastian Karcher, Qualitative Data Repository

Research transparency comprises production transparency (clearly describing the processes through which data were generated) and analytic transparency (clearly indicating how data were analyzed and how they support claims, conclusions, inferences and interpretations in scholarship). In this session we consider the ongoing debate in political science over making qualitative research transparent, and discuss the merits and limitations of one transparency technique for qualitative research, Annotation for Transparent Inference (ATI). Participants are called on to think through what increasing the transparency of a piece of their own scholarship would entail. Finally, we consider the challenges of “replicating” qualitative research.

  • 13.3.1. Elman, Colin and Diana Kapiszewski. 2014. “Data Access and Research Transparency in the Qualitative Tradition.” PS: Political Science & Politics 47(01): 43‐47.
    13.3.2. Moravcsik, Andrew. 2014. “Transparency: The Revolution in Qualitative Research." PS: Political Science & Politics 47(01): 48‐53.
  • 13.3.3. Saunders Elizabeth N. (2015) “John F. Kennedy” (92‐131). Leaders at War: How Presidents Shape Military Interventions. 2011. Ithaca, NY: Cornell University Press. Active Citation Compilation, QDR:10048. Syracuse, NY: Qualitative Data Repository [distributor]. http://doi.org/10.5064/F68G8HMM

Recommended

  • 13.3.4. Moravcsik, Andrew, Colin Elman, and Diana Kapiszewski. 2013. “A Guide to Active Citation” Qualitative Data Repository.
  • 13.3.5. Ishiyama, John. 2014. “Replication, Research Transparency, and Journal Publications: Individualism, Community Models, and the Future of Replication Studies.” PS: Political Science & Politics 47(01): 78‐83.
  • 13.3.6. The (DA‐RT) Data Access and Research Transparency Joint Statement (http://www.dartstatement.org)
  • 13.3.7 Saunders, Elizabeth N. 2014. “Transparency without Tears: A Pragmatic Approach to Transparent Security Studies Research.” Security Studies 23 (4): 689–98. doi:10.1080/09636412.2014.970405.
  • 13.3.8 Qualitative Transparency Deliberations (https://www.qualtd.net/)

Monday, June 20 Module 14 – Comparative and Qualitative Methods, Gary Goertz

8:45am ‐ 10:15am Two Cultures: Contrasting Qualitative and Quantitative Research

This session contrasts an approach to qualitative and multimethod research based on the statistical paradigm with one based on within‐case causal analysis and logic.

  • 14.1.1. Gary Goertz and James Mahoney, A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences (Princeton: Princeton University Press, 2012), chapters 4‐6, 9, and 15. (Book to purchase)

Recommended:

  • 14.1.2. Thiem, A. and Baumgartner, M. 2016. Still lost in translation: a correction of three misunderstandings between configurational comparativists and regressional analysts. Comparative Political Studies May 49(6): 742‐774. d

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Social Science Concepts

This session provides basic guidelines for the construction and evaluation of concepts. In particular, it provides a framework for dealing with complex concepts, which are typical in much social science research.

  • 14.2.1. Gary Goertz, Social Science Concepts, chapters 1‐2.
  • 14.2.2 Gary Goertz and James Mahoney, A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences (Princeton: Princeton University Press, 2012), chaps. 11‐13. (Book to purchase)

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Multimethod Research, Causal Mechanisms, and Case Studies

This session will explore the linkage between multimethod research, causal mechanisms and case studies. In particular, it analyses the logic of case selection for multimethod research when the goal is to investigate causal mechanisms.

  • 14.3.1. Gary Goertz, "Multimethod Research, Causal Mechanisms, and Case Studies: The Research Triad," chapters 1‐‐3, (in production, 2017, Princeton University Press).

Monday, June 20 Module 15 – Focus Groups in the Social Sciences, Jennifer Cyr

This module explains when and how to make rigorous and effective use of focus groups. It specifies when focus groups should be used and what their strengths are for a social science researcher, particularly when it comes to a mixed‐methods research design. A significant portion of each session will be dedicated to simulating the process through which a focus group is conceived, organized, and undertaken. Upon completing the three‐part series the student will be able to assess if focus groups make sense for her research project. She will also have the tools to undertake focus groups on her own.

8:45am ‐ 10:15am. Focus Groups: When to Use Them and Why

This session introduces the focus group as a data collection method. It provides a short history of the use of the method and explains when and why social science researchers may wish to use them. Students will use this knowledge to develop a research design that includes focus groups.

  • 15.1.1. David Morgan. 1996. “Focus Groups.” Annual Review of Sociology, Vol. 22 (1996), pp. 129‐152.
  • 15.1.2. Jennifer Cyr. Unpublished document. “Specifying the Role of Focus Groups in Mixed‐Methods Research”

Recommended:

  • 15.1.3. Robert K. Merton & Patricia L. Kendall. 1946. “The Focused Interview.”
  • 15.1.4. David W. Stewart, Prem N. Shamdasani, & Dennis W. Rook. 2007. Chapter 1: Focus Group History, Theory & Practice. Focus Groups: Theory and Practice.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm. Planning and Carrying out Focus Groups.

This session provides a step‐by‐step guide to how to organize and undertake a focus group. We will address several aspects, including how to: develop an effective question protocol, identify and prepare a moderator, and address specific challenges. As an exercise, students will work in groups to develop a question protocol for a hypothetical research project.

  • 15.2.1. Rosaline Barbour. 2007. “Practicalities of planning and running focus groups.” In Doing Focus Groups. Sage Publications, Ltd.
  • 15.2.2. Edward Fern. “Methodological Issues in Focus Group Research: Representativeness, Independence, Degrees of Freedom, and Theory Confirmation.” In Advanced Focus Group Research. Sage Publications.

Recommended:

  • 15.2.3. David W. Stewart, Prem N. Shamdasani, & Dennis W. Rook. 2007. Chapter 3: “Focus Groups and the Research Toolbox.” Focus Groups: Theory and Practice. Sage Publications.
  • 15.2.4. J. Gothberg, et al. 2013. Is the Medium Really the Message? A Comparison of Face to-Face, Telephone, and Internet Focus Group Venues. Journal of Ethnographic & Qualitative Research, 7(3).

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm. Analyzing Focus Groups.

This session addresses how to analyze the data generated from focus groups. It explains how to take full advantage of the data generated by analyzing the transcript at three different levels of analysis: the individual level, the group level, and the level of interaction. The students will then work in groups to analyze a sample focus group transcript.

  • 15.3.1. Jennifer Cyr. 2016. “The Pitfalls and Promise of Focus Groups as a Data Collection Method.” Sociological Methods and Research. 45(2): 231‐59.
  • 15.3.2. Rosaline Barbour. 2007. “Analytical Challenges in Focus Group Research.” In Doing Focus Groups. Sage Publications, Ltd.

Recommended:

  • 15.3.3. Pamela S. Kidd & Mark B. Parshall. 2000. “Getting the focus and the group: enhancing analytical rigor in focus group research.” Qualitative health research, 10(3), 293‐308.
  • 15.3.4. Anthony J. Onwuegbuzie, et al. 2009. “A Qualitative Framework for Collecting and Analyzing Data in Focus Group Research.” International journal of qualitative methods, 8(3), 1‐21.

Monday, June 20 Module 16 – Counterfactual Analysis, Jack S. Levy

All causal arguments have counterfactual implications about what might happen if certain variables were to assume different values. This module assesses the utility of counterfactual analysis in helping to assess the alternative paths that history might have taken, for the purposes of validating causal inferences in historical interpretation. How can we use what did not happen but which might have happened and maybe should have happened to help understand what actually did happen? Given the temptation to invoke “counterfactuals of convenience” that bolster one’s preferred historical interpretations or political preferences, what are the rules for evaluating the scientific legitimacy of counterfactuals?

8:45am ‐ 10:15am . Counterfactuals: Uses, Types, and Criteria for Evaluation

This session begins with a brief discussion of the importance of counterfactuals and of various types and uses of counterfactuals. We then develop a system of methodological rules or best practices for using counterfactuals to help assess the validity of causal inferences in historical interpretation. We also consider the relevance of counterfactual analysis for debates about historical contingency and determinism. This discussion will probably continue into the second session.

  • 16.1.1. Jack S. Levy. “Counterfactuals, Causal Inference, and Historical Analysis.” Security Studies 24, 3 (September 2015): 378‐402.
  • 16.1.2. Philip E. Tetlock and Geoffrey Parker, “Counterfactual Thought Experiments: Why We Can’t Live Without Them and How We Must Learn to Live with Them.” In Philip E. Tetlock, Richard Ned Lebow, and Geoffrey Parker, eds., Unmaking the West: “What If?”
    Scenarios That Rewrite World History. Ann Arbor: University of Michigan Press, 2006. Pp. 14‐44.

Recommended:

  • 16.1.3. “Symposium on Counterfactual Analysis.” Security Studies, 24, 3 (September 2015), commentaries by Lebow, Harvey, and Gavin.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Counterfactuals of the Great War

We use the criteria developed in the first session to explore some counterfactuals of the First World War, and simultaneously use these historical counterfactuals to illustrate and elaborate upon our analytic rules for historical counterfactuals. There are as many counterfactuals as there are causal linkages, but we begin with the big one, which spins off countless others: What if Austrian Archduke Franz Ferdinand had not been assassinated? Would the war still have occurred? What were the likely consequences of a 20th century world without the Great War? What if the war had occurred but if Britain had not intervened?

  • 16.2.1. Richard Ned Lebow, “Contingency, catalysts and nonlinear change: the origins of World War I.” In Gary Goertz and Jack S. Levy, eds., Explaining War and Peace: Case Studies and Necessary Condition Counterfactuals. New York: Routledge, 2007. Pp. 85‐111.
  • 16.2.2. Jack S. Levy, “Preferences, Constraints, and Choices in July 1914.” International Security, 15, 3 (Winter 1990/91): 151‐86.

Recommended:

  • 16.2.3 William R. Thompson, “Powderkegs, sparks, and World War I.” In Gary Goertz an Jack S. Levy, eds., Explaining War and Peace: Case Studies and Necessary Condition Counterfactuals. New York: Routledge, 2007. Pp. 113‐45.
  • 16.2.4. Paul W. Schroeder, “Embedded Counterfactuals and World War I as an Unavoidable War.” In Schroeder, Systems, Stability, and Statecraft: Essays on the International History of Modern Europe, ed. by David Wetzel, Robert Jervis, and Jack S. Levy. New York: Palgrave, 2004. Pp. 157‐91.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Counterfactuals of the 2003 Iraq War and Other Cases

This session focuses on the counterfactual world defined by the hypothetical election of Al Gore as president of the United States in 2000, focusing on Frank Harvey’s counterfactual analysis of the 2003 Iraq War. How valid is Harvey’s counterfactual analysis? What are the implications for an assessment of the relative causal weight of individual‐level variables, external pressures and social forces in the Bush Administration’s decision for war in 2002‐3?

16.3.1. Frank Harvey, “President Al Gore and the 2003 Iraq War: A Counterfactual Test of Conventional “W”isdom. Canadian Journal of Political Science 45:1 (March 2012): 1–32.

Recommended:

  • 16.3.2. Adeed Dawisha, John Ehrenberg, Bruce Gilley, Stephen M. Walt, and Elizabeth Saunders, “Ideology, Realpolitik, and US Foreign Policy: A Discussion of Frank P. Harvey’s Explaining the Iraq War: Counterfactual Theory, Logic and Evidence.” Perspectives on Politics 11, 2 (June 2013): 578‐92.
  • 16.3.3. Klaus Dodds, “Counter‐Factual Geopolitics: President Al Gore, September 11th and the Global War on Terror. Geopolitics 13, 1 (December 2008): 73‐99.

Suggestions for Further Reading

Theory/Methodology of Counterfactuals:

Black, Jeremy. What If? Counterfactualism and the Problem of History. London: Social Affairs Unit, 2008.

Bunzl, Martin. “Counterfactual History: A User’s Guide.” American Historical Review 109, 3 (June 2004): 435‐42.

Carr, E. H. What is History? Harmondsworth, UK: Penguin Books, 1964. Chap. 4.

Collins, John, Ned Hall, and L.A. Paul, eds. Causation and Counterfactuals. Cambridge, MA: MIT Press, 2004. Philosophers’ perspectives.

Evans, Richard J. Altered Pasts: Counterfactuals in History. Waltham, MA: Brandeis University Press, 2013.

Fearon, James D. “Counterfactuals and Hypothesis Testing in Political Science.” World Politics 43, 2 (January 1991): 169‐95.

Ferguson, Niall. “Virtual History: Toward a ‘Chaotic’ Theory of the Past.” Pp. 1‐90 in Virtual History: Alternatives and Counterfactuals, ed. N. Ferguson. New York: Basic Books, 1999. Includes literary perspectives.

Goertz, Gary., and Jack S. Levy. “Causal Explanation, Necessary Conditions, and Case Studies." Pp. 9‐45 in Explaining War and Peace: Case Studies and Necessary Condition Counterfactuals, ed. G. Goertz and J. S. Levy. New York: Routledge, 2007.

Gould, J.D. “Hypothetical History.” The Economic History Review, 2nd series, 22, 2 (1969): 195‐207.

Lebow, Richard Ned. “What’s So Different about a Counterfactual?” World Politics, 52, 3 (July 2000): 550‐85.

Levy, Jack S. “Counterfactuals and Case Studies.” In Janet Box‐Steffensmeier, Henry Brady, and David Collier, eds., Oxford Handbook of Political Methodology. New York: Oxford University Press, 2008. Pp. 627‐44.

Lewis, David. Counterfactuals. Cambridge, MA: Harvard University Press, 1973.

McClelland, Peter D. Causal Explanation and Model Building in History, Economics, and the New Economic History. Ithaca, NY: Cornell University Press, 1975. Chap. IV.

Morgan, Stephen L., and Christopher Winship, Counterfactuals and Causal Inference: Methods and Principles for Social Research (New York: Cambridge University Press, 2007). A quantitative perspective.

Paul, L. A. “Counterfactual Theories.” In Helen Beebee, Christopher Hitchcock, and Peter Menzies, eds., The Oxford Handbook of Causation. New York: Oxford University Press, 2009. Pp.158‐84. Philosophers’ perspectives.

Tetlock, Philip E., and Aaron Belkin, eds, Counterfactual Thought Experiments in World Politics Princeton: Princeton University Press, 1996.

Wenzlhuemer, Roland, ed. “Counterfactual Thinking as a Scientific Method.” Special Issue, Historical Social Research 34, 2 (2009).

Psychology of Counterfactuals (how do people actually think about counterfactuals?):

Lebow, Richard Ned. Forbidden Fruit: Counterfactuals and International Relations. Princeton NJ and Oxford UK: Princeton University Press and Oxford University Press, 2010. Chap. 5‐6.

Roese, Neal J., and James M. Olson. What Might Have Been: The Social Psychology of Counterfactual Thinking. Mahwah, NJ: Lawrence Erlbaum, 1995.

Tetlock, Philip E. Expert Political Judgment. Princeton: Princeton University Press, 2005.

Applications to Other Historical Cases:

Blight, James G., Janet M. Lang, and David A. Welch. 2009. Vietnam if Kennedy Had Lived: Virtual JFK. Lanham, MD: Rowman & Littlefield.

Chwieroth, Jeffrey. “Counterfactuals and the Study of the American Presidency.” Presidential Studies Quarterly. June 2002. pp. 293‐327.

Cowley, Robert. 1999. What If? New York: G. P. Putnam’s Sons.

Dull, Jonathan R. The Miracle of American Independence: Twenty Ways Things Could Have Turned Out Differently. Lincoln, NB: Potomac Books, 2015.

Fogel, Robert. 1964. Railroads and American Economic Growth: Essays in Econometric History. Baltimore, MD: Johns Hopkins University Press.

Greenfield, Jeff. 43*: When Gore Beat Bush‐A Political Fable. Byliner/Amazon e‐book.

Harvey, Frank P. Explaining the Iraq War: Counterfactual Theory, Logic and Evidence. Cambridge, UK: Cambridge University Press, 2011.

Rosenfeld, G.D. The World Hitler Never Made: Alternate History and the Memory of Nazism. New York: Cambridge University Press, 2005.

Roth, Philip. The Plot Against America. Boston: Houghton Mifflin, 2004.

Tuesday, June 21 Module 17 – Computer Assisted Text Analysis I, William Lowe and James Lo

These two modules are about using computers to systematically analyze text, typically as precursor, successor, or complement to a qualitative analysis. We’ll discuss and practice classical dictionary‐based content analysis and its newer incarnation as topic modeling,
consider how to classify documents, and show how to project their content into rhetorical spaces for understanding and visualization.

We’ll presume a grasp of basic mathematical and statistical concepts and a willingness to follow along with the computational parts. The module mostly uses R and its packages. Expertise in R is not required, although some prior experience may be helpful.

If you choose this module you should bring a laptop and be ready and able to install some software beforehand. We’ll circulate a handout with software prerequisites before the course and if you have any problems you can meet with us at 8pm the day before in the lobby of the Sheraton hotel and we’ll try to fix them.

8:45am ‐ 10:15am

In the first session we’ll introduce text analysis as a measurement problem, and then discuss dictionary‐based content analysis in old and new style. We will focus on identifying model assumptions, learn how to deploy the output effectively in subsequent analyses, see how to validate them, and maybe even fix them when they fail. Finally, when the task is beyond the capacities of machines, we’ll consider the mechanics of getting other people to do our content analyses for us.

  • 17.1.1. J. Grimmer and B. Stewart, Text as Data: The Promise and Pitfalls of Automatic Content Analysis Methods for Political Texts, Political Analysis (2013)
  • 17.1.2. M. Laver, J. Garry, Estimating policy positions from political texts. American Journal of Political Science, 44(3) (2000):619–634

Optional:

  • 17.1.3. K. Benoit, D. Conway, B. E. Lauderdale, M. Laver, and S. Mikhaylov. Crowd‐Sourced Text Analysis: Reproducible and Agile Production of Political Data. American Political Science Review. (in press)
  • 17.1.4. J. Bara, A. Weale, and A. Biquelet, Analysing parliamentary debate with computer assistance. Swiss Political Science Review, 13(4) (2007):577‐605.
  • 17.1.5. D. T. Young. How Do You Measure a Constitutional Moment? Using Algorithmic Topic Modeling To Evaluate Bruce Ackerman’s Theory of Constitutional Change. Yale Law Journal 1990 (2013)
  • 17.1.6. D. Blei and J. Lafferty. Topic Models. In A. Srivastava and M. Sahami, editors, Text Mining: Classification, Clustering, and Applications. (2009).

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12:30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm

We’ll show how to use a variety of text analysis tools for dictionary‐based content analysis and replicate studies using a variety of types of text source.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm

Computer‐assisted assignment of categories to documents

Document classification methods automate the process of putting complete texts into categories from a typology without the need to construct a dictionary or codebook, using labelled examples. In this session we discuss how these methods work and consider their
advantages and limitations.

  • 17.3.1. W. McIntosh, M. Evans, J. Lin, and C. Cates, Recounting the courts? applying automated content analysis to enhance empirical legal research. Journal of Empirical Legal Studies, 4(4) (2007):1041– 1057.
  • 17.3.2. D. Hillard, S. J. Purpura, and S. Wilkerson, Computer assisted topic classification for mixed methods social science research. Journal of Information Technology and Politics, 4(4) (2008):31‐46.

Tuesday, June 21 Module 18 – QCA/fs I, Charles Ragin and Carsten Schneider

This module presents the basic principles and practices of set‐analytic methods, in general, and Qualitative Comparative Analysis (QCA), in particular. After introducing the tools of formal logic and set theory that underpin this family of methods, participants learn about the formalized analysis of set relations using truth tables. Particular attention is given to (a) the assessment of set‐theoretic consistency and coverage, and (b) the phenomenon of limited diversity and how QCA enables researcher to employ counterfactual reasoning. Each session will contain some hands‐on exercises, familiarizing participants with the two major software packages available for set‐analytic analyses; fsQCA and R (packages QCA, QCAGUI, and SetMethods).

8:45am ‐ 10:15am Introduction to Qualitative Comparative Analysis (QCA)

Charles Ragin, University of California, Irvine

This session introduces QCA, especially its use as a tool for deciphering and unraveling causal complexity. QCA uses set‐analytic procedures that are consistent with common practices in case‐oriented comparative research. The key difference is that with QCA it is possible to examine an intermediate number of cases—too many for conventional case‐oriented analysis. The basics of QCA are illustrated with a published example of applied QCA.

  • 18.1.1. Charles C. Ragin, Redesigning Social Inquiry: Fuzzy Sets and Beyond. University of Chicago Press, 2008, chapters 1‐3. (book to purchase)
  • 18.1.2. Carsten Q. Schneider and Claudius Wagemann, Set‐Theoretic Methods for the Social Sciences: A Guide to Qualitative Comparative Analysis (Cambridge University Press, 2012), Chapter 2, pp. 42‐55. (book to purchase)
  • 18.1.3. Barbara Vis. 2009. “Governments and Unpopular Social Policy Reform: Biting the Bullet or Steering Clear?” European Journal of Political Research 48: 31–57.

Recommended:

  • 18.1.4. Gary Goertz and James Mahoney, “Mathematical Prelude: A Selective Introduction to Logic and Set Theory for Social Scientists” in A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences (Princeton University Press,
    2012), pp. 16‐38.
  • 18.1.5. Axel Marx, Benoit Rihoux and Charles Ragin, “The origins, development, and application of Qualitative Comparative Analysis: the first 25 years.” European Political Science Review, 2013, pp. 1‐28.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Constructing and Analyzing Truth Tables

Charles Ragin, University of California, Irvine

This session describes the procedures for constructing and analyzing truth tables. Truth tables are at the heart of any QCA. We first explain how not only crisp, but also fuzzy sets can be represented in a truth table. Then we explain the logic of identifying sufficient terms for the outcome, using logical minimization.

  • 18.2.1. Charles C. Ragin, “Boolean approach to qualitative comparison.” The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. University of California Press, 1987, Chapter 6, pp. 85‐102.
  • 18.2.2 Carsten Q. Schneider and Claudius Wagemann, Set‐Theoretic Methods for the Social Sciences: A Guide to Qualitative Comparative Analysis (Cambridge University Press, 2012), Chapter 4, pp. 92‐116.

Recommended:

  • 18.2.3. Ragin, Charles and Lisa Amoroso. Constructing Social Research, Second Edition (Pine Forge Press, 2011), Chapter 6, pp. 135‐161.
  • 18.2.4. Rihoux, Benoit and Charles Ragin. Configurational Comparative Methods (Sage, 2009), Chapter 3, pp. 33‐68.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Counterfactual Analysis: A Set‐Analytic Approach

Charles Ragin, University of California, Irvine and Carsten Schneider, Central European University, Budapest

This session further elaborates truth table analysis. One of the key features of qualitative research is its reliance on counterfactual analysis. Surprisingly, most qualitative researchers are unaware that they conduct counterfactual analysis “on the fly,” and the analytic process remains hidden and implicit. With QCA, counterfactual analysis is made explicit in the form of
the distinction between “easy” versus “difficult” versus “untenable” counterfactual claims. The examination of counterfactual analysis in QCA illustrates the theory and knowledge dependence of empirical social science.

  • 18.3.1. Charles C. Ragin, Redesigning Social Inquiry: Fuzzy Sets and Beyond. University of Chicago Press, 2008, chapters 8‐9. (book to purchase)
  • 18.3.2. Carsten Q. Schneider and Claudius Wagemann, Set‐Theoretic Methods for the Social Sciences: A Guide to Qualitative Comparative Analysis (Cambridge University Press, 2012), Chapters 7 and 8, pp. 178‐217. (book to purchase)

Recommended:

  • 18.3.3. Charles C. Ragin, Extensions of Boolean methods of qualitative comparison. The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. University of California Press, 1987, Chapter 7, pp. 103‐124.

Tuesday, June 21 Module 19 Designing and Conducting Field Research: Preparing for Fieldwork and Operating in the Field, Diana Kapiszewski and Lauren MacLean

This module considers the parameters, design, planning, and execution of field research. We offer strategies for addressing the various intellectual, logistical, and social challenges that carrying out field research involves. A basic premise underlying the module is that fieldwork entails iterating among research design, data collection, and data analysis. Each session is conducted with the understanding that participants have carefully read the assigned materials. The instructors will present key points drawing on the readings and their collective experiences in managing fieldwork’s diverse challenges, and will then facilitate discussion of concepts and ideas in small and large groups.

8:45am ‐ 10:15am Borders and Varieties of Fieldwork

Diana Kapiszewski, Georgetown University
Lauren M. MacLean, Indiana University

In this session we discuss our conception of field research as an iterative process entailing repeated shifts among research design, data collection, and data analysis, and consider some of the implications of this conception. We consider fieldwork’s heterogeneity – how it varies across contexts, projects, and points of time in the same project – and also address issues of
ethics and power in the field.

  • 19.1.1. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Field Research in Political Science: Practices and Principles,” Chapter One in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)
  • 19.1.2. Elisabeth Wood, “The Ethical Challenges of Field Research in Conflict Zones,"Qualitative Sociology 29(3) (September 2006): 373‐386.
  • 19.1.3. Hauck, Robert, ed. 2008. “Symposium: Protecting Human Research Participants, IRBs and Political Science Redux.” PS: Political Science and Politics 41(3): 475‐511. See in particular contributions by Mitchell Seligson, Dvora Yanow, and Peri Schwartz‐Shea.

Recommended

  • 19.1.4. David Collier, “Data, Field Work and Extracting New Ideas at Close Range,” APSA‐CP Newsletter 10(1) (Winter 1999): 1‐2, 4‐6.
  • 19.1.5. Elisabeth Wood, “Field Research.” In Charles Boix and Susan Stokes, eds., The Oxford Handbook of Comparative Politics (Oxford University Press, 2007), pp. 123‐146.
  • 9.1.6. David Collier, David A. Freedman, James D. Fearon, David D. Laitin, John Gerring, and Gary Goertz, “Symposium: Case Selection, Case Studies, and Causal Inference,” Qualitative & Multi‐Method Research 6(2) (Fall 2008): 2‐16.
  • 19.1.7. Soledad Loaeza, Randy Stevenson, and Devra C. Moehler, “Symposium: Should Everyone Do Fieldwork?,” APSA‐CP Newsletter 16(2) (2005): 8‐18.
  • 19.1.8. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “A Historical and Empirical Overview of Field Research in the Discipline,” Chapter Two in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to
    purchase)

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Preparing for Fieldwork

Diana Kapiszewski, Georgetown University
Lauren M. MacLean, Indiana University

This session addresses pre‐dissertation and other exploratory research, logistical preparations for fieldwork, securing funding, networking to obtain contacts and interviews, negotiating institutional affiliation, and developing a data‐collection plan.

  • 19.2.1. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Preparing for Fieldwork,” Chapter Three in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)
  • 19.2.2. Adam Przeworski and Frank Salomon. 1995. “On the Art of Writing Proposals: Some Candid Suggestions for Applicants to Social Science Research Council Competitions,” Social Science Research Council.
  • 19.2.3. Micah Altman. “Funding, Funding,” PS: Political Science & Politics, 42(3) (2009): 521‐526.

Recommended

  • 19.2.4. Christopher B. Barrett and Jeffrey W. Cason, “Identifying a Site and Funding Source" in Overseas Research II: A Practical Guide (Johns Hopkins University Press, 2010).
  • 19.2.5. Christopher B. Barrett and Jeffrey W. Cason, “Predeparture Preparations” in Overseas Research II: A Practical Guide (Johns Hopkins University Press, 2010).

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Operating in the Field

Diana Kapiszewski, Georgetown University
Lauren M. MacLean, Indiana University

This session offers practical advice on collecting data and managing inter‐personal relations in the field. We introduce a range of more‐interactive and less‐interactive data‐collection techniques, with a particular emphasis on the latter, consider their strengths and weaknesses, and think about how they can be combined. We discuss various forms of interaction in which fieldworkers engage, including hiring and working with research assistants and collaborating with other researchers.

  • 19.3.1. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Managing in the Field: Logistical, Social, Operational, and Ethical Challenges,” Chapter Four in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)
  • 19.3.2. Lee Ann Fuji, “Working with Interpreters.” In Layna Mosley, ed., Interview Research in Political Science (Cornell University Press, 2013), pp. 144‐158.

Recommended

  • 19.3.3. Melani Cammett, “Using Proxy Interviewing to Address Sensitive Topics.” In Layna Mosley, ed., Interview Research in Political Science (Cornell University Press, 2013)., pp. 125‐143.
  • 19.3.4. Sheila Carapico, Janine A. Clark, Amaney Jamal, David Romano, Jilian Schwedler, and Mark Tessler, “Symposium: Methodologies of Field Research in the Middle East,” PS: Political Science and Politics 39(3) (July 2006).

Wednesday, June 22 Module 20 – Computer Assisted Text Analysis II, William Lowe and James Lo

8:45am ‐ 10:15am

In this session, we show how to use open source text analysis tools for automated document classification.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm

Scaling models try to estimate actors' positions on interesting dimensions using differential word usage. In this session we show how to fit and interpret such models, how to think about dimensionality of texts, what important discourse features are left out or abstracted away, and what we have to assume about how words are generated in order to be able to apply them. We then consider to what extent those assumptions are reasonable, and also how square them with the idiosyncratic and often strategically structured institutional contexts in which political
language actually appears.

  • 20.1.1. Jonathan B. Slapin and Sven‐Oliver ProkschA Scaling Model for Estimating Time-Series Party Positions from Texts. American Journal of Political Science 52(3) 2008: 705‐722
  • 20.1.2. M. Laver, K. Benoit, and J. Garry, Extracting Policy Positions from Political Texts Using Words as Data. American Political Science Review 97(2) 2003: 311‐332

Recommended:

  • 20.1.3. S.‐O. Proksch and J.B. Slapin, Position taking in European Parliament speeches. British Journal of Political Science, 2009
  • 20.1.4. S‐O Proksch and J. Slapin, Institutional Foundations of Legislative Speech. American Journal of Political Science 56(3) 2012: 520‐537

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm

In the practical session, we show how to scale texts and visualize their content. Time permitting we may also discuss how to harvest text data from the web, deal with non‐English language content, and solve other practical problems that arise in text analysis.

Wednesday, June 22 Module 21 – QCA/fs II, Charles Ragin and Carsten Schneider

This module discusses various extensions and advanced issues in using set‐analytic methods. After a brief description of the set calibration procedure, we spell out the principles and practices of set‐theoretic multi‐method research and discuss various ways of including the time dimension into set‐analytic approaches. In the final session we replicate and review several applications of set‐analytic studies. The hands‐on exercises in each session aim at further familiarizing participants with the existing software tools for set‐analytic analyses.

8:45am ‐ 10:15am Time and QCA

Charles Ragin, University of California, Irvine and Carsten Schneider, Central European University, Budapest

This session spells out various strategies for integrating the temporal dimension into QCA‐based research. Such strategies range from calibrating sets that contain information on temporal order or duration of events to using panel data. We discuss several strategies and demonstrate the empirical application of panel data diagnostics using an update version of the R package SetMethods.

  • 21.1.1. Roberto García‐Castro and Miguel A. Arino. 2013. “A General Approach to Panel Data Set‐Theoretic Research.” Compasss Working Paper, WP2013‐76; 1–27.
  • 21.1.2. Neal Caren and Aaron Panofsky. 2005. “TQCA. a Technique for Adding Temporality to Qualitative Comparative Analysis.” Sociological Methods & Research 34(2): 147–72.
  • 21.1.3. Charles C. Ragin and Sarah Strand. 2008. “Using Qualitative Comparative Analysis to Study Causal Order. Comment on Caren and Panofsky (2005).” Sociological Methods & Research 36(4): 431–41.

Recommended:

  • 21.1.4. James Mahoney, Erin Kimball, and Kendra L Koivu. 2009. “The Logic of Historical Explanation in the Social Sciences.” Comparative Political Studies 42(1): 114–46.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Set‐Theoretic Multi‐Method Research

Carsten Schneider, Central European University, Budapest

This session explains the principles and some (computer‐aided) practices of combining the truth table analysis aspect of QCA with follow‐up within‐case analyses of purposefully selected cases. We discuss which cases, based on a cross‐case pattern discerned with QCA, are typical and which ones are deviant. We also spell out which of the potentially many typical and deviant cases should be chosen for either single‐case or comparative within‐case analysis and what the analytic goal of process tracing can (and cannot) be in these different forms of comparison. We will use a set of functions in the update R package SetMethods.

  • 21.2.1. Carsten Q. Schneider and Ingo Rohlfing. 2013. “Combining QCA and Process Tracing in Set‐Theoretic Multi‐Method Research.” Sociological Methods and Research 42(4): 559–97.
  • 21.2.2. Ingo Rohlfing and Carsten Q. Schneider. 2013. “Improving Research On Necessary Conditions.” Political Research Quarterly 66(1): 220–35.

Recommended:

  • 21.2.3. Charles C. Ragin and Garrett Andrew Schneider. 2011. “Case‐Oriented Theory Building and Theory Testing.” In The SAGE Handbook of Innovations in Social Research Methods, ed. Malcolm; Vogt Williams W. Paul. London, 150–66.
  • 21.2.4. Rihoux, Benoit, and Bojana Lobe. 2009. “The Case for Qualitative Comparative Analysis (QCA): Adding Leverage for Thick Cross‐Case Comparison.” In Sage Handbook Of Case‐Based Methods, eds. David Byrne and Charles Ragin. London: Sage, 222–42.
  • 21.2.5. Ingo Rohlfing and Carsten Q. Schneider. 2016. “A Unifying Framework for Causal Analysis in Set‐Theoretic Multi‐Method Research.” Sociological Methods & Research.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Examples of Advanced Applied QCA

Charles Ragin, University of California, Irvine and Carsten Schneider, Central European University, Budapest

This session reviews several applications of set‐analytic methods. Our goal is to illustrate the utility and flexibility of the approach, as well as its tight coupling with theoretical concepts. We include a large‐N application to illustrate issues in applying QCA to such data.

  • 21.3.1. Charles C. Ragin, Redesigning Social Inquiry: Fuzzy Sets and Beyond. University of Chicago Press, 2008, chapters 10, 11. (book to purchase)
  • 21.3.2. Charles Ragin and Peer Fiss, Intersectional Inequality: Race, Class, Test Scores and Poverty. book manuscript, chapters 6 and 7.
  • 21.3.3. Carsten Q. Schneider and Kristin Makszin. 2014. “Forms of Welfare Capitalism and Education‐Based Participatory Inequality.” Socio‐Economic Review 12(2): 437‐62

Recommended:

  • 21.3.4. Corinne Bara, 2014. “Incentives and Opportunities: A Complexity‐Oriented Explanation of Violent Ethnic Conflict.” Journal of Peace Research 51(6): 696–710.
  • 21.3.5. Eva Thomann, 2015. “Journal of European Public Policy Customizing Europe: Transposition as Bottom‐up Implementation.” Journal of European Public Policy 22(10): 1368–87.
  • 21.3.6. David Kuehn et al. 2016. “Conditions of Civilian Control in New Democracies: An Empirical Analysis of 28 ‘third Wave’ Democracies.” European Political Science Review: 1–23.

Wednesday, June 22 Module 22, Archival Research and Elite Interviewing, James Goldgeier, Andrew Moravcsik, Elizabeth Saunders

Archival and Interview Research with Primary Sources: What Do You Need to Know, How Do You Know Where to Look, and How Do You Get What You Need?

In this module, we will discuss how political scientists decide they need to use primary records of policy‐making—archives, interviews, and published primary sources—in their research. This includes how one prepares for, structures, conducts, and manages the information flow from
archival visits, interviews or structured examination of published materials. We focus on practical research skills scholars can use, and judgments they must make in everyday research. We conclude with a discussion of making qualitative research transparent.

8:45am ‐ 10:15am Selecting and Preparing for Archival and Interview Research

This session highlights the practical trade‐offs between different types of textual and interview research and the ways in which one must prepare for them. It focuses on issues to think about before you start your research. We will talk about different types of repositories, briefly explain how to use the Freedom of Information Act, and strategies for maximizing the output of interviews.

  • 22.1.1. Fred I. Greenstein and Richard H. Immerman, “What Did Eisenhower Tell Kennedy About Indochina? The Politics of Misperception,” Journal of American History 79(2) (September 1992): 568‐587.
  • 22.1.2. Cameron Thies, “A Pragmatic Guide to Qualitative Historical Analysis in the Study of International Relations,” International Studies Perspectives 3(4) (November 2002): 351‐372.
  • 22.1.3. Ian Lustick, “History, Historiography, and Political Science: Multiple Historical Records and the Problem of Selection Bias,” American Political Science Review 90(3) (September 1996): 605‐618.
  • 22.1.4. Marc Trachtenberg, The Craft of International History (Princeton University Press, 2006), Appendix I and Appendix II. Available at
    http://www.sscnet.ucla.edu/polisci/faculty/trachtenberg/methbk/AppendixI.html
    http://www.sscnet.ucla.edu/polisci/faculty/trachtenberg/methbk/AppendixII.html

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Structuring Your Data Collection: Making Sure You Can Use What You Find

This session will address concerns that arise during your research. We will discuss hands‐on electronic strategies for structuring, organizing, and storing your oral and documentary data so that you can easily and systematically access it as you move to the analysis and writing phase of your project. The process of structuring your data begins before you leave for the archives, and informs how you conduct your research in the archives and your analysis of documents when you get home.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm – 5:30pm DART and the Recent Transparency Debate in Political Science

This session addresses concerns that arise when presenting research. Researchers, methodologists, journals, funders and regulators are encouraging scholars to make their evidence, interpretations and research procedures more transparent, as is increasingly the norm for journalists, government officials, policy analysts, bloggers, and scholars in other academic disciplines. Active citation, a system of digitally enabled appendices, is the most viable general strategy to cost‐effectively increase the transparency of qualitative work. Data Access and Research Transparency (DART), an informal initiative designed to enhance the openness of political science research, has generated considerable support in the profession, but also intense criticism and opposition. In this session, we will conduct an open debate on the merits of transparency norms and their optimal form. Participants are encouraged to read carefully so as to engage fully in the debate.

  • 22.3.1. "Guidelines for Data Access and Research Transparency in Qualitative Political Science," (Draft Document of APSA Ethics Committee Report, 2013).
  • 22.3.2. Jeffrey Isaac, "For a More Public Political Science," Perspectives in Politics 13:2 (June 2015), pp. 269‐283.
  • 22.3.3. Andrew Moravcsik, "Qualitative Transparency: Pluralistic, Humanistic and Policy-Relevant," International History and Politics Newsletter (APSA) 1:2 (Winter 2016), pp. 12‐18.

Recommended:

  • 22.3.4. Colin Elman and Arthur Lupia, "Openness in Political Science: Data Access and Research Transparency," PS: Political Science & Politics 47:1 (January 2014), pp 19‐42.
  • 22.3.5. Andrew Moravcsik, "One Norm, Two Standards: Realizing Transparency in Qualitative Political Science," The Political Methodologist 22:1 (Fall 2014), pp. 3‐9.
  • 22.3.6. Andrew Moravcsik, “Trust, but Verify: The Transparency Revolution and Qualitative International Relations,” Security Studies 23:4 (2014), pp. 663‐688.
  • 22.3.7. Website: “Qualitative Transparency Deliberations on Behalf of the APSA Section for Qualitative and Multi‐Method Research” at https://www.qualtd.net/
  • 22.3.8. “Symposium: Transparency in Qualitative and Multi‐Method Research,” Qualitative and Multi‐Method Research Newsletter 13:1 (Spring 2015).

Wednesday, June 22 Module 23 Designing and Conducting Field Research: Collecting and Analyzing Data – Diana Kapiszewski and Lauren MacLean

This module discusses a range of data‐collection techniques, emphasizes the importance of engaging in data collection and data analysis simultaneously, and offers multiple strategies for engaging in analysis in the field. Each session of this module is conducted with the understanding that participants have carefully read the assigned materials. The instructors will present key points drawing on the readings and their collective experiences in managing fieldwork’s diverse challenges, and will then facilitate discussion of concepts and ideas in small
and large groups. Students will also have an opportunity to practice using the data‐collection techniques discussed and research tools presented.

8:30am ‐ 10:00am More‐Interactive Forms of Data Collection

Diana Kapiszewski, Georgetown University
Lauren M. MacLean, Indiana University

This session considers the differences among, unique features of, and benefits and challenges inherent in employing several more‐interactive forms of data collection including participant observation, ethnography, surveys, and experiments.

  • 23.1.1. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Site‐Intensive Methods: Ethnography and Participant Observation,” Chapter Seven in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to
    purchase)
  • 23.1.2. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Surveys in the Context of Field Research,” Chapter Eight in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)
  • 23.1.3. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Experiments in the Field,” Chapter Nine in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)

Recommended

  • 23.1.4. Ellen Pader, “Seeing with an Ethnographic Sensibility: Explorations Beneath the Surface of Public Policies.” In Dvora Yanow and Peregrine Schwartz‐Shea, eds., Interpretation and Method: Empirical Research Methods and the Interpretive Turn (M.E., Sharpe, 2006).
  • 23.1.5. Lisa Wedeen, “Reflections on ethnographic work in political science,” Annual Review of Political Science 13: 255‐272.
  • 23.1.6. Jan Kubik, “Ethnography of Politics: Foundations, Applications, Prospects.” In Edward Schatz, ed., Political Ethnography (University of Chicago Press, 2009), pp. 25‐52.
  • 23.1.7. Henry E. Brady, “Contributions of Survey Research to Political Science,” PS: Political Science and Politics 33(1) (March 2000): 47‐57.
  • 23.1.8. Nora Cate Schaeffer and Stanley Presser, “The Science of Asking Questions,” Annual Review of Sociology 29(1) (December 2003): 65‐88.
  • 23.1.9. Seymour Sudman and Norman M. Bradburn, Asking Questions: A Practical Guide to Questionnaire Design (Jossey‐Bass, 1982).
  • 23.1.10. Paluck, Elizabeth Levy, “The Promising Integration of Qualitative Methods and Field Experiments,” The ANNALS of the American Academy of Political and Social Science 628(1) (March 2010): 59‐71.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Interviewing

Diana Kapiszewski, Georgetown University
Lauren M. MacLean, Indiana University

This session explores various types of interviewing including one‐on‐one in‐depth interviews, oral histories, and focus groups. We consider the many challenges and opportunities that conducting interviews in the field entails and offer a range of practical advice.

  • 23.2.1. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Interviews, Oral Histories, and Focus Groups,” Chapter Six in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)
  • 23.2.2. Erik Bleich and Robert Pekkanen, “How to Report Interview Data.” In Layna Mosley, ed., Interview Research in Political Science (Cornell University Press, 2013), .pp 84‐105.
  • 23.2.3. Joe Soss, “Talking Our Way to Meaningful Explanations: A Practice‐Centered View of Interviewing for Interpretive Research.” In Dvora Yanow and Peregrine Schwartz‐Shea, eds., Interpretation and Method: Empirical Research Methods and the Interpretive Turn (M.E.
    Sharpe, 2006), pp. 127‐149.

Recommended

  • 23.2.4. Beth Leech and Kenneth Goldstein, “Symposium: Interview Methods in Political Science,” PS: Political Science and Politics 35(4) (December 2002): 663‐672.
  • 23.2.5. Susan E. Short, “Focus Groups.” In Ellen Perecman and Sara Curran, eds., A Handbook for Social Science Field Research: Essays & Bibliographic Sources on Research Design and Methods (Sage, 2006), pp. 103‐115.
  • 23.2.6. Herbert Rubin and Irene Rubin, Qualitative Interviewing. The Art of Hearing Data, 2nd ed. (Sage, 2005), Chapters 6‐9.
  • 23.2.7. Oisin Tansey, “Process Tracing and Elite Interviewing: A Case for Non‐Probability Sampling,” PS: Political Science and Politics 40(4) (October 2007): 765‐772.

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm Analyzing, Re‐Tooling, and Assessing Progress

Diana Kapiszewski, Georgetown University
Lauren M. MacLean, Indiana University

This session considers various strategies for engaging in data analysis, writing, and presenting initial findings to different audiences while conducting fieldwork. It also considers how fieldworkers can retool their project in the field, assess their progress toward completing field
research.

  • 23.3.1. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Analyzing, Writing, and Retooling in the Field,” Chapter Ten in Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015). (Book to purchase)
  • 23.3.2. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Reconceptualizing Field Research,” Unpublished manuscript.
  • 23.3.3. Robert Emerson, Rachel Fretz, and Linda Shaw, “Processing Fieldnotes: Coding and Memoing” in Writing Ethnographic Fieldnotes (University of Chicago Press, 1995), pp. 142‐168.

Recommended

  • 23.3.4. Gilbert Shapiro and John Markoff, “A Matter of Definition.” In Carl Roberts, ed., Text Analysis for the Social Sciences: Methods for Drawing Statistical Inferences from Texts and Transcripts (Lawrence Erlbaum, 1997).
  • 23.3.5. Rose McDermott et al., “Symposium: Data Collection and Collaboration,” PS: Political Science and Politics 43(1) (January 2010): 15‐58.

Thursday, June 23 Module 24 – Mixed‐method research and causal mechanisms, part I, Nick Weller and Jeb Barnes

8:45am to 10:15am Introduction

This session introduces participants to different styles of mixed method research, their underlying assumptions and how these assumptions relate to the study of causal mechanisms.

  • 24.1.1. Nicholas Weller and Jeb Barnes, Finding Pathways: Mixed‐Method Research for Studying Causal Mechanisms (Cambridge University Press, 2014), Chapters 1‐2. (Book to purchase)
  • 24.1.2. John Gerring, “Causal Mechanisms: Yes, But...,” Comparative Political Studies 43(11) (November 2010): 1499‐1526.

Recommended

  • 24.1.3. John Gerring, “Is There a (Viable) Crucial‐Case Method?,” Comparative Political Studies 40(3) (March 2007): 231‐253.

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12:30pm – 2:00pm Lunch

2:00pm – 3:30pm. Selecting Cases for Pathway Analysis

This session will present a set of general steps for selecting cases for pathway analysis that guides scholars towards how to read the relevant literature, how to identify relevant research questions, and how to think about the types of cases that are relevant given the extant literature and research questions.

  • 24.2.1. Nicholas Weller and Jeb Barnes, Finding Pathways: Mixed‐Method Research for Studying Causal Mechanisms (Cambridge University Press, 2014), Chapter 3‐4. (Book to purchase)

Recommended

  • 24.2.2. Derek Beach and Rasmus Brun Pedersen, “Case Selection Techniques in Process Tracing and the Implications of Taking the Study of Causal Mechanisms Seriously.” Working paper (2012).

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Case selection using regression

This session introduces participants to the use of regression to select cases. We will discuss both the benefits and the pitfalls of this approach, and walk through multiple examples. The examples include both cross‐sectional data and panel data so that we can explore case
selection in both instances.

  • 24.3.1. Nicholas Weller and Jeb Barnes, Finding Pathways: Mixed‐Method Research for Studying Causal Mechanisms (Cambridge University Press, 2014), Chapter 4‐5. Chapter 4 is germane to both sessions 24.2 and 24.3. (Book to purchase)

Recommended

  • 24.3.2. Kenneth F. Schulz and David A Grimes, “Case‐control studies: research in reverse,” Lancet 359(9304) (February 2002): 431‐434.

Thursday, June 23 Module 25, CAQDAS I introduction to Atlas.ti – Robert Rubinstein

Overall Description

In this module participants will be introduced to atlas.ti for qualitative data analysis. The module will present the program and the general principles of its design. Using a sample project, participants will use the program to set up a research project for analysis and work with the data management functions of the software. Participants will explore the different ways atlas.ti facilitates the coding of project data, and how atlas.ti supports analysis of research materials. They will practice querying and producing a variety of outputs from a sample project.

On the second day of the module participants will set up an atlas.ti project, begin coding, and analysis with their own data sets. Although one can work with sample projects throughout, to get the most out of this module participants should bring with them material from their own
research. These can be interview transcripts, observational notes, documents and reports, or focus group documents.

This is not an introduction to a particular style of research or coding, so participants should come with an idea as to which approach to data coding and analysis they plan to use for their project. On the second day of the module participants will work collectively and individually to
create, code, query, and report information from their projects.

The articles listed below are useful background:

  • 25.1.1. Nancy L. Leech and Anthony J. Onwuegbuzie 2007. An Array of Qualitative Data Analysis Tools: A Call for Data Triangulation, School Psychology Quarterly 22(4): 557‐584.
  • 25.1.2. Janis Marshall and Harris L. Friedman 2012. Human versus Computer‐Aided Qualitative Data Analysis Ratings: Spiritual Content in Dream Reports and Diary Entries, The Humanistic Psychologist 40: 329‐342.
  • 25.1.3. Matthew B. Miles, A. Michael Huberman, and Johnny Saldaña 2014 Chapter 4: Fundamentals of Qualitative Data Analysis, in Qualitative Data Analysis: A Methods Sourcebook, 3rd Edition. London, UK: Sage Publications, Pp. 69‐104.
  • 25.1.4. Jonny Saldaña, 2013. Chapter 1: An Introduction to Codes and Coding, and Appendix A: A Glossary of Coding Methods, in The Coding Manual for Qualitative Researchers, 2nd Edition. London, UK: Sage Publications, Pp. 1‐40 and Pp. 261‐268.
  • 25.1.5. David Silverman, 2014. Chapter 5: Data Analysis, in Interpreting Qualitative Data, 5th Edition. London, UK: Sage Publications, Pp. 110‐137

8:45am ‐ 10:15am / Session 1 Why CAQDA / Meet atlas.ti

We will discuss the ways in which computer assisted qualitative analysis both reflects and improves upon traditional approaches to qualitative data analysis. We will then turn to looking at how atlas.ti in particular does this. During this first session we will explore the particular language used by atlas.ti and discuss issues of database management. We will set up a project and enter sample data into it.

What is atlas.ti
Terminology particular to atlas.ti
Organization of an atlas.ti project: The Hermeneutic Unit
Getting data into the HU—Primary Documents (PDs)
Working with PDs

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm‐3:00pm / Session 2

In this portion of the module we will continue working with database entry and management. We will explore the different objects that atlas.ti allows one to create—documents, codes, quotations, and memos. We will practice using the object managers to create primary document groupings, called families. Working with the primary document manager introduces skills that apply to the creation of other kinds of families. We will begin exploring the processes that atlas.ti makes available for classifying data, looking at the underlying logic used, and practicing with these tools.

Creating document families
Coding in atlas.ti
Word Cruncher
The code manager
Free quotations
Codes
Auto coding

3:30pm‐4:00pm / Coffee Break

4:00pm‐5:30pm / Session 3

We continue exploring the ways in which codes are implemented in atlas.ti and discuss and practice organizing codes to facilitate data exploration. During this portion of the module we will begin using some of the tools that atlas.ti provides for investigating patterns within a dataset, and which support the development of analytic analyses. We will create reports using these tools. We will begin working with the Query Tool which allows for the construction of complex queries of the dataset. At the end of the day we will conclude by preparing our sample projects so that can be safely transferred to other computers.

Codes and code books
Structuring codes / coding schemes
Simple retrieval
Creating reports
Primary document‐Code concurrence
Exploring the Query Tool: Complex retrieval
The Copy Bundle

Thursday, June 23, Module 26 Geographic Information Systems I: Introduction to GIS as a Qualitative Research Method – Jonnell Robinson

8:30am ‐ 10:00am Introduction and Case Studies in Qualitative GIS

Jonnell Robinson, Syracuse University

This session will introduce participants to GIS as a tool for qualitative research, present basic GIS terminology and concepts and the basic functions of ESRI’s ArcGIS software suite, particularly those functions that are most commonly used by social scientists.

  • 26.1.1. Samuel F. Dennis Jr. “Prospects for qualitative GIS at the intersection of youth." Environment and Planning A. 38. (2006): 2039‐2054. Web. 4 Apr. 2015.
  • 26.1.2. National Geographic. “Geographic Information Systems" http://education.nationalgeographic.com/education/encyclopedia/geographic‐informationsystem‐gis/?ar_a=1. Web. 4 Apr. 2015.
  • 26.1.3. Sam Sturgis. “Kids in India are sparking urban planning changes by mapping slums.” The Atlantic City Lab. Feb 19, 2015.
  • 26.1.4. Pamela Wridt. “A qualitative GIS approach to mapping urban neighborhoods with children to promote physical activity and child‐friendly community planning.” Environment and Planning B: Planning and Design. 37. (2010): 129‐147. Web. 4 Apr. 2014.

Further:

  • 26.1.5. Meghan Cope and Sarah Elwood. Qualitative GIS: A Mixed Methods Approach. Thousand Oaks, California: SAGE Publications Inc., 2009. Print.
  • 26.1.6. William J. Craig, Trevor M. Harris, and Weiner Daniel. Community Participation and Geographic Information Systems. London/ New York, New York: Taylor & Francis Inc., 2002. Print.
  • 26.1.7. Steven J. Steinberg. GIS: Geographic Information Systems for the Social Sciences: Investigating Space and Place. Thousand Oaks, California: SAGE Publications, 2006. Print.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm ‐ 3:30pm Basic GIS Functions

Jonnell Robinson, Syracuse University

This module will explore basic visualization and analytical functions such as building and querying attribute tables, selecting map features, and symbolizing data.

  • 26.2.1. Michael Batty. 2003. “Using Geographical Information Systems.” In Key Methods in Geography, edited by Nicholas J. Clifford and Gill Valentine, 409‐423. London: SAGE Publications.
  • 26.2.2. Juliana Maantay and John Ziegler. 2006. GIS for the Urban Environment. Redlands: ESRI Press. Read Pages 8‐19 and 57‐86.
  • 26.2.3. Andy Mitchell. 1999. The ESRI Guide to GIS Analysis. Volume 1: Geographic Patterns and Relationships. Redlands: Environmental Systems Research Institute, Inc. Read Pages 9‐19.

Recommended

  • 26.2.4. David Allen. GIS Tutorial 2: Spatial Analysis Workbook, 10.1 Edition. Redlands, California: ESRI Press Inc., 2013. Print..
  • 26.2.5. David W. Allen and Jeffery M. Coffey. GIS Tutorial 3: Advanced Workbook, 10.0 Edition. Redlands, California: ESRI Press Inc., 2010. Print.).
  • 26.2.6. Gorr L. Wilpen and Kristen S. Kurland. GIS Tutorial 1: Basics Workbook, 10.1. Edition. Redlands, California: ESRI Press Inc., 2013. Print..

3:30pm ‐ 4:00pm Coffee Break.

4:00pm ‐ 5:30pm GIS Data Sources and Data Integration

Jonnell Robinson, Syracuse University

This session will review the types and sources of data that are available for GIS users working in both data rich and data poor settings, the ethics of using mapping in research, how metadata can be used to communicate qualitative information, and data overlay analysis.

  • 26.3.1. Jin‐Kyu Jung and Sarah Elwood. “Extending the qualitative capabilities of GIS.” Transactions in GIS. 14. (2010): 63‐87. Web. 4 Apr. 2015.
  • 26.3.2. Giacomo Rambaldi, Robert Chambers, Mike McCall and Jefferson Fox. 2006. “Practical Ethics for PGIS Practitioners, Facilitators, Technology Intermediaries and Researchers.” In Participatory Learning and Action, 106‐113.
  • 26.3.3. Steven J. Steinberg and Sheila L. Steinberg. 2006. GIS for the Social Sciences: Investigating Place and Space. Thousand Oaks: SAGE Publications. Read Chapter 2.

Recommended

  • 26.3.4. Ian N. Gregory, A Place in History: A guide to using GIS in historical research. 2nd. Belfast, Northern Ireland: Centre for Data Digitisation and Analysis, 2005. Web. 4 Apr. 2014
  • 26.3.5. Mark Monmonier. How to Lie With Maps. 2. Chicago, Illinois: University of Chicago Press, 1996. Print.

Thursday, June 23, Module 27: Interpretation and History I: Discourse Analysis and Intellectual History, Thomas Dodman and Daragh Grant

This module introduces students to methods of discourse analysis employed by political theorists and historians of political thought and to critical approaches to intellectual history. Building on earlier modules on discourse analysis, the first session will introduce participants to different approaches to “reading” texts, and will examine debates over meaning, concepts, context, and the explanation of historical change, as well as engaging with ongoing debates about the politics of historiography. We will discuss the techniques of the Cambridge school and the German tradition of Begriffsgeschichte (concept history). Participants will engage in a practical exercise of concept analysis during the second session of the day, and we will discuss their findings, and the methodological challenges they encountered in the final session of the day.

In both modules on Interpretation and History, we expect students to come to the sessions having completed all of the required readings. These two sessions will be conducted in the style of an academic seminar rather than in lecture form, with a view to allowing your research interests to shape our discussion of the readings.

8:45 am to 10:15am Interpretive debates in intellectual history

This session considers two important traditions in the history of political thought by introducing participants to the work of Quentin Skinner and the Cambridge school of intellectual history and Reinhardt Koselleck and the techniques of Begriffsgeschichte (or concept history). We will consider, among other things, how one goes about reconstructing the questions that a given author is asking? what are illocutionary acts and why do they matter? to what extent are texts and the ideas they formulate related to specific historical contexts? and how do texts relate to practices of power and domination? We will also investigate What is a concept? how does it come into being? and in what relation to the social world? In both cases, we will try to ascertain what are the advantages and limitations of this approach to discourse analysis, a conversation that will continue into the final session of the day.

  • 27.1.1. Quentin Skinner, “Meaning and Understanding in the History of Ideas,” History and Theory, 8 (1969): 3‐53.
  • 27.1.2 Reinhardt Koselleck, “Introduction and Prefaces to the Geschichtliche Grundbegriffe," trans. Michaela Richter, Contributions to the History of Concepts, 6 (2011), 1‐37.
  • 27.1.3 Reinhardt Koselleck, “Historical Criteria of the Modern Concept of Revolution,” in Futures Past, trans. Keith Tribe (New York: Columbia University Press, 2004), 43‐57.

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12:30pm – 2:00pm Lunch

2:00pm – 3:30pm Practical exercise (Syracuse University Library)

During this session, you will be asked to work collaboratively (in groups of up to 4 participants) to develop a history of a particular concept. You will use the resources available to you at Syracuse University Library and online to investigate the meaning of the concept in question how it has changed over time, and the kinds of conceptual challenges that these changes pose for scholars doing historical work. We would ask you to make note not only of this concept history, but also of the challenges you faced when attempting to investigate it. Naturally, the limited time you will have available to complete this task will pose a significant constraint, but the goal is for you to come face to face with some of the challenges of this kind of work. Groups will be able to choose one of four concepts, which we will hand out in the first session of the day. We hope that by the third session the similarities and divergences in your respective experiences will allow for a fruitful debriefing and discussion of the methods of intellectual history.

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Debrief and further discussion

Participants will have some time at the beginning of this session to finish writing up their findings from the morning. We will then discuss the exercise in the light of the morning’s readings and of Hayden White’s analysis of the various ways in which you might “emplot” your
research findings.

  • 27.3.1. Hayden White, “Introduction: The Poetics of History,” in Metahistory: The Historical Imagination in 19th‐Century Europe (Baltimore: Johns Hopkins University Press, 1973). 1‐42.

Suggested further readings

J.L. Austin, How to do Things with Words (Cambridge: Harvard University Press, 1962).

Pierre Bourdieu, Language and Symbolic Power, ed. by John Thompson, trans. by Gino Raymond and Matthew Adamson (Cambridge: Polity, 1991), 107‐137.

R. G. Collingwood, An Autobiography (Oxford: Oxford University Press, 1951), 27‐43.

Reinhardt Koselleck, “Begriffsgeschichte and Social History,” in Futures Past: On the Semantics of Historical Time, trans. by Keith Tribe (New York: Columbia University Press, 2004), 75‐92.

Reinhardt Koselleck, The Practice of Conceptual History: Timing History, Spacing Concepts, trans. by Todd Samuel Presner and others (Stanford: Stanford University Press, 2002).

Dominick LaCapra, “Rethinking Intellectual History and Reading Texts,” in Rethinking Intellectual History: Texts, Contexts, Language (Ithaca: Cornell University Press, 1983), 23‐71.

Melvin Richter, “Begriffsgeschichte and the History of Ideas,” Journal of the History of Ideas, 48 (1987): 247‐263.

Quentin Skinner, “The rise of, challenge to, and prospects for a Collingwoodian approach to the history of political thought,” in The History of Political Thought in National Context, eds. Dario Castiglione and Iain Hampsher‐Monk (Cambridge: Cambridge University Press, 2001), 175‐88.

Friday, June 24 Module 28 – Mixed‐method research and causal mechanisms, part II, Nick Weller and Jeb Barnes

This module continues with the material from Part 1 (Module 24).

8:45am ‐ 10:15am Case selection using matching

This session introduces participants to the use of matching as a way to select cases for mixedmethods research. We will discuss matching at a general level and then turn to how to use matching to select cases.

  • 28.1.1. Nicholas Weller and Jeb Barnes, Finding Pathways: Mixed‐Method Research for Studying Causal Mechanisms (Cambridge University Press, 2014), Chapter 6. (Book to purchase)

Recommended

  • 28.1.2. Elizabeth Stuart, “Matching Methods for Causal Inference: A Review and a Look Forward,” Statistical Science 25(1) (February 2010): 1‐21.
  • 28.1.3. Richard Nielsen, “Case Selection via Matching,” Sociological Methods and Research (forthcoming).

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12: 30pm – 2:00pm Lunch

2:00pm – 3:30pm Contextualizing and extending prior research

This session will discuss how to use large‐N methods to conceptualize prior research when scholars are building directly on results from other studies. In particular, we will consider issues related to knowledge accumulation across multiple case studies.

  • 28.2.1. Nicholas Weller and Jeb Barnes, Finding Pathways: Mixed‐Method Research for Studying Causal Mechanisms (Cambridge University Press, 2014), Chapters 7‐8.
  • 28.2.2. Michael Ross, “How Do Natural Resources Influence Civil War? Evidence from Thirteen Cases,” International Organization 58(1) (Winter 2004): 35‐67.

Recommended

28.2.3. Karen Luftey and Jeremy Freese, “Toward Some Fundamentals of Fundamental Causality. Socioeconomic Status and Health in Routine Clinic Visit for Diabetes,” American Journal of Sociology 110(5) (March 2005):1326‐1372

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Applying Pathway Analysis

In this session, we will field questions about student projects in light of the material we’ve discussed in the prior sessions.

Friday, June 24 Module 29 – CAQDAS II Introduction to Atlas.ti — Robert Rubinstein

8:45am ‐ 10:15am / Session 1

Atlas.ti provides a variety of tools for enhancing the analytic engagement with one’s dataset. We will use some of those tools in this portion of the module. We will continue looking at the Query Tool, and then consider how the process of creating memos can support a researcher’s analytic processes. We will use memos to create a record of our theoretical thinking, explore how memos can be used as field journal, and discuss other uses of memos. We will explore more complex way of working with codes, and if time permits we will explore some of the theory building capacities in atlas.ti, especially the ability to visualize data through the construction of semantic networks.

Working with memos
Working with families and codes
Networks and theory building
Advance features

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm‐3:00pm / Session 3

During the final two sessions of this module participants will work on their own research projects. Although this is not a workshop on research design or how to code, it will be useful for participants to discuss their coding approaches and project designs with one another. Participants will work on their projects, and will have the opportunity to consult with one another about strategies for implementing atlas.ti for their research.

Create atlas.ti projects for individual research projects
Collective discussion of project structures
Approaches to coding

3:30pm‐4:00pm / Coffee Break

4:00pm‐5:30pm / Session 4

Work with individual projects
Create initial reports

Friday, June 24, Module 30 Geographic Information Systems II: Exploring Analytic Capabilities – Jonnell Robinson

8:45am ‐ 10:15am Open Source Mapping Tools

Jonnell Robinson, Syracuse University

This session will introduce open source geovisualization and analysis tools including Open Street Map, Google My Maps, and QGIS.

  • 30.1.1. Mordechia Haklay and Patrick Weber. “OpenStreetMap: User‐Generated Street Maps”. Pervasive Computing. 2008: 7(4) 12‐18.
    http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4653466 (accessed April 2015).
  • 30.1.2. Sophia B. Liu and Leysia Palen. “The New Cartographers: Crisis Map Mashups and the Emergence of Neogeographic Practice.” Cartography and Geographic Information Science 2010: 37(1) 69‐90.
  • 30.1.3. Stefan Steiniger and Erwan Bocher. “An Overview on Current Free and Open Source Desktop GIS Developments. International Journal of Geographical Information Science. 2009:23(10) 1345‐1370.

Further Readings

  • 30.1.4. Sarah Elwood, Michael F. Goodchild and Daniel Z. Sui. “Researching Volunteered Geographic Information: Spatial Data, Geographic Research, and New Social Practice.” Annals of the Association of American Geographers 2012:102(3) 571‐590.

10:15am ‐ 10:45am Coffee Break.

10:45am ‐ 12:30pm Research Design Discussion Sessions (not part of Module).

12: 30pm ‐ 2:00pm Lunch.

2:00pm – 3:30pm GIS Data Collection: Digitizing Archival Maps, Collecting GPS Point

Locations, Counter and Sketch Mapping, and Spatial Data Repositories

Jonnell Robinson, Syracuse University

This session will demonstrate valuable data collection techniques for archival research, field work, participatory and community‐based mapping, as well as the availability and accessibility of spatial data through data repositories. “Heads‐up” digitizing, or turning print maps into a
digital GIS map, integrating GPS receiver data into GIS, and sketch map digitization will be demonstrated. Downloading spatial data from web‐based repositories for integration into GIS will also be discussed.

  • 30.2.1 Lynn Heasley, “Shifting Boundaries on a Wisconsin Landscape: Can GIS Help Historians Tell a Complicated Story?” Human Ecology. 2003;31(2) 183‐213.
  • 30.2.2. Nancy Lee Peluso. “Whose Woods are These? Counter‐Mapping Forest Territories in Kalimantan, Indonesia.” Antipode 1995:27(4) 383‐406

Further Readings

  • 30.2.3. William J. Craig, Trevor M. Harris, and Weiner Daniel. Community Participation and Geographic Information Systems. London/ New York, New York: Taylor & Francis Inc., 2002. Print.
  • 30.2.4. Ian N. Gregory, A Place in History: A guide to using GIS in historical research. 2nd. Belfast, Northern Ireland: Centre for Data Digitisation and Analysis, 2005. http://www.researchgate.net/profile/Ian_Gregory2/publication/228725974_A_place_in_history_A_guide_to_using_GIS_in_historical_research/links/547726620cf29afed614470b.pdf. (accessed April 2015).
  • 30.2.5. John Pickles, Ground Truth: The Social Implications of Geographic Information Systems. New York, New York: The Guilford Press, 1995. Print.
  • 30.2.6. Denis Wood, The Power of Maps. New York, New York: The Guilford Press, 1992. Print.

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm Map Design

Jonnell Robinson, Syracuse University

This session will provide an overview of basic map design, integrating narrative and photos with GIS, and a discussion about why, how and where to further hone GIS skills.

  • 30.3.1. Aileen Buckley, Kenneth Field, and Esri. “Making a Meaningful Map.” ESRI ‐ GIS Mapping Software, Solutions, Services, Map Apps, and Data. http://www.esri.com/news/arcuser/0911/making‐a‐map‐meaningful.html (accessed April 2015).

    Further Readings
  • 30.3.2. Cynthia A. Brewer, Designing better maps: a guide for GIS users. Redlands, California: ESRI Press, Inc., 2005.
  • 30.3.3. Heather MacDonald and Alan Peters. Urban Policy and the Census. Redlands, California: ESRI Press, Inc. 2011. Print.
  • 30.3.4. Andy Mitchell, The ESRI Guide to GIS Analysis: Geographic Patterns & Relationships. 1. Redlands, California: ESRI Press, Inc., 1999. Print.
  • 30.3.5. Andy Mitchell, The ESRI Guide to GIS Analysis: Spatial Measurements & Statistics. 2. Redlands, California: ESRI Press, Inc., 2005. Print.
  • 30.3.6. Andy Mitchell, The ESRI Guide to GIS Analysis: Modeling Suitability, Movement, and Interaction. 3. Redlands, California: ESRI Press, Inc., 2012. Print.
  • 30.3.7. Mark Monmonier, Mapping it Out: Expository Cartography for the Humanities and Social Sciences. Chicago, Illinois: University of Chicago Press, 1993. Print.

Friday, June 24, Module 31: Interpretation and History II: Interpretive Methods for Archival and Historical Research, Thomas Dodman and Daragh Grant

This module introduces students to the challenges of working with materials drawn from different social, cultural, and historical settings, and explores creative interpretive strategies for addressing these challenges. Students will be introduced to the basics of the historical method, and will be encouraged to think about how a careful attention to questions of temporality can shape and reveal new avenues in their empirical research. All three sessions will be attentive to the problem of analyzing historical materials from the standpoint of the present. Shifting meanings over time, and transformations in the criteria for judgment, present particular problems for historical researchers. In light of these challenges, students will be invited to think through the strategies available for working in a partial archive, with attention to the virtues and pitfalls of creatively thinking about historical source materials.

8:45am ‐ 10:15am: History as social science: The study of structures and events

This session introduces students to the historical method, highlighting two key challenges to the study of historical events. Students will begin the session by working in groups to identify their own archival challenges, specifically related to two questions. First, how does the problem of temporality enter their work? And second, how do the events they study refashion the very structures of the societies on which their research is centered?

  • 31.1.1. William H. Sewell Jr., “Three Temporalities: Toward an Eventful Sociology,” in Logics of History: Social Theory and Social Transformation (Chicago: University of Chicago Press, 2005), 81‐123.
  • 31.1.2. Marshall Sahlins, “Structure and History,” in Islands of History (Chicago: University of Chicago Press, 1985), 136‐56.

10:15‐10:45am Coffee Break

10:45am‐12.30pm Research Design Discussion Sessions

12: 30pm – 2:00pm Lunch

2:00pm – 3:30pm: Practical challenges of archival research

This session will introduce students to the more mundane practical challenges that scholars face, as well as some of the hidden possibilities that await them in the course of archival research. The readings for this session are designed to give participants a sense of the importance of understanding the production of the archive itself. We will examine questions of interpretation raised by these readings as well as exploring how fleeting or fragmentary records might nevertheless yield a wealth of historical insights.

To conclude this session, we will invite participants to examine a brief archival fragment. The goal of this exercise will be to attempt to bring some of the discussion of the previous two days to bear on the examination of a historical document.

  • 31.2.1. Carlo Ginzburg, “Clues: Roots of a Scientific Paradigm,” Theory and Society 7 (1979): 273‐88.
  • 31.2.2. Randolph Head, “Knowing the State: The Transformation of Political Knowledge in Swiss Archives, 1450‐1770,” Journal of Modern History 75 (2003): 745‐82. [Participants can skim the more detailed discussions of Swiss history, but are asked to focus on the discussions of archival construction]

3:30pm ‐ 4:00pm Coffee Break

4:00pm ‐ 5:30pm: The Politics of Historical Interpretation

At the core of historical research are questions of evidence, of both the power of the archive and the archive of power. This section explores key debates and controversies that have shaped the considerable theoretically informed literature on the shifting coordinates of historical evidence.

  • 31.3.1. Michel‐Rolph Trouillot, “The Power in the Story,” in Silencing the Past: Power and the Production of History (Boston: Beacon Press, 1995), 1‐30.
  • 31.3.2 David Scott, Conscripts of Modernity: The Tragedy of Colonial Enlightenment (Durham: Duke University Press, 2004), 23‐57.

Suggested further readings

Arlette Farge, The Allure of the Archives, trans. Thomas Scott‐Railton (New Haven: Yale University Press, 2013).

Constantin Fasolt, The Limits of History (Chicago: University of Chicago Press, 2004).

Carlo Ginzburg, “Checking the Evidence: The Judge and the Historian,” Critical Inquiry 18 (1991): 79‐92.

Jan E. Goldstein, Hysteria Complicated by Ecstasy: The Case of Nanette Leroux (Princeton: Princeton University Press, 2011).

Jan E. Goldstein, “Toward an Empirical History of Moral Thinking: The Case of Racial Theory in Mid‐Nineteenth‐Century France,” American Historical Review 120 (2015): 1‐27.

Joan W. Scott, “Evidence of Experience,” in Questions of Evidence: Proof, Practice, and Persuasion across the Disciplines, eds. James Chandler, Harry Harootunian and Arnold Davidson (Chicago: University of Chicago Press, 1994) 363‐387.

William H. Sewell Jr.,, “History, Theory, and Social Science,” in Logics of History, 1‐21.

William H. Sewell Jr., “A Theory of the Event: Marshall Sahlins’s ‘Possible Theory of History,’” in Logics of History, 197‐224.

William H. Sewell Jr.,, “Historical Events as Transformations of Structures: Inventing Revolution at the Bastille,” in Logics of History, 225‐270.

Carolyn Steedman. “Something She Called a Fever: Michelet, Derrida, and Dust.” American Historical Review 106 (2001): 1159‐80.

Ann Laura Stoler, “Colonial Archives and the Arts of Governance,” Archival Science 2 (2002): 87‐109.

Ann Laura Stoler, Along the Archival Grain: Epistemic Anxieties and Colonial Common Sense (Princeton: Princeton University Press, 2009).

twitter facebook youtube linkedin blog rss
Syracuse UniversityGive Now

Center for Qualitative and Multi-Method Inquiry | Maxwell School | Syracuse University | 346 Eggers Hall | Syracuse, NY 13244-1090 | 315.443.6198