Skip to Main Content
Maxwell School
Maxwell / Consortium on Qualitative Research Methods

Institute for Qualitative and Multi-Method Research – June 19-30, 2017

Schedule and Reading List

[Please note that this page provides details from the 2017 institute. While the 2018 institute is expected to be similar, there will be some revisions.]

(download pdf version)

There are three types of institute sessions: (1) Unified (whole institute) sessions; (2) research design discussion groups; and (3) elective modules. The unified sessions are on the first Monday (6/19).

The research design discussion groups will be held for two hours on most mornings of the institute. A separate schedule will be available.

There are 24 elective modules, of which participants will select six. That is, for each of the days on which there is a choice, participants will select from the modules offered.

Choosing Which Modules to Take

While several of the 24 modules can be taken as stand-alone units, there are some limitations on selections.

Modules with higher numbered suffixes (e.g. Content Analysis II) can usually only be taken with the first module in the sequence (e.g. Content Analysis I). [That is, while it is often fine to take I and not II in a sequence, it is usually not possible to take II and not I.] The exception to this rule is module 6 Comparative Methods in Qualitative Research II. (It is also possible to take Module 14 Geographic Information Systems (GIS) II without Module 10 Geographic Information Systems (GIS) I, but only if you already have some familiarity with GIS.)

Modules 17, 18, 19 and 20 are four-day modules that should each be considered as a single unit. Participants select one four-day module in week two, and remain in that module for all four days. 

Please note that participants wishing to take module 17 (multi-method research from a potential outcomes perspective) should also opt for modules 9 and 13 from week one (Multi-Method Research I and II).

Apart from these formal limitations, we should also note that there are several modules which follow in a natural sequence and/or lend themselves to being taken as a group. For the avoidance of doubt, we outline these informal sequences simply to help you navigate the table above. Beyond the two limitations we mention above, you may take whichever modules you would find most helpful.

Modules 2 and 6 (Comparative Methods in Qualitative Research I and II) and Module 18 (QCA/fs I and II).

Module 4 (Discourse Analysis), Module 8 (title TBA), Modules 12 and 16 (Interpretation and History), and Module 20 (Ethnographic Methods).

Module 3 (Archives and Elite Interviews), Module 7 (Qualitative Data Management), and Modules 11 and 15 (Designing and Conducting Fieldwork I and II).

Books to Purchase or Otherwise Obtain

The reading for some unified sessions and modules includes a book or books that must be purchased, or borrowed from your university library [please note that they are unlikely to be available at the Syracuse University bookstore or library].  You will also see that there is some overlap:  some books are used in more than one module.

Manuscripts in Press or in Progress

To the extent possible, IQMR uses the most up-to-date readings on the methods covered at the institute. One consequence is that we are often using manuscripts that are either in press or in progress.  Please note that the authors are allowing us to use these materials as a courtesy. As with all IQMR materials, they are made available for current attendees’ use only.

 

Monday, June 19

Unified Sessions

Colin Elman, Gary Goertz, James Mahoney, Lisa Wedeen


U1 8:30am - 9:15am – Introduction

Colin Elman, Syracuse University


U2 9:15am - 10:30am – Within-Case Analysis/Process Tracing

James Mahoney, Northwestern University  

  • U.2.1. Mahoney, J. (2000). Strategies of causal inference in small-N analysis. Sociological Methods & Research28(4), 387-424. DOI: 10.1177/0049124100028004001

Recommended

  • U.2.2. George, A. L., & Bennett, A. (2005). Case studies and theory development in the social sciences. MIT Press.

   10:30am - 11:00am – Coffee Break

   

U3 11:00am - 12:15pm – Set Theoretic Approaches

Gary Goertz, University of Notre Dame

This session introduces the idea that logic and set theory constitute one important set of tools used in qualitative research.  

  • U.3.1. Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press. Chapters 1-2.  
  • U.3.2. Goertz, G. (In Press). Multimethod research, causal mechanisms, and case studies: an integrated approach. Princeton University Press. Chapter 1.  

Recommended  

  • U.3.3. Mahoney, J., & Vanderpoel, R. S. (2015). Set diagrams and qualitative research. Comparative Political Studies48(1), 65-100. DOI: 10.1177/0010414013519410  

12:15pm - 2:15pm – Lunch

     

U4 2:15pm - 3:30pm – The Interpretive Approach to Qualitative Research

Lisa Wedeen, University of Chicago  

  • U.4.1. Geertz, C. (1973). Thick Description: Toward an Interpretive Theory of Culture. In The Interpretation of Cultures: Selected Essays by Clifford Geertz. Basic Books. Chapter 1, 3-30.  
  • U. 4. 2 Geertz, C. (1973). Deep Play: Notes on the Balinese Cockfight. In The Interpretation of Cultures: Selected Essays by Clifford Geertz. Basic Books. Chapter 15, 412-453.  
  • U.4.3. Foucault, M. (1995). The Body of the Condemned. In Discipline and Punish: The Birth of the Prison. 2nd edition, Vintage Books. Chapter 1, 3-31.  
  • U.4.4. Foucault, M. (1991) Questions of Method. In Foucault, M., Burchell, G., Gordon, C., & Miller, P. (1991), The Foucault effect: Studies in governmentality. University of Chicago Press, Chapter 3, 73-86.    

3:30pm - 4:00pm – Coffee Break

   

U5 4:00pm - 5:15pm – Research Communities and the Production of Knowledge

Colin Elman, Syracuse University

 

Tuesday, June 20

Module 1 – Computer Assisted Text Analysis I

William Lowe  

These two modules are about using computers to systematically analyze text, typically as precursor, successor, or complement to a qualitative analysis. We’ll discuss and practice classical dictionary-based content analysis and its newer incarnation topic modeling, consider how to classify large numbers of documents by topic, and show how to project their contents into rhetorical spaces for understanding and visualization.  Along the way we’ll scrape texts from the web, and discuss good ways to integrate text analysis into a variety or research designs.  

We’ll presume a grasp of basic mathematical and statistical concepts and a willingness to follow along with the computational parts. The module mostly uses R and its packages. Expertise in R is not required, although some prior experience may be helpful. If there is interest we can also run an introduction to R prior to the course for those who’ve not met it before.  

If you choose this module you should bring a laptop and be ready and be ready to install some software beforehand. We’ll circulate a handout with software prerequisites before the course and if you have any problems you can meet with us at 8pm the day before in the lobby of the Sheraton hotel and we’ll try to sort them out.  

8:45am - 10:15am – Session 1   

In the first session we’ll introduce text analysis as a problem of measurement and then discuss dictionary-based content analysis in old and new style. We will focus on identifying model assumptions, learn how to deploy the output effectively in subsequent analyses, see how to validate them, and maybe even fix them when they fail.  

  • 1.1.1. J. Grimmer, J., & Stewart, B. M. (2013). Text as data: The promise and pitfalls of automatic content analysis methods for political texts. Political analysis, 267-297.  DOI: 10.1093/pan/mps028  
  • 1.1.2. Laver, M., & Garry, J. (2000). Estimating policy positions from political texts. American Journal of Political Science, 619-634. DOI: 10.2307/2669268  

Recommended:  

  • 1.1.3. Benoit K., et al. (2016). Crowd-sourced text analysis: Reproducible and agile production of political data. American Political Science Review.  DOI: 10.1017/S0003055416000058    
  • 1.1.4. Bara, J., Weale, A., & Bicquelet, A. (2007). Analysing parliamentary debate with computer assistance. Swiss Political Science Review13(4), 577-605. DOI: 10.1002/j.1662-6370.2007.tb00090.x  

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12: 30pm - 2:00pm – Lunch         

 

2:00pm - 3:30pm – Session 2  

We’ll show how to use a variety of text analysis tools for dictionary-based content analysis by replicating some of the analyses in the readings.


3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Session 3  

In this session we’ll introduce topic models, the probabilistic generalization of the content analysis dictionaries in the previous session, and consider their advantages and disadvantages for understanding large bodies of text.  We’ll also introduce document classification methods that automate the process of assigning documents into categories from a typology using labelled examples instead of a manually constructed dictionary or codebook,.  

  • 1.3.1. Blei, D. M. (2012). Probabilistic topic models. Communications of the ACM55(4), 77-84. DOI: 10.1145/2133806.2133826  
  • 1.3.2. Evans, M., McIntosh, W., Lin, J., & Cates, C. (2007). Recounting the courts? Applying automated content analysis to enhance empirical legal research. Journal of Empirical Legal Studies4(4), 1007-1039. DOI: 10.1111/j.1740-1461.2007.00113.x

Recommended:  

  • 1.3.3. Special issue: Poetics 41(6), (December 2013)  
  • 1.3.4. Hillard, D., Purpura, S., & Wilkerson, J. (2008). Computer-assisted topic classification for mixed-methods social science research. Journal of Information Technology & Politics4(4), 31-46. DOI: 10.1080/19331680801975367

Tuesday, June 20 Module 2

Module 2 – Qualitative Methods for Causal Analysis I

James Mahoney and Gary Goertz

Modules 2 and 6 cover many classic and standard topics of qualitative methodology, with a special focus on within-case causal inference and multimethod research.  The topics include conceptualization, process tracing, counterfactual analysis, comparative-historical analysis.  The sessions use logic and set theory as a foundation for discussing and elucidating qualitative methods.  

8:45am - 10:15am – Two Cultures: Contrasting Qualitative and Quantitative Research

Gary Goertz, University of Notre Dame  

This session contrasts an approach to qualitative and multimethod research based on the statistical paradigm with one based on within‐case causal analysis and logic.  

  • 2.1.1. Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press. Chapters 4‐6, 9, and 15. (book to purchase)  

Recommended:  

  • 2.1.2. Thiem, A., Baumgartner, M., & Bol, D. (2016). Still lost in translation! A correction of three misunderstandings between configurational comparativists and regressional analysts. Comparative Political Studies49(6), 742-774. DOI: 10.1177/0010414014565892  

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch  


2:00pm - 3:30pm – Analyzing Historical Sequences

James Mahoney, Northwestern University  

This session considers: (1) within-case causal inference methods that leverage sequences and over-time processes; and (2) temporal concepts (e.g., critical juncture, path dependence) that are used to frame sequential analyses.  

  • 2.2.1. Mahoney, J., Kimball, E., & Koivu, K. L. (2009). The logic of historical explanation in the social sciences. Comparative Political Studies42(1), 114-146. DOI: 10.1177/0010414008325433  
  • 2.2.2 Falleti, T.G. &Mahoney, J. (2015). The Comparative Sequential Method. In Advances in Comparative-Historical Analysis. New York: Cambridge University Press.  

Recommended:  

  • 2.2.3. Mahoney, J., & Thelen, K. (2015). Advances in Comparative-Historical Analysis. Cambridge: Cambridge University Press.  
  • 2.2.4. Mahoney, J & Rueschemeyer, D. (2003) Comparative Historical Analysis in the Social Sciences. New York: Cambridge University Press.  

3:30pm - 4:00pm - Coffee Break  

4:00pm - 5:30pm – The Logic of Process Tracing

James Mahoney, Northwestern University  

This session provides a framework, based on logic and set theory, for the pursuit of process tracing and the use of process-tracing tests in within-case causal inference.  

  • 2.3.2. Mahoney, J. (2012). The logic of process tracing tests in the social sciences. Sociological Methods & Research, 41(4), 570-597. DOI: 10.1177/0049124112437709  
  • 2.3.3. Barrenechea, R, & Mahoney, J. (forthcoming). A Set-Theoretic Approach to Bayesian Process Tracing. Sociological Methods & Research.  

Recommended:  

  • 2.3.4. Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press. Chapters 7 & 8. (book to purchase) 
       

Tuesday, June 20

Module 3 – Archival Research and Elite Interviews 

James Goldgeier, Andrew Moravcsik, and Elizabeth Saunders


Archival and Interview Research with Primary Sources: What Do You Need to Know, How Do You Know Where to Look, and How Do You Get What You Need?

In this module, we will discuss how political scientists decide they need to use primary records of policy-making—archives, interviews, and published primary sources—in their research. This includes how one prepares for, structures, conducts, and manages the information flow from archival visits, interviews or structured examination of published materials. We focus on practical research skills scholars can use, and judgments they must make in everyday research. We conclude with a discussion of making qualitative research transparent.  

8:45am - 10:15am – Selecting and Preparing for Archival and Interview Research  

This session highlights the practical trade-offs between different types of textual and interview research and the ways in which one must prepare for them.  It focuses on issues to think about before you start your research.  We will talk about different types of repositories, briefly explain how to use the Freedom of Information Act, and strategies for maximizing the output of interviews.  

  • 3.1.1. Greenstein, F. I., & Immerman, R. H. (1992). What did Eisenhower tell Kennedy about Indochina? The politics of misperception. The Journal of American History79(2), 568-587. DOI: 10.2307/2080047  
  • 3.1.2. Thies, C. G. (2002). A pragmatic guide to qualitative historical analysis in the study of international relations. International Studies Perspectives3(4), 351-372. DOI: 10.1111/1528-3577.t01-1-00099  
  • 3.1.3. Lustick, I. S. (1996). History, historiography, and political science: Multiple historical records and the problem of selection bias. American Political Science Review90(03), 605-618. DOI: 10.2307/2082612  
  • 3.1.4. Trachtenberg, M. (2006) The Craft of International History. Princeton University Press. Appendix I and Appendix II. Available at

http://www.sscnet.ucla.edu/polisci/faculty/trachtenberg/methbk/AppendixI.html

http://www.sscnet.ucla.edu/polisci/faculty/trachtenberg/methbk/AppendixII.html

   

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Structuring Your Data Collection: Making Sure You Can Use What You Find  

This session will address concerns that arise during your research.  We will discuss hands-on electronic strategies for structuring, organizing, and storing your oral and documentary data so that you can easily and systematically access it as you move to the analysis and writing phase of your project.  The process of structuring your data begins before you leave for the archives, and informs how you conduct your research in the archives and your analysis of documents when you get home. 

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm – 5:30pm – DART and the Recent Transparency Debate in Political Science  

This session addresses concerns that arise when presenting research. Researchers, methodologists, journals, funders and regulators are encouraging scholars to make their evidence, interpretations and research procedures more transparent, as is increasingly the norm for journalists, government officials, policy analysts, bloggers, and scholars in other academic disciplines. Active citation, a system of digitally enabled appendices, is the most viable general strategy to cost-effectively increase the transparency of qualitative work. Data Access and Research Transparency (DART), an informal initiative designed to enhance the openness of political science research, has generated considerable support in the profession, but also intense criticism and opposition. In this session, we will conduct an open debate on the merits of transparency norms and their optimal form. Participants are encouraged to read carefully so as to engage fully in the debate.  

  • 3.3.1. "Guidelines for Data Access and Research Transparency in Qualitative Political Science," (Draft Document of APSA Ethics Committee Report, 2013).

  
  • 3.3.2. Isaac, J. C. (2015). For a more public political science. Perspectives on Politics13(02), 269-283. DOI: 10.1017/S1537592715000031  
  • 3.3.3. Moravcsik, A. (2016). Qualitative transparency: pluralistic, humanistic and policy-relevant. International History and Politics Newsletter (APSA) 1(2), 12-18.

  

Recommended:

  

  • 3.3.4. Lupia, A., & Elman, C. (2014). Openness in political science: Data access and research transparency. PS: Political Science & Politics47(01), 19-42. DOI: 10.1017/S1049096513001716    
  • 3.3.5. Moravcsik, A. (2014). One norm, two standards: realizing transparency in qualitative political science. The Political Methodologist, 22(1), 3-9.  
  • 3.3.6. Moravcsik, A. (2014). Trust, but verify: The transparency revolution and qualitative international relations. Security Studies, 23(4), 663-688. DOI: 10.1080/09636412.2014.970846  
  • 3.3.7. Website: “Qualitative Transparency Deliberations on Behalf of the APSA Section for Qualitative and Multi-Method Research” at https://www.qualtd.net/  
  • 3.3.8. Büthe, T., & Alan, J. (2015). Symposium: Transparency in Qualitative and Multi-Method Research. Qualitative & Multi-Method Research 13(1).

   

Tuesday, June 20

Module 4 – Interpretivism I

Lisa Wedeen and William Mazzarella  

This module provides students with an introduction to three different modes of discourse analysis. Participants will learn to "read" texts while becoming familiar with contemporary thinking about interpretation, narrative, and social construction. In these three sessions we shall explore the following methods: Foucault’s “interpretive analytics”; Wittgenstein’s understanding of language as activity and its relevance to ordinary language-use analysis (including theories of “performativity”); and an analysis of the rhetoric of cinema.  

8:45am - 10:15am – Wittgenstein and Ordinary Language-Use Analysis

Lisa Wedeen, University of Chicago  

This session introduces participants to Ludwig Wittgenstein’s thought and its relationship to ordinary language-use methods. We shall focus on several key ways in which Wittgensteinian-inspired methods can be used in ethnographic and analytical research. Among the questions we shall ask are: What is the “value added” of concentrating on language? Why is understanding language as an activity important? How can social scientists grapple with vexed issues of intention? What does “performative” mean, and how do political theories about language as performative differ from discussions of performance? How can social scientists uninterested in taking on new jargon use this kind of political theory to further their theoretical and empirical work?  

  • 4.1.1. Pitkin, H.F. (1972). Wittgenstein and Justice: On the Significance of Ludwig Wittgenstein for Social and Political Thought. University of California Press, 169-192.  
  • 4.1.2. Wedeen, L. (2008). Peripheral Visions: Publics, Power, and Performance in Yemen. University of Chicago Press. Chapter 2, chapter 3, and conclusion. (Book to purchase)  
  • 4.1.3. Wittgenstein, L. (2001). The Philosophical Investigations, G. E. M. Anscombe, trans. Blackwell Publishers. Paragraphs 1-33; paragraph 154; pages 194-195.  

10:15-10:45am  Coffee Break  

10:45am-12.30pm Research Design Discussion Sessions  

12: 30pm – 2:00pm Lunch

 

2:00pm – 3:30pm Foucauldian Discourse Analysis

Lisa Wedeen, University of Chicago  

This session introduces participants to the techniques of Foucauldian discourse analysis or “interpretive analytics.” Participants will learn how to conduct a discourse analysis, what the underlying assumptions of such an analysis are, and how these techniques can be used to advance political inquiry. The session will consider both the power and limitations of the method, the ways in which it differs from other modes of interpretation, and its advantages over content analysis.  

  • 4.2.1. Foucault, M. (1977). Nietzsche, Genealogy, History. In Language, Counter-Memory, Practice: Selected Essays and Interviews, ed. DF Bouchard, Cornell University Press, 139-164.  
  • 4.2.2. Foucault, M. (1990). The history of sexuality: An introduction, volume ITrans. Robert Hurley. New York: Vintage. 1-35 and 92-114.  
  • 4.2.3 King, G., Keohane, R. O., & Verba, S. (1994). Designing social inquiry: Scientific inference in qualitative research. Princeton University Press. (Please bring this text to class)  

Recommended  

  • 4.2.4. Dreyfus, H. L., & Rabinow, P. (2014). Michel Foucault: Beyond structuralism and hermeneutics. University of Chicago Press, Part Two.
  •  

3:30pm - 4:00pm Coffee Break  

4:00pm - 5:30pm – Ideology

William Mazzarella, University of Chicago  

What is ideology and how does it structure public culture and everyday life? What is the relation between ideology and media, and between ideology and political economy? How does ideology enable or interrupt desire, imagination, and attachment? Is there anything ‘beyond’ or ‘behind’ ideology and, if there isn’t, then what grounds critical analysis (since it might simply be yet another example of ideology)?  

  • 4.3.1. Althusser, L. (2006). Ideology and ideological state apparatuses (notes towards an investigation). The anthropology of the state: A reader9(1), 86-98.  
  • 4.3.2. Horkheimer, M. (1976). Traditional and critical theory. Critical theory: Selected essays, 188-204.  

Recommended/Further  

  • Roland Barthes, ‘Myth Today’
  • Pierre Bourdieu, ‘Preliminaries,’ from The Field of Cultural Production
  • Michel Foucault, ‘The Subject and Power’ in Michel Foucault: Beyond Structuralism and Hermeneutics
  • Immanuel Kant, ‘What is Enlightenment?’ in Philosophical Writings
  • Claude Lévi-Strauss, ‘The Effectiveness of Symbols’ in Structural Anthropology
  • Georg Lukács, ‘Reification and the Consciousness of the Proletariat’
  • Herbert Marcuse, ‘The Affirmative Character of Culture,’ in The Essential Marcuse
  • Karl Marx, ‘Preface to a Contribution to the Critique of Political Economy’
  • Edward Sapir, ‘Symbolism,’ in Encyclopedia of the Social Sciences
  • Peter Sloterdijk, ‘Part One – Sightings: Five Preliminary Reflections’ in Critique of Cynical Reason
  • Raymond Williams, ‘Hegemony’ and ‘Structures of Feeling’ in Marxism and Literature
  • Slavoj Zizek, The Sublime Object of Ideology

   

Wednesday, June 21

Module 5 – Computer Assisted Text Analysis II

William Lowe


8:45am - 10:15am – Session 1  

In this practical session, we show how to use open source tools to fit and interpret topic models, and suggest a workflow for automated document classification.  


10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Session 2  

In this session we look at text scaling models. These try to place texts and their words in an substantively interpretably space based on differential word usage.  We discuss how to fit and interpret such models, how to think about the ‘dimensionality’ of a discourse, and what important discourse features are left out or abstract away. We will pay particular attention to the extent to which simplifying model assumptions are reasonable, especially given the institutional structures from which documents are often retrieved.  

  • 5.2.1. Slapin, J. B., & Proksch, S. O. (2008). A scaling model for estimating time‐series party positions from texts. American Journal of Political Science52(3), 705-722. DOI: 10.1111/j.1540-5907.2008.00338.x  
  • 5.2.2. Caliskan, A., Bryson, J. J., & Narayanan, A. (2017). Semantics derived automatically from language corpora contain human-like biases. Science356(6334), 183-186. DOI: 10.1126/science.aal4230

Recommended:  

  • 5.2.3. Proksch, S. O., & Slapin, J. B. (2010). Position taking in European Parliament speeches. British Journal of Political Science40(03), 587-611. DOI: 10.1017/S0007123409990299  
  • 5.2.4. Lowe, W., Benoit, K., Mikhaylov, S., & Laver, M. (2009, September). Scaling Policy Positions From Coded Units of Political Texts. In general conference of the European Consortium of Political Research (ECPR), Potsdam.  

3:30pm - 4:00pm – Coffee Break


4:00pm - 5:30pm – Session 3  

In the practical session, we show how to scale texts and visualize their content.  

Time permitting we may also discuss how to harvest text data from the web, deal with non-English language content, and solve other practical problems that arise in text analysis.    


Wednesday, June 21

Module 6 – Qualitative Methods for Causal Analysis II

James Mahoney and Gary Goertz  

Modules 2 and 6 cover many classic and standard topics of qualitative methodology, with a special focus on within-case causal inference and multimethod research.  The topics include conceptualization, process tracing, counterfactual analysis, comparative-historical analysis.  The sessions use logic and set theory as a foundation for discussing and elucidating qualitative methods.


8:45am - 10:15am – Social Science Concepts

Gary Goertz, University of Notre Dame  

This session provides basic guidelines for the construction and evaluation of concepts.  In particular, it provides a framework for dealing with complex concepts, which are typical in much social science research.  

  • 6.1.1. Goertz, G. (2006). Social science concepts: A user's guide. Princeton University Press. Chapters 1-2.  
  • 6.1.2. Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press. Chapters 11-13. (book to purchase)  


10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12: 30pm - 2:00pm – Lunch                     

 

2:00pm - 3:30pm – Counterfactuals Analysis

James Mahoney, Northwestern University  

This session provides a framework, based on logic and set theory, for the use of counterfactual causal analysis in qualitative research.  

  • 6.2.1. Levy, J. S. (2008). Counterfactuals and case studies. In The Oxford handbook of political methodology. Oxford: Oxford University Press, 627-644.  
  • 6.2.2 Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press, 115-124.  (book to purchase)    

Recommended:  

  • 6.2.3. Mahoney, J. & Barrenechea, R. “The Logic of Counterfactual Analysis in Case-Study Explanation,” Manuscript, 2017.
  •  

3:30pm - 4:00pm – Coffee Break

   

4:00pm - 5:30pm – Case Selection and Multimethod Research Designs

Gary Goertz, University of Notre Dame  

This session offers a general framework -- “the research triad” -- for designing multimethod and multiple case research with the goal of evaluating causal mechanisms. The session develops guidelines and rules for choosing cases that will allow qualitative researchers to achieve maximum leverage for causal inference, both in case study as well as multimethod designs.  

  • 6.3.1. Goertz, G. (In Press). Multimethod research, causal mechanisms, and case studies: an integrated approach. Princeton University Press. Chapters 2-3, 7-8.  

Recommended:  

  • 6.3.2. Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press. Chapter 14. (book to purchase)

   

Wednesday, June 21

Module 7 – Managing and Sharing Qualitative Data, Making Qualitative Research Transparent 

Diana Kapiszewski, Sebastian Karcher, and Dessi Kirilova

Research data management -- developing a data management plan when designing a research project and handling research materials systematically throughout the research lifecycle -- is a critical aspect of empirical research. Effectively managing data makes research more robust and prolongs the period during which data remain useful. It also facilitates sharing data with the broader research community (as funders, publishers, and academic associations increasingly require), and makes research based on them more transparent. This module equips participants with a range of strategies for effectively managing qualitative data.  We also highlight the benefits of sharing data (including enhanced citation and collaboration, and the catalyzing of secondary analysis), consider some perceived barriers to data sharing, and demonstrate appropriate techniques for overcoming them. Finally, we discuss how making qualitative research more transparent (i.e., clearly conveying how data were generated and analyzed to produce inferences and interpretations) helps scholars to showcase the strength of their work, and we introduce strategies for achieving research transparency in qualitative inquiry. The module includes numerous exercises and practical applications to consolidate knowledge and encourage interaction among participants.  Participants will benefit most if they have an actual research project, including its data-generation issues and challenges, in mind.  


8:45am - 10:15am – Managing Data

Dessi Kirilova, Qualitative Data Repository

We introduce the notion of the “research lifecycle” to demonstrate that research data can prove useful far beyond the research project through which they were generated. We consider the importance of planning data management when designing research projects and examine the strategies and techniques required to manage data effectively, both for the benefit of the immediate project and to give them a longer life beyond it. In particular, students will receive guidance on developing a Data Management Plan (DMP). Additionally, we use examples of real research projects to establish what types of protocols are needed at key stages of the research cycle, and to identify trigger points at which data sharing considerations come into play.  Finally, we discuss briefly the role of describing and contextualizing data in order for them to be reusable, and consider the issues that need to be addressed in order to manage data safely.  

  • 7.1.1. Corti, L., Van den Eynden, V., Bishop, L., & Woollard, M. (2014). Managing and sharing research data: a guide to good practice. Sage. Available at: http://www.data-archive.ac.uk/media/2894/managingsharing.pdf  
  • 7.1.2. Qualitative Data Repository. (2017) “Managing Data” (including all sub-screens). Available at: https://qdr.syr.edu/guidance/managing

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Sharing Qualitative Data

Diana Kapiszewski, Georgetown University and Qualitative Data Repository

Dessi Kirilova, Qualitative Data Repository  

We discuss the benefits of sharing qualitative data, and best practices for addressing the ethical, legal, and logistical challenges of doing so.  We consider issues of rights management (who owns “your” data?) – and copyright concerns and how they can be addressed; we also discuss “fair use” of copyrighted data.  We describe the advantages of sharing data in an institutional venue, including curation and long-term availability of data; we also introduce the Qualitative Data Repository (www.qdr.org).  Students are encouraged to consider questions of ethics and rights in relation to their own projects.  

  • 7.2.1. Bishop, L. (2009). Ethical sharing and reuse of qualitative data. Australian Journal of Social Issues44(3), 255.  
  • 7.2.2. Tsai, A. C., Kohrt, B. A., Matthews, L. T., Betancourt, T. S., Lee, J. K., Papachristos, A. V., & Dworkin, S. L. (2016). Promises and pitfalls of data sharing in qualitative research. Social Science & Medicine169, 191-198. DOI: 10.1016/j.socscimed.2016.08.004  
  • 7.2.3. Wutich, A., & Bernard, H. R. (2016). Sharing qualitative data & analysis. With whom and how widely?: A response to'Promises and pitfalls of data sharing in qualitative research'. Social science & medicine (1982)169, 199. DOI: 10.1016/j.socscimed.2016.09.041  

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Making Qualitative Research Transparent

Diana Kapiszewski, Georgetown University and Qualitative Data Repository

Sebastian Karcher, Qualitative Data Repository

Research transparency comprises production transparency (clearly describing the processes through which data were generated) and analytic transparency (clearly indicating how data were analyzed and how they support claims, conclusions, inferences and interpretations in scholarship).  In this session we consider the ongoing debate in political science over making qualitative research transparent, and discuss the merits and limitations a new approach to qualitative research transparent, Annotation for Transparent Inquiry (ATI).  Finally, we consider the meaning and challenges of “replicating” qualitative research.  

  • 7.3.1. Elman, C., & Kapiszewski, D. (2014). Data access and research transparency in the qualitative tradition. PS: Political Science & Politics47(01), 43-47. DOI: 10.1017/S1049096513001777  
  • 7.3.2. Moravcsik, A. (2014). Transparency: The revolution in qualitative research. PS: Political Science & Politics47(01), 48-53. DOI: 10.1017/S1049096513001789  
  • 7.3.3. Saunders, E. N. (2011). Leaders at war: how presidents shape military interventions. Cornell University Press. DOI: 10.5064/F68G8HMM Active Citation Compilation, QDR:10048. Syracuse, NY: Qualitative Data Repository [distributor].

Further Reference

Managing Data

  • ICPSR (2014) ‘Framework for Creating a Data Management Plan’, University of Michigan. Available at: http://www.icpsr.umich.edu/icpsrweb/content/datamanagement/dmp/framework.html  
  • Corti, L., Van den Eynden, V., Bishop, L. and Woollard, M. (2014) Managing and Sharing Research Data: A Guide to Good Practice, London: Sage. ISBN:  978-1-44626-726-4.
  • Corti, L. and Thompson, P. (2012) 'Secondary analysis of archived data' in J. Goodwin (ed.) SAGE: Secondary Data Analysis London: Sage Publications Ltd (http://repository.essex.ac.uk/2444/)

Sharing Data

  • http://methods.sagepub.com/video/what-is-secondary-analysis-of-qualitative-data
  • Clark, A. (2006) “Anonymising Research Data,” ESRC National Centre for Research Methods, Working Paper 7/06. Available at: http://eprints.ncrm.ac.uk/480/1/0706_anonymising_research_data.pdf
  • Schrag, Zachary. (2017) “A social scientist’s guide to the Final Rule”. Institutional Review Blog Post. January 19. Available at: http://www.institutionalreviewblog.com/2017/01/a-social-scientists-guide-to-final-rule.html  
  • Kirilova, Dessi and Sebastian Karcher. (2017) “Rethinking Data Sharing and Human Participant Protection in Social Science Research: Applications from the Qualitative Realm.” Working paper available at: https://osf.io/preprints/socarxiv/9n7w8/

Making Qualitative Research Transparent

  • Qualitative Data Repository. (2017) “Annotations for Transparent Inquiry at a Glance.” Available at: https://qdr.syr.edu/guidance/ati-at-a-glance
  • Handlin, Samuel. (2015) Data for: “The Politics of Polarization: Governance and Party System Change in Latin America, 1990-2010.” Working Paper 401, November 2014. Notre Dame, IN: The Kellogg Institute for International Studies, University of Notre Dame. ATI Project, QDR:10065. Syracuse, NY: Qualitative Data Repository [distributor]. Available at: https://via.hypothes.is/https://kellogg.nd.edu/publications/workingpapers/WPS/401.pdf#annotations:query:user:qdr@hypothes.is
  • The (DA-RT) Data Access and Research Transparency Joint Statement (http://www.dartstatement.org)
  • Saunders, Elizabeth N. 2014. “Transparency without Tears: A Pragmatic Approach to Transparent Security Studies Research.” Security Studies 23 (4): 689–98. doi:10.1080/09636412.2014.970405.
  • Qualitative Transparency Deliberations (https://www.qualtd.net/)

   

Wednesday, June 21

Module 8 – Interpretivism II

William Mazzarella and Joseph Masco

 

8:45am - 10:15am – The Energetics of Mass Society

William Mazzarella, University of Chicago  

Most social science, including the analysis of ideology, grapples with meaning. What do things mean? To whom do they mean what they mean? Talking meaning means talking ‘culture’ and ‘identity.’ But alongside questions of meaning, we need to ask: ‘how is it that we have not just meaning but meaning that matters?’ What makes meaning stick? What makes it resonate? Whether we call it ‘affect,’ ‘aesthetics,’ or something else – social and political analysis needs ways of thinking critically and creatively about the energies that animate social life.

  • 8.1.1. Mazzarella, W. The Mana of Mass Society.  ‘Introduction’ and Chapters 1 and 2. Forthcoming.  

Recommended/Further readings

  • Roland Barthes, ‘The Rhetoric of the Image’ (from Image, Music, Text)
  • Jean Baudrillard, ‘Sign-Function and Class Logic’ (from For a Critique of the Political Economy of the Sign)
  • Walter Benjamin, ‘The Work of Art in the Age of its Technological Reproducibility (Second Version)’ in Selected Writings, Vol 3
  • John Berger, Ways of Seeing
  • Dick Hebdige, Subculture
  • Siegfried Kracauer, ‘The Mass Ornament,’ in The Mass Ornament: Weimar Essays
  • Karl Marx, ‘The 1844 Economic and Philosophical Manuscripts’ in The Marx-Engels Reader
  • W J T Mitchell, Iconology
  • Jacques Rancière, ‘The Aesthetic Revolution and its Outcomes: Emplotments of Autonomy and Heteronomy’ in New Left Review 14 (March-April 2002)
  • Susan Sontag, On Photography
  • Michael Taussig, Defacement  

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12:30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm Considering Risk and Threat in a National Public Sphere

Joseph Masco, University of Chicago  

This session explores how notions of threat and danger function in mass mediated politics.  Using the U.S. as case study, it considers how Cold War fears of nuclear destruction informed the War on Terror after 2001.  It asks questions about how affect and emotions are mobilized at a national level and attends in particular to the way that negative futures are both politicized and internalized.  Participants will gain tools for thinking about existential dangers, the role of nation-making in an age of technological revolution, and the politics of mass mediated futures.  

  • 8.2.1. Masco, J. (2014). The theater of operations: National security affect from the Cold War to the War on Terror. Duke University Press. Introduction, chapter 1, chapter 3, conclusion.  

Recommended:  

  • 8.2.2. Massumi, B. (2010). The political ontology of threat. The affect theory reader, 52-70.  
  • 8.2.3. Rogin, M. (1987). 'Ronald Reagan'—The movie. Radical History Review1987(38), 88-113. DOI: 10.1215/01636545-1987-38-88    

3:30pm - 4:00pm – Coffee Break

        

4:00pm - 5:30pm    Predictive Analytics and Elections as Problem Space: Ethnographic and Analytic Approaches,  Joseph Masco, University of Chicago  

This session considers the imbrication of technological revolution, covert action, and threat assessments in recent election cycles.  Working through recent reporting about the use of social media as a means of influencing national opinion, we will consider how to unpack a controversy and establish a set of questions suitable for ethnographic inquiry.  Of particular focus here will be to how to constitute a problem space, how to recognize its internal complexity, and how to select specific dimension for further study.    

Recommended:  

  • 8.3.4.   Dumit, J. (2014). Writing the implosion: Teaching the world one thing at a time. Cultural Anthropology, 29(2), 344-362.  DOI: 10.14506/ca29.2.09
  •    

Thursday, June 22

Module 9 – Multimethod Research I

Jason Seawright  

This module discusses the challenge of causal inference, and ways that multi-method research designs can contribute to causal inference. We will differentiate between traditional, triangulation designs that offer relatively little advantage, and integrative multi-method designs that directly strengthen causal inference. All of this will be structured around a discussion of multi-method designs that use regression-type methods as the quantitative component of the causal inference.  

8:45am - 10:15am – Causal Inference in Multi-Method Research  

This session frames problems of multi-method research design in terms of the goal of causal inference. Is there one concept of causation, or are there many? If more than one exist, are there distinctive qualitative and quantitative concepts of causation that do not overlap? If they do overlap, how can qualitative and quantitative tools for causal inference best be aligned to avoid redundancy or irrelevancy?

  • 9.1.1. Brady, H. E. (2008). Causation and explanation in the social sciences. In The Oxford Handbook of Political Methodology. Oxford: Oxford University Press, 217–70.  
  • 9.1.2. Seawright, J. (2016). Multi-method social science: Combining qualitative and quantitative tools. Cambridge University Press. Chapters 1-2 (book to purchase)  

Recommended:  

  • 9.1.3. Lieberman, E. S. (2005). Nested analysis as a mixed-method strategy for comparative research. American Political Science Review99(03), 435-452. DOI: 10.1017/S0003055405051762  
  • 9.1.4. Freedman, D.A. (2008). On Types of Scientific Enquiry: The Role of Qualitative Reasoning. In The Oxford Handbook of Political Methodology. Oxford: Oxford University Press, 300-18.
     

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm – Combining Regression and Case Studies  

This session looks closely at the challenges of combining case studies with the most common quantitative tool in the social sciences, regression. It offers research designs for testing assumptions connected with measurement, confounding, and the existence of a hypothesized causal path.  

  • 9.2.1. Seawright, J. (2016). Multi-method social science: Combining qualitative and quantitative tools. Cambridge University Press. Chapter 3. (book to purchase)  

Recommended:  

  • 9.2.2. Small, M. L. (2011). How to conduct a mixed methods study: Recent trends in a rapidly growing literature. Annual Review of Sociology37, 57-86. DOI: 10.1146/annurev.soc.012809.102657  
  • 9.2.3. Howard, M. M., & Roessler, P. G. (2006). Liberalizing electoral outcomes in competitive authoritarian regimes. American Journal of Political Science50(2), 365-381. DOI: 10.1111/j.1540-5907.2006.00189.x


3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Case Selection  

This session asks how cases should best be selected from a larger population. We will review a range of case-selection rules and evaluate them based on their contribution to the process of case-study discovery.  

  • 9.3.1. Seawright, J. (2016). Multi-method social science: Combining qualitative and quantitative tools. Cambridge University Press. Chapter 4. (book to purchase)  

Recommended:  

  • 9.3.2. Nielsen, R. A. (2016). Case selection via matching. Sociological Methods & Research45(3), 569-597. DOI: 10.1177/0049124114547054  
  • 9.3.3. Seawright, J., & Gerring, J. (2008). Case selection techniques in case study research: A menu of qualitative and quantitative options. Political Research Quarterly61(2), 294-308. DOI: 10.1177/1065912907313077

   

Thursday, June 22

Module 10 – Geographic Information Systems I: Introduction to GIS as a Qualitative Research Method

Jonnell Robinson

 

8:45am - 10:15am – Introduction and Case Studies in Qualitative GIS

This session will introduce participants to GIS as a tool for qualitative research, present basic GIS terminology and concepts and the basic functions of ESRI’s ArcGIS software suite, particularly those functions that are most commonly used by social scientists.

  • 10.1.1. Dennis Jr, S. F. (2006). Prospects for qualitative GIS at the intersection of youth development and participatory urban planning. Environment and Planning A38(11), 2039-2054. DOI: 10.1068/a3861  
  • 10.1.2. National Geographic. (2015) Geographic Information Systems.  http://education.nationalgeographic.com/education/encyclopedia/geographic‐informationsystem‐gis/?ar_a=1.  
  • 10.1.3. Sturgis, S. (2015). Kids in india are sparking urban planning changes by mapping slums. Atlantic Citylab.  
  • 10.1.4. Wridt, P. (2010). A qualitative GIS approach to mapping urban neighborhoods with children to promote physical activity and child-friendly community planning. Environment and Planning B: Planning and Design37(1), 129-147. DOI: 10.1068/b35002  

Further:  

  • Meghan Cope and Sarah Elwood. Qualitative GIS: A Mixed Methods Approach. Thousand Oaks, California: SAGE Publications Inc., 2009. Print.  
  • William J. Craig, Trevor M. Harris, and Weiner Daniel. Community Participation and Geographic Information Systems. London/ New York, New York: Taylor & Francis Inc., 2002. Print.  
  • Steven J. Steinberg. GIS: Geographic Information Systems for the Social Sciences: Investigating Space and Place. Thousand Oaks, California: SAGE Publications, 2006. Print.  


10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Basic GIS Functions  

This module will explore basic visualization and analytical functions such as building and querying attribute tables, selecting map features, and symbolizing data.  

  • 10.2.1. Batty, B. (2003). Using Geographical Information Systems. In Key Methods in Geography. London: SAGE Publications, 409-423.  
  • 10.2.2. Maantay, J., Ziegler, J., & Pickles, J. (2006). GIS for the urban environment. Redlands, CA: Esri Press.  
  • 10.2.3. Mitchel, A. (1999). The ESRI Guide to GIS Analysis. Volume 1: Geographic Patterns and Relationships. Pages 9-19.  

Recommended:  

  • David Allen. GIS Tutorial 2: Spatial Analysis Workbook, 10.1 Edition. Redlands, California: ESRI Press Inc., 2013. Print..
  • David W. Allen and Jeffery M. Coffey. GIS Tutorial 3: Advanced Workbook, 10.0 Edition. Redlands, California: ESRI Press Inc., 2010. Print.).  
  • Gorr L. Wilpen and Kristen S. Kurland. GIS Tutorial 1: Basics Workbook, 10.1. Edition. Redlands, California: ESRI Press Inc., 2013. Print.

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – GIS Data Sources and Data Integration  

This session will review the types and sources of data that are available for GIS users working in both data rich and data poor settings, the ethics of using mapping in research, how metadata can be used to communicate qualitative information, and data overlay analysis.

  • 10.3.1. Jung, J. K., & Elwood, S. (2010). Extending the qualitative capabilities of GIS: computer‐aided qualitative GIS. Transactions in GIS14(1), 63-87. DOI: 10.1111/j.1467-9671.2009.01182.x  
  • 10.3.2. Rambaldi, G., Chambers, R., McCall, M., & Fox, J. (2006). Practical ethics for PGIS practitioners, facilitators, technology intermediaries and researchers. Participatory learning and action54(1), 106-113.  
  • 10.3.3. Steinberg, S. and Steinberg, S. (2006) GIS for the Social Sciences: Investigating Place and Space. Thousand Oaks: SAGE Publications. Chapter 2.  

Recommended:  

  • Ian N. Gregory, A Place in History: A guide to using GIS in historical research. 2nd. Belfast, Northern Ireland: Centre for Data Digitisation and Analysis, 2005. Web. 4 Apr. 2014  
  • Mark Monmonier. How to Lie With Maps. 2. Chicago, Illinois: University of Chicago Press, 1996. Print.

   

Thursday, June 22

Module 11 – Designing and Conducting Field Research I: Preparing for Fieldwork and Operating in the Field

Diana Kapiszewski and Lauren MacLean

This module considers the design, planning, and execution of field research. We offer strategies for addressing the intellectual, logistical, and social challenges that carrying out field research involves. A basic premise underlying the module is that fieldwork entails shifting among research design, data collection, and data analysis.  Each session is conducted with the understanding that participants have carefully read the assigned materials.  The instructors will present key points drawing on the readings, other published work on field research, and the experiences they and others have had with managing fieldwork’s diverse challenges. Interaction and discussion in small and large groups is encouraged.  

 

8:45am - 10:15am – Borders and Varieties of Fieldwork

Diana Kapiszewski, Georgetown University

Lauren M. MacLean, Indiana University  

In this session we discuss our conception of field research as entailing repeated shifts among research design, data collection, and data analysis, consider some of the implications of these shifts, and evaluate the benefits of iterated research design. We consider fieldwork’s heterogeneity – how it varies across contexts, researchers, projects, and points of time in the same project – and also address how ethical challenges in the field go well beyond obtaining approval from your IRB.   

  • 11.1.1. Kapiszewski, D., MacLean, L. M., & Read, B. L. (2015). Field Research in Political Science:  Practices and Principles. Field research in political science: practices and principles. Cambridge University Press. Chapter 1. (book to purchase)
  •  
  • 11.1.2. Wood, E. J. (2006). The ethical challenges of field research in conflict zones. Qualitative Sociology29(3), 373-386. DOI: 10.1007/s11133-006-9027-8  
  • 11.1.3. Hauck, R. J.  et. al (2008). Symposium on Protecting Human Research Participants, IRBs, and Political Science Redux. PS: Political Science & Politics41(3), 475-511.  See in particular contributions by Mitchell Seligson, Dvora Yanow, and Peri Schwartz-Shea.  

Additional Reference Material  

  • 11.1.4. Collier, D. (1999) Data, Field Work and Extracting New Ideas at Close Range. APSA-CP Newsletter, 10(1), 1-2, 4-6.
  •  
  • 11.1.5. Wood, E. (2007). Field Methods. The Oxford Handbook of Comparative Politics Oxford: Oxford University Press. Chapter 5.
  •  
  • 11.1.6. Collier, D., Freedman D.A., Fearon, J.D., Laitin, D.D., Gerring, J., & Goertz, G. (2008). Symposium: Case Selection, Case Studies, and Causal Inference. Qualitative & Multi-Method Research, 6(2), 2-16.  
  • 11.1.7. Loaeza, S., Stevenson, R., & Moehler, D. C. (2005). Symposium: should everyone do fieldwork?. APSA-CP16(2), 8-18.  
  • 11.1.8. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). A Historical and Empirical Overview of Field Research in the Discipline. Field Research in Political Science: Practices and Principles. Cambridge University Press. Chapter 2. (book to purchase)

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12:30pm - 2:00pm – Lunch

 

2:00pm – 3:30pm – Preparing for Fieldwork

Diana Kapiszewski, Georgetown University

Lauren M. MacLean, Indiana University

This session addresses pre-dissertation and other exploratory research, logistical preparations for fieldwork, securing funding, networking to obtain contacts and interviews, negotiating institutional affiliation, and developing a data-collection plan.

  • 11.2.1. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). Preparing for Fieldwork. Field Research in Political Science: Practices and Principles. Cambridge University Press. Chapter 3. (book to purchase)  
  • 11.2.2. Przeworski, A., & Salomon, F. (1995). The art of writing proposals: Some candid suggestions for applicants to Social Science Research Council competitions. Social Science Research Foundation.  
  • 11.2.3. Altman, M. (2009). Funding, funding. PS: Political Science & Politics42(03), 521-526. DOI: 10.1017/S1049096509090830

Additional Reference Material  

  • 11.2.4. Barrett, C. B., & Cason, J. (2010). Identifying a Site and Funding Source. Overseas research II: A practical guide. Routledge.  
  • 11.2.5. Barrett, C. B., & Cason, J. (2010). Predeparture Preparations. Overseas research II: A practical guide. Routledge.

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Operating in the Field

Diana Kapiszewski, Georgetown University

Lauren M. MacLean, Indiana University

This session offers practical advice on collecting data and managing inter-personal relations in the field.  We introduce a range of more-interactive and less-interactive data-collection techniques, with a particular emphasis on the latter, consider their strengths and weaknesses, and think about how they can be combined.  We discuss the different types of human interaction fieldwork entails, including hiring and working with research assistants and collaborating with other researchers.    

  • 11.3.1. Kapiszewski, D., MacLean, L. M., & Read, B. L. (2015). Managing in the Field: Logistical, Social, Operational, and Ethical Challenges. Field research in political science: practices and principles. Cambridge University Press. Chapter 1. (book to purchase)  
  • 11.3.2. Ahram, A. I., & Goode, J. P. (2016). Researching authoritarianism in the discipline of democracy. Social Science Quarterly97(4), 834-849. DOI: 10.1111/ssqu.12340  

Additional Reference Material  

  • 11.3.3. Fuji, L.A. (2013). Working with Interpreters. Interview research in political science. Cornell University Press.
  •  
  • 11.3.4. Cammett, M. (2013) Positionality and Sensitive Topics:  Matched Proxy Interviewing as a Research Strategy. Interview Research in Political Science. Cornell University Press.  
  • 11.3.5. Carapico S., Clark, J.A., Jamal, A.,  Romano, D., Schwedler, J. & Tessler, M. (2006). “Symposium: The methodologies of field research in the Middle East,” PS:  Political Science and Politics 39(3).  
  • 11.3.6. Karlan, D., & Appel, J. (2016). Failing in the field: what we can learn when field research goes wrong. Princeton University Press. 17-70.

   

Thursday, June 22

Module 12 – Interpretation and History I: Discourse Analysis and Intellectual History  

Thomas Dodman and Daragh Grant  

This module introduces students to methods of discourse analysis employed by political theorists and historians of political thought and to critical approaches to intellectual history. Building on earlier modules on discourse analysis, the first session will introduce participants to different approaches to “reading” texts, and will examine debates over meaning, concepts, context, and the explanation of historical change, as well as engaging with ongoing debates about the politics of historiography. We will discuss the techniques of the Cambridge school and the German tradition of Begriffsgeschichte (concept history). Participants will engage in a practical exercise of concept analysis during the second session of the day, and we will discuss their findings, and the methodological challenges they encountered in the final session of the day.

In both modules on Interpretation and History, we expect students to come to the sessions having completed all of the required readings. These two sessions will be conducted in the style of an academic seminar rather than in lecture form, with a view to allowing your research interests to shape our discussion of the readings.  


8:45am - 10:15am – Session 1: Interpretive debates in intellectual history

This session considers two important traditions in the history of political thought by introducing participants to the work of Quentin Skinner and the Cambridge school of intellectual history and Reinhardt Koselleck and the techniques of Begriffsgeschichte (or concept history). We will consider, among other things, how one goes about reconstructing the questions that a given author is asking? what are illocutionary acts and why do they matter? to what extent are texts and the ideas they formulate related to specific historical contexts? and how do texts relate to practices of power and domination? We will also investigate What is a concept? how does it come into being? and in what relation to the social world? In both cases, we will try to ascertain what are the advantages and limitations of this approach to discourse analysis, a conversation that will continue into the final session of the day.  

  • 12.1.1. Skinner, Q. (1969). Meaning and understanding in the history of ideas. History and theory8(1), 3-53. DOI: 10.2307/2504188.  
  • 12.1.2. Koselleck, R. (2011). Introduction and prefaces to the Geschichtliche Grundbegriffe. Contributions to the History of Concepts6(1), 1-37. DOI: 10.3167/choc.2011.060102    
  • 12.1.3. Koselleck, R. (1969). Historical criteria of the modern concept of revolution. R. Koselleck, Futures Past: On the Semantics of Historical Time, New York: Columbia University Press2004, 43-57.

 

10:15am - 10:45am – Coffee Break

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Session 2: Practical exercise (Syracuse University Library)

During this session, you will be asked to work collaboratively (in groups of up to 4 participants) to develop a history of a particular concept. You will use the resources available to you at Syracuse University Library and online to investigate the meaning of the concept in question, how it has changed over time, and the kinds of conceptual challenges that these changes pose for scholars doing historical work. We would ask you to make note not only of this concept history, but also of the challenges you faced when attempting to investigate it. Naturally, the limited time you will have available to complete this task will pose a significant constraint, but the goal is for you to come face to face with some of the challenges of this kind of work. Groups will be able to choose one of four concepts, which we will hand out in the first session of the day. We hope that by the third session the similarities and divergences in your respective experiences will allow for a fruitful debriefing and discussion of the methods of intellectual history.

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Session 3: Debrief and further discussion

Participants will have some time at the beginning of this session to finish writing up their findings from the morning. We will then discuss the exercise in the light of the morning’s readings and of Hayden White’s analysis of the various ways in which you might “employ” your research findings.  

  • 12.3.1. White, H. (1973). Introduction: The Poetics of History. Metahistory: The Historical Imagination in 19th-Century Europe. Baltimore: Johns Hopkins University Press, 1-42.

Suggested further readings

J.L. Austin, How to do Things with Words (Cambridge: Harvard University Press, 1962).

Pierre Bourdieu, Language and Symbolic Power, ed. by John Thompson, trans. by Gino Raymond and Matthew Adamson (Cambridge: Polity, 1991), 107-137.

R. G. Collingwood, An Autobiography (Oxford: Oxford University Press, 1951), 27-43.

Reinhardt Koselleck, “Begriffsgeschichte and Social History,” in Futures Past: On the Semantics of Historical Time, trans. by Keith Tribe (New York: Columbia University Press, 2004), 75-92.

Reinhardt Koselleck, The Practice of Conceptual History: Timing History, Spacing Concepts, trans. by Todd Samuel Presner and others (Stanford: Stanford University Press, 2002).

Dominick LaCapra, “Rethinking Intellectual History and Reading Texts,” in Rethinking Intellectual History: Texts, Contexts, Language (Ithaca: Cornell University Press, 1983), 23-71.

Melvin Richter, “Begriffsgeschichte and the History of Ideas,” Journal of the History of Ideas, 48 (1987): 247-263.

Quentin Skinner, “The rise of, challenge to, and prospects for a Collingwoodian approach to the history of political thought,” in The History of Political Thought in National Context, eds. Dario Castiglione and Iain Hampsher-Monk (Cambridge: Cambridge University Press, 2001), 175-88.

   

Friday, June 23

Module 13 – Multimethod Research II

Jason Seawright  

This module extends the idea of integrative multi-method research by exploring designs that strengthen causal inferences based on random assignment and on process tracing, as well as designs that increase the value of methods for conceptualization and measurement.

8:45am - 10:15am – Random Assignment and Multi-Method Research   

This session looks at how multi-method research works in the context of random (or as-if random) assignment, exploring how to design case studies in conjunction with experimental or natural-experimental research. It considers assumptions about independence, realism, and the causal history of the treatment variable specifically in the context of these designs.

  • 13.1.1. Seawright, J. (2016). Multi-method social science: Combining qualitative and quantitative tools. Cambridge University Press. Chapters 6-7 (book to purchase)  
  • 13.1.2. Dunning, T. (2012). Natural experiments in the social sciences: A design-based approach. Cambridge University Press. Chapter 11  

Recommended:  

  • 13.1.3. Freedman, D. A. (1991). Statistical models and shoe leather. Sociological methodology, 291-313. DOI: 10.2307/270939

 

10:15am - 10:45am – Coffee Break

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Improving Process-Tracing Arguments with Quantitative Tools

This session asks what tools from statistics and machine learning can add to causal inferences based on process tracing. It considers three designs: using a quantitative component to strengthen a weak link in a process-tracing chain, using comparative experiments to measure the outcome for comparative-historical analysis, and using machine learning to discover more of the relevant range of alternative hypotheses.  

  • 13.2.1. Seawright, J. (2016). Multi-method social science: Combining qualitative and quantitative tools. Cambridge University Press.  Chapter 8 (book to purchase)  
  • 13.2.2 Bennett, A. (2015). Appendix: Disciplining Our Conjectures: Systematizing Process Tracing with Bayesian Analysis. Process Tracing: From Metaphor to Analytic Tool. Cambridge: Cambridge University Press.  

Recommended:  

  • 13.2.3. Siroky, D. S. (2009). Navigating random forests and related advances in algorithmic modeling. Statistics Surveys3, 147-163. DOI: 10.1214/07-SS033  
  • 13.2.4. Henrich, J., Heine, S., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33(2-3), 61-83. DOI: 10.1017/S0140525X0999152X

This citation includes a collection of wonderful and highly relevant comments by other authors.

 

3:30pm - 4:00pm – Coffee Break

        

4:00pm - 5:30pm – Concepts and Measurement in Multi-Method Research    

This session asks whether there can be value added from multi-method research designs focused on conceptualization and measurement. We discuss assumption-testing case-study designs in conjunction with psychometric measurement models, as well as case studies focused on finding meaning in conjunction with cluster analysis.  

  • 13.3.1. Seawright, J. (2017) “Integrative Multi-Method Measurement: Combining Qualitative and Quantitative Methods for Evaluating Indicators.” Working Paper.  
  • 13.3.2. Seawright, J., & Collier, D. (2014). Rival strategies of validation tools for evaluating measures of democracy. Comparative Political Studies47(1), 111-138. DOI: 10.1177/0010414013489098  

Recommended:  

  • 13.3.3. Austrian, Z. (2000). Cluster case studies: The marriage of quantitative and qualitative information for action. Economic Development Quarterly14(1), 97-110. DOI: 10.1177/089124240001400110  
  • 13.3.4. Adcock, R. & Collier, D. (2001). Measurement validity: A shared standard for qualitative and quantitative research. American Political Science Review, 95(3): 529-546. DOI: 10.1017/S0003055401003100

 

Friday, June 23

Module 14 – Geographic Information Systems II

Jonnell Robinson

 

8:45am - 10:15am – Open Source Mapping Tools 

This session will introduce open source geovisualization and analysis tools including Open Street Map, Google My Maps, and QGIS.  

  • 14.1.1. Haklay, M., & Weber, P. (2008). Openstreetmap: User-generated street maps. IEEE Pervasive Computing, 7(4), 12-18. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4653466 (accessed April 2015).  
  • 14.1.2. Liu, S. B., & Palen, L. (2010). The new cartographers: Crisis map mashups and the emergence of neogeographic practice. Cartography and Geographic Information Science37(1), 69-90. DOI: 10.1559/152304010790588098  
  • 14.1.3. Steiniger, S., & Bocher, E. (2009). An overview on current free and open source desktop GIS developments. International Journal of Geographical Information Science23(10), 1345-1370. DOI: 10.1080/13658810802634956  
  • 14.1.4. Goodchild, M. F., & Glennon, J. A. (2010). Crowdsourcing geographic information for disaster response: a research frontier. International Journal of Digital Earth3(3), 231-241. DOI: 10.1080/17538941003759255  

Further:  

  • Sarah Elwood, Michael F. Goodchild and Daniel Z. Sui. “Researching Volunteered Geographic Information: Spatial Data, Geographic Research, and New Social Practice.” Annals of the Association of American Geographers 2012:102(3) 571‐590.

 

10:15am - 10:45am – Coffee Break

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)

12:30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm – GIS Data Collection: Digitizing Archival Maps, Collecting GPS Point Locations, Counter and Sketch Mapping, and Spatial Data Repositories

This session will demonstrate valuable data collection techniques for archival research, field work, participatory and community‐based mapping, as well as the availability and accessibility of spatial data through data repositories. “Heads‐up” digitizing, or turning print maps into a digital GIS map, integrating GPS receiver data into GIS, and sketch map digitization will be demonstrated. Downloading spatial data from web‐based repositories for integration into GIS will also be discussed.

  • 14.2.1 Heasley, L. (2003). Shifting boundaries on a Wisconsin landscape: Can GIS help historians tell a complicated story?. Human Ecology31(2), 183-213. DOI: 10.1023/A:1023928728978  
  • 14.2.2. Peluso, N. L. (1995). Whose woods are these? counter‐mapping forest territories in Kalimantan, Indonesia. Antipode27(4), 383-406. DOI: 10.1111/j.1467-8330.1995.tb00286.x  

Further:  

  • William J. Craig, Trevor M. Harris, and Weiner Daniel. Community Participation and Geographic Information Systems. London/ New York, New York: Taylor & Francis Inc., 2002. Print.  
  • Ian N. Gregory, A Place in History: A guide to using GIS in historical research. 2nd. Belfast, Northern Ireland: Centre for Data Digitisation and Analysis, 2005. http://www.researchgate.net/profile/Ian_Gregory2/publication/228725974_A_place_in_history_A_guide_to_using_GIS_in_historical_research/links/547726620cf29afed614470b.pdf.(accessed April 2015).  
  • John Pickles, Ground Truth: The Social Implications of Geographic Information Systems. New York, New York: The Guilford Press, 1995. Print.  
  • Denis Wood, The Power of Maps. New York, New York: The Guilford Press, 1992. Print.

 

3:30pm - 4:00pm – Coffee Break

        

4:00pm - 5:30pm – Map Design    

This session will provide an overview of basic map design, integrating narrative and photos with mGIS, and a discussion about why, how and where to further hone GIS skills.  

  • 14.3.1. Buckley, A., Field, K., & Esri. Making a Meaningful Map. ESRI ‐ GIS Mapping Software, Solutions, Services, Map Apps, and Data. http://www.esri.com/news/arcuser/0911/making‐a‐map‐meaningful.html (accessed April 2015).    

Further:  

  • Cynthia A. Brewer, Designing better maps: a guide for GIS users. Redlands, California: ESRI Press, Inc., 2005.  
  • Heather MacDonald and Alan Peters. Urban Policy and the Census. Redlands, California: ESRI Press, Inc. 2011. Print.  
  • Andy Mitchell, The ESRI Guide to GIS Analysis: Geographic Patterns & Relationships. 1. Redlands, California: ESRI Press, Inc., 1999. Print.  
  • Andy Mitchell, The ESRI Guide to GIS Analysis: Spatial Measurements & Statistics. 2. Redlands, California: ESRI Press, Inc., 2005. Print.  
  • Andy Mitchell, The ESRI Guide to GIS Analysis: Modeling Suitability, Movement, and Interaction. 3. Redlands, California: ESRI Press, Inc., 2012. Print.  
  • Mark Monmonier, Mapping it Out: Expository Cartography for the Humanities and Social Sciences. Chicago, Illinois: University of Chicago Press, 1993. Print.

   

Friday, June 23

Module 15 – Designing and Conducting Field Research II: Collecting and Analyzing Data

Diana Kapiszewski and Lauren MacLean

This module discusses a range of data-collection techniques and offers multiple strategies for engaging in analysis in the field. We emphasize that the most productive fieldwork entails data collection, data analysis, and research design. Each session of this module is conducted with the understanding that participants have carefully read the assigned materials.  The instructors will present key points drawing on the readings, other published work on field research, and the experiences they and others have had with managing fieldwork’s diverse challenges. Interaction and discussion in small and large groups is encouraged.  

 

8:45am - 10:15am – More-Interactive Forms of Data Collection

Diana Kapiszewski, Georgetown University

Lauren M. MacLean, Indiana University  

This session considers the differences among, unique features of, benefits of, and challenges inherent in employing several more-interactive forms of data collection including participant observation, ethnography, surveys, and experiments.  

  • 15.1.1. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). Site-Intensive Methods: Ethnography and Participant Observation. Field Research in Political Science: Practices and Principles. Cambridge University PressChapter 7. (book to purchase)   
                                
  • 15.1.2. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). Surveys in the Context of Field Research. Field Research in Political Science: Practices and Principles. Cambridge University PressChapter 8. (book to purchase)  
  • 15.1.3. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). Experiments in the Field. Field Research in Political Science: Practices and Principles. Cambridge University Press. Chapter 9. (book to purchase)

Additional Reference Material  

  • 15.1.4. Ellen Pader, E. (2006) Seeing with an Ethnographic Sensibility: Explorations Beneath the Surface of Public Policies. Interpretation and Method:  Empirical Research Methods and the Interpretive Turn. Routledge.
  •  
  • 15.1.5. Wedeen, L. (2010). Reflections on ethnographic work in political science. Annual Review of Political Science13, 255-272. DOI: 10.1146/annurev.polisci.11.052706.123951  
  • 15.1.6 Kubik, J. (2009). Ethnography of politics: foundations, applications, prospects. Political ethnography: What immersion contributes to the study of power, 25-52.  
  • 15.1.7. Brady, H. E. (2000). Contributions of survey research to political science. PS: Political Science & Politics33(01), 47-58. DOI: 10.2307/420775  
  • 15.1.9. Sudman, S., & Bradburn, N. M. (1982). Asking questions: a practical guide to questionnaire design.  
  • 15.1.10. Levy Paluck, E. (2010). The promising integration of qualitative methods and field experiments. The ANNALS of the American Academy of Political and Social Science628(1), 59-71. DOI: 10.1177/0002716209351510

 

10:15am - 10:45am – Coffee Break

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Interviewing

Diana Kapiszewski, Georgetown University

Lauren M. MacLean, Indiana University

This session explores various types of interviewing including one-on-one in-depth interviews, oral histories, and focus groups.  We consider the many challenges and opportunities that conducting interviews in the field entails and offer a range of practical advice.  

  • 15.2.1. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). Interviews, Oral Histories, and Focus Groups. Field Research in Political Science: Practices and Principles. Cambridge University Press. Chapter 6. (book to purchase)  
  • 15.2.2. Bleich, E. & Pekkanen, R. (2013) How to Report Interview Data. Interview Research in Political Science. Cornell University Press.  
  • 15.2.3. Soss, J. (2006). Talking our way to meaningful explanations. Interpretation and method: Empirical research methods and the interpretive turn, 127-149.

Additional Reference Material  

  • 15.2.4. Leech, B. & Goldstein, K. (2002) Symposium: Interview Methods in Political Science. PS: Political Science and Politics 35(4), 663-672.  
  • 15.2.5. Short, S.E.,  Perecman, E., & Curran S.R. (2006) Focus Groups. A Handbook for Social Science Field Research: Essays & Bibliographic Sources on Research Design and Methods. Sage.  
  • 15.2.6. Rubin, H. & Rubin, I. (2005). Qualitative Interviewing. The Art of Hearing Data, 2nd ed. Sage. Chapters 6-9.  
  • 15.2.7. Tansey, O. (2007). Process tracing and elite interviewing: a case for non-probability sampling. PS: Political Science & Politics40(04), 765-772. DOI: 10.1017/S1049096507071211

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Analyzing, Re-Tooling, and Assessing Progress

Diana Kapiszewski, Georgetown University

Lauren M. MacLean, Indiana University  

This session considers various strategies for engaging in data analysis, writing, and presenting initial findings to different audiences while conducting fieldwork. It also considers how to retool a project in the field, and assess progress toward completing field research.  

  • 15.3.1. Kapiszewski, D., MacLean, L.M., Read, B.L. (2015). Analyzing, Writing, and Retooling in the Field. Field Research in Political Science: Practices and Principles (Cambridge University Press, 2015)Chapter 10. (book to purchase)  
  • 15.3.2. Diana Kapiszewski, Lauren M. MacLean, and Benjamin L. Read, “Reconceptualizing Field Research,” Unpublished manuscript.  
  • 15.3.3. Emerson, R. M., Fretz, R. I., & Shaw, L. L. (1995). Fieldnotes in Ethnographic Research (Fragments de texte). University of Chicago Press.  

Additional Reference Material  

  • 15.3.4. Shapiro, G. & Markoff, J. (1997). A Matter of Definition. Text Analysis for the Social Sciences: Methods for Drawing Statistical Inferences from Texts and Transcripts. Lawrence Erlbaum.  
  • 15.3.5. McDermott, R. et al. (2010). Symposium: Data Collection and Collaboration. PS: Political Science and Politics, 43(1), 15-58.

   

Friday, June 23

Module 16 – Interpretation and History II: Interpretive Methods for Archival and Historical Research

Thomas Dodman and Daragh Grant

This module introduces students to the challenges of working with materials drawn from different social, cultural, and historical settings, and explores creative interpretive strategies for addressing these challenges. Students will be introduced to the basics of the historical method, and will be encouraged to think about how a careful attention to questions of temporality can shape and reveal new avenues in their empirical research. All three sessions will be attentive to the problem of analyzing historical materials from the standpoint of the present. Shifting meanings over time, and transformations in the criteria for judgment, present particular problems for historical researchers. In light of these challenges, students will be invited to think through the strategies available for working in a partial archive, with attention to the virtues and pitfalls of creatively thinking about historical source materials.

 

8:45am - 10:15am – Session 1: History as social science: The study of structures and events  

This session introduces students to the historical method, highlighting two key challenges to the study of historical events. Students will begin the session by working in groups to identify their own archival challenges, specifically related to two questions. First, how does the problem of temporality enter their work? And second, how do the events they study refashion the very structures of the societies on which their research is centered?  

  • 16.1.1. Sewell Jr., W.H. (2005) Three Temporalities: Toward an Eventful Sociology. Logics of History: Social Theory and Social Transformation. Chicago: University of Chicago Press, 81-123.  
  • 16.1.2. Sahlins, M. (1985) Structure and History. Islands of History. Chicago, University of Chicago Press, 136-56.

 

10:15am - 10:45am – Coffee Break

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Session 2: Practical challenges of archival research  

This session will introduce students to the more mundane practical challenges that scholars face, as well as some of the hidden possibilities that await them in the course of archival research. The readings for this session are designed to give participants a sense of the importance of understanding the production of the archive itself.  We will examine questions of interpretation raised by these readings as well as exploring how fleeting or fragmentary records might nevertheless yield a wealth of historical insights.  

To conclude this session, we will invite participants to examine a brief archival fragment. The goal of this exercise will be to attempt to bring some of the discussion of the previous two days to bear on the examination of a historical document.    

  • 16.2.1. Ginzburg, C. (1979). Clues. Theory and Society7(3), 273-288. DOI: 10.1007/BF00207323  
  • 16.2.2. Stoler, A. L. (2002). Colonial archives and the arts of governance. Archival science2(1-2), 87-109. DOI: 10.1007/BF02435632

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Session 3: The Politics of Historical Interpretation  

At the core of historical research are questions of evidence, of both the power of the archive and the archive of power. This section explores key debates and controversies that have shaped the considerable theoretically informed literature on the shifting coordinates of historical evidence.    

  • 16.3.1. Goldstein, J. E. (2015). Toward an Empirical History of Moral Thinking: The Case of Racial Theory in Mid-Nineteenth-Century France. The American Historical Review120(1), 1-27. DOI: doi.org/10.1093/ahr/120.1.1  
  • 16.3.2. Scott, D. (2004). Conscripts of Modernity: The Tragedy of Colonial Enlightenment. Durham: Duke University Press, 23-57.  

Suggested further readings  

Arlette Farge, The Allure of the Archives, trans. Thomas Scott-Railton (New Haven: Yale University Press, 2013).  

Constantin Fasolt, The Limits of History (Chicago: University of Chicago Press, 2004).  

Carlo Ginzburg, “Checking the Evidence: The Judge and the Historian,” Critical Inquiry 18 (1991): 79-92.  

Jan E. Goldstein, Hysteria Complicated by Ecstasy: The Case of Nanette Leroux (Princeton: Princeton University Press, 2011).  

Randolph Head, “Knowing the State: The Transformation of Political Knowledge in Swiss Archives, 1450-1770,” Journal of Modern History 75 (2003): 745-82.  

Joan W. Scott, “Evidence of Experience,” in Questions of Evidence: Proof, Practice, and Persuasion across the Disciplines, eds. James Chandler, Harry Harootunian and Arnold Davidson (Chicago: University of Chicago Press, 1994) 363-387.  

William H. Sewell Jr., “History, Theory, and Social Science,” in Logics of History, 1-21.  

William H. Sewell Jr., “A Theory of the Event: Marshall Sahlins’s ‘Possible Theory of History,’” in Logics of History, 197-224.  

William H. Sewell Jr.,, “Historical Events as Transformations of Structures: Inventing Revolution at the Bastille,” in Logics of History, 225-270.  

Carolyn Steedman. “Something She Called a Fever: Michelet, Derrida, and Dust.” American Historical Review 106 (2001): 1159-80.  

Ann Laura Stoler, Along the Archival Grain: Epistemic Anxieties and Colonial Common Sense (Princeton: Princeton University Press, 2009).  

Michel-Rolph Trouillot, Silencing the Past: Power and the Production of History (Boston: Beacon Press, 1995).

   

Monday, June 26

Module 17-1 – Natural Experiments

Daniel Hidalgo

 

8:45am - 10:15am – Introduction to Natural Experiments  

What are natural experiments? We review the concept of natural experiments and discuss their strengths and limitations through a survey of recent examples from political science and economics. We introduce a common formal framework for understanding and assessing natural experiments. 

  • 17-1.1.1. Dunning, T. (2012). Natural experiments in the social sciences: A design-based approach. Cambridge University Press. Chapters 1-4. (Book to purchase)  
  • 17-1.1.2.  Di Tella, R., Galiant, S., & Schargrodsky, E. (2007). The formation of beliefs: evidence from the allocation of land titles to squatters. The Quarterly Journal of Economics122(1), 209-241. DOI: 10.1162/qjec.122.1.209  
  • 17-1.1.3. Hinnerich, B. T., & Pettersson‐Lidbom, P. (2014). Democracy, redistribution, and political participation: Evidence from Sweden 1919–1938. Econometrica82(3), 961-993. DOI: 10.3982/ECTA9607  
  • 17-1.1.4 Sances, M. W. (2016). The Distributional Impact of Greater Responsiveness: Evidence from New York Towns. The Journal of Politics78(1), 105-119. DOI: 10.1086/683026

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)          

12: 30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm – Natural Experiments: Quantitative Methods  

We critically assess natural-experimental research using an evaluative framework based on (1) the plausibility of as-if random assignment and (2) the credibility of causal and statistical assumptions.  We discuss formal tools for assessing designs on these criteria, such as sensitivity analyses, non-parametric bounds, and robustness tests.   

  • 17-1.2.1. Dunning, T. (2012). Natural experiments in the social sciences: A design-based approach. Cambridge University Press. Chapters 5-6. (Book to purchase)  
  • 17-1.2.2 Rosenbaum, P. (2010). Design of Observational Studies. Springer. Chapter 3
  •  
  • 17-1.2.3 Blattman, C., & Annan, J. (2010). The consequences of child soldiering. The review of economics and statistics92(4), 882-898. DOI: 10.1162/REST_a_00036

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Natural Experiments: Qualitative Methods  

We highlight the essential role of qualitative methods in the analysis of natural experiments. We present examples that illustrate how qualitative evidence can bolster the credibility of causal assumptions and aid in the interpretation of quantitative results.  

  • 17-1.3.1. Dunning, T. (2012). Natural experiments in the social sciences: A design-based approach. Cambridge University Press. Chapter 7. (Book to purchase)  
  • 17-1.3.2. Ferwerda, J. & Miller, N. (2014). Political Devolution and Resistance to Foreign Rule: A Natural Experiment. American Political Science Review 108(3), 642-660. DOI: 10.1017/S0003055414000240  
  • 17-1.3.3 Kocher, M. & Monteiro, N. (2015). What’s in a Line? Natural Experiments and the Line of Demarcation in WWII Occupied France. Manuscript.

     

Tuesday, June 27

Module 17-2 – Natural Experiments

Daniel Hidalgo

 

8:45am - 10:15am – Enhancing the Credibility of Natural Experiments  

We discuss how to bolster the credibility of natural experiments in the design-stage. In particular, we will focus on the role of “ex-ante” approaches to increasing the credibility of our inferences, such as the use of pre-analysis plans, results-blind review, and sample splitting.  How can qualitative methods be integrated into efforts to increase research transparency?

  • 17-2.1.1. Christensen, G. & Miguel, M. (Forthcoming). Transparency, Reproducibility, and the Credibility of Economics Research. Journal of Economics Literature.   
  • 17-2.1.2. Findley, M. G., Jensen, N. M., Malesky, E. J., & Pepinsky, T. B. (2016). Can results-free review reduce publication bias? The results and implications of a pilot study. Comparative Political Studies49(13), 1667-1703. DOI: 10.1177/0010414016655539  
  • 17-2.1.3. Hidalgo, F. D., Canello, J., & Lima-de-Oliveira, R. (2016). Can politicians police themselves? Natural experimental evidence from Brazil’s audit courts. Comparative Political Studies49(13), 1739-1773. DOI: 10.1177/0010414015626436  
  • 17-2.1.4. Hidalgo, F.D. (2017). Purges: The Legacy of Dictatorship in Brazilian Politics.  Manuscript.

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Design Your Own Natural Experiment  

In this session, we give participants the opportunity to design a natural experiment related to their own work and receive feedback from course participants.

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Multi-Method Research and Natural Experiments    

We end the course by evaluating the promise and obstacles to the use of multi-method research in the analysis of natural experiments. Drawing upon the previous sessions and readings, we discuss how qualitative methods can help address some of the criticisms of natural experiments, as well as how natural experiments can bolster the inferences drawn from qualitative evidence.   

  • 17-2.3.1. Dunning, T. (2012). Natural experiments in the social sciences: A design-based approach. Cambridge University Press. Chapter 11.  (Book to purchase)
     

Further Readings by Topic (for both Modules 6 and 10)

Standard Natural Experiments:

  • Christopher Blattman, “From Violence to Voting: War and Political Participation in Uganda” American Political Science Review 103(2) (May 2009): 231-247.
  • Raghabendra Chattopadhyay and Esther Duflo, “Women as Policy Makers: Evidence from a Randomized Experiment in India,” Econometrica 72(5) (September 2004): 1409-1443.
  • Daniel Doherty, Donald Green, and Alan Gerber, “Personal Income and Attitudes toward Redistribution: A Study of Lottery Winners,” Political Psychology 27(3) (June 2006): 441-458.
  • Claudio Ferraz and Frederico Finan, “Exposing Corrupt Politicians: The Effect of Brazil’s Publicly Released Audits on Electoral Outcomes,” Quarterly Journal of Economics 123(2) (May 2008): 703-745.
  • Susan Hyde, “The Observer Effect in International Politics: Evidence from a Natural Experiment,” World Politics 60(1) (October 2007): 37–63.
  • Jason Lyall, “Does Indiscriminate Violence Incite Insurgent Attacks? Evidence from Chechnya,” Journal of Conflict Resolution 53(3) (June 2009): 331-362.
  • Daniel N. Posner, “The Political Salience of Cultural Difference: Why Chewas and Tumbukas Are Allies in Zambia and Adversaries in Malawi,” American Political Science Review 98(4) (November 2004): 529-545.  

Regression-Discontinuity Designs:

  • Thad Dunning and Janhavi Nilekani, “Ethnic Quotas and Political Mobilization: Caste, Parties, and Distribution in Indian Village Councils.” Working paper, Department of Political Science, Yale University (2010).  Available at http://www.thaddunning.com/research/all-research.
  • David S. Lee, “Randomized Experiments from Non-random Selection in U.S. House Elections,” Journal of Econometrics 142(2) (February 2008): 675-697.
  • Amy Lerman, “Bowling Alone (With my Own Ball and Chain): The Effects of Incarceration and the Dark Side of Social Capital.”  Manuscript, Department of Politics, Princeton University (2008).
  • Donald L. Thistlewaite and Donald T. Campbell, “Regression-discontinuity Analysis: An Alternative to the Ex-post Facto Experiment,” Journal of Educational Psychology 51(6) (December 1960): 309-317.  

Instrumental-Variables Designs:

  • Edward Miguel, Shanker Satyanath, and Ernest Sergenti, “Economic Shocks and Civil Conflict: An Instrumental Variables Approach,” Journal of PoliticalEconomy 112(4) (August 2004): 725-753.  

Analysis and Design:

  • Joshua D. Angrist and Alan B. Krueger, “Instrumental Variables and the Search for Identification: From Supply and Demand to Natural Experiments,” Journal of Economic Perspectives 15(4) (Fall 2001): 69-85.
  • Henry Brady and David Collier, eds., Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd ed. (Rowman & Littlefield, 2010).
  • Donald T. Campbell and Julian C. Stanley, Experimental and Quasi-Experimental Designs for Research (Houghton Mifflin Co., 1963).
  • Thad Dunning, “Improving Causal Inference: Strengths and Limitations of Natural Experiments,” Political Research Quarterly 61(2) (June 2008): 282-293.
  • Thad Dunning, “Model Specification in Instrumental-Variables Regression,” Political Analysis 16(3) (July 2008): 290-302.
  • Thad Dunning, “Natural and Field Experiments: The Role of Qualitative Methods,” Qualitative Methods Newsletter 6(2) (2008).
  • David Freedman, Statistical Models: Theory and Practice (Cambridge University Press, 2005).
  • David Freedman, Robert Pisani, and Roger Purves, Statistics, 4th ed. (W.W. Norton & Co., 2007), Chapter 1 (“Controlled Experiments”) and Chapter 2 (“Observational Studies”).
  • Donald P. Green, Terence Y. Leong, Holger L. Kern, Alan S. Gerber, and Christopher W. Larimer,   “Testing the Accuracy of Regression Discontinuity Analysis Using Experimental Benchmarks,”  Political Analysis 17(4) (October 2009): 400-417.
  • Allison J. Sovey and Donald P. Green, “Instrumental Variables Estimation in Political Science: A Readers’ Guide,” American Journal of Political Science 55(1) (January 2011): 188-200.  

Qualitative Methods

  • Kripa Ananthpur, Kabir Malik, and Vijayendra Rao, “The Anatomy of Failure: An Ethnography of a Randomized Trial to Deepen Democracy in Rural India.” June 2014
  • Christopher Blattman, Tricia Gonwa, Julian Jamison, Katherine Rodrigues, and Margaret Sheridan. “Measuring the Measurement Error: A Method to Qualitatively Validate Survey Data”. November 2014.
  • Elizabeth Levy Plaluck. “The Promising Integration of Qualitative Methods and Field Experiments”. Annals of the American Academy of Political and Social Sciences”. 628 March 2010.

   

Wednesday, June 28

Module 17-3 – Causal Inference from Causal Models

Alan M. Jacobs  

This module will explore how we can use causal models to design and implement qualitative and mixed-method empirical strategies of causal inference. A great deal of recent methodological progress in the social sciences has focused on how features of a research design – such as randomization by the researcher or by nature – can allow for causal identification with minimal assumptions.  Yet, for many of the questions of greatest interest to social scientists and policymakers, randomization or its close equivalents are unavailable. We are, in short, often forced to rely on beliefs about how the world works – that is, on models. Based on a book-in-progress by Macartan Humphreys and Alan Jacobs, this module will examine how we can engage in systematic model-based causal inference. Specifically, we will explore how researchers can encode their prior knowledge in a probabilistic causal model (or Bayesian network) and an associated directed acyclic graph (DAG), use the model to make research design choices (including selecting cases and choosing observations), and draw inferences about causation at the level of both individual cases and populations, using both qualitative and quantitative data.  

While this module and the associated book manuscript are grounded in a vast literature on causal models, the required readings will be drawn largely from the manuscript itself, which addresses those concepts from the literature that are most relevant to the module’s objectives.

 

8:45am - 10:15am – What is a Causal Model?  

In this session, we will learn the “nuts and bolts” of causal models and their graphical counterparts, directed acyclic graphs (DAGs). How can we formalize our beliefs about relationships in a given domain in the form of a causal model? What does and does not need to be specified when writing down a causal model? What are the rules for visually representing causal dependencies in a DAG? How can a more detailed causal model underwrite, or imply, a less-detailed one? And how can we represent causal estimands of interest – such as a case-level causal effect, a causal pathway, or an average causal effect – within a causal model?

  • 17-3.1.1. Humphreys, M. & Jacobs, A. Integrated Inferences. Manuscript in progress. Chapters 1, 2, and 4

Recommended

  • 17-3.1.2. Humphreys, M. & Jacobs, A. Integrated Inferences. Manuscript in progress. Chapter 3.

 

10:15am - 10:45am – Coffee Break    

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – What Can Causal Graphs Tell Us?  

In this session, we will examine what we can learn about research design from a graphical representation of a causal model. In particular, we will explore the property of “d-separation,” which allows one to read relations of conditional independence off of the structure of a properly constructed DAG. We will then assess how understanding relations of conditional independence can help us identify potentially informative pieces of data for a given causal estimand – that is, how causal models can help us figure out what it is we want to observe.   

  • 17-3.2.1. Humphreys, M. & Jacobs, A. Integrated Inferences. Manuscript in progress. Chapter 5 - section 5.1.

 

3:30pm - 4:00pm Coffee – Break

 

4:00pm - 5:30pm – Make Your Own Model  

In this session, students will have a chance to write down their own causal models and draw the associated DAGs, formally encoding their own beliefs about causal relationships in a domain of interest to them. This will be an opportunity to work through some of the choices that researchers confront when constructing causal models.

   

Thursday, June 29

Module 17-4 – Causal Inference from Causal Models

Alan M. Jacobs

 

8:45am - 10:15am – Process Tracing from a Causal Model  

In this session, we will learn how we can carry out process-tracing with causal models. We will see how we can use within-case information, together with a model, to draw inferences about what would or did cause the outcome in a given case. We will see how a model-based approach to process tracing provides an explicit and theoretically disciplined procedure for determining which pieces of within-case evidence are informative and how their observation should shift causal beliefs. Moreover, while the “process tracing” metaphor implies the examination of a causal chain between X and Y, we will see that informative observations may come from many different parts of a causal network. We will work through a substantive application of the approach to the question of inequality’s effect democratization, drawing on theoretical arguments by Boix (2003), Acemoglu and Robinson (2005), and Ansell and Samuels (2014) and on data from Haggard and Kaufman (2012). 

  • 17-4.1.1. Humphreys, M. & Jacobs, A. Integrated Inferences. Manuscript in progress. Chapter 5 (remainder) and Chapter 7.  
  • 17-4.1.2. Ansell, B. W., & Samuels, D. J. (2014). Inequality and Democratization. Cambridge University Press. Chapter 4.

  • 17-4.1.3. Haggard, S., & Kaufman, R. R. (2012). Inequality and regime change: Democratic transitions and the stability of democratic rule. American Political Science Review106(03), 495-516. DOI: 10.1017/S0003055412000287

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)      

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Mixed-Method Inference from a Causal Model  

In this session, we will see how mixed-method inference can be grounded in a causal-model-based approach. The session will examine how we can use causal models to draw inferences about population-level causal relations (such as average causal effects) from any combination of qualitative and quantitative data. This session will address the approach to mixing methods presented in Humphreys and Jacobs (2015) and how that approach can itself be grounded in a causal-model framework.  

  • 17-4.2.1. Humphreys, M., & Jacobs, A. M. (2015). Mixing methods: A bayesian approach. American Political Science Review109(04), 653-673. DOI: 10.1017/S0003055415000453  
  • 17-4.2.2 Humphreys, M. & Jacobs, A. Integrated Inferences. Manuscript in progress. Chapter 8.

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Causal Models and Research Design Choices  

In this session, we will explore the variety of ways in which causal models can help us make research-design choices. These include figuring out which pieces of within-case evidence to examine (i.e., identifying from which observations we can learn the most), selecting cases for process tracing, and striking the optimal balance between quantitative breadth and qualitative depth in a mixed-method project.  

  • 17-4.3.1. Humphreys, M. & Jacobs, A. Integrated Inferences. Manuscript in progress. Chapter 6, Chapter 9, and Chapter 10  

Recommended:  

  • 17-4.3.2. Lieberman, E. S. (2005). Nested analysis as a mixed-method strategy for comparative research. American Political Science Review99(03), 435-452. DOI: 10.1017/S0003055405051762

  • 17-4.3.3. Seawright, J., & Gerring, J. (2008). Case selection techniques in case study research: A menu of qualitative and quantitative options. Political Research Quarterly61(2), 294-308. DOI: 10.1177/1065912907313077  
  • 17-4.3.4. Herron, M. C., & Quinn, K. M. (2016). A careful look at modern case selection methods. Sociological Methods & Research45(3), 458-492. DOI: 10.1177/0049124114547053

   

Monday, June 26

Module 18-1 – Basics of Set Methods and QCA

Carsten Schneider  

This module presents the basic principles and practices of set-analytic methods, in general, and Qualitative Comparative Analysis (QCA), in particular. After introducing the tools of formal logic and set theory that underpin this family of methods, participants learn about the formalized analysis of set relations using truth tables. Throughout the module, we will use examples of applied QCA in order to illustrate our points.

 

8:45am - 10:15am – Introduction to Qualitative Comparative Analysis (QCA): Causal Complexity, Set Calibration  

This session introduces QCA, especially its use as a tool for deciphering and unraveling causal complexity. QCA uses set-analytic procedures that are consistent with common practices in case-oriented comparative research. Since QCA, essentially, is about identifying set relations in your data, a necessary first step is to assign membership scores of cases in sets, i.e. to calibrate sets. The basics of QCA are illustrated with a published example of applied QCA.

  • 18-1.1.1. Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press. Chapters 4-5. (book to purchase)  
  • 18-1.1.2. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press. Introduction, chapters 1-2. (book to purchase)  
  • 18-1.1.3. Vis, B. (2009). Governments and unpopular social policy reform: Biting the bullet or steering clear? European Journal of Political Research48(1), 31-57. DOI: 10.1111/j.1475-6765.2008.00783.x  

Recommended:

  • 18-1.1.4. Goertz, G., & Mahoney, J. (2012). Mathematical Prelude: A Selective Introduction to Logic and Set Theory for Social Scientists. A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press, 16-38.  
  • 18-1.1.5. Marx, A., Rihoux, B., & Ragin, C. (2014). The origins, development, and application of Qualitative Comparative Analysis: the first 25 years. European Political Science Review6(01), 115-142. DOI: 10.1017/S1755773912000318

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)          

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Set Relations (Necessity, Sufficiency, INUS, SUIN) and Parameters of Fit  

This session spells out the logic of set relations and their link to the notions of necessity and sufficiency and their derivatives INUS and SUIN conditions. We contrast set relations with correlations and also introduce the so-called parameters of fit (consistency and coverage) that are needed if and when the data at hand deviates (slightly) from perfect set relations.  

  • 18-1.2.1. Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press. Chapter 3. (book to purchase)  
  • 18-1.2.2. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press. Introduction, Chapters 3, 5. (book to purchase)  

Recommended:  

  • 18-1.2.3. Ragin, C. C. (2006). Set relations in social research: Evaluating their consistency and coverage. Political Analysis14(3), 291-310. DOI: 10.1093/pan/mpj019

 

3:30pm - 4:00pm – Coffee Break

        

4:00pm - 5:30pm – Constructing and Analyzing Truth Tables  

This session describes the procedures for constructing and analyzing truth tables. Truth tables are at the heart of any QCA. We first explain how not only crisp, but also fuzzy set data can be represented in a truth table. Then we explain how the logical minimization of a truth table works.  

  • 18-1.3.1. Ragin, C. (1987). Boolean approach to qualitative comparison. The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. University of California Press. Chapter 6.  
  • 18-1.3.2. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press. Chapter 4. (book to purchase)  

Recommended:  

  • 18-1.3.3. Ragin, C. &Amoroso, L. (2011). Constructing Social Research, Second Edition. Pine Forge Press. Chapter 6.  
  • 18-1.3.4. Rihoux, B. & Ragin, C. (2009). Configurational Comparative Methods. Sage. Chapter 3.

   

Tuesday, June 27

Module 18-2 – Set Theory Meets Noisy Data

Carsten Schneider and Ingo Rohlfing  

This module spells out the problems and some solutions when neat formal logic, as introduced in the previous module, meets noisy social science data. The first problem of inconsistent set relation has already been discussed in the previous module. The second problem consists of so-called limited diversity, that is, the empirical non-existence of logically possible cases. We discuss the use of counterfactual reasoning as a tool for handling such logical remainders. The last session will contain hands-on exercises, familiarizing participants with the R packages QCA (Dusa 2007) and SetMethods (Medzihorsky et al 2016). For these exercises, we will use data from published articles using QCA.

 

8:45am - 10:15am – Counterfactual Analysis: A Set-Analytic Approach

Carsten Schneider, Central European University, Budapest  

This session further elaborates truth table analysis. One of the key features of qualitative research is its reliance on counterfactual analysis. Surprisingly, most qualitative researchers are unaware that they conduct counterfactual analysis “on the fly,” and the analytic process remains hidden and implicit. With QCA, counterfactual analysis is made explicit in the form of the distinction between “easy” versus “difficult” versus “untenable” counterfactual claims. The examination of counterfactual analysis in QCA illustrates the theory and knowledge dependence of empirical social science.

  • 18-2.1.1. Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press. Chapters 8 & 9. (book to purchase)  
  • 18-2.1.2. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press. Chapters 4, 6 and 8. (book to purchase)  

Recommended:  

  • 18-2.1.3. Ragin, C. (1987). Boolean approach to qualitative comparison. The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies. University of California Press. Chapter 7.

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12:30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm – Examples of Applied QCA
Ingo Rohlfing, University of Cologne, Carsten Schneider, Central European University, Budapest
 

This session reviews several applications of set-analytic methods. Our goal is to illustrate the utility and flexibility of the approach, as well as its tight coupling with theoretical concepts. We include a large-N application to illustrate issues in applying QCA to such data.  

  • 18-2.3.1. Schneider, C. Q., & Makszin, K. (2014). Forms of welfare capitalism and education-based participatory inequality. Socio-Economic Review, 12(2), 437-62. DOI: 10.1093/ser/mwu010  
  • 18-2.3.2. Kuehn, D., Croissant, A., Kamerling, J., Lueders, H., & Strecker, A. (2016). Conditions of civilian control in new democracies: An empirical analysis of 28 ‘third wave’ democracies. European Political Science Review, 1-23. DOI: 10.1017/S1755773916000011  

Recommended:  

  • 18-2.3.3. Bara, C. (2014). Incentives and opportunities A complexity-oriented explanation of violent ethnic conflict. Journal of Peace Research51(6), 696-710. DOI: 10.1177/0022343314534458  
  • 18-2.3.4. Hinterleitner, M., Sager, F., & Thomann, E. (2016). The politics of external approval: Explaining the IMF's evaluation of austerity programmes. European Journal of Political Research55(3), 549-567. DOI: 10.1111/1475-6765.12142

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Computer Exercises

Ingo Rohlfing, University of Cologne, Carsten Schneider, Central European University, Budapest  

This session will familiarize students with the R software packages QCA and SetMethods. We introduce the functions with which all the analytic steps can be performed that have been introduced so far. The packages needed for running QCA in R are developed on a continuous basis. For this reason, we assign as readings online manuals and documentations for the packages we need. If you are not familiar with R, you can find many excellent introductions to R on the internet. The manual by Eva Thomann (reading 18-2.2.1. also gives a short introduction to basic commands in R).

Note: Adrian Dusa and Alrik Thiem published a book on Qualitative Comparative Analysis with R in 2013. This book is outdated because of the rapid development of software.  

Recommended:  

     

Wednesday, June 28

Module 18-3 – Integrating QCA with Process Tracing

Carsten Schneider and Ingo Rohlfing  

This module presents two extensions to the analysis of truth tables, both of which aiming at enhancing the combination of QCA with follow-up case studies. We spell out the principles and practices of set-theoretic Multi-Method Research (MMR) and introduce the notion of set-theoretic theory evaluation, as opposed to hypothesis testing. The relevant functions in R package SetMethods are introduced in the last session, again using data from published QCA research.

 

8:45am - 10:15am – Set-Theoretic Multi-Method Research

Ingo Rohlfing, University of Cologne, Carsten Schneider, Central European University, Budapest  

This session explains the principles and some (computer-aided) practices of combining the truth table analysis aspect of QCA with follow-up within-case analyses of purposefully selected cases. We discuss which cases, based on a cross-case pattern discerned with QCA, are typical and which ones are deviant. We also spell out which of the potentially many typical and deviant cases should be chosen for either single-case or comparative within-case analysis and what the analytic goal of process tracing can (and cannot) be in these different forms of comparison. We will use a set of functions in the update R package SetMethods.

  • 18-3.1.1. Schneider, C. Q., & Rohlfing, I. (2013). Combining QCA and Process Tracing in Set-Theoretic Multi-Method Research. Sociological Methods and Research42(4), 559-597. DOI: 10.1177/0049124113481341  
  • 18-3.1.2. Schneider, C. Q. & Rohlfing, I. (2017). The Importance of Test Severity and Conjunctions for Case Selection in Set-theoretic Multimethod Research. Typescript.  

Recommended:  

  • 18-3.1.3. Ragin, C. C., & Schneider, G. A. (2011). Case-oriented theory building and theory testing. The Sage handbook of innovation in social research methods, 150-66.  
  • 18-3.1.4. Rohlfing, I., & Schneider, C. Q. (2013). Improving Research On Necessary Conditions: Formalized Case Selection for Process Tracing after QCA. Political Research Quarterly66(1), 220-235. DOI: 10.1177/1065912912468269

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)          

12: 30pm - 2:00pm – Lunch        

 

2:00pm - 3:30pm – Set-theoretic Multi-Method Research & Theory Evaluation

Carsten Schneider, Central European University, Budapest  

At the beginning of this session, we will finish the discussion of how to combine QCA with process tracing. The second half of the session introduces the notion of theory evaluation and juxtaposes it with that of hypothesis testing. Theory evaluation is a tool both for updating theoretical hunches held prior to running a QCA and for identifying cases for within-case analysis.   

  • 18-3.2.1. Ragin, C. C. (2014). The comparative method: Moving beyond qualitative and quantitative strategies. University of California Press. 118–21.  
  • 18-3.2.2 Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press. Chapters 11.3. (book to purchase)  

Recommended:  

  • 18-3.2.3. Schneider, C. Q., & Maerz, S. F. (2017). Legitimation, cooptation, and repression and the survival of electoral autocracies. Zeitschrift für Vergleichende Politikwissenschaft, 1-23. DOI: 10.1007/s12286-017-0332-2  
  • 18-3.2.4. Thomann, E. (2015). Customizing Europe: transposition as bottom-up implementation. Journal of European public policy22(10), 1368-1387. DOI: 10.1080/13501763.2015.1008554

 

3:30pm - 4:00pm – Coffee Break        

 

4:00pm - 5:30pm – Computer Exercises for Set-Theoretic Multi-Method Research and Theory Evaluation

Ingo Rohlfing, University of Cologne  

This session uses the empirical example of the study by Schneider and Maerz (2017). We use the analysis to illustrate the idea behind theory evaluation and solution-based case selection.  

  • 18-3.3.1. Schneider, C. Q., & Maerz, S. F. (2017). Legitimation, cooptation, and repression and the survival of electoral autocracies. Zeitschrift für Vergleichende Politikwissenschaft, 1-23. DOI: 10.1007/s12286-017-0332-2

   

Thursday, June 29

Module 18-4 – Set Methods, Robustness and Data Structures

Ingo Rohlfing  

A truth table analysis requires empirical researchers to make many modeling decisions that potentially affect the resulting QCA solution. We first introduce different strategies for assessing what ‘correctness’ and ‘robustness’ mean in a truth table analysis. Subsequently, we will discuss on an analytical level how different design decisions affect the robustness of QCA results.

 

8:45am - 10:15am – Robustness and Sensitivity Analyses  

This session reviews the emerging debate about the simulation-based, computational assessment of the quality of QCA.  

  • 18-4.1.1. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press. Chapter 11.2. (book to purchase)  
  • 18-4.1.2. Rohlfing, I. (2015). Mind the gap: A review of simulation designs for Qualitative Comparative Analysis. Research and Politics, 2(4), 1-4. DOI: 10.1177/2053168015623562  

Recommended:  

  • 18-4.1.3. Krogslund, C., Choi, D.D., & Poertner, M. (2014). Fuzzy Sets on Shaky Ground: Parametric and Specification Sensitivity in fsQCA. Political Analysis, 23(1), 21-41. DOI: 10.1093/pan/mpu016  
  • 18-4.1.4. Baumgartner, M. &Thiem, A. (2015). Model Ambiguities in Configurational Comparative Research. Sociological Methods & Research. DOI:  10.1177/0049124115610351

 

10:15 - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm – Data Structures in QCA  

This session focusses on the rather common situation in applied comparative analysis in which the cases at hand cluster around specific features. This occurs, for instance, when the data has a panel structure or when cases are compared across different world regions, sectors of the economy, etc. We introduce a computer-assisted diagnostic tool for detecting whether the QCA results obtained based on the pooled data also holds for each of the clusters in the data.  

  • 18-4.2.1. Castro, R. G., & Ariño, M. A. (2016). A General Approach to Panel Data Set-Theoretic Research. Journal of Advances in Management Sciences & Information Systems2, 63-76. DOI: 10.6000/2371-1647.2016.02.06  
  • 18-4.2.2. Baumgartner, M. (2013). Detecting causal chains in small-n data. Field Methods25(1), 3-24. DOI: 10.1177/1525822X12462527  

Recommended:  

  • 18-4.2.3. Caren, N., & Panofsky, A. (2005). TQCA: A technique for adding temporality to qualitative comparative analysis. Sociological Methods & Research34(2), 147-172. DOI: 10.1177/0049124105277197  
  • 18-4.2.4. Thiem, A. (2016). Analyzing multilevel data with QCA: yet another straightforward procedure. Quality & Quantity50(1), 121-128. DOI: 10.1007/s11135-014-0140-6

 

3:30pm - 4:00pm – Coffee Break

        

4:00pm - 5:30pm – Computer Exercises for Robustness Tests and Clustered Data Structures  

The major part of this session deals with robustness tests and sensitivity analyses of a truth table analysis. At present, there are no packages implementing such tests. This is the reason why we do not assign specific readings to this session. The exercise will be based on code and functions that we will share with the participants in due time.

     

Monday, June 26

Module 19-1 – Concepts and Typological Theories Case Study Research Design and Typological Theorizing      

Andrew Bennett  

This module covers key aspects of small-n case study research, including designs for single and comparative case studies, typological theorizing and case selection, counterfactual reasoning, and process tracing.

 

8:45am - 10:15am – Research Designs for Single and Comparative Case Studies

Andrew Bennett, Georgetown University  

This session provides an overview of case study research design.

  • 19-1.1.1. George, A. L., & Bennett, A. (2005). Case studies and theory development in the social sciences. MIT Press. Chapter 4  (book to purchase)  
  • 19-1.1.2. Seawright, J., & Gerring, J. (2008). Case selection techniques in case study research: A menu of qualitative and quantitative options. Political Research Quarterly, 61(2), 294-308. DOI: 10.1177/1065912907313077

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Typological Theories

Andrew Bennett, Georgetown University  

This session introduces typological theorizing and offers practical advice on building typological theories, which can help in the process of choosing cases for process tracing.  

  • 19-1.2.1. George, A. L., & Bennett, A. (2005). Case studies and theory development in the social sciences. MIT Press. Chapter 11  (book to purchase)  
  • 19-1.2.2 Bennett, A. (2013) chapter excerpt on typological theory from Checkel, J. T. (Ed.). (2013). Transnational dynamics of civil war. Cambridge University Press.

 

3:30pm - 4:00pm – Coffee Break

          

4:00pm - 5:30pm – Exercises and Examples of Typological Theories

Andrew Bennett, Georgetown University  

This session discusses several examples of typological theories.  

  • 19-1.3.1. Edelstein, D. M. (2004). Occupational hazards: Why military occupations succeed or fail. International Security, 29(1), 49-91 (read pages 49-56, 80-91).  DOI: 10.1162/0162288041762913   
  • 19-1.3.2. Marsden, S. V. (2016). A social movement theory typology of militant organisations: contextualising terrorism. Terrorism and Political Violence, 28(4), 750-773.  DOI: 10.1080/09546553.2014.954039  
  • 19-1.3.3. Sebastian Ziaja, Political regimes and civil conflict onset: A reassessment with a new empirical typology.  Research Center for Distributional Conflict and Globalization Heidelberg University:

http://unige.ch/sciences-societe/speri/files/9814/5313/1653/Sebastian_Ziaja_-_Political_regimes_and_civil_conflict_onset.pdf  (poster)  

  • 19-1.3.4. Grävingholt, J., Ziaja, S., & Kreibaum, M. (2015). Disaggregating state fragility: a method to establish a multidimensional empirical typology. Third World Quarterly, 36(7), 1281-1298. DOI: 10.1080/01436597.2015.1038340

   

Tuesday, June 27

Module 19-2 – Counterfactual Analysis, Student Examples of Typological Theories      

Jack Levy and Andrew Bennett  

This module addresses the question of whether and how we can use what did not happen but which might have happened and maybe should have happened to help understand what actually did happen. We explore the utility of counterfactual analysis in helping to assess the alternative paths that history might have taken, for the purposes of validating causal inferences in historical interpretation. Given the temptation to invoke “counterfactuals of convenience” that bolster one’s preferred historical interpretations or political preferences, what are the rules for evaluating the scientific legitimacy and utility of counterfactuals?   

In the final afternoon session, students will present and discuss typological theories from their own work.

 

8:45am - 10:15am – Counterfactual Analysis

Jack Levy, Rutgers University  

This session begins with a brief discussion of the importance of counterfactuals, various types and uses of counterfactuals, and methodological problems inherent in evaluating counterfactuals. We then develop a system of methodological rules or best practices for using counterfactuals to help assess the validity of causal inferences in historical interpretation. We discuss criteria relating to clarity, the plausibility of the antecedent, the conditional plausibility of the consequent, and comparative counterfactual analysis. 

  • 19-2.1.1. Levy, J. S. (2015). Counterfactuals, Causal Inference, and Historical Analysis. Security Studies, 24(3), 378-402. DOI: 10.1080/09636412.2015.1070602  

Recommended:

  • 19-2.1.3. Gavin, F. J. (2015). What If? The Historian and the Counterfactual. Security Studies, 24(3), 425-430. DOI: 10.1080/09636412.2015.1070610

 

10:15am - 10:45am – Coffee Break    

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)          

12: 30pm - 2:00pm – Lunch


2:00pm - 3:30pm – Counterfactual Analysis Exercises and Examples

Jack Levy, Rutgers University  

In this session we illustrate and elaborate upon our analytic rules for counterfactual analysis by applying them to two historical cases: the First World War and the 2003 Iraq War. There are as many counterfactuals as there are causal linkages on the road to war, but we begin with the big one, which spins off countless others: What if Austrian Archduke Franz Ferdinand had not been assassinated? Would the war still have occurred? For the Iraq War, we examine the counterfactual world defined by the hypothetical election of Al Gore as president of the United States in 2000, focusing on Frank Harvey’s counterfactual analysis.    

  • 19-2.2.1. Lebow, R.N. (2007). Contingency, catalysts and nonlinear change: the origins of World War I. In Goertz, G., & Levy, J. S. (Eds.). (2007). Explaining war and peace: Case studies and necessary condition counterfactuals. London, UK: Routledge,  85-111.  
  • 19-2.2.2 Harvey, F. P. (2012). President Al Gore and the 2003 Iraq War: A Counterfactual Test of Conventional “W” isdom. Canadian Journal of Political Science, 45(01), 1-32. DOI: 10.1017/S0008423911000904

  Recommended:  

  • 19-2.2.3. Levy, J. S. (1990). Preferences, constraints, and choices in July 1914. International Security, 15(3), 151-186. DOI: 10.2307/2538910  
  • 19-2.2.4. Dawisha, A., Ehrenberg, J., Gilley, B.,  Walt, S.M., & Saunders E. (2013). Ideology, Realpolitik, and US Foreign Policy: A Discussion of Frank P. Harvey’s Explaining the Iraq War: Counterfactual Theory, Logic and Evidence. Perspectives on Politics 11(2), 578-92.

 

3:30pm - 4:00pm – Coffee Break

        

4:00pm - 5:30pm – Student Examples of Typological Theories

Andrew Bennett, Georgetown University  

This session will focus on discussing the work of students who volunteer to talk about how they might apply typological theorizing in their own research projects.

   

Wednesday, June 28

Module 19-3 – Process Tracing I: A Bayesian Perspective

Andrew Bennett and Tasha Fairfield

This module examines the inferential logic of process tracing, which is used extensively in qualitative case studies.  We identify Bayesian probability as the foundation for causal inference in process tracing, which entails assessing which hypothesis or theory provides the best explanation for the evidence at hand.  We will present practical advice for conducting process-tracing research as well as best practices for applying Bayesian reasoning in case study analysis.   

NOTE: No prior knowledge of Bayesian analysis is required for this module.

 

8:45am - 10:15am – Practical Advice on Process Tracing and Introduction to Bayesianism

Andrew Bennett, Georgetown University  

This session provides a general introduction to the method of process tracing, practical advice for research projects that employ process tracing, and an introduction to Bayesianism.

  • 19-3.1.1.  Bennett, A., & Checkel, J. T. (Eds.). (2014). Process tracing: from metaphor to analytic tool. Cambridge University Press, Chapter 1, Chapter 10, and appendix (Book for purchase)

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Bayesian Process Tracing: Methodological Foundations and Applications

Tasha Fairfield, London School of Economics  

Bayesian probability provides a rigorous methodological foundation for qualitative research that mirrors how we intuitively use evidence to develop and evaluate alternative explanations.  We will introduce the fundamentals of Bayesian probability and explain how Bayesianism differs from the frequentist framework that underpins causal inference in most large-N research.  We will then discuss how Bayesian analysis can be explicitly applied in qualitative research, as well as prospects for improving traditional case study narratives with heuristic Bayesian reasoning.       

  • 19-3.2.1. Fairfield, T., & Charman, A. E. (2017). Explicit Bayesian analysis for process tracing: guidelines, opportunities, and caveats. Political Analysis, 1-18. DOI: 10.1017/pan.2017.14  
  • 19-3.2.2. Fairfield, T. (2013). Going where the money is: Strategies for taxing economic elites in unequal democracies. World Development, 47, 42-57. DOI: 10.1016/j.worlddev.2013.02.011     NOTE: Please skim pp. 42–45 and read only the Chilean cases, pp. 47–49.    

Recommended:  

  • 19-3.2.3.  “Appendix. Evidence and Alternative Explanations: A Bayesian Approach to Process Tracing,” in Fairfield, T., & Garay, C. (2017). Redistribution under the right in Latin America: electoral competition and organized actors in policymaking. Comparative Political Studies, 0010414017695331. NOTE: This appendix provides an introduction to the Bayesian thought process using minimal mathematics.  It may be helpful to skim the article before reading the appendix.
  • 19-3.2.4.  Fairfield, T., & Charman, A. E. (2017). “Appendix A: Explicit Bayesian Analysis in Qualitative Case Research: An Empirical Example,” in Fairfield, T., & Charman, A. E. (2017). Explicit Bayesian analysis for process tracing: guidelines, opportunities, and caveats. Political Analysis, 1-18.  NOTE: This appendix provides a highly detailed explicit Bayesian analysis of Fairfield’s (2013) Chilean tax reform case.

 

3:30pm - 4:00pm – Coffee Break

         

4:00pm - 5:30pm – Bayesian Process Tracing: Exercises and Examples

Tasha Fairfield, London School of Economics  

Participants will break into groups and practice applying Bayesian reasoning to examples that use hypotheses and evidence drawn from well-known qualitative case studies.  We will examine the potential for Bayesian reasoning to help scholars pinpoint disagreements and build consensus on causal inferences.   

Recommended:  

  • 19-3.3.1.  Slater, D. (2009). Revolutions, Crackdowns, and Quiescence: Communal Elites and Democratic Mobilization in Southeast Asia 1. American Journal of Sociology, 115(1), 203-254. DOI: 10.1086/597796   NOTE: Read only pp. 229-34 on the Philippines. Highlight what you think are the key pieces of evidence that support the author’s causal argument—Appealsto nationalist and religious sentiments spark and sustain popular collective action against dictatorship—and cast doubt on the rival hypothesis—Economicdecline erodes the dictatorship’s ‘performance’ legitimacy and motivates popular collective action against the regime.

   

Thursday, June 29

Module 19-4 – Process Tracing II

Andrew Bennett and David Waldner  

This module discusses the philosophy of science behind process tracing and introduces Directed Acyclic Graphs as a way to model arguments and assess the completeness of process tracing.

 

8:45am - 10:15am – Philosophy of Science of Causal Mechanisms and Process Tracing

David Waldner, University of Virginia  

This session introduces process-tracing students to the “Completeness Standard,” composed of causal graphs, event-history maps, and invariant causal mechanisms  We will consider the extent to which successful execution of this standard supports valid claims of unit-level causal inferences.

  • 19-4.1.1. Waldner, D. (2014). What makes process tracing good? Causal mechanisms, causal inference, and the completeness standard in comparative politics. In Bennett, A., & Checkel, J. T. (Eds.). (2014). Process tracing: from metaphor to analytic tool. Cambridge University Press, chapter 5. (book to purchase)  
  • 19-4.1.2. Waldner, D. (2016). Invariant Causal Mechanisms. Qualitative & Multi-Method Research 14(1/2), 28–34. DOI:  10.5281/zenodo.569316

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)  

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Process Tracing Exercises and Examples

Andrew Bennett, Georgetown University  

This session analyzes and critiques published examples of process tracing and carries out process tracing exercises.  

Read and prepare the “Homework Exercises” and “Six in Class Exercises”   

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – The Completeness Standard in Process Tracing, Exercises and Examples

David Waldner, University of Virginia  

This session covers applications of the Completeness Standard.  We will collectively review on important example and how it can be modified to better meet the standard.  We will then discuss how students can employ the standard in their own work.  

  • 19-4.3.1. Owen, J. M. (1994). How liberalism produces democratic peace. International security19(2), 87-125. DOI: 10.2307/2539197  
  • 19-4.3.2. Waldner, D. (2015). Process tracing and qualitative causal inference. Security Studies, 24(2), 239-250. DOI: 10.1080/09636412.2015.1036624

   

Monday, June 26

Module 20-1 – Ethnography I

Timothy Pachirat and Fred Schaffer  

How does sustained attention to meaning making in the research world contribute to the study of politics? What are the promises, and perils, of social research that invites the unruly minutiae of lived experience and conceptual lifeworlds to converse with, and contest, abstract disciplinary theories and categories? In this practice-intensive four-day short course, we explore two ethnographic methods - participant observation and interviewing - with specific attention to their potential to subvert, generate, and extend understandings of politics and power.

 

8:45am - 10:15am – Introduction to Ethnography

Timothy Pachirat, University of Massachusetts, Amherst  

This session explores the promises and pitfalls of ethnographic approaches to the political.

  • 20-1.1.1. Geertz, C. (1973). Thick description: toward an interpretive theory of culture” in The Interpretation of Cultures. Basic Books.  
  • 20-1.1.2. Schatz, E. (2009). Ethnographic immersion and the study of politics, and What kind(s) of ethnography does political science need? In Schatz, E. ed., Political Ethnography: What Immersion Contributes to the Study of Power. University of Chicago Press, 1-22, 303-318.

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)         

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – What is Ethnographic Interviewing?

Fred Schaffer, University of Massachusetts, Amherst  

In this session, we examine the family of practices that characterize ethnographic interviewing and explore in more depth one type of ethnographic interviewing: ordinary language interviewing. Ordinary language interviewing is a tool for uncovering the meaning of words in everyday talk. By studying the meaning of words, the promise is to gain insight into the various social realities these words name, evoke, or realize.   

  • 20-1.2.1. Heyl, B.S. (2001). Ethnographic Interviewing. In Paul Atkinson, Amanda Coffey, Sara Delamont, John Lofland and Lyn Lofland, eds., Handbook of Ethnography. Sage, 369-383.

  • 20-1.2.2. Schaffer, F.C. (2016). Elucidating Social Science Concepts: An Interpretivist Guide. Routledge. Read the entire book, but pay special attention to pp. 1-64 and 89-98. [Book to purchase]  
  • 20-1.2.3. Schaffer, F.C. (2014) Thin Descriptions: The Limits of Survey Research on the Meaning of Democracy. Polity (2014) 46(3), 303-330. DOI: 10.1057/pol.2014.14

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Ordinary Language Interviewing I

Fred Schaffer, University of Massachusetts, Amherst  

Participants learn how to conduct a basic ordinary language interview and practice doing one focusing on words of their own choosing.

   

Tuesday, June 27

Module 20-2 – Ethnography II

Timothy Pachirat and Fred Schaffer

 

8:45am - 10:15am – Ordinary Language Interviewing II

Fred Schaffer, University of Massachusetts, Amherst  

Participants learn about and practice using additional types of ordinary-language questions as well as strategies for approaching people to interview. By this time, participants have selected the sites in which they will do their field exercises. Participants work with their fieldsite groups during this session’s exercises and in the short course’s subsequent exercises.

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12:30pm – 1:30pm – Lunch

 

1:30pm - 4:00pm – Interviewing Fieldwork Exercise and Write-Up  

Participants go to fieldsites (around campus or at the Carousel Center Mall) to conduct ordinary language interviews. They then write-up their main findings.

 

4:00pm - 4:30pm – Break

        

4:30pm – 6:00pm – Interviewing Debriefing

Fred Schaffer, University of Massachusetts, Amherst  

In this session, we discuss the challenges that participants encountered in approaching people to interview, conducting ordinary language interviews, and writing up results. We also discuss what participants discovered substantively in doing their fieldsite interviews.

   

Wednesday, June 28

Module 20-3 – Ethnography III

Timothy Pachirat and Fred Schaffer

 

8:45am - 10:15am – Ethics and Praxis in Participant Observation I

Timothy Pachirat, University of Massachusetts, Amherst  

Part One of an exploration of the practice of participant observation, with special emphasis on jottings, fieldnote writing, and the ethics of fieldwork.

  • 20-3.1.1. Emerson, R.M., Fretz, R.I., & Shaw, L.L. (1995). Writing Ethnographic Fieldnotes.  University of Chicago Press. (book to purchase)  
  • 20-3.1.2. Pachirat, T. (2013). Every Twelve Seconds: Industrialized Slaughter and the Politics of Sight. Yale University Press. (book to purchase)

 

10:15am - 10:45am – Coffee Break  

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)        

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Ethics and Praxis in Participant Observation II

Timothy Pachirat, University of Massachusetts, Amherst  

Part Two of an exploration of the practice of participant observation, with special emphasis on jottings, fieldnote writing, and the ethics of fieldwork.  Instructions and discussion of fieldwork exercise.  

  • 20-3.2.1. Emerson, R.M., Fretz, R.I., & Shaw, L.L. (1995). Writing Ethnographic Fieldnotes.  University of Chicago Press. (book to purchase)  
  • 20-3.2.2 Pachirat, T. (2013). Every Twelve Seconds: Industrialized Slaughter and the Politics of Sight. Yale University Press. (book to purchase)

 

3:30pm - 3:40pm – Coffee Break

   

3:40pm - 6:00pm – Participant Observation Fieldwork Exercise  

In their fieldsite groups, participants conduct participant-observation exercises in pre-selected sites.

 

6:00pm - 8:30pm – Fieldnote Writing  

Participants use this time to write up a set of fieldnotes based on jottings taken in their fieldsites.

   

Thursday, June 29

Module 20-4 – Ethnography IV

Timothy Pachirat and Fred Schaffer

 

9:15am - 10:15am – Fieldsite Group Review of Fieldnotes  

Participants exchange and comment on each other’s fieldnotes.

 

10:15am - 10:45am – Coffee Break

 

10:45am - 12:30pm – Research Design Discussion Sessions (not part of Module)       

 

12: 30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Fieldsite Group Discussions and Presentations

Timothy Pachirat, University of Massachusetts, Amherst  

Participants combine with other fieldsite groups to discuss the experience of doing participant observation.

 

3:30pm - 4:00pm – Coffee Break       

 

4:00pm - 5:30pm – Overall Debriefing (interviewing and participant observation)  

In this session, we reflect together on the following three clusters of questions: (1) How can participant observation, lifeworld interviewing, and ordinary language interviewing be fruitfully combined when doing ethnographic fieldwork? What are the potential pitfalls of such a combination? (2) To what extent does the method one adopts shape what one apprehends? Specifically, do we learn something different when we access meaning by means of (relatively unstructured) participant observation as opposed to (relatively structured) interviewing? (3) Is there anything that you learned about participant observation and/or interviewing that might or will inform your *own* research?

   

Friday, June 30

Module 21 – Multi-method Research

Jason Seawright  

This module looks at how to productively combine qualitative and quantitative methods when the overall goal is causal inference. We will discuss qualitative and statistical ideas about causation, explore the roles of regression, natural experiments, and process tracing in multi-method design, and consider multi-method tools for theory-building and for formalizing the causal inference produced by an entire research project.

 

8:45am - 10:15am – Multi-Method Designs with Regression and Process Tracing

Jason Seawright, Northwestern University   

This session looks at the theory of causal inference and how it has been used in triangulation, nested inference, and other common multi-method designs involving regression analysis and process tracing.  

  • 21.1.1. Seawright, J. (2016). Multi-method social science: Combining qualitative and quantitative tools. Cambridge University Press, Chapters 1-3. (book to purchase)  

Recommended:  

  • 21.1.2. Small, M. L. (2011). How to conduct a mixed methods study: Recent trends in a rapidly growing literature. Annual Review of Sociology, 37, 57-86. DOI: 10.1146/annurev.soc.012809.102657  
  • 21.1.3. Howard, M. M., & Roessler, P. G. (2006). Liberalizing electoral outcomes in competitive authoritarian regimes. American Journal of Political Science, 50(2), 365-381.  DOI: 10.1111/j.1540-5907.2006.00189.x

   

10:15am - 10:45am – Coffee Break

   

10:45am - 12:30pm – Multi-Method Designs with Natural Experiments

Jason Seawright, Northwestern University   

This session considers the assumptions needed to get a causal inference out of a randomized intervention in the world, and asks what qualitative and multi-method designs can add to the process.  

  • 21.2.1. Dunning, T. (2012). Natural experiments in the social sciences: A design-based approach. Cambridge University Press, Chapter 7. (Book to purchase)  
  • 21.2.2. Ferwerda, J., & Miller, N. L. (2014). Political devolution and resistance to foreign rule: A natural experiment. American Political Science Review, 108(03), 642-660.  DOI: 10.1017/S0003055414000240  
  • 21.2.3 Kocher, Matthew and Monteiro, Nuno. “What’s in a Line? Natural Experiments and the Line of Demarcation in WWII Occupied France”. 2015 Manuscript.

  

12:30pm - 2:00pm – Lunch

   

2:00pm - 3:30pm – Formalizing Results and Building Theory in Multi-Method Research

Jason Seawright, Northwestern University   

This session explores tools for quantifying qualitative findings and explicitly combining the results of different components in a multi-method design --- and therefore shows the final steps in a multi-method analysis. It also demonstrates multi-method research designs that can help at the opposite end of the process, exploring multi-method approaches to theory-building.  

  • 21.3.1. Humphreys, M., & Jacobs, A. M. (2015). Mixing methods: A bayesian approach. American Political Science Review, 109(04), 653-673. DOI: 10.1017/S0003055415000453  
  • 21.3.2. Bennett, A. (2015) “Appendix: Disciplining Our Conjectures: Systematizing Process Tracing with Bayesian Analysis.” In Bennett, A., & Checkel, J. T. (Eds.). (2014). Process tracing: from metaphor to analytic tool. Cambridge University Press.  

Recommended:  

  • 21.3.3. Siroky, D. S. (2009). Navigating random forests and related advances in algorithmic modeling. Statistics Surveys, 3, 147-163. DOI: 10.1214/07-SS033

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Institute Conclusion

   

Friday, June 30

Module 22 – Qualitative Comparative Analysis and Fuzzy Sets

Charles Ragin

This module provides an overview of Qualitative Comparative Analysis (QCA) and fuzzy sets, including instruction in use of fsQCA software (a free download from www.fsqca.com).  Topics include: necessary/sufficient causation, causal complexity, counterfactual analysis, and crisp-set and fuzzy-set configurational analysis using truth tables. Particular attention is given to the phenomenon of limited diversity and how QCA enables researcher to employ counterfactual reasoning as a way to address limited diversity.

 

8:45am - 10:15am – Session 1   

This session introduces QCA, especially its use as a tool for deciphering and unraveling causal complexity. QCA uses set-analytic procedures that are consistent with common practices in case-oriented comparative research. The key difference is that with QCA it is possible to examine an intermediate number of cases—too many for conventional case-oriented analysis.

  • 22.1.1. Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press, Chapters 1-3. (book to purchase)  

Recommended:  

  • 22.1.2. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press), Chapter 2, pp. 42‐55.  
  • 22.1.3. Goertz, G., & Mahoney, J. (2012). “Mathematical Prelude: A Selective Introduction to Logic and Set Theory for Social Scientists” in Goertz, G., & Mahoney, J. (2012). A tale of two cultures: Qualitative and quantitative research in the social sciences. Princeton University Press, pp. 16‐38.

   

10:15am - 10:45am – Coffee Break

 

10:45am - 12:30pm – Session 2         

This session describes procedures for constructing and analyzing truth tables. Truth tables are at the heart of any application of QCA. One of the key features of truth table analysis is its reliance on counterfactuals. Most qualitative researchers are unaware that they conduct counterfactual analysis “on the fly,” and the analytic process remains hidden and implicit. With QCA, counterfactual analysis is made explicit in the form of the distinction between “easy” versus “difficult” counterfactual claims. The examination of counterfactual analysis in QCA illustrates the theory and knowledge dependence of empirical social science.  

  • 22.2.1. Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press, chapters 6-9. (book to purchase)  

Recommended:  

  • 22.2.2. Rihoux, B., & Ragin, C. C. (2009). Configurational comparative methods: Qualitative comparative analysis (QCA) and related techniques. Sage. Chapter 3, pp. 33-68. Full text available online: http://dx.doi.org/10.4135/9781452226569  
  • 22.2.3. Ragin, C. C. (2000). Fuzzy-set social science. University of Chicago Press., Chapters 3-5.

 

12: 30pm - 2:00pm – Lunch         

 

2:00pm - 3:30pm – Session 3  

This session offers practical advice in the use of fsQCA software for both crisp set and fuzzy set applications. Participants should bring their laptop computers; both PC and Mac versions are supported, with identical user interfaces. Software and sample data files can be downloaded from www.fsqca.com.  

  • 22.3.1. Ragin, C. C. (2008). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press, Chapters 4-5. (book to purchase)  
  • 22.3.2. Ragin, C. C. (2014). The comparative method: Moving beyond qualitative and quantitative strategies. University of California Press. Chapters 6-8.  

Recommended  

  • 22.3.3. Schneider, C. Q., & Wagemann, C. (2012). Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge University Press), Chapter 4, pp. 92‐116.

   

Friday, June 30

Module 23 – Within case and small-n analysis (cross over)

Andrew Bennett and David Waldner

 

8:45am - 10:15am – Counterfactuals

David Waldner, University of Virginia  

This session introduces students to counterfactuals, their role in causal influence, and methods for their evaluation.

  • 23.1.1. Levy, J. S. (2015). Counterfactuals, Causal Inference, and Historical Analysis. Security Studies, 24(3), 378-402. DOI: 10.1080/09636412.2015.1070602  
  • 23.1.2. Runhardt, R. (2016). Tracing the productive continuity of social mechanisms. Qualitative & Multi-method Research, 14(1/2), 22–28. DOI: 10.5281/zenodo.569309

 

10:15am - 10:45am – Coffee Break

 

10:45am - 12:30pm – Process Tracing

Andrew Bennett, Georgetown University  

  • 23.2.1. Bennett, A., & Checkel, J. T. (Eds.). (2014). Process tracing: from metaphor to analytic tool. Cambridge University Press. Chapter 1, Chapter 10, and appendix (Book to purchase)

     

12:30pm - 2:00pm – Lunch

 

2:00pm - 3:30pm – Typological Theory

Andrew Bennett, Georgetown University  

  • 23.3.1. Andrew Bennett, chapter excerpt on typological theory from Jeff Checkel, ed., Transnational Dynamics of Civil War.

 

3:30pm - 4:00pm – Coffee Break

 

4:00pm - 5:30pm – Institute Conclusion

   

Friday, June 30 Module 24 – Ethnography

Timothy Pachirat and Fred Schaffer    

How does sustained attention to meaning making in the research world contribute to the study of politics? What are the promises, and perils, of social research that invites the unruly minutiae of lived experience and conceptual lifeworlds to converse with, and contest, abstract disciplinary theories and categories?  This module explores the promises and pitfalls of ethnographic approaches to the political with specific attention to their potential to subvert, generate, and extend understandings of politics and power.

 

8:45am - 10:15am  Introduction to Ethnography [Timothy Pachirat]  

This session explores the promises and pitfalls of ethnographic approaches to the political.

  • 24.1.1. Clifford Geertz, “Thick Description: Toward an Interpretive Theory of Culture” in The Interpretation of Cultures (Basic Books, 1973).  
  • 24.1.2. Edward Schatz, “Ethnographic Immersion and the Study of Politics” and “What Kind(s) of Ethnography does Political Science Need?” In Edward Schatz, ed., Political Ethnography: What Immersion Contributes to the Study of Power (University of Chicago Press, 2009).

   

10:24am - 10:45am Coffee Break.

   

10:45am - 12:15pm What is Ethnographic Interviewing? [Fred Schaffer]  

In this session, we examine the family of practices that characterize ethnographic interviewing and explore in more depth one type of ethnographic interviewing: ordinary language interviewing. Ordinary language interviewing is a tool for uncovering the meaning of words in everyday talk. By studying the meaning of words, the promise is to gain insight into the various social realities these words name, evoke, or realize.   

  • 24.2.1. Barbara Sherman Heyl, “Ethnographic Interviewing.” In Paul Atkinson, Amanda Coffey, Sara Delamont, John Lofland and Lyn Lofland, eds., Handbook of Ethnography (Sage, 2001), pp. 369-383.
  • 24.2.2. Frederic Charles Schaffer, Elucidating Social Science Concepts: An Interpretivist Guide (Routledge, 2016) pp. 1-54 and 89-98.
  • 24.2.3. Frederic Charles Schaffer, “Thin Descriptions: The Limits of Survey Research on the Meaning of Democracy.” Polity (2014) 46,3: 303-30.

 

12: 15pm - 2:00pm  Lunch.                     

   

2:00pm - 3:30pm  How to Do an Ordinary Language Interview [Fred Schaffer]  

In this session, participants will learn how to conduct an ordinary language interview and practice doing one focusing on words of their own choosing

 

3:30pm - 4:00pm Coffee Break.

          

4:00pm - 5:30pm   Institute Conclusion