Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
2019 | 10 | nr 2 | 42--55
Tytuł artykułu

Evaluation of Public Interventions in a Complex Environment: Developing Generalizable Knowledge from Case Studies

Warianty tytułu
Języki publikacji
Purpose: In the debate on how to increase the effectiveness of public policy instruments, learning- oriented evaluation attracts considerable attention. The focus of interest has been shifted from 'what' questions to 'why' and 'how' questions ( i.e. instead of asking what works/does not work we want to know why a particular public intervention works/ does not work). However, implementing public interventions in a complex environment which is characterised by feedback loops, adaptation by boththose delivering and those receiving the intervention does not allow to establish universal truths that apply anywhere, anytime. On the contrary, context matters and human agency cannot be taken for granted. Thus, we are more specific in our inquiry asking 'what works for whom in what circumstances'(the stance of the realistic evaluation approach). Case studies which have the explanatory power, do not necessary have to serve for one-off, discrete evaluation. The aim of this article is to address the dilemma of developing generalizable knowledge from case study research and on the basis of the extant evaluation literature, suggest approaches to enhance its external validity to enable the middle-ranged theories formulation, i.e. 'law-like'- regularities delimited in time and space, which can be used for learning beyond a particular case. Methodology: The article has been written following a careful review of leading literature in the subject as well as a review of evaluation reports from the Science and Innovation Policy Evaluations Repository (the SIPER database) to provide insights into evaluation practice.Findings: A case study approach is well recognised in evaluation practice in the field of research, development and innovation, however its full potential has not been exploited in terms of drawing lessons for future public interventions.Originality/value: Given the complexity surrounding numerous public interventions the article suggests a wider utilisation of case study approach in evaluation along with the techniques to enhance the generalisability of knowledge gained from case study research.(original abstract)
Opis fizyczny
  • Poznan University of Technology, Poland
  • Astbury, B. (2013), "Some reflections on Pawson's Science of Evaluation: A Realist Manifesto", Evaluation, Vol. 19 No. 4, pp. 383-401.
  • Astbury, B., Leeuw, F. (2010), "Unpacking Black Boxes: Mechanisms and Theory Building in Evaluation", American Journal of Evaluation, Vol. 31 No. 3, pp. 363-381.
  • Aus, J. (2005), "Conjunctural Causation in Comparative Case-Oriented Research. Exploring the Scope Conditions of Rationalist and Institutionalist Causal Mechanism", ARENA Working Paper No. 28, November 2005.
  • Barnes, M., Matka, E., Sullivan, H. (2003), "Evidence, Understanding and Complexity: Evaluation in Non-Linear Systems", Evaluation, Vol. 9 No. 3, pp. 265-284.
  • Befani, B. (2013), "Between complexity and generalization: Addressing evaluation challenges with QCA", Evaluation, Vol. 19 No. 3, pp. 269-283.
  • Befani, B. (2016), "Causal frameworks for assessing the impact of development programmes", UEA seminar, 9 March 2016.
  • Blamey, A., Mackenzie, M. (2007), "Theories of Change and Realistic Evaluation. Peas in a Pod or Apples and Orangaes", Evaluation, Vol. 13 No. 4, pp. 439-455.
  • Blatter, J., Haverland, M. (2012), "Two or three approaches to explanatory case study research? Paper prepared for the presentation at the Annual Meeting of the American Political Science Association", New Orleans, August 30-September 2, 2012.
  • Byrne, D. (2013), "Evaluating complex social interventions in a complex world", Evaluation, Vol. 19 No. 3, pp. 217-228.
  • Granger, R., Maynard, R. (2015), "Unlocking the Potential of the "What Works" Approach to Policymaking and Practice: Improving Impact Evaluations", American Journal of Evaluation, Vol. 36 No. 4, pp. 558-569.
  • Hedström, P., Swedberg, R. (1998), Social mechanisms: An analytical approach to social theory, Cambridge University Press, Cambridge.
  • Marchal, B., van Belle, S., van Olmen, J., Hoerée T., Kegels G. (2012), "Is realist evaluation keeping its promise? A review of published empirical studies in the field of health systems research", Evaluation, Vol. 18 No. 2, pp. 192-212.
  • Maxwell, J. (2004), "Using Qualitative Methods for Causal Explanation", Field Methods, Vol. 16 No. 3, pp. 243-264.
  • Patton, M. (2011), Developmental evaluation: Applying Complexity Concepts to Enhance Innovation and Use, SAGE, Thausand Oaks.
  • Pawson, R. (2003), "Nothing as Practical as Good Theory", Evaluation, Vol. 9 No. 4, pp. 471-490.
  • Pawson, R. (2006), Evidence-based policy: A realist perspective, SAGE, London.
  • Pawson, R., Greenhalgh, Y., Harvey, G., Walshe, K. (2004), "Realist synthesis: an Introduction", ESCR Research Methods programme, University of Manchester RMP methods Paper No. 2.
  • Pawson, R., Tilley, N. (2007), Realistic Evaluation, SAGE, London.
  • Ragin, C. (1987), The Comparative Mthod: Moving Beyond Qualitative nad Quantitative Strategies, University of California Press, Berkley.
  • Riege, A. (2003), "Validity and reliability tests in case study research: a literature review with 'hands-on' applications for each research phase", Qualitative Market Research: An International Journal, Vol. 6 No. 2, pp.75-86.
  • Schmitt, J., Beach, D. (2015), "The contribution of process tracing to theory-based evaluations of complex aid insruments", Evaluation, Vol. 21 No. 4, pp.429-447.
  • Simons, H. (2015), "Interpret in context: Generalizing from the single case in evaluation", Evaluation, Vol. 21 No. 2, pp. 173-188.
  • Skocpol, T. (1979), States and Social Revolutions: A Comparative Analysis of France, Russia, and China, Cambridge University Press, Cambridge.
  • Solmeyer, A., Costance, N. (2015), "Unpackingthe 'Black Box' of Social Programs and Policies: Introduction", American Journal of Evaluation, Vol. 36 No. 4, pp. 470-474.
  • Van der Knapp, P. (2004), "Theory-based Evaluation and Learning: Possibilities and Challenges, Evaluation, Vol. 10 No. 1, pp. 16-34.
  • Verweij, S., Gerrits, L. (2012), "Understanding and researching complexity with Qualitative Comparative Analysis: Evaluationg transportation infrastructure projects", Evaluation, Vol. 19 No. 1, pp. 40-55.
  • Weiss, C. (1997), "Theory-based evaluation: Past, present and future", New directions for Evaluation, No. 76, Jossey-Bass, San Fransisco, pp. 41-55.
  • Westhorp, G. (2013), "Developing complexity-consistent theory in a realist investigation", Evaluation, Vol. 19 No. 4, pp. 364-382.
  • Woolcock, M. (2013), "Using case studies to explore the external validity of 'complex' development interventions", Evaluation, Vol. 19 No. 3, pp. 229-248.
  • Yin, R. (2013), "Validity and generalisation in future case study evaluations", Evaluation, Vol. 19 No. 3, pp. 321-332.
  • Yin, R. (2018), Case study research and applications: design and methods, SAGE.
  • Ylikoski, P. (2019), "Mechanism-based theorizing and generalization form case studies", Studies in History and Philosophy of Science, No. 78, pp. 14-22.
Typ dokumentu
Identyfikator YADDA

Zgłoszenie zostało wysłane

Zgłoszenie zostało wysłane

Musisz być zalogowany aby pisać komentarze.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.