Back close

Course Detail

Course Name Intervention Design and Impact Evaluation
Course Code 25SDS532
Program M.Sc. in Social Data Science & Policy
Semester 3
Credits 3
Campus Faridabad

Syllabus

Unit 1

Unit I From Idea to Theory of Change Framing the problem and objectives. Mapping assumptions, activities, outputs, and outcomes. Defining key indicators.

Unit 2

Unit II Choosing an Evaluation Strategy Comparing RCTs, quasi-experimental, non-experimental, and mixed-methods designs. Practical, ethical, and budget trade-offs. Basics of sampling and power.

Unit 3

Unit III Measurement & Instrumentation Designing surveys and administrative-data tools. Piloting and revision. Incorporating qualitative modules. Data management and transparency protocols.

Unit 4

Unit IV Implementation Monitoring & Field Management Building a monitoring plan linked to the theory of change. Tracking fidelity and context. Quality-control procedures and field-team supervision.

Unit 5

Unit V Analysis, Interpretation & Reporting Cleaning and documenting data. Estimating impacts with the chosen method. Robustness checks. Translating findings into concise briefs and presentations for decision- makers.

Unit 6

Unit VI Capstone project Based on a proposed project idea, students code sign a full evaluation blueprint: theory-of-change, monitoring plan, design and power specifications, instruments, and a concise policy pitch.

Text Books / References

Textbooks and Papers: Gerber, A. S., & Green, D. P. (2012). Field experiments: Design, analysis, and interpretation. W. W. Norton & Company. Banerjee, A. V., & Duflo, E. (Eds.). (2017). Handbook of field experiments (Vol. 1). North-Holland. Karlan, D., & Appel, J. (2016). Failing in the field: What we can learn when field research goes wrong. Princeton University Press. Leeuw, F., & Vaessen, J. (2009). Impact evaluations and development: NONIE guidance on impact evaluation. Network of Networks for Impact Evaluation (NONIE). Gertler, P. J., Martinez, S., Premand, P., Rawlings, L. B., & Vermeersch, C. M. J. (2016). Impact evaluation in practice (2nd ed.). World Bank. https://doi.org/10.1596/978-1-4648-0779-4 Reference Books: Glennerster, R., & Takavarasha, K. (2013). Running randomized evaluations: A practical guide. Princeton University Press. Stoecker, R. (2013). Research methods for community change: A project-based approach (2nd ed.). SAGE Publications. United Nations Evaluation Group. (2013). Impact evaluations: UNEG guidance document. United Nations. Whetten, D. A. (1989). What constitutes a theoretical contribution? Academy of Management Review, 14(4), 490 https://doi.org/10.5465/amr.1989.4308371

Introduction

Prerequisite: Research Methods I & II, Foundations of Development Policy Summary: This applied course guides you from a policy idea all the way to an impact study: you will learn to craft a theory of change that maps assumptions to measurable outcomes, weigh the trade-offs among randomised, quasi-experimental, non-experimental and mixed-methods designs to choose an evaluation approach that fits real- world constraints and the research questions, and build an implementation-monitoring plan that keeps a pulse on fidelity and context while data are collected. Along the way we cover instrument design, piloting, ethics, transparency, core analytical techniques, and clear policy reporting, so that by the end you can chart a defensible evaluation strategy, supervise reliable fieldwork, interpret results, and communicate evidence to decision-makers.

Objectives and Outcomes

Course Objectives:

  1. To enable students to translate policy problems into rigorous theories of change, mapping assumptions, activities, outputs, outcomes, and measurable indicators.
  2. To develop the capacity to select and justify the most appropriate evaluation designrandomized, quasi- experimental, non-experimental, or mixed methodsand to produce defensible sampling and power calculations.
  3. To train students to design, pilot, and refine quantitative and qualitative measurement instruments, supported by transparent data-management, ethics, and documentation protocols.
  4. To build competence in constructing and executing field-monitoring and quality-control systems that track implementation fidelity, contextual shifts, and data accuracy.
  5. To equip students with the skills to clean, analyze, and interpret impact-evaluation data; conduct robustness checks; and translate findings into concise, decision-oriented communications.
  6. To integrate all of the above competencies in a capstone project that delivers a complete evaluation blueprint including theory of change, design and power specifications, monitoring plan, instruments, and policy pitch ready for stakeholder review.

Course Outcomes:

CO1: Ability to develop a rigorous theory-of-change map that links activities, outputs, outcomes, and measurable indicators for a real-world intervention.

CO2: Ability to select and justify the most appropriate evaluation designRCT, quasi-experimental, non-experimental, or mixed methodsand to produce a defensible sampling and power plan within ethical, practical, and budget constraints.

CO3: Capacity to design, pilot, and refine quantitative and qualitative measurement instruments, complemented by transparent data-management and documentation protocols.

CO4: Capacity to implement a field monitoring and quality-control plan that tracks implementation fidelity, contextual shifts, and data accuracy throughout the study.

CO5: Ability to clean and analyze evaluation data, perform robustness checks, and translate findings into concise briefs and presentations for policy decision-makers.

CO6: Capacity to integrate all componentstheory of change, design and power specifications, monitoring plan, instruments, and policy pitchinto a complete evaluation blueprint ready for stakeholder review.

Skills:

  • Theory-of-Change Mapping: Ability to translate policy problems into causal pathways with clearly specified assumptions, indicators, and testable hypotheses.
  • Evaluation Design & Power Analysis: Capacity to compare experimental, quasi-experimental, and mixed- methods designs and to calculate sample sizes and minimum-detectable effects under real-world constraints.
  • Measurement & Instrumentation: Skill in crafting, piloting, and refining quantitative surveys and qualitative modules, accompanied by transparent data-management and ethics protocols.
  • Field Monitoring & Quality Control: Competence in designing and executing implementation-fidelity checks, context tracking, and real-time data-quality assurance for multi-site studies.
  • Impact Analysis & Evidence Communication: Ability to clean and analyze evaluation data, run robustness checks, and convert technical findings into concise, decision-oriented briefs and presentations for stakeholders.

Program outcome PO – Course Outcomes CO Mapping

PO1

PO2

PO3

PO4

PO5

PO6

PO7

PO8

PO9

CO1

X

CO2

X

CO3

X

CO4

CO5

X

C06

X

Program Specific Outcomes PSO – Course Objectives – Mapping

PSO1

PSO2

PSO3

PSO4

PSO5

PSO6

PSO7

PSO8

PSO9

PSO10

CO1

X

CO2

X

CO3

X

CO4

X

CO5

X

C06

X

Evaluation Pattern

Assessment

Internal

External

Midterm Exam

20

*Continuous Assessment (CA)

30

End Semester

50

*CA – Can be Quizzes, Assignment, Projects, and Reports, and Seminar

DISCLAIMER: The appearance of external links on this web site does not constitute endorsement by the School of Biotechnology/Amrita Vishwa Vidyapeetham or the information, products or services contained therein. For other than authorized activities, the Amrita Vishwa Vidyapeetham does not exercise any editorial control over the information you may find at these locations. These links are provided consistent with the stated purpose of this web site.

Admissions Apply Now