Course Learning Objectives
- Define and distinguish between monitoring and evaluation.
- Develop a program logic model to communicate an evidence-based program theory.
- Develop an M&E plan to track progress of program activities toward objectives and assess program effectiveness.
- Develop quantitative and qualitative indicators and targets for an M&E plan.
- Use relevant qualitative and quantitative data collection and analysis methods to track and evaluate program progress.
- Identify the qualities of effective qualitative and quantitative data collection tools.
- Describe how program data can be used for decision-making.
- Apply ethical guidelines for data collection and reporting.
Module Learning Objectives
Module 1: An Introduction to Monitoring and Evaluating in Global Health
- Define monitoring and evaluation.
- Distinguish between monitoring and evaluation.
- Explain why M&E is important.
- Identify monitoring best practices.
- Explain how key M&E activities fit into a typical program cycle.
- Describe strategies to address common concerns about program evaluation.
Module 2: Program Theory and Frameworks
- Define what a program theory is.
- Identify three program frameworks.
- List the five main components of a logic model.
- Develop evidence-based program outcomes that align with program impact.
- Develop program outputs that align with program activities and outcomes.
Module 3: The M&E plan
- Describe what an M&E plan is and why it is an important aspect of program success.
- Explain the relationship between logic models and M&E plans.
- Define the key components of an M&E plan.
- Write SMART objectives.
- Name and explain the qualities of effective program indicators.
- Develop indicators and targets for an M&E plan according to specified criteria.
- Describe the 6 steps involved in developing and implementing an M&E plan.
Module 4: Program Monitoring
- Describe the basic steps to conducting effective program monitoring.
- List three potential data sources for program monitoring.
- Conduct descriptive analysis to summarize data for program monitoring.
- Apply data visualization principles in preparing tables and figures.
- Describe three data visualization methods to visualize data for action.
Module 5: Designing and Conducting Program Evaluations
- Describe the main steps to conducting a program evaluation.
- Explain when the five types of program evaluations are used.
- Develop relevant program evaluation questions.
- Describe three program evaluation methodologies.
- Describe two quantitative designs commonly used in program evaluation.
- Name one key element to successful dissemination of evaluation findings.
Module 6: Setting and Participant Selection
- Define the terms evaluation setting and evaluation participants.
- Explain how inclusion and exclusion criteria are used to select evaluation setting and participants.
- Distinguish between population and sample.
- Describe the three broad sampling approaches of convenience, probability, and purposive sampling.
- Explain the criteria used to inform sample size for purposive sampling.
- Describe seven commonly used purposive sampling methods.
Module 7: Data Collection: Part 1
- Define quantitative and qualitative data.
- Describe four characteristics of high-quality data.
- Describe the main steps to prepare for data collection.
- Explain how to collect data through document review.
- Explain how to collect data through data abstraction.
Module 8: Data Collection: Part 2
- List the two key concepts that should guide data collection tool design.
- Describe the four best practices for overall data collection tool design.
- Apply the four best practices for developing questions for data collection tools.
- Differentiate between closed- and open-ended questions.
- Recognize common question types used in surveys.
- Define a Likert scale.
- Explain how to collect data through surveys, observations, interviews, and focus groups.
- Explain the overall structure of interview and focus group discussion guides.
Module 9: Data Analysis, Validation, and Dissemination
- Describe four key data processing practices.
- Explain two essential data quality checks to perform on quantitative data.
- Differentiate between descriptive and inferential analysis.Distinguish between statistically significant and programmatically meaningful differences.
- Describe the basic steps involved in thematic analysis.
- Describe elements to include in a codebook and why codebooks are important.
- List guidelines for writing up qualitative findings.
Module 10: Ethics
- Explain what human subjects protections are and why they are important.
- Name and define the Belmont Report’s three fundamental principles of ethics.
- Explain what informed consent means and describe the key elements of a consent process.
- Distinguish between anonymity, confidentiality, and privacy and describe methods to protect each.
- Describe procedures that evaluators can adopt to minimize participant vulnerability.
- Identify the four categories of safeguards for ethical data management and give examples of each.
- Describe key recommendations to promote ethical reporting, dissemination, and use of findings.
Course Activities
During the course, participants will be expected to:
- Analyze problem statements and develop outcomes
- Work with logic models
- Write SMART objectives and indicators
- Complete activities around data analysis and visualization (in Microsoft Excel)
- Assess evaluation questions
- Analyze qualitative methods
- Choose sampling methods
- Create open ended questions
- Work on an M&E plan