Personal tools

Course Outline


Scoping and Planning Program Evaluations

Download Course Brochure

Program evaluation: An overview

  • Definition of evaluation
  • The place of evaluation in the policy process
  • Outcome and process evaluation
  • Evaluation and other research initiatives
  • Linking evaluation to evidence-based policy development, best practice interventions, creating organisational report cards, and organisational learning

Understanding a program


Evaluators must balance the quest for objectivity with the need to gain intimate knowledge of why a program was established, how stakeholders make sense of the program, how the program works, and what it delivers for the target population.

  • Characterising the program
  • Getting to know the program
  • Identifying program goals
  • Program theory and implementation theory
  • Building a program theory
  • Comparing program theory to actual developments

Identifying issues and formulating questions

  • Assessing the need for a program
  • Good evaluation questions
  • Devising specific questions for the evaluation
  • Prioritising questions to be explored

Assessing and monitoring the program process

  • Setting criteria for judging program process
  • Common forms of program process evaluation
  • Assessing service utilisation
  • Assessing organisational functions

Assessing and monitoring program outcomes


The question: ‘Is this program working’ can be most readily answered by assessing and monitoring program outcomes. Various approaches to identifying and assessing program outcomes are reviewed.

  • Identifying relevant outcomes
  • Establishing base-line and current outcomes
  • Considering unintended outcomes

Assessing program impact: The classic experimental design


The classic experimental design is often upheld as the ‘gold standard’ for evaluation techniques:

  • Random assignment
  • Planned variation
  • The classic experimental design reviewed
  • Analysing experimental results
  • Coping with difficulties and limitations that arise
  • Conditions that make randomised experiments difficult
  • Criticisms

Alternative strategies for program evaluation


Acknowledgement of the practical limitations on the use of the classic experimental design has led researchers to consider alternative strategies for program evaluation such as these:

  • Quasi-experimental research designs
  • Qualitative methods
  • Cost benefit and cost effectiveness analysis
  • Meta-analysis
  • Triangulation and replication

Managing Program Evaluations


Working with program stakeholders


Program evaluation always takes place in a politicised environment. Stakeholders need to be considered.

  • Identifying key individuals and groups associated with the program
  • Developing a communication plan
  • Making effective use of an advisory committee
  • Gathering advice, acting upon it, and reporting back

Measurement issues


Program evaluators seek to make strong claims about the nature and impacts of the programs they study. Approaches are discussed for transforming concepts into collectable, good quality data.

  • Dependent and independent variables
  • Desirable characteristics of variables
  • Concepts and measures
  • Measuring variables
  • Using multiple measures
  • Markers of progress
  • Program inputs, resources, and environments

Techniques for data collection

  • Sources of data
  • Sampling
  • Interviewing
  • Coding responses
  • Utilising existing statistical data
  • Merging primary and secondary data

Techniques for data analysis and interpretation


Evaluators must find ways to undertake appropriate data analysis that can yield insights into program effectiveness, such as:

  • Regression analysis
  • Working with limited dependent variables
  • Working with censored data
  • Time series analysis
  • Choosing case studies based on analysis of quantitative evidence
  • Working with qualitative data
  • Good practice in data presentation and interpretation

Presenting and utilising evaluation findings – increase the odds they will have an impact on decision-making and inform broader policy conversations

  • Meeting expectations of the immediate client
  • Supporting change processes
  • Serving multiple audiences
  • Exploring opportunities for broader dissemination

Aspects of ethical practice

  • Understanding the program and its place
  • Maintaining high technical quality
  • Using balance and judgment
  • Adopting a utilization focus from the outset
  • Behaving with integrity when working with others
  • Surviving under difficult circumstances

Continuing to build your evaluation research capabilities

  • Inviting feedback and constructive criticism
  • Lesson drawing from other evaluation work
  • Joining professional associations
  • Improving your professional reading habits
  • Integrating an evaluation mindset into your everyday work

Course review and evaluation


An interactive discussion focusing on topics covered in the course, issues that have arisen as the course has progressed, what participants have gained from the course, how participants will integrate what they have learned into their work practices, and how the course could be improved or extended in the future.

Keep updated with the latest news and happenings  Follow us on Linkedin  Follow us on Twitter  Featured speaker presentations  Watch event highlights and exclusive interviews  Google+  Flickr-Informa Australia