Generations: multi-generational approach to evaluation
May 1 to 4, 2011
English / Français
These interactive workshops provide participants with hands-on opportunities to gain knowledge and expertise in evaluation methods, concepts and tools. Full and half day workshops are available.
Full-day Workshops - Sunday, May 1, 9:00 AM to 4:00 PM
1. Handling data: From logic Model to final report
Gail Barrington, Barrington Research Group, Inc.
This full-day workshop will discuss ways of identifying, collecting, and analyzing program evaluation data that are feasible for the evaluator and meaningful to the client. Based on her more than twenty-five years of consulting experience, Gail will share some hard-won lessons about how to interact with stakeholders, ask the right questions, collect the right data and analyze and present findings in useful ways.
You will have the opportunity to work together and in small groups to tackle some common data collection and data handling problems, particularly in complex studies. Actual work samples will be provided. At the end of the workshop, you will take away some fresh ideas and a number of useful tools and techniques for application in your program evaluation context.
You will learn how to:
2. Evaluation project management 101: Intro to evaluation management practice
Nicole Michaud, Social Sciences and Humanities Research Council
Against a backdrop of demanding technical requirements and a dynamic political environment within the current Canadian context, the goal and challenge of evaluation management is to develop, with available resources and time, valid and useful evaluations. This is certainly true in the evaluation enterprise where recent shifts in the policy and program environments have resulted in significantly more complex projects. This workshop will focus on building and increasing the evaluation management knowledge and skills of those who conduct evaluations, or those who have oversight responsibility, based on an adaptation of project management tools and methodologies in real-world evaluation settings.
Building on a presentation on this topic at the CES 2009 conference, this workshop is intended to increase the evaluation management knowledge and skills of those who conduct evaluations, as well as others who have oversight responsibility. As well , it will introduce participants to a discussion of the principles of project management (PM) and project portfolio management (PPM) based on the Project Management Institute's Project Management Body of Knowledge (PMBOK), including the benefits of applying well-established and internationally accepted PM methodologies to the field of program evaluation.
Using a combination of short presentations, seminar discussions, and case examples in small group exercises, participants will have the opportunity to enhance their skills in directly managing or overseeing each phase of an evaluation (i.e., planning, conducting, monitoring and closing). Practical suggestions will focus on how common problems can be avoided or resolved. Key cross-cutting topics will also be addressed, such as, the interrelation of evaluation management phases. Other examples of evaluation management challenges and potential solutions using PM methods will be provided by the instructors as well as being drawn upon from participants' experiences.
This interactive workshop will provide participants with a basic understanding of:
Half-day Workshops - Sunday, May 1 9:00 AM to 12:00 PM
3. Practical approaches to managing ethical risk in evaluation projects
Linda Barrett-Smith, Alberta Innovates - Health Solutions
To address these gaps, the Alberta Research Ethics Community Consensus Initiative (ARECCI), with the assistance of CES Alberta Chapter, developed a practical, "on the ground" course that prepares evaluation and quality improvement leaders to consistently manage ethical risks in projects that involve people or their information.
The Project Ethics Workshop Description
Some of the concepts addressed in the workshop are:
The workshop features two tested ethics decision-support tools:
Workshop Learning Objectives
4. A wide-angle view of program success: When are performance indicators meaningful, what are their limitations, and how can we create alternative approaches?
Tammy Horne, WellQuest Consulting Ltd.
Reporting progress on indicators has become central to evaluators' work over the past 20 years, as funders' emphasis on performance measurement for accountability has grown. However, simply monitoring a program's performance indicators can lead to describing change, without exploring the how's, why's, and implications of change.
Over-emphasis on, or misuse of, pre-determined performance indicators has been questioned by prominent evaluation theorists such as Robert Stake and Michael Patton, in their writings on responsive evaluation and developmental evaluation, respectively. Canadians Burt Perrin and Sarah Bowen have shared concerns around use and misuse of indicators in practice. Critiques include: (a) indicators may be poorly developed, (b) indicators may be chosen based on convenience rather than on evidence and relevance to evaluation questions, and (c) overemphasis on a few pre-determined indicators misses program context and complexity, stakeholder perceptions/experiences, and issues that emerge during evaluation.
Workshop learning objectives: Participants will learn to (1) distinguish between well-developed and poorly-developed indicators, (2) write well-constructed indicators (both quantitative and qualitative), (3) distinguish between performance measurement/monitoring and program evaluation, (4) determine when indicators are most likely to provide useful data for decision-making, or not, (5) develop alternative approaches that could complement or replace pre-determined indicators, depending on an evaluation's focus.
This workshop will be extensively hands-on, as follows:
By sharing experiences and perspectives with their peers, participants can return to their work with skills/ideas for (a) improving performance indicators when they are required/useful, and (b) developing complementary or alternative approaches. They will be able to contribute to evaluation capacity building among others with whom they work by further sharing their workshop learning.
Improving indicator development skills may help participants better align their work with the Program Evaluation Standards, particularly A3-A7 on monitoring and data gathering. Moving beyond over-emphasizing indicators may further enhance alignment with the Standards (e.g., U3, U7, P1, A2).
Though the workshop will be paperless, I will e-mail presentation materials to participants following the workshop, for future reference.
5. Waawiyeyaa (Circular) evaluation tool certification
Andrea L. K. Johnston, Johnston Research Inc.
Learning Objective 1: To demonstrate the Waawiyeyaa (Circular) Evaluation Tool video and manual in order to educate participants on the options beyond the logic model for evaluation.
Learning Objective 2: To demonstrate the power of oral and visual education via video and by example showcase how this tool can not only serve to evaluate participant outcomes, but too also further their healing by sharing stories.
Learning Objective 3: To provide certification to use and implement the tool to workshop participants. Participants leave with a DVD and Facilitators Manual.
Developed by Johnston Research Inc., this holistic evaluation tool, grounded in Anishnawbe traditional knowledge was created for program providers. It's a self-evaluation tool allowing programs to document both meaningful process and outcomes over time. It's also a learning tool that promotes growth and self-development among the program participants. By applying the tool at various program-milestones a full picture can be documented of the personal journeys of the participants in a systematic manner. The traditional knowledge tool provides a framework from which program participants can easily relate. Participants like the tool because the storytelling is driven by them through their own eyes and at their own pace. We will review the manual, see the 20-minute video, complete the paper and crayon exercise and incorporate our stories into an evaluation report. You will take home your story, as well as a certification in the use of the Tool. The DVD and Manual are for sale; however free to interested First Nations.
Johnston Research Inc. is a forerunner in utilizing new technology and media to develop culture-based evaluation tools that can properly assess and improve culture-based social programming. Our latest releases include the Waawiyeyaa Evaluation Tool - that uses a video that shows program providers and participants how they can use the tool to document their stories; and an electronic survey tool that uses an easy question builder to launch a survey that will be instantly integrated into a database using the Internet. Other methods can be viewed on our website.
6. Evaluating environmental, resource and conservation programs
Andy Rowe, ARCeconomics Inc.
Demand for evaluations of resource, conservation and environmental interventions is growing rapidly. However evaluation thinking and capacity to work in these settings is still very nascent. Currently the majority of these evaluations are undertaken by domain specialists from the natural and physical sciences with no evaluation training or experience.
The distinguishing characteristic of evaluation in environmental settings is an evaluand that involves two systems, human and natural. A two system evaluand stretches contemporary evaluation approaches and requires evaluators to modify their approaches and methods. Thus in providing a frame and approaches for evaluating environmental interventions the workshop also addresses important issues in systems thinking as well as evaluation in complex settings where developmental evaluation has an important role.
This workshop introduces the key concepts and techniques for evaluating environmental, conservation and resource interventions. Participants will also be introduced to approaches addressing the unique methodological challenges and the difficulty of achieving use in science settings. The workshop will include small group and interactive sessions interspersed with tutorials based on the presenters' extensive experience in this area and is most appropriate for those with basic evaluation knowledge.
7. La conduite d'évaluations complexes (French session, maximum 20 participants)
Simon Roy, Goss Gilroy Inc.
Une évaluation peut s'avérer complexe pour plusieurs raisons. La complexité peut découler du type de programme à l'étude, de la structure du programme, de la gouvernance ou des sources d'information. De nombreuses conséquences malheureuses peuvent s'en suivre, comme des délais additionnels ou des résultats, conclusions ou recommandations inappropriés. Pour cet atelier, Simon Roy présentera des stratégies et des conseils qui peuvent être utilisés pour quatre types d'évaluations complexes : évaluations horizontales ou multi-programmes; évaluations de programmes ou de sujets complexes; évaluations à délais serrés; et évaluations à sujets délicats. Les participant(e)s apprendront quelques stratégies pour surmonter les obstacles associés à ces difficultés, notamment celles associées à la gouvernance des évaluations, l'utilisation d'experts, la participation d'intervenants, la méthodologie, et l'analyse et la présentation de résultats. L'atelier sera présenté dans un format interactif qui permettra également aux participants d'apprendre les uns des autres.
8. Writing for action
Rochelle Zorzi, Cathexis Consulting Inc.
Do you consider yourself to be a good writer, but still wish your reports would have more of an impact than they do? Then join us! This hands-on workshop provides a set of tools that you can use to make every report a springboard for action.
This will be an hands-on workshop, including large group discussion, paired activities, and individual activities. Participants will consider a report or other communication that they have written or are about to write. With guidance from the facilitators, they will:
This workshop addresses the following Competencies for Canadian Evaluators:
Half-day Workshops - Sunday, May 1, 1:00 to 4:00 PM
9. Developmental evaluation: The experience and reflections of early adopters
Mark Cabaj, Tamarack Institute
This 1/2 day session is a presentation-discussion style workshop aimed at people interested in learning more about the experience of their colleagues in using developmental evaluation, an approach designed to be used in situations of social innovation, high complexity, program replication, crisis situations and ongoing/radical program design. Michael Quinn Patton, a well-known evaluation expert and the person credited with "developing" the approach first mentioned the concept in 1994 and released his comprehensive book on developmental evaluation in July 2010. This workshop will explore the results of a six month investigation into the experience of 18 "early adopters" of developmental evaluation. This is the first known research of this guide on this topic. The findings are organized into the following areas: distinguishing developmental evaluation from formative and summative evaluation; the 'niches' in which developmental evaluation is employed; the conditions for effective development evaluations; coping with 'imperfect' conditions for developmental evaluations; planning and budgeting for emergent interventions, methodological issue in developmental evaluations; accountability in developmental evaluation; expertise and competencies of evaluators in developmental evaluation; implications and recommendations for evaluation theory and practice. It is recommended that participants have a basic understanding of developmental evaluation.
10. Evaluating development results in peace-precarious situations
Catherine Elkins, RTI International and Duke University
Peace-precarious situations (Elkins 2006; 2008) are characterized by recent, ongoing, or chronic violent power disputes, yet increasingly also host major international development interventions. Contemporary projects and programs are undertaken where wars have barely ended or have yet truly to conclude, and where other security or crime concerns dominate community development priorities. Intervening amidst intermittent or ongoing friction that unpredictably yields further sporadic violence, with core institutions not obliterated but significantly destabilized, complicates all of our implementation and learning challenges.
Our theory and expectations for programs expected to work in peace-precarious situations require critical examination from design assumptions to empirical results. While undertaking development in these unusually stressed circumstances generates vast amounts of new and needed information, however, much of it tends to dissipate unused. Political sensitivities or humanitarian urgency can inhibit complete and accurate documentation; turnover further undermines accountability; and goals, scope, and resource levels can shift dramatically and repeatedly. In a volatile context, the theories of change effectively in play multiply throughout program implementation. Successfully navigating peace-precarious environments to produce measurable development results requires innovative strategies, analysis, and tools.
This half-day workshop consists of participatory presentation including individual and group exercises on case studies. The workshop uses a theoretical framework with field examples to explore the state of the art in development and M&E/evaluation related to peace-precarious situations, critiquing methods and approaches within a framework for participants to assess strengths, weaknesses, and persisting gaps in the field.
Learning Objectives: Participants will increase their professional knowledge of:
Presentations, discussion, and cases cover M&E/evaluation in this complex landscape. We will examine results-relevant characteristics of these settings and their development projects from an evaluation perspective, exploring analytical and M&E/evaluation strategies to deliver sound and useful findings. Evaluation and M&E practitioners interested in development, especially in conflict/post-conflict/fragile or analogous settings, will participate in small-group exercises using case studies, and contribute to collaborative learning.
11. Designing and advancing evaluation quality
Cheryl Poth, University of Alberta
The Program Evaluation Standards, Third Edition (2010), developed by the Joint Committee on Standards for Educational Evaluation (JCSSE), has been approved by the American National Standards Institute, and adopted by the Canadian Evaluation Society (CES). Six years in development, The Standards document is a product of needs assessments, reviews of the scholarship and practice literature from the last four decades and suggestions from numerous American, Canadian and international meetings on what the third edition should accomplish. As drafts were prepared, more than 75 national and international reviewers provided comments and suggestions in an open review process. More than 50 practitioners as well as program users tried out these standards and provided comments prior to final editing and approval.
The Program Evaluation Standards, Third Edition, is an integrated guide for evaluating programs designed to foster human learning and performance. These standards apply to a wide variety of settings such as schools, communities, governmental health-care, private and non-profit organizations. In the 16 years since the publication of the second edition much has been learned about
A feature of the third edition is the intention to have the standards also serve the needs of evaluation users.
This half-day workshop will introduce participants to the five dimensions and 30 standards that, as a set, define quality evaluation. The session will begin by applying one or more dimensions of evaluation quality to a practical problem in evaluation. Participants will situate themselves within a team of evaluators and program administrators negotiating 'next steps' in an evaluation. This experience will form the foundation for examining how standards can work both independently and interactively to guide evaluation decision-making. The workshop will also deal explicitly with metaevaluation and its role in evaluation quality improvement and accountability. Attendees will also have the opportunity to report on their own evaluation dilemmas and explore, in small and large groups, how the application of program evaluation standards may serve to increase and balance dimensions of evaluation quality, such as utility, feasibility, propriety, accuracy, and evaluation accountability.
Attendees will receive handouts to support reflective practice in their future evaluations and evaluation-related work.
12. Social return on investment: An emerging tool for the evaluator's toolkit
Stephanie Robertson, SiMPACT Strategy Group
Learning Goals & Objectives:
In addition to methodology-specific skill development, participants will learn about the development of The SROI Network (international body), the SROI Canada Network, opportunities to become accredited as an SROI practitioner and how to access & share research on financial proxies.
The SROI methodology is a principles-based approach that values change for people and the environment, changes that are at-risk of being under-valued, misunderstood or ignored altogether because their value is not universally understood. As a result, an SROI analysis assigns financial proxies to social and environmental change wherever possible. Most importantly, the monetized value presented is then combined with non-monetizeable indicators of change, so that a complete expression of value creation is communicated.
This session will draw upon three years of case development in Calgary and a recent province-wide exercise to train 107 people representing 55 Alberta-based organizations, including their evaluators, on how to use the SROI framework to forecast value created, therefore to map out their value creation evaluation strategy. It will illustrate how an effective approach to SROI analysis ensures that the monetizable and non-monetizable sources of value created are presented together in order to ensure that the SROI result tells the entire value creation story.
How will engage participants:
13. Program evaluation and organizational development: An integrative framework
Wendy Rowe, Royal Roads University
Integrating evaluation methods and OD processes (E-OD) into an organizational or program change framework leverages the power of these two disciplines to help organizations achieve outcomes and to sustain continuous learning and development. This session will describe the stages of the E-OD change model beginning with mobilizing commitment to a desired change, using visioning, planning and needs assessments processes and then finally using evaluation results within the organization and for all stakeholders groups to direct and sustain continued change and development.
Using small discussion groups, participants will be engaged in applying these evaluation concepts to a change initiative in their organization or an organization they are working with. Participants will be introduced to group process techniques, such as the world café and appreciative inquiry process; techniques that generate evaluation data while also creating the conditions for change to occur.
By the end of the workshop, participants will have acquired greater understanding of the role that evaluation can play in facilitating organizational development and change, as well as developed some familiarity with methods and tools for carrying out this kind of integrative evaluation and OD function.
14. Conducting complex evaluations (English session, maximum 20 participants)
Simon Roy, Goss Gilroy Inc.
Evaluations can be highly complex for a number of reasons. They can be complex by the subject matter (or program type), program structure, evaluation governance structure and/or the sources of information. This complexity can lead to a number of undesired outcomes, including delayed timelines and inappropriate findings/conclusions or recommendations. For this workshop, Dr. Roy will present key strategies and tips that could be used for four types of challenging evaluations: horizontal and multi-program evaluations, evaluations of highly complex subject matters/programs, evaluations with tight timelines, and evaluations about sensitive programs. Participants will learn about practical strategies and tools to help them deal with these complexities. Strategies will be related to the governance of evaluations, the use of experts, stakeholder participation, methodology, and the analysis and presentation of results. The workshop will be presented in an interactive format that will allow participants to learn from experiences of other participants.
The conference organizing committee invites proposals for professional development workshops to be held at the annual 2011 conference. These half-day or full-day workshops will be offered on Sunday, May 1, 2011.
The deadline to submit a workshop proposal was October 25, 2010. Please note that no proposals will be accepted after this date.
Presentation proposals are now being accepted for the CES conference Generations: a mutli-generational approach to evaluation. Presentations may be in the format of a poster or oral presentation.
The deadline for presentation abstracts is November 15, 2010. Please note that no presentation abstracts will be accepted after this date.
Canadian Evaluation Society Annual Conference