(613) 818-2848
Menu

Evaluation Support

Support for Evaluation, Measurement and Planning

Facilitated processes, workshops and storyboarding sessions to identify, clarify and write:

  • logic models and theory of change
  • Evaluation narratives: consolidated evidence and findings
  • Logic models, theory of change and complex system/environment mapping

Taking time to think through how you understand your work to lead to results is one of the most valuable things you can do. With facilitated sessions, you can work through your theory of change and your logic model, that is, how your organization actuates change. You can trace processes from resources through activities and outputs and the outcomes that these are meant to cause or support or work backwards and ask how what you have done has contributed to what you see, in the context of a complex world. Sound facilitation technique helps engage participants in mapping these interactions and capture the results.

With a well-specified logic model and theory of change with supporting research in hand, an organization is well-placed to measure performance and evaluate results, both for reporting and as a tool for management. A clear model of how activities and outputs are expected to lead to or support outcomes, and an accounting of external factors, moreover, supports a plausible account of the contribution that a program or initiative has made in the presence of other programs and influences. This is called “attribution”, the establishment of which is an important step in developing a meaningful evaluation or measurement of a program’s effectiveness.

 

I have worked with many logic and theory of change models in my evaluation and performance measurement projects. There are various ways to produce a model, all the way from a program manager or evaluator drawing one up in a hurry, never to be used again, to participatory processes involving management, staff and stakeholders. In more effective approaches, organizations and stakeholders identify and validate their intended/expected outcomes, and through an identified and agreed theory of change (or one that is accepted pending further experience), determine how their initiative can be expected to deliver results and how to tell if that expectation is in fact being borne out by evidence.

The process of developing on program logic and expectations for change will usually, and generally should, begin with research on existing expressions of program logic and related theory. With this in hand, facilitated sessions or rounds of commentary and refinement help to revise, clarify and validate the model and to achieve an acceptable level of consensus.

 

Evaluation narratives: consolidated evidence and findings

Program evaluations generally produce a range of findings and recommendations of various immediate or longer term importance and implications, supported by multiple lines of evidence. People with differing perspectives, even within an evaluation team, will often interpret the same results as having different meanings. Findings, moreover, and the lines of evidence supporting them, can be complex and difficult to put together by any one person into an overall picture, or story. Facilitated workshops involving evaluators and, where appropriate, stakeholders, can surface, prioritize and link key findings and evidence in a story that can be supported from differing perspectives.

The narrative can then be expressed in textual and visual form, using key data graphics in “infographic” or annotated dashboard formats and “one-pagers” where clarity and brevity are the focus. This allows your audiences to quickly understand what is important about what you have found.

 

Development of reports and reporting formats for specific audiences and purposes

The section of this website on “effective visual data design, reporting and presentation” (LINK) already discusses effective reporting formats, including:

digital/printable dashboards and annotated displays
digital/printable “slidedoc” reports
communication-focused “one-pagers”
interactive websites

I am glad to see that visual and digitally-deliverable reporting formats are catching on. It’s way past high time that decision-makers should no longer subjected to 100-page evaluation reports.

“Dashboard”, or annotated visual display-style report formats allow audiences to focus squarely on the information that they need to consider. “Infographic”-style reporting does the same thing, with added flexibility for incorporating qualitative findings and relationships, timelines and geographic information. When I say something supportive about infographics, however, I mean infographics that make full use as much as possible of principles for clear display of data and information and that avoid the pitfalls of brightly-coloured, marketing-style infographics that we have so often, unfortunately, seen being advocated. My clients have agreed with me on that.
My most recent clients for dashboard and infographic reporting include:

Circum Network, in a report for the Canadian Music Publishers Association and the Association des professionnels de l’edition musicale;

J Birch-Jones Consulting Inc, working for Ringette Canada and;

Global Affairs Canada
and my work has previously appeared in reporting for the Girl Guides of Canada and financial/operational management dashboards for the Agency for Co-operative Housing.
The section of this website on “effective visual data design, reporting and presentation” discusses “slidedocs”. This is a form of digital document, based on PowerPoint or other presentation software, that presents a highly-visual summary of key points. The document is meant to be read on a computer screen, and can contain interactive features such as hyperlinks to let a reader “drill down” to more detail, as well as links to navigate around in the document. A great use of slidedocs is as a briefing preparatory to a meeting, as a way to reduce reliance on decks of printed material. They are quick to prepare and when distributed in advance take away the pressure to include everything you have in you slide presentation. You can simply take data graphics and succinct key points in text from the slidedoc to use in the onscreen presentation.

I have worked on slidedocs recently for Global Affairs Canada and for the Atlantic Canada Opportunities Agency.

Along with dashboards/visual displays and slidedocs, evaluators and others that need to communicate results succinctly and clearly should be using communication-focused “one-pagers” (sometimes actually a one-pager with a supporting evidence page). My approach to this highly effective briefing vehicle is based on experience in writing press briefings while engaged in government relations exercises involving the federal government and municipal governments, where instant communication of the point is paramount. Recent work with the Atlantic Canada Opportunities Agency included development of a template for effective one-page briefings on evaluation results.

 

Purposefully-designed presentations

One place where I do not yet see much progress is in purposefully-designed presentations.

What do I mean by “purposefully-designed presentations”? If you are making a presentation at a conference or other similar type of audience, you are trying to make a point. To be be effective at this, you need to design your presentation. Although this does not yet seem to be well-known, there are research and practice-based principles and guidelines for designing presentations so that audiences will be engaged, remain engaged, and walk away with a clear message. I don’t just mean designing slides, either, I mean designing the arc, turning points and resolution of the story being told, building engagement and interest as it goes.

I attend a lot of conferences and public events, and most of the time see presentations that drone on with no particular such structure, and audiences spending most of their time looking at their phones. I used to deliver that kind of presentation too, until I learned that there was more to know, and took some training with Duarte Design, California. Duarte is a leader in the field of presentation design and their founder, Nancy Duarte, a prolific author on the subject.

The first thing I did after attending their training was to design a presentation on how to avoid pitfalls in visual data and presentation, entitled “Avoiding Chartjunk and Slideuments”. IMAGE The presentation built in a story, with an arc, turning points, tension and resolution and a challenging finish. It was, by a country mile, the most well-received presentation I’ve ever delivered and I’ve used these techniques ever since.

When you think about how important presentations can be for your organization’s image and for engagement in your initiatives, you should question the wisdom of making up presentations at the last minute. I’d love to help you learn to design great presentations, by working with you on your next major presentation or by introducing you to the principles and practices with my course. Either way, your audiences will thank you.

 

Complex system/environment mapping

One problem, even with most thoughtfully constructed logic models, is that they usually leave out external influences and feedback effects, even when they are likely to be important, because they make the model “too complex”. It is good to simplify, but ignoring important influences on program success when planning a program or an evaluation is poor strategy. Failing to consider potentially important external influences and other complexities essentially places hopes for success on a best case scenario. Moreover, this may lead evaluators to fail to collect important data and to misinterpret program results.

Trying to embrace complexity by simply drawing a web of boxes and arrows like this is not helpful: it’s too complex to use and explain; will drive your audience away; and will probably come only from the mind of the evaluator or program manager, thereby easily missing important external influences and other complexities.

Fortunately, the same technology as used in social network analysis, along with more standard outcome mapping facilitation techniques, is perfect for uncovering and highlighting the patterns and structure of complex systems. Using existing research and/or facilitated sessions with experts and stakeholders, you can map out elements of cause and effect for evaluation, research and planning.

The advantage of a network analysis-based methodology is that it can build logic models and theory of change mappings that untangle complex relationships and external effects that would otherwise not be possible or practical, for policy planning, performance measurement and program evaluation.

In this simple demonstration version, the links between a program, its resources and activities, and its outcomes, as well as external factors, can be seen as parts of a system, using the network mapping technology. Moreover, the centrality of certain parts of the system can be clearly seen. At the same time, the model can be presented in a more standard form, as shown at top, if you like, once the relationships are understood and validated.

My approach is to develop a first draft mapping using existing material that can be then validated in a facilitated session with staff, experts or stakeholders, as appropriate. I bring a varied set of tools to help in illustrating the connections between elements of a system or logic model, including visuals that can easily be revised during an expert or stakeholder data validation session, using live meeting-oriented tools from sticky-note-based affinity maps and storyboards to interactive software like Kumu and others.

The data and visuals are then also used in reporting, like this interactive matrix displaying strengths of connection (read the causality as “y-axis drives x-axis”) and real-time-revisable network analysis software. The interactive matrix is used along with a network mapping, onscreen to help guide validators through potentially complex chains of influence and effect.

Mapping of dynamic systems is equally powerful for understanding complex webs of organizations, functions and results, such as, for example, my recent mapping of Ontario’s immunization system. This approach allowed the client to focus on key connections in the midst of a large and complex set of factors.

As the 2016 Conference Chair of the American Evaluation Association Topical Interest Group on Social Network Analysis I was very happy to give a presentation entitled ”Getting comfortable with complexity: a network analysis approach to program logic, design and evaluation”, covering this subject in detail.

 

My dynamic system mapping clients include the Ministry of Health and Long Term Care, Ontario, the United Way of Calgary and Area and Human Services, Alberta.