
#EvalTuesdayTip: Evaluation results in game format
Evaluations can bring together people to share ideas in a safe space that allows for diverse experiences, new insights and collective recommendations. But how can
Evaluations can bring together people to share ideas in a safe space that allows for diverse experiences, new insights and collective recommendations. But how can
M&E language can easily become convoluted and abstract. That’s why this indicator guide, developed by the UK’s National Health Service, presents a refreshing way to
All evaluators talk about mixed methods, but most find it difficult to integrate their qualitative and quantitative data analyses. This Clustering Methods with Qualitative Data
Engaging people forms an essential part of collecting data for MEL, so why not make that process more fun and interactive? A useful idea Khulisa
What to do with big data? Evaluators are often required to tell impact stories over time but struggle to find systematic, yet compelling ways to
Selecting the best font family for an evaluation report may sound like a low priority. However, it’s an important factor to consider if you want
QuIP, an acronym for qualitative impact analysis protocol, is specifically useful in making qualitative data more transparent. In March 2019, James Copestake, Marlies Morsink and
Is a partnership on course, or is it ineffective? Does it require intervention, and if it does, what kind? The Connective Impact Partnership Impact
When conducting evaluations, we sometimes need to collect data where there is no connectivity. Khulisa recently participated in an online discussion about offline (not internet
Do you know how to get your data ready and start visualizing? Or why visuals work better than text for communicating your findings and recommendations?