Identifying Indicators
After the Theory of Change, the next stage of the Impact Management Framework is identifying indicators to measure your impact.
This section covers the work involved in developing a framework for gathering data. The section will also discuss the importance of assessing how robust the data needs to be for a given purpose by considering who your audience is.
After the Theory of Change, the next stage of the Impact Management Framework is identifying indicators to measure your impact.
This is done by identifying indicators for each of the priority outcomes that were identified in the Theory of Change.
For example:
There are certain questions to ask when trying to understand which indicator to select for a particular outcome:
It is recommended to do both quantitative and qualitative data collection, because:
QUANTITATIVE INDICATORS
Good quantitative indicators are:
Make sure your quantitative indicators are SMART:
Specific
Measurable
Achievable
Reliable
Timebound
QUALITATIVE INDICATORS
Some changes are very hard to measure in a quantitative way. The effort required to collect the data might outweigh the usefulness, or it might be something that can be hard to measure like innovation. In these cases you will want to use qualitative indicators. You can gather stories from people about the impact an activity has had on them.
And Indicator Library can be a good starting point for ideas.
These could be from:
There are a number of different approaches to or methods for collecting data.
Clear logic + Observable changes + survey stakeholders + use existing data/research
Weigh the anticipated benefits of an investment against its costs.
For example, social return on investment (SROI), is one example of an expected return method that provides a framework to calculate an investment’s present social value of impact compared to the value of inputs.
Measure the execution of strategy against the project’s mission and end goals over time, using rubrics such as scorecards to monitor and manage key performance metrics on operational performance, organizational effectiveness, finances, and social value.
After-the-fact evaluations that use randomised control trials or other counterfactual approaches to determine the impact of an intervention compared to the situation if the intervention had not taken place.
Methods such as the Most Significant Change or Story-Based Evaluations solicit the perceptions of constituents about the performance of an intervention. Their feedback is then benchmarked against related interventions to demonstrate impact.
Brings together organisations across sectors to solve social problems by building a common agenda and shared measures of success. They develop maps to understanding complex, nonlinear, and adaptive systems; identify strategic leverage points for interventions within these systems; and then develop indicators to assess whether these interventions work, while recognising that systems-level outcomes are often unpredictable.
There are a range of different tools for data collection, each with their own pros and cons.
If possible, it’s great for your ventures to try and measure what would have happened without them. Would things have improved anyway?
Control groups are groups of entrepreneurs with similar characteristics to those your venture works with, but who don’t receive their support. Your venture collects data on the control group that can be compared with the data collected from their program participants.
Control groups can be time-consuming, expensive to run and challenging to manage. It is difficult to not offer some entrepreneurs what you are offering others!
Think about your audience and whether or not you NEED a control group.
Some alternatives to a control group are:
The lean data approach is a useful way to approach data collection when working with limited time and limited resources. The lean data approach is:
CUSTOMER FIRST
The idea that you are collecting data to create additional value for your customers or clients. Collecting their feedback and analysing data frequently allows you to implement changes faster and keep your customers engaged.
USES LOW-COST TECHNOLOGY
The idea that you can collect useful data using low-tech and low-cost technology. For example, free online surveys and text surveys are a cheaper option than a research programme. This means it’s perfect for social enterprises who are usually strapped for cash.
DECISION-DRIVEN
Focus on collecting feedback at multiple, continuous stages that can rapidly inform decisions. Allows moving quickly to change direction for both impact and the business model.
How is It Useful?
Core Principles of Lean Data
1. SETTING UP
Decide what you want to know and why. Create a clear hypothesis.
– Think about what data will come back from the questions you are asking
– Make sure questions are relevant to the data you want to collect
Be clear on who you want data from and what that might mean for how you construct your survey
– Language
– Prior experience to surveys
Make your survey ethical and enjoyable
– Introduce yourself
– Be respectful of people’s knowledge and time. You are asking them for something and what are you giving back to them? Can you explain to them what the data is being used for and undertake to give the survey results to them? Remember you might be being paid to do the survey but they are unlikely to be being paid to complete it.
– Be careful to not include triggers in your survey that could cause stress or anxiety for vulnerable people. If you are working with a potentially vulnerable group, you could test the survey with people who work with the group first, or with a smaller sample group to ask for feedback before using more broadly.
– Less is more
2. FRAMING YOUR QUESTIONS
Questions need to be objective for the organisation to inform decision making
Mix up quantitative and qualitative questions
– People respond well to open-ended questions. They feel listened to and the quality of their responses is higher
– Qualitative can take longer so be mindful of the duration
Avoid “brain pain.” Use plain language
– Think about your audience and if they have any barriers (language/cost/electricity/tech etc.)
– Don’t ask complex questions
– Run it past a family member to test it first
Examples:
AVOID: “What was the state of cleanliness of the hospital room?”
ASK: “How clean was your room?”
AVOID: “Do you possess bovine livestock?”
ASK: “Do you own a cow?”
– Avoid loaded questions – don’t inject your values and assumptions (this could influence the responses you receive)
Examples:
AVOID: “In the past week, how much money did you waste on cigarettes?”
ASK: “In the past week, how much money did you spend on cigarettes?”
– Avoid asking double-barreled questions (where there are two questions in one and someone could just respond “yes” or “no” and you won’t know to which part of the question they are referring).
Examples:
AVOID: “Do you read and sing to your kids every day?”
ASK: “Do you read to your kids every day?”
ASK: “Do you sing to your kids every day?”
– Avoid asking broad questions
Examples:
AVOID: “What do you think of our cookstove?”
ASK: “What is the most useful feature of our cookstove?”
3. NET PROMOTER SCORE
Net Promoter Score asks one simple question: “How likely are you to recommend us to a friend?” Respondents pick a numerical score between 0 and 10. The higher the score, the more likely they are to you recommend you – the lower the score, the less likely.
This is a good measure for determining whether the customer saw value in your service which can be an early-stage indicator of the customer receiving value that may lead to positive outcomes. It can be used for people receiving benefit from the service or from customers of the venture. It is frequently used in mainstream business so can be good for comparing the venture with other businesses.
4. CUSTOMER EFFORT SCORE
The customer effort score is another simple tool to learn how much effort the customer had to go to to fix a problem which is used in mainstream business so could be useful when comparing to other businesses but can also demonstrate customer satisfaction.
Examples:
Q: Have you experienced any challenges using the [product/service]?
A: Yes/No
Q: Has the issue been resolved?
A:Yes/No
Q: Do you agree/disagree with the statement: Overall, this product has made it easy for me to handle/resolve my issue.
Options: strongly disagree, disagree, somewhat disagree, neutral, somewhat agree, agree, strongly agree.
Source: Akina
Have consistent scales so it is easier for the person doing it.
For example, you could use the Likert scale.
Thinking about the past month, how do you feel about the following statements?
“I have felt supported at work”
Answers: 5-point scale: (strongly disagree, disagree, undecided, agree, strongly agree).
“I have had enough food at home”
Answers: 5-point scale: (strongly disagree, disagree, undecided, agree, strongly agree)
Ask demographic questions for comparing survey answers (age/geographic etc. e.g. under 25 prefers xx compared to over 50s).
Ask these questions last so you can build a report with them beforehand
Date the survey to show change over time and to keep track of your surveys once you are finished.
Code your surveys so they can be anonymous as you are more likely to get more honest answers from people who know their name will not be shared.
For example, use a numerical or letter system that you write on the survey along with the survey taker’s name and date of survey.
Aim for around 7-10 questions
There are a number of different ways to use the data that you have collected.
Using data to tell a story about or report on your program can help to:
In order to achieve Minimal Viable Reporting, ask yourself:
Effective reporting:
There are many different ways you might want to report your impact.
The about the format that best suits your information, and your audience.
Different audiences need different levels of impact reporting. This could influence your data collection methods so take that into account when deciding what data you are going to collect.
Possible audience groups are:
When thinking about your audience’s needs, ask yourself:
Before you dive into all of the data you have collected across your program, think about:
Put all of your data into a spreadsheet so that you can group and connect them.
Check that you are consistent in the language you use.
For example, don’t refer to participants as students in one piece of data and children in another.
INTERPRETING QUANTITATIVE DATA
INTERPRETING QUALITATIVE DATA
Deductive: Based off a theory or hypothesis (ToC) that you have predetermined
Inductive: Emergent. Used when there is little research about your area of focus. As you go through your results, you identify patterns or trends.
The reader is able to understand a coherent narrative that connects the aims, plans, activities, and results.
You present relevant information with plain language and use a range of formats (video, text-heavy or visually accessible).
Be full, open and honest. Try to hold yourself account to your community stakeholders as well. Doing so will increase trust.
Provide reassurance and make sure your data is verifiable. Information is there for people to appreciate.
Data: Seems clear cut and provides justification for decisions.
Intuition: Combines multiple sources of data and experience to make a decision.
After analysing your data you will have insights that can inform the future direction of your work. There will be a lot to explore.
As you do your analysis, create a list of potential changes for consideration that you want to explore further as an organisation. Prioritise them based on different criteria e.g. cost, evidence, speed
A platform for data visualisation
An impact measurement and management platform
Impact management software
Tools to support large-scale data projects
A tool for making informative explainer videos
A tool for creating maps and diagrams
A tool for creating infographics
Tools for visualising your data
How you as incubators and accelerators can measure your own impact