Interactive resources for incubators and accelerators
Interactive resources for incubators and accelerators
Interactive resources for incubators and accelerators

Gathering and Using Data

This section covers the work involved in developing a framework for gathering data. The section will also discuss the importance of assessing how robust the data needs to be for a given purpose by considering who your audience is.

Identifying Indicators

After the Theory of Change, the next stage of the Impact Management Framework is identifying indicators to measure your impact.

This is done by identifying indicators for each of the priority outcomes that were identified in the Theory of Change.

For example:

Components of an Indicator

There are certain questions to ask when trying to understand which indicator to select for a particular outcome:

  • Who or what is changing? (e.g. parents spending time with their kids)
  • How much is it changing? (e.g. number of hours of positive time)
  • Is there a goal, and what is it? (e.g. at least four hours a week)
  • By when does this outcome need to happen? (e.g. within six weeks of the program)

 

It is recommended to do both quantitative and qualitative data collection, because:

  • Some audiences want numbers as they are perceived as more robust
  • Sometimes stories (qualitative) resonate more than numbers with some audiences
  • Sometimes it would create too much of a data capture burden to collect numbers (quantitative) and you will have to rely on stories
  • Some stakeholders will value one type over the other

Quantitative and Qualitative Indicators

QUANTITATIVE INDICATORS

Good quantitative indicators are:

  • Valid: accurate measure of a behaviour, practice, task that is the expected output or outcome of the intervention
  • Reliable: consistently measurable over time, in the same way by different observers
  • Precise: defined in clear terms
  • Measurable: quantifiable using available tools and methods
  • Timely: provides a measurement at time intervals relevant and appropriate in terms of programme goals and activities
  • Relevant: linked to the programme or to achieving the programme objectives

Practical Tip

Make sure your quantitative indicators are SMART:

Specific

Measurable

Achievable

Reliable

Timebound

QUALITATIVE INDICATORS

Some changes are very hard to measure in a quantitative way. The effort required to collect the data might outweigh the usefulness, or it might be something that can be hard to measure like innovation. In these cases you will want to use qualitative indicators. You can gather stories from people about the impact an activity has had on them.

Practical Tip

And Indicator Library can be a good starting point for ideas.

These could be from:

  • An international framework (e.g. the SDGs)
  • A national framework Government departments (e.g. Indicators Aotearoa)

Approaches to Data Collection

There are a number of different approaches to or methods for collecting data.

  • Clear Logic

    Clear logic + Observable changes + survey stakeholders + use existing data/research

  • Expected Return Method

    Weigh the anticipated benefits of an investment against its costs.

    For example, social return on investment (SROI), is one example of an expected return method that provides a framework to calculate an investment’s present social value of impact compared to the value of inputs.

  • Mission Alignment Method

    Measure the execution of strategy against the project’s mission and end goals over time, using rubrics such as scorecards to monitor and manage key performance metrics on operational performance, organizational effectiveness, finances, and social value.

  • Experimental Method (and quasi-experimental methods)

    After-the-fact evaluations that use randomised control trials or other counterfactual approaches to determine the impact of an intervention compared to the situation if the intervention had not taken place.

  • Participatory Method

    Methods such as the Most Significant Change or Story-Based Evaluations solicit the perceptions of constituents about the performance of an intervention. Their feedback is then benchmarked against related interventions to demonstrate impact.

  • Systems Mapping and Models of Collective Impact

    Brings together organisations across sectors to solve social problems by building a common agenda and shared measures of success. They develop maps to understanding complex, nonlinear, and adaptive systems; identify strategic leverage points for interventions within these systems; and then develop indicators to assess whether these interventions work, while recognising that systems-level outcomes are often unpredictable.

Tools for Data Collection

There are a range of different tools for data collection, each with their own pros and cons.

  • Text message surveys

    • Great for lower-tech environments where beneficiaries might not have access to a computer
    • Fast and effective (have high response rates)
    • Cheap to run
    • Work best for simple questions where you aren’t asking for a lot of detail e.g. how many hours did you spend reading to your kids in the last week
    • Not so good for gaining a deep understanding
    • Need to also think about context (e.g. if people have to pay to reply, women may not have phones, coverage etc.)
  • Interactive Voice Response (IVR)

    • A system that calls people, gathers information and then responds based on their responses
    • Great for areas with lower literacy, but that have phone access
    • Similar to text message – good for simple questions, not so good for complex ones
    • Easier than text message for customers to provide detailed responses
    • Can feel like it’s lacking in care or personal connection
  • Online survey

    • Assumes the recipient has a computer or phone capable of doing the survey
    • Can gather more in-depth responses, but can risk asking too much
    • Lots of easy free tools available to do this
    • May pair with a paper survey for those who don’t have tech
    • Response-bias may happen if people opt-in willingly to those who are randomly chosen
  • Face-to-face interviews

    • Lets you really understand the person you are talking to
    • Allows you to follow the conversation to generate deeper insights and follow leads
    • Great for collecting qualitative data (stories of change)
    • Less efficient for gathering quantitative data (numbers)
    • More time consuming than surveys
    • Have to look at context as well (e.g. male participants may not want to speak to women interviewers etc.)
  • Focus groups

    • Allows you to gather insights from multiple people at once
    • Risk that a small number of voices dominate the conversation
    • Better for gathering qualitative information that quantitative
    • Also pair with quantitative survey
  • Observation

    • Using staff or volunteers to document change through observation
    • Risk of bias and interpretation
    • Can work well where there are ethical or capability issues, such as working with children
    • Also works well where the change is not happening in people, such as environmental change
  • Internal systems

    • Venture systems may already be collecting some indicators (e.g. comment section on your online store, data you are already collection for an investor etc.)
    • Normally only collecting output indicators such as # of units sold, number of clients, # of people attending workshops. That output data may be useful, however, for revealing something about a change for someone
    • Efficient because it doesn’t require extra work
    • Data out only as good as the data going in
  • External data sources (secondary data)

    • Government datasets
    • Academic research
    • Databases

Control Groups

If possible, it’s great for your ventures to try and measure what would have happened without them. Would things have improved anyway?

Control groups are groups of entrepreneurs with similar characteristics to those your venture works with, but who don’t receive their support. Your venture collects data on the control group that can be compared with the data collected from their program participants.

Control groups can be time-consuming, expensive to run and challenging to manage. It is difficult to not offer some entrepreneurs what you are offering others!

Practical Tip

Think about your audience and whether or not you NEED a control group.

Some alternatives to a control group are:

  • Establishing a baseline so you can measure conditions before and after your program
  • Asking participants about the impact that they experienced and what they believe contributed to the change

Lean Data Approach

The lean data approach is a useful way to approach data collection when working with limited time and limited resources. The lean data approach is:

 

CUSTOMER FIRST

The idea that you are collecting data to create additional value for your customers or clients. Collecting their feedback and analysing data frequently allows you to implement changes faster and keep your customers engaged.

 

USES LOW-COST TECHNOLOGY

The idea that you can collect useful data using low-tech and low-cost technology. For example, free online surveys and text surveys are a cheaper option than a research programme. This means it’s perfect for social enterprises who are usually strapped for cash.

 

DECISION-DRIVEN

Focus on collecting feedback at multiple, continuous stages that can rapidly inform decisions. Allows moving quickly to change direction for both impact and the business model.

 

How is It Useful?

  • Fast
  • Prevents survey fatigue for your customers
  • Only collects relevant data
  • Respects your customers (e.g. time – no long term commitments)
  • Improves decision-making
  • Cheaper

 

Core Principles of Lean Data

  • Bottom-up
  • Useful
  • Iterative
  • Light touch
  • Dynamic

Other Considerations with Data Gathering

  • The speed and frequency of data collection
  • The resources (financial, human) required
  • The stage of impact measurement where they are useful
  • The rigor to prove causality
  • The statistical significance and reliability of results
  • The level of impact they take into account (e.g customer, company or system?) – how big do you want to go?
  • The subject (person, environment) – what does your audience want?

TOOL / EXERCISE

Creating a Survey

  • 1.

    1. SETTING UP

     

    Decide what you want to know and why. Create a clear hypothesis.

    – Think about what data will come back from the questions you are asking

    – Make sure questions are relevant to the data you want to collect

     

    Be clear on who you want data from and what that might mean for how you construct your survey

    – Language

    – Prior experience to surveys

     

    Make your survey ethical and enjoyable

    – Introduce yourself

     

    – Be respectful of people’s knowledge and time. You are asking them for something and what are you giving back to them? Can you explain to them what the data is being used for and undertake to give the survey results to them? Remember you might be being paid to do the survey but they are unlikely to be being paid to complete it.

     

    – Be careful to not include triggers in your survey that could cause stress or anxiety for vulnerable people. If you are working with a potentially vulnerable group, you could test the survey with people who work with the group first, or with a smaller sample group to ask for feedback before using more broadly.

     

    – Less is more

     

    2. FRAMING YOUR QUESTIONS

     

    Questions need to be objective for the organisation to inform decision making

     

    Mix up quantitative and qualitative questions

    – People respond well to open-ended questions. They feel listened to and the quality of their responses is higher

    – Qualitative can take longer so be mindful of the duration

     

    Avoid “brain pain.” Use plain language

    – Think about your audience and if they have any barriers (language/cost/electricity/tech etc.)

    – Don’t ask complex questions

    – Run it past a family member to test it first

     

    Examples:

    AVOID: “What was the state of cleanliness of the hospital room?”

    ASK: “How clean was your room?”

    AVOID: “Do you possess bovine livestock?”

    ASK: “Do you own a cow?”

     

    – Avoid loaded questions – don’t inject your values and assumptions (this could influence the responses you receive)

     

    Examples:

    AVOID: “In the past week, how much money did you waste on cigarettes?”

    ASK: “In the past week, how much money did you spend on cigarettes?”

     

    – Avoid asking double-barreled questions (where there are two questions in one and someone could just respond “yes” or “no” and you won’t know to which part of the question they are referring).

     

    Examples:

    AVOID: “Do you read and sing to your kids every day?”

    ASK: “Do you read to your kids every day?”

    ASK: “Do you sing to your kids every day?”

     

    – Avoid asking broad questions

     

    Examples:

    AVOID: “What do you think of our cookstove?”
    ASK: “What is the most useful feature of our cookstove?”

     

    3. NET PROMOTER SCORE

     

    Net Promoter Score asks one simple question: “How likely are you to recommend us to a friend?” Respondents pick a numerical score between 0 and 10. The higher the score, the more likely they are to you recommend you – the lower the score, the less likely.

     

    This is a good measure for determining whether the customer saw value in your service which can be an early-stage indicator of the customer receiving value that may lead to positive outcomes. It can be used for people receiving benefit from the service or from customers of the venture. It is frequently used in mainstream business so can be good for comparing the venture with other businesses.

     

    4. CUSTOMER EFFORT SCORE

     

    The customer effort score is another simple tool to learn how much effort the customer had to go to to fix a problem which is used in mainstream business so could be useful when comparing to other businesses but can also demonstrate customer satisfaction.

     

    Examples:

     

    Q: Have you experienced any challenges using the [product/service]?

    A: Yes/No

     

    Q: Has the issue been resolved?

    A:Yes/No

     

    Q: Do you agree/disagree with the statement: Overall, this product has made it easy for me to handle/resolve my issue.

    Options: strongly disagree, disagree, somewhat disagree, neutral, somewhat agree, agree, strongly agree.

     

    Source: Akina

Survey Design Tips

  • Use consistent scales

    Have consistent scales so it is easier for the person doing it.

    For example, you could use the Likert scale.

    Thinking about the past month, how do you feel about the following statements?

    “I have felt supported at work” 

    Answers: 5-point scale: (strongly disagree, disagree, undecided, agree, strongly agree).

    “I have had enough food at home” 

    Answers: 5-point scale: (strongly disagree, disagree, undecided, agree, strongly agree)

  • Ask demographic questions

    Ask demographic questions for comparing survey answers (age/geographic etc. e.g. under 25 prefers xx compared to over 50s).

    Ask these questions last so you can build a report with them beforehand

    • Always ask age in the form of the year they were born, not how old they actually are
    • When asking for incomes, ask it in the form of a range and not the actual number and have it be for the whole household
  • Date the survey

    Date the survey to show change over time and to keep track of your surveys once you are finished.

  • Code the survey

    Code your surveys so they can be anonymous as you are more likely to get more honest answers from people who know their name will not be shared.

    For example, use a numerical or letter system that you write on the survey along with the survey taker’s name and date of survey.

  • Keep it brief

    Aim for around 7-10 questions

Using Data

There are a number of different ways to use the data that you have collected.

  • For storytelling
  • For Reporting
  • As a learning and strategic tool for business decision making
  • To ensure all stakeholders tools and framework adopt the process and learning culture for all the stakeholders in the business

Analysis and Reporting

Using data to tell a story about or report on your program can help to:

  • Demonstrate the impact you are having
  • Inform the people who benefit
  • Inspire your staff
  • Show your donor what you have done. Donors want to give to organisations that meet needs. Not to organisations that have needs.
  • Look more closely at your data and draw conclusions

Practical Tip

In order to achieve Minimal Viable Reporting, ask yourself:

  • What is the best place to start with reporting?
  • What is the stuff you need to focus on to meet your learning needs and need of your funding and partners?

Effective reporting:

  • Puts the beneficiaries’ voice at the centre of the impact by using case studies and data
  • Focuses on outcomes. It can include activity measures but make sure they are not the sole focus of your reporting.

Practical Tip

There are many different ways you might want to report your impact.

  • Presentations / slide decks
  • Written report
  • Interactive webpage
  • Video

The about the format that best suits your information, and your audience.

  • Be comfortable that your data may not be perfect. Be willing to communicate what is perfect and what isn’t. Don’t overclaim. Transparency is key.

Identifying Your Audiences

Different audiences need different levels of impact reporting. This could influence your data collection methods so take that into account when deciding what data you are going to collect.

Possible audience groups are:

  • People the venture is benefiting. Ventures should hold themselves accountable to the people they seek to benefit. Generally it is good practice to see this group as their primary stakeholder. Impact reporting can show how the venture is supporting people to achieve their goals.
  • Management (complex, frequent)
  • Board (complex, robust)
  • Funders (combination, outcomes data, and stories)
  • Media (stories)

Practical Tip

When thinking about your audience’s needs, ask yourself:

  • What are they most interested in? (e.g. funders want to see evidence)
  • Are you asking your audience to make a decision based on your information?
  • What are the key pieces of information they will need to make their decision?

Gathering, Cleaning and Interpreting Your Data

Before you dive into all of the data you have collected across your program, think about:

  • Which data needs to be collated, and which doesn’t
  • All the different sources of data you have
  • How to collate/normalise the data so you can group or connect

Practical Tip

Put all of your data into a spreadsheet so that you can group and connect them.

Check that you are consistent in the language you use.

For example, don’t refer to participants as students in one piece of data and children in another.

INTERPRETING QUANTITATIVE DATA

  • Find an Excel expert or a data analyst to get the best results
  • In Excel, learn to use tools like Pivot Tables to focus information
  • Remember that specialists tools = need specialists skills

 

INTERPRETING QUALITATIVE DATA

  • Qualitative data is more time intensive. Set some time aside!
  • Remember to grab quotes as you go
  • Identify key words, phrases and themes about impact and document them
  • ‘Code’ or store the key information

Deductive vs Inductive Approach

Deductive: Based off a theory or hypothesis (ToC) that you have predetermined

Inductive: Emergent. Used when there is little research about your area of focus. As you go through your results, you identify patterns or trends.

Key Principles of Good Impact Reporting

  • Clarity

    The reader is able to understand a coherent narrative that connects the aims, plans, activities, and results.

  • Accessibility

    You present relevant information with plain language and use a range of formats (video, text-heavy or visually accessible).

  • Transparency

    Be full, open and honest. Try to hold yourself account to your community stakeholders as well. Doing so will increase trust.

  • Accountability

    Provide reassurance and make sure your data is verifiable. Information is there for people to appreciate.

  • Don’t be afraid to tell people what didn’t work and what you learnt.

Data Driven vs Data Informed

Data: Seems clear cut and provides justification for decisions.

Intuition: Combines multiple sources of data and experience to make a decision.

Act on Your Data

After analysing your data you will have insights that can inform the future direction of your work. There will be a lot to explore.

As you do your analysis, create a list of potential changes for consideration that you want to explore further as an organisation. Prioritise them based on different criteria e.g. cost, evidence, speed

  • Impact reporting isn’t something you ‘do’, it is something you use.

RESOURCES

  • Viz for Social Good

    A platform for data visualisation

    View
  • Sopact

    An impact measurement and management platform

    View
  • Socialsuite

    Impact management software

    View
  • SocialCops

    Tools to support large-scale data projects

    View
  • Biteable

    A tool for making informative explainer videos

    View
  • Pletica

    A tool for creating maps and diagrams

    View
  • Infogram

    A tool for creating infographics

    View
  • Google Data Studio

    Tools for visualising your data

    View

Next:

Measuring the Impact of Accelerators

How you as incubators and accelerators can measure your own impact