Measurement in a digital age
This insights page is regularly updated and highlights what Strive Community is learning about measurement in a digital age. Do you have best practices or insights to share about this topic? Reach out to us.
Introduction
Few businesses are immune to the impact of COVID-19. Both large and small businesses experienced the effects of the pandemic on day-to-day operations and their bottom lines. To mitigate these effects, organizations that support small businesses quickly digitized their processes to continue to deliver much-needed training and support.
In addition to changing how support was delivered to small businesses, the pandemic challenged traditional approaches to designing and implementing monitoring, evaluation, and learning (MEL) systems that support solution optimization, learning, and impact measurement. For example, travel restrictions meant that organizations that previously relied on in-person measurement methods were left to adapt to digital tools to learn about their users and improve their work.
While digital data collection has been a viable option for many years, the pandemic accelerated its importance. As the wake of the pandemic settles and in-person evaluations resume, many digital data collection approaches are—rightfully—here to stay. This insight brief focuses on organizations that support small businesses through digital capacity building, covers the benefits of digital measurement, and offers best practices for enabling it within capacity-building programs. We will update this brief as we continue to encounter best practices through our partners and our program.
Benefits and considerations of measurement in the digital age
In many ways, digital technology is revolutionizing the types of data that can be collected for measurement and how it is stored, processed, and used to improve learning. For organizations that support small businesses, digital measurement and digital data can be advantageous in many ways, including:
- A more cost-effective and greener option
- Greater accuracy, with more efficient analysis and data-driven decision-making, through the use of data analytics and visualization
- Real-time data collection that supports adaptive programming
- Greater reach of previously inaccessible populations
Ethical and methodological considerations
However, digital measurement raises ethical and methodological questions, particularly around exclusion and privacy issues.
- Exclusion. Digital (big) data sets have the potential to exclude individuals —often the most vulnerable—within a target population. Similarly, using unsupervised machine learning for analytics may introduce unknown bias. Organizations need to critically and systematically reflect on the ways a particular data set or mode may introduce bias, consider the consequences of that bias, and reflect on emerging practices to reduce identified risks.
- Privacy. Organizations that collect personal identifiable data can leave participants open to data privacy compromises. To mitigate these risks, the Active Learning Network for Accountability and Performance (ALNAP)—a global network focused on improving responses to humanitarian crises—recommends that organizations minimize the amount of personally identifiable and sensitive data collected. Data protection processes should be established across the data lifecycle and comply with data privacy regulations such as the General Data Protection Regulation. Similarly, the MERL Tech community introduced a series of guides focused on responsible data governance for MEL in Africa. Lastly, obtaining active and informed consent is paramount in a privacy-first approach.
Insights on digital measurement for capacity-building programs
Our partners use digital solutions to support small businesses to increase their productivity, resilience, and growth. And, given our digital- and data-first approach, digital measurement is central to all of Strive Community’s projects. Below, we highlight some insights and best practices for implementing responsible digital measurement processes.
Strive for lean data collection
Every active request to collect data from a small business takes time away from their livelihood. Lean data collection practices result in more mindful and efficient requests, minimizing the use of small business owners’ time. Lean data collection can be achieved by:
Measuring what matters
Organizations should focus on collecting data that is most valuable for understanding their target users' needs and preferences, improving project delivery, optimizing their specific solutions, and understanding user impact. Every data point that is collected should be scrutinized before to ensure it is absolutely necessary—a good practice in data minimization.
- For example, Strive Community has selected seven core metrics as the most critical and insightful for understanding the progress and impact of our overall program (see box below).
- Organizations should review existing datasets and collection practices to ensure efforts aren’t being duplicated.
- Additionally, for donor-funded programs that require data collection for accountability purposes, negotiating with donors about how much data is truly necessary may help reduce the collection burden.
Strive Community’s core measurement metrics
To understand the overall progress and impact of the Strive Community program on small businesses, we’ve identified seven core metrics that are most critical and insightful for our work. These include the number of small businesses:
- Reached: that viewed a minimum threshold of material with impact potential
- Engaged: that engaged with a minimum threshold of material, sufficient to have some impact
- That saved time
- That saved money
- That improved financial resilience
- That increased revenue
- That created more work opportunities
Collecting system data
The sheer volume of available digital data creates countless opportunities for measurement. Where possible, organizations should collect data generated by systems, such as usage data from training apps or platforms. System data can be collected in the background without active end-user engagement in a passive form of data collection. Such data is not only more accurate and often available in real time, but also less resource-intensive to gather for both organizations and small businesses. The most ubiquitous form comes from web and mobile applications and can include demographic, location, time of use, and behavioral data. System data can reveal how someone performs an action and provide insight into whether an intervention is having its intended effect.
Several Strive Community implementation partners are optimizing for passive system data collection:
- Kenya-based partner Arifu automatically collects system data about the channels they use to deliver training to small businesses. This data—which includes the number of messages accessed by small businesses; learning progress through quiz scores; and Arifu’s proprietary knowledge scores to understand levels of learning by topic—supports Arifu’s project delivery and solution optimization.
- Partners like WISE in Vietnam and UKM in Indonesia collect system data from their interactive learning management system on user behavior and preferences. This enables their teams to quickly adjust content and design, and support greater user engagement.
Digital, short, and simple self-reported data
Where system data is unavailable, organizations can perform active data collection to capture self-reported data from their target users. This data can also provide qualitative insights and feedback to complement system data.
Given that small businesses are often time-poor, organizations should be sensitive to their availability to provide this data. Self-reported data collection should utilize digital modes (as appropriate to the technologies accessible to small businesses) and be streamlined into existing processes. Organizations could consider using existing capacity-building interactions with small businesses to capture data, such as short surveys via SMS or WhatsApp messages.
- Strive Community partner Shujaaz Inc will deploy SMS surveys to more than 10,000 small business participants to gather feedback and impact data on its training. Additionally, Shujaaz Inc noted that focus group discussions conducted via WhatsApp enable organic, open, and authentic discussion about the participants’ experience.
- Organizations should consider the digital capabilities, availability, and comfort levels of participants. Impact measurement firm 60 Decibels has developed a remote survey toolkit, offering their best practices on remote phone surveys, in addition to a cheat sheet for choosing the most appropriate remote survey technology. Similarly, ALNAP suggests considering interactive voice response or phone surveys for participants with lower literacy levels. Further, using a female voice for recorded messages can make women more comfortable participating. Organizations can also conduct testing to determine which time of day gets the best response rate from small businesses.
Leveraging other digital data sources
To supplement system and self-reported data, organizations can also capture relevant data from other partners, organizations, and third parties. Strive Community partners work with other organizations that provide access to small businesses and can provide their own data to complement the data our partners collect.
- A FMCG organization may provide data on changes in small business transaction frequency and volume following engagement with the pur partner’s capacity-building program. This would deepen and widen the grantee's understanding of the potential impact of their program.
- When using secondary data, it is important to ensure that data is de-identified (no personal identifiable information is revealed), to assess whether informed consent was obtained, and to confirm the analysis will not enable re-identification.
Cultivate continuous learning practices
The real-time and often continuous nature of digital data can support measurement practices that facilitate constant learning, enabling organizations to analyze information from a range of sources to inform decisions and adapt programs. This can be achieved by:
- Capturing operational metrics. In addition to metrics that inform programs’ impact on participants, organizations can capture metrics on understanding the program implementation. These enable iterative improvements to program delivery.
- Supporting rapid feedback to test and iterate on program design and delivery. Nimble experimentation methods can support rapid iteration and optimization of digital capacity-building programs. For example, Strive Community partner TechnoServe utilized a rapid, user-centered design process to develop e-commerce training content by gathering user feedback and testing the content. This ensured that the training met participants’ needs and preferences. Similarly, Arifu is adopting an iterative process for updating their learner experience throughout their Strive Community project via A/B testing to gather small business feedback and preferences.
- Determining appropriate metric frequency. The continuous availability of system data means that metrics can be analyzed more frequently to support real-time decision-making and enable A/B testing. Organizations should determine an appropriate collection frequency that optimizes the ability to use insights and detect change.
Establish strong data protection principles
While digital measurement offers many benefits to organizations that support small businesses, it can increase risks around data privacy for end users. For instance, one report found that data compromises (breaches, exposures, or leaks) in the US increased by 68% in 2021 from the previous year.
To better manage these risks, organizations should review and ensure adequate data security along the entire life cycle, from collection to retention and disposal. For end users, organizations should establish transparent, easily understood, and ethical processes to explain how personal data is used and obtain informed consent. When working with partners or third parties for data collection or sharing, ample due diligence should be conducted to ensure adequate data protection procedures are in place. For instance, Caribou Digital has established and adopted Data Protection Principles that inform its collection, usage, and disposal of data.
Caribou Digital’s Data Protection Principles
- Develop clear informed consent protocols including specifying the current and potential future use cases for the data.
- Obtain informed consent for all data collection.
- Avoid collecting Personal Identifiable Information unless absolutely necessary.
- Do not use the data for other purposes other than what was stated to the respondents.
- Collect what you need and no more.
- Limit the number of team members who can access the data.
- Lock down the folders where data is stored.
- Delete the data after it has been used.
Looking ahead
The COVID-19 pandemic led many organizations supporting small businesses to implement digital tools and processes to continue their programming. Equally, it was a transformative moment for measurement. Given the potential of digital measurement, this trend is poised to grow as more organizations increase their capabilities around the use of digital data and digital modes of data gathering and analysis. This growth will also bring a greater understanding of the harms of unchecked digital measurement and push practitioners for best practices to avoid these. As we learn from our partners about measurement in a digital age, we will continue to share our insights.