Learn more about JFF
Learn more about JFF

Blazing a Trail (How to Know If You’re Succeeding)

Field Guide for Workforce Technology Solutions 


Measuring and Validating Impact

When launching new technology partnerships, discussions on impact measurement and data collection should happen early to ensure that partner organizations are in alignment on data definitions, desired impacts, and existing data policies, sources, and processes. A useful framing exercise is to consider what outcomes you wish to achieve and what data you’ll need in order to tell a compelling success story. Then, determine which metrics each partner can provide and what types of data usage or data sharing agreements must be developed to ensure safe and secure use of the data. 

A good impact story is informed not just by program staff or participants, but by all of the stakeholders involved in an initiative. It requires a diverse collection of narratives and perspectives, and those narratives and perspectives must, ideally, be supported with reliable outcomes data. 

Once you’ve engaged your stakeholders in determining your desired impacts and the potential stories that may result from the initiative, you should coordinate with your data team and your partners’ data teams to determine whether or not you have the data needed to support these stories. It’s important to document this process along with the team’s anticipated data assets and needs because these should be included in any data use or sharing agreements required for the partnership. Once a project launches, it can be difficult to amend executed data agreements. Finally, consider the capacity and funding needs of each organization to ensure that there are adequate resources to support data management and a strong evaluative analysis of the initiative. 

Specifically, remember to do the following:

  • Write data requirements into contracts with all parties
  • Have data sharing agreements to make sure you know how and where data is flowing

To help shape your data story, consider some of the questions and sample answers provided here as examples of the foundations of impact stories that could potentially result from a tech-driven partnership:

  1. Q: What is the story you want to tell? What are the most important narratives and how do they coalesce around certain perspectives?
    A: Understanding the supportive services required to help people overcome barriers to completing digital skills training. 

  2. Q: Who is part of telling that story? What stakeholders should provide input?
    A: Community-based organizations that serve individuals who face significant barriers to educational access and persistence, workforce center case managers, local community colleges, and learners 

  3. Q: How will you tell the story? What data points are important to gather? Through what mechanism will you collect them?
    A: Program persistence rate (source: training platform), barriers to employment (source: instructor/career coach, learner survey), number of learners receiving a service and levels of service (source: service providers/community-based organizations)

Keep in mind that these are strictly examples of potential questions and answers. Depending on the nature of your partnership, you may need to include additional questions to fully encompass the depth and breadth of your data needs. In addition, these questions may require multiple conversations and iterations before you arrive at a final decision that enables you to tell a story that accurately reflects the outcomes of the initiative and the impact it made in your community. 


The success of innovative technology partnerships is not always measured in quantitative data or standard outcomes. In fact, as is the case with any new venture, you may end up falling far short of your goals when you try to forge a working relationship with a partner from beyond the circles you usually operate within. But if a particular partnership doesn’t deliver preplanned success metrics, don’t immediately write the program off as a failure. Instead, take the time to understand what lessons you can draw from the effort, and look for things you can improve upon. 

As the nature of work evolves, as the economic environment changes, and as technology’s impact on our lives grows, there will be moments of confusion and apparent disappointment, but all of that is part of the process of building a resilient and forward-thinking organization and community. In this fast-moving world, the long-term opportunity cost of standing still is far greater than the risk of trying something new and failing.


When operationalizing a data collection plan, make sure you have a clear and consistent methodology for all partners to follow. For example, if you’re collecting learner data via an online survey tool, recognize that asking respondents to answer open-ended questions will yield a very different data set—in terms of quality, consistency, and your ability to categorize similar responses—than the one you’d get if you asked people to answer a list of multiple-choice questions. Similarly, if you use a survey tool to gather outcomes data in one region but rely on one-on-one interviews with case managers in another, you’ll get inconsistent and potentially unusable outcomes data across the two regions.  


Join the Community