Discover Continuous Evidence Building

Learning Objectives

After completing this unit, you’ll be able to:

  • List the three elements of continuous evidence building.
  • Explain the goals of continuous evidence building.
  • Discuss the foundations of a continuous evidence building strategy.

Collecting Your Evidence

Now that you’re familiar with tactics for carrying out your Impact Management, let’s take a closer look at continuous evidence building. After all, it’s evidence—or, data—that optimally will inform your future decisions and evaluation. And, continuous evidence building is an ongoing process that, if it’s going to be effective, will be deeply ingrained into routine data management at your organization.

Because organizations vary in why and how they deliver on their missions, the data they collect for Impact Management will vary. Therefore, your data collection goals and strategy will be tailored to your mission and services. Typically, impact metrics are a mix of programmatic indicators (like positive feedback rates) and operational and financial measures (like staffing or program costs). Dashboards can be used to enable data analysis at a quick glance and customized to align with your chosen metrics. We’ll take a closer look at choosing those metrics in Unit 4.

Elements of Continuous Evidence Building

With a number of different measurement and evaluation frameworks to choose from, it can be helpful to take a step back for a clear view of the why, how, and what of the elements of continuous evidence building (performance monitoring, program improvement, and impact measurement).


Performance monitoring

Verifying program delivery and integrity using process metrics

  • Frequent data collection and monitoring
  • Ongoing output reporting
  • Actionable insights for decision-making
  • Change-over-time assessments
  • Goal setting
  • Performance management
  • Dashboard development and use

Program improvement

Learning how programmatic changes affect outcomes 

  • Systematically using data to improve
  • Low-risk test-and-learn cycles
  • Isolating concrete, incremental changes
  • Directional learning about program and operations changes
  • Improvement science
  • Rapid-cycle evaluations
  • Continuous quality improvement
  • Change management

Impact measurement

Validating programmatic effects on core outcomes

  • Rigorous, often external, causal analysis
  • Validation of promising, lower-stakes assessments
  • Informing high-stakes decisions about delivery, scaling, and funding
  • Impact evaluation methods (experimental and quasi-experimental)
  • Data science
  • Econometrics
  • Measurement instrument selection

Goals of Continuous Evidence Building

Social impact leaders and practitioners—nonprofits, school districts, social enterprises, and public and philanthropic funders—build evidence in order to achieve many different programming and impact goals.

Effectiveness: Achieve impact for beneficiaries.

  • Determine whether and how participants (individuals, groups, or organizations) are benefiting from a program or service.
  • Understand how well the organization is achieving its defined mission.
  • Demonstrate the cost-effectiveness of a program.

Program improvement: Optimize offerings.

  • Better match participants’ service needs with service and program offerings.
  • Modify an intervention given participants’ needs.
  • Evaluate modifications to program design or implementation, or population or geography served.
  • Allocate organizational resources efficiently and equitably.

Scale: Expand program reach.

  • Identify new populations or communities that may benefit from an intervention.
  • Attract new resources or partners to achieve regional or national scale.
  • Identify strategically positioned program delivery or advocacy coalition partners.
  • Scale to a new geography.

Funding: Access resources for organizational sustainability.

  • Articulate the total cost of program delivery, including evidence-building activities, to existing or prospective funders.
  • Attract additional sources of philanthropic funding.
  • Catalyze and attract joint public-private funding opportunities.
  • Diversify philanthropic and public funding sources.

Resource allocation: Distribute resources wisely.

  • Shift funding, personnel, and other resources toward the most mission-effective and cost-effective programs and services.
  • Allocate funding to a portfolio of grantees that will best achieve strategic goals.

Communications: Tell your organization’s story.

  • Provide a voice for program participants.
  • Raise your organization’s profile based on the strength of its programs and the value you place on a culture of learning.
  • Support public advocacy efforts.
  • Communicate to beneficiaries how interventions and programs can help their communities.
  • Report to board and funders on organizational impact.

Policy: Drive government support toward evidence generation and effective and efficient program delivery.

  • Attract strategic public advocates and legislative champions.
  • Access, increase, and protect publicly available funds.
  • Achieve legislative and regulatory change.
  • Ensure programmatic alignment with federal, state, and local legislation and regulation.

Now that we’ve taken a deeper look at the continuous evidence building process and its potential outcomes or benefits, let’s review some templates that are helpful in creating your own strategic evidence plan.


Keep learning for
Sign up for an account to continue.
What’s in it for you?
  • 1 in 4 land a new job
  • 50% receive a promotion or raise
  • 80% learn new technologies that boost their resume
  • 66% say it increases productivity
Source: Trailblazer Community Impact Survey 2019