Evaluation is a powerful tool for improving a program and increasing its ability to serve people more efficiently and effectively. It gives programs an opportunity to test their interventions, adjust services to best meet community needs, and collect data to support their work.

For questions about the evaluation resources and how they can apply to your organization, please email the Office of Research and Evaluation.

Featured Resources

  • Use the new SCALER tool to assess whether an intervention is ready to successfully scale and extend its impact. Note: For optimal performance, we recommend launching the tool in a web browser other than Internet Explorer.
  • Use the Organizational Capacity Assessment Tool (OCAT) to assess your organization's strengths, clarify perceptions, and plan strategies to enhance capacity in identified areas.
Evaluation cycle diagram

Planning

Investing time in planning an evaluation helps to ensure that your evaluation will produce useful information for program improvement.

Laying the Groundwork Before Your First Evaluation

Laying the Groundwork Before Your First Evaluation: This presentation covers five foundational activities that programs should undertake during their first grant cycle so that they are ready to do an evaluation when they receive recompete funding for a second grant cycle.

  • Laying the Groundwork Before Your First Evaluation Slides (PDF) (PPT)
  • Laying the Groundwork Before Your First Evaluation Assessment Tool (PDF)
  • Laying the Groundwork Before Your First Evaluation Assessment Handout (PDF)
  • Laying the Groundwork Before Your First Evaluation - First Six Years Handout (PDF)
  • Access the recording of the Laying the Groundwork Before Your First Evaluation presentation held on January 28, 2016 here. (Description of audio.)  

Logic Models

Logic Models: This course introduces the key components of a logic model and discusses how logic models can be used to support daily program operations and evaluation planning.

Additional Resources:

  • College Preparation example (PDF)

Developing Research Questions

Developing the Right Research Questions: This presentation describes the importance of research questions in overall program evaluation design, identifies the four basic steps for developing research questions, and demonstrates how to write strong research questions for different types of evaluation designs (i.e., process evaluation and outcome evaluation).

  • Developing the Right Research Questions Slides (PDF) (PPT)
  • Research Questions Handout (PDF) (DOC)
  • Access the recording of the Developing the Right Research Questions presentation held on April 16, 2015 here. (Description of audio.)

Designing an Evaluation

Overview of Evaluation Design: This course explains different types of evaluation designs, the differences between them, the key elements of each, as well as considerations in selecting a design for your AmeriCorps program evaluation.

Additional Resources:

  • SIF Evaluation Plan (SEP) Guidance (PDF)
  • Subgrantee Evaluation Plan Getting Started Tutorials (web page). The tutorial contains the following modules:
    • Developing your Portfolio Evaluation Strategy (PES)
    • Introducing the Subgrantee Evaluation Plan (SEP)
    • Working with Evaluation Professionals
    • Planning for SEP Development

Writing an Evaluation Plan

Best Practices in Writing an Evaluation Plan (2019): This webinar, led by CNCS's evaluation technical assistance provider, NORC, provides attendees with an overview of how to develop an evaluation plan that meets CNCS's evaluation requirements. Key components of the plan are reviewed, and examples of best practices and common challenges are presented.  The webinar also describes the resources available to competitive grantees, subgrantees, and potential AmeriCorps applicants to support evaluation plan development and implementation.

Evaluation Plans and Reports for Grantees with CNCS Share Less than $500,000 (2013): This session was intended for grantees who already understand the AmeriCorps evaluation requirements. Participants reviewed the essential components of evaluation plans and reports.

  • How to Write an Evaluation Plan and Report Slides (PDF)
  • Evaluator Qualifications and Independence (PDF)
  • Sample Evaluation Plan (PDF)
  • Sample Evaluation Plan Checklist (PDF)
  • Sample Logic Model (PDF)

Evaluation Plans and Reports for Grantees with CNCS Share Greater than $500,000 (2013): This session was intended for grantees who already understand the AmeriCorps evaluation requirements. Participants reviewed the essential components of evaluation plans and reports.

  • How to Write an Evaluation Plan and Report Slides (PDF)
  • Evaluator Qualifications and Independence (PDF)
  • Sample Evaluation Plan (PDF)
  • Sample Evaluation Plan Checklist (PDF)
  • Sample Logic Model (PDF)

How to Write an Evaluation Plan (2015): This course explains the purpose of an evaluation plan and outlines the key sections of the plan and what should be included in each section.

  • How to Write an Evaluation Plan Slides (PDF) (PPT)
  • How to Write an Evaluation Plan Slides February 26 (PDF
  • Handout packet for how to write an evaluation plan (PDF)
  • Sample evaluation plan checklist (PDF)
  • Example Impact Evaluation Plan (PDF)
  • Example Impact Evaluation Plan Annotated (PDF) (Accessible Version)
  • Example Process Evaluation Plan (PDF)
  • Example Process Evaluation Plan Annotated (PDF) (Accessible Version)
  • Access the recording of the How to Write an Evaluation Plan presentation held on February 26, 2015 here. (Description of audio.)

Additional Resources:

  • SIF Evaluation Plan (SEP) Guidance (PDF)

Budgeting for an Evaluation

Budgeting for Evaluation: This course discusses the key components of an evaluation budget and approaches for creating an evaluation budget.

  • Budgeting for Evaluation slides (PDF) (PPT)
  • Access the recording of the Budgeting for Evaluation presentation held on December 11, 2014 here. (Description of audio.)

Recruiting and Managing an Evaluator

Managing an External Evaluation: This presentation describes how to manage an external evaluation and is intended for grantees that are considering or will be undertaking an external evaluation of their program.

  • Managing an External Evaluation Slides (PDF) (PPT)
  • Example Statement of Work (PDF) (DOC)
  • Sample Evaluation RFP (PDF) (DOC)
  • Sample Evaluator Assessment Form (PDF) (DOC)
  • Hiring the Right Evaluator (PDF)
  • Group Exercise (PDF) (DOC)
  • Access the recording of the How to Manage an External Evaluation presentation held on March 19, 2015 here. (Description of audio.)

Additional Resources:

  • Evaluator Screening Tips (PDF)

Conducting a Needs or Readiness Assessment

  • Impact Evaluability Assessment Tool (PDF)
  • Assessing Levels of Evidence (PDF

Additional Resources

  • Assessing and Building Local Evaluation Networks Webinar (PPT) (PDF)
    (The content of the Assessing and Building Local Evaluation Networks Webinar is contained in the PowerPoint slides. There is no accompanying audio recording)
  • Administrative Data (web page)
  • Examples of Effective SEPs (web page)
  • How program managers can use low-cost experiments to improve results: A video overview (web page)
  • Working with Institutional Review Boards (PDF)

Implementation

Implementing an evaluation involves executing the steps detailed in your evaluation plan.

Basic Steps of an Evaluation

Basic Steps in Conducting an Evaluation: This course describes the basic steps for conducting an evaluation including planning, collecting and analyzing data, and communicating and applying findings.

  • Evaluation Basic Steps Slides (PDF) (PPT)
  • Evaluation Basic Steps Prework (PDF)
  • Evaluation Basic Steps Handouts with Examples (PDF)
  • Access the recording of the Basic Steps of Evaluation presentation held on November 18, 2014 here. (Description of audio.)
     

Data Collection

Data Collection: This course will address key questions to consider prior to selecting a data collection method; the importance of selecting appropriate methods; the advantages and disadvantages of a variety of methods; and the difference between quantitative and qualitative methods and their roles in process and outcome evaluations.

Family Education Rights and Privacy Act (FERPA): These notes summarize a presentation made to School Turnaround AmeriCorps grantees about obtaining and sharing data with schools while adhering to FERPA privacy laws. The handout, from the Privacy Technical Assistance Center (PTAC) at the Department of Education, provides more detailed information on navigating FERPA.

  • Presentation to School Turnaround Grantees (PDF)
  • Handout from the PTAC (PDF)

Additional Resources:

  • SIF Secondary/Administrative Data Use: Accessing Restricted-Use Data (PDF)

Managing an Evaluation

Managing an External Evaluation: This presentation describes how to manage an external evaluation and is intended for grantees that are considering or will be undertaking an external evaluation of their program.

  • Managing an External Evaluation webinar (PDF) (PPT)
  • Example Statement of Work (PDF) (DOC)
  • Sample Evaluation RFP (PDF) (DOC)
  • Sample Evaluator Assessment Form (PDF) (DOC)
  • Group Exercise (PDF) (DOC)
  • Access the recording of the How to Manage an External Evaluation presentation held on March 19, 2015 here. (Description of audio.)

Additional Resources:

  • Evaluation Implementation Monitoring Tool (PDF)

Analysis and Reporting

Analysis is a key evaluation step that begins to make meaning of the evaluation data you have collected. Reporting the subsequent evaluation results is an important step in documenting findings and staying accountable to stakeholders.

Reporting

Reporting and Using Evaluation Results: This course will help AmeriCorps State and National programs understand the importance of communicating and disseminating evaluation results to stakeholders; write an evaluation report and become familiar with other key reporting tools; and determine meaningful programmatic changes based on evaluation findings and learn how to implement them.

  • Reporting and Using Evaluation Results Slides (PDF) (PPT)
  • Pre-work Handout (PDF)
  • Dissemination Plan Example (PDF) (XLS)
  • Access the recording of the Reporting and Using Evaluation Results presentation held on June 18, 2015 here. (Description of audio.)

Additional Resources:

  • SIF Evaluation Reporting Guidance (PDF)
  • SIF Propensity Score Matched (PSM) Reporting Checklist (PDF)
  • SIF Secondary/Administrative Data Use & Reporting Checklist (PDF)
  • SIF Implementation Reporting Checklist (PDF)
  • SIF Outcomes/Impact Reporting Checklist (PDF)
  • SIF Assessing Levels of Evidence (PDF
     

Using evaluation results for action and improvement

The final step in the evaluation process is to use the results of your evaluation to make meaningful program improvements.

Using Results for Program Improvement

Reporting and Using Evaluation Results: This course will help AmeriCorps State and National programs understand the importance of communicating and disseminating evaluation results to stakeholders; write an evaluation report and become familiar with other key reporting tools; and determine meaningful programmatic changes based on evaluation findings and learn how to implement them.

  • Reporting and Using Evaluation Results Slides (PDF) (PPT)
  • Pre-work Handout (PDF)
  • Dissemination Plan Example (PDF) (XLS)
  • Access the recording of the Reporting and Using Evaluation Results presentation held on June 18, 2015 here. (Description of audio.)
     

Creating a Long-Term Research Agenda

Developing a Long-Term Research Agenda: This course will help participants recognize the importance of building a long-term research agenda; identify the various stages in building evidence of a program’s effectiveness; and understand the key questions to consider prior to developing a long-term research agenda.

  • Developing a Long-Term Research Agenda Slides (PDF) (PPT)
  • Developing a Long-Term Research Agenda Handout (PDF)
  • Access the recording of the Developing a Long-Term Research Agenda presentation on July 30, 2015 here. (Description of audio.)
     

Evaluation Examples

  • Using Evaluation Results to Learn and Improve Webinar (PDF) (PPT)
    (Note: The content of this presentation is contained in the PowerPoint slides. There is no accompanying audio recording)

Find more evaluation examples on the Evidence Exchange

Scaling Evidence-Based Models (SEBM) Project

The Office of Research and Evaluation initiated the Scaling Evidence-Based Models project to support the scaling of effective interventions. The project includes guides, research reports, case studies, and tools that contribute to the study and application of scaling effective interventions.

Guides:

  • Scaling an Intervention: Recommendations and Resources: The guide provides five key recommendations that will help funders like AmeriCorps, other government agencies, and philanthropic organizations identify which funded interventions are effective, enhance their knowledge base on scaling them, and pursue scaling.
  • Baseline Equivalence: What it is and Why it is Needed: This guide is designed to help practitioners and researchers work together to design an impact study with baseline equivalence and in turn learning how to determine if an impact study is likely to produce meaningful results.
  • What Makes for a Well-Designed, Well-Implemented Impact Study: This guide is intended to help practitioners ensure that their evaluators produce high-quality impact studies.
  • How to Structure Implementation Supports: This guide will help practitioners develop formal strategies (also known as implementation supports) to help consistently deliver an intervention as it was designed, which is especially helpful for organizations scaling an intervention and assessing implementation fidelity.
  • Build Organizational Capacity to Implement an Intervention: This guide will help practitioners prepare to implement their desired intervention through building organizational capacity, which involves establishing the organizational structure, workforce, resources, processes, and culture to enable success.
  • How to Fully Describe an Intervention: This guide is intended to help practitioners to thoroughly describe their intervention and communicate the following to potential funders or stakeholders.
  • Making the Most of Data: This guide will help practitioners maximize the use of their intervention data to help their organizations improve program implementation and provide evidence to funders about effectiveness.

Research Reports:

Case Studies:

Tools:

  • Scaling Checklists: Assessing Your Level of Evidence and Readiness (SCALER): This report describes a framework that identifies how organizations can improve both their readiness to scale an intervention and the intervention’s readiness to be scaled, so that intervention services are best positioned to improve outcomes for a larger number of participants. Each checklist in the SCALER provides summary scores to reflect how ready an intervention and organization might be for scaling.