Key Messages Funders Seek

(Download as PDF)

Like many agencies that award grant funding, Pennsylvania Commission on Crime and Delinquency (PCCD) has an application process that requires applicants to:

  • Explain how the proposed program matches data-documented community needs
  • Communicate a clear data collection plan for demonstrating the proposed program’s impact

Below are excerpts from actual Request for Proposals and examples of messages using data to successfully convey need, value, and credibility.


Example One: Problem Statement

Applicants must describe the problem, provide data to support the existence of the problem, and demonstrate how the proposed program will address the problem.

Points to Remember
When selecting a program for implementation, the program’s goals and intended impacts should match your community’s elevated risk factors and identified problem behaviors.  You must convince the reader that a need for the selected program exists.  To do so, you must clearly link the evidence of the program’s effectiveness to the data that reflects your community’s needs.

Use available program research data and community-level indicators to:

  • Create for the reader a mental image of your community risks
    (poverty levels, unemployment rates, crime rates, child abuse rates, prioritized risk and protective factors, Pennsylvania Youth Survey (PAYS)data)
    • Response Example: 
      Anytown County completed a risk and resource assessment in 2011 using PAYS data.  They prioritized Family Attachment (score of 40) as a protective factor that needed to be enhanced and Poor Family Management (score of 60) as a targeted risk factor.  The need to address family domain risks is also reflected in elevated child abuse rates: Anytown’s 2010 child abuse substantiation rate was 42% while the statewide rate was 14.9%.
  • Demonstrate the existence of local youth problem behaviors through data
    (PAYS substance use rates, county juvenile justice or out-of-home placement rates, and school reports of disruptive behaviors, suspension, academic failure, or truancy)
    • Response Example: 
      In Anytown County, students in 6th grade completed the PAYS in 2011 and the data reflects a rate of past 30-day alcohol use that is higher than the statewide average: 8% in Anytown County vs. 5% as the PA Statewide Average.
  • Explain how local needs match the intended short and long-term impacts of the selected program 
    (Highlight the outcomes demonstrated through the developer’s research.)
    • Response Example:  
      The Strengthening Families Program: For Parents and Youth 10-14 (SFP 10-14) is recognized as a Blueprints Promising Program.  In a randomized control trial, significant impacts on alcohol initiation and misuse were demonstrated. The long-term impacts on substance use were mediated through short-term impacts, such as family bonding, empathetic caregiver-child communication, supportive parental involvement, youth life skills, and defined rules and substance use expectations.  By enhancing parenting styles and building life skills in youth, such as peer pressure resistance, SFP 10-14 is expected to decrease Anytown’s elevated child abuse and youth alcohol use rates.


Example Two: Previous Funding and Implementation
If the applicant has previously been or is currently funded to implement this program, provide a summary describing the date, source, and amount of funding, the population targeted, program challenges and successes, and whether, and to what extent, the program is currently operating.

Points to Remember
If an agency has previously funded a program, it is important to convey to them that their funding had an impact and that an additional investment will be judiciously used to expand upon existing efforts for even greater good.

The confidence of funders in a project’s ability to continue to grow and achieve outcomes is bolstered by:

  • Explaining the data collection process
  • Highlighting the participant impacts that have been demonstrated through previously collected data
    • Response Example:
      Any agency received grant funding from X Source in July of 2008 in the amount of X dollars to implement LifeSkills Training (LST) program to 600 youth in grades 6-8. For four years, data has been collected through LSTQ-MS pre and post surveys. Complete data sets were collected for 575 youth and analysis showed the following positive impacts: 90% increase in drug refusal skills, 80% increase in assertiveness skills, and a 73% increase in self-control skills.
  • Sharing lessons learned that will be used to enhance outcome measurement or program quality
    • Response Example:
      Any agency has implemented Promoting Alternative THinking Strategies (PATHS) at three elementary schools since October of 2010.  Using a program monitoring checklist, a PATHS coach completes monthly observations of each teacher.  Individual and aggregate observation data is used to identify targeted training topics. For example, we have learned that teachers need support in helping youth to use vocabulary to express their emotions.  By providing training on effectively using feeling faces, we have enhanced the teachers’ ability to generalize the concepts in the curriculum and have improved program quality.

Note: Funders seek honest responses about your challenges.  It is important to demonstrate that past barriers have been overcome and that your organization has the ability to identify and remedy current and future problems.


Example Three: Impacts and Outcomes

Internal monitoring must be conducted by each applicant and a description of how data will be collected is required.

Points to Remember
Funders choose to invest in programs with the expectation that their funding will have an impact on outcomes that align with their mission and goals. They seek evidence that the applicant understands the program’s intended impacts (theory of change) and is capable of developing strong data collection and analysis systems to report outcomes.

To reflect the project’s ability to demonstrate impacts:

  • Describe how the proposed project's impact will be measured
    (what data will be collected, how the data will be collected, who will collect the data, and how it will be analyzed)
  • Note: Specific programmatic tools should be identified by name and fully explained.
  • Explain how program outcomes data will be shared with stakeholders, referral sources, staff and the funder
    • Response Example:
      Any agency will utilize the required tools to measure program impact, including administering a pre and post survey to all participants and assessing fidelity through observations of 6 program components for each program delivered. Data will be entered into a spreadsheet analysis tool by the program coordinator. To generate program support, verbal and written reports will be provided to key stakeholders, referral sources, and the collaborative board quarterly.
  • Highlight the program's researched effectiveness in preventing, treating, or reducing behaviors related to the funder's goals
    • Response Example:
      A goal outlined in the funding announcement is to engage and motivate youth and families to reduce and/or eliminate negative attitudes or behaviors related to substance abuse.  SFP 10-14 is recognized as a Blueprints Promising Program. In a randomized control trial, impacts on youth alcohol and drug initiation/misuse were demonstrated up to 6 years following program participation. Short-term indicators showed that youth participants had less favorable attitudes toward alcohol and increased peer pressure resistance skills. Caregivers had an increased ability to clearly communicate expectations and consequences for substance use.
  • Shared data outcomes from previous program implementation (see the response in Example Two)