Evaluation is a critical component of prevention, helping to demonstrate results, improve effectiveness, and inform decision-making. Yet even among practitioners who know and appreciate its value, the prospect of designing and implementing an evaluation can feel daunting.
So when New York’s Office of Alcoholism and Substance Abuse Services (OASAS) rolled out a new set of reporting requirements last spring, they saw a unique opportunity to empower and engage prevention practitioners in evaluation planning. To help them do so, they drew on the resources of Prevention Solutions@EDC (PS@EDC).
Throughout the summer, PS@EDC evaluation specialists, working in partnership with OASAS staff, delivered five day-long trainings to nearly 150 prevention practitioners from across the state. Sessions were designed to prepare participants to develop evaluation plans that were concrete and actionable, and that they could use to communicate the benefits of prevention efforts to funders and community members.
But the trainings also went a step further.
“We could have simply provided instructions for completing their annual reports,” says PS@EDC’s Gisela Rots. “Instead, we incorporated the new requirements into a broader discussion of evaluation. We placed the requirements in context.”
For example, training content looked at the feasibility of different evaluation approaches, given real-world constraints such as limited time and money. It explored challenges such as how to measure change in the absence of baseline data, how to decide which programs to evaluate when faced with several possibilities, and what to do when evaluation findings reveal inconsistencies in implementation.
And because they were learning in a group setting, participants had a chance to problem-solve collectively and learn from the experiences of others. “We had people who have been working in prevention forever and people who have been in the field for less than three years,” says Rots. “Each brought a unique perspective to the table.”
According to Rots, practitioners also benefitted from the involvement of trainers from outside the OASAS system. “It allowed [the participants] to be a little more transparent about the challenges they faced, and maybe a little more open to exploring “outside the box” solutions.”
Rots acknowledges that most practitioners attended the training with a solitary goal in mind—to find out how to complete their annual reports. But she was thrilled to see that most left visibly excited about their evaluation plans.
“People had a chance to think about what they hope to accomplish through their programs, and how they could use evaluation to get there,” says Rots. “It was exciting to see their excitement grow as they realized that they weren’t just collecting data to report to the state, but to improve prevention practice.”
On August 22, Prevention Solutions@EDC delivered the webinar Effective Evaluation: A Prevention Provider’s What- and How-To Guide for practitioners who were unable to attend the in-person trainings. A recording of the webinar will be available on the OASAS website: https://www.oasas.ny.gov/prevention/index.cfm.
To learn more about Prevention Solution@EDC’s evaluation work in New York State, contact Gisela Rots at firstname.lastname@example.org.