Lessons Learned from Evaluating a Case Management Program

JessicaBroome

Jessica Broome

A client of mine is a social service agency. One of their smallest programs (serving about 40 clients at any given time) offers case management for individuals involved with the criminal justice system. The typical program participant has recently been released from a long prison sentence, is drug-addicted, and suffers from a chronic health condition (most often HIV, but also hepatitis, diabetes, and cancer). The audience is hard to engage, service staff are chronically overworked, and, largely because the number of participants is so small, the quarterly evaluation has sometimes not shown much progress. After almost four years of evaluating this program, I’ve learned a few lessons that I hope can help other evaluators assess their own approaches:

  1. Use clear and relevant objectives to track progress: This may seem obvious, but it has been a bumpy road! The objectives put forth by the program’s funder were broad and, initially, impossible to track. (“Reduce the spread of HIV in served communities” was one request that we simply could not collect data to measure.) Instead of trying to force our program into this rubric, we came up with a few relevant sub- objectives that we COULD track, like education of community members and decreased viral loads among program participants.
  1. Involve service providers in selecting evaluation techniques: The counselors and case managers who are on the “front lines” of client service have been my best resource when it comes to designing data collection mechanisms and deciding which outcomes should be tracked. For example, a counselor pointed out that program participants often have limited cognitive abilities and are just not able to reliably answer questions rating their health. Further, requiring this was posing a large burden on already-overworked staff, who had to collect the data from participants at regular intervals. Instead of a self-assessment of health, we started collecting lab results directly from medical providers, which provided a more objective measure, and reduced the burden on both participants and line staff.
  2. Keep evolving data needs and sources: For several years, I conducted focus groups and client satisfaction surveys with current clients, who consistently gave the program rave reviews. Ultimately, I realized that we weren’t learning anything new from these evaluative approaches; participants who responded to the survey and attended the focus groups were those who came to the program every day. Those who were unhappy with the services they received had stopped coming, or came only rarely. This year, we’re implementing a “track down” program, where all the resource that were used for the focus groups are diverted to locate and interview individuals who have stopped attending the program.

It’s important to point out that these lessons can be applied to almost any program evaluation! This program works with a very specific target audience, but I will take these lessons with me when I work on other evaluations.

Author: Jessica Broome
Jessica’s question to you
What other lessons have you learned from program evaluations?

Please click the link above to view responses or comment.