Information to help your mission collect outcomes data


AGRM’s Think Outcomes initiative has launched, and some missions are already using the tool to collect outcomes data. The seven-question survey tool is designed to gather data in four key areas: Relationships (with God and others), Health (which includes sobriety), Housing, and Economics.

Here are some points to help you collect data uniformly: 

  • You can begin collecting data anytime, using whatever software you’re comfortable with, and after we have the repository in place, we’ll provide instruction on how to get the data to us.

  • Missions are encouraged to survey recovery participants at program entry, exit, and six to 12 months after exit. Some missions measure at six months, some 12, some 18—some at all of these intervals—but we’re asking for only one follow-up at six to 12 months.

  •  “Exit,” for these purposes, means after completing your mission’s recovery program. The questions assume a period of transition, regardless of whether that’s at the mission or elsewhere, so even if the “graduate” enters a transitional living program at your mission, please consider them as having “exited” the program.

  • Questions four and five of the survey tool deal with physical health. The idea would be to leave out the two physical health questions when clients are surveyed at entry, but to include them when the client exits the recovery program, and again at follow-up. It’s our belief that clients will be better able to evaluate their health history and current health after they’ve been through a mission program. Otherwise, it may appear as though their physical health declined after being in the program, when in reality it may be that they were just more informed and educated about their health after completing the program.

  • Some missions have clients fill out the survey themselves, others utilize an interviewer. We want your mission to feel the latitude to continue whatever process you currently employ, as long as each mission is collecting data using the same questions.

  • The survey tool doesn’t include the mission name or address, but that would need to be collected in some way, so that missions can run comparative reports without including their own data in the comparison pool. Having the mission’s address will allow missions to run reports by region.

  • The survey tool includes numbers along the side which should be used as values assigned to each response. The goal with the tool is to be able to show a baseline condition and an indication of improvement in the areas of (1) Relationships with God and others, (2) Health, especially sobriety, (3) Economics, and (4) Housing. We’re hoping the instrument will demonstrate improvement from program entry to graduation, then continued improvement or at least stability 6-12 months later. For example, the results may show that, association-wide, clients moved from, say, a “0” on average in sobriety to maybe a “2.5” on average, or clients may progress from a “1” on average in housing, to a “3” on average.

  • Missions should consider ways to minimize selection bias in their outcomes reporting. The clients you’re able to maintain contact with are more likely to report positive results, so a foundation may feel that the results are not really as positive as indicated. There are different ways to address this: (1) Report findings, but footnote the percentage of graduates that could not be reached for a particular timeframe; (2) Attempt to follow-up with all graduates, then randomly sample the surveys, including those with whom the mission could not maintain contact, and consider any who could not be contacted as unsuccessful (“Unsuccessful” would be represented by the number value not increasing); or (3) Count all those with whom the mission could not maintain contact as unsuccessful (this last option is the most conservative and probably the least attractive alternative).


Next Steps for AGRM:

  • AGRM is working with a software provider to develop a beta module to collect and transmit the data to AGRM’s central repository. After AGRM has had a opportunity to evaluate the software and make appropriate adjustments as necessary, we'll encourage other software vendors to develop outcomes modules. 

  • AGRM and the software provider are recruiting missions to be part of a beta test group. We’re selecting missions from a cross-section of the membership, varying the location, size, program types, etc. These missions will provide input on the questions, the process, the software module, and other factors. We do not anticipate any significant change in the survey questions themselves, unless a strong, widely felt concern is voiced.

  • After a sufficient amount of data has been collected and deposited into a central repository, we anticipate that missions will be able to run reports and even compare their data to averages within their region and nationally. Neither missions nor their clients will be connected to their data by name.

  • AGRM will keep members up-to-date on this process.


About AGRM’s Process to Develop the Survey Tool:


  • AGRM assembled a cross-section of mission staff for an outcomes task force. The group came up with the four broad areas for measurement, plus a laundry list of possible questions. The four areas of measurement are: Relationships (with God and others), Health (including sobriety), Economics, and Housing.

  • After further discussion with the task force members, AGRM combined their questions into 40 potential questions.

  • In the next phase, AGRM conferred with a number of task force members and other mission leaders to narrow the questions to 23. AGRM then developed a survey, asking the task force to rate the value of each question. The top five questions from this survey were preserved.

  • AGRM acquired the expertise of Baylor University researcher William Wubbenhorst (learn more about William at http://www.baylorisr.org/scholars/w/william-wubbenhorst). William visited a number of missions to learn about their programs and the scope of their work within their respective communities.

  • With input from William, AGRM further polished the questions, taking many factors into account. AGRM added two questions about physical health and developed the initial scoring system.

  • Next, AGRM held a session at our 2016 annual convention to present the questions and process, and to gather further information. More than 220 mission staff members attended the session. William was present as well. View a PDF of the on-screen presentation.

  • After getting input from William to help synthesize what we’d heard from members, AGRM tweaked the questions and re-arranged the scoring system to better reflect best practices.

  • AGRM has since sent the final questions to a few missions, and no further changes have emerged.

  • AGRM is working with a software company to develop a module to collect outcome data based on this measurement tool. The company and AGRM is working together to assemble a test group to evaluate the software tool.

Questions about the process may be directed to AGRM Director of Member Services Justin Boles. Feel free to email Justin at jboles@agrm.org.