Virginia Tech® home

The Use of Ripple Effect Mapping’s Rippling and Theming Approach as an Evaluation Tool for Extension Programming: A Case Study

ID

ALCE-297NP

Authors as Published

Authored by Clare Lillard, Extension Agent, Orange and Madison Counties, Family and Consumer Sciences; Dr. Karen A. Vines, Assistant Professor and Continuing Professional Education Specialist, Department of Agricultural, Leadership and Community Education, Virginia Tech; Dr. Melissa Chase, Consumer Food Safety Program Manager, Dept. of Food Science and Technology, Virginia Tech; Dr. Tiffany Drape, Assistant Professor, Agricultural, Leadership, and Community Education; and Lisa Ellis McCormick, MS Student, Department of Agricultural, Leadership, and Community Education, Virginia Tech.

Introduction

Ripple Effect Mapping (REM) is described as “a group participatory evaluation method engaging program participants and community stakeholders to retrospectively and visually map the chain of effects resulting from a program or complex collaboration” (Chazdon et al., 2017, pg. 6). REM was designed to document the results of program efforts within complex, real-life settings (Chazdon et al., 2017). Three most common reasons for using REM are to document program impacts in a means that provides insight from multiple perspectives, to generate enthusiasm and energy for continued work, and to help participants connect their efforts with those of others (Chazdon et al., 2017). There are varied approaches to REM, however each contains four core elements: appreciative inquiry (AI), a participatory approach, interactive group interviewing and reflection, and radiant thinking (RT). REM uses these core elements to guide participants in reflection through visually mapped program impacts, both intentioned and unintended (Chazdon et al., 2017). Data collected through REM is beneficial in program evaluation and planning because of the deeper understanding it provides into why a program works, and awareness of alternative approaches that can be used to build on strengths and overcome program challenges. This article reflects on the experience of using the theming and rippling approach of REM to evaluate the Virginia Cooperative Extension Stone Soup Rural Workforce Training Program (Stone Soup).

Stone Soup Rural Workforce Training Program

Stone Soup was established to address the need for workforce training identified through community needs assessments conducted by the Culpeper, Madison, and Orange County Virginia Cooperative Extension (VCE) offices in Central Virginia. Stone Soup collaborates with the George Washington Carver Food Enterprise Center and Culpeper County Human Services. The program provides food service workforce training for low income audiences, with two goals in mind: increased employment opportunities, and increased health for both participants and family members achieved through nutrition training. A professional chef volunteers his time and expertise to conduct the six-session program focusing on food safety, customer service, basic nutrition, and budgeting the food dollar. The program is supported by extension agents, Family Nutrition Program assistants, Master Food volunteers, and community volunteers from Rural Madison Organizations. A total of 64 individuals graduated from six Stone Soup Programs conducted at the time of the REM exercise. Stone Soup has included low income individuals, individuals with intellectual, and/or developmental disabilities, and mental illnesses. Graduates are offered additional assistance such as resume writing and interviewing skills.

REM was selected for the Stone Soup Program for several reasons. Although evaluations have been conducted on Stone Soup to evaluate short-term outcomes, there was a growing interest in gathering impact data., Over time, the purpose of providing workforce preparation for low income audiences shifted to include increasing self- sufficiency for audiences with other challenges. The program team was interested in knowing if there were other unintended outcomes with the program and ways the program might be improved.

Using the REM Approach

As a first step, this project was submitted to the Virginia Tech Institutional Review Board for determination of human subjects research. The project was deemed not to be human subject research since its primary purpose was program evaluation. The project was then moved forward as an evaluation process.

The program logic model (Figure 1) was used to develop open-ended inquiry questions for the participants and the stakeholders based on the outcomes for use in REM. The use of open-ended questions prevents the use of leading questions that may be asked by program administrators. An agenda and script were developed using examples provided in A Field Guide to Ripple Effect Mapping (Chazdon et al., 2017) as included in Appendix C.

Imagine depicts logic model for program inputs, outputs, and outcomes for the stone soup project.

Figure 1. Logic Model for Stone Soup program.

Program participants and stakeholders were recruited to participate in the REM session via email or phone. The mapping session was held in a central location near where many of the Stone Soup classes were previously conducted. The theming-and-rippling method was selected as the REM method (Emery et al., 2015). With this method, participants assist with the identification of overall themes. The method was selected because it gives all participants the opportunity to express their thoughts and see them displayed on an initial mind map (Chazdon et al., 2017).

The REM session was facilitated by a third party, not involved in the program. The facilitator followed the script developed by the program director using open ended questions. An additional extension agent was presented as the “mapper” to record and map information provided by participants using XMind mapping software (www.xmind.net). The mapper asked participants to clarify their responses as needed during the program. The facilitator and mapper received copies of the script prior to the session.

The session began with an overview of the program and general guidelines presented by the facilitator and researcher. The participants introduced themselves. Participants were then asked pair up, preferably with someone they did not know well. Once paired up, they took turns serving as the interviewer, asking open ended inquiry questions developed by program administrators. They were instructed not to deviate from the AI questions. The facilitator was available during the interviews to answer any questions and to check in on each pair to ensure protocol was followed. Following the interview protocol increased rigor of the interviews, especially with participants who had limited experience in conducting interviews (Chazdon et al., 2017).

After completion of the interview, the participants came together as a large group to create a group “mind map” (Figure 2). Group mapping gives participants ownership of the process and helps participants see connections among the effects they are describing (University of Minnesota Extension, 2019). Participants shared stories while key phrases were entered into the map. When the findings were mapped, they were grouped according to emergent themes. The participants were asked if the findings were grouped appropriately into the assigned theme. Findings were associated with multiple themes in some cases. Member checking was created as individuals confirmed that what they said was captured and categorized correctly. While completing the group mind map, participants reviewed and reported the outcomes they experienced or observed of the Stone Soup program while the recorder of the “mapper” inputted the data into mind mapping software. Ripples were the unintended program consequences. The mind map visually depicted the outcomes associated with Stone Soup when completed.

Mapping continued until the group was satisfied that the map captured everything known to have happened as a result of the program.

Diagram provied is the Mind map created to depict the Ripple Effect Mapping as a result for the Stone Soup Program

Figure 2. An Example of the xMindmap.

The REM session ended with asking participants to identify what they found interesting regarding the map, and how they felt the REM process would be beneficial to future Stone Soup programming. Participants identified the full scope of their collaborative efforts, encouraging continued collaboration and leading to the identification of additional community partners. Participants also appreciated the visual means to recap what had been accomplished through Stone Soup; allowing for easy identification of the successes of the program, as well as to determine areas which could be built on to strengthen future Stone Soup programing.

After the session, four participants and one stakeholder who were unable to attend the REM session were contacted to gather additional information using questions from the REM session. There responses were then added to the mind map. The map was finalized by the researcher.

The final map was exported into an Excel spreadsheet. The program administrators reviewed the data from the mind map and assigned codes based on the intended short, medium, and long-term outcomes from the project’s logic model. Emerging themes were assigned to outcomes to determine the extent to which the Stone Soup program met its intended goals, and to identify any unintended impacts that resulted from the program. The data was then coded using the Community Capitals Framework (Emery and Flora, 2006) to describe the impact on of Stone Soup on the community. This framework identifies resources and characteristics that were associated with successful and sustainable communities in terms of built, cultural, human, financial, natural, political, and social capital (Emery and Flora, 2006).

Evidence of Efficacy of using REM for Program Evaluation

The REM process produced qualitative data which showed the positive program impacts of Stone Soup. Through REM, we were able to identify numerous intentional and unintentional impacts of the program. All of the intended program outcomes were met as identified in the discussion and resulting mind map. Ripples that extended the reach of the intended outcomes were also identified through stories told by participants in the REM process. These impacts of Stone Soup were then shared with the public to show the value of the program.

Identified ripples included: people moving forward, obtaining jobs, receiving certification. increasing life skills increased. In addition, graduates increased awareness of others, and became successful entrepreneurs. Through REM we also learned that there was greater relationship building among community agencies, we reached new audiences and increased our job skills. Finally, through the comments we learned that Stone Soup helped improve community perceptions related to the relevance of extension.

There were benefits and challenges that were identified through the process associated with using REM. One benefit was the increased engagement and energy observed in the participants through the REM session. All individuals who took part in the REM session enjoyed the process, and felt it was worthwhile, as reported during the reflection period. Participants agreed that the participant-centered approach of REM encouraged great conversation, and connected participants. Participants built on the stories of other and uncovered far-reaching impacts through these conversations. One participant reported that he enjoyed sharing with his interview partner during the appreciative interviews, and now has a better understanding of the role his partner plays in the community. Participants were excited to see the far-reaching impacts of their work with Stone Soup which was represented on the visual map. Participants felt encouraged to continue their work with the program after seeing results of the efforts and participation. Stone Soup administrators who participated in the REM session reported that they enjoyed reconnecting with those who helped implement the program as well as program participants.

Challenges primarily centered on having difficulty getting enough people to participate in the REM session.

Summary, Conclusion, and Results

REM is a great evaluation tool that can be used on any extension program. REM collects qualitative data that can be supported by quantitative data collected during the program. Short term goals were shown to have been met in addition to medium-term goals. Data collected through this process was entirely qualitative. Surveys or questionnaires could be utilized by extension educators if quantitative data was needed.

Challenges associated with the Stone Soup REM process included having a small number of participants. Chazdon et al. (2017) recommends 8-15 participants. The small sample number of participants allowed for the opinions of all to be shared without reaching a time restriction in the session and the numbers were sufficient to evaluate the REM process. REM sessions are cost-effective. The only cost associated with the implementation of the REM session was for refreshments. Refreshments are suggested to help frame the REM session as a program celebration (Hansen, Higgins, & Sero, 2018). There was no cost associated with the meeting facility in the Extension office.

A major concern was the low participation in the REM session. Possible reasons for low participation were time required to complete the process, the passage of time from completion of the course to the REM session, and loss of relationships with participants once the program was competed. REM sessions generally take two hours to complete. Potential participants may not have been able to dedicate two hours to the process. Program administrators were not in regular contact with participants during this time. Participation may increase if the REM session were held closer to the completion of the program. A class “reunion” could be held with a REM session to celebrate program success to provide follow-up training. One challenge related to participation in this specific program may be participants with mental illness not feeling comfortable working with individuals they have not previously met Creating an environment in a familiar setting would create a safe environment that would encourage greater participation from this sector of program participants.

There was a risk of bias in selection in data collection with the low number of participants. (North Central Regional Center for Rural Development, 2012). The map was produced largely representing the viewpoints of stakeholders. This limitation could be overcome by conducting interviews with other program participants and stakeholders (Kollock et al., 2012). It would be interesting to conduct a REM session that contained only participants or with a more balanced sample of participants and stakeholders for comparison to this project.

This session used the theming and rippling approach of REM (Chazdon et al., 2017), but there are numerous approaches to conducting REM. The mapper records the data on a visual map using a mind mapping software or a large sheet of paper in this method. The map was projected on a wall-mounted projection screen. The map was not large enough for all participants to read. In the future, we recommend enlarging the image when using the XMind mapping software.

Also, a reporter was not used in this session, but is planned to be used in future sessions to capture direct quotes from participants. A reporter will be beneficial to ensure that direct comments will be captured, leading to clarification of the map itself, and will provide personal statements from participants for use in reporting impacts of the program to community stakeholders. The REM process is very intense. Another recommendation is to schedule an additional meeting for group reflection after completion of a REM session. This will help reduce fatigue among participants.

References

Burkhart-Kreisel, C. (2015). Ripple Effect Mapping: A Tool to Document Change. Cornhusker Economics, 817.

Chazdon, S., Emery, M., Hansen, D., Higgins, L., & Sero, R. (2017). A Field Guide to Ripple Effects Mapping. University of Minnesota Libraries Publishing.

Coughlan, A.T., Preskill, H., & Catsambas, T.T. (2003). An Overview of Appreciative Inquiry in Evaluation. New Directors for Evaluation, 100, 5-22. https://doi.org/10.1002/ev.96

Mattos, D. (2015). Community Capitals Framework as a Measure of Community Development. Cornhusker Economics, 811.

Cooperrider, D. & Whitney, D. (2006). Appreciative Inquiry: A Positive Revolution in Change. In Holman, P., Devane, T., Cady, S., et al., The Change Handbook (Chapter 29, pp. 19-33). Berrett-Koehler. Cooperrider. D.C., & Whitney, D., Stavros, J.M. (2008). Appreciative Inquiry Handbook: for Leaders of Change.

Crown Custom; Berrett-Koehler

Daniels, C.H., Chalker-Scott, L., & Martini, N. (2016) Uncovering transdisciplinary project outcomes through ripple effect mapping. Journal of Extension, 54(5), Article V54-5TT.

Darger, M. (2014). Capturing the ripples from community-driven business retention and expansion programs.

Journal of Extension, 52(2), Article 2TOT6.

Emery, M., Fey, S., & Flora, C. (2006). Using Community Capitals to Develop Assets for Positive Community Change. CD Practice, Issue 13http://srdc.msstate.edu/fop/levelthree/trainarc/socialcapital/communitycapitalstodevelopassets- emeryfeyflora2006.pdf

Emery, M., & Flora, C. (2006). Spiraling-Up: Mapping Community Transformation with Community Capitals Framework. Journal of the Community Development Society, 37(1), 19-35. https://www.uvm.edu/rsenr/rm230/costarica/Emery-Flora-2006.pdf

Emery, M., Higgins, L., Chazdon, S., & Hansen, D. (2015). Using ripple effect mapping to evaluate program impact: choosing or combining the methods that work best for you. Journal of Extension, 53(2). Article 2TOT1.

Enhancing Rural Capacity. (2015, July 17). Using Ripple Effect Mapping to Determine Program Outcome. [Video]. YouTube. https://www.youtube.com/watch?v=6rFdZVSETPU

Community-Development. (2019, July 12). Ripple Effects mapping: An Effective Tool for Identifying Community Development Program Impacts. Enhancing Rural Capacity. https://community- development.extension.org/ripple-effects-mapping-an-effective-tool-for-identifying-community-development- program-impacts/

Hansen, D., Higgins, L., & Sero, R. (2018) Advanced facilitator guide for in-depth ripple effects mapping. Pullman, Washington: Washington State University Extension. https://hdl.handle.net/2376/13110

Kollock, D. H., Flage, L., Chazdon, S., Paine, N., & Higgins, L. (2012). Ripple effect mapping: a "radiant” way to capture program impacts. Journal of Extension, 50(5). Article 5TOT6.

Nathaniel, K.C., & Kinsey, S.B., (2013). Contributions of youth engagement to the development of social capital through community mapping. Journal of Extension, 51(1). Article 1TOT7.

North Central Regional Center for Rural Development (2015, October 20). Ripple Effect Mapping of Extension. [Video]. YouTube. https://www.youtube.com/watch?v=tgzrIcvrDV8

Rosciano, A., (2015). The effectiveness of mind mapping as an active learning strategy among associate degree nursing students. Teaching and Learning in Nursing, 10(2), 93-99.

University of Minnesota Extension. (n.d.). Ripple Effect Mapping Makes Waves in the World of Evaluation.

University of Minnesota. Retrieved from https://extension.umn.edu/community-development/ripple-effect- mapping

University of Wisconsin Extension. (2003). Enhancing Program Performance with Logic Models [MOOC]. https://fyi.extension.wisc.edu/programdevelopment/files/2016/03/lmcourseall.pdf

Taylor-Powell, E., Jones, L., & Henert, E. (2002). Enhancing Program Performance with Logic Models.

University of Wisconsin Extension Services. [Online Course]. https://lmcourse.ces.uwex.edu

Taylor-Powell, E. (2006). Participatory Evaluation [Fact Sheet]. University of Wisconsin Extension. https://fyi.extension.wisc.edu/programdevelopment/files/2016/04/Tipsheet35.pdf

Taylor-Powell, E., Camino, L. (2006). Probing Questions in Interviews [Fact Sheet]. University of Wisconsin. https://fyi.extension.wisc.edu/programdevelopment/files/2016/04/Tipsheet34.pdf

Van Lelyveld, G. (2018, July 13). Ripple Effect mapping: Peninsula Food Coalition. [Video]. YouTube. https://www.youtube.com/watch?v=uI9pei1AU-I

Vines, K. A. (2018). Exploration of Engaged Practice in Cooperative Extension and Implications for Higher Education. The Journal of Extension, 56(4), Article 24.

N.A. (2002). Xmind 2022 (Version 11.1.2) [Computer Software]. XMind. https://www.xmind.net/download/xmind


Virginia Cooperative Extension materials are available for public use, reprint, or citation without further permission, provided the use includes credit to the author and to Virginia Cooperative Extension, Virginia Tech, and Virginia State University.

Virginia Cooperative Extension is a partnership of Virginia Tech, Virginia State University, the U.S. Department of Agriculture, and local governments. Its programs and employment are open to all, regardless of age, color, disability, sex (including pregnancy), gender, gender identity, gender expression, genetic information, ethnicity or national origin, political affiliation, race, religion, sexual orientation, or military status, or any other basis protected by law.

Publication Date

April 20, 2022