New Year, New Approach: One Organization’s Re-SEL-utions for Overcoming SEL Measurement Challenges

By: Linda Galib, Urban Initiatives

Social emotional learning. Such a complex concept wrapped up into three words.

As the Research and Evaluation Director for Urban Initiatives–a nonprofit, sports-based positive youth development organization that partners with 55 Chicago Public Schools and serves nearly 17,000 K-12 youth–my job is to ensure that our evaluation efforts accurately and systematically capture the change our program staff sees day-in and day-out in our participants. We use the power of sport and play to empower Chicago youth during recess and out-of-school time programming. Concerning social emotional learning specifically, we believe the hands-on and experiential nature of this sports-based approach allows us to strategically impact social emotional growth with long-term outcomes. Social emotional learning is critical to our mission and intentionally incorporated into all programming curricula and program evaluation strategies.

Despite being so central to our programming approach, social emotional learning is by far our most difficult outcome to measure. We know we are not alone in this sentiment: With 136 different social emotional frameworks to choose from, it’s clear that others have felt similarly.

Our programming staff know firsthand the power of SEL. It enables students to develop positive relationships with each other and with adults, to engage in positive play, and to succeed in the classroom. Our staff have countless anecdotes of participants’ social emotional growth and learning. So, why is SEL so difficult to systematically measure?

As we move into 2019, Urban Initiatives is resolving to put the “SEL” in our new year’s “Re-SEL-utions.” Outlined below are two of the largest challenges we’ve faced in measuring SEL as a service provider organization, steps we’ve taken so far, and our plans to continue addressing these challenges in 2019:

Resolution #1: Utilize one SEL framework for all programming and evaluation efforts that best meets the developmental needs of Urban Initiatives’ multiple programs. Currently, each of our evidence-based program models are based on different frameworks that incorporate SEL in slightly different ways. For example, the SEL curriculum for our 5-8th grade leadership development program is based on the Search Institute’s Developmental Assets Framework, whereas SEL components of our college and career readiness program curriculum for 9th-12th grade students are derived from MHA Labs’ Building Blocks. Even if we continue using different evidence sources to inform our curriculum development, having one overarching SEL framework that accounts for different developmental stages of SEL for children and youth would be incredibly helpful when evaluating and understanding our theory of change and impact as an organization.

To fulfill this resolution, we plan to create an organizational SEL theory of change and map out all SEL components of our program curricula to identify overlap and program-specific SEL focal points. In doing so, we’ll use available resources (for example, CASEL’s criteria for selecting SEL frameworks and CCSR’s developmental framework) to review and identify existing SEL frameworks that are empirically-based, account for developmental stages of youth and acknowledge the potential for SEL growth in adults, and account for systemic factors such as inequity and oppression that are central to the experience of many of the  youth that we serve, but often overlooked and not discussed in the more individualized context of SEL.

Resolution #2: Work with external stakeholders to identify the most effective methods for measuring SEL in action. While we’ve extensively incorporated SEL measurement into each of our program evaluation strategies (through a combination of participant, parent, teacher, and coach surveys), it still feels like we’re missing something. Generally speaking, while surveys have many advantages, survey limitations are evident in our evaluation planning, implementation, and analysis. We want to use validated survey measures, but they often either don’t align with our program curriculum, or are so lengthy that they don’t account for the needs of diverse learners. Reference bias for our participant surveys may also be a challenge–as students learn more about social emotional development throughout the program year, they may rate themselves lower at the end of the year than they did at the beginning of the year.[1] Low response rates also present a challenge for generalizing our survey results to our programs. Whenever I see our programming in action, however, I see firsthand how impactful adults can be in intentionally modeling SEL. For example, during recess observations I see our recess coaches facilitating inclusion on the playground by intentionally engaging with students sitting on the sidelines and supporting them to engage in play with their classmates. In another example, while participating in a session of our out-of-school-time soccer programming, I watched in awe as our adult coach guided their 5-8th grade team captains to lead their younger K-4th classmates in warm-ups and games during program sessions. The captains used call-and-response techniques to get their team’s attention and effectively addressed behavioral challenges at practice without ever needing to raise their  voice or use a whistle. These are rich qualitative examples of adult modeling of SEL that I would love to be able to quantify through evaluation in some way. How best to do this?

If there’s one thing I’ve learned in my past 2.5 years at Urban Initiatives, it’s that going it alone as an internal evaluation department is not glamorous, nor recommended. In 2019, we are excited to build upon existing partnerships with entities such as the University of Illinois at Chicago and the Thrive Data Partnership, in order to collaborate with researchers and other service-providing organizations to learn more about possibilities for evaluating SEL in action. Last year we piloted a workforce development partnership with researchers from the University of Illinois at Chicago (UIC) to intentionally develop social emotional skills of our recess staff. As that partnership continues, we will also be working with the UIC team to build our technical capacity for evaluation internally, to ensure we are using the most effective evaluation strategies and analyses possible for truly identifying SEL progress across all of our programs. We also plan to collaborate with our program staff to learn more about their experiences and opinions on how best to capture SEL in action.

The bottom line: At Urban Initiatives, one of our favorite programming cheers is TEAM (“Together Everyone Achieves More”) and this could not be more true when measuring SEL. We look forward to working toward these evaluation resolutions in 2019 and beyond, and would love to hear from researchers and other service-providing organizations to explore how we can collaborate on this SEL measurement journey.

 

Linda Galib is Research and Evaluation Director at Urban Initiatives. She can be reached at linda.galib@urbaninitiatives.org.

 

References/Links included in blog (in order that they appear):

Berg, et al. (2017). Identifying, defining, and measuring social and emotional competencies. American Institutes for Research: Washington, DC. Accessed January 2, 2019 at https://www.air.org/sites/default/files/downloads/report/Identifying-Defining-and-Measuring-Social-and-Emotional-Competencies-December-2017-rev.pdf.

[1]West, et al. (2016). Promise and paradox: Measuring students’ non-cognitive skills and the impact of schooling. Educational Evaluation and Policy Analysis, 38(1): 148–170.

Search Institute. “The Developmental Assets Framework.” https://www.search-institute.org/our-research/development-assets/developmental-assets-framework/.

MHA Labs. “Skill Building Blocks.” http://mhalabs.org/skill-building-blocks/

Blyth, D.A. & Borowski, T. (2018). “Ten Criteria for Describing and Selecting SEL Frameworks.” CASEL: Chicago, IL. Accessed January 2, 2019 at https://measuringsel.casel.org/wp-content/uploads/2018/09/Framework-A.3.pdf.

Nagaoka, et al. (2015). Foundations for young adult success: a developmental framework. The University of Chicago Consortium on Chicago School Research: Chicago, IL. Accessed January 2, 2019 at https://consortium.uchicago.edu/sites/default/files/publications/Foundations%20for%20Young%20Adult-Jun2015-Consortium.pdf.

West, et al. (2016). Promise and paradox: Measuring students’ non-cognitive skills and the impact of schooling. Educational Evaluation and Policy Analysis, 38(1): 148–170.

Urban Initiatives. (2018). “Workforce development isn’t optional: Here’s how we are prioritizing it.” http://www.urbaninitiatives.org/news/2018/09/workforce-development-isnt-optional-heres-how-we-are-prioritizing-it/.

Posted in:

Leave a Reply

Your email address will not be published. Required fields are marked *