Observational Data to Inspire SEL Practice

By Charles Smith, QTurn LLC

 

Over the past decade we have learned a lot about how to use data to transform educational settings and improve student outcomes such as social and emotional learning (SEL). In my experience, “using data to inspire SEL practice” is the right way to think about it. Transformation through inspiration requires circumstances where educators experience both organizational support and positive motivation to demonstrate and/or improve their practice. From a measurement perspective, we’re asking a question about the consequences of producing the SEL data: Do the data inspire teachers and youth workers to demonstrate their best practices or to work on improving them?

For obvious reasons, much of the action in the field of SEL measurement is focused on assessment of student skills using survey questions, formative tasks, computer integrated tests, and teacher ratings.[1] These measures of what students know and do are important because student’s mental skills are necessary for learning new behavioral skills, which are typically the outcome we seek in SEL work. It is easy to see how these kinds of measures could be used to inspire practice. However, data from observations of teacher and youth worker practice can also support transformation through inspiration. Here are some reasons why:

First, SEL practices are generally a challenge to implement. This is true for both explicit interventions, where the steps are defined in a manual and confined to a lesson, and for more embedded approaches where teachers are prepared to support students to practice at SEL skills anytime the going gets tough. In my years of experience with the High/Scope preschool curriculum – a powerful evidence-based SEL intervention design when implemented well – fidelity to the instructional practices was frequently a challenge for inexperienced organizations and unsupported teachers. Further, knowing what’s happening in classrooms from observation allows us to ask “is this classroom ready to support these students?” rather than the reverse which can put the SEL deficit frame on the students. Although it may be significant to know that some percentage of students gained SEL skills that help them succeed in school settings, that does not mean that those students get to learn in classrooms that are free of disruption or withdrawal. Data from observation can tell this story.

Second, observational data describes the prevalence of teacher’s responsive practices[2] like modeling, scaffolding, facilitating, and coaching that occur in the moment. Like all skills, SEL skills are learned in part through mimicry of models, and particularly in the emotional realm, vicariously through reading the face and body language of important adults and peers. For students who have difficult SEL histories, access to responsive practice from adults is a source of co-regulation[3] when students bring fast moving emotionally-charged and attention-directing schema (i.e., sensory-affective-motor-schema[4]) into the education setting. However, because immediate positive responsivity to emotional distress, withdrawal, or exuberance can be counter-cultural in education settings, and because people are often inaccurate in describing their own behavior, it is good to have a third-party observer doing the counting.

Third, observational data on teacher practice is a powerful foundation for effective performance feedback.  A growing evidence base[5] is establishing the utility of teacher observation data as performance feedback that causes instructional improvement and student-level skill change. There are several reasons for this; in particular, because more fine-grained observational items for teacher behavior are actionable (i.e., the item level data is directive about what to do more of and who is doing it well), and the conditions of assessment engender trust (i.e., how scores were produced is transparent to both rater and teacher). In particular, trust in measurement[6] is a critical support to use of data.

Fourth, observational assessment of SEL practice supports the identification of local expertise. We’ve learned that variability in quality of SEL practice is typical among teachers in the same school or in the same afterschool program. This means that despite the counter-cultural status of SEL in prior years, many teachers and youth workers have been iterating toward effective practice, already customized for use with “our kids.” By observing teacher practice, local expertise can be identified, and the energy that comes from recognition of best practice can be released.

To conclude, data from well-designed observational measures for the quality of SEL instructional practices can have transformative and inspirational effects; that is, the data can support organizations to intentionally grow students’ SEL skills. However, a profound irony in current policy is that over the past several years states have made huge investments in teacher evaluation systems that produce high quality observation-assessment data but which is being put to the wrong use – higher-stakes individual teacher evaluation conversations with a focus on eliminating weak teachers. In particular, because these policies emphasize the psychometric value of measure stability over multiple occasions of observation (i.e., reliability) for the purpose of eliminating teachers with “low” performance, the transformative potential in these data is often missed. By creating lower-stakes[7] conditions of data use – where individual teachers’ and youth workers’ experiences of their own performance data entails challenge but not threat – motivational energy around the SEL practice is likely to grow.

 

How can you know when an education setting is ready to support optimal SEL skill use?  What are responsive practices that make a setting more ready?

What builds staff trust in the process of using SEL performance measures? How do you define the right stakes (e.g., consequences, incentives) for staff when using SEL performance data?

 

References 

[1] These are highlighted because they could link to examples from prior blogs focused on these subjects, e.g., Peckel on surveys, assessment competition winners on formative tasks, McKown on computer integrated tests, LeBuffe on teacher ratings.

[2] https://www.selpractices.org/

[3] https://www.acf.hhs.gov/sites/default/files/opre/report_1_foundations_paper_final_012715_submitted_508.pdf

[4] https://www.selpractices.org/resource/preparing-youth-to-thrive-methodology-and-findings-from-the-sel-challenge

[5] http://socialinnovationcenter.org/wp-content/uploads/2018/03/CSI-turnarounds.pdf

[6] https://www.rand.org/pubs/research_briefs/RB9549/index1.html

[7] http://cypq.org/content/moving-needle-moving-needle

 

 

Disclaimer: The Assessment Work Group is committed to enabling a rich dialogue on key issues in the field and seeking out diverse perspectives. The views and opinions expressed in this blog are those of the authors and do not necessarily reflect the official policy or position of the Assessment Work Group, CASEL or any of the organizations involved with the work group.

Posted in:

Leave a Reply

Your email address will not be published. Required fields are marked *