By: Nicole Russo-Ponsaran and Ashley Karls, Rush University
Wouldn’t it be great if educators had a standardized set of measures to evaluate students’ social-emotional learning skills? We know that millions of children in the United States have social-emotional challenges that are related to behavioral and academic difficulties. To address this, many states have adopted social-emotional learning standards, and schools are implementing programs that teach children how to interact positively with others. However, teachers often report not having any tools to effectively measure social-emotional skills in their students.
How does social information processing relate to social emotional learning?
Social information processing skills encompass the cognitive and emotional steps an individual takes to navigate challenging social situations. We enlist skills such as identifying a social challenge, interpreting hostile intent, recognizing and regulating our emotional response to a situation, generating a preferred outcome for the situation, identifying potential solutions, and recognizing our ability to enact our preferred solutions. While these skills are a targeted aspect of broader social-emotional learning, related skills are also employed – e.g., social perspective taking and self-regulation. Social information processing skills are critical for effective peer interactions and the development of friendships, as well as healthy participation in group learning activities. Additionally, understanding how individuals process social information by being able to accurately measure component steps within an individual can help practitioners pinpoint where a deficit may be present.
Ever wonder how a child successfully navigates the social world?
To help answer this question, the research team at Rush NeuroBehavioral Center (RNBC) teamed up with Soar Technology, Inc. to develop VESIP™, an interactive web-based assessment designed to measure students’ social information processing skills. VESIP was based on the theoretical social information processing model proposed by Crick and Dodge. To this end, VESIP measures problem identification, goal preference, solution preference, emotion response, hostile attribution bias, and self-efficacy. Students customize an avatar character and are able to watch the character encounter a range of challenging situations in various school environments. Students respond to prompts from a friend character, expressing what they think about the situation and what they would want to do.
We are already learning a lot from the use of VESIP in schools. For example, initial VESIP data from a middle school sample of students support that children’s ability to identify effective, pro-social solutions is related to both social skills and problem behaviors as reported by teachers. Students’ emotional responses to situations are also related to problem behaviors such that less aggressive interpretation of situations is related to fewer problem behaviors. Finally, there is a robust relationship between students’ perceived self-efficacy for preferred solutions and problem behaviors such that greater self-efficacy is related to lower problem behaviors.
Here’s where you come in!
We know that these skills are important to social development and even academic performance. It would be great if we could easily measure these skills and know how children are performing akin to their peers. We’ve collected data from a large group of students locally in the Midwest. Now we need a nationally representative sample to learn how students across the country perform on VESIP. Once enough students take the assessment, we can create standardized scores that will allow schools to automatically interpret their students’ scores in a meaningful way.
Throughout the 2019-2020 school year, schools that would like to partner with RNBC to achieve this goal will be able to use VESIP for free as part of their programming, along with instruction, training, and support provided at no cost. Schools can automatically generate score reports summarizing student, classroom, and district performance.
If you are interested in participating in this study or would like more information, please contact Principal Investigator, Nicole Russo-Ponsaran, PhD (firstname.lastname@example.org, 847-763-7927) or Study Coordinator, Ashley Karls, BA (email@example.com, 847-763-7963).
Disclaimer: The Assessment Work Group is committed to enabling a rich dialogue on key issues in the field and seeking out diverse perspectives. The views and opinions expressed in this blog are those of the authors and do not necessarily reflect the official policy or position of the Assessment Work Group, CASEL or any of the organizations involved with the work group.