Design Challenge: Year 2 Winners and Lessons Learned

By: Clark McKown, xSEL Labs


This spring, the Assessment Work Group mounted a second design challenge to identify direct assessment tools to supplement districts’ current emphasis on teacher and student surveys that measure SEL competencies. We will release a brief detailing the winners, lessons learned, and building on the key design principles for effective assessments that we established in the first Design Challenge. Read it here later this week. In the mean time, blog readers can get an early look into our findings from the competition…

For the second design challenge, we wished to increase practitioner input so that we might shine a light on assessments that are responsive to clearly expressed practitioner needs. To that end, our second design challenge included two phases. In the first phase, we asked practitioners to submit brief descriptions of the kinds of assessments that would most dramatically benefit them in their work with students, and how those assessment would help them. In response, we received over 60 submissions. We selected the 10 best submissions that applied to a universal group of students within preK-12th grade and provided a clear and actionable description of need that could be addressed through a direct assessment of SE competences. For example, several practitioners wanted assessments that could help them assess whether the programs and practices they are using are having an impact on the student competencies being targeted. Other practitioners work in high schools and want to measure the college and career-readiness of their students. Most practitioners were focused on measuring both the inter- and intrapersonal competencies of their students. The winners of the first phase can be seen here.

In the second phase of this year’s design challenge, we issued a call for submissions of assessments specifically designed to address the needs articulated by one of the winning practitioner submissions. We asked that submitters identify one of the practitioner submissions to respond to and describe their assessment and how it would address the stated practitioner need. We emphasized this year that submissions could be designs, prototypes, or fully developed assessments. In response to this call, we received 11 submissions—six from practitioners, two from researchers, two from test developers, and one from a consultant. Six submissions were in the design phase; four reflected working prototypes; and one was a fully developed assessments.


Judges selected five submissions as winners. In brief, the winning submissions included:

  • A design for a brief assessment of SE competencies among elementary-aged students using a game-like computer platform in which students respond to hypothetical vignettes.
  • A design for a text-based decision game for high school in which students walk through scenarios and earn points for actions they take in response to specific situations.
  • A fully developed web-based assessment of social information processing for the elementary and middle school grades that uses an interactive and immersive format in which students customize and adopt the role of an avatar who experiences challenging social situations.
  • A rubric design for evaluating the extent to which high school students demonstrate critical SE competencies during classroom time.
  • A prototype problem-solving performance assessment for high school.


Each winning submission was able to demonstrate how their assessment connected to a practitioner need identified through the first phase of the design challenge. This year, again, each of the winners of this year’s challenge will be featured on our blog in the coming weeks. See our website for additional details on the winning assessments.

Based on lessons learned from this year’s design competition, we added to the list of design principles that we think should guide the development of new SEL assessments. Specifically, we believe that SEL assessments should have the following characteristics, with new principles indicated in bold type:

  • Usability and feasibility in authentic educational settings
  • Clarity about the purpose for which the assessment is designed
  • Developmental and cultural appropriateness
  • Useful data reporting
  • Potential to be used at large scale
  • Technical soundness
  • Clearly meet a specific practitioner need
  • Are developed in collaboration with those who will use the assessment
  • And incorporate methodologies appropriate to achieving the assessment goal, including direct assessment where appropriate, and other methods where they are better suited to achieving the assessment goal.

For those interested in learning more, here’s how you can join the conversation.

  • Read our brief when it’s released later in the week detailing lessons learned from the second year of the Design Challenge, including key design principles to help drive the next generation of assessments.
  • Join the webinar! On August 23rd at 12pm EST, three Design Challenge winners will discuss their winning assessment proposals, including what social-emotional competencies they assess and how.Register here! Note: If you’re not able to attend the live event, please register to receive an email with a recording of the webinar.
  • Read our weekly blog which will feature a post every Wednesday from each winner throughout the next two months. We encourage you to engage with them by sharing your reactions and questions in the comments.
  • Comment below on this blog to let us know how you are using direct assessments or respond to one of these questions:
  • What has been your experience with trying direct performance assessments?  What did you learn from the experience?
  • What SE competencies would you like to see measured more directly?
Posted in:

Leave a Reply

Your email address will not be published. Required fields are marked *