By: Tara Laughlin, PAIRIN
The world is full of problems. Not to be a downer, but I’m setting the stage here. On a personal, professional, local, national, and international level, it can sometimes feel like everywhere you turn, there’s a new problem waiting to be solved. Given this sheer volume of problems, I think it’s safe to say the world needs an equally impressive corps of effective problem solvers.
How do we help develop the next generation of problem solvers? And how do we know when someone has reached the status of master problem solver?
Answering those questions is not as hard as it may seem. Just as the best way to determine if someone is an effective driver is to put them behind the wheel of a car for their driver’s test, the same principle applies here. To assess one’s problem solving prowess, give them an authentic problem to solve. Provide feedback on their progress, and, based on how they perform, coach them towards success. This approach is known as performance assessment (Darling-Hammond & Adamson, 2010).
As an Assessment Design Challenge winner for CASEL, PAIRIN has developed a practitioner-friendly problem solving performance assessment, designed for use with students in grades 9-12. The assessment begins with an authentic, real-world problem for students to solve. This problem is purposely broad enough for learners to apply it to their own life context, thus making the assessment more developmentally and culturally appropriate. Students then work their way through a series of problem solving stages: collecting relevant information; identifying stakeholders and the problem’s impact; developing and evaluating possible alternatives; and finally, selecting and justifying the best solution for the problem.
A descriptive, four-point problem solving rubric is included with the performance assessment, intended for use in two ways: (1) for students to self-assess their work, and (2) for educators to evaluate student work and determine their level of proficiency.
Here’s the nitty-gritty on the assessment:
PAIRIN’s problem solving performance assessment incorporates several research-based elements in its design:
- The problem solving scenario is structured around the GRASPS model: (G)oal; (R)ole; (A)udience; (S)ituation; (P)roduct; (S)tandards (Wiggins & McTighe, 2004).
- It is flexible, or adaptive to different students and contexts (Greenstein, 2012; Reeves, 2010).
- It values the process students take to arrive at a conclusion, rather than just the conclusion itself (DuFour & DuFour, 2010; Greenstein, 2012; Marzano & Heflebower, 2012).
- It is aligned both to a learning objective and a rubric (Wiggins & McTighe, 2005).
- It is responsive, providing students with feedback on their performance relative to the expected level of performance, defined by the rubric (Kay & Greenhill, 2013).
- It is authentic, meaning students need to apply their skills in a new, novel situation (McTighe & Seif, 2010).
- It is metacognitive, having students reflect upon and self-assess their work before turning it in (Marzano & Heflebower, 2012; Pearlman, 2010).
VALIDITY & RELIABILITY
Several steps have been taken to increase the validity and reliability of this assessment. First, the assessment materials include a scoring guide for educators, outlining an inter-rater reliability process to ensure scores across different scorers are consistent. Educators are also provided with exemplars (model student responses) which illustrate all four proficiency levels on the rubric.
In addition, the descriptors on the rubric are calibrated with The PAIRIN Survey, a scientifically validated psychometric SEL assessment. All students taking the performance assessment are granted access to The PAIRIN Survey for the purpose of providing criterion-related validity. This survey can be used as an external instrument, collecting a pre- and post- measure to compare to data collected from the performance assessment.
Not only can The PAIRIN Survey be used as a pre- and post- measure, the performance assessment also includes an alternate assessment for use as a pre- or formative problem solving assessment so that instructors have another indicator of student growth over time. After students take the performance assessment, this data can be used to determine which students need extra time and support in problem solving, as well as which students may benefit from enrichment.
In addition, to track this growth, the PAIRIN team has completed initial designs for an app which would allow educators to input and view data on students’ performance on this and our other SEL performance assessments. In the app, instructors would be able to look at scores (and provide feedback) on a particular assessment, as well as on SEL assessments over time, for both an individual student and groups of students. We will continue to solicit feedback on these designs and hope to release this product in the future.
The practitioner need this assessment addressed in the Assessment Design Challenge specifically requested an assessment which was aligned with curriculum to help fill skill gaps, and we provide just that. In fact, PAIRIN provides an entire arsenal of SEL curriculum for educators to target students’ needs. This includes 300 online micro-lessons, 54 classroom lessons and more than 400 combined exercises, handouts and rubrics, making it the most comprehensive SEL development program in existence. From problem solving to self-control to leadership, PAIRIN’s curriculum includes SEL development tools for the skills that are most critical for students to learn in order to be successful in their future careers and life.
An Assessment with Purpose
In conclusion, PAIRIN’s problem solving performance assessment is an effective vehicle for determining students’ level of mastery and guiding them to reach new heights. There will always be new problems to solve, so let’s ensure our students are prepared to face them head-on.
Darling-Hammond, L. & Adamson, F. (2010). Beyond basic skills: The role of performance assessment in achieving 21st century standards of learning.
DuFour, R., & DuFour, R. (2010). The role of professional learning communities in advancing 21st century skills. In J. Bellanca & R. Brandt (Eds.), 21st century skills: Rethinking how students learn (pp.77-95). Bloomington, IN: Solution Tree Press.
Greenstein, L. (2012). Assessing 21st century skills: A guide to evaluating mastery and authentic learning. Thousand Oaks, CA: Corwin.
Kay, K., & Greenhill, V. (2013). The leader’s guided to 21st century education: 7 steps for schools and districts. Upper Saddle River, NJ: Pearson.
Lai, E. R., & Viering, M. (2012). Assessing 21st century skills: Integrating research findings. Retrieved from http://images.pearsonassessments.com/images/tmrs/Assessing_21st_Century_Skills_NCME.pdf
Marzano, R., & Heflebower, T. (2012). Teaching and assessing 21st century skills. Bloomington, IN: Marzano Research Laboratory.
McTighe, J., & Seif, E. (2010). An implementation framework to support 21st century skills. In J. Bellanca & R. Brandt (Eds.), 21st century skills: Rethinking how students learn (pp.149-174). Bloomington, IN: Solution Tree Press.
Pearlman, B. (2010). Designing new learning environments to support 21st century skills. In J. Bellanca & R. Brandt (Eds.), 21st century skills: Rethinking how students learn (pp.117-148). Bloomington, IN: Solution Tree Press.
Reeves, D. (2010). A framework for assessing 21st century skills. In J. Bellanca & R. Brandt (Eds.), 21st century skills: Rethinking how students learn (pp. 305-326). Bloomington, IN: Solution Tree Press.
Wiggins, G., & McTighe, J. (2004). Understanding by design professional development workbook. Alexandria, VA: Association for Supervision and Curriculum Development.
Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Alexandria, VA: Association for Supervision and Curriculum Development.
World Economic Forum. (2016, January). The future of jobs: Employment, skills, and workforce strategy for the fourth Industrial Revolution. Retrieved from http://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf
Disclaimer: The Assessment Work Group is committed to enabling a rich dialogue on key issues in the field and seeking out diverse perspectives. The views and opinions expressed in this blog are those of the authors and do not necessarily reflect the official policy or position of the Assessment Work Group, CASEL or any of the organizations involved with the work group.