Building Assessments Based on a Competency Model and Learning Progressions

The CBAL science assessments are being developed based on the science competency model and its related provisional learning progressions (LPs). Below we elaborate several essential task features that support such alignment and present examples using a sample formative assessment prototype targeting the core idea of matter. This prototype is meant to measure and improve the teaching and learning of the matter core idea at Grade 6. In this formative assessment prototype, we are also interested in measuring students’ ability to engage in the practices of modeling and constructing scientific explanations.

We have used the provisional LP for matter to inform the design of our prototype task. In particular, we have used the LP to identify the mental model of matter that we would expect students to develop at early middle school, which also has implications for the kinds of phenomena that students can adequately explain and model. Our provisional LP suggests that students at the early middle school level should be developing a particle model (see Table 1). This model of matter posits the existence of tiny particles (although not necessarily atoms and molecules) that are in constant motion and that cannot be seen with the human eye. In this model the type of particles, their motion, and their arrangement determine the macroscopic properties of the material that they make up. Using this particle model, students should be able to predict, explain, and model the behavior of solids, liquids, and gases. Without a particle model, the behavior of gases or phenomena involving gases, such as evaporation, boiling, or condensation, cannot be well explained or predicted by students. For this reason, we have targeted phenomena that involve evaporation and condensation in our prototype task.
Many items in the exemplar formative assessment prototype are developed around the LP for the properties and structure of matter and the LP for changes in and conservation of matter. For example, some multiple choice item options are aligned with different levels of the LPs to differentiate student understanding. In addition, we have developed items that require students to construct and describe graphic models of matter using a digital modeling tool (see Figure 9). We expect these items to elicit rich evidence of students’ initial and developing conceptions of matter addressed in our LP. These models of matter can be scored for presence of particles in student conceptions associated with the nature, arrangement, and behavior of those particles. For instance, we may find that some students’ models of matter stay at Level 1 or 2 during the course of the task, and reflect only macroconceptions of matter; whereas other students’ models may stay at, or move to, Level 3 during the course of the task and include some particle representations along with some partially incorrect ideas (e.g., incorporating some macroscopic ideas and some missing elements of the behavior and arrangement of particles); and some students’ models may stay at, or move to, Level 4 or 5 during the course of the task, and represent and describe particle ideas with more accuracies about the behavior and arrangement of particles.
In addition, we have used our provisional LP for constructing scientific explanations to guide the design of the task. For example, our LP suggests that if students do not have sufficient knowledge about the structure of a good explanation, they may not be able to express what they know in an efficient way. Therefore, we provide a structure for students to enter the critical parts of a scientific explanation (i.e., their claim, evidence, and reasoning) in order to help them articulate their reasoning, thus helping them to externalize their mental models.
In addition, we will use the LP for constructing scientific explanations to guide the scoring of students’ written scientific explanations collected from the task.The LP will allow us to anticipate and categorize the quality of students’ explanations as we collect them and assign them to particular levels in the LP. For example, in students’ written explanations, we expect a number of students to state a claim in their written explanations, but provide insufficient evidence (Level 3), or provide inappropriate and insufficient scientific principles (Level 4).

Other Design Principles of CBAL Science Assessments

In this site, we present several task design principles in the exemplar formative assessment protocol. In addition, we elaborate on the importance of including these design principles in our task design by connecting them with existing research in the learning sciences.

We present the following task features in the exemplar formative assessment protocol: