CAASPP HSS Meeting

I was invited to participate in a stakeholder meeting on the California Assessment of Student Performance and Progress (CAASPP). The purpose of the meeting is to gather input from stakeholders regarding a new History-Social Science Content Assessment. The new assessment will be aligned with the Common Core State Standards. The meeting is hosted by the California Department of Education (CDE) and Educational Testing Service (ETS).

California should move toward an accountability system with local designs in order to have schools prepare students for regional colleges and work sectors. Schools and districts can use state funds to create new assessments for college and career readiness. Testing vendors need to be more inclusive of local needs, or else the opt-out movement in standardized testing will increase. California’s Local Control Accountability Plan (LCAP) requires districts to develop, adopt, and annually update, with parent and community involvement, a three-year accountability plan that identifies goals and measures progress across multiple indicators of both opportunities and outcomes. Local districts can add their own indicators to those that are state required. SBAC

Research shows that engaging teachers in jointly scoring student work produces greater success, builds professional norms, and increases content knowledge. Further, engaging students in peer review requires them to compare their work to the standards and culminates in powerful learning experiences. The next generation assessments should incorporate both of these factors, but additional improvements in testing may be driven through the use of artificial intelligence, computer adaptive testing, and what are colloquially known as “robo-graders” to provide instant feedback to students during the testing process.

Now that California assessments are moving beyond paper and pencil, the technology behind testing systems needs to be updated also. This year, my free fantasy football league added a new feature. Each week, they use artificial intelligence to run if/then calculations on your individual team. If you had played Tom Brady instead of Peyton Manning, you would have come in first place instead of fifth. The computer reports on four or five of your key players and you kick yourself for the rest of the week. Spotlight-Parent-Teacher-ConferenceHonestly, I was blown away by how light, conversational, and even funny the writing in this narrative was, because I knew it was all computer-generated. It was very user-friendly, like the products that Spotlight Education develop.

Regardless of what the assessment experience is like, it’s going to produce some form of data, and that data should be understandable and actionable by families and educators. After pondering this, I started wondering why our educational leaders aren’t using this technology to make testing more interactive and interesting? Aren’t they talking to anyone in Silicon Valley? Don’t they realize that gamification offers powerful assessment technologies and engages students in deeper learning activities?  If these new tests are going to be worth the effort, shouldn’t we make them cutting-edge and restore California’s reputation as an innovator?

The next-generation assessments in History/Social Studies should:

  1. Give students some choice about what they feel they know the most about (e.g., The Gold Rush, WWI, The Vietnam War, and etc).
  2. Provide a library of documents and a menu of different performance tasks like the CWRA or New Hampshire’s PACE system.
  3. Measure how well students respond to feedback from a computer adaptive engine.

For example, students construct a response arguing whether the eugenics movement was positive or negative for mankind. The computer “reads” their response and gives them a revision list. Then, the students improve their writing and resubmit. A teacher, or subject area expert may still be needed to assess the level of content knowledge in the students answer, but we are nearing the time when computers can assess a student’s historical reasoning. This type of test would not only measure what the student knows, but what the student can do with what they know. This is similar to what happens in the workplace, when a piece of writing has to incorporate the diverse opinions of a team or committee. Computer adaptive tests

Critics of this approach will demand evidence that students have learned all the standards. Surely, measurement experts can design an approach that samples different populations in schools and gives them different standards, otherwise we will get new tests that are no different from the 60-80 question fill in the bubble tests that provide 2-4 questions per standard and cause testing fatigue in our students. Allowing students to select a portion of the content will create more engagement in the testing process. If technology can provide high-quality analysis and advice to fantasy football players for free, why can’t ETS provide similar rich and rewarding experiences when testing California students? We need tests that teach students to reflect on their choices and make better decisions. Further, these tests need to be engaging (and dare I say fun?) enough so that students look forward to the challenge instead of dreading it. We have the technology. We have the know-how. We just need the consensus. Please get involved with your local schools and provide input on their LCAP proposals. Demand more than a fill in the bubble testing experience for your child.

Sources

Darling-Hammond, L., Wilhoit, G., & Pittenger, L. (2013) Accountability for College and Career Readiness: Developing a New Paradigm. Standford Center for Opportunity Policy in Education. Palo Alto, CA. Accessed at http://dx.doi.org/10.14507/epaa.v22n86.2014 on March 15, 2015.

Leather, P.K. and Barry, V. M. (2014) New Hampshire Performance Assessment Of Competency Education: An Accountability Pilot Proposal To The United States Department Of Education. November 21, 2014. Concord, NH. Accessed at http://www.education.nh.gov/assessment-systems/pace.htm on March 12, 2015.

Smarter Balanced (2014) Smarter Balanced Pilot Automated Scoring Research Studies. http://www.smarterbalanced.org/wordpress/wp-content/uploads/2014/09/Pilot-Test-Automated-Scoring-Research-Studies.pdf

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s