Credit: Patrick O'Donnell for Assembly campaign
Assemblyman Patrick O'Donnell previously taught middle and high school in Paramount Unified.

Changes are underway to fix flaws in tests designed to help teachers pinpoint student weaknesses before they take Common Corealigned assessments each spring.

The tests, known as “interim assessments,” are similar to the end-of-the year Smarter Balanced assessments that are used to assess student achievement and progress, as well as that of their schools and districts, in math and English language arts. More than 3 million California students take the Smarter Balanced assessments each year.

Many teachers have given the optional interim tests to their students during the school year to gauge how they are doing, hoping to adjust what or how they teach in advance of the final assessments that are used to fulfill state and federal accountability requirements.

But a panel of three teachers and a school district administrator told the Assembly Education Committee at a hearing in Sacramento earlier this month that they couldn’t get a clear picture of students’ progress because the reports they received on how students did on the interim assessments lacked enough detail to be useful. Specifically, the reports didn’t include any of the questions on the interim tests, students’ responses or the specific standards they were tested on.

The reports are part of the California Assessment of Student Performance and Progress, or CAASPP – a statewide system launched three years ago to test students on new Common Core standards in math and English language arts, as well as science. California is part of a multi-state group called the Smarter Balanced Assessment Consortium, which developed the Common Core-aligned tests.

Come this fall, the reports that teachers receive on the interim assessments are expected to include questions, student answers, and information about the questions’ alignment to specific Common Core standards, said Tony Alpert, executive director of the Smarter Balanced Assessment Consortium.

“What took so long?” asked Assemblyman Patrick O’Donnell, D-Long Beach, chairman of the Education Committee, pointing out that some other organizations created tests and reports with the features teachers want much more quickly.

The state Legislature approved the new testing system nearly four years ago, stating that its intent was to provide tests “to improve teaching and learning.” But educators and community members told the committee April 5 that the interim tests and reports are not meeting this goal.

Frustrated by the test’s drawbacks, many districts have spent significant amounts of their own money on other interim tests that provide more detailed reports that teachers find more helpful in advance of their students taking the end-of-the-year test, said Paula Heupel, assistant superintendent for educational services in the Merced City School District.

O’Donnell was especially concerned about school districts having to spend funds for these purposes, which wouldn’t have been necessary if the interim tests provided by the consortium had met teachers’ needs. He noted a “redundancy of effort that’s gone on all over the state” by districts seeking to come up with alternatives to the interim tests on their own.

A former teacher, O’Donnell called the hearing to gather information for his bill, AB 1035, which seeks to improve the interim tests and the reports sent to schools and districts on their student’s performance.

“This is something that should have been taken a look at much sooner,” he said.

Alpert said there were many possible reasons the issues raised by teachers had not been addressed earlier, noting that the members of the consortium needed time to reach consensus about what was most important.

Bill Lucia, president and CEO of the nonprofit education advocacy group EdVoice, urged the committee and Legislature to “get to the bottom of why the tools promised to teachers and kids” were still not available four years later.

O’Donnell plans to amend AB 1035 based on what he heard, in preparation for a committee hearing on the bill April 26, said Rick Pratt, chief consultant to the committee, in a phone interview. Pratt said that if the education committee approves the amended bill it will move on to the appropriations committee.

California won’t pay any additional costs for the new test report features, since the consortium contracted directly with the vendor, said Peter Tira, spokesman for the state Department of Education, in an email. So far this year, California teachers have administered about 4.5 million interim assessments, he added.

The state has hired the Human Resources Research Organization, or HumRRO, to conduct two separate studies related to the assessments. In one, the group is surveying teachers about interim assessment hand-scoring workshops provided by the Educational Testing Service organization, or ETS, which administers the tests. In the other, the group is surveying teachers about how useful they find the assessments.

Alpert said in a phone interview that California pays $9.95 million for its membership in the consortium, which gives it access to the end-of-the-year tests, interim tests and “formative resources,” including a Digital Library of online lesson plans. In addition, the state has entered into a multi-year, multi-million dollar contract with ETS for test administration and training.

The consortium had not widely released details about the improvements before the hearing because it wanted to finalize them, Alpert said. After Fairway Technologies, the firm that got the contract to revise the test reports, has finalized its plans, he said the consortium would release more information so the state could train districts and teachers in how to use them.

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (3)

Leave a Comment

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Joseph C Antone 6 years ago6 years ago

    To be fair, while the computer scored portions tend to be all but useless, I have found the hand scoring portions to be somewhat informative. But nowhere near the analytics of what we used to have.

  2. Kristoffer Kohl 6 years ago6 years ago

    I'm curious about the headline for this article characterizing teachers' efforts as 'complaints' rather that advocacy, testimony, etc. The article seems to indicate that teachers and district leaders are calling for improved assessments that better enable them to gauge what students know and transform their instruction accordingly. Why is that expressed as complaining? Seems to me that teachers are informing the discussion with classroom experience that is all too often neglected when policies are formulated … Read More

    I’m curious about the headline for this article characterizing teachers’ efforts as ‘complaints’ rather that advocacy, testimony, etc. The article seems to indicate that teachers and district leaders are calling for improved assessments that better enable them to gauge what students know and transform their instruction accordingly.

    Why is that expressed as complaining? Seems to me that teachers are informing the discussion with classroom experience that is all too often neglected when policies are formulated and implemented by those who are far from the classroom.

    EdSource should be celebrating these teachers for speaking up on behalf of their students and colleagues. Unfortunately, this headline feeds the negative narrative about teachers.

  3. Doug McRae 6 years ago6 years ago

    The Smarter Balanced interim tests have been problematic from the beginning; the plan to have teachers provide the labor for scoring open-ended responses without compensation doomed the program from the get-go. I've asked for information from the California Department of Education and the State Board of Education on how many of the Smarter Balanced interim tests initiated over the past 3 school years [cited by the post as 4.5 million tests] have actually been scored … Read More

    The Smarter Balanced interim tests have been problematic from the beginning; the plan to have teachers provide the labor for scoring open-ended responses without compensation doomed the program from the get-go. I’ve asked for information from the California Department of Education and the State Board of Education on how many of the Smarter Balanced interim tests initiated over the past 3 school years [cited by the post as 4.5 million tests] have actually been scored to yield results for potential instructional purposes, but to date neither CDE nor Smarter Balanced has provided this simple operational information. It is likely only a small percentage of the interim tests administered that require human scoring have actually generated results for potential use, and the utility of those results are now being seriously questioned by teachers who have attempted to use those results for instructional purposes.

    More important, however, is information that suggests the Smarter Balanced bank of test questions that supports the interim testing program has neither the depth nor breadth necessary for a decent interim testing effort. The cream of the test question item bank field tested in 2014 for both the mandatory end-of-year summative tests and the non-mandatory interim tests had an acknowledged problem that there were not enough questions to adequately measure the low end of the achievement spectrum for underserved groups of students. This information was buried in the Smarter Balanced technical report submitted to the feds for peer review the summer of 2016, based on spring 2015 test administration.

    Smarter Balanced said it had a program to remedy this fatal flaw for any large scale statewide testing program, but to date neither Smarter Balanced nor CDE has released information whether that flaw in the initial operational Smarter Balanced tests has been addressed to any substantial degree. If the Smarter Balanced end-of-year summative tests have an inadequate item bank, then clearly the lower priority non-secure interim testing item bank will have the same problem.

    The whole notion of having a test publisher for a statewide accountability test also supplying instructional tests (the interims) has an unsavory “teaching to the test” aroma to it. Instructional tests can be valuable for teachers, when appropriately designed and produced with adequate banks of test questions to measure a full range of achievement. But, those tests are more appropriately produced by instructional materials publishers, and aligned with local district instructional efforts, than by an end-of-year statewide testing program vendor. In effect, the Smarter Balanced tests embraced by the CDE and the State Board in California are efforts to leverage the mandatory statewide state testing program to influence instructional strategies that should appropriately be left to local control.

    Rather than attempting to correct the Smarter Balanced interim test flaws per AB 1035, the legislature would be wise to flush this aspect of CA’s Smarter Balanced program from the statewide assessment budget, to allow local districts to implement their own interim tests more closely aligned to their own curriculum and instruction efforts. This would allow the statewide assessment program to focus on correcting the flaws that compromise the results now being generated by the Smarter Balanced end-of-year summative tests.