Laurie Udesky/EdSource Today
Students take Smarter Balanced practice tests at Bayshore Elementary School in Daly City.

Although parents were originally supposed to receive their children’s scores on new Smarter Balanced tests over the summer, most school districts received reports to send to parents much later than anticipated.

Some educators say they are frustrated that parents had not received the reports earlier so they could discuss them at back-to-school and other beginning of the year events.

Pam Slater, spokeswoman for the California Department of Education, said the delays were due in part to a desire to ship all the reports at once, instead of piecemeal.

“Additionally, the deployment of the new and complex reporting system required quality control measures be added that resulted in the delay of the reports,” she said. On Oct. 2, Slater said all reports had been shipped to districts by that day.

Districts had received the scores electronically earlier, but not the more detailed printed reports which explain to parents what the scores mean. The reports were prepared and mailed to districts by the Education Testing Service, which administered the new Common Core-aligned tests. Once they receive them, districts have the responsibility to mail them to parents.

Districts must send the reports to parents within 20 days of receiving them and were initially supposed to send them no “later than the first 20 working days” of the next school year, according to the state’s education code.  

The delay frustrated some district officials because they say it has hampered their ability to discuss the scores with parents. Some, such as Garden Grove in Orange County – which just received its reports last week – are preparing their own letters to parents explaining the results and are planning special parent meetings to discuss the scores.

“The report the state has provided for parents is not the easiest to comprehend,” said Garden Grove Superintendent Gabriela Mafi.

Statewide, average scores were lower than parents were used to seeing on the previous paper and pencil California Standards Tests in math and English language arts, which schools last administered for about 15 years until the spring of 2013. Thirty-three percent of students in grades 3-8 and 11 met or exceeded standards in math, as defined by the Smarter Balanced consortium that drew up the tests, and 44 percent met or exceeded English language arts standards on the new tests.

The Smarter Balanced tests – which are part of the California Assessment of Student Performance and Progress, or CAASPP – were administered online and assessed critical thinking and problem-solving skills, and required more writing than previous tests.

The newness of the tests, and results presented in unfamiliar formats, has prompted some schools in the Aspire charter school network to make presentations to parents about how to how to interpret the scores, said Elise Darwish, Aspire’s chief academic officer.

Aspire’s letter to parents stresses that the tests assessed students on new, more rigorous academic standards — and that the scores reflected those higher standards.

“If your student receives low scores, it does not mean that students have fallen behind, learned less or will be held back from moving to the next grade,” the letter from Aspire Bay Area Superintendent Kimi Kean states. “It means that we raised expectations for our scholars to ensure that (they) are more prepared for college.”

Like those in Aspire, average student scores in  Visalia Unified in the Central Valley district were lower than statewide averages.   The district received its student reports earlier this week and plans to send them out by Monday, said Superintendent Craig Wheaton.

It also plans to train its teachers to discuss the results during parent-teacher conferences in November.

“I was surprised they’ve taken so long to get here,” Wheaton said last month. “I’m sure that teachers will need to be prepared to give information and answer questions, when you start talking about, ‘How’s my child doing?’ I can just imagine some parents saying, ‘What’s this? My child’s not at grade level?’ We’re going to have to educate our educators about this.”

The reports include overall scores in English language arts/literacy and math that fall into four achievement levels: standard exceeded, standard met, standard nearly met or standard not met. They also show achievement levels in subcategories such as reading, writing, listening and research/inquiry in English, and communicating reasoning, concepts and procedures, and problem solving and modeling/data analysis in math.

But there are only three achievement levels listed for the subcategories and they don’t include any scores or show how students compared to others statewide. These levels are above standard, at or near standard, or below standard.

Mafi said the “at or near standard” level is so broad that it’s difficult to interpret.

“As a parent, it’s very challenging,” she said. “We’re doing a cover letter to our parents to explain that this is a new test. We don’t know a lot about it yet. In the interim, we’re still doing classroom assessments and district-wide assessments.”

In addition to student score reports in English, the state is providing districts with a two-page guide to the scores available in English, Spanish and nine other languages, including two versions of Chinese. Sample letters that could be sent out with the reports in English and Spanish are also available.

However, these documents merely explain the contents of the report, without going into details about the tests themselves. To give parents a better idea of the kinds of questions their children were asked, the state Department of Education teamed up with the California PTA to create parent guides in English that explain elementary, middle and high school results.

The GreatSchools organization has also created an interactive online tool called the GreatKids California State Test Guide for Parents at CaliforniaTestGuide.org that explains the Common Core standards assessed on the tests for grades 3-8, available in English or Spanish. Bill Jackson, chief executive officer for GreatSchools, said his organization created guides for every state that is administering Common Core-aligned tests to help parents understand what the scores mean and how they can help their children improve.

“You can see the kinds of questions the test used to assess your kid,” he said, adding that parents can help build their children’s skills and knowledge in everyday interactions with them.

For example, if children didn’t meet 8th grade reading standards, the guide advises parents to have discussions with them about books, films or magazines and to work on building their academic vocabularies. It also recommends talking about students’ skills with teachers to identify strengths and weaknesses and to ask about how to help.

But in some districts, such as Fresno Unified, parents may not receive their reports in time to discuss them at parent teacher conferences. The first quarter ends Oct. 9 and elementary parent conferences will take place from Oct. 12-23.

Fresno Unified spokesman Jedidiah Chernabaeff  said the district plans to send the reports out sometime this month, but he could not confirm whether parents would receive them before the conferences. Students in the district scored well below the state average, with fewer than 30 percent meeting or exceeding goals in math and English language arts.

Superintendent Michael Hanson said last month that the state’s delay in sending out the reports hindered the district’s ability to effectively explain scores to parents before their children started the new school year.

“The primary use should be so we can help a student understand where they are and to help a teacher understand that,” Hanson said, after the state released district and school results Sept. 9. “For us not to be able to communicate that – that’s a problem.”

 

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (12)

Leave a Comment

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Gary Ravani 8 years ago8 years ago

    A few points: The term "educator" casts a rather broad net here. It would have been interesting, and perhaps more to the point, if actual teachers or teachers' representatives had been quoted to find out if there is actual frustration in the classrooms about delayed test scores. The perspective from the classroom is typically different from those at the 10K feet level of the district administration buildings. I would suggest the classroom view is more relevant … Read More

    A few points:

    The term “educator” casts a rather broad net here. It would have been interesting, and perhaps more to the point, if actual teachers or teachers’ representatives had been quoted to find out if there is actual frustration in the classrooms about delayed test scores. The perspective from the classroom is typically different from those at the 10K feet level of the district administration buildings. I would suggest the classroom view is more relevant too.

    Let’s also recall that the term “proficient,” along with the other test categorization terminology was borrowed from the NAEP. The NAEP has some positive aspects to it , but the categories were roundly criticized by just about every professional body who looked at them, including the General Accountability Office (for statistical reasons). They were very arbitrary and seemed intended to give a picture of lower school achievement than is the reality. Scholars have looked at those categories and found, when cross referenced to international test scores, no country on Earth would have a majority of its students ranking at “proficient” (except maybe France as I recall) on the NAEP or equivalent.

    Now, this was all known when the SBE of the time (with Reed Hastings in charge?) adopted the NAEP categories for use with the API and AYP. CDE staff informed the SBE that this would result, eventually, in the majority of schools in CA being labeled, and unfairly labeled, as underachieving (which came to be known as Program Improvement) if the SBE went ahead with this plan. Needless to say the SBE did go ahead and with great enthusiasm as I recall. (The majority of schools labeled as failures? Full speed ahead!)

    I see the usual folks continue to be sour on the idea of trying to improve the assessment system so that it no longer gives a false picture of CA’s schools. Better good information late than bad information delivered on schedule.

    We should also recall, historically, that “grade level” is something of a contrivance. If it is indeed either an “average” level of performance on some test or an arbitrarily agreed upon “cut score.” If the former than mathematically half the students will be above the average and half below. Plenty of ammunition there for the usual suspects and school critics. If the latter, then you get to the political machinations of that SBE of olden times where the schools are set up by policy makers for failure. Of course we can all yearn for the schools of mythical Lake Woe-B-Gone where all the kids are “above average.” Hey! Isn’t that what the school critics are demanding now?

  2. Roxana Marachi 8 years ago8 years ago

    Dr. Pedro Noguera describes the situation best... "We’ve created an accountability system that holds those with the most power the least accountable.” For readers interested in additional research and critical questions about the experimental computerized assessments, please read the following Open Letter to the CA State Board of Education on Release of [False] SBAC Scores: http://eduresearcher.com/2015/09/08/openletter/ and the accompanying 10 Critical Questions. It is abundantly evident that there have been serious breaches of contracts. … Read More

    Dr. Pedro Noguera describes the situation best… “We’ve created an accountability system that holds those with the most power the least accountable.”

    For readers interested in additional research and critical questions about the experimental computerized assessments, please read the following Open Letter to the CA State Board of Education on Release of [False] SBAC Scores: http://eduresearcher.com/2015/09/08/openletter/ and the accompanying 10 Critical Questions.

    It is abundantly evident that there have been serious breaches of contracts. What has been promised for hundreds of millions of dollars has clearly not been delivered. More important than the delay of (false) “score reports” however, would be an investigation into the original Race to the Top grant proposal to document point-by-point the egregious failures of the new assessments to meet stated industry standards for quality. Has EdSource investigated why the State Board has failed to respond to the 30+ page SBAC invalidation report published by SR Education that documents egregious errors and technological barriers in the new assessments? Or might the shared funding sources that support/promote both SBAC and EdSource silence such an inquiry?

  3. Doug McRae 8 years ago8 years ago

    This post does not adequately address the reason why the parent reports were delivered late. The reason for the late delivery given by CDE, that is, a desire to ship all reports at the same time, and/or the need for quality control measures, contradict both statute for the CAASPP program and the vendor (ETS) contract approved by the State Board. Statute calls for parent reports to be shipped to districts within 8 weeks of completion of … Read More

    This post does not adequately address the reason why the parent reports were delivered late.

    The reason for the late delivery given by CDE, that is, a desire to ship all reports at the same time, and/or the need for quality control measures, contradict both statute for the CAASPP program and the vendor (ETS) contract approved by the State Board. Statute calls for parent reports to be shipped to districts within 8 weeks of completion of testing for each district, i.e., on a “rolling basis” not all at once. The contract calls for the same, and the contract clearly indicates it is ETS responsibility to provide complete and accurate scores (e.g., to have quality control measures in place in time) within the 8-week requirement.

    The vast majority of districts, schools, and students (99+% of students) completed the 2014-15 school year by June 12 (same as the end of the CAASPP testing window), which translates to a requirement to ship parent reports to districts no later than August 7. ETS missed the August 7 deadline entirely, and began shipping parent reports to districts the week of August 17 [on a rolling basis when an entire district’s scores were determined to be complete and accurate] with final shipments to be made by the end of September, according to previous CDE information published by EdSource.

    Why was ETS not ready to ship all parent reports within the 8-week statutory and contractual deadline? K-12 testing vendors in past years in many states have been fined heavily for missing score report distribution deadlines, and even have had to reimburse local districts for expenses caused by delays in return of results. CA statewide assessment vendors have also been docked substantial amounts for providing late reports [Harcourt in 1999, ETS in the 2006-08 time period as I recall]. Did ETS fail to perform? Or did the CDE/SSPI/SBE give ETS a “pass” on this contractual obligation this year? The vendor finances aspect for these delays, however, are relatively minor compared to frustrations endured by the end users of the testing information, the “hampering” and “hindering” of local district ability to effectively explain scores to parents before kids started school in August or early September. The CDE needs to be transparent explaining why the parent reports were shipped late by the vendor, and the parties responsible for the delays need to be held accountable for their performance. Anything less would be a failure to be effective stewards for CA taxpayer dollars, a lack of respect for the information and services needed by local districts to adequately play their role implementing the CAASPP statewide assessment program.

    Replies

    • Don 8 years ago8 years ago

      Doug, you hit the nail on the head, once again. Now we have to pin the tail on the donkey.

    • Theresa Harrington 8 years ago8 years ago

      Doug, After asking the CA Dept. of Education about the delays repeatedly for more than a week, I received the reasons stated in this response. When I tried to go straight to ETS, they referred me back to the CDE. I was told that all reports had been shipped to all but nine districts by Sept. 25. The nine were: Antioch Unified, IvyTech Charter, Oak Grove Elementary School District, one.Charter, Patterson Joint Unified, Southern Kern … Read More

      Doug, After asking the CA Dept. of Education about the delays repeatedly for more than a week, I received the reasons stated in this response. When I tried to go straight to ETS, they referred me back to the CDE. I was told that all reports had been shipped to all but nine districts by Sept. 25. The nine were: Antioch Unified,
      IvyTech Charter, Oak Grove Elementary School District, one.Charter, Patterson Joint Unified, Southern Kern Unified, Upland Unified, Five Keys Charter (SF Sheriff), and San Jose Conservation Corps Charter.

      • Doug McRae 8 years ago8 years ago

        Theresa -- Yeah, there is no question the CDE media office has not been forthcoming with a full explanation for the delay in delivery of paper parent results. I do not doubt the explanation that it took longer to get "complete and accurate" results for each district, the question not addressed is "Why?" For the 45 years I've been in the K-12 testing arena, the industry standard time for scoring has been 14-21 calendar days … Read More

        Theresa — Yeah, there is no question the CDE media office has not been forthcoming with a full explanation for the delay in delivery of paper parent results. I do not doubt the explanation that it took longer to get “complete and accurate” results for each district, the question not addressed is “Why?” For the 45 years I’ve been in the K-12 testing arena, the industry standard time for scoring has been 14-21 calendar days for machine scored tests, and 4-6 weeks for human scored tests, with penalties for failure to meet these timelines. Both statute (approved Oct 2013) and the ETS contract (approved July 2014 by the State Board) allow for a generous 8 weeks. Actual deliveries have been 10 to 16 weeks. Districts and parents and the public deserve an honest and complete answer to the question “Why so late?”

  4. Manuel 8 years ago8 years ago

    Can someone more informed than me show me where in the Ed Code or the SBAC information does it categorically state that SBAC tests can determine whether a student is “on grade level” as Visalia Superintendent Craig Wheaton implies?

    Thank you in advance.

    Replies

    • Doug McRae 8 years ago8 years ago

      Manuel -- The folks that develop K-12 large scale tests have never promised scores that reflect the description "on grade level" but educators and parents and others have frequently used that terminology to reflect either an average score obtained by a student at a given grade level (for older norm-referenced tests) or the agreed upon target score (for newer standards-based tests). Thus, scores of "proficient" for CA's prior STAR CSTs and now scores of … Read More

      Manuel — The folks that develop K-12 large scale tests have never promised scores that reflect the description “on grade level” but educators and parents and others have frequently used that terminology to reflect either an average score obtained by a student at a given grade level (for older norm-referenced tests) or the agreed upon target score (for newer standards-based tests). Thus, scores of “proficient” for CA’s prior STAR CSTs and now scores of “standard met” for the new Smarter Balanced tests are frequently described as “on grade level” by teachers and administrators and parents, but test makers and official language (e.g., statutes) typically do not use this informal description. The terminology “on grade level” does not have a definition in the context of large scale tests, to my knowledge.

    • Don 8 years ago8 years ago

      Manuel and Doug, you've probably noticed that the breakdown or "Areas" are categorized as "above standard", "at or near standard" and "below standard". If grade level status is implied by "met standard" how then would teachers interpret the Area information which comingles both "met' and "near" standard? That's a problem. Read More

      Manuel and Doug, you’ve probably noticed that the breakdown or “Areas” are categorized as “above standard”, “at or near standard” and “below standard”. If grade level status is implied by “met standard” how then would teachers interpret the Area information which comingles both “met’ and “near” standard? That’s a problem.

      • Theresa Harrington 8 years ago8 years ago

        Don, That’s what Garden Grove Supt. Gabriela Mafi was pointing out.

        This also came up at the last state Board of Education meeting, where trustees were raising the same questions. The executive director of Smarter Balanced said that was the first time he had heard that this was confusing and he would work to make it more clear in the future.

      • Doug McRae 8 years ago8 years ago

        Don — Yes, it is a problem and it has been pointed out before at the State Board meetings (in March and May when mock-ups of Parent Reports were discussed) that the word “standard” has two different meanings for those two sections of the Parent Report and using the same word to mean two different things would be confusing to the end users . . . . .