Credit: Alison Yin for EdSource Today

As educators eagerly await the results of the new standardized assessments aligned with the Common Core standards that more than 3 million students took in the spring, state officials now say they plan to release the scores in early September, later than originally projected.

Parents can expect to start receiving their children’s scores about the same time.

As early as last month at the State Board of Education’s most recent meeting, California Department of Education officials anticipated that results of the Smarter Balanced Assessments would be released to the public sometime in August.

Officials say that because this is the first time results on the new assessments will be released, they want to take extra care to make sure everything is accurate and complete before the official release in September. A date has yet to be announced.

“We are taking an abundance of caution to ensure that the substantial amount of data we are receiving is properly collected and placed in new files on a new site,” said department spokeswoman Pam Slater. “Additionally, the department will be launching a new website to display the results and needs sufficient time to test the new site.”

The Educational Testing Service, which administered the new assessments on behalf of the state, plans to start sending parent reports to districts in mid-August, Slater said. Districts then have 20 days to mail the reports to parents.

In California, the Smarter Balanced assessments are part of  the California Assessment of Student Performance and Progress, or CAASPP program, replacing the old STAR program with its multiple-choice paper-and-pencil tests that students took each spring. The STAR results were usually unveiled in August.

One selling point of the new assessments, which are administered online, was that they could be scored more quickly than the old tests and would be available sooner to both parents and teachers.

“The tests are taken online, and results are available to teachers, schools and school districts much more quickly than results from previous tests,” the CDE website promised.

Another argument in favor of the Smarter Balanced assessments was that they could provide more information about a student’s academic abilities than the old California Standards Tests, and could be used to inform actual classroom instruction.

As the California Department of Education website explained, “the tests provide timely and actionable student information so that teachers and schools can adjust and improve teaching to ensure students have the knowledge and skills they need to succeed in school and beyond.” But this year, at least, many schools  opened their doors before they receive final test scores, so that teachers will likely have to postpone making use of test results at least for the first few weeks of the academic year.

Next year, however, school districts are likely to get individual student results much more quickly – three to six weeks after completing the math and English language arts assessments, officials say.

Although official scores aren’t out yet, districts are not entirely in the dark about how their students have done. Most already have received preliminary results. Since May, the California Department of Education has been uploading individual student results roughly four weeks after they finished taking the tests, as scoring of the students’ responses are completed.

Those scores are considered preliminary because districts haven’t necessarily received scores for all of their students and for technical and other reasons. The department has advised districts that “preliminary results should not be shared with the public.”

School districts vary on how they are dealing with the preliminary results that they have already received from the state. Among six school districts that EdSource Today is tracking as they implement the Common Core, some have allowed only principals to see the preliminary results.  Testing coordinators at other districts have said they will wait to distribute scores to principals and teachers after all of the scores are in.

In Visalia Unified principals and teachers will start discussing their schools’ preliminary scores this week to make plans for the school year, which starts Thursday, said Phil Black, the district’s assessment coordinator.

As of Monday, Visalia Unified had received between 96 and 99 percent of its scores, depending on the grade level and subject.  “The thing that’s an advantage this time around is, we have a good set of scores before school starts,” Black said.

But Santa Ana and Fresno Unified officials say they won’t share results with individual schools until they receive a complete set from the state.

Michele Cunha, Santa Ana’s coordinator of student achievement, research and evaluation, said she is printing out schools’ scores for district administrators, but they will not share the information with schools until the official results are out. Santa Ana, where schools open on Sept. 1,  has between 96 and 100 percent of preliminary scores, depending on the grade and subject.

“We’re at the whim of (the California Department of Education) when they officially release them,” Cunha said.

Garden Grove Unified, where schools open Sept. 8,  also plans to wait to give the scores to teachers, said John Marsh, the district’s testing administrator. Now, without state or county scores, it’s hard to explain what the scores mean because they have nothing to compare them to.

“I think the biggest challenge is there is not any context to look at the score reports,” Marsh said.

San Jose Unified School District officials have only shared the preliminary results among the central staff. School started this week.

“While we eagerly await the results, we understand and appreciate the (California Department of Education’s) commitment to delivering a comprehensive and complete data set,” said Jorge Quintana, a district spokesman.

Catherine Foster, a spokeswoman for the Aspire Public Schools, with 35 California campuses, said while school officials are grateful for the preliminary results, they were expecting to get their own results earlier. So far, Aspire has about 98 percent of its results. School resumed this week.

“(The) delays made it impossible for teachers to use this data to get a sense of their current students’ strengths and challenges on (Smarter Balanced Assessments) before school started or reflect on their performance from the prior year,” Foster said in an email.

 Staff writer John Fensterwald contributed to this report.

Sarah Tully covers the Common Core and early education. Email her or Follow her on Twitter. Sign up here for a no-cost online subscription to EdSource Today for reports from the largest education reporting team in California.

 

To get more reports like this one, click here to sign up for EdSource’s no-cost daily email on latest developments in education.

Share Article

Comments (34)

Leave a Comment

Your email address will not be published. Required fields are marked * *

Comments Policy

We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.

  1. Handy 8 years ago8 years ago

    As educators eagerly await the results of the new standardized assessments aligned with the Common Core standards that more than 3 million students took in the spring, state officials now say they plan to release the scores in early September, later than originally projected. Use the Common Core.

  2. Manybugs 8 years ago8 years ago

    This test is the first stage for privatizing the public school system. The goal is to illuminate all public school teaching positions. There are many private firms that are vying for this new opportunity. Many of these companies will employ one "master" teacher for each subject and grade level. These master teachers will record their daily lessons and these video recordings will be watched by the students.(one teacher for thousands of students) The former classrooms … Read More

    This test is the first stage for privatizing the public school system. The goal is to illuminate all public school teaching positions. There are many private firms that are vying for this new opportunity. Many of these companies will employ one “master” teacher for each subject and grade level. These master teachers will record their daily lessons and these video recordings will be watched by the students.(one teacher for thousands of students) The former classrooms teachers will have the choice to keep their jobs, but only as classroom monitors and daily work facilitators(essentially, making sure the kids don’t burn the classroom down-glorrified security guards.) These sorry saps’ pay will also be greatly reduced to just above the Federal minimum wage.(from what I am hearing, around $10 to 12 per hour) This is the future of public education. Get ready for it.

  3. Don 8 years ago8 years ago

    Navigio, the WAPO article states that Opt-Out data was published this year as opposed to the previous 2, giving more precise data to analyze. The assumption is that a highly respected authority in the person of Carol Burris, a noted and respected principal and researcher, is not cooking the books. However, she was only putting a fine point on the 1-2 percentage achievement gains. That is to say opt-outs likely contributed to a negligibly … Read More

    Navigio, the WAPO article states that Opt-Out data was published this year as opposed to the previous 2, giving more precise data to analyze. The assumption is that a highly respected authority in the person of Carol Burris, a noted and respected principal and researcher, is not cooking the books. However, she was only putting a fine point on the 1-2 percentage achievement gains. That is to say opt-outs likely contributed to a negligibly positive rather than negative aggregate statewide result, that result being horrendous either way. If your point is that there is a story in who opt-outs, I wouldn’t disagree. (Typically more knowledgeable and anti-reform parents will be the early adopters of anti-corporate reform efforts like CCSS and the testing mania.) But the NYT graph shows non-participation across the board, though focused more in the middle and tending toward the lower FRPM. At the same time, it should be noted that New York has lower total FRPM totals than CA. I didn’t really look any deeper, because opt-out population dynamics was a side issue compared to the overall growing movement which is an outgrowth of Common Core, testing mania and the corporate focus statistics regardless of the veracity of the data, particularly as concerns teacher performance.

    Replies

    • navigio 8 years ago8 years ago

      My point was to try to understand the reason for the different characterizations of the issue. Anyway, I took a look at the data and now I understand what's going on. Essentially both characterizations are correct. Opt outs happened disproportionately in more affluent districts but even in those districts it was most often the lower performers who opted out. So the eventual impact on overall scores may be different than the impact on subgroup scores. Note … Read More

      My point was to try to understand the reason for the different characterizations of the issue. Anyway, I took a look at the data and now I understand what’s going on.
      Essentially both characterizations are correct. Opt outs happened disproportionately in more affluent districts but even in those districts it was most often the lower performers who opted out. So the eventual impact on overall scores may be different than the impact on subgroup scores.
      Note that opt out numbers that allowed calculating prior year level performance, by definition, can’t include previous year’s opt outs. And without that info any guesses as to the impact of opt outs on scores can’t be known (why the affluence part may be relevant).
      Also note that in some districts opt outs were 100% special Ed students.

    • navigio 8 years ago8 years ago

      "Department data show that students who did not take the 2015 Grades 3-8 ELA and Math Tests and did not have a recognized, valid reason for not doing so were more likely to be White, more likely to be from a low or average need district, and slightly more likely to have scored at Levels 1 or 2 in 2014. Students who did not take the test in 2015 and did not have a recognized, … Read More

      “Department data show that students who did not take the 2015 Grades 3-8 ELA and Math Tests and did not have a recognized, valid reason for not doing so were more likely to be White, more likely to be from a low or average need district, and slightly more likely to have scored at Levels 1 or 2 in 2014. Students who did not take the test in 2015 and did not have a recognized, valid reason for doing so were [much] less likely to be economically disadvantaged and less likely to be an ELL.”
      – nysed 2015 3-8 Test Result Press Release

      In other words, just as likely as it is for the opt-outs to cause an overall lower proficiency rate, so too is it likely that they are also to blame for not only the persistence of the achievement gap, but even a worsening of it–even in spite of equal or improved yoy results for black, hispanic, ell and sed. Whites had the most improvement of any significant subgroup, especially in math, likely due to the decision by many of their lower performers to not participate.

      Interestingly–or perhaps conspiratorially–there was negligible opt-out in ny charter schools. And assuming the serviced population is representative, charters would thus show an increased overall proficiency rate and a lower achievement gap due solely to traditional public school opt-outs. Man, how ironic is that? Especially given many opt-outs claimed to be doing so as a way to show solidarity with public school teachers.

      • Don 8 years ago8 years ago

        Navigio, the excerpt you quoted said "slightly more likely to have scored levels 1 or 2 in 2014", slightly being the operative word. Why is slight statistical difference a rationale for why low performing whites are dragging down the achievement gap? It is not a matter of minor differences. There's a lack of proportion in this argument, one that the NY DOE intentionally promoted by a press release that chose to report, in … Read More

        Navigio, the excerpt you quoted said “slightly more likely to have scored levels 1 or 2 in 2014”, slightly being the operative word. Why is slight statistical difference a rationale for why low performing whites are dragging down the achievement gap? It is not a matter of minor differences. There’s a lack of proportion in this argument, one that the NY DOE intentionally promoted by a press release that chose to report, in that instance, on an insignificance).

        You said, “Whites had the most improvement of any significant subgroup, especially in math, likely due to the decision by many of their lower performers to not participate.” I don’t think that is the reason. Higher performing subgroups have fared better on the new tests across the nation. The slight difference in aggregate subgroup performance due to the opt out differential created a commensurately slight increase in performance, but that is not the main reason why higher performers continue to outperform. These test adversely impact ELLs, and underperformers in general. Moreover, teachers had 3 years to get up to speed on CCSS instruction, a fact which should make people notice that the test results are not just related to premature evaluation.

        Also, related to your comment about charter performance v. TPS:

        From Chalkbeat New York:

        “The city’s charter schools continued to outdo citywide proficiency rates in math, but lag in English. Proficiency rates at charter schools increased from 28 percent to 29.3 percent in English, and from 43.9 percent to 44.2 percent in math. Those rates have improved at a slower rate than the district-school averages.”

        Based on the above, your critique of Burris’ is off point. Whites outperform as a subgroup , as do Asians, but they outperform with or without the opt-out movement and significantly so,and whether or not these tests are an accurate measure of student achievement or teacher effectiveness. Overemphasized test culture is part of the problem.

        • navigio 8 years ago8 years ago

          Wait, now you're arguing that there weren't enough level one and level two opt outs to affect the achievement gap but there were enough to affect the aggregate result? Seriously?Of course its not the reason they 'continue to outperform', but it is the reason they continue to outperform in relative terms (ie achievement gap). She could have mentioned that and chose not to. The rest of her wording makes it clear why. Opinion rants are … Read More

          Wait, now you’re arguing that there weren’t enough level one and level two opt outs to affect the achievement gap but there were enough to affect the aggregate result? Seriously?Of course its not the reason they ‘continue to outperform’, but it is the reason they continue to outperform in relative terms (ie achievement gap). She could have mentioned that and chose not to. The rest of her wording makes it clear why. Opinion rants are fun. They’re not ‘news’ though.
          The blurb from the nydoe was not the focus of the release. In fact it was one of the last paragraphs in it. Burriss felt it important enough to quote for the point of being able to criticize overall bad scores, but not important enough to use to explain something as critical as the achievement gap. gee..
          The city of new york’s scores is something very different than the overall state, which is what these reports are talking about (they do separate out nyc, but its still different).
          Regardless, I dont think burriss even mentioned charters. Not sure why you’d include that in your defense of her ‘information’.

          • Don 8 years ago8 years ago

            Navigio, I'll try it again. Burris simply stated the results were bad with a catchy headline. That the results were within the margin of error and that she pointed out how bad they were is not a flagrant obfuscation of fact. Given the way political entities spin the news, t the slight positive related to the opt-out factor was a point worth mentioning. It was only by virtue of popular antipathy of CCSS … Read More

            Navigio, I’ll try it again. Burris simply stated the results were bad with a catchy headline. That the results were within the margin of error and that she pointed out how bad they were is not a flagrant obfuscation of fact. Given the way political entities spin the news, t the slight positive related to the opt-out factor was a point worth mentioning. It was only by virtue of popular antipathy of CCSS and the test, a phenomenon DeBlasio rejects, was he able to point to a positive number, even if he didn’t mention the irony. Anyone who doesn’t know any better might take that number for good news and Burris was not buying his spin along with millions of others. And the state Chancellor, what’s her name, looks ridiculous attempting to enforce federal law on an angry mob engaged in civil disobedience. Some local families wants to opt out of the test so all the families should loss their Title One?. I don’t think that’s going to fly in federal court. You cannot stop the opt-out movement by force because it is a grassroots effort – a rejection of Common Core and high stakes testing. These people look like dictators and their antics only reinforce the narrative of brutalizing children as pawns in a bigger game.

            Anyway, Burris wasn’t providing news. She was giving her opinion on the mess that NY finds itself in. Obviously, a negligible increase in the third year of CCSS is not what was hoped for and particularly with the achievement gap widening. Now the test results appear less and less valid and consequently the regimen by which teachers are evaluated is in question, the primary focus of teacher revolt. That is to say, the corporate reform agenda is on the ropes and if opt outs grow a couple for more years the whole things will become politically unviable. It’s a shame that students will have to take the brunt of it..

            • navigio 8 years ago8 years ago

              Yeah, a shame that she places anti-test rhetoric above the real takeaway from the data. I agree the kids will bear the brunt of that. From an ‘expert’ no less.

  4. Don 8 years ago8 years ago

    Just in this week - for the third year in a row NY posted dismal Common Core test scores and their achievement gap grew. Their opt outs quadrupled last year and had they not done so the results would have been even worse as most of the opt-outers were low performers. Here in CA we have the largest percentage of students in poverty in the nation. It is going to be extremely ugly when … Read More

    Just in this week – for the third year in a row NY posted dismal Common Core test scores and their achievement gap grew. Their opt outs quadrupled last year and had they not done so the results would have been even worse as most of the opt-outers were low performers. Here in CA we have the largest percentage of students in poverty in the nation. It is going to be extremely ugly when they find the black box and we find out what happened. I suspect CA will follow Washington state’s embarrassing example and will lower graduation requirements below “college and career ready”. It’s embarrassing to be an adult nowadays ,but it beats being a public school student.

    Replies

    • navigio 8 years ago8 years ago

      I keep hearing this claim that mostly low performers opted out, but the nyt did a piece a couple of months ago that not only showed a breakdown by district affluence in which the highest opt out rates were in districts with nfrm rates between 5% and 30%, but even that, with the exception of below 5% nfrm, opt out rate was inversely correlated with nfrm rate. The one thing that does seem curious is … Read More

      I keep hearing this claim that mostly low performers opted out, but the nyt did a piece a couple of months ago that not only showed a breakdown by district affluence in which the highest opt out rates were in districts with nfrm rates between 5% and 30%, but even that, with the exception of below 5% nfrm, opt out rate was inversely correlated with nfrm rate.
      The one thing that does seem curious is most of these stories indicate the meager sampling rate as an invalidator of results, then go right on to say what the results were anyway.

      • Don 8 years ago8 years ago

        Navigio, I was using the information about opt-outs from an Strauss' WAPO Answer Sheet piece written by Carol Burris. http://www.washingtonpost.com/blogs/answer-sheet/wp/2015/08/13/n-y-common-core-test-scores-flop-yet-again/ "Over 200,000 students opted out of the tests. Remarkably, opt outs helped fuel the small overall increases. If the 20 percent of potential test takers had opted in, the tiny increases in proficiency rates would have likely been smaller still. Opt outs were disproportionately students who had scored at levels 1 or 2 (below proficiency) during … Read More

        Navigio, I was using the information about opt-outs from an Strauss’ WAPO Answer Sheet piece written by Carol Burris.

        http://www.washingtonpost.com/blogs/answer-sheet/wp/2015/08/13/n-y-common-core-test-scores-flop-yet-again/

        “Over 200,000 students opted out of the tests. Remarkably, opt outs helped fuel the small overall increases. If the 20 percent of potential test takers had opted in, the tiny increases in proficiency rates would have likely been smaller still. Opt outs were disproportionately students who had scored at levels 1 or 2 (below proficiency) during the prior year.”

        Can’t vouch for the accuracy, but Burris is no hack.

      • CarolineSF 8 years ago8 years ago

        I’ve read all over the place that it was almost entirely low-poverty districts that had high opt-out rates in New York (which correlates closely with higher-achieving districts).

        • Don 8 years ago8 years ago

          This year the opt-out numbers quadrupled and the official stats have been only recently released. From the NYT article Navigio linked earlier: "The State Education Department also noted that students who scored at Levels 1 and 2 last year were more likely to sit out this year than students who had scored at Levels 3 (which is considered passing) and 4, a sign that the increasing difficulty of the tests might have factored into some parents’ … Read More

          This year the opt-out numbers quadrupled and the official stats have been only recently released.

          From the NYT article Navigio linked earlier:

          “The State Education Department also noted that students who scored at Levels 1 and 2 last year were more likely to sit out this year than students who had scored at Levels 3 (which is considered passing) and 4, a sign that the increasing difficulty of the tests might have factored into some parents’ decisions.”

          I take that to mean that more than half the opt-outs were previous low performers. Much of the news until now was about previous years, Carolyn. But there’s no doubt about it, whatever the spread, the movement sprouted wings this year. I think it is true that it generated from the middle class and that some of the underprivileged have been reluctant to side-step what they view as an accountability mechanism that they believe gives them a voice. I personally don’t believe that’s true, but I don’t begrudge their caution either.

          I made a response earlier that I forgot to submit as a reply so it’s at the top.

          Carolyn, I would imagine you are sympathetic to the views of Dr. Carol Burris.

          • navigio 8 years ago8 years ago

            Not that it matters much, but I dont see how burriss' piece could be taken to be anything other than an opinion piece. I'll quote just the first line of her statement, introduced by strauss as a person who is 'the author of numerous articles, books and blog posts about the botched school reform efforts in her state.' "Once again, New York State Common Core test scores are a flop." she goes on to point out the … Read More

            Not that it matters much, but I dont see how burriss’ piece could be taken to be anything other than an opinion piece. I’ll quote just the first line of her statement, introduced by strauss as a person who is ‘the author of numerous articles, books and blog posts about the botched school reform efforts in her state.’

            Once again, New York State Common Core test scores are a flop.

            she goes on to point out the increase in white performance as a cause of the increased achievement gap but conveniently leaves out the point that it was likely the opt-outs that caused that in the first place, even though it was laid out in black and white in the press release.

            and finishes off with “…Elia can’t fix this mess alone. Elia needs a new course and a new boss.’ And until that happens, no matter what the threats, opt outs will continue to grow.

            So I’ve heard of ‘new math’, but is the ‘new news’?

            • Don 8 years ago8 years ago

              I should ad that while I don’t disagree that Burris’ piece was opinion, which by the way is what the Answer Sheet does, it is hard to argue the NY results were not a flop in their third year, unless one believes that margin-of-error variations towards the positive are a sign of success.

  5. Don 8 years ago8 years ago

    It is possible that the reporting delay has more to to do with public relations and politics than any caution regarding accuracy. If accuracy was at the top of the CDE's concerns, it would have done a whole lot of things differently up to now, especially as concerns to establishment of cut scores. In fact, this sudden concern for accuracy seems entirely out of character. With the school year about to begin over the next … Read More

    It is possible that the reporting delay has more to to do with public relations and politics than any caution regarding accuracy. If accuracy was at the top of the CDE’s concerns, it would have done a whole lot of things differently up to now, especially as concerns to establishment of cut scores. In fact, this sudden concern for accuracy seems entirely out of character. With the school year about to begin over the next three weeks and the resulting media spotlight on millions of students heading back to school, the projected bad news of poor test results, results that could possibly be even worse than expected, would be better delivered after the headlines of the new school year have faded. Imagine a bleak report on California student achievement coinciding with the smiling faces of students heading off to school. No, that doesn’t make good copy.

    Replies

    • navigio 8 years ago8 years ago

      They can't really control that. Some schools have already started. Others won't start for a while. Unless they delay to October, the release will always align with someone's school start. Plus looks like they just posted the cut scores. If they were still mucking with perceptions they wouldn't have done that yet. The interesting thing will be what happens with EL results. Based on field data, those level three or higher (I won't call it … Read More

      They can’t really control that. Some schools have already started. Others won’t start for a while. Unless they delay to October, the release will always align with someone’s school start.
      Plus looks like they just posted the cut scores. If they were still mucking with perceptions they wouldn’t have done that yet.
      The interesting thing will be what happens with EL results. Based on field data, those level three or higher (I won’t call it ‘proficient’ yet) scoring rates were around 7%. That’s lower than special education students. Interestingly, the field test demographics were quite unlike CA’s so the overall level three rates are probably going to be lower than those estimates as well.
      When it comes to technology related implementation, not much happens on time as it is.

      • Don 8 years ago8 years ago

        When it comes to large government releases of information, it is standard practice to exercise discretion to minimize bad news. Given the lack of any rationale other than "to make sure everything is accurate and complete" (I mean, come on - do you really believe that line?) it would seem the CDE is vying for time. And to what end? Accuracy at this juncture has already passed. What could be done now to … Read More

        When it comes to large government releases of information, it is standard practice to exercise discretion to minimize bad news. Given the lack of any rationale other than “to make sure everything is accurate and complete” (I mean, come on – do you really believe that line?) it would seem the CDE is vying for time. And to what end? Accuracy at this juncture has already passed. What could be done now to assure greater accuracy? All the data has been collected and crunched. Only the matter of exactly how to report it remains to be seen. When in doubt delay and pick the best moment for media exposure. If the news was great they would have a big press release in the next week to coincide with the back-to-school media hoopla.

  6. Peter 8 years ago8 years ago

    ABC Unified School district has the result while the other districts has no clue and do not care about the result.

  7. Doug McRae 8 years ago8 years ago

    I wonder if the delayed release is due to the large number of "incomplete" tests as of the end of the testing windows (end of school years) for each district. A News Release from CDE on June 15 indicated 3.2 million students had initiated at least one Smarter Balanced test by June 12 [districts representing @ 99 percent of students in CA finished their school year by June 12], but only 2.7 million had completed … Read More

    I wonder if the delayed release is due to the large number of “incomplete” tests as of the end of the testing windows (end of school years) for each district. A News Release from CDE on June 15 indicated 3.2 million students had initiated at least one Smarter Balanced test by June 12 [districts representing @ 99 percent of students in CA finished their school year by June 12], but only 2.7 million had completed E/LA tests and 2.8 million had completed Math tests. Of the 3.2 million students enrolled in grades 3-8 & 11, the June 15 News Release indicates somewhere between 500,000 and 900,000 incomplete tests that needed to be processed . . . . some being complete enough to generate scores, some not complete enough to generate a score other than the lowest obtainable score indicating insufficient participation to generate a meaningful interaction with the test material or insufficient participation to generate a meaningful score. In any case, these incomplete tests amount to roughly 15 percent of the total number of tests administered, far more than the typical occurrence of perhaps 1 or 2 percent for statewide testing programs, and represent a considerable challenge for the vendor to resolve in an adequate manner by the August reporting deadlines.

    The vendor (ETS) contract for scoring and reporting was approved by the SBE at their July 2014 meeting. It called for individual student paper reports no later than 8 weeks after the close of the testing window for each district, which for 99 percent of the students being tested was August 7. The contract also called for launch of the statewide reporting website on August 17. Clearly these deadlines are not being met. It is unknown whether ETS has been granted extensions for these deadlines by the CDE, or if so what the reasons and exact conditions for any extensions may be.

    Replies

    • navigio 8 years ago8 years ago

      “There are inconsistencies in student information that require additional research before reporting.”
      – July 22 CAASPP Update Newsletter

      The CDE also posted the scale score ranges it will be using. They are almost identical to the graphs eddource published with the story about performance thresholds. Don’t know whether that is intended to imply results were in line with expectations.

      • Doug McRae 8 years ago8 years ago

        Navigio -- I took that line in the CDE 7/22 update to mean inconsistencies in demographic info rather than test score results. But, it is possible the actual test results could prompt consideration of changes in the scale score ranges recommended by the consortium -- that is possible for member states, and in fact recently Washington state moved away from consortium recommended cut scores for their grade 11 results. However, I've heard nada from Sacto … Read More

        Navigio — I took that line in the CDE 7/22 update to mean inconsistencies in demographic info rather than test score results. But, it is possible the actual test results could prompt consideration of changes in the scale score ranges recommended by the consortium — that is possible for member states, and in fact recently Washington state moved away from consortium recommended cut scores for their grade 11 results. However, I’ve heard nada from Sacto whether this is being considered behind closed doors . . . . .

    • Don 8 years ago8 years ago

      Doug, thank you for this info. Can you tell me whether individual and aggregate scores will be released and by aggregate I’m thinking districtwide, schoolwide, grade-wide, subgroups etc. As scores are expected to be lower than with STAR, less aggregate score reporting could be used to hide unpopular results.

      • Doug McRae 8 years ago8 years ago

        Don — My understanding is no individual student results will be released due to privacy considerations. For aggregate data, my understanding is all district, school, and subgroup aggregates will be released by grade, essentially the same as the statewide STAR results release from prior years.

        • Don 8 years ago8 years ago

          Once again thanks for your expert account of the requirements of the vendor. Let me ask you this: We knew on June 12th that the number of incomplete scores was extremely high. Yet, only a few weeks ago the results were projected to be on time. What happened in the interim? Certainly ETS knew the protocols and difficulties inherent in presenting the data by late Aug. Did they underestimate the amount of time it would … Read More

          Once again thanks for your expert account of the requirements of the vendor. Let me ask you this: We knew on June 12th that the number of incomplete scores was extremely high. Yet, only a few weeks ago the results were projected to be on time. What happened in the interim? Certainly ETS knew the protocols and difficulties inherent in presenting the data by late Aug. Did they underestimate the amount of time it would take? Has there been any clarification as to the exact nature of the delay above and beyond the expressed concern for accuracy?

          • Doug McRae 8 years ago8 years ago

            Don -- I've got no clue what may have transpired after June 12 between ETS and CDE folks on this topic before the CDE media ofc indicated last week that the results would not be released until September. I'm just speculating that the large volume of incomplete test records as of June 12 may be a contributing reason for the delay, with that speculation informed by experience of having dealt with statewide assessment scoring issues … Read More

            Don — I’ve got no clue what may have transpired after June 12 between ETS and CDE folks on this topic before the CDE media ofc indicated last week that the results would not be released until September. I’m just speculating that the large volume of incomplete test records as of June 12 may be a contributing reason for the delay, with that speculation informed by experience of having dealt with statewide assessment scoring issues in a previous life . . . . .

      • navigio 8 years ago8 years ago

        According to the file layout being used for score reporting the following will be possible: - Obviously grade, school (charter or no), district, county - Gender - Performance Task or Classroom Activity (two parts of 'the test') - If paper and pencil test - special circumstances: whether the student was: absent, cheated, tested elsewhere, opted out, used individualised aid, didnt answer any or only a few questions, special ed status, embedded or non embedded supports (ASL, Braille, CC, Magnification, etc..), … Read More

        According to the file layout being used for score reporting the following will be possible:
        – Obviously grade, school (charter or no), district, county
        – Gender
        – Performance Task or Classroom Activity (two parts of ‘the test’)
        – If paper and pencil test
        – special circumstances: whether the student was: absent, cheated, tested elsewhere, opted out, used individualised aid, didnt answer any or only a few questions, special ed status, embedded or non embedded supports (ASL, Braille, CC, Magnification, etc..), Adult Cheated, Inappropriate test prep
        – Lang proficiency: EL level (5 levels), migrant, LEP, Lang status (EL, IFEP, RFEP, EO (sign language), Dates entered and exited (can be used to calc LTEL, tho no explicit LTEL classification)
        – Spec ed: SED, sp-ed type, NPT
        – Ethnicity: Hisp, Native Am, Asian, Pac Isl, Fili, Black, White, 2 or more, PEL (same levels as before)
        I dont see foster or homeless, even though there was some discussion about reporting all of these somewhere.
        Obviously the state may or may not make all of this public, but it will have access to this level of detail. Most of this is equivalent to what was available in most recent STAR reports.
        As far as school level detail, it’s not possible to know from this data file how that might be restricted for subgroups like it was in STAR, but I’m guessing that might be part of the contract language.
        And I’m guessing if the results are too chaotic or troubling the state might not do full disaggregation in the first year.

    • ann 8 years ago8 years ago

      Stay with us, Doug. We need an honest interpreter of the bureaucratic line.

    • Don 8 years ago8 years ago

      Doug, when you say, " these incomplete tests amount to roughly 15 percent of the total number of tests administered.....and represent a considerable challenge for the vendor to resolve in an adequate manner by the August reporting deadlines." You seem to be alluding to the idea that the delay is intended to provide more time to figure out how to make incomplete test data more reliable. Can you see any verifiable, scientifically sound method to … Read More

      Doug, when you say, ” these incomplete tests amount to roughly 15 percent of the total number of tests administered…..and represent a considerable challenge for the vendor to resolve in an adequate manner by the August reporting deadlines.” You seem to be alluding to the idea that the delay is intended to provide more time to figure out how to make incomplete test data more reliable. Can you see any verifiable, scientifically sound method to skirt the lack of reliable raw data?

      • Doug McRae 8 years ago8 years ago

        Don -- A complete set of scoring rules for any large scale test included nitty gritty fine print for determining how many kids "participated" in the exercise with a rule for what constitutes participation, how many tests were there with sufficient participation to generate a score (even it is a lowest obtainable score), how many items have to be attempted to yield a score above the lowest obtainable score, and other rules for disqualifying a … Read More

        Don — A complete set of scoring rules for any large scale test included nitty gritty fine print for determining how many kids “participated” in the exercise with a rule for what constitutes participation, how many tests were there with sufficient participation to generate a score (even it is a lowest obtainable score), how many items have to be attempted to yield a score above the lowest obtainable score, and other rules for disqualifying a test record for invalid response patterns [such as all answers the same, multiple choice answers constituting an irrelevant or even vulgar patterns, random responses where words are required, etc]. Ideally all of these rules have been identified in advance, and in fact many if not most of the rules have to be consistent with rules used for the field test if one wants to use the threshold scores for the different reporting categories generated from field test data and/or the measurement scales generated from the field test, or compare the operational results to the projected percents for each reporting category that Smarter Balanced released based on field test data. In other words, there are some messy nitty gritty things that have to be attended to when dealing with incomplete tests, or even some tests with complete records. All this is part of the scoring vendor’s task to complete by the reporting deadlines included in the contract. The testing vendor is not unlike a building contractor, building an $80 million building each year, lots of nitty gritty stuff hidden behind the walls that has to be done correctly. If there are 10 times more incomplete testing records at the end of the testing window that need to be processed, than the estimated number of incomplete tests covered by the contract, that may well require some adjustments for both the deadlines and the costs associated with the contract . . . . just like time extensions or cost revisions for an $80 million building project.

  8. FloydThursby1941 8 years ago8 years ago

    Usually those who delay release are under pressure from people like Gary to never release it at all. I will feel lucky if I get it at all.

  9. Roxana Marachi 8 years ago8 years ago

    What may have been a well-intentioned effort to develop a modern, state-of-the-art assessment has unfortunately resulted in a deeply flawed product that fails to meet even the most basic standards of testing and accountability. Students, parents, taxpayers, board members, and education policymakers should consider asking the following questions about the new computerized assessments and soon-to-be-released scores: http://eduresearcher.com/2015/07/06/critical-questions-computerized-testing-sbac/