Committee Liaisons & Representatives: DAS Advisory Committee / February 2016
Type: Open Committee
Focus Areas: High Quality Student Evaluations
Upcoming dates or events to share with general membership
March 1, 2016
MDE Updates: Vanessa Keesler, Andy Middlestead, Pat King
Made a lot of changes for this testing year
Cut testing time in all grades
Computer adaptive in ELA and math grades 3-8
won’t greatly impact testing time
blueprint is the same; same number of items, just changes the content
based on content standards of the current grade level, not above or below grade level
deviated from SBA and PARCC by eliminating the performance task; concept is the same; algorithm is the same
Went against “expertise” in this reduction of time (state is not pretending to be the experts)
Admin. footprint should be less
Data files and reports back to districts prior to start of next school year
Hearing that the field doesn’t know how to analyze the data; interpret the data
Investing efforts into parent outreach and communicating what the data mean; messaging around the data is an area of improvement
Teacher evaluations will have to use state measure in 2018-2019 (not new info?)
Changes in reports for upcoming year
Claim score reporting: know that this is not great, looking for ways to improve this and make it more clear (fewer items; “giant” error range)
Psychometrics and Standard deviations are being reviewed
Want to put out more data in the data files
Preliminary looks at where students may score; not publishing reports on this data though
48 hour data (hand scored will obviously take longer to analyze)
Having difficulty in labeling parts of the reports: red X, checkmarks, etc.
Claims: possibly 3 “buckets” potentially; preliminary scaled scores difficult to label (use for short cycle instructional decisions per Vanessa)
More detail potentially in the “yellow” as to where the students actually score; near benchmark, middle, etc.
Discussion around who the reports are tailored to, Wendy talked about other assessments that report “on grade level”, “near grade level”
Not a normal distribution, no bell curve, flat distribution across the board
Median SGP? What is the appropriate range?
Don’t have to have the same test (this years data goes back to the last MEAP)
Vanessa wants to shift SPG’s to ed. evaluation timelines; more effort into the communication about these starting next year
Set their benchmark scores nationwide; MDE does not want to set Michigan scores, want to use national benchmarks; not using unique cut scores to MI
Rationale: simplicity, statistical methods
Norms will be re-calculated after period of time given MI data
Not looking at proficiency, rather college readiness
Accountability: don’t have to use proficiency, looking at alternative reporting options (quadrant method: growth and status scaled score vs. proficient); not clear right now, no decision
has been made at this point
this year is a “wait and see” year due to the test being new
doing away with top to bottom ranking eventually
When discussing sanctions, the measurement quality begins to become a larger conversation
Discussion surrounds different ways to report data to parents vs. the districts to promote better understanding and “buy-in” from a community perspective
Stick with what was decided for ESEA for accountability for now???
No teeth around ESSA until 17-18 school year
Implement something transitional for next year?
Not sure what will happen with the lowest 5% at this point
TS Gold 3-year pilot; can we request that much testing time with that tool; not advisable at this point
Talk about an entry status from preschool; stakeholder group will be devised; shift funding to some
Nobody is required to use TS Gold measure
K-2 benchmarks: need statewide comparable data sooner than 3rd grade
MDE wants feedback on whether or not to require evaluation or make it optional at this point?
do we need comparable data on every student? State is examining this question currently
Leave the tool open and let people choose to use the tool and only report on those who have been assessed?
State is ready to go if the direction is to implement the assessment; standing rec. is to make it optional and let districts choose
Reimbursements for k-3: some have purchased tools and some have used the state tool; no answer at this point?
Sample items have been installed on DRC and schools can choose to have students practice if they choose
Moving towards taking the Essential Elements, continuing the use of those and building a state test that is aligned with the EE’s and the range of complexities
This is somewhat of a “field test” and will be less of a high stakes year as people become comfortable with it
Comments: Range of complexity does not align with learning progressions of students with CI’s; overly simplistic
Consider public comments for next years assessment
Asking for the possibility to test across levels or between levels: assess students where they are
Guidance on test selection
Bring in coalitions/interest groups to discuss assessment/accountability
Feds guidance was that they are aiming for November, but could be a 6 month hold after the elections
MDE will look at law and then make decisions from there; potentially moving forward without guidance from the feds