Educational assessment articles and books across this country in the last five years have referenced hogs, chickens, and cows. The question has been if you want a healthier animal, do you weigh it more often or do you improve the quality of the feed? It has been continually suggested for the last decade that improving our students’ achievement requires breaking the pattern of being data-rich but information-poor.
This has been the case in my district over the last few years. We were using an outside vendor to track student progress and doing a great job at weighing students. However, assessments were just being given and were taking up valuable instructional time when we weren’t doing anything with the data. Add a number of parents who refused to have their students take these assessments and we had an assessment system that wasn’t working for anyone!
As Director of Educational Services, it was my number one priority this last year to build capacity in my staff to convert data to information that gives teachers tools needed to probe for causes where students are underperforming or exceeding expectations, analyze conditions that contribute to various trends of student achievement, and develop intervention and enrichment strategies to support these analyses.
In relationship to the educational shifts required in the academic content standards for our state, I spent considerable time developing assessments created by my teachers that function from an evidence-centered design (see the embedded image for more information on this type of design). Evidence-centered design begins with inferences that we want to make about student learning connected to standards and follows with a collection of evidence (i.e., an assessment) that shows how we know that students are making progress toward doing what we claim they can do.
I knew that if I was to get buy-in from staff on utilizing the data to drive instruction, we had to create our own assessments. From my perspective, this meant that we either had to learn how to write good questions quickly or find them from vetted resources. Utilizing resources from Achieve the Core and Illustrative Mathematics and doing some work with assessment blueprinting, we created assessments on Edcite that my principals and I felt were worthy of kids’ time and that would also provide us with valuable information with which to adjust instruction.
Edcite’s new platform Edcite Schools fit our needs as it was cost-effective, allowed us to generate reports in a number of ways (standards-based, classroom-based, student-based, etc.), and allowed our teachers to provide feedback to students via the electronic platform. The system allowed us to search through question banks that we vetted using an assessment vetting tool on Achieve the Core. It also had the extra advantage of being able to be customized to give students experience in a viewer very similar to our state’s assessment system, AIR.
After we administered our first assessment, we met in grade level teams to analyze results. Utilizing the reports in Edcite Schools and following a data-protocol, where we set norms, focused on what students can do, what students were struggling with, and trends amongst assessments, we were able to have the professional conversation about how we were going to improve our instruction. The quality of the feed improved and we saw growth amongst students from assessment-to-assessment.
Fast-forward to June, which in my state is every curriculum director’s nightmare: the release of state achievement data. I learned that the work that we did with Edcite Schools actually was predicative. When I compared our internal Edcite Schools data to our state achievement data, with a 99% predicative accuracy, I was able to determine which students were in danger of not meeting grade-level benchmarks. We are now data rich.
As we plan for the following year, it is crucial that we start to utilize the information that we are collecting consistently to further plan interventions to help our students who are struggling. I know, as do my teams, that there are going to be mistakes. Will we have it 100% right next year? No. The key is to plan the administration of the assessment knowing that we have to do better for our students with whatever data comes back. They deserve it so that we can consistently focus on the feed rather than than weighing.
Bryan R. Drost is the Director of Curriculum and Instruction for the ESC of Summit County, Ohio. He holds a Master’s of Education in Educational Foundations with an emphasis in Standards-Based instruction as well as a Ph.D. in Curriculum and Instruction and Assessment both from Kent State. Bryan holds a variety of roles at the state and national levels: chairperson for the Ohio Foreign Language Association Technology Integration Committee, an ODE Network Regional Leader, a member of ODE’s Fairness and Test Use Committee, a steering committee member of the Northeast Ohio TALK Network, a RESA master coder, a national supervisor for edTPA, a consultant for the National Board, part of NCME’s Standards and Test Use Committee, one of Ohio’s Core Advocates, and a Batelle for Kids Roster Verification trainer. He has presented throughout the state and country on various topics related to instructional shifts, assessment, and technology integration.