Start the Second Semester off Right: Vision to Assessment Success

If only they would release more assessment items! If I just knew what the test looked like. It’s always a moving target—it always changes! Who are those writers anyways? They are just biased and don’t know anything about kids. If only the kids had a way of practicing….

Have you figured out what I’m describing yet? Those statements are all complaints that I have heard regarding Ohio’s Assessment system, all within the last week. While I believe there to be no state system out there that is flawless, and while ours, perhaps arguably, has issues that need to be addressed, there is one thing that I am positive of. It’s time to get on the proverbial professional development school bus and adjust that mindset…you can conquer the test.

1Before I show you how, I want to remind you about the heart of what we do as educators—student learning—which is anchored around three interrelated areas. Curriculum / assessment / instruction always exist in relationship to one another; they are not separate from one another, and they can’t exist without each other (Pelligrino, Chudowski, & Glaser, 2001). For any Deweyian scholars out there, this is the transactional relationship at best (Ryan, 2011). For the non-nerds out there, it’s the concept of a store without customers. The store does not exist because it has no customers and the customers don’t exist because there is no store. The same exists in the classroom—we can’t teach without finding out what kids know and if we don’t know what kids know, we can’t teach. Assessments, no matter what their form (an oral question for example), are proof that somebody (a student) knows something (that the teacher has taught).  

Since assessment is integral to what we do in classrooms, we must have a vision for it— in other words, if we want to conquer the test, we need to start first with vision (that’s Step 1 to conquering the test). Vision as Manasse (1986) describes it is the force that gives meaning and purpose to the work of an organization. It inspires commitment for the future as it explains who is involved and what they plan to accomplish, and explains the importance for the growth. As Pejza (1985) cogently stated, it’s a “hunger to see improvement.”  

I remember this like it was yesterday. It was almost four years to the day. I was walking with one of my thought partners (Stanny, 2012) down the Short North in Columbus during a lunch break on one of the coldest days ever that winter. We were shooting the breeze about everything educational, and I remember that this friend and colleague was challenging me on my thinking process about where I wanted my district to go. I was shopping for assessment products, as we had had a bit of an assessment crisis in my district, and I realized very quickly that I needed to envision what it was that I wanted our team to accomplish. What was going to be the purpose of our assessment system? How would it relate to curriculum? What instructional strategies did we as a district value that would align to our assessments? What did I want teachers to do with the data? It really, of course, wasn’t about the platform, but rather the vision we had for how we were going to work systematically across our district to improve student learning (see this entry for additional information). In other words, my goal was (and still is!) to help teachers understand how to write and analyze good assessment items, as by teaching a teacher to fish, I feed them for a lifetime.

It isn’t until late that I’ve really come to understand how crucial that conversation was and how important the envisioning process is. We say it in Admin 101 classes all the time that vision is gold, and we teach teachers on a regular basis that there must be clearly defined learning targets (their vision for a lesson). The same is true for our assessment systems within our districts. No matter what role you play in your district (curriculum director, principal, or teacher), start with a vision that addresses the relationship between your classroom assessments and state assessments and then the relationship between your formatives and summatives within your context. What is your goal by assessing? What are you hoping your assessments show? How do you know that your assessments prove what students should know and are able to do?

2.pngStep 2 in conquering the test comes through group discussion and utilizing the plethora of resources that exist out in our state. For starters, I encourage everyone to review the blueprints for what we are hoping our students can do. Test blueprints are outlines of the content and skills that are to be measured. They specifically help teachers understand what the test looks like and also helps reassure us that the target isn’t moving. Our blueprints have stayed the same for the last three years and it’s a great exercise to compare these in relationship to curriculum maps and lesson plans and in relationship to state standards (ELA, Math, Science, Social Studies).

Next should come analysis of released items; these released items come out on a regular basis per Ohio Revised Code and provide insight into how different concepts can be assessed. (A caution to note is that released items are different than practice items. A practice item is designed to practice technology skills. In other words, a Grade 8 math practice item might not be aligned to Grade 8 math but does show students how to manipulate technologies that they will see on the test). Once you’ve analyzed those items in relationship to your standards, start to create your own items, items that measure the depth that is required in the standards.

Speaking of depth, my favorite tool is to utilize the performance level descriptors to start to move students forward (this becomes the intersection of curriculum / instruction / assessment). PLDs are, as ODE describes, the “link between Ohio’s Learning Standards and performance standards and help to define what students should know or be able to do at each performance level.” The power in this tool comes in when we start to look at a collection of student work over the course of a unit or a month or week and start to compare this student’s work to the PLDs. For example,  a 3rd Grade Basic student can “Determine the main idea of a text and identify key details to recount the main idea;” this is different than the Proficient student who can “Determine the main idea of a text and recount key details and explain how they support the main idea.” Now we know that we have to teach the student how to explain how details support the main idea. This can become the vision (i.e. learning target) for instruction.

3The last step comes in learning more about assessments. Learn how Ohio writes its assessments — interestingly enough, they are not individuals in fedoras, but rather teachers and administrators like you and me (see this to learn about the item development process). Learn about other myths that exist in relationship to Ohio’s testing system.  

Then learn how to write your own items that will help support the instruction and curriculum happening in your classrooms.  

Now, if only if there was a place to learn more about all of these concepts, and ways to get more examples….

Ohio educators can see Dr. Drost present at the Ohio Assessment Literacy Conference on January 27, 2018 at Summit ESC.

Bryan Drost_Edcite BlogBryan R. Drost is the Director of Curriculum and Instruction for the Summit ESC, Ohio. He holds a Master’s of Education in Educational Foundations with an emphasis in Standards-Based instruction as well as a Ph.D. in Curriculum and Instruction and Assessment both from Kent State. Bryan holds a variety of roles at the state and national levels: an ODE Network Regional Leader, a member of ODE’s Fairness and Test Use Committee, Content Advisory Committee member, NAEP assessment member, a steering committee member of the Northeast Ohio TALK Network, a RESA master coder, a national supervisor for edTPA, a consultant for the National Board, part of NCME’s Standards and Test Use Committee, the mathematics lead for Ohio’s Core Advocates, and Regional Data Lead for State Support Team Regions 4 & 8. He has presented throughout the state and country on various topics related to instructional shifts, assessment, and technology integration.

5 New Improvements to Our Interface (Teachers Tools)

We are so Edcited to release our new interface! Feedback from our community helped us improve the teacher experience across our platform. Keep reading to learn more about these exciting updates.

Continue reading 5 New Improvements to Our Interface (Teachers Tools)

Improving the Quality of the Feed: Electronic Common Assessments

Educational assessment articles and books across this country in the last five years have referenced hogs, chickens, and cows. The question has been if you want a healthier animal, do you weigh it more often or do you improve the quality of the feed?  It has been continually suggested for the last decade that improving our students’ achievement requires breaking the pattern of being data-rich but information-poor.

However, assessments were just being given and were taking up valuable instructional time when we weren’t doing anything with the data. (2)This has been the case in my district over the last few years.  We were using an outside vendor to track student progress and doing a great job at weighing students.  However, assessments were just being given and were taking up valuable instructional time when we weren’t doing anything with the data. Add a number of parents who refused to have their students take these assessments and we had an assessment system that wasn’t working for anyone!

As Director of Educational Services, it was my number one priority this last year to build capacity in my staff to convert data to information that gives teachers tools needed to probe for causes where students are underperforming or exceeding expectations, analyze conditions that contribute to various trends of student achievement, and develop intervention and enrichment strategies to support these analyses.

Interventionand Enrichment

In relationship to the educational shifts required in the academic content standards for our state, I spent considerable time developing assessments created by my teachers that function from an evidence-centered design (see the embedded image for more information on this type of design).  Evidence-centered design begins with inferences that we want to make about student learning connected to standards and follows with a collection of evidence (i.e., an assessment) that shows how we know that students are making progress toward doing what we claim they can do.

bryan drost image

I knew that if I was to get buy-in from staff on utilizing the data to drive instruction, we had to create our own assessments. From my perspective, this meant that we either had to learn how to write good questions quickly or find them from vetted resources.  Utilizing resources from Achieve the Core and Illustrative Mathematics and doing some work with assessment blueprinting, we created assessments on Edcite that my principals and I felt were worthy of kids’ time and that would also provide us with valuable information with which to adjust instruction.

However, assessments were just being given and were taking up valuable instructional time when we weren’t doing anything with the data. (4).jpg

Edcite’s new platform Edcite Schools fit our needs as it was cost-effective, allowed us to generate reports in a number of ways (standards-based, classroom-based, student-based, etc.), and allowed our teachers to provide feedback to students via the electronic platform. The system allowed us to search through question banks that we vetted using an assessment vetting tool on Achieve the Core. It also had the extra advantage of being able to be customized to give students experience in a viewer very similar to our state’s assessment system, AIR.

Screen Shot 2016-07-22 at 3.40.06 PM

After we administered our first assessment, we met in grade level teams to analyze results. Utilizing the reports in Edcite Schools and following a data-protocol, where we set norms, focused on what students can do, what students were struggling with, and trends amongst assessments, we were able to have the professional conversation about how we were going to improve our instruction. The quality of the feed improved and we saw growth amongst students from assessment-to-assessment.

However, assessments were just being given and were taking up valuable instructional time when we weren’t doing anything with the data. (6).jpg

Fast-forward to June, which in my state is every curriculum director’s nightmare: the release of state achievement data. I learned that the work that we did with Edcite Schools actually was predicative.  When I compared our internal Edcite Schools data to our state achievement data, with a 99% predicative accuracy, I was able to determine which students were in danger of not meeting grade-level benchmarks.  We are now data rich.

As we plan for the following year, it is crucial that we start to utilize the information that we are collecting consistently to further plan interventions to help our students who are struggling. I know, as do my teams, that there are going to be mistakes. Will we have it 100% right next year? No. The key is to plan the administration of the assessment knowing that we have to do better for our students with whatever data comes back. They deserve it so that we can consistently focus on the feed rather than than weighing.

Bryan Drost_Edcite Blog

Bryan R. Drost is the Director of Curriculum and Instruction for the ESC of Summit County, Ohio. He holds a Master’s of Education in Educational Foundations with an emphasis in Standards-Based instruction as well as a Ph.D. in Curriculum and Instruction and Assessment both from Kent State. Bryan holds a variety of roles at the state and national levels: chairperson for the Ohio Foreign Language Association Technology Integration Committee, an ODE Network Regional Leader, a member of ODE’s Fairness and Test Use Committee, a steering committee member of the Northeast Ohio TALK Network, a RESA master coder, a national supervisor for edTPA, a consultant for the National Board, part of NCME’s Standards and Test Use Committee, one of Ohio’s Core Advocates, and a Batelle for Kids Roster Verification trainer. He has presented throughout the state and country on various topics related to instructional shifts, assessment, and technology integration.