Preparing Students for Ohio End-of-Course Exams

This week we are sharing stories from Ohio teachers who use Edcite for AIR®-alignment. Read our Q & A below to learn how a biology teacher in Bellefontaine, Ohio has been using Edcite.

What is your name?

Joslin Lee

What is your job title? Where do you work?

HS Biology teacher, Benjamin-Logan High School

How long have you been using Edcite?

Edcite had only [been around] for a year, and I heard about it at a professional development meeting for the upcoming biology AIR® test. I started creating questions and tests that summer to use in my class so that my students would be prepared for the Biology End-of-Course Exam given by Ohio. I am going on four years using Edcite, and I love it.

Pre-test 17-18 question 15

What problem does Edcite solve for you?

Edcite allows me to quickly see what questions students are having trouble with. I can then check whether it was how I wrote my question or whether it was their knowledge. I also use Edcite for pre-tests to get pre and post data.

Pre-test 17-18 question 18

What else would you like to share?

Edcite has allowed me to reach my goals for students scoring higher on end-of-course exams without “teaching to the test.”

Thank you, Joslin, for your Edcite Story! Teachers, you can check out some of Joslin’s assessments below. Make copies for your class or use them as inspiration to create your own assessment on Edcite!

Assessment Links:

Pre-Test 17-18

The Cell

Genetics

Interested in using Edcite for assessments across your school or district? Joslin’s team uses Edcite Schools to give assessments with an AIR®-aligned assessment viewer. Learn more by visiting www.edcite.com/edcite-schools-assessment-platform.

chalkboard-620316_640Joslin shared her Edcite Story with us, and so you can you! We publish Edcite Stories so that educators can learn about different ways Edcite is used in schools and school districts. If you want to submit your own Edcite story, fill out our form.

Start the Second Semester off Right: Vision to Assessment Success

If only they would release more assessment items! If I just knew what the test looked like. It’s always a moving target—it always changes! Who are those writers anyways? They are just biased and don’t know anything about kids. If only the kids had a way of practicing….

Have you figured out what I’m describing yet? Those statements are all complaints that I have heard regarding Ohio’s Assessment system, all within the last week. While I believe there to be no state system out there that is flawless, and while ours, perhaps arguably, has issues that need to be addressed, there is one thing that I am positive of. It’s time to get on the proverbial professional development school bus and adjust that mindset…you can conquer the test.

1Before I show you how, I want to remind you about the heart of what we do as educators—student learning—which is anchored around three interrelated areas. Curriculum / assessment / instruction always exist in relationship to one another; they are not separate from one another, and they can’t exist without each other (Pelligrino, Chudowski, & Glaser, 2001). For any Deweyian scholars out there, this is the transactional relationship at best (Ryan, 2011). For the non-nerds out there, it’s the concept of a store without customers. The store does not exist because it has no customers and the customers don’t exist because there is no store. The same exists in the classroom—we can’t teach without finding out what kids know and if we don’t know what kids know, we can’t teach. Assessments, no matter what their form (an oral question for example), are proof that somebody (a student) knows something (that the teacher has taught).  

Since assessment is integral to what we do in classrooms, we must have a vision for it— in other words, if we want to conquer the test, we need to start first with vision (that’s Step 1 to conquering the test). Vision as Manasse (1986) describes it is the force that gives meaning and purpose to the work of an organization. It inspires commitment for the future as it explains who is involved and what they plan to accomplish, and explains the importance for the growth. As Pejza (1985) cogently stated, it’s a “hunger to see improvement.”  

I remember this like it was yesterday. It was almost four years to the day. I was walking with one of my thought partners (Stanny, 2012) down the Short North in Columbus during a lunch break on one of the coldest days ever that winter. We were shooting the breeze about everything educational, and I remember that this friend and colleague was challenging me on my thinking process about where I wanted my district to go. I was shopping for assessment products, as we had had a bit of an assessment crisis in my district, and I realized very quickly that I needed to envision what it was that I wanted our team to accomplish. What was going to be the purpose of our assessment system? How would it relate to curriculum? What instructional strategies did we as a district value that would align to our assessments? What did I want teachers to do with the data? It really, of course, wasn’t about the platform, but rather the vision we had for how we were going to work systematically across our district to improve student learning (see this entry for additional information). In other words, my goal was (and still is!) to help teachers understand how to write and analyze good assessment items, as by teaching a teacher to fish, I feed them for a lifetime.

It isn’t until late that I’ve really come to understand how crucial that conversation was and how important the envisioning process is. We say it in Admin 101 classes all the time that vision is gold, and we teach teachers on a regular basis that there must be clearly defined learning targets (their vision for a lesson). The same is true for our assessment systems within our districts. No matter what role you play in your district (curriculum director, principal, or teacher), start with a vision that addresses the relationship between your classroom assessments and state assessments and then the relationship between your formatives and summatives within your context. What is your goal by assessing? What are you hoping your assessments show? How do you know that your assessments prove what students should know and are able to do?

2.pngStep 2 in conquering the test comes through group discussion and utilizing the plethora of resources that exist out in our state. For starters, I encourage everyone to review the blueprints for what we are hoping our students can do. Test blueprints are outlines of the content and skills that are to be measured. They specifically help teachers understand what the test looks like and also helps reassure us that the target isn’t moving. Our blueprints have stayed the same for the last three years and it’s a great exercise to compare these in relationship to curriculum maps and lesson plans and in relationship to state standards (ELA, Math, Science, Social Studies).

Next should come analysis of released items; these released items come out on a regular basis per Ohio Revised Code and provide insight into how different concepts can be assessed. (A caution to note is that released items are different than practice items. A practice item is designed to practice technology skills. In other words, a Grade 8 math practice item might not be aligned to Grade 8 math but does show students how to manipulate technologies that they will see on the test). Once you’ve analyzed those items in relationship to your standards, start to create your own items, items that measure the depth that is required in the standards.

Speaking of depth, my favorite tool is to utilize the performance level descriptors to start to move students forward (this becomes the intersection of curriculum / instruction / assessment). PLDs are, as ODE describes, the “link between Ohio’s Learning Standards and performance standards and help to define what students should know or be able to do at each performance level.” The power in this tool comes in when we start to look at a collection of student work over the course of a unit or a month or week and start to compare this student’s work to the PLDs. For example,  a 3rd Grade Basic student can “Determine the main idea of a text and identify key details to recount the main idea;” this is different than the Proficient student who can “Determine the main idea of a text and recount key details and explain how they support the main idea.” Now we know that we have to teach the student how to explain how details support the main idea. This can become the vision (i.e. learning target) for instruction.

3The last step comes in learning more about assessments. Learn how Ohio writes its assessments — interestingly enough, they are not individuals in fedoras, but rather teachers and administrators like you and me (see this to learn about the item development process). Learn about other myths that exist in relationship to Ohio’s testing system.  

Then learn how to write your own items that will help support the instruction and curriculum happening in your classrooms.  

Now, if only if there was a place to learn more about all of these concepts, and ways to get more examples….

Ohio educators can see Dr. Drost present at the Ohio Assessment Literacy Conference on January 27, 2018 at Summit ESC.

Bryan Drost_Edcite BlogBryan R. Drost is the Director of Curriculum and Instruction for the Summit ESC, Ohio. He holds a Master’s of Education in Educational Foundations with an emphasis in Standards-Based instruction as well as a Ph.D. in Curriculum and Instruction and Assessment both from Kent State. Bryan holds a variety of roles at the state and national levels: an ODE Network Regional Leader, a member of ODE’s Fairness and Test Use Committee, Content Advisory Committee member, NAEP assessment member, a steering committee member of the Northeast Ohio TALK Network, a RESA master coder, a national supervisor for edTPA, a consultant for the National Board, part of NCME’s Standards and Test Use Committee, the mathematics lead for Ohio’s Core Advocates, and Regional Data Lead for State Support Team Regions 4 & 8. He has presented throughout the state and country on various topics related to instructional shifts, assessment, and technology integration.