Am I getting what I need to support instructional adjustments?

You know the old saying: garbage in, garbage out. Good decisions rely on good input, and in education, that input comes in part from a comprehensive assessment system you can use to effectively track—and improve—student achievement. According to the Data Quality Campaign, “Empowered with education data, stakeholders are better poised to improve system performance, increase transparency, and most important, improve student achievement.”[1]

So what might that quality data look like? Well, according to Dr. Marylou Caldarera, Supervisor of Assessment for Calcasieu Parish School District, “It’s one thing to say, okay, this is how the child is doing now, but when you look at a seventh grader, and you…look at his history all the way back to kindergartenall in one spreadsheet…that is so empowering.” Dr. Caldarera knows from experience—she worked with a cross-disciplinary team for more than a year—they chose Scantron’s Performance Series® in 2006 and are still using it today.

Does your assessment tool align student results with the most granular information a classroom teacher might need: the actual standard that describes a specific skill? Or does it merely align to a strand/sub-domain, leaving your teachers to guess exactly what aspect of that strand is causing the student difficulty?

Further, does your assessment tool provide a means to access instructional resources that can help classroom educators, students, and even parents address these specific learning insights? To be truly useful in the classroom, assessment results should not only be granular, but connected in readily accessible ways to instructional resources to help close the gap and support individualized learning.

Another factor to consider is exactly how growth or gains are measured. Does your assessment solution provide only tracking against a target score, or does it also let you view the degree or level of growth? For example, a student may be below a district-identified grade-level score in a subject at the beginning of the year and still be below that score at the end of the year. Does your system tell you how close the student is to closing that gap? Or does the system only tell you that he or she did not measure up? The difference between the two is the difference between a student who might give up and a student who will keep trying. As Dr. Caldarera notes, “I’ve seen children actually being able to
articulate how they’ve done on these standards, and voice it not only to their teacher, but also to their peers and to their parents. If I could have only one assessment tool to replace everything else I’ve ever worked with, it would be Performance Series.”

Questions to Ask Yourself

  • Does your assessment solution align student results with the most granular information a classroom teacher might need: the actual standard that describes a specific skill? Or does it merely align to a strand/sub-domain, leaving your teachers to guess exactly what aspect of that strand is causing the student difficulty?
  • Does your assessment solution provide a means to access instructional resources that can help classroom educators, students, and parents address these learning insights?
  • Does your assessment solution provide only tracking against a target score, or does it also let you view the degree or level of growth?
    Does your solution tell you how close the student is to closing that gap?

How easy is it, really, to use the system?

Everyone will tell you the system they provide is easy to use. Everyone. Even when it isn’t. Maybe especially when it isn’t. The secret that vendors typically won’t share is that you get to decide whether a system is easy to use. But remember, true ease-of-use can’t be found in a demo; it must be lived, day in and day out, as your classroom educators experience the system while they are trying to educate. Ensuring data is easily accessible helps both teachers and students. Dr. Caldarera maintains that it’s crucial to “show where a student is and [provide] a map for the next steps. If I’m a parent or a teacher, I can see a student is making progress and that the gap is closing.”

How easy is it to access reports in your assessment system? Can you group and organize your data from different directions? Recognizing the patterns in data is a big part of using it to make strong decisions. And sometimes you have to shift the kaleidoscope a bit to see new and amazing patterns. Your assessment solution should enable you to aggregate and disaggregate data according to categories you choose. The more easily you can do this, the more flexible and effective any intervention that classroom educators create will be, such as Individual Education Plans or student study groups.

An often-overlooked aspect of ease-of-use is customization. Can you see your data how you want? Another way to ask that question is how much can you customize the appearance of reports? Results delivered by most state benchmark exams use color-coded performance bands so anyone reading reports can see at a glance exactly where student scores fall. Does your assessment solution allow you to define custom performance bands to match those bands with color-coded cut scores and apply the bands to your own district and classroom assessments? Making sure bands match between high-stakes state scores and your own ongoing formative and summative assessments is a great way to unify your assessment efforts.

Certainly a factor to consider here is the system’s back end. Is the system truly Software as a Service, delivered entirely over the Internet, or do you need to set up your own servers so your teachers can connect to those servers via the Internet? How much training is required to prepare teachers to create and/or administer tests? How difficult is it to connect the data in your Student Information System with your assessment solution? And how often can that data be refreshed to ensure that it is current and accurate?

Questions to Ask Yourself

  • How easy is it to access reports in your assessment solution?
  • Can you group and organize your data from different directions?
  • Can you see your data how you want? Another way to ask that question is how much can you customize the appearance of reports?
  • Does your assessment solution allow you to define custom performance bands to match those bands with color-coded cut scores and apply the bands to your own district and classroom assessments?
  • Do you need to set up your own servers, or is the solution hosted and available over the Internet?
  • How much training is required to prepare teachers to create and/or administer tests?
  • How difficult is it to connect the data in your Student Information System with your assessment solution? And how often can that data be refreshed to ensure that it is current and accurate?

Am I getting the best value for my money?

As consumers, we all know that it’s not always what you spend, but what you get for what you spend. That’s just as true whether you’re buying peanut butter or an enterprise assessment solution—you need to receive the best value for your budget dollars. When you choose an assessment solution, it pays to consider the whole picture.

Some of the questions we’ve already asked also address value. For example, if your teachers can’t—or won’t—use a system because it’s too difficult or because it doesn’t deliver the results in a useful way, that system is too expensive, no matter what you’re paying for it. A system with additional hardware costs, staff maintenance requirements, and extensive required training may cost more in the long run than a system with a higher up-front per-student charge. A system you cannot customize to suit your needs and work with your current processes simply isn’t a “great deal.”

Value goes beyond the questions already asked. You should ask what other aspects of your assessment strategy could be served within the solution.

Questions to Ask Yourself

  • Can you create your own, targeted tests for any subject you offer or are you restricted to the core subjects covered by high-stakes exams?
  • How difficult is it to create or import content from other sources? Can you even import content from other sources?
  • Can the solution support a spectrum of technology capabilities to present a complete picture of student assessment results, regardless of how the test is delivered (e.g. paper vs. online)?
  • Can the provider offer help beyond the testing mechanism itself? That is, do they offer system training, professional development above and beyond system training, consulting, implementation, customization, research, item and assessment bank content?
  • No one provider can offer everything—what partners does your assessment provider have to offer that provide expanded solutions or functionality? Do they regularly add partners who help provide a better overall solution (e.g., partners who provide automated prescriptions or connections between assessment and curriculum)?

In the End…

In the end, it’s up to you. As with any relationship, the time may come when you have to ask yourself if you’re getting as much out of it as you’re giving. And if not, is that OK with you, or is it time to explore a more rewarding possibility?

[1] “Who We Are | Data Quality Campaign.” Who We Are | Data Quality Campaign. N.p., n.d. Web. 11 July 2014. <http://www.dataqualitycampaign.com/who-we-are>.