Technology Information Center for Administrative Leadership

 
man holding star graphic

How A Data-Driven Organization Deals With Aligning Assessment Measures to Local Standards
(One of the greatest barriers to becoming a proactive Data-Driven Organization)

Jim Cox
Educational Consultant

 

 

In the current accountability frenzy, how often we make the statement, "This test doesn't measure what we teach!" This refers, of course to the perceived misalignment of one's curriculum to whatever test the students are taking. Usually when such a statement is made, the test is a high stakes test, perhaps given state wide, in which the results have incredible power.

 

What, then, does a "data-driven" school do when confronted with this apparent conflict?

 

I suggest two things...

 

Know How Well the Two are Aligned
Do not say that a test is misaligned unless you know that it is! A data-driven organization knows the extent to which local standards and any high stakes test are aligned. In California, the Stanford Achievement Test (SAT 9) is the high stakes test to which I refer. I often use a Venn diagram to picture the alignment between the two.

 

In the figure below, Area 1 is that portion of one's standards not included on the test; Area 3 represents the content on the test but not in the standards, and Area 2 is the content common to both the standards and the test. If the figure as shown below depicted reality, then a great deal of misalignment would be present.

 

 

If, on the other hand the figure looked like the one below, the two would be relatively well aligned.

 

IT IS ESSENTIAL THAT A DATA-DRIVEN ORGANIZATION KNOWS PRECISELY HOW WELL ITS STANDARDS AND ITS HIGH STAKES MEASURES ARE ALIGNED.

 

 

I have assisted many schools and districts in California to determine the alignment between the local standards and the content of SAT 9. In almost every case in grades 2-8 the figure looks like the one just above for reading and math. There is very little on the test that is not a part of the curriculum. In other words, when a school staff automatically declares, "This test doesn't test what we teach," what they are really saying (without realizing it) is that the issue is not Area 3; the issue is Area 1. That is, "We teach so much more that the test doesn't touch!"

 

It is really important to avoid making unproven statements and using clichés when referring to the critical issue of alignment. Let's not exacerbate this already delicate issue.

 

Focus On Measuring Area 1, Not Finding New Ways to Measure Area 2

When I meet with school folks regarding what measures they are currently using to assess student progress, often they cite multiple means of measuring Area 2. For example, let's say we're dealing with reading comprehension. They will cite a high stakes test like SAT 9; then they may indicate that they are also using some district developed test or some other published test to get at the same content that is in Area 2. I can only conclude from this that in the eyes of the educators, the mandated measure is insufficient to get a good assessment, and, therefore, something else is needed. That's all well and good, assuming a staff is willing to deal with these multiple measure logistically, but something is sorely missing. That is, there is little or no effort to measure the content of Area 1.

 

What we have under these circumstances are several standards which a school has identified as important for students to attain, yet, for which there is no measure to assess progress. A good example of this would be the oral language development portion of language arts. We assess reading; we assess writing; but how many of us formally assess progress in speaking and listening? And if we do not, how serious are we about including these skills as part of our "higher standards?" A data-driven organization doesn't focus on just getting more data from more sources. A data-driven organization focuses on getting data in the places that they are needed.

 

 

Other Columns by Jim Cox: