Dr. Walt Cooper
(Click for bio)
On several occasions I have written how Colorado assessments are changing in response to the updated Colorado Academic Standards. Previously, I shared thoughts and information about the technical requirements for these assessments, the amount of time required for full test administration, the changing nature of the material tested, and the manner in which the tests are administered (computer vs. paper-and-pencil).
We knew as far back as early 2013 that changes were going to be significant, so we practiced, we trained, we prepared, and we learned many things through these exercises. But until now, most all of the information I’ve shared has been largely hypothetical; it was information based on the best data available and our best understanding of the expectations at the time. Over the course of the last two weeks, however, we have finally been able to grasp much more about the future of the testing (thought we still have much to learn) as we implemented the first set of CMAS tests in our elementary and junior high schools. So, what have we learned?
From a technical standpoint, we wondered whether the technology would be hard for students to use when taking the assessment. We learned that the new assessments are built so that, although different, is actually rather easy for students to learn and use. We also learned that we could use a range of devices for the assessment, the same technology that students use in classrooms throughout the year, so this is familiar to them.
Logistically, we questioned if network and bandwidth capabilities would be sufficient enough to allow us to test large numbers of students at the same time. We learned that our infrastructure across the district easily meet the requirements to effectively deliver the test content.
We also learned that the assessments are certainly not free of glitches. We ran into technical difficulties at the school level when some computers would not display selected items correctly. Problems on a national level surfaced when the test publisher’s technical support network became unavailable during a critical testing period. These glitches caused school and district personnel to adjust some daily schedules and react “on the fly” to meet our testing obligations, but students and staff did so admirably.
Beyond the technical components, we have been advised all along that the content of the assessments themselves would be much more rigorous and demanding at all levels. Based on sample items and practice tests early on and now from student feedback after actual testing, we know this to be true. The assessments ask students to answer a wider variety of questions, show their work in greater detail, and explain their reasoning. They assess critical-thinking and problem-solving skills in an in-depth manner and ask students to back up their answers with information from the text instead of just offering their opinions.
All along we anticipated that students would respond more favorably to the computer-based assessments (rather than the paper-and-pencil predecessor) because they would allow for more engaging and innovative assessment items that mirror the engagement technology adds to our everyday classrooms. We were right. Many students offered unsolicited comments supporting our assumption.
From the students’ perspective, the most common feedback was that the assessments were much “harder.” I trust their perspective and see this as evidence that the bar has been raised for student achievement in Colorado. However, simply increasing the difficulty of an assessment does not help students learn to a higher level. Increasing student achievement is the result of very intentional and focused instruction toward a more rigorous set of standards, and that is now the most pressing challenge for us to undertake.