By Robert Nichols, Communications director, College of Arts and Sciences at Lehigh University
was speaking with my neighbor the other evening, chatting from our porches as neighbors do. She has two daughters, one in second grade and one sixth grade. She is always willing to share her thoughts, so I asked her what she thought about the PSSA, Pennsylvania’s standardized achievement test.
“Don’t get me started,” she exclaimed. “I’ve never dealt with such angst in the kids. I’m tempted to opt out next time.”
I then had this brief moment when I was so glad my children are out of school. It’s not as though I am a bad parent who could not wait for my children to grow up. My wife and I actively participated in our two children’s educations, but for nine years, we questioned the need for standardized testing. I must admit that at times I’m relieved I no longer have to participate in a testing system I find troublesome.
The PSSA, the Pennsylvania System of School Assessment, is an annual assessment administered throughout the state in grades 3 through 8, to test English Language Arts and Mathematics. Grades 4 through 8 are also tested on science. It was seemingly an annual spring ritual in my home. Practice tests would come home. Teachers posted practice problems on their websites. My children would go to school, sit down, fill in the bubbles and score well on the tests. Yet, I never felt the assessments provided me with a complete picture as to how my children were progressing.
I understand the logic behind standardized testing. Yes, we need criterion-oriented assessments that provide students, parents, educators and the public with an understanding of student and school performance. But I watched my children prepare for these assessments and I started to veer from the norm. I began to question the validity of such assessments. It was particularly fascinating to watch my son. He had a reputation for not progressing as quickly as his peers, yet he knew how to take multiple choice examinations. I posit that standardized testing appraises a student’s performance on one particular day and does not consider external influencers. What happens if a student has a death in the family just before the assessment? What if a student is frequently bullied and is fearful of attending school? Factors outside school can influence a student’s ability to successfully take the test.
With standardized exams, all students answer the same questions under the same conditions. The belief is one can assess progress from a bubble sheet. These tests are full of questions that may not have the same meaning to all students, even if the developers do attempt to eliminate bias. They do not measure the ability to think deeply or creatively in all fields. My son learned what topics were on the test, and was usually prepared by his teachers to know the content on the test. But did he deeply understand the content? For some of the material, I doubt it. If I relied on PSSA results, I would have had an incomplete picture of his progress.
Parallel to this difficulty is my belief that standardized testing only evaluates the individual performance of the student instead of the overall growth of that student over the course of the year. It is a snapshot, no more. The PSSA only assessed my children’s proficiency for that particular March or April. Not every student progresses at the same rate. A better gauge would have been to assess their progress over the course of the entire academic year.
I am not isolated in my views. During the 2015 PSSA test, more than 4,500 students statewide opted out of the math and English and language arts areas of the test — more than 770,000 students did take that part of the test. More than 1,000 students statewide opted out of the science portion, with more than 258,000 students taking that part of the exam.
One of the arguments against opting out is it skews tests results. Students opting out of the 2015 tests represented a little more than 0.5 percent of the students who took the test. In New Jersey, the opt-outs numbered 115,000, or 14 percent of the exam-taking pool. In New York, 200,000 third through eighth graders sat out assessments last year of the 1.1 million test-takers.FairTest: The National Center for Fair & Open Testing notes that across the country, an estimated 640,000 students opted out last year in the 14 states that reported. Save Our Schools, NJ, a grass-roots advocacy group promoting opt out, now has approximately 30,000 members.
I question whether the groundswell of opt out advocacy can be sustained. Many focus on the tool being used to assess student achievement while not providing possible solutions.
I also do not believe there is a short-term answer as to how we assess student progress. I do believe that evaluation can be used as an instrument. Working for a university, I understand the need for tests as an evaluation tool, but standardized tests such as the PSSA do nothing more than place intense pressure on teachers and schools, teaches students to learn what is on a test and not the overarching content, and gives the public an incomplete and inaccurate picture as to the academic health of their local schools. And that is not good for education.