How Bad is The Test? English Teachers Count the Ways

Recently, the New York City Department of Education rolled out a secondary English Language Arts test called “Measures of Student Learning (MOSL) Performance Assessment.” The stated purpose of the test is to measure improvement in student writing in order to evaluate how well we as teachers do our jobs.

The new teacher evaluation system dictates that these and other state assessments are worth 40 percent of our overall rating. The exam asks high school students to write an “argument essay.” The prompt for the essays poses a simplistic question that can be easily answered, but the guidelines tell students to use evidence from two short texts found on the following pages. The test, developed by Stanford University, Teachers College, and the D.O.E., loosely aligns with the Common Core learning standards.

However, in our opinion, the test falls short in measuring students and teachers for several reasons.

To begin, we lost two days of instructional time giving this pre-test to our students. We continue to lose tutoring and professional development time to grade the test on seven often overlapping components. We will lose two more days to give the post-test on an unknown date in the spring. Four lost instructional days rob the students of valuable learning opportunities, and seems to use them as guinea pigs.

Not only is this test a waste of students’ time and taxpayers’ money, it is also an invalid way to evaluate teachers. It was given to schools across the city at different times. Each school received only one of each grade-level exam and had to make photocopies for its students. The security of the exam was virtually non-existent, thereby implying that it was unimportant. Furthermore, while most high school class periods are between 45 and 60 minutes, the testing period was 90 minutes so, in many schools, students read on the first day and wrote on the second day, making it possible for them to discuss the exam among themselves and plan out responses before writing.

The validity of the test results is further damaged by the fact that we, the teachers the test is meant to evaluate, are grading the tests administered to our own students by our colleagues within our own schools. This puts us in an awkward position because it is a clear conflict of interest. The D.O.E. did not indicate how much the pre-test should count towards students’ grades, if at all; those decisions were left to the discretion of the schools and, in most cases, individual teachers. The lack of system-wide uniformity in how the test is perceived by students should invalidate how the results are used to reflect upon teachers.

Our grievance is not with our Assistant Principal of English or our Principal, both of whom we respect. It is with the D.O.E. which has taken a one-size-fits-all approach to education. Our training as teachers taught us to differentiate for our students’ various needs and abilities. If we were to take this one-size-fits-all approach to the 170 students each of us teaches in a day, we would be rated “Unsatisfactory,” or in today’s parlance, “Ineffective.”

Schools, like students, come in different shapes and sizes with different needs. In specialized schools like Brooklyn Technical High School, gifted students will perform well on both the pre- and post-tests, thereby showing statistically negligible improvement because their scores were already high. It is not clear how much we have to improve our already strong students, how many students we have to improve, and what the consequences are if we do not improve our students enough.

We understand that we will be held accountable for what our students learn, but a poorly written and grossly mismanaged test is not the way to begin.

Instead, the D.O.E. should use existing tests, such as the SAT, the ACT, and Advanced Placement exams, or explore other models. Authentic assessments, such as portfolios, provide multiple examples of student development over time. Post-graduation interviews or surveys that capture what skills students found most valuable in their college and career experiences may improve our teaching as well. These methods require time and patience, qualities for which there is no room in the current data-driven, instant results business model of education.

From the below signed members of the English Department at Brooklyn Technical High School:

1. Laura DeWitt

2. Danny Schott

3. Justyna Kret

4. Anastasia Visbal

5. Giancarlo Malchiodi

6. Shelley Zipper

7. Dan Baldwin

8. Patricia Quilliam

9. David Lo

10. Phyllis Witte

11. Marie Manuto-Brown

12. Rebecca Rendsburg

13. Jonathan Scolnick

14. Emily Tuckman

15. Tanya Green

16. Robert Grandt

17. Chris Rabot

18. Sonia Laudi

19. Monica Rowley

20. Christina Massie

21. Meredith Dobbs

22. Stephen Harris

23. Debra Rothman

24. Melissa Goodrum

25. Timothy Ree

26. Emilie Baser