The Writers Room® Program

Community-based Writing Coaches


Measuring the effectiveness of the program

The Evaluation Team is now working on the last stage of this year’s project, reviewing first and second drafts of middle school persuasive essays and high school expository essays to determine how students responded to coaching notes.


  • Our own evaluation, such as the two-step evaluation in the middle schools conducted in the 1999-2000 school year. Each grade wrote an essay in September and another in May, each of which was scored. Each student improved – some dramatically – in essay or story writing.
  • Writing portfolios that show improvement from one draft to the next and from the beginning of the year to the end.
  • Teachers’ responses to the coaches and to the program as a whole.
  • Scores on state assessments, especially the movement from proficient to advanced proficient.
  • The growth of the program, not only in Montclair, but in other cities as well.

Ellen Kolba From the desk of Ellen Kolba

From the day in 1993 that we were handed our first grant money to start a writing center at Montclair High School, we have been asked to demonstrate our effectiveness. It is, of course, not unreasonable to be held accountable; in fact, we relished the opportunity to show what a good idea we had and what a welcome service we were providing. The big question was — and still is — How?

At the start, when Sheila and I were doing everything and basically making it up as we went along (after all, we had little in the way of useful models), we kept a notebook on the desk for students to sign in when they arrived at our cramped office. Before long, though, we had moved into the classrooms, creating the model that we still use. Coaches were scheduled into individual classes as close to once a week as we could manage, and the easiest way to keep track of how many students we worked with was to hang onto the weekly schedules, see how many coaches worked in each period of the day, assume that each coach saw three to four students, then do the math.

We also had our yellow sheets—the duplicates of the coaches’ responses—filed neatly by class, and we could count those. If we were able to snag copies of the students’ first and second drafts, we could do something even more important. We could look at the nature of the revisions. Did students use the coaches’ suggestions to help them make their writing clearer and more complete? Did some students go beyond the suggested revisions and begin to re-see and re-think what they had written? Or were some students stuck, unable to do more than make a few superficial corrections?

For a long time, even as The Writers Room Program expanded into the middle schools and then the elementary schools, our Board of Education was content with the head counts we provided and approved our budget annually. We could also supplement our count of students coached with a count of coaches recruited and trained. And our growing connection with Montclair State University and its teacher-education program gave us additional evidence of effectiveness.

Six or seven years ago, when the first inquiries about hard data came from the Board of Education, we turned to the scores on New Jersey’s Language Arts/Literacy Assessment. Montclair has had consistently good results on these tests, given originally in grades 4, 8, and 11 and now in grades 3-8 and grade 11. But could The Writers Room Program claim sole credit for this? Obviously, not.

Little by little, we’ve come to the realization that we need a true evaluation, conducted by an outside research group, with a focus on both quantitative and qualitative data. Last year at this time, I wrote a grant proposal, and The Writers Room Program received money from The Schumann Fund for New Jersey to hire ActKnowledge, an independent research and assessment group located at the Graduate Center of the City University of New York. Their study is underway, and I will be reporting on both the process and the results in this space every month. So stay tuned!