This study explored how instructors teaching in the same EFL program rated content in compositions. Twenty-one writing samples written on three different topics were rated by five teacher-raters. The transcribed comments made by the teacher-raters showed that 1) their rating behaviors varied to a great extent as to the ways of commenting on and viewing the content of the compositions, 2) when a composition failed to address the topic given, it negatively affected the assessment of the quality of writing, and 3) the teacher-raters made fewer comments on the content of the formal essays than on the content of e-mails. A couple of suggestions are presented with respect to the necessity of establishing written guides for rating criteria for students" essays and of rater training.