Networked Peer Assessment in Writing: Copyediting Skills Instruction in an ESL Technical Writing Course

Ted Knoy, San-Ju Lin, Zhi-Feng Liu, Shyan-Min Yuan

National Chiao Tung University
Hsinchu, Taiwan

Submitted to ICCE 2001 (Seoul, Korea), November 11-14, 2001

Abstract

Although the effectiveness of peer assessment in writing is well documented, the feasibility of applying networked peer assessment to writing has not been examined. In addition to outlining procedures of networked peer assessment in an ESL (English as a Second Language) technical writing course at two universities in northern Taiwan, this study applies a networked peer assessment system known as NetPeas to improve the copyediting skills of Taiwanese graduate students. The instructor and student's perceptions on the merits of using networked peer assessment to improve their writing skills are also discussed.

Keywords: Peer assessment, networked peer assessment, ESL technical writing

Introduction

Among the many innovative assessment methods developed in recent years include the extensive use of peer, cooperative, self and portfolio assessment (1, 8, 27, 29, 32). This study attempts to reinforce the copyediting skills taught in a graduate level ESL technical writing course in two universities in northern Taiwan by applying a networked peer assessment system developed at National Chiao Tung University, known as NetPeas.

Peer assessment, a natural process used from childhood onwards to critically evaluate peers, has been extensively studied in higher education in recent decades (27, 29). Higher education researchers who explored the validity, reliability and practicalities of peer assessment have generally conferred on its acceptability (10, 11,16, 23, 30). However, to our knowledge, the feasibility of applying networked peer assessment to writing courses has never been explored.

Our previous studies designed a Networked Peer assessment system known as NetPeas (5, 18). Liu adopted NetPeas in peer assessment activities of an undergraduate Operating Systems course (17). In this study, NetPeas is adopted to improve the copyediting skills of Taiwanese graduate students performing peer assessment in an ESL technical writing course.

Pertinent Literature
Peer assessment
Topping (29) defined "peer" as a student with similar educational qualifications or knowledge, who grades or offers suggestions concerning another student's work. Peer assessment has been used in several higher education subjects such as writing composition, civil engineering, sciences, electrical engineering, information, arts, and social sciences. Many other researchers' experiences of peer assessment, as a formative assessment method and as part of the learning process, can enable students to become more involved in learning and in the assessment process; they view peer assessment as fair and accurate as well (27). Falchikov (9) and Freeman (11) indicated a fairly high level of agreement between the marks given by peers and those given by the teacher. Cheng and Warren (4) confirmed that evaluating a particular task could improve students' assessment skills when assessing a similar task. Despite its merits, peer assessment is limited by corrections made based on friendship and decibel marking (24, 28).

Peer assessment of writing and ESL (English as a Second Language) writing instruction
Extensively studied for nearly three decades, peer assessment of writing has been applied to diverse curricula such as composition, technical and business writing, psychology, social science, engineering, geography and computing. In his thorough review of peer assessment research, Topping (30) concluded that peer assessment of writing appears to yield outcomes at least as effective, and occasionally better than, teacher assessment. Richer (26) compared the effects of peer group discussion of essays with teacher discussion and feedback. Holistic scoring of 174 pre- and post-test essays from 87 university freshmen indicated that the writing proficiency of the peer feedback group (at p=0.009) was higher than that of teacher feedback group. While comparing teacher, peer and self-assessment of writing up of pharmacology practicals, Hughes (13) found them equally effective.

Of particular relevance to this study is peer assessment for ESL (English as a Second Language) writing instruction since the participants in this study were non-native English speakers. Related research has cited peer assessment in this area of writing as having the following merits: a) bringing a genuine of sense of audience into the writing classroom (14, 22), b) facilitating the development of students' critical reading and analysis skills (3, 14), and c) encouraging students to focus on their intended meaning by discussing alternative points of view that can lead to the development of those ideas (7, 19, 21). In a writing class, Graner (12) compared peer assessment effects of two groups, one that mutually exchanged feedback and another that gave feedback without receiving peer feedback. After receiving initial peer assessment, subjects in both groups handed in revised essays. Teacher assessment revealed that both groups significantly improved over two rounds of peer assessment, while providing feedback without receiving it still enhanced writing performance. Related studies conferred that, through peer assessment, students can occasionally be more adept to a student's work as truly being in progress than the teachers, who tend to judge the work as a finished product (2, 6). Medonca and Johnson (21) found, through interviews, that all students in their study found peer review helpful in regard to audience perspective and idea development. As to whether students use peer feedback in their revisions, that same study found that 53% of revisions made in student's essays were a result of peer comments being incorporated into their essays. Stanley (28) similarly found that when students were trained how to effectively respond to peer comments, the number of revisions made increased. Mangelsdorf & Schlumberger (20) found that most students adopted a "prescriptive" rather than "collaborative" stance when they responded to their peers, thus making it necessary for teachers to train students in successful peer review techniques and also to provide opportunities for effective peer interaction.

Despite the considerable attention paid to peer assessment of writing, to our knowledge, this study describes for the first time networked peer assessment in this area. As mentioned in other related studies (Liu et al., in press), networked peer assessment has several advantages over traditional peer assessment, such as a higher assurance of anonymity, increased freedom of time and location for learners, and ability of students to modify their work more efficiently. These advantages will hopefully contribute to the effectiveness of peer assessment of writing.

Research Design
Sample

101 graduate level students in engineering and science from two technical writing courses of the Communication Engineering Department and the Information Science and Engineering Department at two research-oriented universities in northern Taiwan participated in four rounds of a networked peer assessment system known as NetPeas. The students also met individually with the instructor to correct the assignments and discuss the challenges in using this novel system. The same students completed 52 copyediting exercises in class over a fourteen-week period (3 exercises weekly). While the classroom exercises aimed to orient students on basic copyediting skills, the four rounds of networked peer assessment provided students with a more realistic environment to submit their assignments, anonymously review other classmates' assignments, and modify their own assignments based on peer comments.

Procedures and criteria
The students attended a weekly technical writing course and submitted eight homework assignments through four rounds of networked peer assessment (Fig. 1), with each round lasting three weeks. The assignments focused on instructing students how to organize and write a research paper for publication in international journals. In particular, each student had to submit the following assignments: (assignment 1) research title, engineering/scientific objective, engineering/scientific motivation, and personal motivation; (assignment 2) outline for technical argument; (assignment 3) technical argument; (assignment 4) description of engineering/scientific need for their research, problem statement and hypothesis statement; 5) an abstract; 6) an introduction; 7) a conclusion; and 8) twelve technical correspondence letters.

Figure 1: Demo of submitting homework to NetPeas (adopted from [17])

Students were also taught copyediting skills through 52 copyediting exercises in class (with each exercise containing roughly 10 to 12 sentences for revision) over a fourteen week period (3 exercises weekly). The copyediting exercises focused on helping students identify and correct writing style errors related to
a) Conciseness, i.e. using active voice frequently, using verbs instead of nouns, a) creating strong verbs, avoiding sentences that begin with It and There, and deleting redundant and needless phrases, and
b) Clarity, i.e. ensuring subject and verb agreement, ensuring that pronoun references are clear in meaning, creating sentences parallel in structure and meaning, eliminating modifier problems, eliminating modifier problems, double checking for faulty comparisons and omissions, and avoiding unnecessary shifts in a sentence.

Before the first round of peer assessment, students were instructed on how to use the NetPeas system to simulate the actual process for submitting research findings to an international journal and to refine the copyediting skills taught in the class exercises by anonymously reviewing homework assignments for writing style errors. Additionally, the students and the instructor discussed the following criteria for grading and correcting peer's homework assignments:
a) Structure: Each class assignment had to follow the structure outlined in the course syllabus. For instance, a student could not submit an abstract not structured based on syllabus requirements.
b) Style: Each class assignment had to be written in clear and concise English, omitting any grammatical, general writing style, or Chinese colloquial errors.
c) Usability: Each class assignment not only had to have a particular audience in mind, but also had to attempt to link the technical information with the particular reader's interest(s).

Results
In depth interviews with students and attitudinal surveys revealed that using a networked system to facilitate peer assessment has the following merits over traditional peer assessment:
1. Higher assurance of anonymity.
2. Increased freedom of time and location for the instructor and student.
3. Cross-platform tools for hypertext access.
4. Increased student-student interaction and feedback.

In another effort to employ networked peer assessment, Kwok & Ma utilized Group Support System (GSS) so that teachers and students could participate in the assessment process collaboratively (15). Students were encouraged to formulate evaluation criteria and weighting with the teacher. In addition to collaborative assessment, students were allowed to assess peers as well as their own selves. Students in the GSS-supported collaborative assessment achieved better grades than those in face-to-face collaborative assessment. Similarly, Rada supervised three small-size classes using Many Using and Creating Hypermedia (MUCH) to facilitate peer assessment in which computer science students attempted to solve exercise problems and submitted solutions for peer review (25). According to their results, students could learn effectively only if they were highly motivated and the grading policy required mandatory peer assessment. In this study, despite the initial unfamiliarity with using the NetPeas system, a majority of the students agreed that networked peer assessment provides a more effective means of learning copyediting skills than merely taking classroom exercises. They attributed this difference to that the former provides a more realistic environment to refine their skills as opposed to the traditional paper and pencil examination, which could be easily memorized without understanding the context of the textbook.

Conclusions
This study has examined the feasibility of applying networked peer assessment to two ESL technical writing courses in graduate level at two universities in northern Taiwan. Applying the same method to other subjects would not necessarily yield the same results, particularly for undergraduate students lacking the necessary academic skills and motivation. Results for native English speaking graduate students may differ from those in this study as well. Nevertheless, this study provides a valuable reference for instructors seeking innovative ways of incorporating the computer into the ESL writing classroom.

Acknowledgments
The authors would like to thank the National Science Council of the Republic of China for financially supporting this research under Contract No. NSC 89-2520-S-009-013 & NSC89-2520-S-009-016.

References

  1. Boud, D. (1990) Assessment and the promotion of academic values. Studies in Higher Education 15, 101-113.
  2. Caulk, N. (1994). Comparing teacher and student responses to written work. TESOL Quartely, 28, 181-188.
  3. Chaudron, C. (1984). The effects of feedback on students' composition revisions. RELC Journal 15(2), 1-14.
  4. Cheng, . and Warren, M. (1999). Peer and teacher assessment of the oral and written tasks of a group project, Assessment & Evaluation in Higher Education 24(3), 301-314.
  5. Chiu, C.H., Wang W.R., and Yuan S.M. (1998) Web-based Collaborative Homework Review System , Proceedings of ICCE'98, 2, 474-477.
  6. Devenney, R. (1989). How ESL teachers and peers evaluate and respond to student writing. RELC Journal 20, 77-90.
  7. DiPardo, A., & Freedman, S.W. (1988). Peer response groups in the writing classroom: theoretic foundations and new directions. System 58, 119-149.
  8. Dochy, F.J.R.C. and McDowell, L. (1997) Assessment as a tool for learning, Studies in Educational Evaluation 23, 279-298.
  9. Falchikov, N. (1993) Group process analysis: self and peer assessment of working together in a group, Educational and Training Technology International, 30, 275-284.
  10. Falchikov, N. and Magin, D. (1997) Detecting gender bias in peer marking of students' group process work. Assessment and Evaluation in Higher Education 22(4),385-396.
  11. Freeman, M. (1995) Peer assessment by groups of group work. Assessment and Evaluation in Higher Education. 20(3), 289-300.
  12. Graner, M.H. (1985). Revision techniques: peer editing and the revision workshop. Dissertation Abstracts International, 47, 109.
  13. Hughes, I.E. (1995) Peer assessment, Capability, 1(3), pp. 39-43.
  14. Keh, C.L. (1990) Feedback in the writing process: a model and methods for implementation. ELT Journal 44, 294-304.
  15. Kwok, R.C.W., & Ma, J. (1999). Use of a group support system for collaborative assessment.
    Computers and Education, 32 (2), 109-125.
  16. Lin, S.S.J. and Liu, E.Z.F. (2000) "The Learning Effects of Web-based Peer Assessment for Various Thinking Styles Students," Paper presented at the annual meeting of the American Education Research Association, New Orleans, Lousiana.
  17. Liu, E.Z.F. (1999) Networked peer assessment system: an analysis of student segments. A thesis for the master degree of Computer and Information Science of National Chiao Tung University.
  18. Liu, E.Z.F., Chiu, C.H., Lin, S.S.J., and Yuan, S.M. (1999) Student participation in computer science courses via the Networked Peer Assessment Systm (NetPeas), Proceedings of ICCE '99, 1, 774-777.
  19. Mangelsdorf, K. (1992) Peer reviews in the ESL composition classroom: what do the students think? ELT Journal 46, 274-284.
  20. Mangelsdorf, K., & Schlumberger, A. (1992) ESL student response stances in a peer review task. Journal of Second Language Writing, 1, 235-254.
  21. Mendonca, C.O., & Johnson, K.E. (1994). Peer review negotiations: revision activities in ESL writing instruction. TESOL Quarterly 28, 745-769.
  22. Mittan, R. (1989). The peer review process: harnessing students' communicative power. In D. Johnson &D. Roen (Eds.), Richness in Writing: Empowering ESL Students (pp. 207-219). New York: Longman.
  23. Orsmond, P., Merry, S. , and Reiling K. (1996) The importance of marking criteria in the use of peer assessment, Assessment and Evaluation in Higher Education 21, 239-249.
  24. Paulus, T.M. (1999) The effect of peer and teacher feedback on student writing, Journal of Second Language Writing, 8(3), pp. 265-289.
  25. Rada, R. (1998). Efficiency and effectiveness in computer-supported peer-peer learning.
    Computers and Education, 30 (3/4), 137-146.
  26. Richer, D.L. (1992) The effects of two feedback systems on frist year college students' writing proficiency, Dissertation Abstracts International, 5
  27. Sluijsmans, D., Dochy, F., and Moerkerke, G. (1999) Creating a learning environment by using self-peer- and co-assessment. Learning, Environments Research 1,3, 293-319.
  28. Stanley, J. (1992). Coaching student writers to be effective peer evaluators. Journal of Second Language Writing 1, 217-233, 53, p. 2722.
  29. Topping, K.J. (1998) Peer assessment between students in college and university, Review of Educational Research, 65(3), pp. 249-267.
  30. Topping, K.J., Smith, E.F., Swanson, I., Elliot, A. (2000) Formative Peer Assessment of Academic Writing Between Postgraduate Students. Assessment & Evaluation in Higher Education, 25(2), p. 149.
  31. Williams, E. (1992) Student attitudes towards approaches to learning and assessment. Assessment and Evaluation in Higher Education, 17(1), 45-58.
  32. Wilcox, B. and Tomei, L. (1999) Professional Portfolios for Teachers: A guide for learners, experts, and scholars. Christopher-Gordon Publishers Press.

 

回新聞報導