Learning From Peer Feedback and Peer Assessment

Learning From Peer Assessment

Peer assessment (ratings) and peer feedback (comments) are increasingly used as a pedagogical strategy because of the learning they can produce and because it is often efficient for instructors. From a research perspective, it is wonderfully complex in terms of what cognitive processes it invokes and all the ways it might influence learners (e.g., changing the audience, seeing models of varying quality, reflecting on qualities of good writing, being persuaded by multiple reviewers, learning to be a feedback provider). Web-based peer assessment/feedback can also be a supercollider for research.

0. A start

Peer assessment/feedback can transform all of your teaching: make every course, regardless of the level or size or supporting resources like a graduate seminar with highly motivated graduate students; eliminate inequities, raise performance, capture the hearts of your students. How can that work? Because peer feedback/assessment enables to you totally change what you ask students to do, and how you assess them. Check out this short video.


But, start simple! One assignment, one round of peer review, medium complexity


Talk to your students about why you are doing this: to improve their experience, not to save your time; mention that this approach has a ton of supporting research (see below for studies to cite, including many meta-analyses).


Show your students a short online tutorial on how to give good feedback.

1. Is peer assessment reliable and valid?

With clear, student-friendly rubrics, peer assessment is generally valid and reliable at all grade levels, in all disciplines

Li, H., Xiong, Y., Zang, X., L. Kornhaber, M., Lyu, Y., Chung, K. S., & K. Suen, H. (2016). Peer assessment in the digital age: A meta-analysis comparing peer and teacher ratings. Assessment & Evaluation in Higher Education, 41(2), 245-264 pdf

Xiong, Y. Schunn, C. D., & Wu, Y. (In press). What predicts variation in reliability and validity of online peer assessment?: A large-scale cross-context study. Journal of Computer Assisted Learning. 10.1111/jcal.12861 pdf


Simple training on the rubrics can be helpful

Li, H., Xiong, Y., Zang, X., L. Kornhaber, M., Lyu, Y., Chung, K. S., & K. Suen, H. (2016). Peer assessment in the digital age: A meta-analysis comparing peer and teacher ratings. Assessment & Evaluation in Higher Education, 41(2), 245-264 pdf


Requiring students to given helpful comments can improve rating reliability

Patchan, M. M., Schunn, C.D., & Clark, R. J. (2018). Accountability in peer assessment: examining the effects of reviewing grades on peer ratings and peer feedback. Studies in Higher Education, 43(12), 2263-2278. 10.1080/03075079.2017.1320374 pdf


Peers rarely give given erroneous comments, probably because they tend to focus comments on the elements they are more confident

Wu, Y. & Schunn, C. D. (2020). When peers agree, do students listen? The central role of feedback quality and feedback frequency in determining uptake of feedback. Contemporary Educational Psychology, 62, 101897. 10.1016/j.cedpsych.2020.101897 pdf


See this explanation for how Peerceptiv supports accurate ratings.

2. Do students learn from peer feedback?

Multiple recent meta-analyses have found that peer feedback improves learning. Positive effects for various learning assessments, writing quality, 2nd language writing, student attitudes. Positive effects in elementary, secondary, tertiary contexts, and in all examined disciplines. Stronger effects from providing than from receiving peer feedback. Stronger effects when supported by a web-based system. Similar positive effects as with self-assessment.

Double, K. S., McGrane, J. A., & Hopfenbeck, T. N. (2020). The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educational Psychology Review, 32, 481-509. html

Li, H., Xiong, Y., Hunter, C. V., Guo, X., & Tywoniw, R. (2020). Does peer assessment promote student learning? A meta-analysis. Assessment & Evaluation in Higher Education, 45(2), 193-211. pdf

Zheng, L., Zhang, X., & Cui, P. (2020). The role of technology-facilitated peer assessment and supporting strategies: A meta-analysis. Assessment & Evaluation in Higher Education, 45(3), 372-386. pdf

Thirakunkovit, S., & Chamcharatsri, B. (2019). A meta-analysis of effectiveness of teacher and peer feedback: Implications for writing instructions and research. Asian EFL Journal, 21(1), 140-170. pdf

Huisman, B., Saab, N., van den Broek, P., & van Driel, J. (2019). The impact of formative peer feedback on higher education students’ academic writing: a Meta-Analysis. Assessment & Evaluation in Higher Education, 44(6), 863-880. html

Vuogan, A., & Li, S. (2022). Examining the Effectiveness of Peer Feedback in Second Language Writing: A Meta‐Analysis. Tesol Quarterly. pdf

Li, H., Bialo, J. A., Xiong, Y., Hunter, C. V., & Guo, X. (2021). The effect of peer assessment on non-cognitive outcomes: A meta-analysis. Applied Measurement in Education, 34(3), 179-203.pdf

Van Popta, E., Kral, M., Camp, G., Martens, R. L., & Simons, P. R. J. (2017). Exploring the value of peer feedback in online learning for the provider. Educational Research Review, 20, 24-34.html

Yan, Z., Lao, H., Panadero, E., Fernández-Castilla, B., Yang, L., & Yang, M. (2022). Effects of self-assessment and peer-assessment interventions on academic performance: A pairwise and network meta-analysis. Educational Research Review, 100484.html

Sanchez, C. E., Atkinson, K. M., Koenka, A. C., Moshontz, H., & Cooper, H. (2017). Self-grading and peer-grading for formative and summative assessments in 3rd through 12th grade classrooms: A meta-analysis. Journal of Educational Psychology, 109(8), 1049.html


Students learn more from peer feedback when they provide more and longer comments, comments with explanations, use the feedback they receive to revise their documents

Wu, Y. & Schunn, C. D. (2021). The effects of providing and receiving peer feedback on writing performance and learning of secondary school students. American Educational Research Journal, 58(3), 492-526. 10.3102/0002831220945266 pdf

Zong, Z., Schunn, C. D., & Wang, Y. (2021). What aspects of online peer feedback robustly predict growth in students’ task performance? Computers in Human Behavior, 124, 106924. 10.1016/j.chb.2021.106924 pdf

Yu, Q. & Schunn, C. D. (2023). Understanding the what and when of peer feedback benefits for performance and transfer. Computers in Human Behavior, 147, 107857. 10.1016/j.chb.2023.1078576 pdf

Wu, Y. & Schunn, C. D. (2021). Passive, active, and constructive engagement with peer feedback A revised model of learning from peer feedback. Contemporary Educational Psychology, 73, 102160. 10.1016/j.cedpsych.2023.102160 pdf

3. How can I support students to give higher quality feedback?

Hold them accountable for providing helpful feedback (part of many online peer feedback systems, including Peerceptiv)

Patchan, M. M., Schunn, C.D., & Clark, R. J. (2018). Accountability in peer assessment: Examining the effects of reviewing grades on peer ratings and peer feedback. Studies in Higher Education, 43(12), 2263-2278. 10.1080/03075079.2017.1320374. pdf


Give some brief training on how to give good feedback. It might just involve showing them this tutorial. For more in depth approaches, see this paper:

Cui, Y., Schunn, C. D., Gai, X., Jiang, Y., & Wang, Z. (2021). Effects of trained peer vs. teacher feedback on EFL students’ writing performance, self-efficacy, and internalization of motivation. Frontiers in Psychology, 12, 6659 10.3389/fpsyg.2021.788474 html


Students who initially given lower quality feedback or give too much feedback will naturally norm set across assignments

Zong, Z., Schunn, C. D., & Wang, Y. (2022). Do experiences of interactional inequality predict lower depth of future student participation in peer review? Computers in Human Behavior, 127, 107056. 10.1016/j.chb.2021.107056 pdf

4. Can lower performing students give accurate ratings and useful feedback?

Lower performing students are not more likely to be lenient and actually slightly more likely to be too harsh in their ratings

Xiong, Y. & Schunn, C. D. (2021). Reviewer, Essay, and Reviewing Process Characteristics that Predict Errors in Web-based Peer Review. Computers & Education, 166, 104146. 10.1016/j.compedu.2021.104146 pdf


Lower performing are just as able to detect problems in documents

Wu, Y. & Schunn, C. D. (2023). Assessor writing performance on peer feedback: Exploring the relation between assessor writing performance, problem identification accuracy, and helpfulness of peer feedback. Journal of Educational Psychology, 115(1), 118–142. 10.1037/edu0000768 pdf


Lower performing students give equally helpful and useful comments, except in the case of very specific problems they also have in their own documents

Wu, Y. & Schunn, C. D. (2023). Assessor writing performance on peer feedback: Exploring the relation between assessor writing performance, problem identification accuracy, and helpfulness of peer feedback. Journal of Educational Psychology, 115(1), 118–142. 10.1037/edu0000768 pdf

5. What makes for a good peer feedback task?

Rubrics that are student friendly and focus directly on what you are hoping students will learn in your course

Schunn, C. D., Godley, A. J., & DeMartino, S. (2016). The reliability and validity of peer review of writing in high school AP English classes. Journal of Adolescent & Adult Literacy, 60(1), 13–23. pdf


Rubrics that are aligned to the performance range of the students in the class (i.e., there is real variation on the scale)

Xiong, Y. Schunn, C. D., & Wu, Y. (In press). What predicts variation in reliability and validity of online peer assessment?: A large-scale cross-context study. Journal of Computer Assisted Learning. 10.1111/jcal.12861 pdf


Comment prompts provide clear guidance for what to look for / comment upon

Mu & Schunn, under review


Comment prompts ask students to clearly define a problem, explain why it is a problem, and give suggestions for how to improve it (rather than briefly listing many different problems)

Jing & Schunn, under review

6. Any other pragmatic advice?

Don’t use the basic peer feedback tool built into an LMS. These are very basic and lack fundamental features that insure higher quality feedback. There are now many more advanced systems that integrate directly with the more commonly used Learning Management Systems (Canvas, D2L/Brightspace, Moodle, Sakai, Blackboard)


Consider assigning group projects. This allows for more interesting assignments (more interesting to produce and more interesting to read). Make sure there are parts to the project that then discourages one project hog; have extensive rubrics so that each part of the project gets attention. Consider using team evaluation (available in Peerceptiv and CATME) to hold students accountable for equal contributions. Have students do individual reviewing (can be lighter load)


Looking at rating disagreement cases and/or lower reliability of your rubrics (information available in Peerceptiv and some other online systems) to learn what to discuss in class and how to improve your assignment instructions/rubrics over time


Give at least 5 days to complete reviewing and at least 3 days to complete feedback on reviews


Allow students to complete extra reviews for bonus points (more reviews = more learning)


Use benchmark grading rather than curved grading (available in Peerceptiv and some other online systems): not a lot of work, gives insight into student performance, discourages dysfunctional competition among students, and is a more equitable teaching practice

peerceptiv

peerceptiv

Peerceptiv is a web-based reciprocal peer assessment system that emphasizes 1) carefully structured end-comments and rubric ratings, 2) anonymity to encourage honesty, and 3) grade accountability algorithms for reviewing accuracy and helpfulness (see video). Iteratively studied and improved since 2002, it has been used by hundreds of thousands of students across disciplines and educational levels around the world. Previously researched under the name SWoRD, research continues on the data it produces by many diffrent researcher teams.