M-powering teachers: natural language processing powered feedback improves 1:1 instruction and student outcomes


Conference paper


Dorottya Demszky, Jing Liu
ACM Conference on Learning, ACM, app, Copenhagen Denmark, 2023 Jul, pp. 59-69


View PDF Link <<
Cite

Cite

APA   Click to copy
Demszky, D., & Liu, J. (2023). M-powering teachers: natural language processing powered feedback improves 1:1 instruction and student outcomes. In ACM Conference on Learning (pp. 59–69). Copenhagen Denmark: ACM. https://doi.org/10.26300/s8xh-zp45


Chicago/Turabian   Click to copy
Demszky, Dorottya, and Jing Liu. “M-Powering Teachers: Natural Language Processing Powered Feedback Improves 1:1 Instruction and Student Outcomes.” In ACM Conference on Learning, 59–69. Copenhagen Denmark: ACM, 2023.


MLA   Click to copy
Demszky, Dorottya, and Jing Liu. “M-Powering Teachers: Natural Language Processing Powered Feedback Improves 1:1 Instruction and Student Outcomes.” ACM Conference on Learning, ACM, 2023, pp. 59–69, doi:10.26300/s8xh-zp45.


BibTeX   Click to copy

@inproceedings{demszky2023a,
  title = {M-powering teachers: natural language processing powered feedback improves 1:1 instruction and student outcomes},
  year = {2023},
  month = jul,
  address = {Copenhagen Denmark},
  pages = {59-69},
  publisher = {ACM},
  school = {app},
  doi = {10.26300/s8xh-zp45},
  author = {Demszky, Dorottya and Liu, Jing},
  booktitle = {ACM Conference on Learning},
  howpublished = {},
  month_numeric = {7}
}

Abstract

Although learners are being connected 1:1 with instructors at an increasing scale, most of these instructors do not receive effective, consistent feedback to help them improved. We deployed M-Powering Teachers, an automated tool based on natural language processing to give instructors feedback on dialogic instructional practices —including their uptake of student contributions, talk time and questioning practices — in a 1:1 online learning context. We conducted a randomized controlled trial on Polygence, a re-search mentorship platform for high schoolers (n=414 mentors) to evaluate the effectiveness of the feedback tool. We find that the intervention improved mentors’ uptake of student contributions by 10%, reduced their talk time by 5% and improves student’s experience with the program as well as their relative optimism about their academic future. These results corroborate existing evidence that scalable and low-cost automated feedback can improve instruction and learning in online educational contexts.

Share



Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in