On the Effect of Explainable AI on Programming Learning A Case Study of using Gradient Integration
33 بار بازدید -
ماه قبل
-
On the Effect of Explainable
On the Effect of Explainable AI on Programming Learning: A Case Study of using Gradient Integration Technology
Feng Hsu Wang, Ming Chuan University, Taiwan
Abstract
AI-based learning technologies, especially deep learning, hold significant promise for enhancing students' learning experiences in educational systems. However, providing accurate predictions or answers to students' learning problems through high-performance deep learning models is not sufficient for students to achieve effective learning. This study explores Explainable Artificial Intelligence (XAI) in reducing students' cognitive load and improving learning outcomes within the realm of object-oriented programming education. Specifically, this study examines the application of Gradient Integration to generate coloured code segments associated with code errors predicted by a Performer-based deep learning classification model for debugging tasks. Thirty-six participants took part in a controlled experiment assessing students' cognitive load and learning performance through the XAI system. They were randomly assigned to a control group (N=18) and an experiment group (N=18). The independent-samples Wilcoxon-Mann-Whitney test results revealed that the coloured code segments reduce students' cognitive load (p=0.006) and improve their exam scores (p=0.006) significantly. This study contributes to an appropriate application of the XAI technique that can reduce students' cognitive load and improve learning outcomes in educational settings.
Keywords
Explainable Artificial Intelligence, Deep Learning Technology, Human-Computer Collaborative Learning, Programming Education
Full Text : https://aircconline.com/csit/papers/v...
Abstract URL : https://aircconline.com/csit/abstract...
Volume URL : https://airccse.org/csit/V14N12.html
#explainableai #deeplearning #programmingeducation #artificialintelligence #machinelearning
Feng Hsu Wang, Ming Chuan University, Taiwan
Abstract
AI-based learning technologies, especially deep learning, hold significant promise for enhancing students' learning experiences in educational systems. However, providing accurate predictions or answers to students' learning problems through high-performance deep learning models is not sufficient for students to achieve effective learning. This study explores Explainable Artificial Intelligence (XAI) in reducing students' cognitive load and improving learning outcomes within the realm of object-oriented programming education. Specifically, this study examines the application of Gradient Integration to generate coloured code segments associated with code errors predicted by a Performer-based deep learning classification model for debugging tasks. Thirty-six participants took part in a controlled experiment assessing students' cognitive load and learning performance through the XAI system. They were randomly assigned to a control group (N=18) and an experiment group (N=18). The independent-samples Wilcoxon-Mann-Whitney test results revealed that the coloured code segments reduce students' cognitive load (p=0.006) and improve their exam scores (p=0.006) significantly. This study contributes to an appropriate application of the XAI technique that can reduce students' cognitive load and improve learning outcomes in educational settings.
Keywords
Explainable Artificial Intelligence, Deep Learning Technology, Human-Computer Collaborative Learning, Programming Education
Full Text : https://aircconline.com/csit/papers/v...
Abstract URL : https://aircconline.com/csit/abstract...
Volume URL : https://airccse.org/csit/V14N12.html
#explainableai #deeplearning #programmingeducation #artificialintelligence #machinelearning
ماه قبل
در تاریخ 1403/04/17 منتشر شده
است.
33
بـار بازدید شده