toplogo
Sign In

Scalable Two-Minute Feedback: Digital, Lecture-Accompanying Survey as a Continuous Feedback Instrument


Core Concepts
A digital, lecture-accompanying survey can provide continuous, scalable feedback to improve teaching and gain insights into student workload and understanding.
Abstract
The article describes a digital, formative feedback approach called the "two-minute feedback (2MF) survey" that was used continuously during the term at two different educational institutions. The 2MF survey allows for the efficient collection and quick analysis of feedback on the course, providing insights into the learning progress and workload of the students. The key findings include: The 2MF survey reached around 17.5% (UAS) and 30.4% (Uni) of the respective course sizes, with a steady decrease in participation over the term. The feedback covered topics related to lecture content, organization, exercises, and general comments. Responses also included self-reflections by students on their work ethic. Statistical analysis showed correlations between perceived stress, feeling overwhelmed, and ability to follow the lecture. Using a large language model (ChatGPT) to summarize the open-ended feedback proved promising, providing a quick overview of the main topics without hallucinating. The article discusses the advantages of the digital implementation for teachers, as well as the ethical implications of collecting data on student workload and stress. It concludes that the 2MF survey can be a valuable tool for improving teaching, especially in large courses, but requires further research to increase participation and optimize the use of AI for summarization.
Stats
"I feel stressed." and "I feel overwhelmed by my studies." have a large statistically significant linear correlation (r(720) = 0.75, p < 0.001). There is a slightly negative linear correlation between "stressed"/"overwhelmed" and "could follow" (r(704) = -0.20, p < 0.001 and r(706) = -0.27, p < 0.001). The median number of exercise sheet submissions is 7 for students who provided feedback, and 4 for those who did not (U = 39.938, p < 0.001).
Quotes
"Do you actually read the feedback?" "Forty submissions from almost 900 participants [...] are not a good result [...] a submission rate of 20 % [is] pathetic..."

Deeper Inquiries

How can the participation in the 2MF survey be increased, for example through gamification or student dashboards for self-reflection?

Increasing participation in the 2MF survey can be achieved through various strategies such as gamification and the implementation of student dashboards for self-reflection. Gamification involves incorporating game-like elements such as points, badges, leaderboards, and rewards to make the survey more engaging and enjoyable for students. By adding a competitive element and providing incentives for participation, students are more likely to actively engage with the survey. This can create a sense of accomplishment and motivation to provide feedback regularly. Additionally, implementing student dashboards for self-reflection can be beneficial. These dashboards can display personalized data on students' responses, trends in their feedback over time, and comparisons with their peers. By visualizing their feedback and progress, students can gain insights into their own learning experience and see the impact of their feedback. This self-reflection can motivate students to participate in the survey consistently and take ownership of their learning journey. By combining gamification elements and student dashboards, participation in the 2MF survey can be enhanced, leading to more comprehensive and valuable feedback for teachers to improve their courses.

How can the ethical implications of collecting data on student workload and stress be addressed?

Collecting data on student workload and stress raises important ethical considerations that must be addressed to ensure the well-being and privacy of students. To mitigate these ethical implications, several measures can be taken: Informed Consent: Students should be fully informed about the purpose of collecting data on their workload and stress, how the data will be used, and their rights regarding the information provided. Obtaining explicit consent from students before collecting sensitive data is crucial. Anonymity and Confidentiality: Ensuring the anonymity and confidentiality of student data is essential. Data should be aggregated and reported in a way that individual students cannot be identified. Protecting the privacy of students is paramount in ethical data collection practices. Data Security: Implementing robust data security measures to safeguard student information from unauthorized access or breaches. Using secure platforms and encryption techniques can help protect the confidentiality of the data collected. Data Use and Transparency: Clearly communicating how the data on student workload and stress will be used, who will have access to it, and how it will benefit the students and the educational institution. Transparency in data collection and usage builds trust with students. Ethics Review: Conducting an ethics review of the data collection process by an institutional review board or ethics committee can ensure that the collection and use of student data adhere to ethical standards and guidelines. By addressing these ethical considerations and implementing appropriate safeguards, the collection of data on student workload and stress can be conducted in a responsible and ethical manner.

How can the automatic summarization of open-ended feedback be further improved to provide teachers with even more valuable insights?

To enhance the automatic summarization of open-ended feedback and provide teachers with more valuable insights, several improvements can be implemented: Natural Language Processing (NLP) Techniques: Utilizing advanced NLP techniques to analyze and summarize text more accurately. Techniques such as sentiment analysis, topic modeling, and entity recognition can extract key information from the feedback. Contextual Understanding: Enhancing the AI model's ability to understand the context of the feedback by considering the course material, previous responses, and specific topics discussed. This contextual understanding can lead to more relevant and insightful summaries. Customization and Personalization: Tailoring the summarization process to the specific needs and preferences of teachers. Allowing teachers to customize the summarization criteria based on their priorities and areas of interest can provide more targeted insights. Feedback Clustering: Implementing clustering algorithms to group similar feedback responses together. This can help identify common themes, issues, or suggestions across multiple responses, providing a holistic view of the feedback. Interactive Summarization: Introducing an interactive summarization feature that allows teachers to provide feedback on the AI-generated summaries. This feedback loop can help refine the summarization process and improve the quality of insights provided. By incorporating these enhancements, the automatic summarization of open-ended feedback can be further refined to offer teachers deeper and more valuable insights for course improvement and student engagement.
0
visual_icon
generate_icon
translate_icon
scholar_search_icon
star