toplogo
Sign In

Leveraging AI to Support Design Education: Insights from a Cross-Disciplinary Study on Assessment and Feedback Practices


Core Concepts
AI-based design analytics and dashboards can complement human instructors' efforts in providing transparent, on-demand assessment and feedback to support students' learning in design education.
Abstract
The researchers conducted a qualitative study involving 11 design instructors across 5 fields (architecture, interactive art & design, mechanical engineering, landscape architecture, and computer science) to understand their practices of assessment and feedback in design education. Key findings: Instructors use rubrics of criteria to assess student design work, but these rubrics function based on the "family resemblance principle" - no single criterion is necessary or sufficient, but rather a combination of characteristics tends to indicate good design. Assessing individual contributions in team-based design projects is challenging for instructors. Instructors provide feedback through various modalities like redlining, verbal comments, and markup, but often find that students fail to consistently incorporate the feedback. The researchers propose a "situating analytics" methodology to develop AI-based design creativity analytics that align with instructors' situated practices and assessment criteria. They suggest integrating these analytics into design learning environments through interactive dashboards to: Make the meaning and derivation of the analytics transparent to instructors and students Enable instructors and students to validate the analytics and provide feedback to refine them Support instructors in providing timely, learning-objectives-based assessment and feedback to students The researchers develop a set of situated design creativity analytics, including Fluency, Flexibility, Visual Consistency, Multiscale Organization, and Legible Contrast, which correspond to instructors' assessment criteria across the studied design education contexts.
Stats
"The goal is quantity." "You only got 2 drawings at the [larger] scale where you got 35 at the [smaller] scale, [hence] you need to do more analysis of the larger scale." "[The computer could tell] where high contrast is vs. where there is less contrast. [This matters] because contrast takes human visual attention."
Quotes
"There's a kind of disconnect between [students] turning in a [design] and they getting a number [back]...Why is it a 'B'?...[We] need to have a better tool communicating [the assessment] to the students." "Yeah that would be cool I think if we could develop some metrics to build into a consistent rubric. It will spit out the rubric scores and then the professor can say, well that's right or wrong."

Deeper Inquiries

How can the "situating analytics" methodology be extended to other educational domains beyond design?

The "situating analytics" methodology can be extended to other educational domains by adapting the principles of aligning analytics with situated practices to fit the specific needs and contexts of those domains. This involves understanding the unique assessment and feedback challenges within each educational field and developing AI-based analytics that support the learning objectives and practices of instructors and students. By engaging stakeholders in the co-design process and continuously refining the analytics based on feedback, the methodology can be applied to various educational settings to enhance teaching and learning outcomes.

What are the potential ethical considerations and risks in deploying AI-based assessment and feedback systems in educational settings?

Deploying AI-based assessment and feedback systems in educational settings raises several ethical considerations and risks. One major concern is the potential for bias in the algorithms used for assessment, which could lead to unfair evaluations and reinforce existing inequalities. There is also the risk of data privacy and security breaches, as these systems often collect and analyze sensitive student information. Additionally, there may be concerns about the transparency and accountability of AI systems, as students and instructors may not fully understand how decisions are made. It is crucial to address these ethical considerations through transparent design, regular audits, and clear guidelines for data usage and protection.

How might the integration of AI-based analytics and dashboards impact the social dynamics and power structures within design education classrooms?

The integration of AI-based analytics and dashboards in design education classrooms could impact the social dynamics and power structures in several ways. Firstly, it may shift the role of instructors from being the sole authority figure to facilitators of learning, as students can receive immediate feedback and guidance from AI systems. This could empower students to take more ownership of their learning process. However, there is a risk that reliance on AI systems could marginalize certain students who may not have access to or be comfortable with technology. Moreover, the use of analytics and dashboards could create new hierarchies based on data literacy and technological proficiency, potentially reinforcing existing power dynamics within the classroom. It is essential to consider these implications and ensure that the integration of AI technologies promotes inclusivity and equitable learning opportunities for all students.
0
visual_icon
generate_icon
translate_icon
scholar_search_icon
star