Core Concepts
AI-based design analytics and dashboards can complement human instructors' efforts in providing transparent, on-demand assessment and feedback to support students' learning in design education.
Abstract
The researchers conducted a qualitative study involving 11 design instructors across 5 fields (architecture, interactive art & design, mechanical engineering, landscape architecture, and computer science) to understand their practices of assessment and feedback in design education.
Key findings:
Instructors use rubrics of criteria to assess student design work, but these rubrics function based on the "family resemblance principle" - no single criterion is necessary or sufficient, but rather a combination of characteristics tends to indicate good design.
Assessing individual contributions in team-based design projects is challenging for instructors.
Instructors provide feedback through various modalities like redlining, verbal comments, and markup, but often find that students fail to consistently incorporate the feedback.
The researchers propose a "situating analytics" methodology to develop AI-based design creativity analytics that align with instructors' situated practices and assessment criteria. They suggest integrating these analytics into design learning environments through interactive dashboards to:
Make the meaning and derivation of the analytics transparent to instructors and students
Enable instructors and students to validate the analytics and provide feedback to refine them
Support instructors in providing timely, learning-objectives-based assessment and feedback to students
The researchers develop a set of situated design creativity analytics, including Fluency, Flexibility, Visual Consistency, Multiscale Organization, and Legible Contrast, which correspond to instructors' assessment criteria across the studied design education contexts.
Stats
"The goal is quantity."
"You only got 2 drawings at the [larger] scale where you got 35 at the [smaller] scale, [hence] you need to do more analysis of the larger scale."
"[The computer could tell] where high contrast is vs. where there is less contrast. [This matters] because contrast takes human visual attention."
Quotes
"There's a kind of disconnect between [students] turning in a [design] and they getting a number [back]...Why is it a 'B'?...[We] need to have a better tool communicating [the assessment] to the students."
"Yeah that would be cool I think if we could develop some metrics to build into a consistent rubric. It will spit out the rubric scores and then the professor can say, well that's right or wrong."