Digitizing Assessment of Creative Aptitude: A Human-Centred Design Approach
No Thumbnail Available
Assessing creative aptitude is an inherent criterion in entrance examination of Design education. The assessment process of creative aptitude in Design entrance examination starts from two major perspectives- 1) Identifying characteristics of framing questions by Design pedagogues that test students' creative aptitude, 2) Evaluating students’ creative responses (Aburas & Nurunnabi, 2019). Formulating creative question plays a significant role in Design entrance examinations that has the potential to instigate creative responses among students (Constantinou, 2021; Rashid & Qaisar, 2016). Pedagogues remain ever-inquisitive to know whether the questions framed by them are really creative enough to instigate creative responses. During this process of framing questions that triggers creativity in students, pedagogues solely rely on their individual experiences. To ensure least interference of individual biases among Design pedagogues that affects quality of questions while formulating them, this research study has been undertaken to inquire and investigate ways through which an optimized system of support for pedagogues can be designed. To address this situation, a computational design model is proposed that assesses the questions framed by the pedagogues and analyses it to check whether it would instigate creative responses from students. While types of question plays a significant role in instigating creative responses, evaluating these responses is a major challenge. During assessment of these responses novelty is the most important factor Design pedagogues often look out for. It is a significant criteria in synthesizing creative responses (Jagtap, 2019; Liberati et al., 2018; Saaksjarvi & Goncalves, 2018; Sarkar & Chakrabarti, 2011). Assessing novelty in creative responses requires subjective evaluation, which is generally dependent on experts’ knowledge, choice, and persuasion (Ma et al., 2017). Presently in Design entrance examinations of India, evaluation of novelty is conducted by pedagogues possessing expertise in assessing creative skills. During this evaluation process, they are confronted with multiple challenges such as individual stress, evaluation time, etc. which often lead to inconsistencies in the evaluation process and often reduce self-confidence among pedagogues (Gonzalez et al., 2017; Richards et al., 2017). Predominantly, creative responses are subjective and generally evaluated based on pedagogues’ frame of reference, leading to inconsistency of evaluation across different pedagogues. In situations like this, there might be a reduction in trust of students in the evaluation procedure. To mitigate these challenges, multiple computational design models have been proposed in this thesis with an objective to automate the process of evaluating novelty from creative responses exhibited through various patterns of responses such as descriptive, labelled image, and annotated image-based responses. The research investigation reported here focuses on addressing challenges faced by the Design educators community and can be directly related as a contribution from the perspective of Design Praxiology as proposed by (Cross, 1999; Gasparski, 1979). The proposals made in this research work possesses an approach that intends to prepare Design education community specifically Design pedagogues to embrace changes in existing ways of framing creative questions and assessing students’ responses. This study has addressed a Design problem, which is specifically based on addressing human-errors in the process of assessing creative responses. Furthermore, studies involved in this research would save time for evaluation, disseminate faster results, and reduce other logistics such as paper, storage spaces, etc., for education practitioners.
Supervisor: Dhar, Debayan
Computational Creativity, Human-Centred Design Approach, Deep Learning