Designing for complex creative task solving

   Solving creative problems such as those pertaining to writing or design is challenging because such problems are open-ended and require people to spend much time and effort to achieve high-quality results through trial and error. Feedback plays a critical role in this process because it can help people recognize and fix errors, leading to better results. However, high-quality feedback is difficult to obtain due to the limited pool of experts available. To address the issues of scalability and high cost, we aim to harness the wisdom of a crowd to collect feedback and thereby enable on-demand problem-solving support.
 
   Crowdsourcing has been proven to be useful for solving the complex problems that a single machine cannot solve. Recent work has collected timely feedback from crowds to support novice designers to improve their design. However, most research focuses on improving the content of feedback but neglects the most important aspect of how to support problem solvers to learn and facilitate high-quality outcomes effectively. In this work, we aim to leverage the power of crowds and machines to enable intelligent systems that generate useful feedback to guide people to learn and solve creative tasks effectively in a flexible workflow.
 
   This work proposes an iterative feedback framework that enables collaboration between authors and feedback providers (see Figure 1). In this framework, the authors can learn to solve problems and improve results based on the diverse feedback obtained from the feedback providers; the feedback providers can also learn to evaluate the quality of outcomes and provide effective feedback to the authors.
 
Figure 1. Iterative feedback framework for supporting creative task solving.
 
   The goal of this work is to explore ways to support not only feedback generation but also feedback integration processes, focusing on writing tasks. By drawing insights from learning science and crowdsourcing, we designed and developed intelligent systems that leverage the power of crowds and machines to support writers in obtaining effective feedback and facilitating good revision behaviors in the writing process.
 
   First, we designed and implemented a crowd-powered feedback system, StructFeed, to generate useful feedback for helping novice writers detect and diagnose high-level structural writing issues [1]. With the support of StructFeed, novice writers can develop a unified essay writing style and achieve high-quality results. In our experiment, we found that participants who received feedback from our system even outperformed others who received expert feedback. Although all participants found the expert feedback helpful, some people could not improve their writing in the revision process. Therefore, we started to explore ways of supporting people to make full and effective use of feedback and revision.
 
   To support effective revision, we present Feedback Orchestration, which adopts structured feedback in guiding writers to utilize feedback of different levels and support flexible revision workflows. We also designed and implemented an intelligent system and evaluated it with twelve novice writers in a field experiment. The results showed that structured feedback helped individuals discover their weaknesses and promoted deep reflection. Furthermore, enabling flexible revision workflows helped novices increase awareness and develop good revision strategies.
 
   In conclusion, this work presents an iterative feedback framework that enables collaboration between writers and online feedback providers to support creative task solving. Two intelligent systems have demonstrated the effectiveness of this framework and successfully supported novice writers in developing writing skills and improving their writing. Ultimately, this work contributes a new perspective that allows humans to collaborate with machines and crowds to accomplish complex creative tasks.
 
Reference
Huang, Y., Huang, J., and Hsu, J. Y. (2017). Supporting ESL writing by prompting crowdsourced structural feedback. In Proceedings of the Fifth AAAI Conference on Human Computation and Crowdsourcing (HCOMP-2017), Qu ́ebec City, Canada, 2017. 
 
Yi-Ching Huang
Graduate Institute of Networking and Multimedia
 
Jane Yung-jen Hsu
Professor, Department of Computer Science and Information Engineering
Director, Intel-NTU Connected Context Computing Center

LANDSCAPE

Keywords