Course Overview
Listed as: SCU COEN 296B
Instructor: Max Kreminski
Offered: Winter (Jan-Mar) 2023
Design, development, and evaluation of software systems intended to support human creativity. Students will read, write responses to, and discuss research papers on creativity support tools (CSTs), human-AI interaction, and related topics; work in small groups to create a software tool that supports artists, writers, designers, musicians, or other creative practitioners; and write a final project report that could serve as the seed for a future peer-reviewed conference or journal publication. HCI-focused complement to COEN 291 Computational Creativity, which approaches creativity from an AI-focused perspective.
Readings
Week 1: What are CSTs?
- Design Principles for Tools to Support Creative Thinking (Resnick et al. 2005). Key takeaways: a preliminary definition of CSTs, and the narrower genre of “composition tools”; low threshold, high ceiling, wide walls; evaluating creativity as an open problem.
- Mapping the Landscape of Creativity Support Tools in HCI (Frich et al. 2019). Key takeaways: different parts of the creative process that CSTs can target; bias toward studying novice users in CSTs research; evaluating creativity as a still-open problem.
Week 2: Understanding creativity
- Creativity in a Nutshell (Boden 2004). Key takeaways: P-creativity vs. H-creativity; novelty, surprise, and value; conceptual spaces.
- Design as a Reflective Conversation with the Situation (Schön 1984). Key takeaways: the design-as-conversation metaphor; moves, sometimes called design moves; close observation of human creative processes as a research method.
- Fixation and Commitment While Designing and Its Measurement (Gero 2011). Key takeaways: fixation and commitment; one potential strategy for visualizing and quantifying a creativity-related phenomenon.
Week 3: Design metaphors for CSTs
- Meanings of Tools, Support, and Uses for Creative Design Processes (Nakakoji 2006). Key takeaways: running shoes, dumbbells, and skis as design metaphors for CSTs with three different kinds of high-level aims: improving what users create, improving users themselves, and enabling entirely new kinds of creative activity.
- Computer of a Thousand Faces: Anthropomorphizations of the Computer in Design (1965-1975) (Vardouli 2012). Key takeaways: anthropomorphic roles as design metaphors for CSTs, including clerk, partner, wizard, surrogate, and accountant.
- Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity (Liapis et al. 2016). Key takeaways: mixed-initiative interaction patterns, in which the computer proactively intervenes in the creative process.
- Optional bonus readings: two alternative taxonomies of anthropomorphic roles that the computer might play. Do we ever want the computer to be a “nanny”, “coach”, or “manager”?
Week 4: Experience-focused CST design
- Casual Creators (Compton and Mateas 2015). Key takeaways: autotelic creativity (as opposed to goal-oriented creativity); casual creators as a genre of CSTs focused on making creativity easy and pleasurable; design patterns for casual creators.
- Reflective Creators (Kreminski and Mateas 2021). Key takeaways: alternative experiential goals for CSTs, beyond ease and pleasure; reasons that a CST might deliberately slow down user interaction, or add friction to it; design patterns for reflective creators.
- Optional bonus playing: BECOME A GREAT ARTIST IN JUST 10 SECONDS (download required), a canonical casual creator discussed by Compton and Mateas; Redactionist, a canonical reflective creator discussed by Kreminski and Mateas.
Week 5: Prototyping CSTs
- Computational Caricatures: Probing the Game Design Process with AI (Smith and Mateas 2011). Key takeaways: CSTs can be designed as “computational caricatures”, with explicit claims (to be recognized and tested), oversimplifications (to be overlooked), and abstractions (to be reused in later work).
- Prototyping Tools and Techniques (Beaudouin-Lafon and Mackay 2009). Key takeaways: Wizard-of-Oz (WOz) prototyping, i.e., having a human stand in for unimplemented parts of your software to enable rapid testing of new interaction paradigms. (You can skip sections 4.2, 5 and 6 of this reading; these sections mostly deal with specific software tools for prototyping and are more outdated in comparison to the rest of the chapter.)
Week 6: Evaluating CSTs
- Quantifying the Creativity Support of Digital Tools through the Creativity Support Index (Cherry and Latulipe 2014). Key takeaways: the Creativity Support Index (CSI), an attempt at a standardized and psychometrically validated survey instrument for evaluating creativity support tools.
- Evaluating Creativity Support Tools in HCI Research (Remy et al. 2020). Key takeaways: everything in the “Discussion” section; the diversity of methods used for CST evaluation; the fact that no clear consensus exists on the “best way” to evaluate CSTs.
Week 7: Expressive range analysis
- Evaluating Mixed-Initiative Creative Interfaces via Expressive Range Coverage Analysis (Kreminski et al. 2022). Key takeaways: expressive range analysis (ERA) as a visualization-based technique for understanding the texture of a computationally creative system’s output; extending expressive range analysis to account for human interactors.
- Danesh: Interactive Tools for Understanding Procedural Content Generators (Cook et al. 2021). Key takeaways: integrating ERA into a CST for designers of procedural content generators, enabling them to rapidly visualize the impact of changes.
- Optional background reading: Analyzing the Expressive Range of a Level Generator (Smith and Whitehead 2010). This was the original paper that introduced ERA and might be helpful if the brief introduction to ERA in the first reading doesn’t go deep enough for you.
Week 8: CSTs for collaborative creativity
- Artist Support Networks: Implications for Future Creativity Support Tools (Chung et al. 2022). Key takeaways: different relationship types in artist support networks (e.g, as summarized in Fig. 3); recommendations for CST design (Section 5.2). Also worth considering in light of how CSTs can affect not just individual artists but broader artist support networks: for instance, how tools like ChatGPT have recently resulted in major science fiction magazines like Clarkesworld being flooded with low-quality submissions.
- Designing for Creativity in Computer-Supported Cooperative Work (Farooq et al. 2008). Key takeaways: divergent vs. convergent thinking in ideation; the importance of dyads to creativity; support for individual, dyadic, and group brainstorming; preserving minority dissent.
- Bursting Scientific Filter Bubbles: Boosting Innovation via Novel Author Discovery (Portenoy et al. 2022). Key takeaways: leveraging knowledge of collaboration graphs to help users discover new sources of inspiration; CSTs for scientists/researchers.
Week 9: CSTs as coaches / teachers
- Creativity Support for Novice Digital Filmmaking (Davis et al. 2013). Key takeaways: CSTs can give feedback on “rules” of a creative form (for instance, filmmaking) that a novice creator’s work has violated.
- Toward Automated Critique for Student-Created Interactive Narrative Projects (Mahajan et al. 2019). Key takeaways: CSTs can show users where an artifact (for instance, an interactive story) falls in relation to a larger space of artifacts, and suggest very similar and very dissimilar artifacts as reference points or inspiration.
- Interactive Guidance Techniques for Improving Creative Feedback (Ngoon et al. 2018). Key takeaways: CSTs can scaffold learning about the creative process, for instance by helping student designers learn to critique one another’s work more effectively.
Week 10: Show and tell
- No readings this week. Students were instead asked to submit an existing CST for discussion. During class, we played around with and talked about these CSTs.