TaleBrush: Sketching Stories with Generative Pretrained Language Models

被引:78
|
作者
Chung, John Joon Young [1 ]
Kim, Wooseok [2 ]
Yoo, Kang Min [3 ]
Lee, Hwaran [3 ]
Adar, Eytan [1 ]
Chang, Minsuk [3 ]
机构
[1] Univ Michigan, Ann Arbor, MI 48109 USA
[2] Korea Adv Inst Sci & Technol, Daejeon, South Korea
[3] Naver AI LAB, Seongnam, South Korea
关键词
story writing; sketching; creativity support tool; story generation; controlled generation; PLOT;
D O I
10.1145/3491102.3501819
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
While advanced text generation algorithms (e.g., GPT-3) have enabled writers to co-create stories with an AI, guiding the narrative remains a challenge. Existing systems often leverage simple turn-taking between the writer and the AI in story development. However, writers remain unsupported in intuitively understanding the AI's actions or steering the iterative generation. We introduce TaleBrush, a generative story ideation tool that uses line sketching interactions with a GPT-based language model for control and sensemaking of a protagonist's fortune in co-created stories. Our empirical evaluation found our pipeline reliably controls story generation while maintaining the novelty of generated sentences. In a user study with 14 participants with diverse writing experiences, we found participants successfully leveraged sketching to iteratively explore and write stories according to their intentions about the character's fortune while taking inspiration from generated stories. We conclude with a reflection on how sketching interactions can facilitate the iterative human-AI co-creation process.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] TaleBrush: Visual Sketching of Story Generation with Pretrained Language Models
    Chung, John Joon Young
    Kim, Wooseok
    Yoo, Kang Min
    Lee, Hwaran
    Adar, Eytan
    Chang, Minsuk
    EXTENDED ABSTRACTS OF THE 2022 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2022, 2022,
  • [2] Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale
    Hu, Xiang
    Ji, Pengyu
    Zhu, Qingyang
    Wu, Wei
    Tu, Kewei
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 2640 - 2657
  • [3] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [4] Constructing Chinese taxonomy trees from understanding and generative pretrained language models
    Guo, Jianyu
    Chen, Jingnan
    Ren, Li
    Zhou, Huanlai
    Xu, Wenbo
    Jia, Haitao
    PeerJ Computer Science, 2024, 10
  • [5] A Survey of Pretrained Language Models
    Sun, Kaili
    Luo, Xudong
    Luo, Michael Y.
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, 2022, 13369 : 442 - 456
  • [6] A Heideggerian analysis of generative pretrained transformer models
    Floroiu, Iustin
    Timisica, Daniela
    ROMANIAN JOURNAL OF INFORMATION TECHNOLOGY AND AUTOMATIC CONTROL-REVISTA ROMANA DE INFORMATICA SI AUTOMATICA, 2024, 34 (01): : 13 - 22
  • [7] Sketching Process Models by Mining Participant Stories
    Ivanchikj, Ana
    Pautasso, Cesare
    BUSINESS PROCESS MANAGEMENT FORUM, BPM FORUM 2019, 2019, 360 : 3 - 19
  • [8] Geographic Adaptation of Pretrained Language Models
    Hofmann, Valentin
    Glavas, Goran
    Ljubesic, Nikola
    Pierrehumbert, Janet B.
    Schuetze, Hinrich
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 411 - 431
  • [9] Generating Datasets with Pretrained Language Models
    Schick, Timo
    Schuetze, Hinrich
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6943 - 6951
  • [10] Investigating Transferability in Pretrained Language Models
    Tamkin, Alex
    Singh, Trisha
    Giovanardi, Davide
    Goodman, Noah
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1393 - 1401