Towards Logic-Consistent and Controllable Automatic Story Generation
Access status:
Open Access
Type
ThesisThesis type
Masters by ResearchAuthor/s
Cai, ShizhanAbstract
Stories always state an allegorical or a past event, which records and disseminates cultural engagement and social values. Therefore, story generation has consistently drawn attention from the natural language generation (NLG ). In the story generation task, the model is trained ...
See moreStories always state an allegorical or a past event, which records and disseminates cultural engagement and social values. Therefore, story generation has consistently drawn attention from the natural language generation (NLG ). In the story generation task, the model is trained on a particular corpus of text. Starting with a given prompt, the language model will provide one or more tokens that continue the text. The prompt plus the continuation can be input into the language model to get the next continuation, and so on. However, there are still many challenging issues in story generation. The primary one lacking logical consistency has been plaguing story generation approaches. Namely, generated events may be irrelevant, conflicting, and illogical. Meanwhile, the unreasonable personas and emotions of characters can deviate from the original storylines. Moreover, the story generation lacks controllability. It is hard to control the content produced by Large Language Models (LLMs) beyond the given prompt, such as enforcing or editing the narrative style, character persona, or story topic. To address lacking a coherent story background, which is essential for crafting well-developed narratives, we propose a novel method for constructing detailed story backgrounds to enhance the quality of generated stories. Additionally, LLMs face limitations due to their context windows of approximately several thousand words, hindering their ability to comprehend and generate longer narratives. To overcome this, we introduce a method to extend the context window, enabling LLMs to handle and generate longer, more coherent stories.
See less
See moreStories always state an allegorical or a past event, which records and disseminates cultural engagement and social values. Therefore, story generation has consistently drawn attention from the natural language generation (NLG ). In the story generation task, the model is trained on a particular corpus of text. Starting with a given prompt, the language model will provide one or more tokens that continue the text. The prompt plus the continuation can be input into the language model to get the next continuation, and so on. However, there are still many challenging issues in story generation. The primary one lacking logical consistency has been plaguing story generation approaches. Namely, generated events may be irrelevant, conflicting, and illogical. Meanwhile, the unreasonable personas and emotions of characters can deviate from the original storylines. Moreover, the story generation lacks controllability. It is hard to control the content produced by Large Language Models (LLMs) beyond the given prompt, such as enforcing or editing the narrative style, character persona, or story topic. To address lacking a coherent story background, which is essential for crafting well-developed narratives, we propose a novel method for constructing detailed story backgrounds to enhance the quality of generated stories. Additionally, LLMs face limitations due to their context windows of approximately several thousand words, hindering their ability to comprehend and generate longer narratives. To overcome this, we introduce a method to extend the context window, enabling LLMs to handle and generate longer, more coherent stories.
See less
Date
2024Rights statement
The author retains copyright of this thesis. It may only be used for the purposes of research and study. It must not be used for any other purposes and may not be transmitted or shared with others without prior permission.Faculty/School
Faculty of Engineering, School of Computer ScienceAwarding institution
The University of SydneyShare