-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Echoes of the Forgotten Machine #8
Comments
I will also be helping @plan9ch7 on his entry as well. I'm very excited to see how it goes, and look forward to providing more details when they are available. |
Yes, I am excited about this collaboration. @tra38 had some great ideas on the text generation portion, and we're splitting this up into the distinct phases that also speak to our interests. BTW, if anyone is in Houston, TX, I was thinking of trying to get some nanogenmoers together to work on their projects, discuss ideas, etc... |
I wanted to share the approach we're using to generate our novel. First, there's the structure. This will be done with a Markov Chain encoding the beats required of a Quest novel. This will give us a plot outline that should ensure that the novel unfolds in a logical, causal way while still keeping with the conventions of the genre. To flesh the outline out, we're using an LLM to produce passages of text that can apply to each beat. Ideally, we'd have a variety of interchangeable parts for any beat from which we can randomly choose. The stochastic elements in both the outline and the choice of snippets will produce different novels in this genre and thus lead to the generative power of this method. If anyone is interested in joining our meetings, helping out, discussing this idea, or even being a fly on the wall, please reach out to us. We're in the Houston area and plan to meet in person about once a week during this period. It would be great to see what other local Nanogenmoers we have. |
# -*- coding: utf-8 -*-
"""Markov Chain (LLM Snippets)
Automatically generated by Colab.
Original file is located at
https://colab.research.google.com/drive/1XZCZXlTcPwcympe4YDLc5S7f7Ijc8kY1
# Generating Novels
This program uses a Markov Chain in conjunction with an LLM to generate a 50,000 word novel. Why not use an LLM for everything?
1. LLMs reason poorly (despite some claims) and so will not maintain continuity of characters or events.
1. None of the LLMs we used could return anywhere near that number of words.
On the other hand, LLMs do well at producing short snippets that sound like a human wrote them, even if that human is not particularly talented. So to get the best of both worlds, we're control the rational structure of the novel and using that to prompt an LLM for focused snippets within the narrow constraints defined by that.
Let's now talk about novel structures.
# The Structure of a Novel
The difference between a mass of random words and even coherent disconnected sentences is structure. For instance, take this structure:
1. We introduce a protagonist.
1. The protagnist's world is upset by a crisis.
1. The protagonist tries to resolve the crisis but makes things worse.
1. The protagonist finally has a moment of insight and fixes the crisis.
This is a classic story structure. The structure here is the deepening series of crisis, and the payoff is the resolution, which is the problem posed when the novel opened. All the sentences contribute to the same consistent them and structure and progress towards a resolution, the failures serving to buid up tension to make the resolution that much more rewarding.
This isn't the only story structure, and there are additional refinements. For instance, quest novels can follow the structure above, while introducing additional fine-grained structures, such as hitting the road to find the key to resolve the crisis. And within quest novels, we have variations. For instance...
1. Does the hero immediately go on the quest or resist and only get forced into it later?
1. Does the hero meet the villain and get defeated then go on a quest, or only meet the villain at the end?
1. Does the hero share a history with the villain or is the villain a stranger to the hero?
There are many more. But regardless, the overall structure is there. The hero will go on a quest, will face the villain, etc... This makes the structure contain stochastic elements that perturb some of the details but don't change the main narrative thrust. Such a structure can be represented with a Markov Chain.
A Markov Chain is a collection of states along with probabilities of going from one state to another. We can then walk this structure and get a sequence of states that represent our story beats. And fortunately, we have some tools that let us handle Markov Chains without implementing them from scratch. Let's install one now.
!pip install PyDTMC > /dev/null 2>&1
"""
from pydtmc import MarkovChain, plot_graph
"""We can initialize a Markov Chain from a Matrix. For instance, assume we have a chain consisting of 3 states, $a, b, c$. We then create a $3x3$ table where the rows and columns are $a,b,c$ in that order. The value of any cell $i,j$ gives the probability of going from $i,j$. So here's an example:
$$
\begin{bmatrix}
0 & 0.5 & 0.5 \\
0.3 & 0 & 0.7 \\
0 & 0 & 1
\end{bmatrix}
$$
The first value $0$ is the probability of going from $a$ to $a$. Since we can't do that, we have a probability of $0$. The next value ($0.5$) tells us the probability of going from $a$ to $b$ and so on. The next row are the probabailities for going from state $b$ to the other states and so on. In the last row, we see that $c$ only has one transiton and that's to itself with a probability of $1$. This means once we get to $c$ we can never get out. This is an absorbing state it represents the end of our story.
"""
chain = MarkovChain([ [0.0, 0.5, 0.5],
[0.3, 0.0, 0.7],
[0.0, 0.0, 1.0] ],
['a','b','c'] )
"""It helps to visualize the Markov Chain. Fortunately, this package comes with the ability to graph it, so let's take a look."""
plot_graph(chain, dpi=600)
"""Once we have a Markov Chain, we just need walk it until we hit an absorbing state. Let's write a function that does it for us and try it out on our sample chain. Since this represents our plot, I'll just name it accordingly.
"""
# Walks the Markov Chain until we hit an absorbing state.
def walk(mc, start_state):
state = start_state
states = [state]
while not mc.is_absorbing_state(mc.states.index(state)):
state = mc.next(state)
states.append(state)
return states
"""To use it, we just specify that starting state and let it rip. Hit play a few times below and see the kinds of walks you get."""
walk(chain, 'a')
"""You can see there's some variation in what's returned, but the overall pattern is there. You will end up at $c$. This is the blend of local randomness with a global structure that takes you to your end state.
Now that we see how this is supposed to work, let's build up our plot. Here I'm going to break up the story into smaller chains rather than one big one. my purpose here is to be able to easily visualize each step, which means breaking down the chain.
"""
# @title The Markov Chain, broken up into acts and movements within acts
acts = [
MarkovChain([
[0, 0.4, 0.4, 0.2, 0, 0, 0, 0, 0 ],
[0, 0, 0, 0, 1, 0, 0, 0, 0 ],
[0, 0, 0, 0, 0, 1, 0, 0, 0 ],
[0, 0, 0, 0, 0, 0, 1, 0, 0 ],
[0, 0, 0, 0, 0, 0, 0, 1, 0 ],
[0, 0, 0, 0, 0, 0, 0, 0.5, 0.5],
[0, 0, 0, 0, 0, 0, 0, 0, 1 ],
[0, 0, 0, 0, 0, 0, 0, 1, 0 ],
[0, 0, 0, 0, 0, 0, 0, 0, 1 ]],
['Setting', 'Good', 'Avg', 'Anti-Hero', 'Sidekick', 'Friend', 'Conscience', 'Seeks Adventure', 'Forced Into It']),
MarkovChain([
[0, 0.5, 0.5, 0, 0, 0, 0, 0, 0, 0 ],
[0, 0, 0, 1, 0, 0, 0, 0, 0, 0 ],
[0, 0, 0, 1, 0, 0, 0, 0, 0, 0 ],
[0, 0, 0, 0, 0.5, 0.5, 0, 0, 0, 0 ],
[0, 0, 0, 0, 0, 0, 1, 0, 0, 0 ],
[0, 0, 0, 0, 0, 0, 1, 0, 0, 0 ],
[0, 0, 0, 0, 0, 0, 0, 0.3, 0.3, 0.4],
[0, 0, 0, 0, 0, 0, 0, 1, 0, 0 ],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 0 ],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 1 ]],
['Meets', 'Knows', 'Stranger', 'Fight', 'Trickery', 'Skill', 'Defeated', 'Escapes', 'Released', 'Rescued']),
MarkovChain([
[0, 0.5, 0.5, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 0, 0.5, 0.5, 0],
[0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, 0, 0, 1]],
['Depression', 'Give Up', 'Listless', 'Pep Talk', 'Explore', 'Attacked', 'Learns of Key']),
MarkovChain([
[0, 0.5, 0.5, 0],
[0, 0, 0, 1],
[0, 0, 0, 1],
[0, 0, 0, 1]],
['Seeks Key', 'Enemies?', 'Allies!', 'Finds Key']),
MarkovChain([
[0, 0.5, 0.5, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0.5, 0.5, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0.5, 0.5, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 1]],
['Return', 'Ally Leaves', 'Ally Dies', 'Rematch', 'Trick!','Betrayal!', 'Defeat?', 'Sacrifice', 'Epiphany', 'Victory']),
MarkovChain([
[0, 0.5, 0.5, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 0, 0.5, 0.5, 0],
[0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 0, 0, 0, 1]],
['Wrap Up', 'Home with Allies', 'Home Alone', 'Settle Back In', 'Transform Home', 'Trouble at Home', 'Growth'])
]
for act in acts:
plot_graph(act, dpi=600)
print("")
"""# Putting it all Together
Once we have our plot outline, we need to send prompts to our LLM to generate snippets and append them into a list that becomes our story. You will need to provide your own OpenAI key in order to run this step. The results of one run have been uploaded here - https://docs.google.com/document/d/1BGpBAlx_xgRDPW2xwXdUzATqEiWw3sP7A9ZP0xF_dZI/edit?tab=t.0 .
Much of the below code (and the prompts) were created by GPT-4o. We used GPT-4o-mini for generating the novel itself because it was cheaper and faster than GPT-4o, but this may mean a loss of quality as a result.
In theory, we could have simulated a single long-running conversation between an LLM and a human, but we were concerned about the limited context window. So instead, we had multiple LLM instances, with each instance given the previous passage and the story beat that is necessary for the next passage. We also instructed the LLM to keep the passages standalone (to try to limit continuity errors). There's also some messy, hardcoded logic as well within the code.
We also had difficulty reviewing the generated text and had to rely on spot checks to make sure the novel was...uh..."readable".
We would expect future LLMs to handle long-running conversations, allowing us to write a novel end-to-end. We also expect future LLMs to produce better, cheaper, and faster novels. However, valdiating that a computer-generated novel is "good" is still an issue that we can't just solve by using future LLMs (they can judge how well a novel is, but you still need a human to validate said judgment). So this is an unsolved problem, but that's something we can worry about next NaNoGenMo.
"""
# @title Putting it all together
import os
import openai
from datetime import datetime
STORY_BEAT_DETAILS = {
'Setting': "Describe the setting of the story and the status quo.",
'Good': 'Show the hero doing something virtuous to establish the hero as a noble person.',
'Avg': 'Show the hero harried in a typical job and typical woes. Hero is your average person.',
'Anti-Hero': 'Show the hero doing something less than ethical or downright nasty.',
'Sidekick': "We introduce the hero's sidekick, an admirer, maybe someone the hero saved.",
'Friend': "We introduce the hero's friend, a confidant, an equal.",
'Conscience': "We introduce the antihero's conscience, who tries to get him to do the right thing.",
'Seeks Adventure': 'The hero seeks adventure because it is the right thing to do.',
'Forced Into It': 'The friend/conscience convinces the hero to go on the adventure.',
'Meets': 'The hero initially meets the villain and they size each other up, maybe even taunt/threaten.',
'Knows': 'Reveal the hero has some history with the villain.',
'Stranger': 'This is the first time the hero has seen the villain.',
'Fight': 'The villain defeats the hero, either through superior skill or chicanery.',
'Trickery': 'The villain defeats the hero through trickery.',
'Skill': 'The villain outclasses the hero and easily wins in a fair fight.',
'Defeated': 'The hero realizes in horror he is getting defeated; then must cope with a shattering defeat and uncertainty about what will happen next.',
'Escapes': 'The villain tries to finish off the hero, but the hero manages to escape.',
'Released': 'The villain taunts the hero. Not considering him/her a threat, the hero lets the villain go.',
'Rescued': 'The hero is rescued by the secondary character.',
'Depression': 'The hero loses confidence and falls into a deep depression',
'Pep Talk': 'The companion convinces or shames the hero into snapping out of it',
'Challenge': 'The hero and companion are attacked and they win; this helps morale a bit',
'Learns of Key': 'The hero learns there exists some key to defeating the villain',
'Listless': 'The hero is listless. Too ashamed to go home, too demoralized to keep fighting',
'Give Up': 'The hero decides to give up and go home and may even start a decent ways on that journey',
'Explore': 'The hero and companion decide to explore the area and learn as much as they can about the villain',
'Attacked': "The hero and companion get attacked by the villain's forces.",
'Seeks Key': 'The hero cannot defeat the villain without the help of some object. The hero goes on a quest to find it.',
'Enemies?': 'The hero encounters others and fights them, before realizing they have a common foe, the villain.',
'Allies!': 'The hero finds allies who were victimized by the villain.',
'Finds Key': 'The hero finds the key to defeating the villain',
'Return': 'The hero & allies head back to the villain, and encounter obstacles along the way',
'Ally Leaves': 'An important ally leaves the group, due to cowardice or something comes up.',
'Ally Dies': 'An important ally dies.',
'Rematch': 'The hero faces the villain again and they fight. All looks good for the hero.',
'Trick!': 'The villain pulls a stunt again and turns the tables.',
'Betrayal!': 'One of the allies betrays the hero.',
'Defeat?': 'All seems lost as the hero looks defeated, and this time the villain will finish the hero off',
'Sacrifice': 'The sidekick suffers for the hero, giving the hero an in to defeat the villain',
'Epiphany': 'The hero realizes something -- the final piece of the secret or how to use the key, to defeat the villain',
'Victory': 'The villain is defeated',
'Wrap Up': 'Hero wraps up loose ends and damage caused by villain',
'Home Alone': 'The hero bids goodbye to the allies and heads home',
'Home with Allies': 'The remaining allies, having nowhere else to go, go back to the home town of the hero',
'Settle Back In': 'Hero settles back in and tries to go back to the status quo',
'Transform Home': 'The hero uses what he learned to make his home better',
'Trouble at Home': 'The hero has one final challenge to face at home. Maybe it is the last gasp of an ally of the villain or a final surprise the villain left.',
'Growth': 'Show hero in daily life to see how hero has grown/changed'
}
# Sample response is shown here - https://drive.google.com/file/d/1I8wCagwVuYzHc84r4qmID9k-gfkGTQbb/view?usp=sharing
try:
# themes = list of themes
themes = []
for act in acts:
themes.append(walk(act, act.states[0]))
# expand themes to serve as a mini-prompt of sorts to the LLM
story_beats = []
for i, arc in enumerate(themes):
for point in arc:
if point in STORY_BEAT_DETAILS: # Check if the key exists in the dictionary
story_beats.append(STORY_BEAT_DETAILS[point])
else:
print(f"Warning: '{point}' is not a valid story beat.") # Log invalid keys
# Set your OpenAI API key
openai.api_key = "SAMPLE_API_KEY"
# Generate timestamp for file names
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
# Files to save passages and summaries, with timestamp appended
output_passages_file = f"paranoia_passages_{timestamp}.txt"
# Define the system message
original_message = {
"role": "system",
"content": (
'''
Generate a passage of **at least 2000 words**. This passage will be part of a **generic story set in the PARANOIA universe** and must be designed to be **self-contained**—it should not reference events, characters, or settings outside of the passage itself.
### Format & Style:
- The story should be written in **third-person omniscient** perspective, focusing primarily on the **protagonist**.
- The writing style should have a **gritty, intense tone**, as though the protagonist is struggling through a harsh, conflict-heavy environment.
- No references to specific **locations**, **events**, or **previous encounters**. The narrative should be **self-contained**. This shall ensure standalone narratives.
### Previous Passage:
A previous passage may be provided to ensure continuity between passages, but remember that all passages are **self-contained**. Do not imitate the previous passage, but use the facts that are established and do not contradict them.
'''
)
}
# Character introduction points
sidekick_intro_index = 2 # Introduce the sidekick starting from this index
antagonist_intro_index = 4 # Introduce the antagonist starting from this index
# Function to dynamically update the system message based on the index
def update_system_message(index):
base_message = original_message["content"]
# Append character-specific instructions based on index
if index >= antagonist_intro_index:
base_message += (
'''
### Character placeholders:
- **Protagonist**: Use the name Ralph-O-YAM.
- **Sidekick**: Use the name Linda-R-YEX.
- **Antagonist**: Use the name Jillian-G-SAL.
## Character genders:
- Protagonist is male.
- Sidekick is female.
- Antagonist is female
## Final Note
Each character has their own motives, and the conflict should be heightened with the antagonist's actions.
'''
)
elif index >= sidekick_intro_index:
base_message += (
'''
### Character placeholders:
- **Protagonist**: Use the name Ralph-O-YAM.
- **Sidekick**: Use the name Linda-R-YEX.
## Character genders:
- Protagonist is male.
- Sidekick is female.
## Final Note
Introduce the sidekick as a tentative ally or companion who aids the protagonist's efforts but has her own unique motivations.
'''
)
else:
base_message += '''
### Character placeholders:
- **Protagonist**: Use the name Ralph-O-YAM.
## Character genders:
- Protagonist is male.
## Final Note
The focus remains solely on the protagonist (Ralph-O-YAM) and his internal and external conflicts.
'''
return {
"role": "system",
"content": base_message
}
# Function to generate a new passage
def generate_passage(previous_passage, theme, index):
system_message = update_system_message(index)
user_message = {
"role": "user",
"content": f"Previous Passage: {previous_passage}\nTheme: {theme}"
}
response = openai.ChatCompletion.create(
model="gpt-4o-mini",
messages=[system_message, user_message],
max_tokens=7_000,
temperature=0.7
)
response_text = response.choices[0].message['content']
if len(response['choices'][0]['message']['content'].split()) < 2000:
continuation_response = openai.ChatCompletion.create(
model="gpt-4o-mini",
messages=[system_message, {"role": "user", "content": "Please continue the passage, so that we get **at least 2000 words**. Do not contradict what happened in your previous message."}],
max_tokens=7_000,
temperature=0.7
)
continuation_response_text = continuation_response.choices[0].message['content']
return response_text + "\n" + "="*80 + "\n\n" + continuation_response_text
return response_text
previous_passage = "(none provided)"
print("Let's begin.")
with open(output_passages_file, 'w', encoding='utf-8') as passages_file:
for index, beat in enumerate(themes):
print(f"{index+1}/{len(themes)}: Working on beat - {beat}")
# Generate a new passage with the previous summary and current theme
new_passage = generate_passage(previous_passage, beat, index)
print(f"Finished theme, saving results")
# Write the passage to the passages file
passages_file.write(f"Passage #{index+1}:\n{new_passage}\n\n")
passages_file.write("\n" + "="*80 + "\n\n")
# Update the previous summary for the next iteration
previous_passage = new_passage
print(f"Passages have been written to {passages_file}")
except Exception as e:
print(f"An error occurred: {e}")
input("Press Enter to exit...") |
This is my Nanogenmo entry.
The text was updated successfully, but these errors were encountered: