Experiment 14 — Biochem in python

art

LLMs for Engineering

Check out the code

I was having a conversation the other day with a friend who said LLMs were all trained on stolen data. I said yes — this is called “primitive accumulation” — it is the origin of capitalism.

But I am not really interested in LLM content generation for our future when we all sleep in pods, matrix style, hooked up to vr headsets watching AI generated entertainment that will probably be better than some entertainment today (reality tv for example).

What I AM interested in is the relationship between LLMs and engineering and how LLMs can insert itself into the engineering process. Build me a component that does X. This can be an electrical component, a chemical component, a mechanical component, a programming component. My experiments have been programming components since this is what I understand. But I have also been studying phytochemistry- not much — so I really dont know anything.

But I was looking at methylation and wanted to experiment with that.

I asked:
> create an interactive diagram and process methylation — use tryptophan as an example — use python to write the program and use bio chem python packages

The result was a junky non interactive diagram generated by python.

I prompted
>this is not interactive

And you know what — I got a really sort of fun- but sort of useless interactive diagram. It took a few prompts to fix because it was using Jupyter notebooks and I was not, but FUN!.

Take a look at the html code in the git repo — I am into this.

I also wanted to generate the actual methylation process — ideally as an animation with bonds coming and going but this was not as successful and I gave up.

The amino acid tryptophan is crucial for serotonin production in the gut.

What is interesting is the step by step discussion of methylation — I’ll put here. But I dont know if this is an AI hallucination

>The methylation of tryptophan to form methylated tryptophan involves the addition of a methyl group (CH3) to one of the carbon atoms in the tryptophan molecule. Specifically, the process involves the methylation of the indole ring in tryptophan. Here’s an intermediate step-by-step explanation of how this methylation process happens:

1. **Tryptophan Structure**:
The chemical structure of tryptophan contains an indole ring, which is a hexagonal ring fused to a five-membered ring, along with various functional groups. Here’s a simplified representation:

“`
H
|
N — C — C — C — C
| | | |
H H H H
“`

2. **Methyl Group Addition**:
The methylation process involves adding a methyl group (CH3) to one of the carbon atoms in the indole ring. The exact location of the methylation can vary depending on the enzyme or reaction conditions, but for the sake of illustration, let’s consider methylation at the 7th carbon (C7) of the indole ring. Here’s the structure after methylation:

“`
H
|
N — C — C — C — C
| | | |
H H H H
|
CH3
“`

3. **Methylated Tryptophan**:
The result is the formation of methylated tryptophan, where the indole ring now has a methyl group attached to one of its carbons:

“`
H
|
N — C — C — C — C
| | | |
H H H H
|
CH3
“`

In a biological context, the methylation of tryptophan is catalyzed by specific enzymes. The exact mechanism and location of methylation can vary depending on the specific enzyme and the organism in which it occurs. Methylation reactions are essential for various biological processes, including the modification of amino acids, nucleotides, and other biomolecules, which can have significant impacts on cellular function and gene regulation.
OpenAI. (2023). _ChatGPT_ (September 25 Version) [Large language model]. https://chat.openai.com

When I asked it to generate python I got this addtion
>In this code:

1. We create a molecule object for tryptophan using its SMILES notation.

2. We identify the carbon atom (C7) in the indole ring where we want to add the methyl group. In a real biological context, the methylation site would be determined by specific enzymes.

3. We create a methyl group molecule using its SMILES notation.

4. We merge the tryptophan molecule with the methyl group using `Chem.CombineMols`.

5. We add a single bond between the methyl group and the chosen carbon atom to complete the methylation process.

6. Finally, we use RDKit to draw and display the chemical structures of tryptophan and methylated tryptophan.

Please note that this is a simplified representation of the methylation process. In a biological context, methylation is a complex enzymatic process that can involve specific enzymes and co-factors. This code only illustrates the addition of a methyl group to a specific carbon atom for demonstration purposes.
ChatGPT_ (September 25 Version) [Large language model]. https://chat.openai.com

## Other Packages
I also asked python what other packages are available for chemistry and got the following

>There are several Python packages and libraries that are widely used for biochemical synthesis, analysis, and visualization. These packages provide tools for various tasks related to biochemistry, from structural analysis to data visualization. Here are some of the most popular ones:

1. **RDKit**: The RDKit is a comprehensive cheminformatics library that allows you to work with chemical structures, fingerprints, and molecular data. It’s widely used in drug discovery and biochemical analysis. It provides functions for chemical structure handling, substructure searching, and more.

— Website: [https://www.rdkit.org/](https://www.rdkit.org/)
2. **Biopython**: Biopython is a collection of tools, libraries, and scripts for computational biology and bioinformatics. It provides modules for sequence analysis, structural biology, and more.

— Website: [https://biopython.org/](https://biopython.org/)
3. **Bioconductor**: Bioconductor is a collection of R packages for the analysis and comprehension of high-throughput genomic data. It includes a wide range of tools for bioinformatics and statistical analysis.

— Website: [https://www.bioconductor.org/](https://www.bioconductor.org/)
4. **PySCeS**: PySCeS is a package for modeling and simulation of biochemical systems. It provides tools for metabolic control analysis, sensitivity analysis, and dynamic modeling of biochemical networks.

— Website: [http://pysces.sourceforge.net/](http://pysces.sourceforge.net/)
5. **BioPandas**: BioPandas is a library for working with molecular and structural data, similar to Pandas for tabular data. It allows you to manipulate and analyze biological data in a structured way.

— GitHub: [https://github.com/rasbt/biopandas](https://github.com/rasbt/biopandas)
6. **MDAnalysis**: MDAnalysis is a library for analyzing molecular dynamics (MD) simulation trajectories. It’s commonly used for studying protein structures and dynamics.

— Website: [https://www.mdanalysis.org/](https://www.mdanalysis.org/)
7. **PyMOL**: PyMOL is a molecular visualization system that can produce high-quality 3D molecular structures and animations. While it’s not a Python library itself, it has a Python API for scripting and automating tasks.

— Website: [https://pymol.org/](https://pymol.org/)
8. **Matplotlib and Plotly**: These are popular data visualization libraries that can be used for visualizing biochemical data, including plotting protein structures, molecular interactions, and more.

— Matplotlib: [https://matplotlib.org/](https://matplotlib.org/)
— Plotly: [https://plotly.com/python/](https://plotly.com/python/)
9. **NetworkX**: NetworkX is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. It’s useful for analyzing biological networks such as protein-protein interaction networks and metabolic pathways.

— Website: [https://networkx.github.io/](https://networkx.github.io/)
10. **PySCeNe**: PySCeNe is a Python library for the analysis and visualization of cellular and subcellular compartments in spatial single-cell RNA sequencing (scRNA-seq) data.

– GitHub: [https://github.com/Teichlab/PySCeNe](https://github.com/Teichlab/PySCeNe)

These libraries cover a wide range of tasks in biochemistry, from structural analysis and molecular dynamics to bioinformatics and network analysis. Depending on your specific needs, you can choose the libraries that best suit your projects.

> any more?

ChatGPT

Certainly, there are more Python packages for biochemical synthesis, analysis, and visualization. Here are a few additional libraries and tools that you might find useful:

11. **ChemPy**: ChemPy is a Python library for modeling and simulating chemical kinetics. It’s particularly useful for modeling chemical reactions and reaction networks.

— GitHub: [https://github.com/bjodah/chempy](https://github.com/bjodah/chempy)
12. **COBRApy**: COBRApy is a Python library for constraint-based reconstruction and analysis (COBRA) of metabolic networks. It’s used in systems biology for analyzing and simulating metabolic pathways.

— GitHub: [https://opencobra.github.io/cobrapy/](https://opencobra.github.io/cobrapy/)
13. **Pybel**: Pybel is a Python wrapper for the Open Babel cheminformatics toolkit. It provides tools for chemical file format conversion and molecular descriptor calculations.

— GitHub: [https://github.com/OpenBabel/pybel](https://github.com/OpenBabel/pybel)
14. **BioPython-Structural**: This is an extension of the Biopython library that focuses on structural biology. It allows you to work with protein structures, perform structural alignments, and more.

— GitHub: [https://github.com/biopython/biopython-struct](https://github.com/biopython/biopython-struct)
15. **Chemoinformatics Jupyter Notebooks**: A collection of Jupyter notebooks with examples and tutorials on various chemoinformatics tasks, including chemical data analysis and visualization.

— GitHub: [https://github.com/chemoinformatics/chemoinformatics-jupyter-notebooks](https://github.com/chemoinformatics/chemoinformatics-jupyter-notebooks)
16. **Cytoscape Automation**: Cytoscape is a popular tool for visualizing and analyzing biological networks. Cytoscape Automation provides a Python API for interacting with Cytoscape and automating network analysis.

— GitHub: [https://github.com/cytoscape/automation](https://github.com/cytoscape/automation)
17. **BiGGER**: BiGGER (Biological Graph and General Entity Renderer) is a Python library for visualizing biological networks and pathway diagrams. It’s designed for creating publication-quality figures.

— GitHub: [https://github.com/ryaneus/gets](https://github.com/ryaneus/gets)

Experiment 13 – the Chemical Wedding

art

I started today trying to get blender to work!!! To no avail. Then I thought I would ask chat to animate a person travelling through the Bardo. The Bardo is the realm between life and death. But really it is not just life and death it is all transition. One of our purposes in life is to become good at transition – this is for the final transition that we know about – death.

Some people are so fluid – so good at transition that it is all bardo – they flit in and out of time, of existence, like i imagine subatomic particles, like quarks.

I am so tired right now. So much, everywhere. I said to a friend once, i think if everyone meditated everyday we would have peace. Really what I meant was that we would have peace in the 30 min that everyone sat in meditation. You can’t do anything when you are in meditation. I don’t remember what we concluded. Maybe that I was wrong. I told this to Mary Ellen too – I said we would have peace if everyone just meditated every day. She said yes, we should do that. I said yes too. Maybe I will just have a twitch stream where we all just meditate – every day. I used to do a yoga sadhana every morning at 4:40, during the pandemic. It was never more than 10 of us. We sang the Japji and moved into asanas and chanted.

Really this post should be called the chemical wedding, but for some reason I was writing the alchemical wedding. The chemical wedding is a concept from alchemy – it is a symbolic story about the combination of opposites – of self realization. But it is? I mean I guess to someone it is. This concept really influence Jung, the father of analytic psychology – and much of his work is about integrating the different aspects of ourselves in our process of individuation – the animus and anima (male and female). This feels outdated today.

I have tattoos of mandalas – circles . My kid Cass drew pages and pages of circles in his notebook. He said he was practicing drawing circles. Jung says when someone draws circles they are moving torward wholeness, healing. I told this to a friend who said – that is so patriarchal! Women when they individuate they make squares. I said maybe they make Xs. This is what I thought today when I generated this image with the LLM.

Prompts:

create a python animation of the alchemical wedding

put this on a loop

it is still not on a loop – how do I know if the wedding has taken place

generate_symbol is not defined

Experiment 12 — LLMs &.Fun With Pipelines

art

I could use a day off to reflect on my practice with LLM, but it is just not happening right now so persevere and sort of phone it in. In phoning it in I am generating some interesting experiments.

Today I asked:

create a gitlab-ci pipeline that does something creative with code

I am obsessed with pipelines. It is a way to divide up code based on stage, development, testing, and deployment, and to apply certain guide rails and checks to each stage. I have a whole pipeline metaphysics that I will at some point discuss. The technological society — which is our current condition operates under pipeline metaphysics.

The Result

Creating a creative GitLab CI/CD pipeline largely depends on your specific project and goals, but I can provide an example of a creative pipeline that performs code analysis, generates ASCII art from your code, and then deploys it to a static website. This is a fun and unique way to visualize your code.

1. First, make sure your repository has the following structure:

“`
project-root/
├── .gitlab-ci.yml
├── src/
│ ├── your_code.py
├── scripts/
│ ├── generate_ascii_art.sh
├── public/
“`

2. Create a `.gitlab-ci.yml` file in the project’s root directory:

“`yaml
stages:
— build
— deploy

variables:
ARTIFACTS_DIR: artifacts
PUBLIC_DIR: public

generate_ascii_art:
stage: build
script:
— mkdir -p $ARTIFACTS_DIR
— ./scripts/generate_ascii_art.sh src/your_code.py > $ARTIFACTS_DIR/ascii_art.txt
artifacts:
paths:
— $ARTIFACTS_DIR/ascii_art.txt

deploy_to_website:
stage: deploy
only:
— master
script:
— cp $ARTIFACTS_DIR/ascii_art.txt $PUBLIC_DIR/index.html
artifacts:
paths:
— $PUBLIC_DIR
“`

This pipeline has two stages: `build` and `deploy`.

3. Create the `generate_ascii_art.sh` script in the `scripts/` directory:

“`bash
#!/bin/bash

code_file=$1

python3 -c “
from art import *
art = text2art(open(‘$code_file’).read(), ‘block’)
print(art)
“ > ascii_art.txt
“`

This script uses the `art` library in Python to generate ASCII art from your code and saves it in `ascii_art.txt`.

4. Make sure you have the `art` library installed in your Python environment. You can install it with:

“`bash
pip install art
“`

5. Create a simple HTML template in `public/index.html` to display the generated ASCII art:

“`html
<!DOCTYPE html>
<html>
<head>
<title>Generated ASCII Art</title>
</head>
<body>
<pre>
<! — Include ASCII art here →
$(cat ascii_art.txt)
</pre>
</body>
</html>
“`

Now, when you push code changes to the `master` branch, GitLab CI/CD will automatically generate ASCII art from your code and deploy it to a static website. You can access the generated ASCII art by opening the website hosted in the `public` directory of your repository.

This is just one example of a creative GitLab CI/CD pipeline. You can adapt and expand upon it based on your project’s needs and the creative ideas you want to implement.

OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com

The Commentary

This did not work out of the box — as is the case with most LLM work. But it 95% worked. And I fixed it mostly by asking follow up questions not editing the code myself. What is most impressive is that the LLM created a series of pieces, the pipeline code, the directory structure, and code run by the pipeline.

The ascii art does not generate. I suspect it has something to do with my python program that generates a Fibonacci sequence.But I am not going to debug that. Instead I am really amazed what did happen and then maybe in the future I’ll go back and revisit these early experiments.

This project and pipeline exist in gitlab.

Experiment 11 — LLM Generates a Website to do Astrology

art

https://www.youtube.com/watch?v=KpQ5oj-qbCs

# Generating a Flask Web App
Today I generated a web app using a few lines of chat gpt prompts. I dont think it would run on a server since it writes to the local file system and so has some permission / security flaws — but it is still pretty darn cool!

First I had LLM generate a program that calculated an astrological chart.. Then had it generate a visualization of the chart, which was not that great, then I had LLM generate music. I asked that the note and the tempo be linked to the zodiac and house of the different planets — I am not sure this happened with any sort of finesse but it happened.

I then asked the LLM to allow the user to input birth data, and finally, I asked the LLM to create a flask web app and a webpage that accepted the birth data and this is the app you see.

There were a bunch of things that did not work out of the box, modules were not imported, as I said this has major security issues, and it looks pretty bad — but for a few lines of prompt and something that was generated in about 10 minutes this is pretty amazing.

Check out the code.

Multimodal AI

art

What is multimodal AI – it is ai that takes input from different modes – like audio, video, or other sensors/senses.

It is the next hot thing in AI.

Implicit in multimodal AI is this concept of mapping or translation. How do we translate sound to text, or text to numbers.

For 20+ years I have been involved in this mapping question. I have called it at various times mapping, translation, or transduction. These days I prefer transduction. We are in the age of transduction – which is translation between different scales of things – or different phases of things. If we imagine the past as the science of transformation, how does one thing transform into something else, transduction is how do we map the parts of the transformation.

That may make no sense – I just think I confused myself.

In any case, for years I have played with translating/transducing chess into other modes (sound, audio, dance ). Chess has 8 rows and 8 columns, like the 8 notes in the western scale. 8s show up again and again in various cultures and it would be interesting to play a bit more with this – I think of the 8 of the ba gua – the 8 trigrams – as well as ba gua the martial art. But I digress.

So today after a number of false starts I had GPT chat generate a python program to generate a musical composition based on a chess game.

chat gpt used the musical program lilypond which I have never used before. Lilypond generates musical notation and midi. This is cool.

I dont necessarily agree with the mappings – why were they selected? I asked chat gpt which said
“The mappings I provided between chess positions and musical notes in the earlier examples were selected for simplicity and as a starting point for the demonstration.” OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com

I then asked “could you regenerate this by considering each piece as representing a different rhythm and each square as representing a different note – the horizontal being notes within an octave and the vertical as different timbres (frequencies)”

This only seemed to generate difference in rhythm – the notes were all the same. Also the musical notation does not seem to match the midi.

In any case – I found this interesting.

Experiment 9 – Audio Generation

art

It seems I forgot to post Experiment 8 to this blog – find it on medium – it generates ascii art.

Today was a bit of a gpt fail. I decided I was going to install the blender python library on my linux machine and do that – but as of now the install is still failing! In the meantime I returned to my csound puttering which I will not officially give up on sinceI have not been able to generate anything that works in a week. I’ll move on to tidal cycles. What I did was generate the sound of the lifecycle of milkweed.

What did I do

Roughly I asked gpt to generate a csound file that represents the lifecycle of milkweed and when the csound file did not work I asked it to use a different programming language. It picked python and the result is in Experiment 9 in my github repo.

The Result is also in that file because I am too lazy to upload it to soundcloud.

Reflections

I also attempted to generate a visualization of this that looked like an oscilliscope. None of my efforts worked. And then I asked GPT to just generate a video – this also did not work. Sometimes experiments fail.

I do feel super burnt out. But I am pushing through on the daily experiments. I believe there is something on the other side of this!

Experiment 7 – 3D

art

So I was trying to get the llm blender code integration to work but I
think there is a bug with osx. I am going to try wiht my ubuntu machine
tomorrow.

Instead I have been thinking about GIS and geographical data.

Prompt

i create a simulation of a volcano from us geological data

Code

Creating a realistic simulation of a volcano using US Geological data is a complex and data-intensive task that typically involves geographic information systems (GIS), computational fluid dynamics, and geological data sources. However, I can provide a simplified Python example of how to create a basic visual representation of a volcano using 3D graphics with the VPython library. This simulation is not based on real geological data but can serve as a starting point for visualization:

from vpython import *

# Create the 3D scene
scene = canvas(title="Volcano Simulation", width=800, height=600)

# Create a simple volcano cone
volcano_cone = cone(pos=vector(0, 0, 0), axis=vector(0, 1, 0), radius=5, color=color.red)

# Create a lava flow
lava = cylinder(pos=vector(0, 0, 0), axis=vector(0, 5, 0), radius=0, color=color.orange)

# Create a ground plane
ground = box(pos=vector(0, -5, 0), length=20, height=0.1, width=20, color=color.green)

# Simulation loop
t = 0
dt = 0.01
while True:
    rate(100)  # Adjust the frame rate as needed

    # Make the lava flow down
    lava.pos.y -= 0.1
    t += dt

    # Restart lava flow when it reaches the ground
    if lava.pos.y < ground.pos.y:
        lava.pos.y = 0
        t = 0

In this code, we use the VPython library to create a basic 3D scene with a volcano cone and lava flow. The lava flows down and restarts when it reaches the ground. Keep in mind that this is a simplified visual representation for educational purposes and is not based on real geological data.

Creating a scientifically accurate volcano simulation would require access to geological data, including terrain data, eruption dynamics, and possibly a GIS platform for visualizing the data. This goes far beyond the scope of a simple Python script and would typically involve collaboration with experts in the field of geology and volcanology.

Conclusion

I did not use my template from yesterday. I regret that. However I was
fascinated that LLM created a website to view 3D data. I never used
vpython before.

Generating Documentation with Python and LLMs

art

So today I am going to generate something in the 3D Program blender. I really want a day to reflect on what I am doing — maybe this weekend. For now I am just keeping up the pace.

I had to do some basic modifications on my system — namely installing the blender python module that I had to build from source — ugh — pain.

It is taking too long so I am going another direction. I am generating a python script to help me document these experiments!

Prompt

generate a python script that creates a template to document daily experiments with chatgpt generating code

Code

Creating a Python script to document daily experiments with ChatGPT generating code is a helpful way to keep track of your work and the results you obtain. Below is a simple Python script template that you can use as a starting point. This script uses the input() function to take notes on your experiments and stores them in a text file for future reference:import os
from datetime import datetime# Define the path to the experiment log file
log_file = ‘experiment_log.txt’# Check if the log file exists, create it if not
if not os.path.exists(log_file):
with open(log_file, ‘w’) as f:
f.write(“Experiment Log\n”)
f.write(“—————-\n”)def log_experiment():
# Get the current date and time
current_time = datetime.now().strftime(‘%Y-%m-%d %H:%M:%S’) # Prompt the user for an experiment description
experiment_description = input(“Enter a description of the experiment: “) # Append the experiment details to the log file
with open(log_file, ‘a’) as f:
f.write(f”\nDate and Time: {current_time}\n”)
f.write(f”Experiment Description: {experiment_description}\n”)
f.write(“-” * 40 + “\n”)if __name__ == “__main__”:
while True:
print(“Options:”)
print(“1. Log an experiment”)
print(“2. Quit”) choice = input(“Select an option: “) if choice == ‘1’:
log_experiment()
elif choice == ‘2’:
break
else:
print(“Invalid choice. Please select a valid option.”)

This script provides a menu where you can choose to log an experiment or quit the program. When you log an experiment, it records the current date and time along with your experiment description into the experiment_log.txt file. You can customize the log file’s name and structure to suit your preferences.

To use the script:

  1. Save it as a Python (.py) file.
  2. Run the script.
  3. Follow the prompts to log your daily experiments and notes.

This simple template can be expanded to include more features like tagging, categories, or additional metadata to make it more comprehensive based on your needs.

Update

I wanted to modify this script to collect more data and output as markdown — I am writing this in obsidian or nvim.

Prompt 2

modify this script to output markdown and to collect the following data about an experiment: an image, a github link, an optional youtube link, a list of prompts and results (formated with code markdown), and initial summary and inspiration, and a final analysis and conclusion.

Code

To modify the script to output Markdown and collect more detailed information about an experiment, including an image, GitHub link, YouTube link, prompts, results, and additional descriptions, you can use the following modified script:

import os
from datetime import datetime# Define the path to the experiment log file
log_file = ‘experiment_log.md’# Check if the log file exists, create it if not
if not os.path.exists(log_file):
with open(log_file, ‘w’) as f:
f.write(“# Experiment Log\n”)def log_experiment():
# Get the current date and time
current_time = datetime.now().strftime(‘%Y-%m-%d %H:%M:%S’) # Prompt the user for experiment details
experiment_title = input(“Enter a title for the experiment: “)
experiment_description = input(“Enter a brief description: “)
github_link = input(“Enter the GitHub link (if any): “)
youtube_link = input(“Enter the YouTube link (if any): “)

# Create a Markdown section for the experiment
markdown_content = f”””
## Experiment: {experiment_title}
– **Date and Time:** {current_time}
– **Description:** {experiment_description}
– **GitHub Link:** {github_link}
– **YouTube Link:** {youtube_link}
“”” # Collect prompts and results
prompts = []
results = []
while True:
prompt = input(“Enter a prompt (or ‘q’ to finish): “)
if prompt == ‘q’:
break
result = input(“Enter the result (formatted with code markdown): “)
prompts.append(prompt)
results.append(result)

# Collect additional descriptions
initial_summary = input(“Enter an initial summary: “)
inspiration = input(“Enter inspiration for the experiment: “)
final_analysis = input(“Enter a final analysis and conclusion: “)

# Append the experiment details to the log file
with open(log_file, ‘a’) as f:
f.write(markdown_content)
for prompt, result in zip(prompts, results):
f.write(f”**Prompt:** {prompt}\n\n”)
f.write(f”**Result:**\n\n“`\n{result}\n“`\n\n”)
f.write(f”**Initial Summary:** {initial_summary}\n\n”)
f.write(f”**Inspiration:** {inspiration}\n\n”)
f.write(f”**Final Analysis and Conclusion:** {final_analysis}\n\n”)if __name__ == “__main__”:
while True:
print(“Options:”)
print(“1. Log an experiment”)
print(“2. Quit”) choice = input(“Select an option: “) if choice == ‘1’:
log_experiment()
elif choice == ‘2’:
break
else:
print(“Invalid choice. Please select a valid option.”)

This script will collect and organize more detailed information for each experiment, including prompts, results, initial summary, inspiration, and final analysis and conclusion. It generates a Markdown file (experiment_log.md) for documentation.

Remember to install any necessary libraries for rendering images or including links to your GitHub and YouTube pages, as those functionalities are not included in the basic script.

OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com

conclusion

Take a look at the experiment log in the github repository for experiment 5. It needs some work, but I think it will really help me save time in documenting these experiments!

Day 5 of the LLM Experiments –

art

I am sort of sticking this in to the end of my day, like a rushed cup of coffee or push ups and sit ups between meetings. But each experiment i do teaches me something new – to make a truly banal comment.

I am thinking about creating a format for these writeups so they are less idiosycratic -but I am not doing that today. The code is aavailable on Githib along with animations and photos.

Today I started out trying to generate music again. My music experiments dont really turn out – often time a wav form is not generated. I did a prompt to improvise on chants of japji – the sikh chant – I got something in korean via google translate. Which was strange. It made me think about doing an audio piece with many different languages. But text to speech sounds pretty bad. We need some FFTs or something to add some personality.

I decided to prompt the LLM to make a game. This is new and visual

It is based on the aesop fable the crow and the pitcher. The crow adds pebbles to a pitcher of water -as the stones are added – the water rises and the crow can then drink the pitcher.

Lots of fables and fairytales feel like they have video game logic to me.

Prompt

generate a game in python where the player tries to stack rocks in a cup so that a crow can drink the water in the cup like the aesop fable crow and pitcher

output

Creating a Python game where the player attempts to stack rocks in a cup for a crow to drink the water, inspired by Aesop’s fable “The Crow and the Pitcher,” is a fun project. Here’s a simple text-based version of such a game:

import random

# Initialize game variables
cup_water = 0
rocks_in_cup = 0
crow_thirst = 100
game_over = False

print("Welcome to 'The Crow and the Cup' Game!\n")
print("The crow is thirsty, and you need to stack rocks in the cup to raise the water level.")

while not game_over:
    print("\nOptions:")
    print("1. Stack a rock in the cup")
    print("2. Let the crow drink")
    print("3. Quit the game")

    choice = input("Enter your choice (1/2/3): ")

    if choice == "1":
        added_water = random.randint(1, 5)
        cup_water += added_water
        rocks_in_cup += 1
        crow_thirst -= added_water
        print(f"You added {added_water} units of water to the cup. The crow's thirst is now {crow_thirst}.")
    elif choice == "2":
        if cup_water >= 20:
            print("The crow is happy and quenched its thirst! You win!")
            game_over = True
        else:
            print("The water level is too low for the crow to drink. Keep stacking rocks.")
    elif choice == "3":
        print("You quit the game. The crow is still thirsty.")
        game_over = True
    else:
        print("Invalid choice. Please select 1, 2, or 3.")

    # Check for crow's condition
    if crow_thirst <= 0:
        print("The crow's thirst reached zero. The crow is satisfied, and you win!")
        game_over = True
    elif rocks_in_cup >= 20:
        print("You've stacked enough rocks, but the crow is still thirsty. You didn't add enough water.")
        game_over = True

print("Thanks for playing 'The Crow and the Cup' Game!")

In this text-based game, you have three options: adding rocks to the cup, allowing the crow to drink, or quitting the game. The goal is to stack enough rocks to raise the water level so the crow can quench its thirst. If you successfully raise the water level to 20 units or satisfy the crow’s thirst, you win the game.

You can expand this game by adding a graphical interface, more complex mechanics, and visual elements to make it more engaging.
OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com

Comment

![[crowandcup.png]]

The output was a text based game. Something I did not expect. I was really surprised and inspired. It made me want to experiment more in this genre – or maybe specify a package like twine.

This also does have game logic your goal is to add enough rocks so that the crow can drink before it dies.

But I wanted an audio visual experience

Prompt

make the game a video animation

please add some generated sound effects

Code

import pygame
import random
import sys
from pydub import AudioSegment
from pydub.generators import Sine

Initialize Pygame

pygame.init()

Initialize PyDub audio

pygame.mixer.init()

Constants

(Your previous constants here…)

Load sound effects using PyDub

rock_drop_sound = Sine(440).to_audio_segment(duration=200) # Example rock drop sound
crow_drink_sound = Sine(880).to_audio_segment(duration=500) # Example crow drink sound

Game variables

(Your previous game variables here…)

def update_game_state():
global cup_water, rocks_in_cup, crow_thirst

# Generate sound effects
rock_drop_sound.play()
pygame.time.delay(200)  # Add a delay for sound effect
crow_drink_sound.play()

added_water = random.randint(1, 5)
cup_water += added_water
rocks_in_cup += 1
crow_thirst -= added_water

(Your previous code here…)

def main():
# (Your previous code here…)

while not game_over:
    for event in pygame.event.get():
        if event.type == pygame.QUIT:
            game_over = True
        if event.type == pygame.KEYDOWN:
            if event.key == pygame.K_s:
                update_game_state()
            if event.key == pygame.K_d:
                if cup_water >= 20:
                    crow_thirst = 0

    # (Your previous code here...)

if name == “main“:
main()

import pygame
import random
import sys
from pydub import AudioSegment
from pydub.generators import Sine

Initialize Pygame

pygame.init()

Initialize PyDub audio

pygame.mixer.init()

Constants

(Your previous constants here…)

Load sound effects using PyDub

rock_drop_sound = Sine(440).to_audio_segment(duration=200) # Example rock drop sound
crow_drink_sound = Sine(880).to_audio_segment(duration=500) # Example crow drink sound

Game variables

(Your previous game variables here…)

def update_game_state():
global cup_water, rocks_in_cup, crow_thirst

# Generate sound effects
rock_drop_sound.play()
pygame.time.delay(200)  # Add a delay for sound effect
crow_drink_sound.play()

added_water = random.randint(1, 5)
cup_water += added_water
rocks_in_cup += 1
crow_thirst -= added_water

(Your previous code here…)

def main():
# (Your previous code here…)

while not game_over:
    for event in pygame.event.get():
        if event.type == pygame.QUIT:
            game_over = True
        if event.type == pygame.KEYDOWN:
            if event.key == pygame.K_s:
                update_game_state()
            if event.key == pygame.K_d:
                if cup_water >= 20:
                    crow_thirst = 0

    # (Your previous code here...)

if name == “main“:
main()

comment

I had to do some debugging on this – there were global variables that were not defined and other issues. But I do have an audio visual experience and I ppress the s and d keys and something happens. I am not sure if there is game logic. It is not apparent from playing the game I have to look in the code.

I am sort of inspired by the text element however I am going to explore that more in the future.

GN

Experiment 4 – Literature Diagrams

art
    
I was talking with my friend Nicolas about LLMs and he said something about generating the diagrams of the relationships between the characters in hamlet.  

The gode is on github.

I am sort of interested in seeing the interactions between all the characters in hamlet as a a dance – how often to characters talk to each other in which scenes and how does this change over time, where do the scenes take place etc.

I was also interested to see if Chat GPT could create a prompt that could create a script – TLDR – I was unsuccesful. I was also unsuccesful in createing anything more specific. Perhaps I need to work on my prompts. I am also a bit tired.

Prompt

create a python script that shows how the characters in hamlet interact with each other overtime make an animated network diagram and animate the diagram so that we see how the character relationships change in each scene

Response

Creating an animated network diagram of how the characters in “Hamlet” interact with each other over time is a complex task that requires a combination of text processing, data visualization, and animation libraries. Here’s a high-level outline of how you might approach this task:

  1. Data Preparation:
  • First, you need to extract the data from the play. You can use a digital text of “Hamlet” and write a Python script to parse the text to identify which characters are in each scene together. You will also need to note the emotional connections between characters.
  1. Data Representation:
  • Create a data structure to represent the relationships between characters in each scene. You might use a graph or network structure where characters are nodes, and interactions are edges.
  1. Data Visualization:
  • Utilize a data visualization library like NetworkX and Matplotlib to create static network diagrams for each scene. You can assign different colors or styles to edges to represent the nature of interactions (e.g., positive, negative, neutral).
  1. Animation:
  • For animation, you can use a library like NetworkX or Plotly to create animated network diagrams. You will need to update the network structure for each scene and save it as a sequence of images or a video.

Here’s a simplified example of how you can create a static network diagram using NetworkX and Matplotlib:

import networkx as nx
import matplotlib.pyplot as plt

# Create a sample network graph
G = nx.Graph()
G.add_edge("Hamlet", "Ophelia")
G.add_edge("Hamlet", "Gertrude")
G.add_edge("Hamlet", "Claudius")
G.add_edge("Ophelia", "Polonius")
G.add_edge("Laertes", "Ophelia")
G.add_edge("Polonius", "Ophelia")
G.add_edge("Polonius", "Hamlet")
G.add_edge("Rosencrantz", "Hamlet")
G.add_edge("Guildenstern", "Hamlet")
G.add_edge("Horatio", "Hamlet")

# Plot the network graph
pos = nx.spring_layout(G)
nx.draw(G, pos, with_labels=True, node_color="skyblue", node_size=1000, font_size=10, font_color="black")
edge_labels = {("Hamlet", "Ophelia"): "Love", ("Hamlet", "Gertrude"): "Mother-Son", ("Hamlet", "Claudius"): "Conflict", ("Polonius", "Hamlet"): "Conflict"}
nx.draw_networkx_edge_labels(G, pos, edge_labels=edge_labels, font_color="red")

plt.title("Character Interactions in 'Hamlet'")
plt.axis('off')
plt.show()

For the animation part, you can extend this by iterating through scenes and updating the graph structure for each scene. You can then use a library like matplotlib’s FuncAnimation or Plotly to create an animated sequence.

Creating a complete animated network diagram with changing character relationships throughout the entire play is a substantial project that requires time and effort. You may also need to create a data structure to track how the character relationships change over time as the play progresses.

OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com

Thoughts

This worked pretty much out of the box but dd not animate. The graphic was sort of cool. The relationships are not super nuanced, but it is something.

i want to get more complex and nuanced interactions. I am thinking of training my own ML – or having my own graph db of word embeddings populated with my own documents. Maybe I should have chat use c++ or haskell or rust or something.

making it animate

I added this prompt

come on – you can add an animation to show how this relationship changes between act 1 act2 and act 3

result

Creating a complete animated network diagram with scene-by-scene changes in character relationships in a play as complex as “Hamlet” is a substantial project that goes beyond the scope of a single text-based response. However, I can provide you with a simplified example of how you might create an animation to show changes in character relationships between Act 1, Act 2, and Act 3 using NetworkX and Matplotlib.

Here’s a basic Python script to create a simple animated network diagram:

import networkx as nx
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation

# Create a sample network graph for Act 1
G_act1 = nx.Graph()
G_act1.add_edge("Hamlet", "Ophelia")
G_act1.add_edge("Hamlet", "Gertrude")

# Create a sample network graph for Act 2
G_act2 = nx.Graph()
G_act2.add_edge("Hamlet", "Ophelia")
G_act2.add_edge("Hamlet", "Gertrude")
G_act2.add_edge("Hamlet", "Polonius")

# Create a sample network graph for Act 3
G_act3 = nx.Graph()
G_act3.add_edge("Hamlet", "Ophelia")
G_act3.add_edge("Hamlet", "Gertrude")
G_act3.add_edge("Hamlet", "Polonius")
G_act3.add_edge("Laertes", "Ophelia")

# Create a list of network graphs for each act
acts = [G_act1, G_act2, G_act3]

# Create a function to update the plot for each act
def update_plot(frame):
    plt.clf()
    G = acts[frame]
    pos = nx.spring_layout(G)
    nx.draw(G, pos, with_labels=True, node_color="skyblue", node_size=1000, font_size=10, font_color="black")
    plt.title(f"Act {frame+1}")
    plt.axis('off')

# Create an animation
ani = FuncAnimation(plt.gcf(), update_plot, frames=len(acts), repeat=False)

# Display the animation
plt.show()

This code sets up a basic animation that displays the network diagram for each act (Act 1, Act 2, and Act 3) with changing character relationships. You can extend this approach by adding more acts and more complex relationships as needed.

Please note that this is a simplified example. In a real-world project, you would need to parse the play’s text to extract character interactions for each scene, which is a non-trivial task.

OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com

Upgrades

Changing repeat to True makes this loop. I do think this is an interesting pedogogical tool for learning code – not so much for understanding Hamlet.