The thing about making art with your computer is that you really have to know what you are doing before hand. You must have a vision and then the determination to execute that vision. Execiting an artistic vision with code is hard, because in the case of painting or drawing, if you dont execute your vision you will still have a painting or a drawing, but if in the case of code if you dont execute your vision or execute it poorl you may not even have something that runs — that creates anything. This is similar to architecture — if I cant execute my vision or execute it poorly in architecture then maybe I have a house that falls down, or is leaky (I am thinking of frank lloyd wright)
I just typed into chat >let there be light (make an app)
No one really interacted with my write up for the last experiment — the queer art of failure — my inability to get my rasa bot to integrate with openai.
So be it!
Today I finally generated a twine game. Actually I generated two twine games and it was AMAZING.
First off I have not used twine since 2015. I became obsessed with twine when I read the book the Rise of the Video Game Zinesters that I bought at Bluestockings before going to one of the amazing yoga classes at what is now Fierce Grace on Allen Street.
The book is amazing — I feel like I might have once pitched a twine project on the work of octavia butler or the octavia butler archive.
> create a twine game about the methylation of tryptophan
Then I said — make. this more colorful with choices and SVG generated graphics (none of the buttons work).
Chat also generated this
>Creating a twine game about the methylation of tryptophan can be a unique and educational way to explore the topic. Below is an outline for such a game:
**Title:** Methylation of Tryptophan: A Molecular Adventure
**Introduction:** You are about to embark on a molecular adventure where you’ll explore the methylation of tryptophan, a vital process in the body. As a curious scientist, you’ve been miniaturized and injected into the body to witness this process firsthand.
**Scene 1: The Microscopic World** *You find yourself inside the bloodstream, surrounded by red and white blood cells. Your journey begins as you navigate the microscopic world.*
– **Choice 1:** Explore the bloodstream. – **Choice 2:** Follow the chemical signals.
**Scene 2: Tryptophan Encounter** *You follow the chemical signals and arrive at the site where tryptophan molecules are located. They’re about to undergo methylation.*
– **Choice 1:** Observe the tryptophan molecules. – **Choice 2:** Interact with an enzyme involved in methylation.
**Scene 3: Methylation Process** *You’ve chosen to observe the tryptophan molecules. They’re about to undergo the methylation process.*
– **Choice 1:** Watch the methylation process. – **Choice 2:** Learn more about the importance of methylation.
**Scene 4: Interaction with an Enzyme** *You’ve chosen to interact with an enzyme involved in the methylation process. This enzyme will explain its role.*
– **Choice 1:** Listen to the enzyme’s explanation. – **Choice 2:** Ask the enzyme questions.
**Scene 5: The Importance of Methylation** *You’ve chosen to learn more about the importance of methylation. Discover the significance of this process in the body.*
– **Choice 1:** Learn about the biological functions. – **Choice 2:** Understand the role in gene regulation.
**Scene 6: The Outcome** *You’ve completed your journey, and now it’s time to reflect on your adventure.*
– **Choice 1:** Return to the macroscopic world. – **Choice 2:** Review what you’ve learned.
**Conclusion:** You’ve successfully completed your adventure through the methylation of tryptophan. This vital process plays a crucial role in regulating gene expression and maintaining various biological functions. Your newfound knowledge may have a positive impact on your future scientific endeavors.
This twine game provides an interactive and educational experience for players interested in learning about the methylation of tryptophan in a fun and engaging way. You can add more details, images, and explanations to make the experience even more immersive.
The big thing people are talking about with AI is that we are going to have AI agents. This will be a new programming and user interaction paradigm. Instead of people wandering around saerching for things or talking, agents trained on custom data sets will act like “people” with intention
acting without prompts
My first bot
create a python nlp chat bot for creating video game prompts
I am really interested in prompt engineering and in programs that create prompts that create programs or components. I tried to do that here — create a program to create prompts!
It was not super successful — but I feel like it could be a weird sort of poetry.
I asked for a nlp based on a decision tree
and I got this
I looked at the code- its chat3.py and it does not work much.
Surely you can do better I told the chat and then I got this rasa powered I was then introduced to the rasa language. Rasa is a Sanskrit word that means taste (or essence or aesthetic)
I was amazed to find this library! Fun! But I am not entirely sure how it works. How does it work — and chat said…
Rasa is an open-source framework for building conversational AI chatbots and virtual assistants. It is designed to handle natural language understanding, dialogue management, and language generation to create interactive and intelligent chatbots. Here’s how Rasa works:
Natural Language Understanding (NLU):
Rasa begins by processing user inputs (text messages or speech) to understand the user’s intent and extract relevant entities (pieces of information) using NLU models. Rasa NLU supports various NLP and machine learning libraries, including spaCy and scikit-learn, to perform tasks such as tokenization, part-of-speech tagging, and entity recognition.
Domain and Intents:
In Rasa, you define a domain.yml file that specifies the intents (what users might say) and the responses (what the chatbot should say). You define intents for different user actions or requests (e.g., “greet,” “book_flight,” “get_weather”). This domain file also contains information about slots (user input that needs to be remembered) and actions (what the chatbot can do).
Rasa uses a dialogue management system to decide how the chatbot should respond to user inputs based on intents and entities. Dialogue management is often handled through stories, which define conversation flows. You create stories in a stories.yml file, specifying sequences of intents, entities, and chatbot responses to create interactive conversations.
Machine Learning Models:
Rasa employs machine learning models to predict the next action in a conversation based on the dialogue management system and the current conversation state. The dialogue management can be rule-based, but it can also use reinforcement learning techniques.
Rasa allows you to define custom actions that your chatbot can perform. These actions can include calling external APIs, looking up information in a database, or performing any other custom functionality. You can create custom actions in Python code and use them in your stories to enable more complex interactions.
To make predictions and understand user inputs, Rasa needs training data. You provide training data in the form of labeled conversations (stories) and NLU data (examples of user messages with their corresponding intents and entities). Rasa trains machine learning models based on this data.
Once your chatbot is trained and ready, you can deploy it on various platforms, including your website, messaging platforms like Slack or Facebook Messenger, or as a voice assistant.
You can integrate Rasa chatbots with various channels and services to make them accessible to users where they are, such as web chats, mobile apps, or voice interfaces. Rasa provides connectors and SDKs for such integrations.
Building a chatbot with Rasa is an iterative process. You can continually improve your chatbot by adding more training data, refining stories, and enhancing your NLU models.
Rasa’s architecture and flexibility make it suitable for a wide range of conversational AI applications, from simple FAQ chatbots to complex virtual assistants. It offers a comprehensive set of tools to develop, train, deploy, and maintain chatbots that can engage in meaningful and interactive conversations with users.
I went to see harry smith at the Whitney the other night. I love harry smith. I am not sure how I first heard about him, but I became obsessed with his collection of string figures and string figures in general and string figures as the first animations and string figures as stories and string figures as constellations.
One of my favorite parts of the exhibition at the Whitney was sitting and listening to his folkway recordings and reading how he analyzed each song. I loved reading his taxonomies and classifications.
Harry Smith also created his own notation for music and dance.
I decided lets see if chat can generate some notation for music. I picked my favorite hildegard von bingen and I picked a chant.
Create a diagram that is a form of notation that explores hidden and obscure features of a piece of music, be inspired by the notation of harry smith and john cage and others like this. for this example create a diagram notation for O Viridissima Vergo by Hildegard von bingen
I asked chat gpt to show various elemens of the music and chat wrote Certainly, you can create a more detailed graphical score notation to show repetition and other musical elements in “O Viridissima Virgo” by Hildegard von Bingen. Some musical concepts and elements you can represent in a graphical score notation include:
Repetition: You can use symbols or lines to indicate repeated musical phrases, motifs, or themes. For example, you might draw an arrow pointing to a section that is repeated.
Dynamics: Use symbols such as “p” for piano (soft) and “f” for forte (loud) to indicate changes in volume. Arrows can show crescendos and decrescendos.
Articulation: Markings like staccato (short and detached) or legato (smooth and connected) can be represented using different symbols.
Tempo Changes: Arrows can indicate changes in tempo, such as accelerando (getting faster) or ritardando (slowing down).
Key Changes: Mark key changes with key signatures or use different colors to represent different keys.
Time Signatures: Show changes in time signatures (e.g., 3/4 to 4/4) to represent rhythmic variations.
Instrumentation: Use symbols or text to represent which instruments or voices are playing at a given time.
Harmony: You can indicate harmonies by showing the relationships between different pitches in the same musical passage.
Melodic Contours: Use lines to show the rising and falling of melody lines.
Counterpoint: Show different melodic lines played simultaneously, emphasizing the polyphonic nature of the composition.
I was having a conversation the other day with a friend who said LLMs were all trained on stolen data. I said yes — this is called “primitive accumulation” — it is the origin of capitalism.
But I am not really interested in LLM content generation for our future when we all sleep in pods, matrix style, hooked up to vr headsets watching AI generated entertainment that will probably be better than some entertainment today (reality tv for example).
What I AM interested in is the relationship between LLMs and engineering and how LLMs can insert itself into the engineering process. Build me a component that does X. This can be an electrical component, a chemical component, a mechanical component, a programming component. My experiments have been programming components since this is what I understand. But I have also been studying phytochemistry- not much — so I really dont know anything.
But I was looking at methylation and wanted to experiment with that.
I asked: > create an interactive diagram and process methylation — use tryptophan as an example — use python to write the program and use bio chem python packages
The result was a junky non interactive diagram generated by python.
I prompted >this is not interactive
And you know what — I got a really sort of fun- but sort of useless interactive diagram. It took a few prompts to fix because it was using Jupyter notebooks and I was not, but FUN!.
Take a look at the html code in the git repo — I am into this.
I also wanted to generate the actual methylation process — ideally as an animation with bonds coming and going but this was not as successful and I gave up.
The amino acid tryptophan is crucial for serotonin production in the gut.
What is interesting is the step by step discussion of methylation — I’ll put here. But I dont know if this is an AI hallucination
>The methylation of tryptophan to form methylated tryptophan involves the addition of a methyl group (CH3) to one of the carbon atoms in the tryptophan molecule. Specifically, the process involves the methylation of the indole ring in tryptophan. Here’s an intermediate step-by-step explanation of how this methylation process happens:
1. **Tryptophan Structure**: The chemical structure of tryptophan contains an indole ring, which is a hexagonal ring fused to a five-membered ring, along with various functional groups. Here’s a simplified representation:
“` H | N — C — C — C — C | | | | H H H H “`
2. **Methyl Group Addition**: The methylation process involves adding a methyl group (CH3) to one of the carbon atoms in the indole ring. The exact location of the methylation can vary depending on the enzyme or reaction conditions, but for the sake of illustration, let’s consider methylation at the 7th carbon (C7) of the indole ring. Here’s the structure after methylation:
“` H | N — C — C — C — C | | | | H H H H | CH3 “`
3. **Methylated Tryptophan**: The result is the formation of methylated tryptophan, where the indole ring now has a methyl group attached to one of its carbons:
“` H | N — C — C — C — C | | | | H H H H | CH3 “`
In a biological context, the methylation of tryptophan is catalyzed by specific enzymes. The exact mechanism and location of methylation can vary depending on the specific enzyme and the organism in which it occurs. Methylation reactions are essential for various biological processes, including the modification of amino acids, nucleotides, and other biomolecules, which can have significant impacts on cellular function and gene regulation. OpenAI. (2023). _ChatGPT_ (September 25 Version) [Large language model]. https://chat.openai.com
When I asked it to generate python I got this addtion >In this code:
1. We create a molecule object for tryptophan using its SMILES notation.
2. We identify the carbon atom (C7) in the indole ring where we want to add the methyl group. In a real biological context, the methylation site would be determined by specific enzymes.
3. We create a methyl group molecule using its SMILES notation.
4. We merge the tryptophan molecule with the methyl group using `Chem.CombineMols`.
5. We add a single bond between the methyl group and the chosen carbon atom to complete the methylation process.
6. Finally, we use RDKit to draw and display the chemical structures of tryptophan and methylated tryptophan.
Please note that this is a simplified representation of the methylation process. In a biological context, methylation is a complex enzymatic process that can involve specific enzymes and co-factors. This code only illustrates the addition of a methyl group to a specific carbon atom for demonstration purposes. ChatGPT_ (September 25 Version) [Large language model]. https://chat.openai.com
## Other Packages I also asked python what other packages are available for chemistry and got the following
>There are several Python packages and libraries that are widely used for biochemical synthesis, analysis, and visualization. These packages provide tools for various tasks related to biochemistry, from structural analysis to data visualization. Here are some of the most popular ones:
1. **RDKit**: The RDKit is a comprehensive cheminformatics library that allows you to work with chemical structures, fingerprints, and molecular data. It’s widely used in drug discovery and biochemical analysis. It provides functions for chemical structure handling, substructure searching, and more.
— Website: [https://www.rdkit.org/](https://www.rdkit.org/) 2. **Biopython**: Biopython is a collection of tools, libraries, and scripts for computational biology and bioinformatics. It provides modules for sequence analysis, structural biology, and more.
— Website: [https://biopython.org/](https://biopython.org/) 3. **Bioconductor**: Bioconductor is a collection of R packages for the analysis and comprehension of high-throughput genomic data. It includes a wide range of tools for bioinformatics and statistical analysis.
— Website: [https://www.mdanalysis.org/](https://www.mdanalysis.org/) 7. **PyMOL**: PyMOL is a molecular visualization system that can produce high-quality 3D molecular structures and animations. While it’s not a Python library itself, it has a Python API for scripting and automating tasks.
— Website: [https://pymol.org/](https://pymol.org/) 8. **Matplotlib and Plotly**: These are popular data visualization libraries that can be used for visualizing biochemical data, including plotting protein structures, molecular interactions, and more.
These libraries cover a wide range of tasks in biochemistry, from structural analysis and molecular dynamics to bioinformatics and network analysis. Depending on your specific needs, you can choose the libraries that best suit your projects.
> any more?
Certainly, there are more Python packages for biochemical synthesis, analysis, and visualization. Here are a few additional libraries and tools that you might find useful:
11. **ChemPy**: ChemPy is a Python library for modeling and simulating chemical kinetics. It’s particularly useful for modeling chemical reactions and reaction networks.
I started today trying to get blender to work!!! To no avail. Then I thought I would ask chat to animate a person travelling through the Bardo. The Bardo is the realm between life and death. But really it is not just life and death it is all transition. One of our purposes in life is to become good at transition – this is for the final transition that we know about – death.
Some people are so fluid – so good at transition that it is all bardo – they flit in and out of time, of existence, like i imagine subatomic particles, like quarks.
I am so tired right now. So much, everywhere. I said to a friend once, i think if everyone meditated everyday we would have peace. Really what I meant was that we would have peace in the 30 min that everyone sat in meditation. You can’t do anything when you are in meditation. I don’t remember what we concluded. Maybe that I was wrong. I told this to Mary Ellen too – I said we would have peace if everyone just meditated every day. She said yes, we should do that. I said yes too. Maybe I will just have a twitch stream where we all just meditate – every day. I used to do a yoga sadhana every morning at 4:40, during the pandemic. It was never more than 10 of us. We sang the Japji and moved into asanas and chanted.
Really this post should be called the chemical wedding, but for some reason I was writing the alchemical wedding. The chemical wedding is a concept from alchemy – it is a symbolic story about the combination of opposites – of self realization. But it is? I mean I guess to someone it is. This concept really influence Jung, the father of analytic psychology – and much of his work is about integrating the different aspects of ourselves in our process of individuation – the animus and anima (male and female). This feels outdated today.
I have tattoos of mandalas – circles . My kid Cass drew pages and pages of circles in his notebook. He said he was practicing drawing circles. Jung says when someone draws circles they are moving torward wholeness, healing. I told this to a friend who said – that is so patriarchal! Women when they individuate they make squares. I said maybe they make Xs. This is what I thought today when I generated this image with the LLM.
create a python animation of the alchemical wedding
put this on a loop
it is still not on a loop – how do I know if the wedding has taken place
I could use a day off to reflect on my practice with LLM, but it is just not happening right now so persevere and sort of phone it in. In phoning it in I am generating some interesting experiments.
Today I asked:
create a gitlab-ci pipeline that does something creative with code
I am obsessed with pipelines. It is a way to divide up code based on stage, development, testing, and deployment, and to apply certain guide rails and checks to each stage. I have a whole pipeline metaphysics that I will at some point discuss. The technological society — which is our current condition operates under pipeline metaphysics.
Creating a creative GitLab CI/CD pipeline largely depends on your specific project and goals, but I can provide an example of a creative pipeline that performs code analysis, generates ASCII art from your code, and then deploys it to a static website. This is a fun and unique way to visualize your code.
1. First, make sure your repository has the following structure:
This pipeline has two stages: `build` and `deploy`.
3. Create the `generate_ascii_art.sh` script in the `scripts/` directory:
python3 -c “ from art import * art = text2art(open(‘$code_file’).read(), ‘block’) print(art) “ > ascii_art.txt “`
This script uses the `art` library in Python to generate ASCII art from your code and saves it in `ascii_art.txt`.
4. Make sure you have the `art` library installed in your Python environment. You can install it with:
“`bash pip install art “`
5. Create a simple HTML template in `public/index.html` to display the generated ASCII art:
“`html <!DOCTYPE html> <html> <head> <title>Generated ASCII Art</title> </head> <body> <pre> <! — Include ASCII art here → $(cat ascii_art.txt) </pre> </body> </html> “`
Now, when you push code changes to the `master` branch, GitLab CI/CD will automatically generate ASCII art from your code and deploy it to a static website. You can access the generated ASCII art by opening the website hosted in the `public` directory of your repository.
This is just one example of a creative GitLab CI/CD pipeline. You can adapt and expand upon it based on your project’s needs and the creative ideas you want to implement.
This did not work out of the box — as is the case with most LLM work. But it 95% worked. And I fixed it mostly by asking follow up questions not editing the code myself. What is most impressive is that the LLM created a series of pieces, the pipeline code, the directory structure, and code run by the pipeline.
The ascii art does not generate. I suspect it has something to do with my python program that generates a Fibonacci sequence.But I am not going to debug that. Instead I am really amazed what did happen and then maybe in the future I’ll go back and revisit these early experiments.
# Generating a Flask Web App Today I generated a web app using a few lines of chat gpt prompts. I dont think it would run on a server since it writes to the local file system and so has some permission / security flaws — but it is still pretty darn cool!
First I had LLM generate a program that calculated an astrological chart.. Then had it generate a visualization of the chart, which was not that great, then I had LLM generate music. I asked that the note and the tempo be linked to the zodiac and house of the different planets — I am not sure this happened with any sort of finesse but it happened.
I then asked the LLM to allow the user to input birth data, and finally, I asked the LLM to create a flask web app and a webpage that accepted the birth data and this is the app you see.
There were a bunch of things that did not work out of the box, modules were not imported, as I said this has major security issues, and it looks pretty bad — but for a few lines of prompt and something that was generated in about 10 minutes this is pretty amazing.
What is multimodal AI – it is ai that takes input from different modes – like audio, video, or other sensors/senses.
It is the next hot thing in AI.
Implicit in multimodal AI is this concept of mapping or translation. How do we translate sound to text, or text to numbers.
For 20+ years I have been involved in this mapping question. I have called it at various times mapping, translation, or transduction. These days I prefer transduction. We are in the age of transduction – which is translation between different scales of things – or different phases of things. If we imagine the past as the science of transformation, how does one thing transform into something else, transduction is how do we map the parts of the transformation.
That may make no sense – I just think I confused myself.
In any case, for years I have played with translating/transducing chess into other modes (sound, audio, dance ). Chess has 8 rows and 8 columns, like the 8 notes in the western scale. 8s show up again and again in various cultures and it would be interesting to play a bit more with this – I think of the 8 of the ba gua – the 8 trigrams – as well as ba gua the martial art. But I digress.
So today after a number of false starts I had GPT chat generate a python program to generate a musical composition based on a chess game.
chat gpt used the musical program lilypond which I have never used before. Lilypond generates musical notation and midi. This is cool.
I dont necessarily agree with the mappings – why were they selected? I asked chat gpt which said “The mappings I provided between chess positions and musical notes in the earlier examples were selected for simplicity and as a starting point for the demonstration.” OpenAI. (2023). ChatGPT (September 25 Version) [Large language model]. https://chat.openai.com
I then asked “could you regenerate this by considering each piece as representing a different rhythm and each square as representing a different note – the horizontal being notes within an octave and the vertical as different timbres (frequencies)”
This only seemed to generate difference in rhythm – the notes were all the same. Also the musical notation does not seem to match the midi.