Broadly speaking, the Lifecycle of Software Objects by Ted Chiang is about how to educate virtual world avatars (digients), and the emotional connections they make with humans and other avatars and this connection to their utility and education. This not the same as my presentation but a brief discussion of the economics of virtual objects and their relation to mental states as represented in the Lifecycle of Software Objects.
“Teleology and Motivation”
Running through this novel is a contrast between purposefulness and purposelessness -and the dialected between the two. Ana, the avatar trainer begins her relationship with the digients because she needs a job, as required by capitalist societies in order to participate fully within the social system. The end of the job is to make money. She is not terribly interested in the job but she needs the money. When the company folds that makes neuroblasts she still desires to work with the avatars. The teleology has shifted from an economic or capitalist basis to one based on pleasure and desire. There is then the final loop in the circuit where, Ana is motivated to get a job not because of her role in the capitalist social system, but because she desires to give her avatar Jax a “full life.” Is her free time playing and educating Jax a commodification of her leisure time, or leisure without economic benefit?
There is the tangential issue of how do we motivate people to work or to learn, do we make the object of learning appealing or do we modify brain chemistry via drugs or do are economic incentives alone enough. This is not so unlike the relation between knowledge and eros in socratic dialogues, and the idea that knowledge exists in a wider ecosystem rather than a rarified logical plane.
On a personal level, this is something I think a lot about with my company print all over me, and with different incentives for community engagement and how this impacts feelings about the system as a whole. For example, people are not paid to post on facebook, yet facebook makes money off this free labor. There are always sites popping up that offer monetized versions of this, yet people are not interested in using a system where their communications and relationships are so obviously commodified (e.g. Steemit)
“Its Like Fashion – It Never ends”
I misquoted this from the movie The Social Network. The title of the Chiang’s story is “The lifecycle of software objects”. This, itself, is an interesting problem to consider in light of various situations introduced in the novella. Unlike a physical item that has no further state changes or desires itself once it is created, a digitial object will continue to change as long as it is receiving inputs. How do we further square this with the 2nd economic property of digital objects – they have zero marginal cost. It costs me nothing to make a copy. However in the world of the digients a copy is not exactly a copy since it changes over time and becomes individuated. It would be interesting to explore Simondon here.
The question of the archive becomes central. This is what I imagine the siloed data earth to be. What is the metaphysics of the archive – is this a real space or how is this space different from the space of the market (or the community at large), What is the ethics of the archive? Chiang’s provides a rejection of the archive. It is useless for a software object, or perhaps a cybernetic object, a human perhaps is not suitable to live in an archive. I am thinking about about Rhizome’s net archive project, where net art programs are preserved on the systems in which they were created, the mosaic browser, for example.
“The Time Value of Money”
This is something I picked up from my days on wallstreet. This is that through the interest a dollar today is worth less today than it is worth in 2 weeks. How much less is the time value of money. Digients and Humans have their own time value. Through out the novella I could not help but see the repetition of time, a year had past, months had past. I sort of want to autogenerate some sort of diagram for this. There is the time value of love, as Derek’s and Ana’s relationships to each other and other people change. There is the very real time value of money as different types of digients are hacked to learn at hothouse speeds. In computational theory I am very interested in the idea that certain processes take certain amount of time. This is the definition of computational theory ontologies that divide one class of computation from another. In newtonian physics there is no content to time. It is just a measurement, and when we measure the distance a car travels, when it travels does not matter. However, this is a proustian / Bersonian notion of time, that time events unfold in time and that time is not uniform and measurable in the newtonian way. Other perturbations to the system affect the affect of time. I am thinking of the experiment where digients were ran to learn from eachother and just became feral. As time progressed, the impact on the digients changed depending on who they interacted with. And just as in Proust, experience of love or other emotions change the nature of what is learned or experienced or remembered as time passes.
I feel like I should say something now about the unconsciousness or neurosis. I am interested in what would constitute a digient or AI unconscious – and this is in a Jungian sense. What is the AI unconscious trying to communicate to the AI ego that will allow the AI to live a more conscious, in Jung – free-er life. So in this sense, psychology is bound up with notions of freedom and free will and autonomy. In the novella there does seem to be a robot unconscious. Why does Jax like choreography, why do Marco and Polo like their dramatic narratives? Why is Marco ok with becoming a sex object? There is a notion of fulfillment and purpose that is related to social interaction and when these are denied the digients they fall into psychological crises. Jax asked to be suspended until he can be moved into a more populated world. What are the unconscious complexes that structure Jax and the other digients. Does it make sense to talk like this? Do we need a different framework to understand AI (non human psychological states)? For the digients, these breakdowns can not be fixed via simple tweaks in software, they must either be reverted to an earlier stage, or learn their way out if it. How they can do that is a mystery.