30 Mar 2009

I will repeat everything you say



I've been experimenting the last few days with feeding back speech recognition interpretations of a paragraph from HG Well's The War of the Worlds back into speech recognition systems to see what effect they had. I'm using a system called 'talkback', which is part of the Microsoft Speech Development kit. I'd open several talkback windows, read a few sentences and the let the system speak back to itself over and over.

I'm a little unsure what to make of my experiments. I came to using speech recognition systems creatively with a mind to exploring what James Guetti calls 'sentence sounds' - based on the premise that, as speech recognition systems look at the words surrounding a word to calculate what it is, they might go for a sentence in a way that might be at least usefully dis-analogous to how we might grasp a sentence aesthetically.

But what my play has thrown up is a decaying discourse, where the only constants: "I heard" before it gives its interpretation and "when you said "after" get increasingly repeated. It gets stuck in one distortion of these - for instance "when you said" gets transformed into "when you shouldn't" "when you used" etc. Occasionally it throws up something suddenly strange and poignant (for instance, "and we want to understand" in one long piece of ramble.

Clearly my methodology is not appropriate to my original interests, but I wonder if I've stumbled upon something of potential interest here? I'm not sure if there's any direct knowledge or insight to be had; but perhaps it could be manipulated to serve as an interestingly dis-analogous model of some other process? Something of Bernard Stiegler pops into my head - was it 'circuits of trans-individuation' perhaps? I'll have to take a look.

In any case, I hope to mix a track out of it for the next Fuselit, so hopefully I can make something enjoyable to listen to at least.

15 Mar 2009

Wittgenstein and Digital Animals

“The human body is the best picture of the human soul” – that much misused and over quoted line of Wittgenstein is interesting on a number of levels. Part of what it does in (the context of where it appears) is draw a red line across our questioning of the existence of other minds. However removed or transcendent the soul might be, the body (by which I take to include behaviour and not just the physical form) is our point of reference to understanding something’s mind.

Now, the question of whether other minds beside one’s own exist at first glance seems like a typical philosophical indulgence – spending an age answering questions that only get us to where we were before we started asking them. But it’s of crucial significance when it comes to looking at non-human beings – if we going to deny that something has a ‘mind’, to deny it’s worthy of consideration in our ethics, then we’d better be damn sure we’re setting the bar for this at a level that doesn’t rule out the human race in its entirety.

Arguments go back and forth about the ethical status of animals – I won’t go into them here, except to mention an interest in Daniel Dennett’s idea that true suffering requires a measure of mental complexity – a dog suffers more than a shrimp under his scheme. I’ll return to this at a later point (I hope!) The point is that what type of mind an animal has, and what attributes of mind are of ethical interest are the crucial questions here.

Now, I think we’re very much still at the stage of talking about digital animals of one form or another – whether computer viruses, Creatures or whatever; but what I’m interested in is the essential differences between a digital animal and an “analogue” one (ick, sorry about that formulation!) with the aim of coming up with a pretty decent list. I’ll start exploring a few obvious ones, which will probably mirror the normal digital versus analogue contrasts. Note that I’m not interested in the difference between biological and pseudo-biological (e.g. androids) but between biological and digital proper (e.g. computer programs as life). I won’t worry for now if the differences end up a bit blurry:

1. The only decay that necessarily affects digital creatures is the decay of its environment. An animal in the woods is threatened by not just the deterioration of its environment to the point where it can no longer sustain itself or reproduce successfully, but also the inevitable decay of its own body. For a digital creature, a failure of infrastructure could be fatal (a hard disk error for example) but there is no logical reason why it should necessarily should deteriorate and die. Of course, artificial life simulators like Creatures do introduce an artificial aging process and decay, and it is also possible that a creatures code could tend towards entropy; however neither artificial limitation or entropy are logically essential. Let’s call this the no necessary decay principle for now. We’ll leave alone for now the possibility that normal biological creatures might not necessarily decay – that it might be a beneficial evolutionary feature for a species to decay and die – just for the sake of efficiency!

2. Related to this: a frozen copy of a digital life form can be taken at any point, and as such it is always possible to restore an a digital life form to a previous state, provided copies are kept. Can we do the same with single celled life? Perhaps. But it is certainly beyond us to do this with more complicated animal life, and it’s the more complicated animals which are of ethical interest. Let’s call this the backup principle,

3. The ‘personal identity’ of a digital lifeform is far less clear than a biological one. We know that although Dolly the sheep, although genetically identical, was not the same sheep as the sheep she was cloned from. The fundamental difficulty for digital lifeforms is that they would move in the same way as they are cloned. It only counts as moving if the copy at the original location is eliminated. So we’ll call this the movement through cloning principle.

4. Digital life would occupy a very different geography – but a geography nonetheless. An unplugged network cable could form an impassable barrier, as could a firewall. Actually, if you think of computer viruses as digital life, is there a great difference in the kind of geography they occupy? There are physical barriers, threats, safer hiding places and so forth. They can adapt, form defensive behaviours etc. Digital geography’s can radically change however – think about a hard disk format. But then we have floods, volcanoes and earthquakes. Perhaps it relates to the above in that you have restorable geographies. Think about a computer switched off, then switched on. Perhaps this is like land cut off by the tides. We could also call it flickering geographies. At the least, digital geography is a distinctly exaggerated version of normal geography.

There’s a lot more to be said, and a lot that’s wrong with the above. It would take some time to unpick, but it’s worth laying out, if only to sort it out later.

We can see that the extent that geographies and beings can be restored provides a very different ethical situation – imagine if we could torture a person, then flick a switch and it’s as if nothing happened. We’d want to say it’s still wrong, but my feeling is that something about it messes with our ethical sense. Again Wittgenstein is of use here:

If a man’s bodily expression of sorrow and of joy alternated, say with the ticking of a clock, here we should not have the characteristic formation of the pattern of sorrow and the pattern of joy.
And we can conceive of doing exactly this with a digital lifeform. The extent we could manipulate such a being is only limited by the complexity of the apparatus we could create to do such a thing. And it could only get away from us if digital life developed complexity beyond human intervention; or alternatively, if the source code for this complexity was destroyed.

This is just a doodling of ideas that I’m not going to try to bring to any fine conclusion yet. It does seem to be a visually rich way of exploring networked digital technology, but whether it has any solid implications needs further exploration.