Monday 5 July 2010

Thoughts on Consciousness

I've killed several insects recently. For some reason this has left me with a strange, but small, sense of guilt. It's probably very revealing about the inner workings of my psyche that this event and emotion have made me all philosophical.

First I placed myself in the 'shoes' of this mosquito, happily flying around, looking forward to its next meal - probably a nice quart of my fresh blood. Of course, this 'empathy' was deeply flawed. The mosquito cannot be 'happy' for it lacks the central nervous system needed to feel this emotion. Nor can it 'look forward' to anything.

If I remember correctly, woodlice move their legs dependent not on 'thought' or 'feeling' but simply because light speeds up the chemical reaction in their legs, making them move more quickly, therefore making them move away from light. A simple mechanism that I imagine makes them hidden from predators and move closer to a food source.

It might be fair to say that this woodlouse is not more complicated than a simple machine. In fact, we, as humans, have built far more complicated machines than this woodlouse. The mechanism I described wouldn't be out of place in a child's toy.

As always, technological advances force us to redefine our understanding of the world around us. We have created synthetic machines that are, in many respects, more advanced than basic lifeforms. Does that not mean that one day we will be able to create machines more advanced than us? What then is the makeup of the greatest mystery of human existence - what is consciousness?

I would not say the woodlouse is a conscious being and I would not say a child's toy is either. Knowing this, I would have no qualms about 'killing' either of these things arbitrarily. I would equally feel no guilt for destroying the most advanced computer in the world (other than for the cost of damages, of course). I would not feel like a life has been taken.

I would feel this guilt if, for instance, I killed a fellow mammal. A friend showed me this rather brutal video (which is possibly the most NSFW thing I've seen since 2girls1cup). It does feel bad seeing animals in distress.

This is of course due to the very noticeable and very 'human' expression of distress. However, if there were a completely new (possibly alien) species than we knew had equal 'consciousness' to a cow (or any mammal) but did not show their distress (perhaps they do not even feel any) I would still feel more guilty than I would do destroying a computer.

Why is this? Why do organisms have more value than synthetic beings, even though their ability to 'think' is very similar? In fact you could argue that everyday computers are far better at 'thinking' than we are. I'm fairly sure there are super computers that exist that have more overall processing power than the human brain (which by the way is remarkable for its size - evolution FTW). By comparison these machines are worthless in and of themselves. Their 'life', their 'thought' has no value.

After killing the aforementioned mosquito, I sat on the toilet and thought hard about what does have value and why, eventually reaching the following conclusion. The difference between the bug, the computer, the mammal and us humans, is the complexity and diversity of our sensations, thought and responses.

In a sense, the computer and the insect are the same. They have no desires, no wishes. They are both machines based on very simple architectures. The computer has vast processing power, capable of sums no human can do, but it cannot choose to do anything. It simply responds to a stimulus and acts directly upon it. The relationship between stimulus and response is hard or softwired.

Humans and animals are slightly more complicated. Whilst, to a degree, we are also hard and softwired, we appear not to be. Much of this is due to the complexity of our 'hardware'. We actually have many distinct outcomes that, due to the limitation of our brains, we cannot quantify (however, perhaps there is a species somewhere in the universe that will look upon us in the same way we look at insects?).

Anyway, the point is that our complexity comes from our ability to receive data from wide ranges of 'sources' that are often in conflict. The insect does not think in the same way we do because it never has to choose between biting someone/thing or running away. This decision is made entirely based on direct stimuli.

Humans have to receive 'inputs' (I use this term for lack of a better word) in the form of desires, personality, emotions, sensations and thoughts. Decisions are made based on these (and probably many other) things. Even a cow has to choose whether to keep finding food, or lay down because it's tired or moody. These might not be choices in the same way that humans have to choose, but they're certainly far closer than a computer ever gets.

If computers eventually develop the same level of complexity in their thinking and develop their own needs, aspirations and sensations then I believe I would feel guilty for ending a computer's 'life'.

Hey, it might all be bollocks. I'm currently very tired (writing this at 2am) and I suspect this whole post is very poorly written. I'm also very aware of how debatable each aspect of the argument is, particularly because, to my knowledge, the language needed for much of this discussion doesn't currently exist in the mainstream (and more importantly, in my vocabulary). I'll probably read this back tomorrow and think it's all ridiculous. Either way, I can say I had a eureka moment on the toilet, so the whole experience was worthwhile.


ADDENDUM:

Here's a simplified, tabulated version of what I was trying to say before. Hope it makes some sense


Update: July 29th 2010

I reread this and here is my self-commentary:

Is this an attempt to rationalise the irrational? The guilt is an emotion triggered (supposedly) by my connection to other sentient beings. Why do I even give this life value above others? Most likely because I, as many humans do, define myself mostly by my sentience. "I think, therefore I am". My concept of identity and individuality is centered around consciousness. I have placed value in these things, therefore I project these values onto other sentient beings. I connect with them. I feel for them in a way I cannot for non-sentient beings that don't value these things.

It is somewhat simplistic to say that all action is selfishly motivated - mostly because it implies negativity - but it does serve as a reminder that the only way we can ultimately judge the consequences of our actions is by the emotion it makes us feel. I myself am guilty of forgetting this. To try to quantify the value of life by its resemblance to the 'human experience' is deeply flawed. In this attempt I was fairly oblivious to the ridiculousness of the endeavour.

That's not to say, however, that the system does not have its merits. In fact I still believe much of what I wrote. Much of the understanding will most likely be robust enough to serve its purpose as my personal guideline. Nevertheless I am aware that in all likelihood, millennia from now, this view will be looked back upon as inhumane.

Essentially what I'm trying to say is that morality in this form is an attempt to process and make sense of [thought] something that we cannot easily process [emotion]. In this case, I was not trying to find 'truth' exactly , but rather to process my guilt for killing the insect. Without the emotion, I would not feel the need to compare the value of life and my only way of judging whether I have been 'correct' in my comparison is the alleviation from that guilt.

In the end, morality is still irrational and no amount of thinking can change that. Of course, it is deeply important from a personal perspective as well as a sociological one. It's just something I feel it's important to be aware of...