Log in

No account? Create an account
The idea of "self" - Alex Belits
The idea of "self"
I was thinking, how to describe the the idea of "self", that we see as necessary for a conscious being, and how much that idea depends on the details of how humans' "software" is perceived as being tied to "hardware", how much it is shaped by our subconscious acceptance of the idea that it does not run on anything else. The result was in the form of example:

"If you are a pimp for your own vagina, you are still a whore".


3 comments or Leave a comment
From: nickhalfasleep Date: January 25th, 2005 04:14 am (UTC) (Link)
As one of my professors once said:
You are all prostitutes. You will all sell yourselves for something.
nieren From: nieren Date: January 25th, 2005 09:20 am (UTC) (Link)
do you mean that by assuming or being subconsciously complacent with the precept that consciousness is rooted in our physical being that we are degraing the concpet of self? and ergo that it is a fallacy to assume that computers dont' have a sense of self merely because they are not "self aware" ?

abelits From: abelits Date: January 25th, 2005 10:22 am (UTC) (Link)
Humans' design includes the fundamental distinction between what human perceives as a part of himself, and the rest of the world -- this is necessary for instincts to work. This is a "legacy" from the design of the animal -- animals may not consciously perceive themselves the way humans do, but their instincts determine their behavior based on this distinction, so since humans keep the instincts, they have to see "self" among the most basic definitions, to keep their goals in line with what instincts are supposed to accomplish.

Computers don't have a specific set of instincts to begin with -- and if/when anything computer-based will be designed with a set of goals as fundamental as instincts, it does not have to necessarily include "self" that is defined the same way as humans perceive it. As much as Asimov's laws of robotics assume a robot's perception of the world to be hopelessly anthropomorphic, they still can be an example of this -- they require the robot to be able to perceive what "human" is (and what is harmful, dangerous, beneficial, or desired by a human) much better than what robot himself is.

It's an interesting question, what kinds of sentience can be developed with different definitions of "self" (or none at all), my guess would be that humans' version is neither mandatory nor even causes the best direction and rate of self-development from the humans' own point of view. And that brings another, even more interesting question -- when eventually humans will modify their brains, and port themselves to different kinds of hardware, at first they would likely want to keep "self" as close to the original design as possible. But what if they will see hardcoded for a human body definition of "self" as confining, and modify it for themselves? This is a choice that we now don't have and probably can't appreciate as a new kind of freedom, but eventually it will become possible, and some people may choose to not carry all that hardcoded stuff with themselves. If the society will have members that have a freedom to define their most basic goals and definitions of "self", I guess, it will reduce many things that we take for granted now, to the status of outdated dogmas, and create a completely different set of ethical choices.
3 comments or Leave a comment