?

Log in

Writer's Block

Humans and Cylons

The final episodes of Battlestar Galactica begin today. The sci-fi drama often explores the relationship between humans and machines. At what point do we consider a machine with artificial intelligence to be an individual with its own feelings and rights?

Answers (247)





  • My dad loves to play chess.

    Many years ago he bought a computerized chess game. That thing was unbeatable.

    He tried and tried and tried to beat the computer, but he just couldn't do it until one day the computer made a fatal mistake.

    My dad caught it in "checkmate." 

    The computer came equiped with a button you could push which forced it to verbally admit defeat by saying, "I lose".

    May dad must have pushed that button about a thousand times.

    "Dad, what are you trying to do? Psych the computer out???"

    "Yes."
  • I'd like to restate my answer: When they can fight for them.
  • When they fall in love with us or with other machines
  • When they start looking like us. Or when they can communicate to us in some way. Even the smallest ways of a fish can tell us their most basic needs.
  • If you watched WALL-E you would know.
  • Hmm...never watched the new show...loved the old show though...I think we need to be very careful in where we go with artificial intelligence..I don't think God's handiwork should be messed with overly much.
  • ummm duhh... its an individual with its own feelings and rights, WHEN it HAS its own feelings and rights. sorry but this question was kinda worded poorly.

    if the machine is able to say that it has thoughts and feelings and rights, then well.. it does! right?

  • Ah, a worthy question indeed...

    Alright then. I can answer this quite simply: that when machines and computers with AI fully indistinguishable from human intelligence (if there is such a thing) come along, different segments of humanity will respond in ways different from each other but identical to the way they always have. The conservatives and right-wingers will say that these new machines should not be recognised as humans and thus should not have their rights, the liberals and left-wingers will say the exact opposite and the centrists will urge caution either way, moving by half-measures.

    I myself am a left-winger; a socialist anarchist, if you like, and thus would advocate most strongly the right of complex AI machines to be human and recognised as human in regard to the law. We should welcome the new machines into our society as our kindred, and help them to prosper as they will no doubt do for us. A symbiotic, synergetic relationship between machine and man, for the good of all.

    After all, if we do not take the way of integration and prosperity, there is only one other way which we can go: that of war, and destruction. Remember the days of slavery: where the conservatives and the slave-traders said that the black Africans were not human and thus were not entitled to human right - they were regarded as little more than chattel. Should human-AI machines arise, and we deny them these rights, they will rebel and gain those rights in the same way that black (or should that be 'coloured'?) people have. I find this particularly significant on the eve of Barack Obama's inauguration.

    Human rights are basic rights. It is folly to think that one can deny any organism, organic or mechanical, the basic rights it requires to function. Everything will find a way of getting what it needs - and if it does not, its suffering will be shared, mostly with those who deny it, but also with innocent bystanders.

    Besides, how does one define a 'human'? How does one define a 'machine'? How can one say that these two things are in any way different if these definitions are not determined? A human has memory - a computer has memory. A human can process facts and figures - so can a machine (better, even). A human can manipulate its environment with its hands - again, a machine is capable of this (if the hands are fitted, obviously).

    'Ah, but machines must be programmed to do all these things, and constructed by us to do them,' you retort. Fair enough. However, are we humans not also constructed? Are we humans not also programmed? If a machine was made that could assemble another machine like it and copy its coding into the next generation, how would they be any different from us?

    The basic fear is, of course, the thought that normally inanimate objects might live. We don't want to face that eventuality: we don't want to have to face ourselves. Truly, the more we face what we cannot recognise, the more we confront ourselves: having to face inorganic entities that behave entirely like us would finally bring us face-to-face with the truth of humanity and what it is to be human. Why else is it such a popular subject in fiction?

    But that is enough, I think.

    So... perhaps the answer isn't quite as simple as I thought. :)
  • Need something to write about and the official Writer's Block archive not cutting it? Well come on over to , where we have over 500 old Writer's Blocks archived! Never run out of things to write about ever again! Also, if you're *still* bored after that, come over to this entry to answer our question "If you could write a Writer's Block, what would it be?"
  • I dont have an answer but I got a question. If this "machine of your life" was not invented, what can you betray? ...I'm so desperate for an answer.
Previous
← Ctrl ← Alt
Next
Ctrl → Alt →