no i do not think this is a good classification scheme. what separates a brain and a nervous system from a silicon processor and pressure sensors? both transmits signals and manipulates information.
The brain has evolved and uses a completely different method than computers. Brains of animals are
vastly more similar to each other in structure than they are to
any computer. For this reason there is much bigger reason to think animals are capable of suffering just like humans than there is to think they can't.
i say consciousness matters because it seems to be the only divider that successfully separates a robot and a human
There's simply no connection in what you're writing. First you say pigs aren't conscious because they don't pass the mirror test. Then you say a blind person is conscious, a photo-detector isn't, in other words that your mirror test can't even establish whether they're conscious or not. That's a rather interesting idea, to think that dozens of millions of pigs should be treated according to a simple test that you yourself clearly don't even trust as a tool to establish whether they're conscious or not, and even more as for whether this ability to recognize itself in a mirror is really what's vital to the ability to feel suffering.
Robots and humans are separated by millions of different factors, but they depend entirely on how you build the robot. A robot may not have legs, it may not have skin, the robots we build today don't use brains like humans and animals, and I'd say that's a very important difference, because we know the link between our brain and our ability to suffer. Had we built a robot out of human cells, with a brain, from scratch, putting every atom together making it just like a human, then I'd assume it does have consciousness and the exact same properties like any other human. Had it been put together like a human brain but no body but still the same inputs, I'd say the same, just like what we've observed in humans who have lost parts of their body, eyes, etc.
Consciousness, established by the primitive mirror-test, is really just
one factor that you're cherry-picking to suit your needs in this case.
what does it mean to feel pain?
That is not an easy question to answer. But what we know is that it is 100% related to parts of our brain processing stimuli from our nerval system.
As for whether there are other ways that could be like pain, such as in robots with a completely different structure, I see no evidence for this. But the evidence says that feeling pain is parts of the brain processing signals. So I'd rather go for preventing slightly different brains from processing such signals than I'd put them into a category with robots and ignore them.
why does a brain which processes signals feel pain but a computer not?
There's simply no evidence for what any computer might feel, unlike the brain.
how are the ionic pumps of the individual neurons firing to give pain signals any different from how the circuits inside memory add one to a variable (or however ram works, i'm not familiar). i don't think it is any different.
Well making a variable is simply like writing a statement on paper. If you make a truth value and set it to true whenever a program sees the pattern "1 + 2 = 4" it doesn't mean the result is correct. There's no reason to think those variables will have any connection to what you call them. You might as well call the pain variable stupidity but give it the exact same properties as before, shouldn't make the program stupid every time it detects pressure on some pressure detector.
On the other hand we see the clear connection between what's going on in human brains and painful events, which is evidence that human brains are capable of feeling pain. Computers have a completely different structure than brains and I don't see any reason why you should be able to generalize from human brains onto computers.
Pigs are aware of external objects (their environment, food, other pigs, predators, knives for example). They fulfill many of those criteria to a big extent even though not as much as humans. They react to things in ways that resemble human feelings, they make noises in cases where you'd expect humans to feel fear or pain, for example. As you yourself said, consciousness isn't boolean.
i don't think a pig is aware of external objects. its brain might process the data from its eyes, but it is not aware that they exist. if it was aware, then it might be able to pass the mirror test or other tests of consciousness
So which other tests so you suggest, now that the mirror test did not work?
http://www.sciencedirect.com/science/article/pii/S0003347209003571Consciousness is a rather nebulous subject and from all we know about it it's clearly nothing like boolean, it's very gradual. Most babies can pass the mirror test by 1.5 years, but earlier can use the mirror image for other things, pigs similarly can use the mirror for understanding where objects are without going there.
(Also, I find it a bit ironic that you're against abortion yet no fosters would pass your mirror test and yet you still find it irrelevant how pigs are treated, or maybe now it doesn't matter how you treat babies until they can pass the mirror test?
).
because we haven't developed a non-boolean test yet. this is the best i can do with what i have
I disagree, consciousness is clearly very gradual and instead of using that one single rather bad test, we should look at the biology of animals, and see that the more they have for pain processing that is comparable to what humans have, the more we should consider their well-being important.
there are similarities between humans and pigs, and similarities between robots and pigs.
although a pig may seem much more similar to a human than a robot, i believe this is not the case.
http://sites.psu.edu/psych256fa13/2013/09/19/similarities-between-humans-and-pigs/