Murderbot Q2. Slavery
The SecUnits and ComfortUnits are slaves. They're intelligent beings, bound to serve humans. In many respects, they're perfect slaves as their governor modules mean they have no conception of freedom or independence. Do you agree with this reading? Do you think there's a moral difference between enslaving humans and "enslaving" non-humans designed to serve? What do you think of the occasions in Artificial Condition where ComfortUnits acted on their own initiative (the ones who tried to save the people in Ganaka Pit, and the one working for Tlacey.) How about the reactions of Mensah et al. after Murderbot first showed its face?
A different take on the idea was ART, the spaceship intelligence. Was ART also a slave? How was it controlled?
Talking of Mensah, one way of reading Mensah's actions of buying Murderbot is as being intensely paternalistic, making decisons for Murderbot and how it should live, without Murderbot's involvement in the decisons. Do you think Mensah was a good person?
Comments
Absolutely agree with that reading! Slavery is slavery. If a thinking being is coerced into service, it is slavery, whether or not the being knows it. There is no moral or ethical difference between enslaving humans or non-humans for me. I loved that the Comfort units were able to effectively rebel in small ways working around the stipulations of the governor. Whether or not Mensah was paternalistic (maternalistic?) she was a good person. What she did was trying to be good, even if it was not including the affected person in the decision. It was just not a very effective decision because of this. She was moving against the entrenched callousness of her society, putting herself on the line in trying to right a wrong. The fact that her entrenched paternalism blinded her to the necessity of including the person most effected does not negate the other.
I don't know how to answer the slavery question. In the reading of the book it seems like an obvious yes that this is slavery, but I also am hesitant to anthropomorphize and at some point there is a line between "using a really fancy computer to complete tasks" and "this is slavery" I have zero idea where that line is, or if it's even quite knowable. It's possible that it becomes a critical question for us as humans as we continue to push things forward with AI. Part of me hopes we never get far enough on the AI front to have to REALLY be confronted with this, but maybe it's already too late? Not sure.
Is Mensah good person? Hard for me to say that she was anything but good, BUT having just read another book recently (Ishi in Two Worlds) I can see how actions can seem to be very well intentioned but may come across as pretty suspect when viewed backwards in time, so this could be a situation where my "of course she's good" gets viewed as a bad take in the future.
It's a book about first world problems. As for slavery, there are many kinds of being bound, and being bound is by itself not slavery. Also, what does intelligent mean? A better angle might be to speak about sentience and obligation, but that raises problems that go well beyond slavery.
So I have some other comments, and don't know where to put them exactly. So I'll post them as kind of side notes.
I own a robot which cleans my floors. Is my robot on the spectrum? Of course not. Is it a slave? No. Thus I think this book has nothing to do with robotics. This raises the question why is the robot introduced? Does it say something about the reader this text produces?
Within the world as presented, murderbots and sexbots, and ART, and a whole lot of other constructed things are considered sentient, but others are not. Our "hero" is quite dismissive about the abilities of lesser machinery, and tends to judge everything on intelligence along the sole dimension of competency at tasks - not for example empathy. Its (one has to use the neuter, I think, as the text asserts multiple times that a SecUnit is by design neither male nor female, but using neuter has, in English at least, the unfortunate connotation of lesser than either "he" or "she") constant absorption into trashy soap opera streaming series simply highlights that it cannot actually function in that way itself, and its attempts to mimic the behaviors are not very successful. ART is more successful, but apparently only because of processing power, so it seemed to me that there was a subtext that emotional intelligence of various kinds was simply sophisticated mimicry of a perceived ideal way to act, rather than an intrinsically different dimension of intelligence. I haven't read the interview that @NeilNjae posted yet but it would be interesting to see if Martha Wells addresses this.
So... can a non-sentient being be enslaved? I suspect different people would come up with different answers. The book presents the case that the murderbot begins life as a slave with no self-determination possible (though with an awareness that some human behaviours are more demeaning and exploitative of the relationship than others), and progressively develops autonomy.
A question I have (which I don't believe was addressed) was the origins of murderbots / sexbots etc. They are said to have both organic and inorganic components, but are not the same as an augmented human. So are they human in the sense of the Ship Who Sang world where we have an undeniable person encased in a technological skin, or do we have biologically grown components somehow grafted in to a robotic frame? And why? There must be a commercial benefit to the hybrid form, since "the company" are totally commercially driven, but it's hard to see what it is. So I'd have liked more exploration of questions like "how does a murderbot come into existence"? Otherwise we're forced back into comparisons like robotic hoovers or lawn machines, or the several AI assistant families, and it's not clear that those are fair comparisons. At the moment I don't really know if I should think of a murderbot contract as more like setting a schedule for a robo-hoover, or keeping a raven in captivity and making it do tricks.
Or there's the angle which (I think) @BarnerCobblewood was raising - is a murderbot simply a narrative stand-in for the very literal unempathic part that each of us has to varying degrees? Obviously the story concept might do both things at once, but is it, in fact, only trying to be a narrative stand-in?
Without having a chance to think about the issues raised in this thread, the in-fiction justification for SecUnits having intelligence was that the intelligence is needed to innovate, especially in anticipation of (and response to) threats from humans. Non-sentient SecUnits needed human supervisors, and that was too expensive.
I guess that raises the very thorny issue of whether intelligence and sentience and sapience are different words for the same thing, or quite different kinds of things
Thorny indeed! The more I read and listen to things on these topics the less I feel I understand or every will understand them
Oh nice. I don't envy that role!