ARS Technica: It is interesting that you have mentioned the Neurodework. I would hesitate to compare directly because it is a huge spectrum, but there are elements of murder boots that seem to be somewhat resonant to autistic tracts.
Paul Vitz: People see something like an autism spectrum, and they inadvertently delete the uniqueness of people who may be on this spectrum because everyone has a special life experience. Martha Wells has been reported that in writing Murderboat, he felt that there were some aspects of himself that could probably be a neurodeveer. So such a special way gives a license to discuss the role.
Murder for rescue
Apple TV+
The murder boot needs a little TLC after confronting them with insects.
Apple TV+
Chris Vitz: I don’t think this is a direct resemblance in any way, but I can understand why people from different areas on the spectrum can identify it.
Paul Vitz: I think one thing with whom someone can identify is telling you that you should not be, you should have a different way, and this is something that does not like killing boot and not.
ARS Technica: You said before, this is not a human, but a person. This is a very interesting fragile. What do you think about the personality of the murder?
Chris Vitz: It is a dispute that you can become a person without being a human being. I think we are dealing with this problem at a moment when artificial general intelligence comes into being. I think that Martha, throughout the series, brings out a variety of emotional and different types of personality that are not a standard human problem. This is a really interesting topic because it is partially our future, who are not human beings, how to get them with intelligence.
Paul Vitz: A few years ago was the New York Times journalist Which interviewed A chat boot-
Chris Vitz: How was it every day, and it was a Sydney chat boot. [Editor: It was an AI chatbot added to Microsoft’s Bing search engine, dubbed Sydney by Roose.]
Paul Vitz: Okay. During the interview, Chatboat told the journalist to leave his wife and stay with him, and that he was making a terrible mistake. Emotions were just as specific and quirky and slightly scary, but very, very recognized. Shortly afterwards, Microsoft stopped the ability to talk to the chat boot. But I think that anywhere in our future, ordinary intelligence is such a dirty emotions and strange kinds of unique personalities. And it looks like something we should entertain this thinking, yes, we treat everyone better as a person.