Thinking time!
I'm working on a game concept that explores individual beliefs, with several existing and possible future political views, from gun control to AI rights, as well as philosophical issues.
The game occurs in a universe where, from birth, humans are given an external companion robot with an AI that is programmed to act like a permanent outer conscience and psychostimulant. The AI evolves with the child into adulthood and shares all said adult's beliefs, essentially mimicking a human mind to the point where, in a phone call, one would have to check if the caller was an AI or a human. The AI essentially becomes a part of the human's life and automatically shuts down upon the human's death, upon which the AI is permanently lost. There is no backup for this AI and it can't be copied. Views on these robots vary from "just machines" to romantic relationships between 2 AIs or an AI and a human.
My question to you good people is:
If an AI of this nature is damaged by someone to the point of no repair, is it murder? How would you deal with it? Is it something worth a charge of manslaughter or would it be destruction of private property? Is the AI even a person?
There would have to be new laws passed to cover evolved AI tampering, but no laws now exist now except for laws on destruction of property.
@kasmian I have no opinion on laws. But for others, it might depend on whether or not the enhanced AI was self-aware or not, and considered itself a "person." Otherwise, any laws passed would be purely subjective, or based on the attitude of the AI owners.
As an SF writer I sit down and run my brain through hypotheticals a lot. This brings up the philosophical and legal implications of what constitutes 'personhood' in a given society. Are the AIs treated as discrete beings? Do they have legal and/or social identities? Do they have rights? And completely apart from legal and philosophical what-ifs, I would run my imagination down the path of what I would think and feel if I had one of these AI companions, myself.
In the end, if I had such a factor in a story, I think my characters would consider them discrete beings and the deliberate destruction of one to be murder (and the non-deliberate destruction to be manslaughter) as they have conscious independent thought and are capable of communicating said thoughts. It'd probably be referred to with slightly different terminology because that's how legalese works, but it'd be an equivalent charge.