Robot Rights Flashcards
What is a moral agent?
To be a moral agent is to perform morally significant actions,
and to be morally responsible therefore.
What is a moral patient?
To be a moral patient is to be affected by morally significant actions, and to have rights that deserve consideration.
Name a few examples where a moral patient is not a moral agent.
Childen, animals or people with cognitive impairments.
What is an Ethical Impact Agent?
An agent that performs tasks with morally significant outcomes
How do you call an agent that behaves in ways that adhere with moral principles?
Implicit ethical agents
How do you call agent that encodes and follows moral principles?
Explicit Ethical Agents
What is meant with full ethical agents?
Agents that encode and follow moral principles, and are deemed to be moral agents.
Why is a consequentialist approach to encoding moral principles hard to do?
Because of the difficulties in measuring the relative goodness of a consequence and calculating over long causal chains.
Explain what deontological principles are and why they might not be a good solution to encoding moral principles in an agent?
Deontological theories propose that certain actions are morally right or wrong independent of their consequences. In other words, the moral status of an action is determined by the nature of the action itself, rather than its outcomes. These principles are easy to encode but may be too abstract (leave too much interpretability) and lead to conflict.
Instead of encoding ethical theory directly, what other solution can be thought of?
Emulating human decision-making through the use of machine learning. Maybe they won’t be better than humans, but they might be good enough.
What are the requirements for moral responsibility? And wherein lies the problem when talking about robots as moral actors?
- Act voluntarily
- Are able to predict and evaluate the consequences of acting
- Are able to exert reasonable control over the action
- Are able to justify the action
requirement 2, 3 and 4 are achieved by following moral principles, but the question is if robots act voluntarily. We could assume that a robot following a predetermined algorithm does not act voluntarily, but do humans than also not act voluntarily as their actions are determined by the laws of nature? Instead, you might define voluntary action as the want to act. In this case, even if an AI system’s actions are determined by an algorithm, it can act voluntarily just in case those actions are consistent with its short-term and long-term goals
Keywords: voluntary action, control, predict, evaluate, justify, wanting
Explain the ‘property view’ of robot moral patiency.
You could attribute moral patiency to something, if they posses some property which might be deemed sufficient for moral consideration. Basically: X has property -> X has rights. Examples of these properties might be p: is intelligent, p: is able to resist being harmed, p: is able to express the desire to possess rights.
Name one original property that a being might have so that it can get rights.
Example:
A property that could justify granting robots rights is the ability to have autonomy or self-determination. This means that the robot would be able to make its own decisions and take actions based on its own goals and desires. This property is important because it is widely accepted that autonomous beings have moral rights, including the right to make decisions about their own lives and the right to be treated with respect and dignity. We could make a gradient scale of autonomy or intelligence that would determine in which level a being gets rights. e.g. Stones get no rights and humans all rights, dogs little less than human rights (no voting rights) and ants even less. Autonomy as a property for rights does raise a lot of questions about the rights of other beings.
Explain the relational view for Robot moral patiency.
This approach argues that robots should be granted moral consideration based on the role they play in human society and the relationship they have with humans. For example, a robot that is used to care for the elderly or assist with medical treatments may be granted more moral consideration than a robot that is used for menial tasks or entertainment.
Keywords: role, society, relationship, humans, moral consideration
Name an argument that robots cannot have rights and will therefore never have rights.
Robots are just tools, and are not much different than a toaster. They are not alive and serve as tools.