Robot Rights Flashcards

1
Q

What is a moral agent?

A

To be a moral agent is to perform morally significant actions,
and to be morally responsible therefore.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a moral patient?

A

To be a moral patient is to be affected by morally significant actions, and to have rights that deserve consideration.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Name a few examples where a moral patient is not a moral agent.

A

Childen, animals or people with cognitive impairments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an Ethical Impact Agent?

A

An agent that performs tasks with morally significant outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do you call an agent that behaves in ways that adhere with moral principles?

A

Implicit ethical agents

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you call agent that encodes and follows moral principles?

A

Explicit Ethical Agents

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is meant with full ethical agents?

A

Agents that encode and follow moral principles, and are deemed to be moral agents.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Why is a consequentialist approach to encoding moral principles hard to do?

A

Because of the difficulties in measuring the relative goodness of a consequence and calculating over long causal chains.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain what deontological principles are and why they might not be a good solution to encoding moral principles in an agent?

A

Deontological theories propose that certain actions are morally right or wrong independent of their consequences. In other words, the moral status of an action is determined by the nature of the action itself, rather than its outcomes. These principles are easy to encode but may be too abstract (leave too much interpretability) and lead to conflict.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Instead of encoding ethical theory directly, what other solution can be thought of?

A

Emulating human decision-making through the use of machine learning. Maybe they won’t be better than humans, but they might be good enough.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the requirements for moral responsibility? And wherein lies the problem when talking about robots as moral actors?

A
  1. Act voluntarily
  2. Are able to predict and evaluate the consequences of acting
  3. Are able to exert reasonable control over the action
  4. Are able to justify the action
    requirement 2, 3 and 4 are achieved by following moral principles, but the question is if robots act voluntarily. We could assume that a robot following a predetermined algorithm does not act voluntarily, but do humans than also not act voluntarily as their actions are determined by the laws of nature? Instead, you might define voluntary action as the want to act. In this case, even if an AI system’s actions are determined by an algorithm, it can act voluntarily just in case those actions are consistent with its short-term and long-term goals

Keywords: voluntary action, control, predict, evaluate, justify, wanting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain the ‘property view’ of robot moral patiency.

A

You could attribute moral patiency to something, if they posses some property which might be deemed sufficient for moral consideration. Basically: X has property -> X has rights. Examples of these properties might be p: is intelligent, p: is able to resist being harmed, p: is able to express the desire to possess rights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Name one original property that a being might have so that it can get rights.

A

Example:
A property that could justify granting robots rights is the ability to have autonomy or self-determination. This means that the robot would be able to make its own decisions and take actions based on its own goals and desires. This property is important because it is widely accepted that autonomous beings have moral rights, including the right to make decisions about their own lives and the right to be treated with respect and dignity. We could make a gradient scale of autonomy or intelligence that would determine in which level a being gets rights. e.g. Stones get no rights and humans all rights, dogs little less than human rights (no voting rights) and ants even less. Autonomy as a property for rights does raise a lot of questions about the rights of other beings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Explain the relational view for Robot moral patiency.

A

This approach argues that robots should be granted moral consideration based on the role they play in human society and the relationship they have with humans. For example, a robot that is used to care for the elderly or assist with medical treatments may be granted more moral consideration than a robot that is used for menial tasks or entertainment.

Keywords: role, society, relationship, humans, moral consideration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Name an argument that robots cannot have rights and will therefore never have rights.

A

Robots are just tools, and are not much different than a toaster. They are not alive and serve as tools.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Name an argument that robots can have rights but should not have rights

A

Robots should be slaves. They are made to serve humans, and in the interest of humans they should never have rights in the first place.

17
Q

Name an argument that defends that robots can have rights and also should have rights.

A

Robots that have capabilities similar to our own should have rights because the difference between us is not large.

18
Q

Provide an argument for why robots cannot have rights but should have rights nonetheless.

A

Robots that can be humanized should have rights, because treating robots inhumanely can make people inhumane.