Exam 1 Flashcards

1
Q

Subjectivism (Moral relativism)

A

Morality is determined by the individual

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Conventionalism (Moral relativism)

A

Morality is determined by the culture

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Objectivism

A

Universal principles core, the rest is determined by other factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Pojman’s Point about Tolerance

A

Ethical realtivism and tolerance do not get along, it is a contradiction. Tolerance would be a universal principle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Problems with Ethical Relativism (Ethical relativism is subjectivism)

A

People have rules they should follow

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Ethical Egoism

A

It is always morally right to act in our long term self interest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Psychological Egoism

A

“I think people think in an ethical egoist way, not to say that it is right or wrong I just think people behave in their own self interest” ABRAHAM LINCOLN’S ARGUEMENT SUPPORTS THIS (FROM GUILT)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Strongest Desires Arguements

A

Motivation comes from your strongest desire, using this motivation you are pursuing your self interest, therefore you are always acting in self interest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Shafer-Langau Response

A

There is nothing that proves your strongest desire is always in your self interest (it begs the question, it assumes what it is trying to prove)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Nagel’s Objective Concern Arguement

A

I can recognize the concern I feel for myself, so I can see that others have concerns, therefore I respect that

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Virtue Ethics

A

“What kind of person am I” or “What kind of person do I want to be?” this has to do with virtues and vices

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Revised Golden Rule (verbatim)

A

Do unto others as you would consent to have them do unto you in the same circumstances

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Literal Rule

A

Does not have “same situation” and implies want as desire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Platinum Rule

A

Treat others how they would like to be treated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Golden Rule and Competition

A

Consent to the conditions of the game

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Kantian Ethics (verbatim)

A

Act only on those maxims that one can will to become universal laws (memorize bold)

17
Q

Contradiction in Conception

A

Implies a logical issue that would not allow such a law to come to pass, aka if you extrapolate this law it would be impossible to operate in the world (Examples: False Promises)

18
Q

Contradiction in Will

A

Implies a logical issue that would be bad for a law to come to pass, aka if you extrapolate this law it would be annoying to follow (Examples: I only help my brother)

19
Q

Formula of Humanity as an Ends (verbatim)

A

Treat humanity, in oneself and others, always as an end and never merely as a means.

20
Q

Wide imperfect duties

A

Help others, it is vague and not explicitly followed or not followed, aka not binary

21
Q

Narrow perfect duties

A

Do not lie, usually negative and just like don’t do that

22
Q

Utilitarianism and Consequentialism

A

Morally correct is what maximizes good for the greatest amount of people

23
Q

Objections to (Utilitarian Consequentialism) Objectivism

A

Personal rights (slavery) and Over-demanding (recreational activities would be impossible)

24
Q

Consequence factors (Utilitarianism)

A

Scope, Intensity, Duration, Probability

25
Prima Facie Duties
Non-injury, Beneficence, Gratitude, Fidelity, Veracity, Reparations, Justice, Self-Improvement
26
Characteristics of Facie Duties
Always morally relevant, self-evident, non-absolute
27
Actual vs. Prima Facie Duties
Actual duties are not self-evident, harder to determine
28
Common Morality Rules
Don't kill, cause pain, deprive of freedom or pleasure, disable, cheat, deceive, do your duty, obey the law, keep your promises
29
Gert's point about moral ideals and impartiality
Moral rules cannot be impartial, but moral ideals can afford to be impartial
30
Gert's 2-step process
Identify which rules are involved, and estimate the consequences if this rule break was universalized aka everyone knows
31
Unavoidable differences in consequence evaluation (remember at least two)
Who counts as moral agents? (fetuses) Differences in rankings of harms and benefits
32
Deontological Asymmetry
Worse to do harm than allow it to happen
33
Chinese Room Arguement (searle)
Dude who is not chinese in a room gives responses that make it seem like he does, strong AI is not a person but a complex math problem
34
Sentience vs Sapience
Sentience is pain or pleasure, sapience is higher executive function like self-awareness and high-level reasoning
35
Most moral theories are
objectivism
36
Intend Forsee Distinction
The gap between inavoidable consequences and what you intend, forseeing is knowing those consequences but intention is more important
37
Trolley Problem
Many people would pull the switch to kill one person would still not push one person onto the tracks to derail the train even though in essence it is accomplishing the same thing
38
The technological singularity
The point where technology growth becomes uncontrollable and irreversible, with potentially catastophic consequences for humans