Week 7 Flashcards

1
Q

Main social and ethical concerns with personalized filtering

A
  1. the privacy intrusion that is unavoidably linked to the tuning of the user behavior profile models
  2. the lack of transparency concerning the data that is used, how it is gathered and how algorithms work
  3. the risk of covert manipulation of user behavior
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Three mechanisms that can explain privacy-related decisions

A
  1. uncertainty
  2. context-dependence
  3. malleability and manipulation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Uncertainty (three mechanisms)

A

People experience considerable uncertainty about whether, and to what degree, they should be concerned about privacy.
This is due to:
1. lack of knowledge about collection and usage of personal data
2. uncertainty about preferences: privacy paradox

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition Privacy Paradox

A

People say that they care about their privacy but their actual online behavior does not reflect these concerns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Context-dependence (three mechanisms)

A

Whether people finds privacy important depends on the context
–> culture, situation, motivations all influence our beliefs regarding what is privacy and what is public

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Malleability and influence (three mechanisms)

A

Various (subtle) factors can be used to activate or suppress privacy concerns
–> default settings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Protection motivation model

A

Privacy protection behavior

  1. threat appraisal
    - perceived severity
    - perceived susceptibility
  2. coping appraisal
    - self-efficacy
    - response-efficacy
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Outcome protection motivation model

A
  1. threat appraisal is high: people receive the collection, usage and sharing of personal information online as a severe problem to which they are susceptible
  2. coping appraisal is mixed: people have little confidence in their own efficacy to protect their personal information online. But do believe that some responses can effectively limit the collection, usage and sharing of personal information online
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Conclusion protection motivation model

A
  1. people are aware of the threats to their privacy online
  2. but knowledge and confidence is limited
  3. when helping people, we should address the severity of the problem and the efficacy of responses
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Definition privacy fatigue

A

People feel that they cannot control their personal information, feel powerless and mistrust the platforms and companies handling their data. They give up, they resign

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Transparency

A

People need to give consent to the use of personal information
But:
1. terms and conditions, privacy statements are rarely read and understood
2. we tend to agree to close or ignore statements (due to present bias)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why transparency?

A
  1. transparency is generally desired because algorithms that are poorly predictable or explainable are difficult to control, monitor and correct
  2. the lack of transparency also causes information asymmetry and an imbalance in knowledge and decision-making power favoring data processors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Bias algorithms

A
  1. algorithms are not neutral in nature
  2. humans develop them and affect the way they work once implemented
  3. algorithms can not be divorced from the conditions under which they are developed and deployed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Algorithms are not neutral

A

Algorithms are created for purposes that are often far from neutral

  1. to create value and capital
  2. to nudge behavior and structure preferences in a certain way
  3. and to identify, sort and classify people
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Mitigating bias

A
  1. create more diverse work teams
  2. use more representative training data
  3. technology should not be designed in a vacuum, but rather account for all of the potential disparities in its platform and execution. Consequently, algorithms that discriminate should be dealt with, fixed or abandoned
  4. incorrect social inferences can be mitigated by giving consumers control over their digital footprint
How well did you know this?
1
Not at all
2
3
4
5
Perfectly