Week 7 Flashcards
Main social and ethical concerns with personalized filtering
- the privacy intrusion that is unavoidably linked to the tuning of the user behavior profile models
- the lack of transparency concerning the data that is used, how it is gathered and how algorithms work
- the risk of covert manipulation of user behavior
Three mechanisms that can explain privacy-related decisions
- uncertainty
- context-dependence
- malleability and manipulation
Uncertainty (three mechanisms)
People experience considerable uncertainty about whether, and to what degree, they should be concerned about privacy.
This is due to:
1. lack of knowledge about collection and usage of personal data
2. uncertainty about preferences: privacy paradox
Definition Privacy Paradox
People say that they care about their privacy but their actual online behavior does not reflect these concerns
Context-dependence (three mechanisms)
Whether people finds privacy important depends on the context
–> culture, situation, motivations all influence our beliefs regarding what is privacy and what is public
Malleability and influence (three mechanisms)
Various (subtle) factors can be used to activate or suppress privacy concerns
–> default settings
Protection motivation model
Privacy protection behavior
- threat appraisal
- perceived severity
- perceived susceptibility - coping appraisal
- self-efficacy
- response-efficacy
Outcome protection motivation model
- threat appraisal is high: people receive the collection, usage and sharing of personal information online as a severe problem to which they are susceptible
- coping appraisal is mixed: people have little confidence in their own efficacy to protect their personal information online. But do believe that some responses can effectively limit the collection, usage and sharing of personal information online
Conclusion protection motivation model
- people are aware of the threats to their privacy online
- but knowledge and confidence is limited
- when helping people, we should address the severity of the problem and the efficacy of responses
Definition privacy fatigue
People feel that they cannot control their personal information, feel powerless and mistrust the platforms and companies handling their data. They give up, they resign
Transparency
People need to give consent to the use of personal information
But:
1. terms and conditions, privacy statements are rarely read and understood
2. we tend to agree to close or ignore statements (due to present bias)
Why transparency?
- transparency is generally desired because algorithms that are poorly predictable or explainable are difficult to control, monitor and correct
- the lack of transparency also causes information asymmetry and an imbalance in knowledge and decision-making power favoring data processors
Bias algorithms
- algorithms are not neutral in nature
- humans develop them and affect the way they work once implemented
- algorithms can not be divorced from the conditions under which they are developed and deployed
Algorithms are not neutral
Algorithms are created for purposes that are often far from neutral
- to create value and capital
- to nudge behavior and structure preferences in a certain way
- and to identify, sort and classify people
Mitigating bias
- create more diverse work teams
- use more representative training data
- technology should not be designed in a vacuum, but rather account for all of the potential disparities in its platform and execution. Consequently, algorithms that discriminate should be dealt with, fixed or abandoned
- incorrect social inferences can be mitigated by giving consumers control over their digital footprint