Law and Technology- fundamental rights and AI Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

what is the first AI system?

A

Theseus- 1950

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what is the name of the AI system developed by Google?

A

PaLM- explain jokes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

which domains now use AI?

A

AI is now used when: booking a flight; assisting the pilot when flying; deciding whether you get a loan; governments use AI systems for surveillance and oppression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what is the name of the AI’s which determine what you see on social media?

A

recommender systems. they also determine which videos you get recommended on youtube

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

currently, what has the doubling time of training computation shortened to?

A

6 months

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

in Daniel Solove’s taxonomy what are the 6 privacy conceptualisations?

A

1) privacy is the right to be let alone, free from state interference
2) privacy is limited access to the self
3) privacy is secrecy
4) privacy is people’s control over their personal information
5) privacy is the right to protect one’s personhood
6) privacy is intimacy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

where is the right to privacy recognised?

A

art 12 UDHR
art 17 ICCPR
art 7 EU Charter of Fundamental Rights
art 8 ECHR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what obligations does the right to privacy impose on the state?

A

positive and negative obligations- the state must refrain from infringing individuals privacy and ensure that privacy is protected from violations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

why was data protection developed?

A

as a response to the rise of data processors in the 70’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is data protection?

A

protection of information of an identifiable natural person

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

who are the 2 key actors in cases of infringement?

A
  1. controller- must comply with all GDPR principles.
  2. data processor- entrusted with processing data on behalf of the controller
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what are the main principles related to the processing of personal data under the GDPR which controllers and data processors must keep in mind?

A

lawfulness
fairness
transparency
accuracy
integrity
confidentiality
accountability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what does the fairness requirement imply?

A

personal data should not be processed in a way that goes against the data subjects’ reasonable expectations and adversely impacts them. complying with the fairness requirement implies that controllers ensure that AI models are statistically accurate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what does the transparent requirement mean?

A

data subjects have the right to be informed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what is the problem with transparency rights?

A

data subjects have the right to obtain information. however, providing this information may be complicated and processors may be unable to explain. this issue is the black box phenomenon.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what is purpose limitation?

A

data can only be processed for the purposes explicitly specified to data subjects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what is the data minimisation principle?

A

personal data must be adequate and relevant for the purposes specified

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

what is the storage limitation principle?

A

it imposes on the controller and processor the obligation to store personal data for as long as necessary. the data must be erased once the model is fully trained

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

what is the accuracy requirement?

A

it creates a right to a correct representation of oneself. so the personal data must be correct and up to date. this also means that data subjects have the right to erasure and rectification

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

what are the 3 additional obligations imposed on controllers and processors?

A

they must ensure that the data processing is done with security, integrity and confidentiality. they have the duty not to disclose private information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

what is the accountability requirement?

A

it is a ‘meta-principle’, which renders data collectors responsible for demonstrating compliance with all the principles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

what has gained increased relevance due to the unpredictability of AI applications?

A

Data Protection Impact Assessments- they incentivise companies to take accountability for the risks that their systems might impose

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

what lies at the core of AI development?

A

privacy and data considerations. for AI to be compliant with the law, these considerations cannot be disregarded

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

what can the digitisation of social protection programmes be traced back to?

A

e-health programmes in the 1990’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

what are ADMS?

A

innovative tools that can bolster welfare systems with shrinking budgets

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

what does integrating ADMS in the welfare workflow require?

A

the recruitment of profiles that are not typically found in the public sector, such as data scientists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

what does ADMS perform?

A

automated administrative processes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

how can ADMS affect the citizens?

A

it can affect their rights and poor people can get state support to build a decent life

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

what are the 3 components of ADMS that are relevant for the delivery of equitable welfare services to women?

A

1) datasets- gendered datasets are lacking, meaning the possibility to serve female citizens becomes undermined. Biased datasets produce biased predictions.
2) decision-making models
3) design- public services targeting women are not designed with their involvement, which can contribute to a negative user experience

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

what is a negative of ADMS?

A

few case studies about ADMS examine gender inequalities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

what % of women live in poverty compared to men?

A

20% women
18% men

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

what % of single parents are female?

A

90%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

what is the exception to the 2-child policy?

A

the ‘‘rape clause’’- the mother must prove to the government that the child was born out of rape.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

what is a negative of the rape clause?

A

it does not apply if the mother continues to live with her rapist, which fails to recognise that most rape occurs within abusive relationships

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

what does the Equal Treatment Act forbid?

A

unequal treatment based on socio-economic variables such as gender

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

from 2016-2017, what % of single parent families were headed by women?

A

82%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

what is ParentsNext?

A

a programme for parents already receiving state support, designed to help them achieve education and employment goals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

what do the 3 case studies present?

A

different forms of automated inequality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

what are the 3 main pitfalls of the case studies?

A

1) a faulty approach to data
2) a lack of gender impact assessment
3) the absence of co-design

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

what is principle 1 of the Gender Equality Principles for the Digital Welfare State?

A

gender relevant datasets and statistics. algorithms reproduce their creators’ biases. if gender data is missing, the algorithm cannot be blamed for discriminating. building a representative dataset includes contextualising data within broader socio-economic realities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

what is principle 2 of the Gender Equality Principles for the Digital Welfare State?

A

gender mainstreaming in planning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

what is principle 3 of the Gender Equality Principles for the Digital Welfare State?

A

co-design, oversight and feedback.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

what is principle 4 of the Gender Equality Principles for the Digital Welfare State?

A

equality by default. the system must consider data that is representative of the situation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

what has digital technology enabled?

A

pathways for false information to be created and disseminated at a scale and reach never known before

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

what are some of the consequences of disinformation?

A

it hinders people from exercising their human rights and destroys trust in governments and institutions

46
Q

how has the European Commission described disinformation?

A

as false information that is created, presented, and disseminated for economic gain or to deceive the public and that may cause public harm.

47
Q

how have academics described disinformation?

A

as false information that is shared with the intention to cause harm.

48
Q

what are the 3 vectors which disinformation consists of, according to academics?

A

manipulative actors, deceptive behaviour and harmful content

49
Q

what reduces the effectiveness of the responses to the spread of disinformation?

A

the lack of clarity and agreement on what constitutes disinformation.

50
Q

what is disinformation a consequence of?

A

societal crisis and the breakdown of public trust in institutions

51
Q

what has disinformation been used for in several countries?

A

to undermine the right to free and fair elections.

52
Q

what does international human rights law provide?

A

a framework for formulating responses to the negative impact of disinformation.

53
Q

what does art 19 of the UDHR guarantee?

A

the right to hold opinions without interference. it is absolute and permits no restrictions.

54
Q

which human right can be restricted?

A

freedom of expression

55
Q

what is an essential element of freedom of opinion?

A

the right to form one’s opinion

56
Q

what is a violation of freedom of opinion?

A

punishment, harassment and intimidation for holding an opinion, including coercive or involuntary manipulation of the thinking process to develop an opinion

57
Q

what are the 2 points worth noting in the context of the right to freedom of expression?

A

1) freedom of expression applies to all kinds of information, including those that may shock or offend.
2) the free flow of information is a critical element of freedom expression.

58
Q

when can freedom of expression be restricted?

A

only in accordance with article 19(3) of the International Covenant on Civil and Political Rights.

59
Q

what have state-led disinformation campaigns sought to do?

A

influence elections, control the narrative of public debates or curb protests against governments.

60
Q

what has the Human Rights Council condemned?

A

the use of internet shutdowns. it hinder voters from accessing information about elections.

61
Q

what do many ‘‘false news’’ laws fail to meet?

A

the 3-pronged test of legality, necessity and legitimate aims set out in article 19(3) of the International Covenant on Civil and Political Rights. they fail because they don’t define what constitutes false information

62
Q

what have some US social media platforms adopted?

A

policies and tools which ban ‘‘fake news’’. however, they are an insufficient response to the challenges posed by disinformation.

63
Q

what do companies continue to fail to do?

A

to provide remedies for wrongful actions taken on the basis of disinformation.

64
Q

what are the major failings of companies in relation to disinformation?

A

lack of transparency and access to data.

65
Q

what has the pandemic exposed in relation to disinformation?

A

the imperative of upholding the right and the challenges of confronting disinformation and misinformation.

66
Q

what does disinformation destory?

A

peoples trust in democratic institutions.

67
Q

what is the fundamental challenge for states, companies and the media in relation to disinformation?

A

to restore public trust.

68
Q

what should companies publish?

A

detailed transparency reports.

69
Q

what is the major role of the UN human rights system and the Human Rights Council in relation to disinformation?

A

in ensuring that all efforts to address disinformation are grounded firmly in international human rights law, including respect for freedom of opinion and expression.

70
Q

what is the threat that democracies are facing?

A

disinformation.

71
Q

what can be detrimentally affected by the use of AI?

A

freedom of elections in the EU

72
Q

what are social bots?

A

they are automated or semi-automated social media accounts controlled by algorithms. one of the most important features of bots is that they can achieve scalability, which enables them to spread information.

73
Q

how many bots make up all twitter accounts?

A

a quarter of all accounts

74
Q

how can AI exacerbate the threats of disinformation in democratic political systems?

A

1) the spread of disinformation through bots and micro-targeting reaches a greater number of people and appeals to their fears.
2) political deepfakes- they can manipulate voters and damage a political candidate’s reputation.

75
Q

what has the European Parliament called for with regard to political advertising?

A

more transparency

76
Q

what do deepfake videos raise concerns over?

A

over who holds the intellectual rights to a deceased person’s semblance.

77
Q

what are the 2 key generative neural network architectures for creating deepfake videos?

A

variational autoencoders and generative adversarial networks (GANs)

78
Q

what are large language models?

A

a type of deep learning model that can perform a variety of natural language processing tasks. chatGPT is a chatbot based on a large-language model.

79
Q

what is the role of ChatGPT in deep fake creation?

A

they save time and effort by writing the dialogue. it spreads the boundaries of a crime and is cost-effective, as there is no need for professional help and anyone can create a deep fake.

80
Q

what does Bill (HF1370) criminalise?

A

non-consensual sharing of deep fake pornography and political misinformation

81
Q

what is the Regulation for privacy and data protection?

A

general data protection (GDPR)

82
Q

what is the difference between intimacy and privacy?

A

intimacy- intimate things about us and our beloved ones. health conditions/race.
privacy- any data that can identify us. previous work experience.

83
Q

what is web tracking?

A

way for the websites to identify and collect information about the people.

84
Q

what did the court find about The Netherlands SyRI case?

A

it violated the right to privacy and private life (art 8 ECHR)

85
Q

what is pseudonymisation?

A

processing of personal data in such a way that is it impossible to attribute it to a particular person without using additional information.

86
Q

how is Anonymization different from pseudonymization?

A

the possibility to discover who the person is in case of pseudonymised data

87
Q

what are the exceptions for sensitive data being prohibited from processing?

A

if the data reveals your racial or ethnic origin, political opinions, religious beliefs and if the data is genetic or biometric data

88
Q

what are the 5 conditions for when data processing is lawful? (only 1 needs to be met)

A

1) a person gave consent
2) necessary to carry out a contract with a person
3) necessary to comply with a legal obligation that the controller is subject to
4) needed to protect the interests of a person
5) needed to carry out a task related to public interest

89
Q

what burden is on the controller when data processing?

A

to prove that there is consent. the person has to be informed about what their data is needed for, how long it will be used, and who will have access to it. the person has the right to withdraw their consent at any time.

90
Q

what are the rights of a person over their person data?

A

1) information rights
2) right to erasure
3) right to data portability
4) right to object

91
Q

what are the 6 reasons why individual rights can be restricted?

A

1) national security
2) defence
3) public security
4) investigation/prosecution of crimes
5) public health
6) protection of judicial independence

92
Q

what is the data protection officer (DPO)?

A

anything that happens in the company (data breaching related) is the work of the DPO.

93
Q

what are the tasks of the DPO?

A

1) inform and advice the controller and processor
2) provide feedback in cases when the data protection impact assessment is needed
3) monitor legal compliance

94
Q

what is the definition of AI in The AI Act?

A

'’an AI system is a machine-based system designed to operate with varying levels of autonomy… and that… infers, from the input it receives, how to generate outputs such as predicitions, content, recommendations’’.

95
Q

what is weak AI?

A

AI-based applications that are good in performing one single function but which is unable to be mutli-functional, like humans are.

96
Q

what is strong AI?

A

AI that we have not yet succeeded in building and which would be similar to our intelligence.

97
Q

how can AI be dangerous to personal data and privacy?

A

1) constant surveillance- your moves are being tracked.
2) you are no longer in control of your data and you have no way of knowing how the companies use your data.
3) high risk of manipulation-
4) your healthcare can be used against you- if i know what you suffer from, i will not include it in your health insurance.
5) use of children as tools to manipulate parents.

98
Q

according to some EU data protection authorities what is meaningful information?

A

personal data used in making decisions.

99
Q

what is facial emotion recognition (FER)?

A

FER reveals information about a person’s emotional state from photos and videos.

100
Q

what is the use of facial emotion recognition (FER)?

A

1) personalised services- may recommend a comedy film if you are sad on Netflix
2) healthcare- depression or other mental disorders
3) traffic- detection of drugs or alcohol
4) law enforcement- analysis of videos from crime scenes
5) education- student engagement

101
Q

which articles of the UDHR relates to equality and non-discrimination?

A

articles 1 and 2

102
Q

what is the problem with Predpol?

A

low level crimes are mixed with high level crimes and the majority of low level crimes are discovered in poor neighbourhoods, thus feeding the system with data that links poverty with criminality.

103
Q

how does AI discriminate on the basis of gender?

A

1) AI is best at recognising white male faces and least good at recognising black female faces.
2) AI is trained using male voices.
3) SIRI and Alexa are better at recognising male voices.
4) AI in medicine uses medical data of men, not women. the research on the side effects of drugs rarely include female conditions.

104
Q

what is Cross Check/XCheck?

A

a program based on AI which analyses content to detect whether it violated Facebook’s rules. However, it exempts the high-profile users from FB rules. example: Neymar posted nude pictures of a woman, such behaviour from a normal user would have provoked sanctions from FB. These VIP users essential have more freedom of speech/expression than normal users.

105
Q

what is the goal of California’s Anti-Bot Law?

A

to make people aware when they interact with other humans and when with bots. However, now thanks to AI, it is less clear when we are talking to bots.

106
Q

what are deep fakes?

A

artificially generated and manipulated fake video content that is difficult to distinguish from the authentic. Usually famous people.

107
Q

what legal effects can deep fakes have?

A

1) human rights- privacy and data protection
2) stock market- if a CEO appears in deep fake saying the company is in trouble, their stock market value drops.

108
Q

what is generative AI?

A

ability to create sophisticated text, video or audio.

109
Q

what are prediction algorithms (COMPAS)?

A

they are used to predict the risk of recidivism on the basis of a person’s data: will this person commit a crime again? judges in the US use PA to decide whether to release a person from jail or how long one should spend in prison.

110
Q

what is the problem with predictive algorithms?

A

if a judge relies on it:
1) she is delegating human life for a machine to decide
2) what is the point of having a judge?

if the judge ignores it:
1) if the person commits again, in societies eye, the judge knew they were dangerous but ignored it.

111
Q

what is a case law on prediction algorithms?

A

State v LOOMIS- no rights were infringed.