Commercial and law enforcement purposes Flashcards
What does the article from Oostveen say, “Identifiability and the applicability of data protection to big data”?
Problems with big data
- Purpose limitation and data minimisation goes directly against the principle of big data (the more data the better). Big data typically requires data for unspecified purposes as well.
- Lack of transparency required by GDPR
- Combination of data sets can create new data, that the subject is not aware of. Subjects can thus not give consent to the processing of it.
- Predictive powers
- Blame is put on the computer (algorithms) and thus liability is evaded
- Big data is build on statistics and finding patterns - a risk of finding coincidental patterns, building a false picture of reality.
- Not as objective as is claimed since algorithm is initially created by a person with a specific point of view that will influence the algorithm.
Is big data even covered by GDPR?
Personal data regarding identified and identifiable persons (indirectly and directly).
Problems distinguishing between indirectly identifiable and non-identifiable. WP29 has a very broad definition of indirectly identifiable - all data that can single out a natural person.
Many controllers claim to process non-identifiable data which seems highly incompatible with the interpretation of the WP29. It is, however, the controller’s assessment.
Phases of big data
- Acquisition: Personal data
- Analysis: Data often de-identified
- Application: Two outcomes, one based on personal data and one not based on personal data.
What can be done to solve the problems of big data?
More algorithmic transparency would increase the general understanding of how big data decisions are made, what information they are based upon and what effects they might have.
What does Elizabeth E. Joh’s article say, “The undue influence of surveillance technology companies on policing”?
Problem:
Because the police don’t produce surveillance technologies themselves, they have to buy it from private companies. These companies greatly determine the design of the technology (ie whether to focus on face recognition in body cameras or not) without the police or the Courts having a chance to influence the decisions.
Additionally, the companies will often have the police sign non-disclosure agreements, meaning that the algorithms used for processing the collected data must be kept a secret, also in court rooms. Lack of transparency and no possibility to challenge or correct the algorithm. Not possible for data subjects to receive access to logic behind the decisions/profiling.
What is the solution to the problems of surveillance technologies?
Oversight - city or county ordinances must require the police to inform them about and seek approval for the surveillance technologies they want to purchase.
Guidelines - developed by the different police departments about how the technology will be used in practice and how the collected data will be stored.
Annual reporting
The author also proposes the use of public records requests as oversight, seeing as civil liberty groups, journalists and private citizens can help uncover secrecy on the matter.