Week 8 Flashcards

1
Q

Deepfakes

A
  1. portmanteau for “deep learning” and “fake”
  2. hyper-realistic digital falsification of video and audio
  3. by leveraging machine-learning algorithms
  4. result: make someone say or do something
  5. early stage (!) of technology
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Societal benefits deepfake

A
  • education: historical figures speaking to students

- art: satirize, parody, and critique public figures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Social costs deepfake

A
  • for individuals:
    1. individual exploitation
    2. reputational sabotage
  • for society:
    1. distortion of democratic discourse
    2. manipulation of elections
    3. increasing social divisions
    4. undermining journalism
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Algorithmic solutions to overcome deepfake

A

Algorithm to detect deep fake based on blinking. Algorithm could then be linked with social media platforms.

Limitations:

  1. does not prevent making and spreading deepfakes
  2. detection patterns can be corrected in the next algorithm iteration
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Digital Provenance Solutions to overcome deepfake

A

Digital watermark specifying when it was captured. Tag will be imprinted by the device capturing or creating the image (at point of creation)

Limitations:

  1. all devices should be equipped with the technology of watermark
  2. social media platforms must require these digital watermarks to post
  3. easier to spot opiniated people in authoritarian regimes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Privacy solutions to overcome deep fake

A
  1. legal ban on creating and sharing deep fakes online for nefarious reasons
    Challenges:
    a. when do you consider something a deep fake?
    b. how to know the source?
    c. what if the source is from a foreign country?
  2. holding platforms accountable for monitoring and removing deep fakes
    Challenges:
    a. over-censoring to avoid fines?
    b. how should platforms detect deep fakes?
    c. what with freedom of expression?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

General challenges for solutions to overcome deep fake

A
  1. not tackling the problem by its roots

2. how can you reach the whole society with literacy initiatives?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Definition Conversational agents

A

Software that accepts natural language as input an generates natural language as output, engaging in a conversation with the user

  1. virtual assistent (voice-based)
  2. chat bot (text-based)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Five steps conversational agents

A
  1. speech recognition (VA)
  2. natural language understanding (VA & CB)
  3. dialogue management (VA & CB)
  4. natural language generation (VA & CB)
  5. text-to-speech (VA)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Opportunities of conversational agents

A

Consumers:

  1. fast, immediate support, 24/7
  2. meeting consumers on the right platform
  3. conversational nature (increases engagement)

Organizations:

  1. substitute employees
  2. make employees more efficient (by taking over repetitive tasks)
  3. using logged data to increase organizational performances
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Challenges research algorithms

A
  1. acces/black box
  2. heterogeneous and embedded
  3. ontogenetic, performative and contingent
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Approaches research algorithms

A
  1. examining pseudo-code/source code
  2. reflexively producing code
  3. reverse engineering
  4. interviewing designers or ethnography of a team
  5. unpacking the full socio-technical assemblage
  6. examining how algorithms do work in the world
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Examining pseudo-code/source code approach & limitations

A
  1. deconstructing the pseudo-code of algorithms
  2. looking at how codes are being rewritten/tweaked
  3. looking at how code for specific task is translated into various languages and runs across different platforms

Limitations:

  1. code is never straightforward (code jungle)
  2. expertise needed
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Reflexively producing code approach and limitations

A
  1. critical reflection on own experiences of formulating an algorithm
  2. analysis of:
    a. translating a task
    b. writing and revising a code
    c. influence of socio-technical factors

Limitations:

  1. detach oneself / critical distance
  2. focus is not on algorithms that have real concrete effects on people
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Reverse engineering approach and limitations

A
  1. focus on input and output
  2. examining what data are fed into an algorithm and what output is produced

Limitations:

  1. not able to draw very specific conclusions
  2. Fuzzy glimpse of the algorithms
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Interviewing designers or ethnography of a team approach and limitations

A
  1. insights into the intent of the designers
  2. how and why an algorithm was designed
  3. questioning the coders of algorithms
  4. observing the work of the coders

Limitation:
1. lack of specificity and detail

17
Q

Examining how algorithms do work in the world approach and limitations

A
  1. how do algorithms perform in practice?
  2. how do algorithms perform in a given context?
  3. how do they shape user behavior? how do people engage with algorithms?
  4. ethnography?
18
Q

The future of algorithms: two claims

A
  1. People will not trust algorithms to make important decisions for them
  2. For many life decisions, no successful algorithm will be able to be developed
19
Q

Claim 1: alignment of incentives

A
  1. encouraging users to buy products
  2. frustrating users so that they continue to use the application
  3. telling users what they want to hear
  4. shocking users with surprising information
  5. keeping users addicted to the app
20
Q

Claim 1

A
  1. no insight into your data
  2. algorithms are not open for scrutiny
  3. important life decisions are irreversible
  4. no reason to trust algorithms with major life decisions
21
Q

Alternative claim

A

Algorithms will optimize our “environment of choice”

  • > rather than replacing human-decision making
  • > algorithms will serve to shape our environment of choice
  • > but we don’t have to rely on them
  • > thus: humans remain the agents of decision and choice (humans are still in the drivers seat)
22
Q

Claim 2

A
  1. the data must be predictive (data ahead of time)
  2. big data needed for making decisions
    a. but not enough data for a lot of decisions
    b. many decisions are too narrow/specific
  3. no company will have access to all the data
  4. hard to measure the succes of algorithmic predictions