tech enablers - AI/ML Flashcards

1
Q

development stages of AI

A
  • narrow artificial intelligence
  • general artificial intelligence
  • super artificial intelligence
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

supervised learning

A
  • create training data (annotated by human)
  • use AI algo to create model
  • apply model to unseen data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

microworkers (pre read)

A
  • perform tasks that machines cannot
  • provide data for machine learning algorithms that are the basis of AI by adding the human element
  • educated individuals, no regulations, etc
  • eg. Amazon’s Mechanical Turk platform
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

environmental sustainability?

A

high carbon emissions footprint to train an AI model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is deep learning

A
  • deep layers of neural network
  • data: very important but hard to derive features
  • representation learning: raw data -> feature learning -> model - perform actions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

pre-trained models and transfer learning

A
  • train from scratch
  • fine tune a pre-trained model (transfer learning)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

data centric vs model centric AI

A
  • Labelling consistency is key : lack of consistency can deteriorate the outcome.
  • Systematic improvement of data quality on a basic model is better than chasing the state-of-the-art models with low-quality data.
  • With data centric view, there is significant room for improvement in problems with smaller datasets (<10k examples).
  • When working with smaller datasets, tools and services to promote data quality are critical.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

GAN

A

generative adversarial networks
- learn to mimic any distribution of data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

generative AI

A
  • textual prompt -> novel content
  • powered by foundation model or AI models trained on vast quantity of unlabelled data at scale -> adapted to wide range of downstream tasks
  • eg. chatgpt is trained using technique called reinforcement learning from human feedback (tune/teach model to human preferred response)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

concerns of AI (2)

A
  • difficult to distinguish real from fake
  • replacing humans
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Working with smart machines_ Insights on the future of work (pre reading)

A

ecosystems for supporting AI applications
1. technology-based ecosystem:
- platforms: exploration support/transaction support/automated decision platforms
- intelligent case management systems: workflow management/prioritisation/recommendations

  1. job role-based ecosystem
    - new job specialisations/hybridisations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

aspects of AI readiness index framework

A
  • organisational readiness ( AI literacy/talent/governance management support)
  • business value readiness (business use case)
  • data readiness (data quality/reference data)
  • infrastructure readiness (ML/Data)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

AI biases

A
  • learnt from training data and is amplified
  • eg. healthcare, hiring, policing, gender, race
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

retail AI example

A

domino’s

obj: use AI for consistent quality and speedier delivery
tech:
- pizza checker (image analysis and ML)
- processing order via voice (NLP)
- autonomous delivery vehicle
results: invest further to fit all kitchens with pizza checker

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

transport AI example

A

Tesla - aiming for level 5 (full) autonomy

obj: minimise accidents and death on road
tech:
- IoT, sensors, camera (computer vision)
- cloud computing to analyse all driving data
- siri-style AI assistant (voice control NLP)
results: autopilot (2) cut rate of airbag deployment to 0.8 per million miles driven instead of 1.3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

products and privacy AI example

A

Apple

obj: pioneer of in-device AI technology
tech:
- neural engine inside iPhone X model (custom chip)
- AI ecosphere (build algorithms into products)
- smarter app with AI functionality (eg. Homecourt computer vision)
results: prioritise user privacy over cloud data processing; exclusivity of Apple app and AI ecosphere

17
Q

healthcare AI example

A

Tencent – AI to power WeChat and Healthcare

obj: capitalize use of AI on all industries
tech:
- Tencent Miying : AI medical imaging and diagnosing platform (deep learning for image recognition and NLP on medical documents and case files)
results:
- Leverage WeChat for appointment booking and treatment payment
- Miying can identify symptoms of more than 700 diseases

18
Q

sustainable AI

A

alphabet’s AI-powered camera system

19
Q

entertainment AI

A

spotify

obj: help to discover new talents and tracks
tech:
- AI fills role of DJ (recommender)
- collaborative filtering (deduce interest from others)
- audio analysis (tempo, pitch)
- NLP (lyrics and reviews on web)
results: subscriber base growing by 8 million and share price rising by 25% in 3 months after IPO

20
Q

opportunity and risks of AI (4)

A
  • social & ethical
  • legal & regulatory
  • technological & implementation-oriented
  • economic
  • refer to docx for more details
21
Q

type of recommenders (2)

A
  1. collaborative filtering
    - articles read by similar users A & B
    - recommend article read by user A to user B
  2. content-based filtering
    - similar articles are recommended to users who read one of the articles
22
Q

complexity of language

A
  • computers mainly treating documents as “bags of words” – good at finding statistical patterns but not the meaning and context
23
Q

text pre-processing

A

source sentence -> remove stop words -> stemming -> tokens

24
Q

text representation

A
  • convert to numbers
  • term frequency / count of word
  • tfidf
25
Q

similarity calculation

A
  1. euclidean distance
  2. cosine similarity
26
Q

word embedding

A
  • neural network’s internal representation for the word
  • Each word is represented as a vector and semantically similar words have similar vectors
  • eg. Word2Vec model is trained such that probability of a word depends upon the neighbouring words
27
Q

pretrained models - eg. BERT

A

Bidirectional Encoder Representations from Transformers

  • text preprocessing is no longer a must
28
Q

caution when using pretrained models

A
  • finetune with task related data
  • can be resource intensive (GPU needed)