Sagemaker Flashcards

1
Q

Training options

A
  • Built-in training algorithms
  • Spark MLLib
  • Custom Python Tensorflow/MXNet code
  • PyTorch, Scikit-Learn, RLEstimator
  • XGBoost, Hugging Face, Chainer
  • Your own Docker image
  • Algorithm purchased from AWS marketplace
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Model deployment Options

A
  • Persistent endpoint for making predictions on demand
  • Sagemaker Batch Transform
  • Inference Pipeline for more complex processing
  • Sagemaker Neo for edge devices
  • Elastic Inference for accelerating deep learning models
  • Automatic Scaling(Horizontal scaling of no. end points)
  • Shadow Testing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Word2Vec models

A

Parameter update models that learn word embeddings vectors by training models on word predictions

Continuous Bag of Words(CBOW) - predict target word given surrounding context words, trying to maximise posterior log prob of predicting the right word, in the process updated weights are used as embedding vectors

Skip-gram - opposite of cbow, predict context words given target words

More details here
https://arxiv.org/abs/1411.2738

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Blazing text

A

AWS implementation of Word2Vec & Text Classification algorithms. Optimized for Multi-core CPU

Not parallelizable

Modes
Single CPU: skip-gram, cbow, batch-skip-gram | supervised
Single GPU: skip-gram, cbow | supervised
Multi CPU: batch skip-gram

Input

Unsupervised:
text file with one training sentence per line, space-separated tokens

Supervised:
One sentence per line
The first “word” in the sentence is the string __label__<label></label>

Example:

__label__4 linux ready for prime time

__label__2 the cat jump over the lazy fox

Hyperparams:
Word2Vec:

  • Mode(skip-gram, batch-skip-gram, cbow)
  • Learning-rate
  • Window-size
  • Vector-dim
  • Negative-samples

Text classification:
- Epochs
- Learning_rate
- Word_ngrams
- Vector_dim

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly