TensorFlow Flashcards

1
Q

Explain the SoftMax function

A

Turns scores(integers which reflect output of neural net) into probabilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are logits?

A

Scores/Numbers. For neural nets, its result of the matmul of weights,input + bias. Logits are the inputs into a softmax function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Explain a TensorFlow Session

A
  • an environment for running a graph
  • in charge of allocating the operations to GPU(s) and/or CPU(s), including remote machines
  • Sessions do not create the tensors, this is done outside the session. Instead, Session instances EVALUATE tensors and returns results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Explain tf.placeholder

A
  • Tensor whose value changes based on different datasets and parameters. However, this tensor can’t be modified.
  • Uses the feed_dict parameter in tf.session.run() to set the value of placeholder tensor
  • tensor is still created outside the Session instance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How to set multiple TF.Placeholder values

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What happens if the data passed to the feed_dict doesn’t match the tensor type and can’t be cast into the tensor type

A

ValueError: invalid literal for

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How to cast a value to another type

A

tf.subtract(tf.cast(tf.constant(2.0), tf.int32), tf.constant(1))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Explain Tf.Variable

A
  • remembers its a capital V
  • creates a tensor with an initial value that can be modified, much like a normal Python variable
  • stores its state in the session
  • assign variable to tf.global_variables_initializer(), then initialize the state of the tensor manually within a session. Or call tf.global_variables_initializer() directly in the session instance.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

TF.normal

A
  • The tf.truncated_normal() function returns a tensor with random values from a normal distribution whose magnitude is no more than 2 standard deviations from the mean.
  • Since the weights are already helping prevent the model from getting stuck, you don’t need to randomize the bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Softmax function call?

A

x = tf.nn.softmax([2.0, 1.0, 0.2])

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain steps to one-hot encode labels

A
  1. import preprocessing from sklearn
  2. create an encoder
  3. encoder finds classes and assigns one-hot encoded vectors
  4. transform labels into one-hot encoded vectors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

basic concept of cross entropy

A

calculates distances of two vectors. Usually comparing one-hote encoded vector and softmax output vector. Basic idea is to reduce distance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe process of calcaulting cross entropy

A
  1. take natural log of softmax outputs vector (prediction probabilities)
  2. Next, multiply by one hot encoded vector
  3. Sum together, take negative
  4. Since one hot-encoded vector has zeros except the true label/class, the formulat simplifies to natural log of prediction probabilities
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Does the cross entropy function output a vector of values?

A

No, just a single value which represents distance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Quiz - Cross Entropy

A
17
Q

how to implement mini-batching in TF

A
  • use range to specify starting, ending and step size
  • identify end of each batch
  • select data and loop
18
Q

Quiz - Set features, labels, weights and biases

A
19
Q

What is a tensor

A

any n-dimensional collection of values

scaler = 0 dimension tensor

vector = 1 dimension tensor

matrix = 2 dimension tensor

Anything larger than 2 dimension is just called a tensor with # of ranks

20
Q

Describe a 3 dimensional tensor

A

an image can be described by a 3 dimensional tensor. This would look like a list of matrices

21
Q

TF.Constant

A

value never changes

22
Q

Best practice to initialize weights

A

truncated normal takes a tuple as input

23
Q

Best practice to initialize bias

A
24
Q

What does “None” allow us to do?

A

None is a placeholder dimension. Allows us to use different batch sizes

25
Q

How to implement mini_batching in TF

A
26
Q

How to implement Epochs in TF

A

Nest loop of batches inside loop of training cycles

27
Q

How to save your trained weights and biases in TF

A

create the class using tf.train.saver

run the save method on this class with the directory

28
Q

When saving weights and biases, what some considerations when applying a name

A

Basically, if weights and biases are specified in differenet order within the pipeline, and no name is specified when saving, TF will throw an error.

29
Q

How to one hot encode labels using TF function

A
30
Q

template code to calculate model softmax, cross_entropy, specify loss and optimize

A