TensorFlow Flashcards
Explain the SoftMax function
Turns scores(integers which reflect output of neural net) into probabilities
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/797/a_image_thumb.png?1511924524)
What are logits?
Scores/Numbers. For neural nets, its result of the matmul of weights,input + bias. Logits are the inputs into a softmax function
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/800/a_image_thumb.png?1511924804)
Explain a TensorFlow Session
- an environment for running a graph
- in charge of allocating the operations to GPU(s) and/or CPU(s), including remote machines
- Sessions do not create the tensors, this is done outside the session. Instead, Session instances EVALUATE tensors and returns results
Explain tf.placeholder
- Tensor whose value changes based on different datasets and parameters. However, this tensor can’t be modified.
- Uses the feed_dict parameter in tf.session.run() to set the value of placeholder tensor
- tensor is still created outside the Session instance
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/803/a_image_thumb.png?1511925812)
How to set multiple TF.Placeholder values
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/805/a_image_thumb.png?1511925898)
What happens if the data passed to the feed_dict doesn’t match the tensor type and can’t be cast into the tensor type
ValueError: invalid literal for
How to cast a value to another type
tf.subtract(tf.cast(tf.constant(2.0), tf.int32), tf.constant(1))
Explain Tf.Variable
- remembers its a capital V
- creates a tensor with an initial value that can be modified, much like a normal Python variable
- stores its state in the session
- assign variable to tf.global_variables_initializer(), then initialize the state of the tensor manually within a session. Or call tf.global_variables_initializer() directly in the session instance.
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/810/a_image_thumb.png?1515931611)
TF.normal
- The tf.truncated_normal() function returns a tensor with random values from a normal distribution whose magnitude is no more than 2 standard deviations from the mean.
- Since the weights are already helping prevent the model from getting stuck, you don’t need to randomize the bias
Softmax function call?
x = tf.nn.softmax([2.0, 1.0, 0.2])
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/813/q_image_thumb.png?1512146794)
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/813/a_image_thumb.png?1512146922)
Explain steps to one-hot encode labels
- import preprocessing from sklearn
- create an encoder
- encoder finds classes and assigns one-hot encoded vectors
- transform labels into one-hot encoded vectors
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/819/a_image_thumb.png?1512147453)
basic concept of cross entropy
calculates distances of two vectors. Usually comparing one-hote encoded vector and softmax output vector. Basic idea is to reduce distance
Describe process of calcaulting cross entropy
- take natural log of softmax outputs vector (prediction probabilities)
- Next, multiply by one hot encoded vector
- Sum together, take negative
- Since one hot-encoded vector has zeros except the true label/class, the formulat simplifies to natural log of prediction probabilities
![](https://s3.amazonaws.com/brainscape-prod/system/cm/298/905/822/a_image_thumb.png?1512149377)
Does the cross entropy function output a vector of values?
No, just a single value which represents distance