ML Flashcards
tf: In tensorflow you imagine
everything you are computing as a graph nodes are the transformations on the data, or functions you are running on the data. These can have multiple inputs and outputs. The edges (the things connecting nodes) are the data.
dl: back propogation is
looking at the output of a deep neural network model and comparing it to the desired output. Based on the difference between the correct answer and the prediction you adjust the layer right before to create the correct answer. Then based on the error in the second last layer, you adjust the third last layer and so on.
ml: The points plotted near the decision boundary are called support vectors because
the are the ones that force the decision boundary to be where it is.
ml: A Support Vector Machine is similar to a Nearest Neighbors because
An SVM only keeps the points that define the decision boundary, while NN keeps the points that do not influence the decision boundary as well as points that do.
tf: A tensor is
a typed ndarray
tf: A one hot vector is
a vector with zero in all columns besides one. The column that one is in represents the class it belongs to.
ml: MNIST is a
computer vision dataset with images of handwritten digits and their labels
ml: softmax is a
multinomial logistic regression
ml: Softmax is good for
when you need the probabilities of a record belong to classes
tf: A bias is used to
tell the algorithm that a certain class is more frequent in general
tf: To create the table that hold all your samples, type
x = tf.placeholder(tf.float32, [None, 784])
tf: in x = tf.placeholder(tf.float32, [None, 1000]), None means
that that dimension can vary
tf: in x = tf.placeholder(tf.float32, [None, 784]), 1000 is
The number of columns
tf: To create the weights variable, type
W = tf.Variable(tf.zeros([1000, 10]))
tf: To create the biases variable, type
b = tf.Variable(tf.zeros([10]))
tf: To create a softmax model, type
y = tf.nn.softmax(tf.matmul(x, W) + b)
tf: a good cost function is called
cross-entropy