TensorFlow Flashcards
How does TensorFlow get its name?
TensorFlow gets its name from tensors, which are arrays of arbitrary dimensionality. Using TensorFlow, you can manipulate tensors with a very high number of dimensions
What is a scalar, a vector, and a matrix?
- A scalar is a single value (a 0th-order tensor), e.g. 5, or “hello”.
- A vector a list, or array, (a 1st-order tensor) e.g. [2, 3, 5, 7, 11]
- A matrix is a 2 dimensional array, (a 2nd-order tensor) e.g. [[1,3], [2,4]]
What are operations and graphs in TensorFlow?
An operation creates, destroys, or manipulates tensors. A graph is a graph data structure who’s nodes are operations, and edges are tensors. Tensors flow through the graph, manipulated by operations.
What are constants and variables?
Constant tensors do not change values. Variable tensors do. They are just operations in a graph, that return a constant or assigned tensor.
What does TensorFlow’s “lazy execution model” mean?
Nodes are only computed when needed, based on the needs of associated nodes.
How do you define constant or variable tensors and assign values?
# Use tf.constant operator to define a constant x = tf.constant(3.14) # Variables can be assigned to after creation y = tf.Variable([5]) y.assign([6]);
What is a TensorFlow “session”
Graphs must be run within a session, which holds the state for the graphs it runs.
What are the basic 2 steps of TensorFlow programming?
- Assemble constants, variables, and operations into a graph.
- Evaluate them with the context of a session.
What my the typical imports at the top of a TensorFlow program look like?
import tensorflow as tf import numpy as np import pandas as pd import matplotlib.pyplot as plt
How would you typically use a graph and a session in Python to ensure they are cleaned up?
Use the “with” statement, which similar to C# “using”, places a guard around the block to ensure enter/exit happens, e.g.
import tensorflow as tf with tf.Graph().as\_default(): # establish a default tf.constant([2,3], name='x\_const') # etc.. assembly the graph with tf.Session() as sess # etc.. evaluate the graph
What are tensor “shapes”?
They express the number and size of dimensions or a tensor. e.g.
# A matrix with 2 rows and 3 columns. matrix = tf.zeros([2, 3])
What is “broadcasting”?
A concept borrowed from NumPy, this allows the use of smaller tensors with larger in “element-wise” mathematical operations (add and equals). Each dimension can be 1 element (or missing). (e.g. a [3,1,5] or [3,5] tensor can be used with a [3,6,5] tensor). Missing values are conceptually copied. e.g
# Create a six-element vector (1-D tensor). primes = tf.constant([2, 3, 5, 7, 11, 13], dtype=tf.int32) # Create a constant scalar with value 1. ones = tf.constant(1, dtype=tf.int32) # Add the two tensors. The resulting tensor is a six-element vector. just\_beyond\_primes = tf.add(primes, ones)
What are the requirements on a matrix for multiplication?
The number of columns of the first matrix, must match the number of rows in the second. e.g. a [3,4] matrix times a [4,2] matrix is value (and results in a [3,2] matrix). A [4,2] and [3,4] cannot be multiplied.
The resulting matrix has the number or rows or A, and the number of columns of B. Each cell is the “dot product”, i.e. each cell in that row from A, multiplied by each similarly indexed cell in that column for B, and added together.
How can you deal with matrixes that are not suitable shapes for multiplying?
Reshape them using tf.reshape.
What must you do to use variable tensors with in a graph?
Initialize them, and run within the session, e.g.
with tf.Session() as sess: initialization = tf.global\_variables\_initializer() sess.run(initialization)