Table of Contents
Main classes
tf.Graph() | A TensorFlow computation, represented as a dataflow graph. |
tf.Operation() | Represents a graph node that performs computation on tensors. |
tf.Tensor() | A tensor represents a rectangular array of data. |
Some useful functions
tf.get_default_graph() | Returns the default graph for the current thread. |
tf.reset_default_graph() | Returns the default graph for the current thread. |
tf.device(“/cpu:0”) | |
tf.name_scope(value) | A context manager for use when defining a Python op. |
tf.convert_to_tensor(value) | Converts the given value to a Tensor . |
ops.reset_default_graph() | Clears the default graph stack and resets the global default graph. |
TensorFlow Optimizers
GradientDescentOptimizer | Optimizer that implements the gradient descent algorithm. |
AdadeltaOptimizer | Optimizer that implements the Adadelta algorithm. |
AdagradOptimizer | Optimizer that implements the Adagrad algorithm. |
MomentumOptimizer | Optimizer that implements the Momentum algorithm. |
AdamOptimizer | Optimizer that implements the Adam algorithm. |
FtrlOptimizer | Optimizer that implements the FTRL algorithm. |
RMSPropOptimizer | Optimizer that implements the RMSProp algorithm |
Reduction
reduce_sum | Computes the sum of elements across dimensions of a tensor. |
reduce_prod | Computes the product of elements across dimensions of a tensor. |
reduce_min | Computes the minimum of elements across dimensions of a tensor. |
reduce_max | Computes the maximum of elements across dimensions of a tensor. |
reduce_mean | Computes the mean of elements across dimensions of a tensor. |
reduce_any | Computes the “logical or” of elements across dimensions of a tensor. |
accumulate_n | Returns the element-wise sum of a list of tensors. |
Activation functions
tf.nn? | Wrappers for primitive Neural Net (NN) Operations. |
relu | Computes rectified linear: max(features, 0) . |
relu6 | Computes Rectified Linear 6: min(max(features, 0), 6) . |
elu | Computes exponential linear: exp(features) - 1 if < 0, features otherwise. |
softplus | Computes softplus: log(exp(features) + 1) . |
softsign | Computes softsign: features / (abs(features) + 1) . |
dropout | Applies Dropout to the input. |
bias_add | Adds bias to value . |
sigmoid | Computes sigmoid of x element-wise. |
tanh | Computes hyperbolic tangent of x element-wise. |
sigmoid_cross_entropy_with_logits | Computes sigmoid cross entropy given logits . |
softmax | Computes softmax activations. |
log_softmax | Computes log softmax activations. |
softmax_cross_entropy_with_logits | Computes softmax cross entropy between logits and labels . |
sparse_softmax_cross_entropy_with_logits | Computes sparse softmax cross entropy between logits and labels . |
weighted_cross_entropy_with_logits | Computes a weighted cross entropy. |