Tensor Flow Function

Main classes

tf.Graph()A TensorFlow computation, represented as a dataflow graph.
tf.Operation()Represents a graph node that performs computation on tensors.
tf.Tensor()A tensor represents a rectangular array of data.

Some useful functions

tf.get_default_graph()Returns the default graph for the current thread.
tf.reset_default_graph()Returns the default graph for the current thread.
tf.device(“/cpu:0”)
tf.name_scope(value)A context manager for use when defining a Python op.
tf.convert_to_tensor(value)Converts the given value to a Tensor.
ops.reset_default_graph()Clears the default graph stack and resets the global default graph.

TensorFlow Optimizers

GradientDescentOptimizerOptimizer that implements the gradient descent algorithm.
AdadeltaOptimizerOptimizer that implements the Adadelta algorithm.
AdagradOptimizerOptimizer that implements the Adagrad algorithm.
MomentumOptimizerOptimizer that implements the Momentum algorithm.
AdamOptimizerOptimizer that implements the Adam algorithm.
FtrlOptimizerOptimizer that implements the FTRL algorithm.
RMSPropOptimizerOptimizer that implements the RMSProp algorithm

Reduction

reduce_sumComputes the sum of elements across dimensions of a tensor.
reduce_prodComputes the product of elements across dimensions of a tensor.
reduce_minComputes the minimum of elements across dimensions of a tensor.
reduce_maxComputes the maximum of elements across dimensions of a tensor.
reduce_meanComputes the mean of elements across dimensions of a tensor.
reduce_anyComputes the “logical or” of elements across dimensions of a tensor.
accumulate_nReturns the element-wise sum of a list of tensors.

Activation functions

tf.nn?Wrappers for primitive Neural Net (NN) Operations.
reluComputes rectified linear: max(features, 0).
relu6Computes Rectified Linear 6: min(max(features, 0), 6).
eluComputes exponential linear: exp(features) - 1 if < 0, features otherwise.
softplusComputes softplus: log(exp(features) + 1).
softsignComputes softsign: features / (abs(features) + 1).
dropoutApplies Dropout to the input.
bias_addAdds bias to value.
sigmoidComputes sigmoid of x element-wise.
tanhComputes hyperbolic tangent of x element-wise.
sigmoid_cross_entropy_with_logitsComputes sigmoid cross entropy given logits.
softmaxComputes softmax activations.
log_softmaxComputes log softmax activations.
softmax_cross_entropy_with_logitsComputes softmax cross entropy between logits and labels.
sparse_softmax_cross_entropy_with_logitsComputes sparse softmax cross entropy between logits and labels.
weighted_cross_entropy_with_logitsComputes a weighted cross entropy.

Leave a Reply

Your email address will not be published. Required fields are marked *