Layers
While the set of TensorFlow operations is quite extensive, developers
of neural networks typically think of models in terms of higher level
concepts like "layers", "losses", "metrics", and "networks". A layer,
such as a Convolutional Layer, a Fully Connected Layer or a BatchNorm
Layer are more abstract than a single TensorFlow operation and
typically involve several operations. Furthermore, a layer usually
(but not always) has variables (tunable parameters) associated with
it, unlike more primitive operations. For example, a Convolutional
Layer in a neural network is composed of several low level operations:
- Creating the weight and bias variables
- Convolving the weights with the input from the previous layer
- Adding the biases to the result of the convolution.
- Applying an activation function.
Using only plain TensorFlow code, this can be rather laborious:
input = ...
with tf.name_scope('conv1_1') as scope:
kernel = tf.Variable(tf.truncated_normal([3, 3, 64, 128], dtype=tf.float32,
stddev=1e-1), name='weights')
conv = tf.nn.conv2d(input, kernel, [1, 1, 1, 1], padding='SAME')
biases = tf.Variable(tf.constant(0.0, shape=[128], dtype=tf.float32),
trainable=True, name='biases')
bias = tf.nn.bias_add(conv, biases)
conv1 = tf.nn.relu(bias, name=scope)
To alleviate the need to duplicate this code repeatedly, TF-Slim
provides a number of convenient operations defined at the more
abstract level of neural network layers. For example, compare the code
above to an invocation of the corresponding TF-Slim code:
input = ...
net = slim.conv2d(input, 128, [3, 3], scope='conv1_1')
操作符为您提供了一些巧妙的抽象,使您不必担心TensorFlow的所有细节 - 如果您问我,这是一个很好的补充。然而,似乎这仍在积极开发中,因此在(未来)开发中积极使用它之前,我建议您再多了解一些相关知识。