给skflow添加正则化

3

我最近从tensorflow转向skflow。在tensorflow中,我们会将lambda*tf.nn.l2_loss(weights)添加到损失中。现在我在skflow中有以下代码:

def deep_psi(X, y):
    layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
    preds, loss = skflow.models.logistic_regression(layers, y)
    return preds, loss

def exp_decay(global_step):
    return tf.train.exponential_decay(learning_rate=0.01,
                                      global_step=global_step,
                                      decay_steps=1000,
                                      decay_rate=0.005)

deep_cd = skflow.TensorFlowEstimator(model_fn=deep_psi,
                                    n_classes=2,
                                    steps=10000,
                                    batch_size=10,
                                    learning_rate=exp_decay,
                                    verbose=True,)

我应该在哪里添加正则化器?Illia在这里给出了一些提示,但我无法理解。

1个回答

3
您仍然可以将额外的组件添加到损失中,只需从dnn / logistic_regression中检索权重并将其添加到损失中即可:
def regularize_loss(loss, weights, lambda):
    for weight in weights:
        loss = loss + lambda * tf.nn.l2_loss(weight)
    return loss    


def deep_psi(X, y):
    layers = skflow.ops.dnn(X, [5, 10, 20, 10, 5], keep_prob=0.5)
    preds, loss = skflow.models.logistic_regression(layers, y)

    weights = []
    for layer in range(5): # n layers you passed to dnn
        weights.append(tf.get_variable("dnn/layer%d/linear/Matrix" % layer))
        # biases are also available at dnn/layer%d/linear/Bias
    weights.append(tf.get_variable('logistic_regression/weights'))

    return preds, regularize_loss(loss, weights, lambda)

请注意,变量路径可以在此处找到。

此外,我们希望为所有具有变量的层(如dnnconv2dfully_connected)添加正则化支持,因此Tensorflow下周的夜间构建版本可能会有类似于这样的内容:dnn(...,regularize=tf.contrib.layers.l2_regularizer(lambda))。发生更改时,我将更新此答案。


dnn(.., regularize=tf.contrib.layers.l2_regularizer(lambda)) 将会受到喜爱。 - maininformer

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接