想象一下一个全连接的神经网络,其最后两层的结构如下:
[Dense]
units = 612
activation = softplus
[Dense]
units = 1
activation = sigmoid
网络的输出值为1,但我想知道Sigmoid函数的输入x是多少(必然是一个很大的数,因为sigm(x)在这里等于1)。
根据indraforyou的回答,我成功地检索到了Keras层的输出和权重:
outputs = [layer.output for layer in model.layers[-2:]]
functors = [K.function( [model.input]+[K.learning_phase()], [out] ) for out in outputs]
test_input = np.array(...)
layer_outs = [func([test_input, 0.]) for func in functors]
print layer_outs[-1][0] # -> array([[ 1.]])
dense_0_out = layer_outs[-2][0] # shape (612, 1)
dense_1_weights = model.layers[-1].weights[0].get_value() # shape (1, 612)
dense_1_bias = model.layers[-1].weights[1].get_value()
x = np.dot(dense_0_out, dense_1_weights) + dense_1_bias
print x # -> -11.7
x怎么可能是负数呢?这种情况下,最后一层的输出应该是一个比1.0更接近0.0的数字。是dense_0_out
或者dense_1_weights
的输出或权重出了问题吗?
x = np.dot(dense_0_out, dense_1_weights) + dense_1_bias
吗? - Marcin Możejkolayer_outs [-1]
却显示为1... - johk95dense_1_weights.shape = (1, 612)
和dense_0_out.shape = (612, 1)
。为了确保,你可以执行x = numpy.sum(dense_1_weights.flatten() * dense_0_out.flatten())
,这会得到相同的结果。 - johk95model.summary()
吗? - Marcin Możejko