我想在Keras的残差块之间添加一个跳跃连接。这是我目前的实现,但由于张量形状不同而无法正常工作。
该函数如下:
该函数如下:
def build_res_blocks(net, x_in, num_res_blocks, res_block, num_filters, res_block_expansion, kernel_size, scaling):
net_next_in = net
for i in range(num_res_blocks):
net = res_block(net_next_in, num_filters, res_block_expansion, kernel_size, scaling)
# net tensor shape: (None, None, 32)
# x_in tensor shape: (None, None, 3)
# Error here, net_next_in should be in the shape of (None, None, 32) to be fed into next layer
net_next_in = Add()([net, x_in])
return net
但是我得到
ValueError: Operands could not be broadcast together with shapes (None, None, 32) (None, None, 3)
如何将这些张量添加或合并到正确的形状(None,None,32)中?如果这不是正确的方法,那么你该如何实现预期的结果?
这就是res_block
的样子:
def res_block(x_in, num_filters, expansion, kernel_size, scaling):
x = Conv2D(num_filters * expansion, kernel_size, padding='same')(x_in)
x = Activation('relu')(x)
x = Conv2D(num_filters, kernel_size, padding='same')(x)
x = Add()([x_in, x])
return x
tensorflow.python.framework.errors_impl.InvalidArgumentError: Incompatible shapes: [16,46,46,8] vs. [16,48,48,8]
有什么想法吗? - atlasres_block
中的卷积和/或池化层中设置padding='same'
。 - mrks