2x 嵌套的 Tensorflow 自定义层导致可训练参数为零

2x nested Tensorflow custom layers results in zero trainable parameters

我正在创建一系列自定义 Tensorflow(版本 2.4.1)层,并且 运行 遇到模型摘要显示零可训练参数的问题。下面是一系列示例,展示了在我添加最后一个自定义层之前一切都很好。

这是进口和定制 classes:

from tensorflow.keras.models import Model
from tensorflow.keras.layers import (BatchNormalization, Conv2D, Input, ReLU, 
                                     Layer)

class basic_conv_stack(Layer):
    def __init__(self, filters, kernel_size, strides):
        super(basic_conv_stack, self).__init__()
        self.conv1 = Conv2D(filters, kernel_size, strides, padding='same')
        self.bn1 = BatchNormalization()
        self.relu = ReLU()

    def call(self, x):
        x = self.conv1(x)
        x = self.bn1(x)
        x = self.relu(x)
        return x
    
class basic_residual(Layer):
    def __init__(self, filters, kernel_size, strides):
        super(basic_residual, self).__init__()
        self.bcs1 = basic_conv_stack(filters, kernel_size, strides)
        self.bcs2 = basic_conv_stack(filters, kernel_size, strides)

    def call(self, x):
        x = self.bcs1(x)
        x = self.bcs2(x)
        return x
    
class basic_module(Layer):
    def __init__(self, filters, kernel_size, strides):
        super(basic_module, self).__init__()
        self.res = basic_residual
        self.args = (filters, kernel_size, strides)
    
    def call(self, x):
        for _ in range(4):
            x = self.res(*self.args)(x)
        return x

现在,如果我执行以下操作,一切正常,我得到 300 个可训练参数:

input_layer = Input((128, 128, 3))
conv = basic_conv_stack(10, 3, 1)(input_layer)

model = Model(input_layer, conv)
print (model.summary())

同样,如果我执行以下操作,我将获得 1,230 个可训练参数:

input_layer = Input((128, 128, 3))
conv = basic_residual(10, 3, 1)(input_layer)

model = Model(input_layer, conv)
print (model.summary())

但是,如果我尝试 basic_module class,我得到的可训练参数为零:

input_layer = Input((128, 128, 3))
conv = basic_module(10, 3, 1)(input_layer)

model = Model(input_layer, conv)
print (model.summary())

有人知道为什么会这样吗?

编辑添加:

我发现调用中使用的层必须在 class 的初始化中初始化才能正常工作。因此,如果我将基本模块更改为:

class basic_module(Layer):
    def __init__(self, filters, kernel_size, strides):
        super(basic_module, self).__init__()
        self.clayers = [basic_residual(filters, kernel_size, strides) for _ in range(4)]

    def call(self, x):
        for idx in range(4):
            x = self.clayers[idx](x)
        return x

一切正常。我不知道 为什么 会这样,所以我会留下这个问题,以防有人能回答这个问题的原因。

您必须初始化 class 个具有所需参数的实例,例如 filterskernel_sizestrides 以预定义 base_mdoule。另外,请注意这些超参数与可训练的权重属性有关。

# >>> a = basic_module
# >>> a __main__.basic_module
# >>> a = basic_module(10, 3, 1)
# >>> a 
# >>> <__main__.basic_module at 0x7f6123eed510>

class basic_module(Layer):
    def __init__(self, filters, kernel_size, strides):
        super(basic_module, self).__init__()
        self.res = basic_residual # < ---
        self.args = (filters, kernel_size, strides)
    
    def call(self, x):
        for _ in range(4):
            x = self.res(*self.args)(x)
        return x