Keras激活层无法正常工作

首先,我做了如下模型:

from tensorflow.keras.layers import Dense, Flatten, Conv2D, Dropout, BatchNormalization, 
     AveragePooling2D, ReLU, Activation
from tensorflow.keras import Model

class MyModel(Model):
    def __init__(self):
        super(MyModel, self).__init__()
        self.conv = Conv2D(4, (3,3), padding = 'same', activation = 'linear'
                           ,input_shape = x_train.shape[1:])
        self.bn = BatchNormalization()
        self.RL = ReLU()
        self.FL = Flatten()
        self.d1 = Dense(4, activation = 'relu')
        self.d2 = Dense(100, activation = 'softmax')
    def call(self,x):
        x = self.conv(x)
        x = self.bn(x)
        x = self.RL(x)
        x = self.FL(x)
        x = self.d1(x)
        return self.d2(x)

但是,此模型不能很好地工作。准确度仅为1%,这意味着它什么也没学到。 (我使用CIFAR100训练了该模型-简单性只是检查代码) 但是当我如下更改代码时,它起作用了。

class MyModel(Model):
    def __init__(self):
        super(MyModel, self).__init__()
        self.conv = Conv2D(4, (3,3), padding = 'same', activation = 'linear'
                           ,input_shape = x_train.shape[1:])
        self.bn = BatchNormalization()

        # The below code is changed from ReLU() -> Activation('relu')
        self.RL = Activation('relu')

        self.FL = Flatten()
        self.d1 = Dense(4, activation = 'relu')
        self.d2 = Dense(100, activation = 'softmax')
    def call(self,x):
        x = self.conv(x)
        x = self.bn(x)
        x = self.RL(x)
        x = self.FL(x)
        x = self.d1(x)
        return self.d2(x)

为什么会这样呢? 我不知道这个问题。 感谢您的阅读。