Keras-在生成测试输入的预测时使用测试生成器时,得到“不一致的样本数错误”?

As the title clearly describes, when I generate the predictions of my CNN model for the given test data, which is loaded thru a custom data generator (namely, test_gen), I'm getting the ValueError: Found input variables with inconsistent numbers of samples: [1697, 1536] error.

以下是每个子集的数据形状:

Training sets:
X1_train.shape: (7919, 39)
X2_train.shape: (7919,)
X3_train.shape: (7919, 2)
y_train.shape: 7919

Validation sets:
X1_valid.shape: (1698, 39)
X2_valid.shape: (1698,)
X3_valid.shape: (1698, 2)
y_valid.shape: 1698

Test sets:
X1_test.shape: (1697, 39)
X2_test.shape: (1697,)
X3_test.shape: (1697, 2)
y_test.shape: 1697

数据生成器的结构如下:

train_gen = TripleInputGenerator(X1_train, X2_train, X3_train, y_train, COUGHVID_DIR, batch_size=n_batch_size,
                                 target_size=(64, 64), shuffle=True)
val_gen = TripleInputGenerator(X1_valid, X2_valid, X3_valid, y_valid, COUGHVID_DIR, batch_size=n_batch_size,
                               target_size=(64, 64), shuffle=True)
test_gen = TripleInputGenerator(X1_test, X2_test, X3_test, y_test, COUGHVID_DIR, batch_size=n_batch_size,
                                target_size=(64, 64))

The model (which is based on an open-source code):

# First Model
inp1 = Input(shape=39)
lay1 = Dense(units=512, activation='relu', kernel_initializer='GlorotUniform')(inp1)
lay2 = Dropout(0.4)(lay1)
lay3 = Dense(units=256, activation='relu', kernel_initializer='GlorotUniform')(lay2)
lay4 = Dropout(0.2)(lay3)

# Second Model
inp2 = Input(shape=(64, 64, 3))
lay1_ = Conv2D(32, (3, 3), strides=(2, 2))(inp2)
lay2_ = AveragePooling2D((2, 2), strides=(2, 2))(lay1_)
lay3_ = BatchNormalization()(lay2_)
lay4_ = Activation('relu')(lay3_)

lay5_ = Conv2D(64, (3, 3), padding="same")(lay4_)
lay6_ = AveragePooling2D((2, 2), strides=(2, 2))(lay5_)
lay7_ = BatchNormalization()(lay6_)
lay8_ = Activation('relu')(lay7_)

lay9_ = Conv2D(64, (3, 3), padding="same")(lay8_)
lay10_ = AveragePooling2D((2, 2), strides=(2, 2))(lay9_)
lay11_ = BatchNormalization()(lay10_)
lay12_ = Activation('relu')(lay11_)

lay13_ = Flatten()(lay12_)
lay14_ = Dense(units=256, activation='relu', kernel_initializer='GlorotUniform')(lay13_)
lay15_ = Dropout(rate=0.5)(lay14_)

# Third model
inp3 = Input(shape=2)
lay31 = Dense(units=16, activation='relu', kernel_initializer='GlorotUniform')(inp3)
lay32 = Dropout(0.4)(lay31)
lay33 = Dense(units=64, activation='relu', kernel_initializer='GlorotUniform')(lay32)
lay43 = Dropout(0.2)(lay33)

# merge input models
merge = concatenate([lay15_, lay4, lay43])

# interpretation model
hidden1 = Dense(64, activation='relu')(merge)
hidden2 = Dense(64, activation='relu')(hidden1)
output = Dense(1, activation='sigmoid')(hidden2)
ensemble_model = Model(inputs=[inp1, inp2, inp3], outputs=output)

return ensemble_model

最后,我生成网络的预测如下:

y_pred = (ensemble_model.predict(test_gen) > 0.5).astype("int32")
y_pred = y_pred.reshape((len(y_pred)), )

The issue is that the size of the y_pred is 1,536 despite the test subset contains 1,697 samples. What could be the reason behind this issue? Your help is really appreciated.

p、 请随时询问有关网络的任何进一步信息。