注意ConvD1?

我的主管要求我为CNN(应用于文本)实现一个Attention层,但是我很确定它不适用于ConvD1层。

Using Keras I have a pretty straightforward model, with an embedding-layer followed by a convolutional layer.

Since the input for ConvD1 is (#documents, words, embedding_size) followed by a MaxPooling-Layer, I considered dropping max-pool and inserting Attention here, but I really dont know whats my query and value-input in this case.

我知道tf.backend有一个Attention-Layer,但是可以在这里应用它吗?还是我需要某种自我关注?

我需要的是查看哪个(单词)最负责相应的分类。

sequence_input = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedded_sequences = embedding_layer(sequence_input)

conv1 = Conv1D(filters=2, kernel_size=2, padding='same')(embedded_sequences)
conv1 = MaxPooling1D(pool_size=32)(conv1)
conv1 = Dropout(0.2)(conv1)

x = Dense(50, activation="relu",
          kernel_regularizer=regularizers.l2(0.01),
          bias_regularizer=regularizers.l2(0.01))(conv1)
x = Dropout(0.3)(x)

preds = Dense(1, activation='sigmoid',name='output')(x)  
model = Model(sequence_input, preds)

model.compile(loss='binary_crossentropy', 
              optimizer='adam', 
              metrics=[TruePositives(name='true_positives'),
                       TrueNegatives(name='true_negatives'),
                       FalseNegatives(name='false_negatives'),
                       FalsePositives(name='false_positives'),
                       ])