-
self-attention 原理
keras 使用self-attention
- 安裝
pip install keras-self-attention
- 使用demo
-
import keras from keras_self_attention import SeqSelfAttention model = keras.models.Sequential() model.add(keras.layers.Embedding(input_dim=10000, output_dim=300, mask_zero=True)) model.add(keras.layers.Bidirectional(keras.layers.LSTM(units=128, return_sequences=True))) model.add(SeqSelfAttention(attention_activation='sigmoid')) model.add(keras.layers.Dense(units=5)) model.compile( optimizer='adam', loss='categorical_crossentropy', metrics=['categorical_accuracy'], ) model.summary()
鄭重說明
本文內容主要來自於https://pypi.org/project/keras-self-attention/