Skip to content

Instantly share code, notes, and snippets.

View shwangdev's full-sized avatar
🎯
Focusing

王翔(Shawn Wang) shwangdev

🎯
Focusing
View GitHub Profile
@shwangdev
shwangdev / attention_lstm.py
Created June 12, 2019 15:23 — forked from mbollmann/attention_lstm.py
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one: