Skip to content

Instantly share code, notes, and snippets.

View bet02024's full-sized avatar

BeGG2024 bet02024

  • México
  • 09:13 (UTC -06:00)
View GitHub Profile
try:
import faiss
except:
!pip install faiss-cpu -q
import faiss
# Build the index
d = embeddings.shape[1] # Dimension
index = faiss.IndexFlatL2(d)
@bet02024
bet02024 / Attention.py
Created September 18, 2017 21:35 — forked from cbaziotis/Attention.py
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
x (): input
kernel (): weights
Returns:
"""