I am trying to implement Poincaré embeddings as discussed in a paper by Facebook (Link) for my hierarchical data. You may find a more accessible explanation of Poincaré embeddings here.
Based on the paper I have found some implementations for Tensorflow here and here as well as tfa.layers.PoincareNormalize in Tensorflow Addons. The latter even had a link to the paper mentioned above, which makes me believe it could be a good starting point for me. However, I had no luck implementing tfa.layers.PoincareNormalize so far and also could not find any documentation except some generic information on the API page that I linked.
Does anyone know how this layer is supposed to be implemented to provide the embedding in hyperbolic space discussed in the paper? My starting point is an implementation with a standard Embedding layer as presented below (it is actually an entity embedding of a categorical variable)?
input = Input(shape=(1, ))
model = Embedding(input_dim=my_input_dim,
output_dim=embed_dim, name="my_feature")(input)
model = Reshape(target_shape=(embed_dim, ))(model)
model = Dense(1)(model)
model = Activation('sigmoid')(model)
Simply replacing the Embedding layer by tfa.layers.PoincareNormalize does not work due to different inputs. I assume that it could be placed somwhere after the embedding layer so that for the back propagation step the "values" are projected into hyperbolic space on each iteration, but had no luck with that so far either.
Poincaré Embeddings
Poincaré embeddings allow you to create hierarchical embeddings in a non-euclidean space. The vectors on the outside of the Poincaré ball are lower in hierarchy compared to the ones in the center.
The transformation to map a Euclidean metric tensor to a Riemannian metric tensor is an open d-dimensional unit ball.
Distances between 2 vectors in this non-euclidean space are calculated as
The research paper for Poincaré embeddings is wonderfully written and you will find some wonderful implementations in popular libraries for them as well. Needless to say, they are under-rated.
Two implementations that you can use are found in -
tensorflow_addons.PoincareNormalizegensim.models.poincareTensorflow Addons implementation
According to the documentation, for a 1D tensor,
tfa.layers.PoincareNormalizecomputes the following output along axis=0.For a higher dimensional tensor, it independently normalizes each 1-D slice along the dimension axis.
This transformation can be simply applied to an embedding of n-dims. Let's create a 5 dim embedding for each element of the time series. The dimension axis=-1 in this case, which is mapped from a euclidean space to a non-euclidean space.
Gensim implementation
Another implementation Poincare embeddings can be found in Gensim. Its very similar to what you would use when working with Word2Vec from Gensim.
The process would be -
More details on training and saving Poincare embeddings can be found here.