TransA: An adaptive approach for knowledge graph embedding

Proposed TransA, an adaptive metric approach for embedding. The argument is that loss metric used by most translation-based models are oversimplified and as a result. TransA uses elliptical surfaces instead of traditional spherical surfaces. TransA could be seen as weighting transformed feature dimensions.

Embedding entities and relations for learning and inference in knowledge bases

Proposed a general unified framework that can be applied to most multi-relational embedding models, where entities can be low-dimensional vectors and relations can be linear/bilinear mapping functions. The paper also empirically evaluate different entity and relation representations using the link prediction task and show that bilinear model achieves SOTA results. Lastly, the paper also uses the learned embeddings to extract logical rules!

Future Work

  1. Exploit deep neural networks

  2. Apply tensor constructs to deep learning architectures. Related constructs and architectures can help improve multi-relational learning and inference

Quaternion knowledge graph embedding

Proposed Quaternion embeddings, which it’s a vector in the hyper complex space with three imaginary components (vs one real and one imaginary component in complex space). The quaternion embeddings also come with a new scoring function that uses the relational quaternion embedding to rotate the head entity, followed by inner product with the tail entity. There are three reasons for the quaternion embeddings:

  1. Greater expressiveness through 4 different dimensions (one real and three imaginaries)

  2. Good for smooth rotation and spatial transformations in vector space

  3. The QuatE framework covers the ComplEx method, which can model different relations such as symmetry, anti-symmetry, and inversion

  4. Less parameters while outperforming previous methods (scalability and accuracy)

Ryan

Ryan

Data Scientist

Leave a Reply