Graph Neural Networks: A Review of Methods and Applications

Contribution

A survey paper covering graph neural networks as well as all the other variants!

A Survey on Knowledge Graphs: Representation, Acquisition and Applications

Contribution

A survey paper on knowledge graphs covering four main pathways: knowledge graph representation, knowledge acquisition, temporal knowledge graphs, and knowledge-aware applications

Knowledge graph embedding via dynamic mapping matrix

Contribution

Proposed TransD, an improvement of TransR / CTransR, which uses two vectors to represent each entity and relation. One vector represents the meaning of the entity or relation and the other is used to construct mapping matrix dynamically. TransD consider BOTH diversity of relations AND entities and it’s highly scalable due to less parameters and no matrix multiplication operations

Future Work

  1. Not all new facts can be estimated from existing facts in knowledge graphs. Look for methods to complete KGs using new facts derive from texts data!

Knowledge graph embedding by translating on hyperplanes

Contribution

Proposed TransH, which models a relation as hyperplane AND a translation operation. TransH solves the problem of TransE, which captures the different mapping properties such as reflexive, 1-to-many, many-to-1, and many-to-many. It also utilise the mapping property of a relation to reduce false negative labelling!

Learning entity and relation embeddings for knowledge graph completion

Contribution

Proposed TransR and (CTransR), which model entity and relation embeddings in separate semantic spaces. The idea being an entity has multiple different aspects and different relations will focus on different aspects of entities! In short, TransR learn embeddings by first projecting entities from entity space to the right relation space and build a translations between projected entities! Shown significant improvement beyond TransE and TransH. CTransR is an extension of TransR which cluster different similar head-tail entity pairs together and learn distinct relation for each group!

Future Work

  1. We only consider relational fact separately, however, relations tend to correlate with each other (transitivity)

  2. Explore unified embedding model that takes in text and knowledge graph information. Currently, they are performed separately and combined with a weighted average

  3. Explore more of CTransR and look into the internal correlations within each relation type

Ryan

Ryan

Data Scientist

Leave a Reply