Neural relation extraction via inner-sentence noise reduction and transfer learning
Proposed a novel word-level distant supervision approach for relation extraction, which consists of two steps:
Build Sub-Tree Parse (STP) to remove noisy words that are irrelevant to relations within sentences
Construct a neural network that takes in this STP and uses entity-wise attention to identify important semantic features of relational words. To increase robustness, we initialise the model with priori knowledge learned from entity classification task
The two steps above handles the problem of word-level noisiness within sentences and increase the robustness of relation extraction against noise.
Incorporate SDP and STP to obtain higher quality shorter sentences
How to better utilise entity information to assign more appropriate initial parameters of relation extractor
Generative adversarial zero-shot relational learning for knowledge graphs
Proposed GANs for zero-shot relational learning for knowledge graphs, where it aims to learn the relational embeddings of new unseen relations using their text descriptions. The GANs connects the text and knowledge graph domain by using the generator to learn how to generate relation embeddings using noisy text descriptions. There are two challenges:
The knowledge transfer process from text semantic space to knowledge graph semantic space
Suppressing the noisy text descriptions as human language expressions always include irrelevant words for target relations
Our model is model-agnostic and so we can use any KG embeddings and it’s the first to consider zero-shot learning for KGC. We also presented two new datasets for zero-shot KGC.