Temporal knowledge graphs incorporate temporal information for representation learning. There are four different research fields:
Temporal relational dependency
Temporal logical reasoning
This area is significantly less explored but it’s extremely important! Structured knowledge graphs are only valid for a particular time period as facts evolve over time.
Temporal Information Embedding
We can infuse temporal information by extending our original triples into temporal quadruple (h, r, t, tao), where tao provides additional temporal information about the fact. The tao could be timestamp embeddings, representing the period of time the fact is valid for. With this timestamp embeddings, you can have a dynamic knowledge graph and static knowledge graph can be generated by supplying a specific timestamp!
The state of entities is constantly changing based on real-world events, which subsequently affect the relations between entities. For embedding approach, we could use entity and timestamp as input to an entity embedding function to capture the temporal-related characteristics of entities at any given time. For example, Know-evolve explores the knowledge evolution phenomenon of entities and relations.
Temporal Relational Dependency
This means that there are temporal dependencies in relational chains. For example,
wasBornIn-->graduateFrom-->workAt-->diedIn. Ideally, we want our model and knowledge graph to have these temporal order and information!
Temporal Logical Reasoning
We could also apply logical rules for temporal reasoning!
Natural Language Understanding
Knowledge graph can enhance language representation with structured facts. This could help with event categorisation, inter-slot relations in spoken language understanding, and knowledge-awared language models. K-BERT is an example of knowledge graph infused into BERT contextual encoder.
Answering questions using facts from knowledge graphs. There are two levels of this application:
Single-fact – deals with simple factoid QA, where we answer a simple question using a single knowledge graph fact. The difficulty here is that fusing KG and QA together introduces model complexity
Multi-hop reasoning – a more complex problem that requires multi-hop common sense reasoning. Structured knowledge can provide informative observations and act as relational inductive biases. This lead to combining symbolic and semantic space for multi-hop reasoning
Recommender systems frequently face the problem of sparsity and cold start problem. By integrating knowledge graph as external information, recommendation systems can perform common sense reasoning.