HyperFormer: Enhancing Entity and Relation Interaction for Hyper-Relational Knowledge Graph Completion logo

HyperFormer: Enhancing Entity and Relation Interaction for Hyper-Relational Knowledge Graph Completion

Hyper-relational knowledge graph completion (HKGC) aims at inferring unknown triples while considering its qualifiers.

GitHub Link

The GitHub link is https://github.com/zhiweihu1103/hkgc-hyperformer

Introduce

The repository "HKGC-HyperFormer" contains the source code and data for the paper titled "HyperFormer Enhancing Entity and Relation Interaction for Hyper-Relational Knowledge Graph Completion" presented at CIKM2023. The code is built using dependencies like PyTorch 1.8.1 and fastmoe 0.2.0. The provided datasets include various types, and the training process involves running scripts with customizable parameters. To reproduce specific results, the `--train_mode` should be set accordingly. The repository also includes citation information and acknowledges the code of CoLE. Hyper-relational knowledge graph completion (HKGC) aims at inferring unknown triples while considering its qualifiers.

Content

Taking the WD50K dataset as an example, you can run the following script_ For other datasets, you only need to modify the following parameters, we used the same other parameters on all datasets_ If you find this code useful, please consider citing the following paper. We refer to the code of CoLE. Thanks for their contributions.

Alternatives & Similar Tools

LongLLaMA-handle very long text contexts, up to 256,000 tokens logo

LongLLaMA is a large language model designed to handle very long text contexts, up to 256,000 tokens. It's based on OpenLLaMA and uses a technique called Focused Transformer (FoT) for training. The repository provides a smaller 3B version of LongLLaMA for free use. It can also be used as a replacement for LLaMA models with shorter contexts.