We're happy to announce the 0.1.3.2 release. You can install it with `pip install dicee`
Some of Important New Features
+ Inductive Neural Link Prediction
+ AllvsAll Scoring Technique
+ Stochastic Weight Averaging and Adaptive Stochastic Weight Averaging
+ Qdrant Vector Database
Inductive Neural Link Prediction
We design a technique to transform most (if not all) transductive knowledge graph embeddings to inductive knowledge graph embedding models.
With our technique called (research paper is under review), a knowledge graph embedding model can operate on triples involving unseen entities and or relations.
By using `byte_pair_encoding`, a knowledge graph embedding model can compute an unnormalized log likelihood of a triple involving an unseen entity or unseen relation or a literal.
python
from dicee.executer import Execute
from dicee.config import Namespace
from dicee import KGE
args = Namespace()
args.dataset_dir = 'KGs/UMLS'
args.model = 'Keci'
args.byte_pair_encoding = True
result = Execute(args).start()
pre_trained_kge = KGE(path=result['path_experiment_folder'])
assert (pre_trained_kge.predict(h="alga", r="isa", t="entity", logits=True) >=pre_trained_kge.predict(h="Demir", r="loves", t="Embeddings", logits=True))
AllvsAll Scoring Technique
AllvsAll is a new scoring technique to train knowledge graph embedding models that is although computationaly more extensive (e.g. the size of the trainin data becomes the number of entities times the number of relations).
A knowledge graph embedding model can be trained with `NegSample | 1vsAll | KvsAll | AllvsAll` techniques
bash
dicee --model Keci --scoring_technique AllvsAll
dicee --model Pykeen_MuRE --scoring_technique AllvsAll
Stochastic Weight Averaging and Adaptive Stochastic Weight Averaging
Read [Averaging Weights Leads to Wider Optima and Better Generalization](https://arxiv.org/abs/1803.05407)
to know more about Stochastic Weight Averaging (SWA).
bash
dicee --model ComplEx --swa
We design a technique called Adaptive SWA that combines SWA with early stopping technique.
bash
dicee --model Keci --adaptive_swa
Qdrant Vector Database
Train a KGE model and store it into CountryEmbeddings.
bash
dicee --dataset_dir KGs/Countries-S1 --path_to_store_single_run CountryEmbeddings --model Keci --p 0 --q 1 --embedding_dim 32 --adaptive_swa
Create a vector database
bash
diceeindex --path_model "CountryEmbeddings" --collection_name "dummy" --location "localhost"
Launch Webservice
bash
diceeserve --path_model "CountryEmbeddings" --collection_name "dummy" --collection_location "localhost"
Query Vector Database
bash
curl -X 'GET' 'http://0.0.0.0:8000/api/get?q=germany' -H 'accept: application/json'