This demo example is built with gradle.
BertTranslator.java
and HuggingFaceQaInference.java
are stored in directory "src/main/java".
bert-base-cased-vocab.txt
and trace_cased_bertqa.pt
are stored in directory "scr/main/resources"
The two recources are downloaded from:
trace_cased_bertqa.pt
: https://mlrepo.djl.ai/model/nlp/question_answer/ai/djl/pytorch/bertqa/trace_cased_bertqa/0.0.1/trace_cased_bertqa.pt.gz
bert-base-cased-vocab.txt
: https://mlrepo.djl.ai/model/nlp/question_answer/ai/djl/pytorch/bertqa/trace_cased_bertqa/0.0.1/bert-base-cased-vocab.txt.gz
See the main text for more info.
Note that, when running this code, the default engine may also need to be specified with the VM option: -Dai.djl.default_engine=PyTorch
which is compatible with the model and the tokenizer.
To all readers:
Any issue related to this example can also be asked in DJL github repo, which may be noticed and replied faster. The relevant example in DJL is in
~/examples/src/main/java/ai/djl/examples/inference/BertQaInference.java