This demo example is built with gradle.
BertTranslator.java
and HuggingFaceQaInference.java
are stored in directory "src/main/java".
bert-base-cased-vocab.txt
and trace_cased_bertqa.pt
are stored in directory "scr/main/resources"
The two recources are downloaded from:
trace_cased_bertqa.pt
: https://mlrepo.djl.ai/model/nlp/question_answer/ai/djl/pytorch/bertqa/trace_cased_bertqa/0.0.1/trace_cased_bertqa.pt.gz
bert-base-cased-vocab.txt
: https://mlrepo.djl.ai/model/nlp/question_answer/ai/djl/pytorch/bertqa/trace_cased_bertqa/0.0.1/bert-base-cased-vocab.txt.gz
See the main text for more info.
Note that, when running this code, the default engine may also need to be specified with the VM option: -Dai.djl.default_engine=PyTorch
which is compatible with the model and the tokenizer.
@hakanai
It shows that you are using MXNet engine. You need to specify the default engine to be PyTorch in order for the example to work. This is done by VM option:
-Dai.djl.default_engine=PyTorch