qabot:
- fix cos sim bug, value ok
- lstm+cos sim+margin loss, loss ok, top1 accuracy 4%~10%
- lstm+cos sim+pooling+margin, loss not ok, top1 accuracy not ok
- pytorch: lstm+cos sim+pooling+margin, loss ok
kint:
- follow numpy convension, if int tensor + int tensor, return int tensor, else return float
- check input types and cast to float in the
GenTensorScalarFn
when necessary - pr merged
c++ print trace stack when crash:
- tried https://github.com/bombela/backward-cpp
- pros: easy to integrate, pretty result, result example: https://gist.github.com/dcslin/e3c36d546e7610a61b10ac7df4a6339b
- cons: need to check how to work with
SWIG