-
-
Save sujithjoseph/c725de5fb38bb3c20e4fb6fd55f63848 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
-------------------------------------------------- | |
DeepSpeed C++/CUDA extension op report | |
-------------------------------------------------- | |
NOTE: Ops not installed will be just-in-time (JIT) compiled at | |
runtime if needed. Op compatibility means that your system | |
meet the required dependencies to JIT install the op. | |
-------------------------------------------------- | |
JIT compiled ops requires ninja | |
ninja .................. [92m[OKAY][0m | |
-------------------------------------------------- | |
op name ................ installed .. compatible | |
-------------------------------------------------- | |
[93m [WARNING] [0m async_io requires the dev libaio .so object and headers but these were not found. | |
[93m [WARNING] [0m async_io: please install the libaio-dev package with apt | |
[93m [WARNING] [0m If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found. | |
async_io ............... [93m[NO][0m ....... [93m[NO][0m | |
cpu_adagrad ............ [93m[NO][0m ....... [92m[OKAY][0m | |
cpu_adam ............... [93m[NO][0m ....... [92m[OKAY][0m | |
fused_adam ............. [93m[NO][0m ....... [92m[OKAY][0m | |
fused_lamb ............. [93m[NO][0m ....... [92m[OKAY][0m | |
quantizer .............. [93m[NO][0m ....... [92m[OKAY][0m | |
random_ltd ............. [93m[NO][0m ....... [92m[OKAY][0m | |
[93m [WARNING] [0m please install triton==1.0.0 if you want to use sparse attention | |
sparse_attn ............ [93m[NO][0m ....... [93m[NO][0m | |
spatial_inference ...... [93m[NO][0m ....... [92m[OKAY][0m | |
transformer ............ [93m[NO][0m ....... [92m[OKAY][0m | |
stochastic_transformer . [93m[NO][0m ....... [92m[OKAY][0m | |
transformer_inference .. [93m[NO][0m ....... [92m[OKAY][0m | |
utils .................. [93m[NO][0m ....... [92m[OKAY][0m | |
-------------------------------------------------- | |
DeepSpeed general environment info: | |
torch install path ............... ['/opt/conda/lib/python3.7/site-packages/torch'] | |
torch version .................... 1.13.1+cu117 | |
deepspeed install path ........... ['/opt/conda/lib/python3.7/site-packages/deepspeed'] | |
deepspeed info ................... 0.8.0, unknown, unknown | |
torch cuda version ............... 11.7 | |
torch hip version ................ None | |
nvcc version ..................... 11.0 | |
deepspeed wheel compiled w. ...... torch 1.13, cuda 11.7 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment