Skip to content

Instantly share code, notes, and snippets.

@fs-eire
Last active October 11, 2023 15:26
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save fs-eire/a55b2c7e10a6864b9602c279b8b75dce to your computer and use it in GitHub Desktop.
Save fs-eire/a55b2c7e10a6864b9602c279b8b75dce to your computer and use it in GitHub Desktop.
How to build and consume WebGPU ( 2023-08-19 )

How to build

Prerequisites

Build using the build_jsep.bat

Use the file js\build_jsep.bat. see its content for usage instruction

C:\code\onnxruntime>js\build_jsep d st

If it's the first time to build, perform a clean build

C:\code\onnxruntime>js\build_jsep d st clean

Build manually

  1. Use the following command line to build wasm:

    C:\code\onnxruntime>.\build.bat --config Release --build_wasm --enable_wasm_simd --use_jsep --target onnxruntime_webassembly --skip_tests

  2. Prepare js/web:

    C:\code\onnxruntime>cd js
    C:\code\onnxruntime\js>npm ci
    C:\code\onnxruntime\js>cd common
    C:\code\onnxruntime\js\common>npm ci
    C:\code\onnxruntime\js\common>cd ..\web
    C:\code\onnxruntime\js\web>npm ci
    C:\code\onnxruntime\js\web>npm run pull:wasm
    C:\code\onnxruntime\js\web>copy /Y ..\..\build\Windows\Release\ort-wasm-simd.js .\lib\wasm\binding\ort-wasm-simd.jsep.js
    C:\code\onnxruntime\js\web>copy /Y ..\..\build\Windows\Release\ort-wasm-simd.wasm .\dist\ort-wasm-simd.jsep.wasm

To run npm tests, do:

C:\code\onnxruntime\js\web>npm test -- model <path-to-ort-model-test-folder> -b=webgpu --wasm-number-threads=1 --debug

To build web artifacts:

C:\code\onnxruntime\js\web>npm run build

Following files are built and needed for distribution:

  • ort.webgpu.min.js
  • ort.webgpu.min.js.map ( optional, for debug purpose )
  • ort-wasm-simd.jsep.wasm
  • ort-wasm-simd-threaded.jsep.wasm

To run E2E, build a NPM package and consume:

C:\code\onnxruntime\js\common>npm pack
C:\code\onnxruntime\js\web>npm pack

2 Package files are needed for npm install. Please install them together:

  • js/common/onnxruntime-common-*.tgz
  • js/web/onnxruntime-web-*.tgz

How to consume

Make sure the files mentioned above are put in dist folder.

  • Consume from <script> tag:

    <script src="./dist/ort.webgpu.min.js"></script>
  • Consume from package:

    import {Tensor, InferenceSession} from "onnxruntime-web/webgpu";
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment