Stream: git-wasmtime

Topic: wasmtime / issue #9506 wasi-nn: WinML failure in CI


view this post on Zulip Wasmtime GitHub notifications bot (Oct 23 2024 at 23:11):

abrown added the bug label to Issue #9506.

view this post on Zulip Wasmtime GitHub notifications bot (Oct 23 2024 at 23:11):

abrown opened issue #9506:

Test Case

@alexcrichton pointed out that the WinML backend of wasi-nn managed to segfault on CI:

Steps to Reproduce

It's unclear how we'll reproduce this.

Versions and Environment

Wasmtime version or commit: main

Operating system: Windows (Microsoft Windows Server 2022)

Architecture: x64

view this post on Zulip Wasmtime GitHub notifications bot (Oct 23 2024 at 23:12):

abrown commented on issue #9506:

cc: @jianjunz

view this post on Zulip Wasmtime GitHub notifications bot (Oct 23 2024 at 23:20):

alexcrichton commented on issue #9506:

Snippets of the log (since it'll go away in a few months) are:

...
     Running tests\test-programs.rs (target\debug\deps\test_programs-d43fa5ca1c8e41fa.exe)
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> WinML learning device is available: Ok(LearningModelDevice(IUnknown(0x23ce2bc8460)))
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> WinML learning device is available: Ok(LearningModelDevice(IUnknown(0x23ce2bc8460)))

running 10 tests
test nn_wit_image_classification_onnx            ... ignored
test nn_wit_image_classification_pytorch         ... > WinML learning device is available: Ok(LearningModelDevice(IUnknown(0x23ce2bc87e0)))
ignored
test nn_witx_image_classification_onnx           ... ignored
test nn_witx_image_classification_pytorch        ... ignored
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> downloading: "curl" "--location" "https://github.com/onnx/models/raw/bec48b6a70e5e9042c0badbaafefe4454e072d08/validated/vision/classification/mobilenet/model/mobilenetv2-10.onnx?download=" "--output" "D:\\a\\wasmtime\\wasmtime\\target\\debug\\build\\wasmtime-wasi-nn-b2acdb39530e4cba\\out\\fixtures\\model.onnx"
> downloading: "curl" "--location" "https://github.com/intel/openvino-rs/raw/72d75601e9be394b3e8c7ff28313d66ef53ff358/crates/openvino/tests/fixtures/mobilenet/mobilenet.bin" "--output" "D:\\a\\wasmtime\\wasmtime\\target\\debug\\build\\wasmtime-wasi-nn-b2acdb39530e4cba\\out\\fixtures\\model.bin"
> downloading: "curl" "--location" "https://github.com/intel/openvino-rs/raw/72d75601e9be394b3e8c7ff28313d66ef53ff358/crates/openvino/tests/fixtures/mobilenet/mobilenet.xml" "--output" "D:\\a\\wasmtime\\wasmtime\\target\\debug\\build\\wasmtime-wasi-nn-b2acdb39530e4cba\\out\\fixtures\\model.xml"
> downloading: "curl" "--location" "https://github.com/intel/openvino-rs/raw/72d75601e9be394b3e8c7ff28313d66ef53ff358/crates/openvino/tests/fixtures/mobilenet/tensor-1x224x224x3-f32.bgr" "--output" "D:\\a\\wasmtime\\wasmtime\\target\\debug\\build\\wasmtime-wasi-nn-b2acdb39530e4cba\\out\\fixtures\\tensor.bgr"
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.bin
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.xml
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\tensor.bgr
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.bin
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.xml
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\tensor.bgr
[nn] created wasi-nn execution context with ID: GraphExecutionContext#0
[nn] set input tensor: 602112 bytes
[nn] executed graph inference in 12 ms
[nn] retrieved output tensor: 4004 bytes
found results, sorted top 5: [InferenceResult(886, 0.3958254), InferenceResult(905, 0.36464655), InferenceResult(85, 0.010480323), InferenceResult(912, 0.0082290955), InferenceResult(742, 0.007244849)]
test nn_witx_image_classification_openvino       ... ok
> found openvino version: 2024.4.0-16579-c3152d32c9c-releases/2024/4
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.bin
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.xml
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\tensor.bgr
[nn] created wasi-nn execution context with ID: GraphExecutionContext { handle: Resource { handle: 1 } }
[nn] set input tensor: 602112 bytes
[nn] executed graph inference in 11 ms
[nn] retrieved output tensor: 4004 bytes
found results, sorted top 5: [InferenceResult(886, 0.3958254), InferenceResult(905, 0.36464655), InferenceResult(85, 0.010480323), InferenceResult(912, 0.0082290955), InferenceResult(742, 0.007244849)]
test nn_wit_image_classification_openvino_named  ... ok
> WinML learning device is available: Ok(LearningModelDevice(IUnknown(0x23ce6d92150)))
> using cached artifact: D:\a\wasmtime\wasmtime\target\debug\build\wasmtime-wasi-nn-b2acdb39530e4cba\out\fixtures\model.onnx
[nn] created wasi-nn execution context with ID: GraphExecutionContext { handle: Resource { handle: 1 } }
[nn] set input tensor: 602112 bytes
[nn] executed graph inference in 11 ms
[nn] retrieved output tensor: 4004 bytes
found results, sorted top 5: [InferenceResult(886, 0.3958254), InferenceResult(905, 0.36464655), InferenceResult(85, 0.010480323), InferenceResult(912, 0.0082290955), InferenceResult(742, 0.007244849)]
test nn_wit_image_classification_openvino        ... ok
[nn] created wasi-nn execution context with ID: GraphExecutionContext { handle: Resource { handle: 1 } }
[nn] set input tensor: 602112 bytes
[nn] executed graph inference in 104 ms
[nn] retrieved output tensor: 4000 bytes
found results, sorted top 5: [InferenceResult(963, 14.325485), InferenceResult(923, 13.550032), InferenceResult(762, 13.373666), InferenceResult(118, 12.3188925), InferenceResult(926, 11.933905)]
test nn_wit_image_classification_winml_named     ... ok
[nn] created wasi-nn execution context with ID: GraphExecutionContext#0
[nn] set input tensor: 602112 bytes
[nn] created wasi-nn execution context with ID: GraphExecutionContext#0
[nn] set input tensor: 602112 bytes
[nn] executed graph inference in 18 ms
[nn] retrieved output tensor: 4004 bytes
found results, sorted top 5: [InferenceResult(886, 0.3958254), InferenceResult(905, 0.36464655), InferenceResult(85, 0.010480323), InferenceResult(912, 0.0082290955), InferenceResult(742, 0.007244849)]
test nn_witx_image_classification_openvino_named ... ok
[nn] executed graph inference in 24 ms
[nn] retrieved output tensor: 4000 bytes
found results, sorted top 5: [InferenceResult(963, 14.325485), InferenceResult(923, 13.550032), InferenceResult(762, 13.373666), InferenceResult(118, 12.3188925), InferenceResult(926, 11.933905)]
error: test failed, to rerun pass `-p wasmtime-wasi-nn --test test-programs`

Caused by:
  process didn't exit successfully: `D:\a\wasmtime\wasmtime\target\debug\deps\test_programs-d43fa5ca1c8e41fa.exe` (exit code: 0xc0000005, STATUS_ACCESS_VIOLATION)
D:\a\_temp\a500644d-a4c9-4428-a896-9463bc38ded7.sh: line 1:  1116 Segmentation fault      cargo test -p wasmtime-wasi-nn --features winml
...


Last updated: Nov 22 2024 at 16:03 UTC