Stream: git-wasmtime

Topic: wasmtime / issue #11296 Repeatedly returning -1 from `mem...


view this post on Zulip Wasmtime GitHub notifications bot (Jul 23 2025 at 15:17):

alexcrichton edited issue #11296:

Describe the bug

Hello, I caught the following performance anomalies while using wasmtime. The specific performance is as follows:

<img width="414" height="127" alt="Image" src="https://github.com/user-attachments/assets/b2447f47-b354-431e-97b9-02562124977d" />

The data is in seconds, and each data is the result of ten executions and averages.

Test Case

test_case.zip

Steps to Reproduce

# wasm2wat or wat2wasm
wasm2wat -f test_case.wasm -o test_case.wat
wat2wasm test_case.wat -o test_case.wasm
# Execute the wasm file and collect data
perf stat -r 10 -e 'task-clock' /path/to/wasmer run test_case.wasm
perf stat -r 10 -e 'task-clock' /path/to/wasmtime test_case.wasm
perf stat -r 10 -e 'task-clock' /path/to/wasmedge --enable-jit test_case.wasm
perf stat -r 10 -e 'task-clock' /path/to/build_fast_jit/iwasm test_case.wasm
perf stat -r 10 -e 'task-clock' /path/to/build_llvm_jit/iwasm test_case.wasm

Expected and actual Results

test_case.wasm causes large execution time differences between several runtimes, with wasmtime and wasmer being particularly pronounced. The execution time of wasmer is about 5x-6x, and the execution time of wasmtime is about 7x-9x.
I did an analysis of tase_case.wat and found that when I deleted the code like the following for 50-52 lines, the execution time is as shown in modified.wasm. At this point, the results of each runtime are relatively normal. From this I think the following directives may be causing performance anomalies in both runtime tools.

          (drop
            (memory.grow
              (i32.const 1)))

Versions and Environment

The runtime tools are all built on release and use JIT mode.
Wasmer uses the Cranelift backend, and the execution time of the llvm backend for this test case is basically the same as that of Cranelift.

Extra Info

I also submitted an issue about the phenomenon to wasmer.
If you need any other relevant information, please let me know and I will do my best to provide it. Looking forward to your reply! Thank you!

view this post on Zulip Wasmtime GitHub notifications bot (Jul 23 2025 at 15:17):

alexcrichton added the performance label to Issue #11296.

view this post on Zulip Wasmtime GitHub notifications bot (Jul 23 2025 at 15:18):

alexcrichton commented on issue #11296:

Thanks for digging in further, I've updated the issue title to reflect these findings.

view this post on Zulip Wasmtime GitHub notifications bot (Jul 24 2025 at 01:57):

gaaraw commented on issue #11296:

please do not post information in bugs as screenshots (your initial table and your later terminal screenshot).

Thank you so much for the reminder! This regulates my work, and I will remember it in my heart!

view this post on Zulip Wasmtime GitHub notifications bot (Jul 24 2025 at 18:25):

primoly commented on issue #11296:

I’ve got two simple tests. Both memory.grow an ungrowable memory 2<sup>32</sup> times.
The first grows by zero pages, so it always succeeds and the second grows by one page and always fails.

grow0.wat

(module
  (func (export "test")
    (local i32)
    loop
      i32.const 0
      memory.grow
      drop
      local.get 0
      i32.const 1
      i32.add
      local.tee 0
      br_if 0
    end
  )
  (memory 0 0)
)

grow1.wat

(module
  (func (export "test")
    (local i32)
    loop
      i32.const 1
      memory.grow
      drop
      local.get 0
      i32.const 1
      i32.add
      local.tee 0
      br_if 0
    end
  )
  (memory 0 0)
)

I get strangely diverging performance by different engines:

Engine Test Time
Wasmtime grow0 27 sec
Firefox grow0 3 min
Safari grow0 33 sec
Chrome grow0 8 min
Wasmtime grow1 38 sec
Firefox grow1 38 sec
Safari grow1 12 sec
Chrome grow1 39 sec

Last updated: Dec 06 2025 at 06:05 UTC