Stream: git-wasmtime

Topic: wasmtime / issue #10802 Panic during component instantiation


view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 08:52):

moldhouse opened issue #10802:

Test Case

Work in progress. Currently we only witness this in our production code base. We are currently working on a minimal example. Yet we found it may be valuable to share the back trace with you up front.

Steps to Reproduce

See above.

Expected Results

Instantiating the component without a panic.

Actual Results

We encounter a panic.

thread 'tokio-runtime-worker' panicked at /root/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/rustix-1.0.5/src/backend/linux_raw/param/auxv.rs:302:68:
called `Option::unwrap()` on a `None` value
stack backtrace:
   0: __rustc::rust_begin_unwind
   1: core::panicking::panic_fmt
   2: core::panicking::panic
   3: core::option::unwrap_failed
   4: rustix::backend::param::auxv::init_auxv_impl
   5: rustix::backend::param::auxv::init_auxv
   6: <wasmtime::runtime::vm::instance::allocator::on_demand::OnDemandInstanceAllocator as wasmtime::runtime::vm::instance::allocator::InstanceAllocatorImpl>::allocate_fiber_stack
   7: wasmtime::runtime::component::instance::InstancePre<T>::instantiate_async::{{closure}}
   8: pharia_kernel::skills::v0_3::skill::SkillPre<_T>::instantiate_async::{{closure}}
   9: <pharia_kernel::skills::v0_3::skill::SkillPre<engine_room::LinkerImpl<alloc::boxed::Box<dyn pharia_kernel::csi::CsiForSkills+core::marker::Send>>> as pharia_kernel::skills::Skill>::run_as_function::{{closure}}
  10: pharia_kernel::skill_runtime::SkillRuntimeActor<C,S>::run::{{closure}}::{{closure}}
  11: <futures_util::stream::stream::select_next_some::SelectNextSome<St> as core::future::future::Future>::poll
  12: tokio::runtime::task::core::Core<T,S>::poll
  13: tokio::runtime::task::raw::poll
  14: tokio::runtime::scheduler::multi_thread::worker::Context::run_task
  15: tokio::runtime::task::raw::poll

Versions and Environment

We see the panic in wasmtime 32, not in 31. We do not see the problem on all platforms.

We saw it on:

We did not see it on:

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 08:52):

moldhouse added the bug label to Issue #10802.

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 08:54):

moldhouse commented on issue #10802:

@pacman82 @markus-klein-aa

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 10:00):

bjorn3 commented on issue #10802:

The error happens at https://github.com/bytecodealliance/rustix/blob/cb01fbe4660844b67fdd4eee2a5f769518f6a655/src/backend/linux_raw/param/auxv.rs#L302 which indicates that one of the auxv entries for the process may be incorrect. By the way are you running an arm64 version of Wasmtime or an x86_64 version on your mac?

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 12:16):

markus-klein-aa commented on issue #10802:

On CI we build for x86_64. Locally on our Mac OS Developer machines we build with:

podman build . --tag pharia-kernel --platform linux/arm64

I am walking back a bit, that we witness the same error on CI. We need to validate that. It might be a different issue.

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 12:18):

markus-klein-aa commented on issue #10802:

Verified it also fails locally on our dev machines if we build for x86_64.

Yet, it does not fail, if we build it natively without a container around it.

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 12:53):

alexcrichton commented on issue #10802:

I'm not sure why this is panicking as I'm not familiar with auxv or how rustix is calculating the host page size, but I've submitted https://github.com/bytecodealliance/wasmtime/pull/10803 to remove calls to this function which will somewhat indirectly "fix" this insofar as Wasmtime won't panic at that location any more.

@markus-klein-aa if you're able to reduce I believe the rustix project would likely be thankful to have an issue about this panic on their issue tracker.

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 12:57):

markus-klein-aa commented on issue #10802:

Yeah, absolutely we want a minimal example. Yet this is some effort and we thought there might be value in sharing the stack trace up front.

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 13:14):

alexcrichton commented on issue #10802:

FWIW the reproduction will likely be invoking this function and that's pretty much it. The main thing to reproduce is your environment which triggers this panic.

view this post on Zulip Wasmtime GitHub notifications bot (May 19 2025 at 13:32):

markus-klein-aa commented on issue #10802:

Hey, thanks for the hint. I'll give it a try!

view this post on Zulip Wasmtime GitHub notifications bot (May 21 2025 at 12:10):

markus-klein-aa commented on issue #10802:

@alexcrichton You were correct about that we only need to call the function to reproduce it. Opened an issue in rustix. Thanks again.

view this post on Zulip Wasmtime GitHub notifications bot (May 22 2025 at 00:11):

alexcrichton closed issue #10802:

Test Case

Work in progress. Currently we only witness this in our production code base. We are currently working on a minimal example. Yet we found it may be valuable to share the back trace with you up front.

Steps to Reproduce

See above.

Expected Results

Instantiating the component without a panic.

Actual Results

We encounter a panic.

thread 'tokio-runtime-worker' panicked at /root/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/rustix-1.0.5/src/backend/linux_raw/param/auxv.rs:302:68:
called `Option::unwrap()` on a `None` value
stack backtrace:
   0: __rustc::rust_begin_unwind
   1: core::panicking::panic_fmt
   2: core::panicking::panic
   3: core::option::unwrap_failed
   4: rustix::backend::param::auxv::init_auxv_impl
   5: rustix::backend::param::auxv::init_auxv
   6: <wasmtime::runtime::vm::instance::allocator::on_demand::OnDemandInstanceAllocator as wasmtime::runtime::vm::instance::allocator::InstanceAllocatorImpl>::allocate_fiber_stack
   7: wasmtime::runtime::component::instance::InstancePre<T>::instantiate_async::{{closure}}
   8: pharia_kernel::skills::v0_3::skill::SkillPre<_T>::instantiate_async::{{closure}}
   9: <pharia_kernel::skills::v0_3::skill::SkillPre<engine_room::LinkerImpl<alloc::boxed::Box<dyn pharia_kernel::csi::CsiForSkills+core::marker::Send>>> as pharia_kernel::skills::Skill>::run_as_function::{{closure}}
  10: pharia_kernel::skill_runtime::SkillRuntimeActor<C,S>::run::{{closure}}::{{closure}}
  11: <futures_util::stream::stream::select_next_some::SelectNextSome<St> as core::future::future::Future>::poll
  12: tokio::runtime::task::core::Core<T,S>::poll
  13: tokio::runtime::task::raw::poll
  14: tokio::runtime::scheduler::multi_thread::worker::Context::run_task
  15: tokio::runtime::task::raw::poll

Versions and Environment

We see the panic in wasmtime 32, not in 31. We do not see the problem on all platforms.

We saw it on:

We did not see it on:


Last updated: Dec 06 2025 at 06:05 UTC