dbezhetskov opened issue #3999:
Test Case
https://user-images.githubusercontent.com/5621716/161971426-a7599e7d-886f-4101-b379-8f5c20a38a8e.mov
( couldn't upload .wasm file so I'd changed extension to .mov, it is actually .wasm file)
Steps to Reproduce
RUST_BACKTRACE=1 gdb --args wasmtime run -g preinitialized.wasm
Expected Results
debug break
Actual Results
crash:
thread 'main' panicked at 'assertion failed:
(left < right)
left:9699
,
right:7415
', crates/cranelift/src/debug/transform/expression.rs:690:13
stack backtrace:
0: rust_begin_unwind
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/std/src/panicking.rs:498:5
1: core::panicking::panic_fmt
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/core/src/panicking.rs:116:14
2: wasmtime_cranelift::debug::transform::expression::ValueLabelRangesBuilder::process_label
3: wasmtime_cranelift::debug::transform::expression::CompiledExpression::build_with_locals
4: wasmtime_cranelift::debug::transform::simulate::generate_simulated_dwarf
5: wasmtime_cranelift::debug::transform::transform_dwarf
6: wasmtime_cranelift::debug::write_debuginfo::emit_dwarf
7: <wasmtime_cranelift::compiler::Compiler as wasmtime_environ::compilation::Compiler>::emit_obj
8: core::ops::function::impls::<impl core::ops::function::FnMut<A> for &F>::call_mut
9: <core::iter::adapters::map::Map<I,F> as core::iter::traits::iterator::Iterator>::try_fold
10: <rayon::iter::fold::FoldFolder<C,ID,F> as rayon::iter::plumbing::Folder<T>>::consume_iter
11: rayon::iter::plumbing::bridge_producer_consumer::helper
12: <rayon::vec::IntoIter<T> as rayon::iter::IndexedParallelIterator>::with_producer
13: <rayon::iter::while_some::WhileSome<I> as rayon::iter::ParallelIterator>::drive_unindexed
14: rayon::iter::collect::<impl rayon::iter::ParallelExtend<T> for alloc::vec::Vec<T>>::par_extend
15: rayon::result::<impl rayon::iter::FromParallelIterator<core::result::Result<T,E>> for core::result::Result<C,E>>::from_par_iter
16: wasmtime::module::Module::build_artifacts
17: core::ops::function::FnOnce::call_once
18: wasmtime_cache::ModuleCacheEntry::get_data_raw
19: wasmtime::module::Module::from_binary
20: wasmtime::module::Module::from_file
21: wasmtime_cli::commands::run::RunCommand::load_module
22: wasmtime_cli::commands::run::RunCommand::load_main_module
23: wasmtime_cli::commands::run::RunCommand::execute
24: wasmtime::main
note: Some details are omitted, run withRUST_BACKTRACE=full
for a verbose backtrace.Versions and Environment
Wasmtime version or commit: wasmtime 0.35.2
Operating system: ubuntu 20.04
Architecture: x86_64
dbezhetskov labeled issue #3999:
Test Case
https://user-images.githubusercontent.com/5621716/161971426-a7599e7d-886f-4101-b379-8f5c20a38a8e.mov
( couldn't upload .wasm file so I'd changed extension to .mov, it is actually .wasm file)
Steps to Reproduce
RUST_BACKTRACE=1 gdb --args wasmtime run -g preinitialized.wasm
Expected Results
debug break
Actual Results
crash:
thread 'main' panicked at 'assertion failed:
(left < right)
left:9699
,
right:7415
', crates/cranelift/src/debug/transform/expression.rs:690:13
stack backtrace:
0: rust_begin_unwind
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/std/src/panicking.rs:498:5
1: core::panicking::panic_fmt
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/core/src/panicking.rs:116:14
2: wasmtime_cranelift::debug::transform::expression::ValueLabelRangesBuilder::process_label
3: wasmtime_cranelift::debug::transform::expression::CompiledExpression::build_with_locals
4: wasmtime_cranelift::debug::transform::simulate::generate_simulated_dwarf
5: wasmtime_cranelift::debug::transform::transform_dwarf
6: wasmtime_cranelift::debug::write_debuginfo::emit_dwarf
7: <wasmtime_cranelift::compiler::Compiler as wasmtime_environ::compilation::Compiler>::emit_obj
8: core::ops::function::impls::<impl core::ops::function::FnMut<A> for &F>::call_mut
9: <core::iter::adapters::map::Map<I,F> as core::iter::traits::iterator::Iterator>::try_fold
10: <rayon::iter::fold::FoldFolder<C,ID,F> as rayon::iter::plumbing::Folder<T>>::consume_iter
11: rayon::iter::plumbing::bridge_producer_consumer::helper
12: <rayon::vec::IntoIter<T> as rayon::iter::IndexedParallelIterator>::with_producer
13: <rayon::iter::while_some::WhileSome<I> as rayon::iter::ParallelIterator>::drive_unindexed
14: rayon::iter::collect::<impl rayon::iter::ParallelExtend<T> for alloc::vec::Vec<T>>::par_extend
15: rayon::result::<impl rayon::iter::FromParallelIterator<core::result::Result<T,E>> for core::result::Result<C,E>>::from_par_iter
16: wasmtime::module::Module::build_artifacts
17: core::ops::function::FnOnce::call_once
18: wasmtime_cache::ModuleCacheEntry::get_data_raw
19: wasmtime::module::Module::from_binary
20: wasmtime::module::Module::from_file
21: wasmtime_cli::commands::run::RunCommand::load_module
22: wasmtime_cli::commands::run::RunCommand::load_main_module
23: wasmtime_cli::commands::run::RunCommand::execute
24: wasmtime::main
note: Some details are omitted, run withRUST_BACKTRACE=full
for a verbose backtrace.Versions and Environment
Wasmtime version or commit: wasmtime 0.35.2
Operating system: ubuntu 20.04
Architecture: x86_64
bjorn3 commented on issue #3999:
How was the wasm file produced?
dbezhetskov commented on issue #3999:
It is a compiled c++ program and it was preinitialized with wizer (https://github.com/bytecodealliance/wizer).
btw, without-g
wasmtime works as expected with the .wasm module.
dbezhetskov edited a comment on issue #3999:
@bjorn3
It is a compiled c++ program and it was preinitialized with wizer (https://github.com/bytecodealliance/wizer).
btw, without-g
wasmtime works as expected with the .wasm module.
bjorn3 commented on issue #3999:
It may be that wasmtime doesn't handle correct debuginfo or it may be that wizer causes the debuginfo to get corrupted.
abrown commented on issue #3999:
cc: @fitzgen?
SuperTails commented on issue #3999:
I compiled a C program _without_ using Wizer and I have encountered the same crash. Here is a zip folder containing the WASM file that causes the crash:
wasmtime_crash_testcase.zip
SuperTails edited a comment on issue #3999:
I compiled a C program _without_ using Wizer and I have encountered the same crash. Here is a zip folder containing the WASM file that causes the crash:
wasmtime_crash_testcase.zip
I was using thewasmtime
crate rather than the CLI version as well.
SuperTails edited a comment on issue #3999:
I compiled a C program _without_ using Wizer and I have encountered the same crash. Here is a zip folder containing the WASM file that causes the crash:
wasmtime_crash_testcase.zip
I am using thewasmtime
crate directly. It only panics ifdebug_info(true)
is set when creating theEngine
.
alexcrichton pinned issue #3999:
Test Case
https://user-images.githubusercontent.com/5621716/161971426-a7599e7d-886f-4101-b379-8f5c20a38a8e.mov
( couldn't upload .wasm file so I'd changed extension to .mov, it is actually .wasm file)
Steps to Reproduce
RUST_BACKTRACE=1 gdb --args wasmtime run -g preinitialized.wasm
Expected Results
debug break
Actual Results
crash:
thread 'main' panicked at 'assertion failed:
(left < right)
left:9699
,
right:7415
', crates/cranelift/src/debug/transform/expression.rs:690:13
stack backtrace:
0: rust_begin_unwind
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/std/src/panicking.rs:498:5
1: core::panicking::panic_fmt
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/core/src/panicking.rs:116:14
2: wasmtime_cranelift::debug::transform::expression::ValueLabelRangesBuilder::process_label
3: wasmtime_cranelift::debug::transform::expression::CompiledExpression::build_with_locals
4: wasmtime_cranelift::debug::transform::simulate::generate_simulated_dwarf
5: wasmtime_cranelift::debug::transform::transform_dwarf
6: wasmtime_cranelift::debug::write_debuginfo::emit_dwarf
7: <wasmtime_cranelift::compiler::Compiler as wasmtime_environ::compilation::Compiler>::emit_obj
8: core::ops::function::impls::<impl core::ops::function::FnMut<A> for &F>::call_mut
9: <core::iter::adapters::map::Map<I,F> as core::iter::traits::iterator::Iterator>::try_fold
10: <rayon::iter::fold::FoldFolder<C,ID,F> as rayon::iter::plumbing::Folder<T>>::consume_iter
11: rayon::iter::plumbing::bridge_producer_consumer::helper
12: <rayon::vec::IntoIter<T> as rayon::iter::IndexedParallelIterator>::with_producer
13: <rayon::iter::while_some::WhileSome<I> as rayon::iter::ParallelIterator>::drive_unindexed
14: rayon::iter::collect::<impl rayon::iter::ParallelExtend<T> for alloc::vec::Vec<T>>::par_extend
15: rayon::result::<impl rayon::iter::FromParallelIterator<core::result::Result<T,E>> for core::result::Result<C,E>>::from_par_iter
16: wasmtime::module::Module::build_artifacts
17: core::ops::function::FnOnce::call_once
18: wasmtime_cache::ModuleCacheEntry::get_data_raw
19: wasmtime::module::Module::from_binary
20: wasmtime::module::Module::from_file
21: wasmtime_cli::commands::run::RunCommand::load_module
22: wasmtime_cli::commands::run::RunCommand::load_main_module
23: wasmtime_cli::commands::run::RunCommand::execute
24: wasmtime::main
note: Some details are omitted, run withRUST_BACKTRACE=full
for a verbose backtrace.Versions and Environment
Wasmtime version or commit: wasmtime 0.35.2
Operating system: ubuntu 20.04
Architecture: x86_64
dbanks12 commented on issue #3999:
I am running into this as well! It looks like if I revert to
wasmtime 1.0.2
or earlier I do not see this assertion failure.wasmtime 2.0.0
and later result in the assertion failure.I see the progress in https://github.com/bytecodealliance/wasmtime/pull/5553 and am watching closely! I am not a wasmtime poweruser, but if I can help make progress here I am happy to.
dbanks12 edited a comment on issue #3999:
I am running into this as well! It looks like if I revert to
wasmtime 1.0.2
or earlier I do not see this assertion failure.wasmtime 2.0.0
and later result in the assertion failure in my case.I see the progress in https://github.com/bytecodealliance/wasmtime/pull/5553 and am watching closely! I am not a wasmtime poweruser, but if I can help make progress here I am happy to.
jameysharp commented on issue #3999:
It looks like if I revert to
wasmtime 1.0.2
or earlier I do not see this assertion failure.wasmtime 2.0.0
and later result in the assertion failure in my case.I'm surprised to hear that. Since Wasmtime 1.0 was released months after the first time this issue was reported, I'd have expected you'd encounter the same issue in that version.
adv-sw commented on issue #3999:
Think this is same issue I've just encountered.
Seems this is the cause of the regression :
Revision: ce67e7fcd1d2d6da1899d2b46cc41ef877bd9462
Author: Amanieu d'Antras <amanieu@gmail.com>
Date: 02/12/2021 11:53:04
Message:
Fix ownership in *_vec_new functions in the C APIThese functions are specified to take ownership of the objects in the given slice, not clone them.
Modified: crates/c-api/src/vec.rs
The following resolves the breakpoint regression I've encountered.
Perhaps for you too.
cfallin commented on issue #3999:
@adv-sw the bug described here can occur even without using the C API, so it doesn't make sense that a change in the
c-api
crate would have caused it. Can you say more why you think this is the case?
adv-sw commented on issue #3999:
You're perhaps right, Chris that another debugger regression issue I've identified is seperate. This issue presents around wasmtime 1.0.2, whereas the one I found presents around wasmtime 0.31.0 / 0.32.0
Apologies - an assumption too far. Seperate issues.
dbanks12 commented on issue #3999:
I'm surprised to hear that. Since Wasmtime 1.0 was released months after the first time this issue was reported, I'd have expected you'd encounter the same issue in that version.
I was surprised to see that as well.... I can try to provide more context shortly.
adv-sw deleted a comment on issue #3999:
Think this is same issue I've just encountered.
Seems this is the cause of the regression :
Revision: ce67e7fcd1d2d6da1899d2b46cc41ef877bd9462
Author: Amanieu d'Antras <amanieu@gmail.com>
Date: 02/12/2021 11:53:04
Message:
Fix ownership in *_vec_new functions in the C APIThese functions are specified to take ownership of the objects in the given slice, not clone them.
Modified: crates/c-api/src/vec.rs
The following resolves the breakpoint regression I've encountered.
Perhaps for you too.
adv-sw deleted a comment on issue #3999:
You're perhaps right, Chris that another debugger regression issue I've identified is seperate. This issue presents around wasmtime 1.0.2, whereas the one I found presents around wasmtime 0.31.0 / 0.32.0
Apologies - an assumption too far. Seperate issues.
abrown unpinned issue #3999:
Test Case
https://user-images.githubusercontent.com/5621716/161971426-a7599e7d-886f-4101-b379-8f5c20a38a8e.mov
( couldn't upload .wasm file so I'd changed extension to .mov, it is actually .wasm file)
Steps to Reproduce
RUST_BACKTRACE=1 gdb --args wasmtime run -g preinitialized.wasm
Expected Results
debug break
Actual Results
crash:
thread 'main' panicked at 'assertion failed:
(left < right)
left:9699
,
right:7415
', crates/cranelift/src/debug/transform/expression.rs:690:13
stack backtrace:
0: rust_begin_unwind
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/std/src/panicking.rs:498:5
1: core::panicking::panic_fmt
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/core/src/panicking.rs:116:14
2: wasmtime_cranelift::debug::transform::expression::ValueLabelRangesBuilder::process_label
3: wasmtime_cranelift::debug::transform::expression::CompiledExpression::build_with_locals
4: wasmtime_cranelift::debug::transform::simulate::generate_simulated_dwarf
5: wasmtime_cranelift::debug::transform::transform_dwarf
6: wasmtime_cranelift::debug::write_debuginfo::emit_dwarf
7: <wasmtime_cranelift::compiler::Compiler as wasmtime_environ::compilation::Compiler>::emit_obj
8: core::ops::function::impls::<impl core::ops::function::FnMut<A> for &F>::call_mut
9: <core::iter::adapters::map::Map<I,F> as core::iter::traits::iterator::Iterator>::try_fold
10: <rayon::iter::fold::FoldFolder<C,ID,F> as rayon::iter::plumbing::Folder<T>>::consume_iter
11: rayon::iter::plumbing::bridge_producer_consumer::helper
12: <rayon::vec::IntoIter<T> as rayon::iter::IndexedParallelIterator>::with_producer
13: <rayon::iter::while_some::WhileSome<I> as rayon::iter::ParallelIterator>::drive_unindexed
14: rayon::iter::collect::<impl rayon::iter::ParallelExtend<T> for alloc::vec::Vec<T>>::par_extend
15: rayon::result::<impl rayon::iter::FromParallelIterator<core::result::Result<T,E>> for core::result::Result<C,E>>::from_par_iter
16: wasmtime::module::Module::build_artifacts
17: core::ops::function::FnOnce::call_once
18: wasmtime_cache::ModuleCacheEntry::get_data_raw
19: wasmtime::module::Module::from_binary
20: wasmtime::module::Module::from_file
21: wasmtime_cli::commands::run::RunCommand::load_module
22: wasmtime_cli::commands::run::RunCommand::load_main_module
23: wasmtime_cli::commands::run::RunCommand::execute
24: wasmtime::main
note: Some details are omitted, run withRUST_BACKTRACE=full
for a verbose backtrace.Versions and Environment
Wasmtime version or commit: wasmtime 0.35.2
Operating system: ubuntu 20.04
Architecture: x86_64
SingleAccretion commented on issue #3999:
I have run into this issue (along with another much simpler one) and investigated the cause a bit.
This is not a bug in the DWARF-related code, since the instruction offset data is fed to it by the code generator. The code generator, in turn, obtains this data when emitting instructions in a linear walk. It turns out that in this walk, prior recorded offsets can become invalidated by branch shortening (
optimize_branches
), leading to the invalidstart > end
ranges:MachBuffer: use_label_at_offset: offset 95 label MachLabel(3) kind JmpRel32 emitting block Block(3) MachBuffer: bind label MachLabel(3) at offset 99 enter optimize_branches: b = [MachBranch { start: 78, end: 84, target: MachLabel(2), fixup: 1, inverted: Some([15, 132, 0, 0, 0, 0]), labels_at_this_branch: [] }, MachBranch { start: 89, end: 94, target: MachLabel(3), fixup: 2, inverted: None, labels_at_this_branch: [] }, MachBranch { start: 94, end: 99, target: MachLabel(3), fixup: 3, inverted: None, labels_at_this_branch: [MachLabel(2)] }] l = [MachLabel(3)] f = [MachLabelFixup { label: MachLabel(4), offset: 16, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(2), offset: 80, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(3), offset: 90, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(3), offset: 95, kind: JmpRel32 }] optimize_branches: last branch MachBranch { start: 94, end: 99, target: MachLabel(3), fixup: 3, inverted: None, labels_at_this_branch: [MachLabel(2)] } at off 99 branch with target == cur off; truncating truncate_last_branch: truncated MachBranch { start: 94, end: 99, target: MachLabel(3), fixup: 3, inverted: None, labels_at_this_branch: [MachLabel(2)] }; off now 94 optimize_branches: last branch MachBranch { start: 89, end: 94, target: MachLabel(3), fixup: 2, inverted: None, labels_at_this_branch: [] } at off 94 branch with target == cur off; truncating truncate_last_branch: truncated MachBranch { start: 89, end: 94, target: MachLabel(3), fixup: 2, inverted: None, labels_at_this_branch: [] }; off now 89 optimize_branches: last branch MachBranch { start: 78, end: 84, target: MachLabel(2), fixup: 1, inverted: Some([15, 132, 0, 0, 0, 0]), labels_at_this_branch: [] } at off 89 purge_latest_branches: removing branch MachBranch { start: 78, end: 84, target: MachLabel(2), fixup: 1, inverted: Some([15, 132, 0, 0, 0, 0]), labels_at_this_branch: [] } leave optimize_branches: b = [] l = [MachLabel(3), MachLabel(2)] f = [MachLabelFixup { label: MachLabel(4), offset: 16, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(2), offset: 80, kind: JmpRel32 }] Recording debug range for VL8 in Reg(p2i): [i13..i14) [95..90) ; Invalid range ; DI: i12 at 89 jmp label3 block2: ; DI: i13 at 94 jmp label3 block3: ; DI: i14 at 89 movl 8(%r9,%r10,1), %ecx
SingleAccretion edited a comment on issue #3999:
I have run into this issue (along with another, much simpler one) and investigated the cause a bit.
This is not a bug in the DWARF-related code, since the instruction offset data is fed to it by the code generator. The code generator, in turn, obtains this data when emitting instructions in a linear walk. It turns out that in this walk, prior recorded offsets can become invalidated by branch shortening (
optimize_branches
), leading to the invalidstart > end
ranges:MachBuffer: use_label_at_offset: offset 95 label MachLabel(3) kind JmpRel32 emitting block Block(3) MachBuffer: bind label MachLabel(3) at offset 99 enter optimize_branches: b = [MachBranch { start: 78, end: 84, target: MachLabel(2), fixup: 1, inverted: Some([15, 132, 0, 0, 0, 0]), labels_at_this_branch: [] }, MachBranch { start: 89, end: 94, target: MachLabel(3), fixup: 2, inverted: None, labels_at_this_branch: [] }, MachBranch { start: 94, end: 99, target: MachLabel(3), fixup: 3, inverted: None, labels_at_this_branch: [MachLabel(2)] }] l = [MachLabel(3)] f = [MachLabelFixup { label: MachLabel(4), offset: 16, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(2), offset: 80, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(3), offset: 90, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(3), offset: 95, kind: JmpRel32 }] optimize_branches: last branch MachBranch { start: 94, end: 99, target: MachLabel(3), fixup: 3, inverted: None, labels_at_this_branch: [MachLabel(2)] } at off 99 branch with target == cur off; truncating truncate_last_branch: truncated MachBranch { start: 94, end: 99, target: MachLabel(3), fixup: 3, inverted: None, labels_at_this_branch: [MachLabel(2)] }; off now 94 optimize_branches: last branch MachBranch { start: 89, end: 94, target: MachLabel(3), fixup: 2, inverted: None, labels_at_this_branch: [] } at off 94 branch with target == cur off; truncating truncate_last_branch: truncated MachBranch { start: 89, end: 94, target: MachLabel(3), fixup: 2, inverted: None, labels_at_this_branch: [] }; off now 89 optimize_branches: last branch MachBranch { start: 78, end: 84, target: MachLabel(2), fixup: 1, inverted: Some([15, 132, 0, 0, 0, 0]), labels_at_this_branch: [] } at off 89 purge_latest_branches: removing branch MachBranch { start: 78, end: 84, target: MachLabel(2), fixup: 1, inverted: Some([15, 132, 0, 0, 0, 0]), labels_at_this_branch: [] } leave optimize_branches: b = [] l = [MachLabel(3), MachLabel(2)] f = [MachLabelFixup { label: MachLabel(4), offset: 16, kind: JmpRel32 }, MachLabelFixup { label: MachLabel(2), offset: 80, kind: JmpRel32 }] Recording debug range for VL8 in Reg(p2i): [i13..i14) [95..90) ; Invalid range ; DI: i12 at 89 jmp label3 block2: ; DI: i13 at 94 jmp label3 block3: ; DI: i14 at 89 movl 8(%r9,%r10,1), %ecx
cfallin commented on issue #3999:
@SingleAccretion that seems like a plausible explanation -- it's entirely possible that we missed a debug-info update when chomping branches. Would you be willing to make an attempt at fixing this? We record instruction offsets here for debug purposes, and then those are cross-correlated with
debug_value_labels
(which contains instruction-index ranges). It's possible that we just need to do a post-pass or in-place update to ensure monotonicity in this sequence (i.e., clampinst_offsets[i]
to be less than or equal toinst_offsets[i + 1]
)...
cfallin commented on issue #3999:
The disassembly bit is intentional: it's meant to be a dump of the VCode, which stays in N-target branch form, rather than an exact correspondence to the machine code. VCode "pseudo-instructions" are similarly slightly different. One can think of the MachBuffer branch chomping (and branch-target editing: target labels will be updated and conditional polarities will be flipped sometimes) as another layer of lowering.
This is also why @elliottt added a Capstone-based disassembly check to the filetests a while back (and why
clif-util
has-D
that disassembles using Capstone): both are useful, for slightly different purposes.
SingleAccretion commented on issue #3999:
@cfallin thank you for a quick response! Yes, I am looking at this right now. The branching logic (if I am reading it correctly) only ever edits the instruction stream to entirely remove the last branch instruction, so it looks possible to do in-place updating.
Side note: now that I consider this, the disassembly is also incorrect because of this after-the-fact removal. That
jmp label3
will not exist in the actual emitted code.
cfallin closed issue #3999:
Test Case
https://user-images.githubusercontent.com/5621716/161971426-a7599e7d-886f-4101-b379-8f5c20a38a8e.mov
( couldn't upload .wasm file so I'd changed extension to .mov, it is actually .wasm file)
Steps to Reproduce
RUST_BACKTRACE=1 gdb --args wasmtime run -g preinitialized.wasm
Expected Results
debug break
Actual Results
crash:
thread 'main' panicked at 'assertion failed:
(left < right)
left:9699
,
right:7415
', crates/cranelift/src/debug/transform/expression.rs:690:13
stack backtrace:
0: rust_begin_unwind
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/std/src/panicking.rs:498:5
1: core::panicking::panic_fmt
at /rustc/9d1b2106e23b1abd32fce1f17267604a5102f57a/library/core/src/panicking.rs:116:14
2: wasmtime_cranelift::debug::transform::expression::ValueLabelRangesBuilder::process_label
3: wasmtime_cranelift::debug::transform::expression::CompiledExpression::build_with_locals
4: wasmtime_cranelift::debug::transform::simulate::generate_simulated_dwarf
5: wasmtime_cranelift::debug::transform::transform_dwarf
6: wasmtime_cranelift::debug::write_debuginfo::emit_dwarf
7: <wasmtime_cranelift::compiler::Compiler as wasmtime_environ::compilation::Compiler>::emit_obj
8: core::ops::function::impls::<impl core::ops::function::FnMut<A> for &F>::call_mut
9: <core::iter::adapters::map::Map<I,F> as core::iter::traits::iterator::Iterator>::try_fold
10: <rayon::iter::fold::FoldFolder<C,ID,F> as rayon::iter::plumbing::Folder<T>>::consume_iter
11: rayon::iter::plumbing::bridge_producer_consumer::helper
12: <rayon::vec::IntoIter<T> as rayon::iter::IndexedParallelIterator>::with_producer
13: <rayon::iter::while_some::WhileSome<I> as rayon::iter::ParallelIterator>::drive_unindexed
14: rayon::iter::collect::<impl rayon::iter::ParallelExtend<T> for alloc::vec::Vec<T>>::par_extend
15: rayon::result::<impl rayon::iter::FromParallelIterator<core::result::Result<T,E>> for core::result::Result<C,E>>::from_par_iter
16: wasmtime::module::Module::build_artifacts
17: core::ops::function::FnOnce::call_once
18: wasmtime_cache::ModuleCacheEntry::get_data_raw
19: wasmtime::module::Module::from_binary
20: wasmtime::module::Module::from_file
21: wasmtime_cli::commands::run::RunCommand::load_module
22: wasmtime_cli::commands::run::RunCommand::load_main_module
23: wasmtime_cli::commands::run::RunCommand::execute
24: wasmtime::main
note: Some details are omitted, run withRUST_BACKTRACE=full
for a verbose backtrace.Versions and Environment
Wasmtime version or commit: wasmtime 0.35.2
Operating system: ubuntu 20.04
Architecture: x86_64
Last updated: Nov 22 2024 at 16:03 UTC