I am working on a little scripting interface I want the users of my program have.
I am using AssemblyScript which is compiled to wasm and I am defining host functions in Rust using wasmtime.
To test it out, I am defining a Vector2 class like this in AssemblyScript:
export declare class Vector2 {
public x: number;
public y: number;
constructor(x: number, y: number);
public ToString(): string;
}
And this is how I define the ToString function in the host:
linker.func_wrap("env", "Vector2#ToString", |mut caller: Caller<'_, WasiCtx>, ptr: i32| {
let memory = caller.get_export("memory").unwrap().into_memory().unwrap();
let mut x_bytes = [0u8; 8];
let mut y_bytes = [0u8; 8];
memory
.read(caller.as_context(), ptr as _, &mut x_bytes)
.unwrap();
memory
.read(caller.as_context(), (ptr + 8) as _, &mut y_bytes)
.unwrap();
let x = f64::from_le_bytes(x_bytes);
let y = f64::from_le_bytes(y_bytes);
let result_str = format!("[{}, {}]", x, y);
let len = result_str.as_bytes().len() as i32;
println!("Result str {} len {}", result_str, len);
len
}).unwrap();
Everything is going well until I try to console.log the value like this:
import { Vector2 } from "./env";
function main(): void {
let newVec = new Vector2(1, 2);
let val = newVec.ToString();
console.log(val);
}
main();
Gives an error like this
called
Result::unwrap()
on anErr
value: error while executing at wasm backtrace:
0: 0x186 - <unknown>!~lib/rt/common/OBJECT#get:rtSize
1: 0x191 - <unknown>!~lib/string/String#get:length
2: 0xc42 - <unknown>!~lib/wasi_process/writeString
3: 0xda7 - <unknown>!~lib/wasi_process/WritableStream#write<~lib/string/String>
4: 0xdb6 - <unknown>!~lib/wasi_console/wasi_console.log
5: 0xde4 - <unknown>!scripts/index/main
6: 0xe00 - <unknown>!start:scripts/index
7: 0xe92 - <unknown>!~startCaused by:
0: memory fault at wasm address 0x100000002 in linear memory of size 0x10000
1: wasm trap: out of bounds memory access
I am sure this has to do with how I am accessing the memory, but what is weird is that if I don't console.log the value everything works.
I would appreciate any insights on this issue!
It looks like the error here is happening before your host function is invoked, so this may be a bug in the assemblyscript runtime or compiler. Removing console.log
or changing the source can do all sorts of things to compiled output, so if it's a compiler bug then that makes sense as a possible trigger and/or cover-up.
Hey @Alex Crichton thanks for your reply, the host function is actually invoked because I can see the println statement actually printing something right before the error:
Result str [1, 2] len 6
thread 'main' panicked at pixa/src/scripting/mod.rs:171:14:
calledResult::unwrap()
on anErr
value: error while executing at wasm backtrace:
0: 0x186 - <unknown>!~lib/rt/common/OBJECT#get:rtSize
1: 0x191 - <unknown>!~lib/string/String#get:length
2: 0xc42 - <unknown>!~lib/wasi_process/writeString
3: 0xda7 - <unknown>!~lib/wasi_process/WritableStream#write<~lib/string/String>
4: 0xdb6 - <unknown>!~lib/wasi_console/wasi_console.log
5: 0xde4 - <unknown>!scripts/index/main
6: 0xe00 - <unknown>!start:scripts/index
7: 0xe92 - <unknown>!~startCaused by:
0: memory fault at wasm address 0x100000002 in linear memory of size 0x10000
1: wasm trap: out of bounds memory access
Looks like just when it is trying to get the length of the console.log parameter, something happens, also worth adding is that logging a normal string works, and so does accessing each one of the vector components:
let newVec = new Vector2(1, 2);
// Works fine, prints 1.0
console.log(newVec.x.toString());
// Works fine, prints 2.0
console.log(newVec.x.toString());
// Works as well
console.log("Hello world");
// Gives memory trap error
console.log(newVec.ToString());
Maybe there is something wrong with the host function definition?
Worth adding also the constructor host function which I defined like this
self.linker
.func_wrap(
"env",
"Vector2#constructor",
|mut caller: Caller<'_, ScriptStore>, _ptr: i32, x: f64, y: f64| {
let memory = caller.get_export("memory").unwrap().into_memory().unwrap();
let offset = allocate_memory(16); // 2*8
let mut store_context = caller.as_context_mut();
memory
.write(&mut store_context, offset, &x.to_le_bytes())
.expect("Failed to write x to memory");
memory
.write(&mut store_context, offset + 8, &y.to_le_bytes())
.expect("Failed to write y to memory");
offset as i32
},
)
.unwrap();
allocate_memory is a function to keep track of the memory offset:
lazy_static! {
static ref MEMORY_OFFSET: Mutex<usize> = Mutex::new(0);
}
fn allocate_memory(size: usize) -> usize {
let mut offset = MEMORY_OFFSET.lock().unwrap();
let current_offset = *offset;
*offset += size;
current_offset
}
I finally fixed it, to anybody else wondering, I stumbled across this issue: https://github.com/AssemblyScript/assemblyscript/issues/2099
Basically I was not running the _start function, which I understand will initialize all the memory.
So indeed this issue is not related to wasmtime but only AssemblyScript specific things
Last updated: Dec 23 2024 at 13:07 UTC