Hi, new to compiler engineering. Just cloned the wasmtime repo and followed the instructions in the README for islec
cargo build --release
but ran into the following build issues. Am I missing something?
Compiling libc v0.2.153
Compiling anstyle-query v1.0.0
Compiling rustix v0.38.31
Compiling memchr v2.5.0
Compiling proc-macro2 v1.0.81
Compiling log v0.4.17
Compiling anstyle-parse v0.2.1
Compiling clap_lex v0.5.0
Compiling strsim v0.10.0
Compiling heck v0.4.0
error: failed to run custom build command for `libc v0.2.153`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/libc-c70185b4d15e25fb/build-script-build` (signal: 9, SIGKILL: kill)
warning: build failed, waiting for other jobs to finish...
error: failed to run custom build command for `rustix v0.38.31`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/rustix-8f0bd8469e02e97b/build-script-build` (signal: 9, SIGKILL: kill)
error: failed to run custom build command for `proc-macro2 v1.0.81`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/proc-macro2-e25cbce0be9ed81c/build-script-build` (signal: 9, SIGKILL: kill)
error: failed to run custom build command for `memchr v2.5.0`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/memchr-f9a14b593813b5ba/build-script-build` (signal: 9, SIGKILL: kill)
error: failed to run custom build command for `log v0.4.17`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/log-9127a2af714571ef/build-script-build` (signal: 9, SIGKILL: kill)
that looks like you may be running out of memory, you can try running it with less parallelism which may reduce the amount of memory necessary to build it by using cargo build --release -jN
where N
is the amount of parallelism you want, e.g. -j4
for using 4 cores. iirc the M1 has 8 cores so defaults to -j8
I actually have an M1 Max with 10 cores but I’ll try the above and report back.
So tried this but I don't think that's the issue
cargo build --release -j4
Compiling libc v0.2.153
Compiling rustix v0.38.31
Compiling proc-macro2 v1.0.81
Compiling memchr v2.5.0
error: failed to run custom build command for `libc v0.2.153`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/libc-c70185b4d15e25fb/build-script-build` (signal: 9, SIGKILL: kill)
warning: build failed, waiting for other jobs to finish...
error: failed to run custom build command for `rustix v0.38.31`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/rustix-8f0bd8469e02e97b/build-script-build` (signal: 9, SIGKILL: kill)
error: failed to run custom build command for `memchr v2.5.0`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/memchr-f9a14b593813b5ba/build-script-build` (signal: 9, SIGKILL: kill)
error: failed to run custom build command for `proc-macro2 v1.0.81`
Caused by:
process didn't exit successfully: `/Users/nihal.pasham/devspace/compiler/wasmtime/target/release/build/proc-macro2-e25cbce0be9ed81c/build-script-build` (signal: 9, SIGKILL: kill)
it kind of seems similar to this - https://users.rust-lang.org/t/error-failed-to-run-custom-build-command-for-libc-v0-2-150/103006/7
nope - a restart did not work
I can confirm that debug builds seem to work just fine while release builds fail to run custom build commands/scripts (on Apple Silicon).
Not sure if I'm missing something but I presume a debug build of islec
is good enough for learning.
cargo build
Compiling memchr v2.7.2
Compiling regex-syntax v0.8.3
Compiling libc v0.2.155
Compiling cranelift-isle v0.110.0 (/Users/nihal.pasham/devspace/compiler/wasmtime/cranelift/isle/isle)
Compiling codespan-reporting v0.11.1
Compiling syn v2.0.66
Compiling aho-corasick v1.1.3
Compiling is-terminal v0.4.12
Compiling regex-automata v0.4.6
Compiling clap_derive v4.5.5
Compiling regex v1.10.4
Compiling env_logger v0.10.2
Compiling clap v4.5.6
Compiling islec v0.0.0 (/Users/nihal.pasham/devspace/compiler/wasmtime/cranelift/isle/islec)
Finished `dev` profile [unoptimized + debuginfo] target(s) in 3.01s
sorry, if that didn't fix it, idk what the issue is, someone more familiar with macos and wasmtime will need to help.
Do you have a third-party binutils from Nix in your PATH by chance? I had this issue once where binaries were not signed (every binary on aarch64 macOS needs an ad-hoc sig) and were thus immediately killed by the kernel
I think this may be the issue. Although, I'm not sure how to solve it. Usually, we should get a prompt asking the user if he wants to continue with a custom build script but that does not happen in my case.
I use homebrew
for package installation (i.e. have not used Nix so far).
but what's weird is debug builds work without any such issues
Nihal Pasham said:
I think this may be the issue. Although, I'm not sure how to solve it. Usually, we should get a prompt asking the user if he wants to continue with a custom build script but that does not happen in my case.
I’m not sure I’ve ever seen a “prompt asking the user” from Cargo in this case? The build script is necessary: it generates part of Cranelift.
Are you able to build and run an ordinary rust binary (e.g. hello world) in your environment?
yes, a simple hello-world
release build works
cargo build --release
Compiling hello-world v0.1.0 (/Users/nihal.pasham/devspace/rust/projects/exp/hello-world)
Finished `release` profile [optimized] target(s) in 14.17s
Ok. I saw your answer about homebrew but didn’t see anything specifically to my question: do you have a custom binutils (or C compiler) in your $PATH? If so can you run without it?
Also: any custom linker configuration, eg in your systemwide cargo configuration?
Oh, I have one custom ref to gcc binutils in my $PATH and some homebrew utils refs. Here is the output for $PATH
$PATH
zsh: no such file or directory
: /Users/nihal.pasham/.modular/pkg/packages.modular.com_mojo/bin
:/Users/nihal.pasham/.wasmtime/bin
:/Users/nihal.pasham/.cargo/bin
:/opt/homebrew/opt/binutils/bin
:/opt/homebrew/opt/openssl@3/bin
:/opt/homebrew/opt/make/libexec/gnubin
:/Users/nihal.pasham/.local/bin
:/opt/homebrew/opt/openjdk/bin
:/Users/nihal.pasham/Library/xPacks/@xpack-dev-tools/arm-none-eabi-gcc/10.3.1-2.3.1/.content/bin
:/opt/local/bin
:/opt/local/sbin
:/opt/homebrew/bin
:/opt/homebrew/sbin
:/usr/local/bin
:/System/Cryptexes/App/usr/bin
:/usr/bin
:/bin
:/usr/sbin
:/sbin
:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/local/bin
:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/bin
:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/appleinternal/bin
:/Library/Apple/usr/bin
:/Users/nihal.pasham/.cargo/binexport
Should I remove them and try compiling? Why would this be a problem?
Ok, this worked. I removed all references to custom homebrew binutils
in my $PATH.
PS: This was not a cranelift build issue. Any Cargo install relying on a custom build was failing. Apparently, having binutils
in one's $PATH can cause issues when building in release mode. We could end up stripping debugging info using strip
from Homebrew's binutils instead of picking the right one from /usr/bin.
Maybe cargo-component
should have a doctor
sub-command that can check things about the environment. Seems what can be found in $PATH and its affects on the LLVM linker has caused cargo
builds to fail for years. This is more of a cargo
issue than a cargo-component
issue but what are the odds of getting a change into cargo
?
There is a cargo issue for it at least (closed, alas): https://github.com/rust-lang/cargo/issues/11641
The core issue is that the third-party binutils doesn’t understand how to sign macOS binaries so in some sense it’s actually a binutils bug
(The strip binary specifically)
@Chris Fallin You've helped others in the past month or so where it was an LLVM linker issue that was silently picking up a different binary to use in one of its steps if it found it in $PATH but worked just fine without the binary in the path. That one may not have been macOS binary specific but also something rust didn't have direct control over because it was an LLVM issue that had been open a while with no clear path to getting changed.
Ah, the wasm-opt nondeterminism! Yes, like you I don’t have a bunch of hope that upstream will change easily, or at least not without more energy than I have at the moment. Would be good to continue to note the issues to upstream nevertheless…
Last updated: Nov 22 2024 at 16:03 UTC