alexcrichton commented on issue #3983:
Thanks for this! I don't think that CI can run the tests since I believe it's x86_64 hardware, but we should probably be able to add a builder so long as it doesn't take unduly long in CI. I think though that the
build-tarballs.sh
script needs to be updated for the new target
cfallin commented on issue #3983:
This seems fine to me as long as we have some testing on the releases -- unfortunately GitHub Actions still does not support macOS/aarch64 (see actions/virtual-environments#2187) so we can't run our tests in CI. (This has been the blocker for M1 support so far.) Last I heard, Embark Studios folks were running their own internal CI on M1 systems (@bnjbvr can you confirm this is still active?) -- if we have that, and if #3955 goes in so we have two weeks between branching a release and publishing it, then we could rely on Embark to notify us if there is some breakage with a given release. Not optimal, but it's something. Thoughts?
cfallin edited a comment on issue #3983:
This seems fine to me as long as we have some testing on the releases -- unfortunately GitHub Actions still does not support macOS/aarch64 (see actions/virtual-environments#2187) so we can't run our tests in CI. (This has been the blocker for adding official M1 releases so far, though we've had unofficial support in-tree thanks to @bnjbvr.) Last I heard, Embark Studios folks were running their own internal CI on M1 systems (@bnjbvr can you confirm this is still active?) -- if we have that, and if #3955 goes in so we have two weeks between branching a release and publishing it, then we could rely on Embark to notify us if there is some breakage with a given release. Not optimal, but it's something. Thoughts?
ricochet commented on issue #3983:
If we have a release milestone, we can do manual testing on a m1 and sign-off for releases. Not ideal, but workable. I would gladly volunteer until an automated solution is put in place.
A different option would be to use a self-hosted runner. Now that AWS and other providers have aarch64-darwin VM's, we could kickoff a workflow for verifying the m1 binary. This issue seems a little closer to completion than the a hosted VM but that's just a guess https://github.com/actions/runner/issues/805.
alexcrichton commented on issue #3983:
I think even with a custom runner our hands are sort of tied because GitHub's own official recommendation is that for public repositories you shouldn't use self-hosted runners for security reasons. If you're ok manually verifying that builds are ok that may be the best way to go. With the release process from https://github.com/bytecodealliance/wasmtime/pull/3955 there will be a 2-week window between when a release branch is created and when it's actually published which should provide a good opportunity to test on non-CI-tested-platforms.
If you'd like it could also be set up to notify you. We could apply a tag to the PR and then using our labeling bot you'd get auto-cc'd on any PRs related to a major version bump
bnjbvr commented on issue #3983:
Confirming that we still do have some nightly CI job running the test suite (using the checked-in scripts) every day, using the latest commit on main at the time of running the CI job. Results are public, but only we at Embark Studios can start jobs etc.
ricochet commented on issue #3983:
That's fantastic. Then this might be ready to merge? I added a reference to embark's CI so that it will be part of the initial PR for releases.
alexcrichton commented on issue #3983:
Oh I think CI is still failing on this, the
build-tarballs.sh
script needs an update to use the right path to the artifacts (right now it assumestarget/release/*
since macOS previously didn't cross-compile but now it's sometimes intarget/aarch64-apple-darwin/release/*
)
alexcrichton commented on issue #3983:
I think the matrix entry for the test build may have snuck back in? Otherwise though the script changes look good to me, thanks!
To double-check, can you confirm the release on this page works locally for you? This should be the direct link there
ricochet commented on issue #3983:
I think the matrix entry for the test build may have snuck back in? Otherwise though the script changes look good to me, thanks!
Ack!
To double-check, can you confirm the release on this page works locally for you? This should be the direct link there
LGTM tried the example and ran a compile on one of my wasm modules.
./wasmtime --version wasmtime 0.37.0
Last updated: Dec 23 2024 at 12:05 UTC