Skip to content

Conversation

@Lucino772
Copy link
Contributor

@Lucino772 Lucino772 commented Nov 21, 2025

Description

I am getting the following error when running this action on an Intel macOS runner:

Received 26451860 of 26451860 (100.0%), 19.3 MBs/sec
Cache Size: ~25 MB (26451860 B)
/usr/local/bin/gtar -xf /Users/runner/work/_temp/c8303197-5442-4088-a86a-22e1197bea7b/cache.tzst -P -C /Users/runner/work/REPOSITORY_NAME/REPOSITORY_NAME --delay-directory-restore --use-compress-program unzstd
Cache restored successfully
Found Tailscale 1.90.6 in cache: /Users/runner/hostedtoolcache/tailscale/1.90.6/macOS-amd64
▶️ copy tailscale from cache
▶️ copy tailscaled from cache
▶️ chmod tailscale
▶️ chmod tailscaled
Starting tailscaled daemon...
node:internal/child_process:421
Waiting for tailscaled daemon to become ready...
    throw new ErrnoException(err, 'spawn');
▶️ get tailscale status
          ^

Error: spawn Unknown system error -86
    at ChildProcess.spawn (node:internal/child_process:421:11)
    at Object.spawn (node:child_process:796:9)
    at ToolRunner.<anonymous> (/Users/runner/work/_actions/tailscale/github-action/v4/dist/index.js:4822:34)
    at Generator.next (<anonymous>)
    at /Users/runner/work/_actions/tailscale/github-action/v4/dist/index.js:4436:71
    at new Promise (<anonymous>)
    at __webpack_modules__.6665.__awaiter (/Users/runner/work/_actions/tailscale/github-action/v4/dist/index.js:4432:12)
    at /Users/runner/work/_actions/tailscale/github-action/v4/dist/index.js:4804:53
    at new Promise (<anonymous>)
    at ToolRunner.<anonymous> (/Users/runner/work/_actions/tailscale/github-action/v4/dist/index.js:4804:20) {
  errno: -86,
  code: 'Unknown system error -86',
  syscall: 'spawn'
}

Node.js v24.10.0

After doing some research, I discovered that both ARM and Intel macOS runners are using the same cache key:
tailscale/1.90.6/macOS-amd64.

Because of this, the first run on an ARM runner built and cached an ARM-compiled Tailscale binary. On a subsequent run, an Intel runner restored that cached binary, but since the binary was built for a different architecture, it failed to run.

macos-13 (Intel):

Found Tailscale 1.90.6 in cache: /Users/runner/hostedtoolcache/tailscale/1.90.6/macOS-amd64

macos-15 (ARM64):

Found Tailscale 1.90.6 in cache: /Users/runner/hostedtoolcache/tailscale/1.90.6/macOS-amd64

Solution

This PR fixes the getTailscaleArch function so that ARM macOS runners return the correct architecture, ensuring that the generated cache key uses the correct architecture. On macOS runners specifically, config.arch is only ever used when generating the cache key, it isn’t passed to the Go build command at all.

PS: As a temporary fix, disabling the cache on macOS runners allows the workflow to run successfully

@Lucino772 Lucino772 marked this pull request as ready for review November 21, 2025 11:28
@Lucino772 Lucino772 changed the title fix: use correct arch on macOs (WIP) fix: use correct arch on macOs for cache key Nov 21, 2025
Copy link
Member

@mpminardi mpminardi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for this @Lucino772 , and apologies for the miss here!

@mpminardi mpminardi merged commit fee37da into tailscale:main Dec 11, 2025
3 of 16 checks passed
mpminardi added a commit that referenced this pull request Dec 11, 2025
Running make build was missed on #235 and subsequently missed by me
when reviewing that PR (whoops).

Updates #cleanup

Signed-off-by: Mario Minardi <[email protected]>
mpminardi added a commit that referenced this pull request Dec 11, 2025
Running make build was missed on #235 and subsequently missed by me
when reviewing that PR (whoops).

Updates #cleanup

Signed-off-by: Mario Minardi <[email protected]>
@Lucino772 Lucino772 deleted the fix/tailscale-arch-macos branch December 12, 2025 10:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants