Build Information
Failed to build SwiftLlama, reference main (792181
), with Swift 6.1 for Wasm on 26 Aug 2025 09:41:11 UTC.
Build Command
bash -c docker run --pull=always --rm -v "checkouts-4606859-3":/host -w "$PWD" -e JAVA_HOME="/root/.sdkman/candidates/java/current" -e SPI_BUILD="1" -e SPI_PROCESSING="1" registry.gitlab.com/finestructure/spi-images:wasm-6.1-latest swift build --swift-sdk wasm32-unknown-wasi 2>&1
Build Log
========================================
RunAll
========================================
Builder version: 4.67.1
Interrupt handler set up.
========================================
Checkout
========================================
Clone URL: https://github.com/ShenghaiWang/SwiftLlama.git
Reference: main
Initialized empty Git repository in /host/spi-builder-workspace/.git/
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: git branch -m <name>
From https://github.com/ShenghaiWang/SwiftLlama
* branch main -> FETCH_HEAD
* [new branch] main -> origin/main
HEAD is now at 7921814 Merge pull request #22 from gpotari/main
Cloned https://github.com/ShenghaiWang/SwiftLlama.git
Revision (git rev-parse @):
792181492beff9edba52314a040032cefb19edd6
SUCCESS checkout https://github.com/ShenghaiWang/SwiftLlama.git at main
========================================
Build
========================================
Selected platform: wasm
Swift version: 6.1
Building package at path: $PWD
https://github.com/ShenghaiWang/SwiftLlama.git
https://github.com/ShenghaiWang/SwiftLlama.git
WARNING: environment variable SUPPRESS_SWIFT_6_FLAGS is not set
{
"dependencies" : [
{
"identity" : "llama.cpp",
"requirement" : {
"revision" : [
"b6d6c5289f1c9c677657c380591201ddb210b649"
]
},
"type" : "sourceControl",
"url" : "https://github.com/ggerganov/llama.cpp.git"
}
],
"manifest_display_name" : "SwiftLlama",
"name" : "SwiftLlama",
"path" : "/host/spi-builder-workspace",
"platforms" : [
{
"name" : "macos",
"version" : "15.0"
},
{
"name" : "ios",
"version" : "18.0"
},
{
"name" : "watchos",
"version" : "11.0"
},
{
"name" : "tvos",
"version" : "18.0"
},
{
"name" : "visionos",
"version" : "2.0"
}
],
"products" : [
{
"name" : "SwiftLlama",
"targets" : [
"SwiftLlama"
],
"type" : {
"library" : [
"automatic"
]
}
}
],
"targets" : [
{
"c99name" : "SwiftLlamaTests",
"module_type" : "SwiftTarget",
"name" : "SwiftLlamaTests",
"path" : "Tests/SwiftLlamaTests",
"sources" : [
"SwiftLlamaTests.swift"
],
"target_dependencies" : [
"SwiftLlama"
],
"type" : "test"
},
{
"c99name" : "SwiftLlama",
"module_type" : "SwiftTarget",
"name" : "SwiftLlama",
"path" : "Sources/SwiftLlama",
"product_dependencies" : [
"llama"
],
"product_memberships" : [
"SwiftLlama"
],
"sources" : [
"LlamaModel.swift",
"Models/Batch.swift",
"Models/Chat.swift",
"Models/Configuration.swift",
"Models/Prompt.swift",
"Models/Session.swift",
"Models/StopToken.swift",
"Models/SwiftLlamaError.swift",
"Models/TypeAlias.swift",
"Swiftllama.swift",
"SwiftllamaActor.swift"
],
"target_dependencies" : [
"LlamaFramework"
],
"type" : "library"
},
{
"c99name" : "LlamaFramework",
"module_type" : "BinaryTarget",
"name" : "LlamaFramework",
"path" : "remote/archive/llama-b5046-xcframework.zip",
"product_memberships" : [
"SwiftLlama"
],
"sources" : [
],
"type" : "binary"
}
],
"tools_version" : "6.0"
}
Running build ...
bash -c docker run --pull=always --rm -v "checkouts-4606859-3":/host -w "$PWD" -e JAVA_HOME="/root/.sdkman/candidates/java/current" -e SPI_BUILD="1" -e SPI_PROCESSING="1" registry.gitlab.com/finestructure/spi-images:wasm-6.1-latest swift build --swift-sdk wasm32-unknown-wasi -Xswiftc -Xfrontend -Xswiftc -stats-output-dir -Xswiftc -Xfrontend -Xswiftc .stats 2>&1
wasm-6.1-latest: Pulling from finestructure/spi-images
Digest: sha256:eb0758f51dbd6991fb9e51dedbfbcbec142ffc0d3b9b8ad91fa19d35e5136f0a
Status: Image is up to date for registry.gitlab.com/finestructure/spi-images:wasm-6.1-latest
Fetching https://github.com/ggerganov/llama.cpp.git
[1/218368] Fetching llama.cpp
Fetched https://github.com/ggerganov/llama.cpp.git from cache (67.74s)
Creating working copy for https://github.com/ggerganov/llama.cpp.git
Working copy of https://github.com/ggerganov/llama.cpp.git resolved at b6d6c5289f1c9c677657c380591201ddb210b649
Downloading binary artifact https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip
[16375/74944281] Downloading https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip
Downloaded https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip (8.11s)
Building for debugging...
[0/13] Compiling ggml-alloc.c
[1/13] Write swift-version-24593BA9C3E375BF.txt
[2/13] Compiling ggml-aarch64.c
In file included from /host/spi-builder-workspace/.build/checkouts/llama.cpp/ggml/src/ggml.c:29:
/root/.swiftpm/swift-sdks/swift-wasm-6.1-RELEASE-wasm32-unknown-wasi.artifactbundle/6.1-RELEASE-wasm32-unknown-wasi/wasm32-unknown-wasi/WASI.sdk/include/wasm32-wasi/signal.h:2:2: error: "wasm lacks signal support; to enable minimal signal emulation, compile with -D_WASI_EMULATED_SIGNAL and link with -lwasi-emulated-signal"
2 | #error "wasm lacks signal support; to enable minimal signal emulation, \
| ^
/host/spi-builder-workspace/.build/checkouts/llama.cpp/ggml/src/ggml.c:170:10: fatal error: 'pthread.h' file not found
170 | #include <pthread.h>
| ^~~~~~~~~~~
2 errors generated.
[3/13] Compiling ggml.c
[3/13] Compiling llama.cpp
[3/13] Compiling ggml-backend.cpp
[3/13] Compiling unicode-data.cpp
[3/13] Compiling llama-grammar.cpp
[3/13] Compiling llama-sampling.cpp
[3/13] Compiling unicode.cpp
[3/13] Compiling llama-vocab.cpp
Running build ...
bash -c docker run --pull=always --rm -v "checkouts-4606859-3":/host -w "$PWD" -e JAVA_HOME="/root/.sdkman/candidates/java/current" -e SPI_BUILD="1" -e SPI_PROCESSING="1" registry.gitlab.com/finestructure/spi-images:wasm-6.1-latest swift build --swift-sdk wasm32-unknown-wasi 2>&1
wasm-6.1-latest: Pulling from finestructure/spi-images
Digest: sha256:eb0758f51dbd6991fb9e51dedbfbcbec142ffc0d3b9b8ad91fa19d35e5136f0a
Status: Image is up to date for registry.gitlab.com/finestructure/spi-images:wasm-6.1-latest
[0/1] Planning build
Building for debugging...
In file included from /host/spi-builder-workspace/.build/checkouts/llama.cpp/ggml/src/ggml.c:29:
/root/.swiftpm/swift-sdks/swift-wasm-6.1-RELEASE-wasm32-unknown-wasi.artifactbundle/6.1-RELEASE-wasm32-unknown-wasi/wasm32-unknown-wasi/WASI.sdk/include/wasm32-wasi/signal.h:2:2: error: "wasm lacks signal support; to enable minimal signal emulation, compile with -D_WASI_EMULATED_SIGNAL and link with -lwasi-emulated-signal"
2 | #error "wasm lacks signal support; to enable minimal signal emulation, \
| ^
/host/spi-builder-workspace/.build/checkouts/llama.cpp/ggml/src/ggml.c:170:10: fatal error: 'pthread.h' file not found
170 | #include <pthread.h>
| ^~~~~~~~~~~
2 errors generated.
[0/11] Compiling ggml.c
[0/11] Compiling llama.cpp
[0/11] Compiling unicode-data.cpp
[0/11] Compiling llama-grammar.cpp
[0/11] Compiling llama-vocab.cpp
[0/11] Compiling unicode.cpp
[0/11] Compiling llama-sampling.cpp
[0/11] Write swift-version-24593BA9C3E375BF.txt
BUILD FAILURE 6.1 wasm