The Swift Package Index logo.Swift Package Index

Has it really been five years since Swift Package Index launched? Read our anniversary blog post!

Build Information

Successful build of SwiftLlama, reference v0.4.0 (469310), with Swift 6.1 for macOS (SPM) on 30 Jul 2025 09:30:37 UTC.

Swift 6 data race errors: 0

Build Command

env DEVELOPER_DIR=/Applications/Xcode-16.3.0.app xcrun swift build --arch arm64 -Xswiftc -Xfrontend -Xswiftc -stats-output-dir -Xswiftc -Xfrontend -Xswiftc .stats

Build Log

========================================
RunAll
========================================
Builder version: 4.64.0
Interrupt handler set up.
========================================
Checkout
========================================
Clone URL: https://github.com/ShenghaiWang/SwiftLlama.git
Reference: v0.4.0
Initialized empty Git repository in /Users/admin/builder/spi-builder-workspace/.git/
From https://github.com/ShenghaiWang/SwiftLlama
 * tag               v0.4.0     -> FETCH_HEAD
HEAD is now at 4693100 Use precompiled framework
Cloned https://github.com/ShenghaiWang/SwiftLlama.git
Revision (git rev-parse @):
469310012506c307f3f6800d9c5f31e670ff068d
SUCCESS checkout https://github.com/ShenghaiWang/SwiftLlama.git at v0.4.0
Fetching https://github.com/ggerganov/llama.cpp.git
[1/204354] Fetching llama.cpp
Fetched https://github.com/ggerganov/llama.cpp.git from cache (117.48s)
Creating working copy for https://github.com/ggerganov/llama.cpp.git
Working copy of https://github.com/ggerganov/llama.cpp.git resolved at master (b6d6c52)
Downloading binary artifact https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip
[16375/74944281] Downloading https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip
Downloaded https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip (4.41s)
========================================
ResolveProductDependencies
========================================
Resolving dependencies ...
{
  "identity": ".resolve-product-dependencies",
  "name": "resolve-dependencies",
  "url": "/Users/admin/builder/spi-builder-workspace/.resolve-product-dependencies",
  "version": "unspecified",
  "path": "/Users/admin/builder/spi-builder-workspace/.resolve-product-dependencies",
  "dependencies": [
    {
      "identity": "swiftllama",
      "name": "SwiftLlama",
      "url": "https://github.com/ShenghaiWang/SwiftLlama.git",
      "version": "unspecified",
      "path": "/Users/admin/builder/spi-builder-workspace/.resolve-product-dependencies/.build/checkouts/SwiftLlama",
      "dependencies": [
      ]
    }
  ]
}
Fetching https://github.com/ShenghaiWang/SwiftLlama.git
[1/359] Fetching swiftllama
Fetched https://github.com/ShenghaiWang/SwiftLlama.git from cache (0.72s)
Creating working copy for https://github.com/ShenghaiWang/SwiftLlama.git
Working copy of https://github.com/ShenghaiWang/SwiftLlama.git resolved at v0.4.0 (4693100)
Fetching binary artifact https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip from cache
Fetched https://github.com/ggml-org/llama.cpp/releases/download/b5046/llama-b5046-xcframework.zip from cache (3.60s)
warning: '.resolve-product-dependencies': dependency 'swiftllama' is not used by any target
Found 0 product dependencies
========================================
Build
========================================
Selected platform:         macosSpm
Swift version:             6.1
Building package at path:  $PWD
https://github.com/ShenghaiWang/SwiftLlama.git
https://github.com/ShenghaiWang/SwiftLlama.git
{
  "dependencies" : [
    {
      "identity" : "llama.cpp",
      "requirement" : {
        "branch" : [
          "master"
        ]
      },
      "type" : "sourceControl",
      "url" : "https://github.com/ggerganov/llama.cpp.git"
    }
  ],
  "manifest_display_name" : "SwiftLlama",
  "name" : "SwiftLlama",
  "path" : "/Users/admin/builder/spi-builder-workspace",
  "platforms" : [
    {
      "name" : "macos",
      "version" : "15.0"
    },
    {
      "name" : "ios",
      "version" : "18.0"
    },
    {
      "name" : "watchos",
      "version" : "11.0"
    },
    {
      "name" : "tvos",
      "version" : "18.0"
    },
    {
      "name" : "visionos",
      "version" : "2.0"
    }
  ],
  "products" : [
    {
      "name" : "SwiftLlama",
      "targets" : [
        "SwiftLlama"
      ],
      "type" : {
        "library" : [
          "automatic"
        ]
      }
    }
  ],
  "targets" : [
    {
      "c99name" : "SwiftLlamaTests",
      "module_type" : "SwiftTarget",
      "name" : "SwiftLlamaTests",
      "path" : "Tests/SwiftLlamaTests",
      "sources" : [
        "SwiftLlamaTests.swift"
      ],
      "target_dependencies" : [
        "SwiftLlama"
      ],
      "type" : "test"
    },
    {
      "c99name" : "SwiftLlama",
      "module_type" : "SwiftTarget",
      "name" : "SwiftLlama",
      "path" : "Sources/SwiftLlama",
      "product_memberships" : [
        "SwiftLlama"
      ],
      "sources" : [
        "LlamaModel.swift",
        "Models/Batch.swift",
        "Models/Chat.swift",
        "Models/Configuration.swift",
        "Models/Prompt.swift",
        "Models/Session.swift",
        "Models/StopToken.swift",
        "Models/SwiftLlamaError.swift",
        "Models/TypeAlias.swift",
        "Swiftllama.swift",
        "SwiftllamaActor.swift"
      ],
      "target_dependencies" : [
        "LlamaFramework"
      ],
      "type" : "library"
    },
    {
      "c99name" : "LlamaFramework",
      "module_type" : "BinaryTarget",
      "name" : "LlamaFramework",
      "path" : "remote/archive/llama-b5046-xcframework.zip",
      "product_memberships" : [
        "SwiftLlama"
      ],
      "sources" : [
      ],
      "type" : "binary"
    }
  ],
  "tools_version" : "6.0"
}
Running build ...
env DEVELOPER_DIR=/Applications/Xcode-16.3.0.app xcrun swift build --arch arm64 -Xswiftc -Xfrontend -Xswiftc -stats-output-dir -Xswiftc -Xfrontend -Xswiftc .stats
Building for debugging...
[0/3] Write sources
[1/3] Copying llama.framework
[2/3] Write swift-version-2F0A5646E1D333AE.txt
[4/14] Compiling SwiftLlama Session.swift
[5/14] Compiling SwiftLlama TypeAlias.swift
[6/14] Compiling SwiftLlama Configuration.swift
[7/14] Compiling SwiftLlama Chat.swift
[8/14] Compiling SwiftLlama SwiftLlamaError.swift
[9/14] Compiling SwiftLlama Prompt.swift
[10/14] Compiling SwiftLlama StopToken.swift
[11/14] Compiling SwiftLlama Swiftllama.swift
[12/14] Compiling SwiftLlama LlamaModel.swift
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:28:27: warning: 'llama_load_model_from_file' is deprecated: use llama_model_load_from_file instead
 26 |         model_params.n_gpu_layers = 0
 27 |         #endif
 28 |         guard let model = llama_load_model_from_file(path, model_params) else {
    |                           `- warning: 'llama_load_model_from_file' is deprecated: use llama_model_load_from_file instead
 29 |             throw SwiftLlamaError.others("Cannot load model at path \(path)")
 30 |         }
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:32:29: warning: 'llama_new_context_with_model' is deprecated: use llama_init_from_model instead
 30 |         }
 31 |         self.model = model
 32 |         guard let context = llama_new_context_with_model(model, configuration.contextParameters) else {
    |                             `- warning: 'llama_new_context_with_model' is deprecated: use llama_init_from_model instead
 33 |             throw SwiftLlamaError.others("Cannot load model context")
 34 |         }
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:40:42: warning: 'llama_sampler_init_softmax()' is deprecated: will be removed in the future (see https://github.com/ggml-org/llama.cpp/pull/9896#discussion_r1800920915)
 38 |         self.sampler = llama_sampler_chain_init(llama_sampler_chain_default_params())
 39 |         llama_sampler_chain_add(sampler, llama_sampler_init_temp(configuration.temperature))
 40 |         llama_sampler_chain_add(sampler, llama_sampler_init_softmax())
    |                                          `- warning: 'llama_sampler_init_softmax()' is deprecated: will be removed in the future (see https://github.com/ggml-org/llama.cpp/pull/9896#discussion_r1800920915)
 41 |         llama_sampler_chain_add(sampler, llama_sampler_init_dist(1234))
 42 |         try checkContextLength(context: context, model: model)
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:47:27: warning: 'llama_n_ctx_train' is deprecated: use llama_model_n_ctx_train instead
 45 |     private func checkContextLength(context: Context, model: Model) throws {
 46 |         let n_ctx = llama_n_ctx(context)
 47 |         let n_ctx_train = llama_n_ctx_train(model)
    |                           `- warning: 'llama_n_ctx_train' is deprecated: use llama_model_n_ctx_train instead
 48 |         if n_ctx > n_ctx_train {
 49 |             throw SwiftLlamaError.others("Model was trained on \(n_ctx_train) context but tokens \(n_ctx) specified")
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:73:12: warning: 'llama_token_is_eog' is deprecated: use llama_vocab_is_eog instead
 71 |         let newToken =  llama_sampler_sample(sampler, context, batch.n_tokens - 1)
 72 |
 73 |         if llama_token_is_eog(model, newToken) || generatedTokenAccount == n_len {
    |            `- warning: 'llama_token_is_eog' is deprecated: use llama_vocab_is_eog instead
 74 |             temporaryInvalidCChars.removeAll()
 75 |             ended = true
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:135:9: warning: 'llama_kv_cache_clear' is deprecated: use llama_kv_self_clear instead
133 |         tokens.removeAll()
134 |         temporaryInvalidCChars.removeAll()
135 |         llama_kv_cache_clear(context)
    |         `- warning: 'llama_kv_cache_clear' is deprecated: use llama_kv_self_clear instead
136 |     }
137 |
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:141:9: warning: 'llama_free_model' is deprecated: use llama_model_free instead
139 |         llama_batch_free(batch)
140 |         llama_free(context)
141 |         llama_free_model(model)
    |         `- warning: 'llama_free_model' is deprecated: use llama_model_free instead
142 |         llama_backend_free()
143 |     }
[13/14] Compiling SwiftLlama Batch.swift
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:28:27: warning: 'llama_load_model_from_file' is deprecated: use llama_model_load_from_file instead
 26 |         model_params.n_gpu_layers = 0
 27 |         #endif
 28 |         guard let model = llama_load_model_from_file(path, model_params) else {
    |                           `- warning: 'llama_load_model_from_file' is deprecated: use llama_model_load_from_file instead
 29 |             throw SwiftLlamaError.others("Cannot load model at path \(path)")
 30 |         }
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:32:29: warning: 'llama_new_context_with_model' is deprecated: use llama_init_from_model instead
 30 |         }
 31 |         self.model = model
 32 |         guard let context = llama_new_context_with_model(model, configuration.contextParameters) else {
    |                             `- warning: 'llama_new_context_with_model' is deprecated: use llama_init_from_model instead
 33 |             throw SwiftLlamaError.others("Cannot load model context")
 34 |         }
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:40:42: warning: 'llama_sampler_init_softmax()' is deprecated: will be removed in the future (see https://github.com/ggml-org/llama.cpp/pull/9896#discussion_r1800920915)
 38 |         self.sampler = llama_sampler_chain_init(llama_sampler_chain_default_params())
 39 |         llama_sampler_chain_add(sampler, llama_sampler_init_temp(configuration.temperature))
 40 |         llama_sampler_chain_add(sampler, llama_sampler_init_softmax())
    |                                          `- warning: 'llama_sampler_init_softmax()' is deprecated: will be removed in the future (see https://github.com/ggml-org/llama.cpp/pull/9896#discussion_r1800920915)
 41 |         llama_sampler_chain_add(sampler, llama_sampler_init_dist(1234))
 42 |         try checkContextLength(context: context, model: model)
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:47:27: warning: 'llama_n_ctx_train' is deprecated: use llama_model_n_ctx_train instead
 45 |     private func checkContextLength(context: Context, model: Model) throws {
 46 |         let n_ctx = llama_n_ctx(context)
 47 |         let n_ctx_train = llama_n_ctx_train(model)
    |                           `- warning: 'llama_n_ctx_train' is deprecated: use llama_model_n_ctx_train instead
 48 |         if n_ctx > n_ctx_train {
 49 |             throw SwiftLlamaError.others("Model was trained on \(n_ctx_train) context but tokens \(n_ctx) specified")
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:73:12: warning: 'llama_token_is_eog' is deprecated: use llama_vocab_is_eog instead
 71 |         let newToken =  llama_sampler_sample(sampler, context, batch.n_tokens - 1)
 72 |
 73 |         if llama_token_is_eog(model, newToken) || generatedTokenAccount == n_len {
    |            `- warning: 'llama_token_is_eog' is deprecated: use llama_vocab_is_eog instead
 74 |             temporaryInvalidCChars.removeAll()
 75 |             ended = true
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:135:9: warning: 'llama_kv_cache_clear' is deprecated: use llama_kv_self_clear instead
133 |         tokens.removeAll()
134 |         temporaryInvalidCChars.removeAll()
135 |         llama_kv_cache_clear(context)
    |         `- warning: 'llama_kv_cache_clear' is deprecated: use llama_kv_self_clear instead
136 |     }
137 |
/Users/admin/builder/spi-builder-workspace/Sources/SwiftLlama/LlamaModel.swift:141:9: warning: 'llama_free_model' is deprecated: use llama_model_free instead
139 |         llama_batch_free(batch)
140 |         llama_free(context)
141 |         llama_free_model(model)
    |         `- warning: 'llama_free_model' is deprecated: use llama_model_free instead
142 |         llama_backend_free()
143 |     }
[14/14] Emitting module SwiftLlama
[15/15] Compiling SwiftLlama SwiftllamaActor.swift
Build complete! (5.97s)
warning: 'spi-builder-workspace': dependency 'llama.cpp' is not used by any target
Build complete.
{
  "dependencies" : [
    {
      "identity" : "llama.cpp",
      "requirement" : {
        "branch" : [
          "master"
        ]
      },
      "type" : "sourceControl",
      "url" : "https://github.com/ggerganov/llama.cpp.git"
    }
  ],
  "manifest_display_name" : "SwiftLlama",
  "name" : "SwiftLlama",
  "path" : "/Users/admin/builder/spi-builder-workspace",
  "platforms" : [
    {
      "name" : "macos",
      "version" : "15.0"
    },
    {
      "name" : "ios",
      "version" : "18.0"
    },
    {
      "name" : "watchos",
      "version" : "11.0"
    },
    {
      "name" : "tvos",
      "version" : "18.0"
    },
    {
      "name" : "visionos",
      "version" : "2.0"
    }
  ],
  "products" : [
    {
      "name" : "SwiftLlama",
      "targets" : [
        "SwiftLlama"
      ],
      "type" : {
        "library" : [
          "automatic"
        ]
      }
    }
  ],
  "targets" : [
    {
      "c99name" : "SwiftLlamaTests",
      "module_type" : "SwiftTarget",
      "name" : "SwiftLlamaTests",
      "path" : "Tests/SwiftLlamaTests",
      "sources" : [
        "SwiftLlamaTests.swift"
      ],
      "target_dependencies" : [
        "SwiftLlama"
      ],
      "type" : "test"
    },
    {
      "c99name" : "SwiftLlama",
      "module_type" : "SwiftTarget",
      "name" : "SwiftLlama",
      "path" : "Sources/SwiftLlama",
      "product_memberships" : [
        "SwiftLlama"
      ],
      "sources" : [
        "LlamaModel.swift",
        "Models/Batch.swift",
        "Models/Chat.swift",
        "Models/Configuration.swift",
        "Models/Prompt.swift",
        "Models/Session.swift",
        "Models/StopToken.swift",
        "Models/SwiftLlamaError.swift",
        "Models/TypeAlias.swift",
        "Swiftllama.swift",
        "SwiftllamaActor.swift"
      ],
      "target_dependencies" : [
        "LlamaFramework"
      ],
      "type" : "library"
    },
    {
      "c99name" : "LlamaFramework",
      "module_type" : "BinaryTarget",
      "name" : "LlamaFramework",
      "path" : "remote/archive/llama-b5046-xcframework.zip",
      "product_memberships" : [
        "SwiftLlama"
      ],
      "sources" : [
      ],
      "type" : "binary"
    }
  ],
  "tools_version" : "6.0"
}
Done.