Skip to content

llama : add xcframework build script #11996

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Mar 5, 2025

Conversation

danbev
Copy link
Collaborator

@danbev danbev commented Feb 21, 2025

This commit adds a script to build an XCFramework for Apple
ios, macos, visionos, and tvos platforms.

The generated XCFramework can then be added to a project and used in
the same way as a regular framework. The llama.swiftui example project
has been updated to use the XCFramework and can be started using the
following command:

$ open examples/llama.swiftui/llama.swiftui.xcodeproj/

Refs: #10747

Package.swift Outdated
Comment on lines 17 to 21
.binaryTarget(
name: "llama",
path: "build-ios/llama.xcframework"
),
//.systemLibrary(name: "llama", pkgConfig: "llama"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something that is not clear to me yet is now that we build a XCFramework, do we need this SPM package? Could we simply add the framework to the project and skip the package?

Copy link
Collaborator Author

@danbev danbev Feb 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The way we are using SPM where we expect llama.cpp to be installed on the local system, my understanding is that these are two ways of doing the same thing and perhaps we should just use framework which seems to be pretty simple.

The option of using SPM for source code distribution (not binary distribution or what is might be called when using a system library) would enable end users to include llama.cpp in their projects using something like this their projects Package.swift:

dependencies: [
    .package(url: "https://github.com/ggerganov/llama.cpp.git", from: "1.0.0")
]

SPM would then download the source code, compile it, etc. But I'm not sure how much work it would require to get this working and maintain.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, so we want to move away from the workflow where the SPM package builds the llama from source for 2 reasons:

  • We would need to maintain a second build system (i.e. the SPM package)
  • It will not work when ggml becomes a submodule in the future (SPM does not support submodules)

So it seems to me that we should probably remove the Package.swift all together and use the framework approach.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So it seems to me that we should probably remove the Package.swift all together and use the framework approach.

This sounds like a good to me.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ggerganov I've removed Package.swift now as it was causing issues with xcodebuild and also produced an error in xcode (but it was still possible to build/run). I've updated the CI build to remove the install of llama.cpp as it should no longer be required, and also added a FRAMEWORKD_FOLDER_PATH for the xcodebuild commands.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll remove spm-headers and Sources shortly. (Package.swift has been removed and in llama : remove Package.swift)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Package dependency management is not just an easy way to integrate dependencies.

It's also there to manage them. SPM actually makes use of the Git version tags, so, when using SPM, one can explicitly define a version they want (need) to use, and distribute that knowledge to others in a team.

If you remove Package.swift, that capability gets lost.

Also, if there ever will be support for executing build scripts on installation of SPM dependencies, this build script could be referenced there.

(cough In that regard, I would like to promote CocoaPods again, as that is capable of exactly such a feature…)

Currently, the nicest way would be to have a Github Action build the xcframework during a release and add that to a Github release page.

That could be referenced from a Package.swift and suddenly people would have a really easy time to integrate without needing a lot of knowledge on how to compile C++ things.

Copy link
Collaborator Author

@danbev danbev Feb 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to make sure I've understood this correctly, I as a user would be able to add the following to my project's Package.swift:

dependencies: [
    .package(url: "https://github.com/ggml-org/llama.cpp.git", from: "b4777")
]

And the Swift Package Manager would then clone the repository and look for a Package.swift file in the root of llama.cpp. This Package.swift could then looks something like the following:

import PackageDescription

let package = Package(
    name: "llama",
    platforms: [
        .iOS(.v14),
        .macOS(.v10_15),
        .tvOS(.v14),
        .visionOS(.v1)
    ],
    products: [
        .library(name: "llama", targets: ["llama"])
    ],
    targets: [
        .binaryTarget(
            name: "llama",
            url: "https://github.com/ggml-org/llama.cpp/releases/download/b4777/llama-b4777-xcframework.zip",
            checksum: "the-sha256-checksum-here"
        )
    ]
)

Without Swift Package (Manual Integration):

  1. User has to manually go to our GitHub releases page
  2. Find the correct version of the xcframework.zip
  3. Download it
  4. Extract the zip file
  5. Drag the XCFramework into their Xcode project
  6. Configure build settings (embedding, signing, etc.)
  7. Repeat all these steps whenever they want to update to a newer version

With Swift Package (Automated Integration):

  1. User adds a single line to their dependencies
  2. Swift Package Manager automatically downloads the right binary
  3. Proper linking and integration happens automatically
  4. Updating is as simple as changing a version number

A very basic example can be found here

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exactly.
To be completely honest, though, I should mention this one:

The checksum is mandatory and is only known after creating the xcframework.zip.

So, someone or something would need to update that in the Package.swift file after creating the new xcframework and, at best, then tag the release accordingly, or force move an already tagged release.

If that is too much hassle, the Package.swift part could be done in another repo which releases separately. Like https://github.com/srgtuszy/llama-cpp-swift.

@github-actions github-actions bot added the devops improvements to build systems and github actions label Feb 21, 2025
@danbev danbev marked this pull request as ready for review February 21, 2025 13:00
@ggerganov
Copy link
Member

Tagging people who participated in recent iOS/Swift build discussions - please provide feedback about the proposed build changes in this PR. The new XCFramework approach proposed here should resolve all build issues for 3rd-party Swift projects depending on llama.cpp - let us know if this works for you.

@jiabochao @nvoter @pgorzelany @Animaxx @MrMage @jhen0409 @hnipps @yhondri @tladesignz

@Animaxx
Copy link

Animaxx commented Feb 23, 2025

with xcframework I am able to build the swift example project, but when run it got error:

dyld[20228]: Library not loaded: @rpath/llama.framework/llama
  Referenced from: <442FB784-7FA0-3D1E-AAF6-7E1E06ECEEA6> /private/var/containers/Bundle/Application/CBDDD71E-76A1-4AF8-8695-24E8ADCAD70A/llama.swiftui.app/llama.swiftui.debug.dylib
  Reason: tried: '/private/var/containers/Bundle/Application/CBDDD71E-76A1-4AF8-8695-24E8ADCAD70A/llama.swiftui.app/llama.framework/llama' (no such file), '/private/var/containers/Bundle/Application/CBDDD71E-76A1-4AF8-8695-24E8ADCAD70A/llama.swiftui.app/llama.framework/llama' (no such file)

when I trying to use in other project got error:

product being built is not an allowed client of it
Undefined symbols for architecture arm64:
  "_ggml_time_us", referenced from:
      LlamaContext.bench(pp: Swift.Int, tg: Swift.Int, pl: Swift.Int, nr: Swift.Int) -> Swift.String in LibLlama.o
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)

@danbev
Copy link
Collaborator Author

danbev commented Feb 23, 2025

@Animaxx Thanks for trying this out!

with xcframework I am able to build the swift example project, but when run it got error:

I'm trying to figure this out. Can you tell me if you are running this using the iphone simulator in xcode or some other way I can reproduce this?

when I trying to use in other project got error:

I've been able to reproduce this and looking into it now. Thanks!

Update: I've been looking at this issue and have updated the script to use static libraries which I should probably have done in the first place. With these changes, using the generated XCFramework I'm able to run the llama.swiftui example, and I also created a standalone example that imported the llama module tried to access a few functions:

import SwiftUI
import llama

struct ContentView: View {
    var body: some View {
        VStack {
            Image(systemName: "globe")
                .imageScale(.large)
                .foregroundStyle(.tint)
            Text("Hello, world!")
        }
        .padding()
        .onAppear {
            // Test llama framework functionality
            let params = llama_context_default_params()
            print("Llama context params initialized:")
            print("n_ctx: \(params.n_ctx)")
            print("n_threads: \(params.n_threads)")
            let start = ggml_time_us()
            print("start: \(start)")
        }
    }
}

#Preview {
    ContentView()
}

I ran out of time today and I'll be out Mon-Tue, but I'll revisit the issue @Animaxx reported when I'm back on Wednesday.

@Animaxx
Copy link

Animaxx commented Feb 23, 2025

Hi @danbev thank you for the PR!

The error I got is from debugging on the real device, not sure if that related to xcframework sign

@Animaxx
Copy link

Animaxx commented Feb 23, 2025

with the latest commit works on device now! trying to have validate archives got the error

  "errors" : [ {
    "id" : "90d7ef88-ded7-4ef2-b999-2c28db0bf716",
    "status" : "409",
    "code" : "STATE_ERROR.VALIDATION_ERROR",
    "title" : "Validation failed",
    "detail" : "Invalid Bundle. The bundle SwiftUI.app/Frameworks/llama.framework does not support the minimum OS Version specified in the Info.plist."
  } ]

@Animaxx
Copy link

Animaxx commented Feb 23, 2025

also when I use the xcframework to build for MacOS app, got error:

./llama.xcframework:1:1 While building for macOS, no library for this platform was found in './llama.xcframework'.

with unsigned alert Screenshot 2025-02-23 at 4 59 50 PM

@danbev
Copy link
Collaborator Author

danbev commented Feb 24, 2025

also when I use the xcframework to build for MacOS app, got error:

Ah, I missed building for macos, it currently only builds for ios. I'll try adding a bulid for macos.

@tladesignz
Copy link

tladesignz commented Feb 24, 2025

Thank you so much, @danbev, for starting this!

I took your work for a test drive:

  • I modified the llama.swiftui example a little, so it would also run on macOS.
  • I used a DeepSeek-R1-Distill-Qwen-1.5B-Q8_0.gguf model.
  • I compiled straightforward using your build script:
% ./build-xcframework.sh

It ran on the following devices:

  • MacBook Pro 16" 2021 with ARM M1 Pro running macOS Sequoia 15.3.1: in the iOS simulator and as native macOS app
  • iPhone 15 Pro
  • MacBook Pro 15" 2018 with Intel i9 and a Radeon Pro 560X running macOS Sonoma 14.7.4: In the iOS simulator and as native macOS app (with abysmal benchmark results)

I didn't try out visionOS, since I don't have a device and I was too lazy to download the simulator.

I stumbled over various minor things, however, so I suggest to take care of the following:

  • I suggest to add visionOS support, too, since the demo app supports it. watchOS and tvOS are probably negligible. I doubt there's a lot of folks out there who want to run llama.cpp there, but visionOS is definitely a strong candidate, since the devise is pretty powerful, and when you're already doing AR, AI isn't far, from my point of view...

  • There is a build issue when combining iphonesimulator and macos with the x86_64 platform, which is especially weird, because it seems to work, nevertheless:

+ combine_static_libraries build-ios-sim Release-iphonesimulator
…
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/libtool: for architecture: x86_64 file: /Users/berhart/workspace/dna/llama.cpp.danbev/build-ios-sim/ggml/src/Release-iphonesimulator/libggml-cpu.a(amx.o) has no symbols
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/libtool: for architecture: x86_64 file: /Users/berhart/workspace/dna/llama.cpp.danbev/build-ios-sim/ggml/src/Release-iphonesimulator/libggml-cpu.a(mmq.o) has no symbols
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/libtool: for architecture: x86_64 file: /Users/berhart/workspace/dna/llama.cpp.danbev/build-ios-sim/ggml/src/Release-iphonesimulator/libggml-cpu.a(ggml-cpu-hbm.o) has no symbols
  • In the Info.plist file, there's the MinimumOSVersion hardcoded for all platforms. I'm pretty sure that needs to change depending on if iOS, macOS or visionOS:
…
    <key>MinimumOSVersion</key>
    <string>14.0</string>
</dict>
  • This one looks super fishy to me, when compiling iOS. Is that cmake's way to define the iOS version when cross-compiling? Brrrrr.
DEPLOYMENT_TARGET=14.0
…
# Common options for all builds
COMMON_CMAKE_ARGS=(
    -DIOS=ON
    -DCMAKE_SYSTEM_NAME=iOS
    -DCMAKE_OSX_DEPLOYMENT_TARGET=${DEPLOYMENT_TARGET}
  • Since you have config variables for BUILD_SHARED_LIBS etc, I would either apply them here, too, or drop them:
# Build for macOS
cmake -B build-macos -G Xcode \
    -DBUILD_SHARED_LIBS=OFF \
    -DLLAMA_BUILD_EXAMPLES=OFF \
    -DLLAMA_BUILD_TESTS=OFF \
    -DLLAMA_BUILD_SERVER=OFF \
    -DGGML_METAL=ON \
    -DGGML_METAL_EMBED_LIBRARY=ON \
    -DGGML_BLAS_DEFAULT=ON \
    -DGGML_METAL_USE_BF16=ON \
    -DCMAKE_OSX_ARCHITECTURES="arm64;x86_64" \
    -DIOS=OFF \
    -DCMAKE_OSX_DEPLOYMENT_TARGET=10.15 \
    -S .
cmake --build build-macos --config Release
  • I suggest to rename the folder build-ios to build-apple to reflect the true nature of the xcframework.

  • In order to make knowledge explicit and lower the entry barrier for new developers, instead of mentioning it in some README file, I would suggest to add brew install calls to install all build dependencies. Or at least test (at a minimum one of) them and spit out instructions on how to get them, if not installed.

@blaineam
Copy link

Im not sure what issue I'm running into on my m4 pro MacBook running latest version of Sequoia. I have all of the Xcode dependencies updated and I cannot get passed this. I ran into it the other day when I tried too but the same commit that had issues that day worked once today when I tried again and after pulling in the latest macOS changes from your repo I'm back to the same issues.

CMake Error at ggml/src/CMakeLists.txt:335 (target_compile_features):
  target_compile_features no known features for C compiler

  ""

  version .


CMake Error at ggml/src/CMakeLists.txt:335 (target_compile_features):
  target_compile_features no known features for C compiler

  ""

  version .


CMake Error at cmake/common.cmake:23 (ggml_get_flags):
  ggml_get_flags Function invoked with incorrect arguments for function
  named: ggml_get_flags
Call Stack (most recent call first):
  src/CMakeLists.txt:1 (llama_add_compile_flags)


CMake Error at src/CMakeLists.txt:33 (target_compile_features):
  target_compile_features no known features for CXX compiler

  ""

  version .


CMake Error at cmake/common.cmake:23 (ggml_get_flags):
  ggml_get_flags Function invoked with incorrect arguments for function
  named: ggml_get_flags
Call Stack (most recent call first):
  common/CMakeLists.txt:5 (llama_add_compile_flags)


CMake Error at common/CMakeLists.txt:140 (target_compile_features):
  target_compile_features no known features for CXX compiler

  ""

  version .


-- Configuring incomplete, errors occurred!


@tladesignz
Copy link

@blaineam, that looks like your C compiler toolchain (LLVM) is broken. Run this:

xcode-select --install

@tladesignz
Copy link

🫶

@blaineam
Copy link

@blaineam, that looks like your C compiler toolchain (LLVM) is broken. Run this:

xcode-select --install

yeah that didn't fix it, I already had that installed. needed to run brew uninstall rust and maybe brew uninstall llvm too.

The build made it farther, however it did end up failing on:

** BUILD FAILED **


The following build commands failed:
        CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-quants.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/gg

@tladesignz
Copy link

The build made it farther, however it did end up failing on:

** BUILD FAILED **


The following build commands failed:
        CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-quants.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/gg

Unfortunately, this log line doesn't tell how it failed.

Anyway, this brings me to an idea:

yeah that didn't fix it, I already had that installed. needed to run brew uninstall rust and maybe brew uninstall llvm too.

Did you just upgrade from Sonoma to Sequoia? In that case, a lot of brew installed build tools are broken. You might want to run

brew reinstall installed

(That'll take time a lot of time, depending on the stuff you installed via brew. Maybe cmake is the only thing necessary.)

@blaineam
Copy link

The build made it farther, however it did end up failing on:

** BUILD FAILED **


The following build commands failed:
        CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-quants.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/gg

Unfortunately, this log line doesn't tell how it failed.

Anyway, this brings me to an idea:

yeah that didn't fix it, I already had that installed. needed to run brew uninstall rust and maybe brew uninstall llvm too.

Did you just upgrade from Sonoma to Sequoia? In that case, a lot of brew installed build tools are broken. You might want to run

brew reinstall installed

(That'll take time a lot of time, depending on the stuff you installed via brew. Maybe cmake is the only thing necessary.)

its not happy about something just not sure what. it seems cmake fails to find the c compiler and cxx compiler identifications sometimes and if either of those fail it cannot build. on rare occasion it finds them though ... had one almost successful run and then it failed too. then subsequent builds fail each time. no clue what is wrong on my MacBook to cause such intermittent issues but something is.

I tried running the following:

brew list | xargs brew reinstall
sudo rm -rf /Library/Developer/CommandLineTools; 
brew install cmake
sudo xcodebuild -license accept
sudo xcodebuild -runFirstLaunch
sudo xcode-select --reset
sudo xcode-select --install
CLI Output

Ari/llama.cpp-temp/llama.cpp on  xcframework-build-10747 took 8s
➜ export SDKROOT=$(xcrun --sdk macosx --show-sdk-path)

Ari/llama.cpp-temp/llama.cpp on  xcframework-build-10747
➜ ./build-xcframework.sh
Checking for required tools...

  • rm -rf build-apple
  • rm -rf build-ios-sim
  • rm -rf build-ios-device
  • rm -rf build-macos
  • rm -rf build-visionos
  • COMMON_CMAKE_ARGS=(-DCMAKE_XCODE_ATTRIBUTE_CODE_SIGNING_REQUIRED=NO -DCMAKE_XCODE_ATTRIBUTE_CODE_SIGN_IDENTITY="" -DCMAKE_XCODE_ATTRIBUTE_CODE_SIGNING_ALLOWED=NO -DBUILD_SHARED_LIBS=OFF -DLLAMA_BUILD_EXAMPLES=${LLAMA_BUILD_EXAMPLES} -DLLAMA_BUILD_TESTS=${LLAMA_BUILD_TESTS} -DLLAMA_BUILD_SERVER=${LLAMA_BUILD_SERVER} -DGGML_METAL_EMBED_LIBRARY=${GGML_METAL_EMBED_LIBRARY} -DGGML_BLAS_DEFAULT=${GGML_BLAS_DEFAULT} -DGGML_METAL=${GGML_METAL} -DGGML_METAL_USE_BF16=${GGML_METAL_USE_BF16})
  • cmake -B build-ios-sim -G Xcode -DCMAKE_XCODE_ATTRIBUTE_CODE_SIGNING_REQUIRED=NO -DCMAKE_XCODE_ATTRIBUTE_CODE_SIGN_IDENTITY= -DCMAKE_XCODE_ATTRIBUTE_CODE_SIGNING_ALLOWED=NO -DBUILD_SHARED_LIBS=OFF -DLLAMA_BUILD_EXAMPLES=OFF -DLLAMA_BUILD_TESTS=OFF -DLLAMA_BUILD_SERVER=OFF -DGGML_METAL_EMBED_LIBRARY=ON -DGGML_BLAS_DEFAULT=ON -DGGML_METAL=ON -DGGML_METAL_USE_BF16=ON -DCMAKE_OSX_DEPLOYMENT_TARGET=14.0 -DIOS=ON -DCMAKE_SYSTEM_NAME=iOS -DCMAKE_OSX_SYSROOT=iphonesimulator '-DCMAKE_OSX_ARCHITECTURES=arm64;x86_64' -DCMAKE_XCODE_ATTRIBUTE_SUPPORTED_PLATFORMS=iphonesimulator -S .
    -- The C compiler identification is AppleClang 16.0.0.16000026
    -- The CXX compiler identification is AppleClang 16.0.0.16000026
    -- Detecting C compiler ABI info
    -- Detecting C compiler ABI info - done
    -- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang - skipped
    -- Detecting C compile features
    -- Detecting C compile features - done
    -- Detecting CXX compiler ABI info
    -- Detecting CXX compiler ABI info - done
    -- Check for working CXX compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang++ - skipped
    -- Detecting CXX compile features
    -- Detecting CXX compile features - done
    -- Found Git: /usr/bin/git (found version "2.39.5 (Apple Git-154)")
    -- Setting GGML_NATIVE_DEFAULT to OFF
    -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
    -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
    -- Found Threads: TRUE
    -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
    -- CMAKE_SYSTEM_PROCESSOR:
    -- Including CPU backend
    -- Accelerate framework found
    -- Could NOT find OpenMP_C (missing: OpenMP_C_FLAGS OpenMP_C_LIB_NAMES)
    -- Could NOT find OpenMP_CXX (missing: OpenMP_CXX_FLAGS OpenMP_CXX_LIB_NAMES)
    -- Could NOT find OpenMP (missing: OpenMP_C_FOUND OpenMP_CXX_FOUND)
    CMake Warning at ggml/src/ggml-cpu/CMakeLists.txt:53 (message):
    OpenMP not found
    Call Stack (most recent call first):
    ggml/src/CMakeLists.txt:318 (ggml_add_cpu_backend_variant_impl)
    ...

/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:741:22: warning: implicit
conversion loses integer precision: 'int64_t' (aka 'long long') to 'int' [-Wshorten-64-to-32]
741 | const int nb = k / QK8_0;
| ~~ ~~^~~~~~~
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:1023:22: warning:
implicit conversion loses integer precision: 'int64_t' (aka 'long long') to 'int' [-Wshorten-64-to-32]
1023 | const int nb = k / QK8_1;
| ~~ ~~^~~~~~~
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:2107:46: error:
always_inline function 'vdotq_s32' requires target feature 'dotprod', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without
support for 'dotprod'
2107 | const int32x4_t p_0 = ggml_vdotq_s32(ggml_vdotq_s32(vdupq_n_s32(0), v0_0ls, v1_0l), v0_0hs, v1_0h);
| ^
In file included from /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:7:
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-impl.h:323:33: note: expanded from
macro 'ggml_vdotq_s32'
323 | #define ggml_vdotq_s32(a, b, c) vdotq_s32(a, b, c)
| ^
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:2107:31: error:
always_inline function 'vdotq_s32' requires target feature 'dotprod', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without
support for 'dotprod'
2107 | const int32x4_t p_0 = ggml_vdotq_s32(ggml_vdotq_s32(vdupq_n_s32(0), v0_0ls, v1_0l), v0_0hs, v1_0h);
| ^
In file included from /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:7:
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-impl.h:323:33: note: expanded from
macro 'ggml_vdotq_s32'
323 | #define ggml_vdotq_s32(a, b, c) vdotq_s32(a, b, c)
| ^
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:2108:46: error:
always_inline function 'vdotq_s32' requires target feature 'dotprod', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without
support for 'dotprod'
2108 | const int32x4_t p_1 = ggml_vdotq_s32(ggml_vdotq_s32(vdupq_n_s32(0), v0_1ls, v1_1l), v0_1hs, v1_1h);
| ^
In file included from /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:7:
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-impl.h:323:33: note: expanded from
macro 'ggml_vdotq_s32'
323 | #define ggml_vdotq_s32(a, b, c) vdotq_s32(a, b, c)
| ^
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:2108:31: error:
always_inline function 'vdotq_s32' requires target feature 'dotprod', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without
support for 'dotprod'
2108 | const int32x4_t p_1 = ggml_vdotq_s32(ggml_vdotq_s32(vdupq_n_s32(0), v0_1ls, v1_1l), v0_1hs, v1_1h);
| ^
In file included from /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:7:
/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-impl.h:323:33: note: expanded from
macro 'ggml_vdotq_s32'
323 | #define ggml_vdotq_s32(a, b, c) vdotq_s32(a, b, c)
| ^
2 warnings and 4 errors generated.

CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-traits.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-traits.cpp normal arm64 c++ com.apple.compilers.llvm.clang.1_0.compiler (in target 'ggml-cpu' from project 'llama.cpp')
cd /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp

Using response file: /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/82b82416624d2658e5098eb0a28c15c5-common-args.resp

/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x c++ -ivfsstatcache /var/folders/z_/mj3r_mwn0rdcppk76mg6b3mr0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphoneos18.2-22C146-d5b9239ec3bf5b3adbecdf21472871e3.sdkstatcache -fmessage-length\=161 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fcolor-diagnostics -Wno-trigraphs -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-non-virtual-dtor -Wno-overloaded-virtual -Wno-exit-time-destructors -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-empty-body -Wno-uninitialized -Wno-unknown-pragmas -Wno-shadow -Wno-four-char-constants -Wno-conversion -Wno-constant-conversion -Wno-int-conversion -Wno-bool-conversion -Wno-enum-conversion -Wno-float-conversion -Wno-non-literal-null-conversion -Wno-objc-literal-conversion -Wshorten-64-to-32 -Wno-newline-eof -Wno-c++11-extensions -Wno-implicit-fallthrough -fstrict-aliasing -Wdeprecated-declarations -Winvalid-offsetof -Wno-sign-conversion -Wno-infinite-recursion -Wno-move -Wno-comma -Wno-block-capture-autoreleasing -Wno-strict-prototypes -Wno-range-loop-analysis -Wno-semicolon-before-method-body -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi @/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/82b82416624d2658e5098eb0a28c15c5-common-args.resp -MMD -MT dependencies -MF /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-traits.d --serialize-diagnostics /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-traits.dia -c /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-traits.cpp -o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-traits.o

CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-hbm.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-hbm.cpp normal arm64 c++ com.apple.compilers.llvm.clang.1_0.compiler (in target 'ggml-cpu' from project 'llama.cpp')
cd /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp

Using response file: /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/82b82416624d2658e5098eb0a28c15c5-common-args.resp

/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x c++ -ivfsstatcache /var/folders/z_/mj3r_mwn0rdcppk76mg6b3mr0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphoneos18.2-22C146-d5b9239ec3bf5b3adbecdf21472871e3.sdkstatcache -fmessage-length\=161 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fcolor-diagnostics -Wno-trigraphs -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-non-virtual-dtor -Wno-overloaded-virtual -Wno-exit-time-destructors -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-empty-body -Wno-uninitialized -Wno-unknown-pragmas -Wno-shadow -Wno-four-char-constants -Wno-conversion -Wno-constant-conversion -Wno-int-conversion -Wno-bool-conversion -Wno-enum-conversion -Wno-float-conversion -Wno-non-literal-null-conversion -Wno-objc-literal-conversion -Wshorten-64-to-32 -Wno-newline-eof -Wno-c++11-extensions -Wno-implicit-fallthrough -fstrict-aliasing -Wdeprecated-declarations -Winvalid-offsetof -Wno-sign-conversion -Wno-infinite-recursion -Wno-move -Wno-comma -Wno-block-capture-autoreleasing -Wno-strict-prototypes -Wno-range-loop-analysis -Wno-semicolon-before-method-body -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi @/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/82b82416624d2658e5098eb0a28c15c5-common-args.resp -MMD -MT dependencies -MF /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-hbm.d --serialize-diagnostics /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-hbm.dia -c /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-hbm.cpp -o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-hbm.o

CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-c0b5ffc4095df3ac83da75871c8389dd.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c normal arm64 c com.apple.compilers.llvm.clang.1_0.compiler (in target 'ggml-cpu' from project 'llama.cpp')
cd /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp

Using response file: /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/7187679823f38a2a940e0043cdf9d637-common-args.resp

/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x c -ivfsstatcache /var/folders/z_/mj3r_mwn0rdcppk76mg6b3mr0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphoneos18.2-22C146-d5b9239ec3bf5b3adbecdf21472871e3.sdkstatcache -fmessage-length\=161 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fcolor-diagnostics -Wno-trigraphs -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-empty-body -Wno-uninitialized -Wno-unknown-pragmas -Wno-shadow -Wno-four-char-constants -Wno-conversion -Wno-constant-conversion -Wno-int-conversion -Wno-bool-conversion -Wno-enum-conversion -Wno-float-conversion -Wno-non-literal-null-conversion -Wno-objc-literal-conversion -Wshorten-64-to-32 -Wpointer-sign -Wno-newline-eof -Wno-implicit-fallthrough -fstrict-aliasing -Wdeprecated-declarations -Wno-sign-conversion -Wno-infinite-recursion -Wno-comma -Wno-block-capture-autoreleasing -Wno-strict-prototypes -Wno-semicolon-before-method-body -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror\=implicit-int -Werror\=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wdouble-promotion @/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/7187679823f38a2a940e0043cdf9d637-common-args.resp -MMD -MT dependencies -MF /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-c0b5ffc4095df3ac83da75871c8389dd.d --serialize-diagnostics /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-c0b5ffc4095df3ac83da75871c8389dd.dia -c /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c -o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-c0b5ffc4095df3ac83da75871c8389dd.o

CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-aarch64.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-aarch64.cpp normal arm64 c++ com.apple.compilers.llvm.clang.1_0.compiler (in target 'ggml-cpu' from project 'llama.cpp')
cd /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp

Using response file: /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/82b82416624d2658e5098eb0a28c15c5-common-args.resp

/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x c++ -ivfsstatcache /var/folders/z_/mj3r_mwn0rdcppk76mg6b3mr0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphoneos18.2-22C146-d5b9239ec3bf5b3adbecdf21472871e3.sdkstatcache -fmessage-length\=161 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fcolor-diagnostics -Wno-trigraphs -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-non-virtual-dtor -Wno-overloaded-virtual -Wno-exit-time-destructors -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-empty-body -Wno-uninitialized -Wno-unknown-pragmas -Wno-shadow -Wno-four-char-constants -Wno-conversion -Wno-constant-conversion -Wno-int-conversion -Wno-bool-conversion -Wno-enum-conversion -Wno-float-conversion -Wno-non-literal-null-conversion -Wno-objc-literal-conversion -Wshorten-64-to-32 -Wno-newline-eof -Wno-c++11-extensions -Wno-implicit-fallthrough -fstrict-aliasing -Wdeprecated-declarations -Winvalid-offsetof -Wno-sign-conversion -Wno-infinite-recursion -Wno-move -Wno-comma -Wno-block-capture-autoreleasing -Wno-strict-prototypes -Wno-range-loop-analysis -Wno-semicolon-before-method-body -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi @/Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/82b82416624d2658e5098eb0a28c15c5-common-args.resp -MMD -MT dependencies -MF /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-aarch64.d --serialize-diagnostics /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-aarch64.dia -c /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-aarch64.cpp -o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-aarch64.o

note: Run script build phase 'Generate CMakeFiles/ZERO_CHECK' will be run during every build because the option to run the script phase "Based on dependency analysis" is unchecked. (in target 'ZERO_CHECK' from project 'llama.cpp')
note: Run script build phase 'Generate CMakeFiles/ALL_BUILD' will be run during every build because the option to run the script phase "Based on dependency analysis" is unchecked. (in target 'ALL_BUILD' from project 'llama.cpp')
** BUILD FAILED **

The following build commands failed:
CompileC /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/build-ios-device/build/ggml-cpu.build/Release-iphoneos/Objects-normal/arm64/ggml-cpu-quants.o /Users/blainemiller/Documents/mine/Personal/Apps/Ari/llama.cpp-temp/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c normal arm64 c com.apple.compilers.llvm.clang.1_0.compiler (in target 'ggml-cpu' from project 'llama.cpp')
(1 failure)

...

-- The C compiler identification is unknown
-- The CXX compiler identification is unknown

@schlu
Copy link

schlu commented Feb 25, 2025

I am getting an error when I try to compile this. Lots of lines that look like

/Users/schlu/d/ai/danllama/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:2108:31: error: always_inline function 'vdotq_s32' requires target feature 'dotprod',
      but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'dotprod'
 2108 |         const int32x4_t p_1 = ggml_vdotq_s32(ggml_vdotq_s32(vdupq_n_s32(0), v0_1ls, v1_1l), v0_1hs, v1_1h);
      |                               ^
In file included from /Users/schlu/d/ai/danllama/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-quants.c:7:
/Users/schlu/d/ai/danllama/llama.cpp/ggml/src/ggml-cpu/ggml-cpu-impl.h:323:33: note: expanded from macro 'ggml_vdotq_s32'
  323 | #define ggml_vdotq_s32(a, b, c) vdotq_s32(a, b, c)

I also see these lines in the output

-- ARM -mcpu not found, -mcpu=native will be used
-- Performing Test GGML_MACHINE_SUPPORTS_dotprod
-- Performing Test GGML_MACHINE_SUPPORTS_dotprod - Failed

I should note that I tested this before you added mac support (which I am very happy about) and it worked fine. I haven't tried reinstalling all my brew packages because it seems a bit heavy since it was working.

@blaineam
Copy link

so I noticed one of the errors was about Disk IO errors so I moved the llama.cpp repo to a non iCloud Drive synced folder and now builds are consistent and mostly working each time. It fails to build macOS with the same issues @schlu pointed out every time. iOS and iOS sim seem to build ok now that Im not using a icloud synced folder.

@danbev
Copy link
Collaborator Author

danbev commented Feb 26, 2025

@schlu Thanks for reporting this, I'm looking into this now.

@danbev
Copy link
Collaborator Author

danbev commented Feb 26, 2025

@blaineam @schlu I've tried to address the issue both of you ran into with 5cecc8a. Would you be able to try building/running again with the latest changes and if this works for you?

@schlu
Copy link

schlu commented Feb 26, 2025

Much worse now. I stops right away. I rolled back to commit 14d48be to make sure it was still working and it was still compiling the iOS version fine.

Here is the entire output

Checking for required tools...
-- The C compiler identification is AppleClang 16.0.0.16000026
-- The CXX compiler identification is AppleClang 16.0.0.16000026
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - failed
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang - broken
CMake Error at /opt/homebrew/Cellar/cmake/3.26.0/share/cmake/Modules/CMakeTestCCompiler.cmake:67 (message):
  The C compiler

    "/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang"

  is not able to compile a simple test program.

  It fails with the following output:

    Change Dir: /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6

    Run Build Command(s):/usr/bin/xcodebuild -project CMAKE_TRY_COMPILE.xcodeproj build -target cmTC_50b6a -parallelizeTargets -configuration Debug -hideShellScriptEnvironment && Command line invocation:
        /Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild -project CMAKE_TRY_COMPILE.xcodeproj build -target cmTC_50b6a -parallelizeTargets -configuration Debug -hideShellScriptEnvironment

    User defaults from command line:
        HideShellScriptEnvironment = YES
        IDEPackageSupportUseBuiltinSCM = YES

    2025-02-26 16:32:56.451 xcodebuild[4771:8451641]  DVTDeviceOperation: Encountered a build number "" that is incompatible with DVTBuildVersion.
    2025-02-26 16:32:56.512 xcodebuild[4771:8451625] [MT] DVTDeviceOperation: Encountered a build number "" that is incompatible with DVTBuildVersion.
    ComputeTargetDependencyGraph
    note: Building targets in dependency order
    note: Target dependency graph (1 target)
        Target 'cmTC_50b6a' in project 'CMAKE_TRY_COMPILE' (no dependencies)

    GatherProvisioningInputs

    CreateBuildDescription

    ExecuteExternalTool /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -v -E -dM -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk -x c -c /dev/null

    ExecuteExternalTool /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -v -E -dM -arch arm64 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk -x c -c /dev/null

    ExecuteExternalTool /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/libtool -V

    ExecuteExternalTool /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -v -E -dM -arch x86_64 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk -x c -c /dev/null

    Build description signature: 74c41cc0e5195433588abbec7f07194f
    Build description path: /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/XCBuildData/74c41cc0e5195433588abbec7f07194f.xcbuilddata
    CreateBuildDirectory /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/CMAKE_TRY_COMPILE.xcodeproj
        builtin-create-build-directory /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build

    CreateBuildDirectory /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/Debug
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/CMAKE_TRY_COMPILE.xcodeproj
        builtin-create-build-directory /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/Debug

    ClangStatCache /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang-stat-cache /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk /var/folders/ys/6bjnnn5s47l4p3c_87mgcsth0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphonesimulator18.2-22C146-07b28473f605e47e75261259d3ef3b5a.sdkstatcache
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/CMAKE_TRY_COMPILE.xcodeproj
        /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang-stat-cache /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk -o /var/folders/ys/6bjnnn5s47l4p3c_87mgcsth0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphonesimulator18.2-22C146-07b28473f605e47e75261259d3ef3b5a.sdkstatcache

    CreateBuildDirectory /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/EagerLinkingTBDs/Debug-iphonesimulator
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/CMAKE_TRY_COMPILE.xcodeproj
        builtin-create-build-directory /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/EagerLinkingTBDs/Debug-iphonesimulator

    WriteAuxiliaryFile /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/cmTC_50b6a.DependencyMetadataFileList (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6
        write-file /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/cmTC_50b6a.DependencyMetadataFileList

    WriteAuxiliaryFile /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/cmTC_50b6a.LinkFileList (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6
        write-file /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/cmTC_50b6a.LinkFileList

    WriteAuxiliaryFile /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/cmTC_50b6a.LinkFileList (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6
        write-file /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/cmTC_50b6a.LinkFileList

    WriteAuxiliaryFile /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/7187679823f38a2a940e0043cdf9d637-common-args.resp (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6
        write-file /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/7187679823f38a2a940e0043cdf9d637-common-args.resp
    -target arm64-apple-ios16.4-simulator -fpascal-strings -O0 '-DCMAKE_INTDIR="Debug-iphonesimulator"' -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk -g -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/Debug/include -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/DerivedSources-normal/arm64 -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/DerivedSources/arm64 -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/DerivedSources -F/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/Debug '-mcpu=apple-a12'

    WriteAuxiliaryFile /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/7187679823f38a2a940e0043cdf9d637-common-args.resp (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6
        write-file /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/7187679823f38a2a940e0043cdf9d637-common-args.resp
    -target x86_64-apple-ios16.4-simulator -fpascal-strings -O0 '-DCMAKE_INTDIR="Debug-iphonesimulator"' -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.2.sdk -fasm-blocks -g -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/Debug/include -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/DerivedSources-normal/x86_64 -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/DerivedSources/x86_64 -I/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/DerivedSources -F/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/Debug '-mcpu=apple-a12'

    CompileC /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/testCCompiler.o /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/testCCompiler.c normal arm64 c com.apple.compilers.llvm.clang.1_0.compiler (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6

        Using response file: /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/7187679823f38a2a940e0043cdf9d637-common-args.resp

        /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x c -ivfsstatcache /var/folders/ys/6bjnnn5s47l4p3c_87mgcsth0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphonesimulator18.2-22C146-07b28473f605e47e75261259d3ef3b5a.sdkstatcache -fmessage-length\=0 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fno-color-diagnostics -Wno-trigraphs -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-empty-body -Wno-uninitialized -Wno-unknown-pragmas -Wno-shadow -Wno-four-char-constants -Wno-conversion -Wno-constant-conversion -Wno-int-conversion -Wno-bool-conversion -Wno-enum-conversion -Wno-float-conversion -Wno-non-literal-null-conversion -Wno-objc-literal-conversion -Wshorten-64-to-32 -Wpointer-sign -Wno-newline-eof -Wno-implicit-fallthrough -fstrict-aliasing -Wdeprecated-declarations -Wno-sign-conversion -Wno-infinite-recursion -Wno-comma -Wno-block-capture-autoreleasing -Wno-strict-prototypes -Wno-semicolon-before-method-body -Wno-macro-redefined -Wno-shorten-64-to-32 -Wno-unused-command-line-argument @/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/7187679823f38a2a940e0043cdf9d637-common-args.resp -MMD -MT dependencies -MF /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/testCCompiler.d --serialize-diagnostics /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/testCCompiler.dia -c /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/testCCompiler.c -o /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/arm64/testCCompiler.o

    CompileC /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/testCCompiler.o /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/testCCompiler.c normal x86_64 c com.apple.compilers.llvm.clang.1_0.compiler (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
        cd /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6

        Using response file: /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/7187679823f38a2a940e0043cdf9d637-common-args.resp

        /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -x c -ivfsstatcache /var/folders/ys/6bjnnn5s47l4p3c_87mgcsth0000gn/C/com.apple.DeveloperTools/16.2-16C5032a/Xcode/SDKStatCaches.noindex/iphonesimulator18.2-22C146-07b28473f605e47e75261259d3ef3b5a.sdkstatcache -fmessage-length\=0 -fdiagnostics-show-note-include-stack -fmacro-backtrace-limit\=0 -fno-color-diagnostics -Wno-trigraphs -Wno-missing-field-initializers -Wno-missing-prototypes -Wno-return-type -Wno-missing-braces -Wparentheses -Wswitch -Wno-unused-function -Wno-unused-label -Wno-unused-parameter -Wno-unused-variable -Wunused-value -Wno-empty-body -Wno-uninitialized -Wno-unknown-pragmas -Wno-shadow -Wno-four-char-constants -Wno-conversion -Wno-constant-conversion -Wno-int-conversion -Wno-bool-conversion -Wno-enum-conversion -Wno-float-conversion -Wno-non-literal-null-conversion -Wno-objc-literal-conversion -Wshorten-64-to-32 -Wpointer-sign -Wno-newline-eof -Wno-implicit-fallthrough -fstrict-aliasing -Wdeprecated-declarations -Wno-sign-conversion -Wno-infinite-recursion -Wno-comma -Wno-block-capture-autoreleasing -Wno-strict-prototypes -Wno-semicolon-before-method-body -Wno-macro-redefined -Wno-shorten-64-to-32 -Wno-unused-command-line-argument @/Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/7187679823f38a2a940e0043cdf9d637-common-args.resp -MMD -MT dependencies -MF /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/testCCompiler.d --serialize-diagnostics /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/testCCompiler.dia -c /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/testCCompiler.c -o /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/testCCompiler.o
    clang: error: unsupported option '-mcpu=' for target 'x86_64-apple-ios16.4-simulator'

    ** BUILD FAILED **


    The following build commands failed:
    	CompileC /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/build/cmTC_50b6a.build/Debug-iphonesimulator/Objects-normal/x86_64/testCCompiler.o /Users/schlu/d/ai/danllama/llama.cpp/build-ios-sim/CMakeFiles/CMakeScratch/TryCompile-OlEfl6/testCCompiler.c normal x86_64 c com.apple.compilers.llvm.clang.1_0.compiler (in target 'cmTC_50b6a' from project 'CMAKE_TRY_COMPILE')
    (1 failure)





  CMake will not be able to correctly generate this project.
Call Stack (most recent call first):
  CMakeLists.txt:2 (project)


-- Configuring incomplete, errors occurred!

@blaineam
Copy link

Ran into the same error as well. Didn't have time to test a previous commit though.

@danbev
Copy link
Collaborator Author

danbev commented Feb 26, 2025

Much worse now. I stops right away.

Sorry about this (I am running the build locally and checking that the builds work in various projects to try to avoid wasting your time). I've pushed another commit which I hope will address this 🤞

@blaineam
Copy link

It built fully this time with no errors I could see.
It appears to be running nicely on my iPhone 16 pro max as well as my m4 pro MacBook.
Thank you for fixing that!

@schlu
Copy link

schlu commented Feb 26, 2025

Sorry about this (I am running the build locally and checking that the builds work in various projects to try to avoid wasting your time). I've pushed another commit which I hope will address this 🤞

Don't worry, thank you for your time. I am getting this error now:

2025-02-26 18:06:00.833 xcodebuild[13215:8574823]  DVTDeviceOperation: Encountered a build number "" that is incompatible with DVTBuildVersion.
2025-02-26 18:06:00.840 xcodebuild[13215:8574786] [MT] DVTDeviceOperation: Encountered a build number "" that is incompatible with DVTBuildVersion.
-- The C compiler identification is unknown
-- The CXX compiler identification is unknown
System is unknown to cmake, create:
Platform/visionOS to use this system, please post your config file on discourse.cmake.org so it can be added to cmake
CMake Error at CMakeLists.txt:2 (project):
  No CMAKE_C_COMPILER could be found.



CMake Error at CMakeLists.txt:2 (project):
  No CMAKE_CXX_COMPILER could be found.



-- Configuring incomplete, errors occurred!

I went and downloaded the visonOS stuff in Xcode but the error persisted:

CleanShot 2025-02-26 at 18 14 06@2x

@blaineam
Copy link

blaineam commented Feb 26, 2025

I do see this adding my app with the llama.xcframework to App Store Connect.

The archive did not include a dSYM for the llama.framework with the UUIDs [137991C7-3209-3649-AF6D-A372E87A3501, CDD4EED7-ED99-30C3-9EBD-61640833600C]. Ensure that the archive's dSYM folder includes a DWARF file for llama.framework with the expected UUIDs.

And they emailed me these issues:

ITMS-90291: Malformed Framework - The framework bundle llama (sample.app/Contents/Frameworks/llama.framework) must contain a symbolic link 'llama' -> 'Versions/Current/llama'. Refer to the Anatomy of Framework Bundles for more information.

ITMS-90291: Malformed Framework - The framework bundle llama (sample.app/Contents/Frameworks/llama.framework) must contain a symbolic link 'Resources' -> 'Versions/Current/Resources'. Refer to the Anatomy of Framework Bundles for more information.

ITMS-90292: Malformed Framework - The framework bundle llama (sample.app/Contents/Frameworks/llama.framework) 'Versions' directory must contain a symbolic link 'Current' resolving to a specific version directory. Resolved link target: '${linkTarget}'. Refer to the Anatomy of Framework Bundles for more information.

Here the link they reference too:
Anatomy of Framework Bundles

@danbev
Copy link
Collaborator Author

danbev commented Feb 26, 2025

@schlu Can you tell me the version of cmake you are using? I'm asking because I think CMake added official support for visionOS in version 3.28.0. Based on this in the error you are seeing, perhaps that is the issue:

System is unknown to cmake, create:
Platform/visionOS to use this system, please post your config file on discourse.cmake.org so it can be added to cmake
CMake Error at CMakeLists.txt:2 (project):

If this is the case I'll add a check to the script.

@danbev
Copy link
Collaborator Author

danbev commented Feb 26, 2025

I do see this adding my app with the llama.xcframework to App Store Connect.

I've not tried this but I'll take a closer look at this tomorrow 👍

danbev added 4 commits March 3, 2025 12:00
This commit adds the ability to create a GitHub release with the
xcframework build artifact.
This commit adds scripts that can validate the iOS, macOS, tvOS, and
VisionOS applications. The scripts create a simple test app project,
copy the llama.xcframework to the test project, build and archive the
app, create an IPA from the archive, and validate the IPA using altool.

The motivation for this is to provide some basic validation and
hopefully avoid having to manually validate apps in Xcode.
This commit removes the Package.swift file, as we are now building an
XCFramework for the project.
@danbev danbev force-pushed the xcframework-build-10747 branch from da0ea36 to 69a6d36 Compare March 3, 2025 11:02
@danbev
Copy link
Collaborator Author

danbev commented Mar 3, 2025

I've rebased this to hopefully make it a little easier to review after all the iterations.

I'm not sure how we should handle the Package.swift issue mentioned in this comment. At the moment Package.swift has been removed by a commit in this PR.

For local testing I've been using this as a Package.swift (not checked in):

// swift-tools-version: 5.10

import PackageDescription

let package = Package(
    name: "llama",
    platforms: [
        .iOS(.v14),
        .macOS(.v10_15),
        .visionOS(.v1)
    ],
    products: [
        .library(name: "llama.cpp", targets: ["llama-framework"])
    ],
    targets: [
        .binaryTarget(
            name: "llama-framework",
            //url: "https://github.com/ggml-org/llama.cpp/releases/download/bXXXX/llama-bXXXX-xcframework.zip",
            //checksum: "the-sha256-checksum-here"

            // The following can be used for local testing. Run the build-xcframework.sh
            // script first to generate the XCFramework.
            path: "build-apple/llama.xcframework"
        )
    ]
)

This was only to verify that I could create a swift project and use the framework as a dependency from it. Perhaps we can investigate this further in a follow up commit to figure out what the best way is to handle the checksum issue?

@danbev
Copy link
Collaborator Author

danbev commented Mar 4, 2025

@blaineam @tladesignz @schlu @ggerganov @slaren Thanks for all the testing and discussions!

If possible it would be nice to this merged so people can start using it. I'll take care of any issues that might crop up (I'm thinking about the removal of Package.swift which I'm not sure if it will effect other projects that might be using it).

The next release should then generate a llama-bxxxx-xcframework.zip and make it possible for users to specify something like the following in their Package.swift:

// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.

import PackageDescription

let package = Package(
    name: "MyLlamaPackage",
    targets: [
        .executableTarget(
            name: "MyLlamaPackage",
            dependencies: [
                "LlamaFramework"
            ]),
        .binaryTarget(
            name: "LlamaFramework",
            url: "https://github.com/ggml-org/llama.cpp/releases/download/bxxxx/llama-bxxxx-xcframework.zip",
            checksum: "<checksum>"
        )
    ]
)

@tladesignz
Copy link

Thank you @danbev, this turned out to be some massive amount of work! Well done!

@danbev danbev merged commit a057897 into ggml-org:master Mar 5, 2025
47 checks passed
mglambda pushed a commit to mglambda/llama.cpp that referenced this pull request Mar 8, 2025
* llama : add xcframework build script

This commit adds a script to build an XCFramework for Apple
ios, macos, visionos, and tvos platforms.

The generated XCFramework can then be added to a project and used in
the same way as a regular framework. The llama.swiftui example project
has been updated to use the XCFramework and can be started using the
following command:
```console
$ open examples/llama.swiftui/llama.swiftui.xcodeproj/
```

Refs: ggml-org#10747

* examples : remove llama.cpp (source dir ref) from project.pbxproj

This commit removes the reference to llama.cpp from the project.pbxproj
file since Package.swift has been removed.

* ci : updated build.yml to use build-xcframework.sh

* ci : add xcframework build to github releases

This commit adds the ability to create a GitHub release with the
xcframework build artifact.

* scripts : add apple app validation scripts

This commit adds scripts that can validate the iOS, macOS, tvOS, and
VisionOS applications. The scripts create a simple test app project,
copy the llama.xcframework to the test project, build and archive the
app, create an IPA from the archive, and validate the IPA using altool.

The motivation for this is to provide some basic validation and
hopefully avoid having to manually validate apps in Xcode.

* llama : remove Package.swift

This commit removes the Package.swift file, as we are now building an
XCFramework for the project.

* llama : remove Sources and spm-headers directories

* llama : use TargetConditionals.h for visionOS/tvOS
@pgorzelany
Copy link

Hi @danbev thank you for the great work you have done here! I was able to wrap the xcframework in a Swift package and use it on iOS. Unfortunately when I try to build it on MacOS I get a build error:
Command CodeSign failed with a nonzero exit code

Did you encountered a similar error by any chance?

@danbev
Copy link
Collaborator Author

danbev commented Mar 15, 2025

Did you encountered a similar error by any chance?

I have not run into this, at least not yet. I've pushed a very basic project that I'm able to build and run here. This does not do anything useful apart from accessing the llama.cpp api using the xcframework so it might not be "advanced" enough to reproduce the error you are seeing. If you can figure out how I can reproduce this then please let me know and I'll look into it.

@blaineam
Copy link

Hi @danbev thank you for the great work you have done here! I was able to wrap the xcframework in a Swift package and use it on iOS. Unfortunately when I try to build it on MacOS I get a build error: Command CodeSign failed with a nonzero exit code

Did you encountered a similar error by any chance?

I have an iOS and macOS app on the App Store already that uses the .xcframework from this PR and it is working nicely. I ran into a similar error to that when trying to use any SPM workaround for llama.cpp including even this .xcframework but only in a response from submitting it for notarization and the debug guide Apple links you to passes every time without issues. I couldn't ever figure out why Direct Distribution Notarization is failing now so just gave up on it. App Store is good enough for my app.

@mchew
Copy link

mchew commented Mar 15, 2025

Unfortunately when I try to build it on MacOS I get a build error: Command CodeSign failed with a nonzero exit code

I am also seeing this error when trying to build a macOS app. Additionally, Xcode gives me this warning:

Couldn't resolve framework symlink for '/Users/mchew/Library/Developer/Xcode/DerivedData/MyLlamaApp-ciauxdwporknvrfccbfrovlonuyt/SourcePackages/artifacts/MyLlamaPackage/llama/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current': readlink(/Users/mchew/Library/Developer/Xcode/DerivedData/MyLlamaApp-ciauxdwporknvrfccbfrovlonuyt/SourcePackages/artifacts/MyLlamaPackage/llama/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current): Invalid argument (22)

I've added the xcframework to my app as a url binary target. I think the issue is with the structure of the framework Versions directory. According to Apple's docs, Versions/Current should be a symlink to Versions/A, however in the zip it is a separate directory. If I manually recreate it as a symlink in Xcode's build directory, then the error goes away.

I see that building the xcframework locally creates the expected symlink, so I think zipping the release artifact with the -y option to preserve symlinks should fix this.

@danbev
Copy link
Collaborator Author

danbev commented Mar 16, 2025

@mchew Thanks for figuring this out!
I'd been testing locally just with the built xcframework but if if I try to zip, and then unzip I'm also seeing this error. I'll push a fix shortly.

danbev added a commit to danbev/llama.cpp that referenced this pull request Mar 16, 2025
This commit adds the --symlinks option to the zip command used to create
the xcframework zip file. This is necessary to create symlinks in the
zip file. Without this option,  the Versions symlink is stored as a
regular directory entry in the zip file, rather than as a symlink in the
zip which causes the followig error in xcode:
```console
Couldn't resolve framework symlink for '/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current': readlink(/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current): Invalid argument (22)
```

Refs: ggml-org#11996 (comment)
danbev added a commit that referenced this pull request Mar 16, 2025
This commit adds the --symlinks option to the zip command used to create
the xcframework zip file. This is necessary to create symlinks in the
zip file. Without this option,  the Versions symlink is stored as a
regular directory entry in the zip file, rather than as a symlink in the
zip which causes the followig error in xcode:
```console
Couldn't resolve framework symlink for '/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current': readlink(/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current): Invalid argument (22)
```

Refs: #11996 (comment)
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Mar 19, 2025
* llama : add xcframework build script

This commit adds a script to build an XCFramework for Apple
ios, macos, visionos, and tvos platforms.

The generated XCFramework can then be added to a project and used in
the same way as a regular framework. The llama.swiftui example project
has been updated to use the XCFramework and can be started using the
following command:
```console
$ open examples/llama.swiftui/llama.swiftui.xcodeproj/
```

Refs: ggml-org#10747

* examples : remove llama.cpp (source dir ref) from project.pbxproj

This commit removes the reference to llama.cpp from the project.pbxproj
file since Package.swift has been removed.

* ci : updated build.yml to use build-xcframework.sh

* ci : add xcframework build to github releases

This commit adds the ability to create a GitHub release with the
xcframework build artifact.

* scripts : add apple app validation scripts

This commit adds scripts that can validate the iOS, macOS, tvOS, and
VisionOS applications. The scripts create a simple test app project,
copy the llama.xcframework to the test project, build and archive the
app, create an IPA from the archive, and validate the IPA using altool.

The motivation for this is to provide some basic validation and
hopefully avoid having to manually validate apps in Xcode.

* llama : remove Package.swift

This commit removes the Package.swift file, as we are now building an
XCFramework for the project.

* llama : remove Sources and spm-headers directories

* llama : use TargetConditionals.h for visionOS/tvOS
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Mar 19, 2025
This commit adds the --symlinks option to the zip command used to create
the xcframework zip file. This is necessary to create symlinks in the
zip file. Without this option,  the Versions symlink is stored as a
regular directory entry in the zip file, rather than as a symlink in the
zip which causes the followig error in xcode:
```console
Couldn't resolve framework symlink for '/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current': readlink(/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current): Invalid argument (22)
```

Refs: ggml-org#11996 (comment)
@pgorzelany
Copy link

Thank you! Can confirm it now works on MacOS.

Erick-0923 pushed a commit to Erick-0923/llama.cpp that referenced this pull request May 30, 2025
This commit adds the --symlinks option to the zip command used to create
the xcframework zip file. This is necessary to create symlinks in the
zip file. Without this option,  the Versions symlink is stored as a
regular directory entry in the zip file, rather than as a symlink in the
zip which causes the followig error in xcode:
```console
Couldn't resolve framework symlink for '/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current': readlink(/Users/danbev/work/ai/llama.cpp/tmp_1/build-apple/llama.xcframework/macos-arm64_x86_64/llama.framework/Versions/Current): Invalid argument (22)
```

Refs: ggml-org/llama.cpp#11996 (comment)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
devops improvements to build systems and github actions examples script Script related
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants