******************************************************************************** conan test cci-a3d3e2e9/recipes/llama-cpp/all/test_package/conanfile.py llama-cpp/b3542@#1d40bd238142cdda7a446e45a014a509 -pr /Users/jenkins/workspace/prod-v1/bsr/84630/dbeed/profile_osx_130_libcpp_apple-clang_release_armv8.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True -c tools.apple:sdk_path=/Applications/conan/xcode/13.0/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.3.sdk ******************************************************************************** Configuration: [settings] arch=armv8 build_type=Release compiler=apple-clang compiler.libcxx=libc++ compiler.version=13.0 os=Macos [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True tools.apple:sdk_path=/Applications/conan/xcode/13.0/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX11.3.sdk llama-cpp/b3542 (test package): Installing package Requirements llama-cpp/b3542 from local cache - Cache Packages llama-cpp/b3542:f1a36a2aea2da1148eb4bddd758d8db183c6db83 - Cache Installing (downloading, building) binaries... llama-cpp/b3542: Already installed! llama-cpp/b3542 (test package): Generator 'VirtualRunEnv' calling 'generate()' llama-cpp/b3542 (test package): Generator 'CMakeToolchain' calling 'generate()' llama-cpp/b3542 (test package): Preset 'release' added to CMakePresets.json. Invoke it manually using 'cmake --preset release' llama-cpp/b3542 (test package): If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE=/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW -DCMAKE_BUILD_TYPE=Release' llama-cpp/b3542 (test package): Generator txt created conanbuildinfo.txt llama-cpp/b3542 (test package): Generator 'CMakeDeps' calling 'generate()' llama-cpp/b3542 (test package): Aggregating env generators llama-cpp/b3542 (test package): Generated conaninfo.txt llama-cpp/b3542 (test package): Generated graphinfo Using lockfile: '/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release/generators/conan.lock' Using cached profile from lockfile [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b3542 (test package): Calling build() llama-cpp/b3542 (test package): CMake command: cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/." ----Running------ > cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/." ----------------- -- Using Conan toolchain: /Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake -- The CXX compiler identification is AppleClang 13.0.0.13000029 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /Applications/conan/xcode/13.0/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Conan: Component target declared 'llama-cpp::common' -- Conan: Component target declared 'llama-cpp::llama' -- Conan: Target declared 'llama-cpp::llama-cpp' -- Configuring done -- Generating done -- Build files have been written to: /Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release llama-cpp/b3542 (test package): CMake command: cmake --build "/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release" '--' '-j8' ----Running------ > cmake --build "/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release" '--' '-j8' ----------------- [ 50%] Building CXX object CMakeFiles/test_package.dir/test_package.cpp.o [100%] Linking CXX executable test_package [100%] Built target test_package llama-cpp/b3542 (test package): Running test() ----Running------ > . "/Users/jenkins/workspace/prod-v1/bsr/cci-a3d3e2e9/recipes/llama-cpp/all/test_package/build/Release/generators/conanrun.sh" && ./test_package ----------------- Main GPU: 0 CMake Warning: Manually-specified variables were not used by the project: CMAKE_POLICY_DEFAULT_CMP0091 WARN: *** Conan 1 is legacy and on a deprecation path *** WARN: *** Please upgrade to Conan 2 *** llama-cpp/b3542 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b3542 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior