******************************************************************************** conan test cci-82f9f6c9/recipes/llama-cpp/all/test_package/conanfile.py llama-cpp/b3012@#bb0c40f9a2f73dd6ee7a7e297b66c4e4 -pr /home/conan/workspace/prod-v1/bsr/49909/fddad/profile_linux_9_libstdcpp11_gcc_release_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Configuration: [settings] arch=x86_64 build_type=Release compiler=gcc compiler.libcxx=libstdc++11 compiler.version=9 os=Linux [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b3012 (test package): Installing package Requirements llama-cpp/b3012 from local cache - Cache Packages llama-cpp/b3012:b911f48570f9bb2902d9e83b2b9ebf9d376c8c56 - Cache Installing (downloading, building) binaries... llama-cpp/b3012: Already installed! llama-cpp/b3012 (test package): Generator txt created conanbuildinfo.txt llama-cpp/b3012 (test package): Generator 'CMakeToolchain' calling 'generate()' llama-cpp/b3012 (test package): Preset 'release' added to CMakePresets.json. Invoke it manually using 'cmake --preset release' llama-cpp/b3012 (test package): If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE=/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW -DCMAKE_BUILD_TYPE=Release' llama-cpp/b3012 (test package): Generator 'VirtualRunEnv' calling 'generate()' llama-cpp/b3012 (test package): Generator 'CMakeDeps' calling 'generate()' llama-cpp/b3012 (test package): Aggregating env generators llama-cpp/b3012 (test package): Generated conaninfo.txt llama-cpp/b3012 (test package): Generated graphinfo Using lockfile: '/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release/generators/conan.lock' Using cached profile from lockfile [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b3012 (test package): Calling build() llama-cpp/b3012 (test package): CMake command: cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/." ----Running------ > cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/." ----------------- -- Using Conan toolchain: /home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake -- The CXX compiler identification is GNU 9.2.1 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Conan: Component target declared 'llama-cpp::llama' -- Conan: Component target declared 'llama-cpp::common' -- Conan: Target declared 'llama-cpp::llama-cpp' -- Configuring done -- Generating done -- Build files have been written to: /home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release llama-cpp/b3012 (test package): CMake command: cmake --build "/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release" '--' '-j3' ----Running------ > cmake --build "/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release" '--' '-j3' ----------------- Scanning dependencies of target test_package [ 50%] Building CXX object CMakeFiles/test_package.dir/test_package.cpp.o CMake Warning: Manually-specified variables were not used by the project: CMAKE_POLICY_DEFAULT_CMP0091 /home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/test_package.cpp: In function ‘int main(int, char**)’: /home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/test_package.cpp:21:29: error: too many arguments to function ‘void llama_backend_init()’ 21 | llama_backend_init(false); | ^ In file included from /home/conan/workspace/prod-v1/bsr/49909/bffdd/.conan/data/llama-cpp/b3012/_/_/package/b911f48570f9bb2902d9e83b2b9ebf9d376c8c56/include/common/common.h:5, from /home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/test_package.cpp:1: /home/conan/workspace/prod-v1/bsr/49909/bffdd/.conan/data/llama-cpp/b3012/_/_/package/b911f48570f9bb2902d9e83b2b9ebf9d376c8c56/include/llama.h:389:20: note: declared here 389 | LLAMA_API void llama_backend_init(void); | ^~~~~~~~~~~~~~~~~~ make[2]: *** [CMakeFiles/test_package.dir/build.make:82: CMakeFiles/test_package.dir/test_package.cpp.o] Error 1 make[1]: *** [CMakeFiles/Makefile2:95: CMakeFiles/test_package.dir/all] Error 2 make: *** [Makefile:103: all] Error 2 WARN: *** Conan 1 is legacy and on a deprecation path *** WARN: *** Please upgrade to Conan 2 *** llama-cpp/b3012 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b3012 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior ERROR: llama-cpp/b3012 (test package): Error in build() method, line 22 cmake.build() ConanException: Error 2 while executing cmake --build "/home/conan/workspace/prod-v1/bsr/cci-82f9f6c9/recipes/llama-cpp/all/test_package/build/Release" '--' '-j3'