******************************************************************************** conan test cci-ed4b7c48/recipes/llama-cpp/all/test_package/conanfile.py llama-cpp/b2038@#85990c1ca60ea51e36a691c39412afb7 -pr /home/conan/w/prod-v1/bsr/102469/fcaba/profile_linux_9_libstdcpp11_gcc_release_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Configuration: [settings] arch=x86_64 build_type=Release compiler=gcc compiler.libcxx=libstdc++11 compiler.version=9 os=Linux [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b2038 (test package): Installing package Requirements llama-cpp/b2038 from local cache - Cache Packages llama-cpp/b2038:f4a9ba174cced6ba89ea48b308d487b3c940b971 - Cache Installing (downloading, building) binaries... llama-cpp/b2038: Already installed! llama-cpp/b2038 (test package): Generator txt created conanbuildinfo.txt llama-cpp/b2038 (test package): Generator 'CMakeToolchain' calling 'generate()' llama-cpp/b2038 (test package): Preset 'release' added to CMakePresets.json. Invoke it manually using 'cmake --preset release' llama-cpp/b2038 (test package): If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE=/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW -DCMAKE_BUILD_TYPE=Release' llama-cpp/b2038 (test package): Generator 'CMakeDeps' calling 'generate()' llama-cpp/b2038 (test package): Generator 'VirtualRunEnv' calling 'generate()' llama-cpp/b2038 (test package): Aggregating env generators llama-cpp/b2038 (test package): Generated conaninfo.txt llama-cpp/b2038 (test package): Generated graphinfo Using lockfile: '/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release/generators/conan.lock' Using cached profile from lockfile [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b2038 (test package): Calling build() llama-cpp/b2038 (test package): CMake command: cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/." ----Running------ > cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/." ----------------- -- Using Conan toolchain: /home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release/generators/conan_toolchain.cmake -- The CXX compiler identification is GNU 9.2.1 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Conan: Component target declared 'llama-cpp::llama' -- Conan: Component target declared 'llama-cpp::common' -- Conan: Target declared 'llama-cpp::llama-cpp' -- Configuring done -- Generating done -- Build files have been written to: /home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release llama-cpp/b2038 (test package): CMake command: cmake --build "/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release" '--' '-j3' ----Running------ > cmake --build "/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release" '--' '-j3' ----------------- Scanning dependencies of target test_package [ 50%] Building CXX object CMakeFiles/test_package.dir/test_package.cpp.o [100%] Linking CXX executable test_package CMake Warning: Manually-specified variables were not used by the project: CMAKE_POLICY_DEFAULT_CMP0091 /usr/bin/ld: /home/conan/w/prod-v1/bsr/102469/adfeb/.conan/data/llama-cpp/b2038/_/_/package/f4a9ba174cced6ba89ea48b308d487b3c940b971/lib/libllama.a(llama.cpp.o): in function `llama_convert_tensor_internal(ggml_tensor*, std::vector, std::allocator > >&, std::vector >&, unsigned long, int)': llama.cpp:(.text+0xf864): undefined reference to `pthread_create' /usr/bin/ld: llama.cpp:(.text+0xfa75): undefined reference to `pthread_create' /usr/bin/ld: /home/conan/w/prod-v1/bsr/102469/adfeb/.conan/data/llama-cpp/b2038/_/_/package/f4a9ba174cced6ba89ea48b308d487b3c940b971/lib/libllama.a(llama.cpp.o): in function `llama_model_quantize_internal(std::__cxx11::basic_string, std::allocator > const&, std::__cxx11::basic_string, std::allocator > const&, llama_model_quantize_params const*)': llama.cpp:(.text+0x35e0c): undefined reference to `pthread_create' /usr/bin/ld: llama.cpp:(.text+0x35f8b): undefined reference to `pthread_create' /usr/bin/ld: /home/conan/w/prod-v1/bsr/102469/adfeb/.conan/data/llama-cpp/b2038/_/_/package/f4a9ba174cced6ba89ea48b308d487b3c940b971/lib/libllama.a(ggml.c.o): in function `ggml_graph_compute_thread': ggml.c:(.text+0x3477b): undefined reference to `pthread_setaffinity_np' /usr/bin/ld: /home/conan/w/prod-v1/bsr/102469/adfeb/.conan/data/llama-cpp/b2038/_/_/package/f4a9ba174cced6ba89ea48b308d487b3c940b971/lib/libllama.a(ggml.c.o): in function `ggml_graph_compute': ggml.c:(.text+0x38503): undefined reference to `pthread_create' /usr/bin/ld: ggml.c:(.text+0x385bf): undefined reference to `pthread_setaffinity_np' /usr/bin/ld: ggml.c:(.text+0x38606): undefined reference to `pthread_join' collect2: error: ld returned 1 exit status make[2]: *** [CMakeFiles/test_package.dir/build.make:105: test_package] Error 1 make[1]: *** [CMakeFiles/Makefile2:95: CMakeFiles/test_package.dir/all] Error 2 make: *** [Makefile:103: all] Error 2 llama-cpp/b2038 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b2038 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior ERROR: llama-cpp/b2038 (test package): Error in build() method, line 22 cmake.build() ConanException: Error 2 while executing cmake --build "/home/conan/w/prod-v1/bsr/cci-ed4b7c48/recipes/llama-cpp/all/test_package/build/Release" '--' '-j3'