******************************************************************************** conan install llama-cpp/b3542@#1d40bd238142cdda7a446e45a014a509 --build=llama-cpp -pr /home/conan/workspace/prod-v1/bsr/84618/adddc/profile_linux_13_libstdcpp_clang_release_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Auto detecting your dev setup to initialize the default profile (/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/profiles/default) CC and CXX: clang, clang++ Found clang 13.0 clang>=8, using the major as version Default settings os=Linux os_build=Linux arch=x86_64 arch_build=x86_64 compiler=clang compiler.version=13 compiler.libcxx=libstdc++ build_type=Release *** You can change them in /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/profiles/default *** *** Or override with -s compiler='other' -s ...s*** Configuration: [settings] arch=x86_64 build_type=Release compiler=clang compiler.libcxx=libstdc++ compiler.version=13 os=Linux [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b3542: Forced build from source Installing package: llama-cpp/b3542 Requirements llama-cpp/b3542 from local cache - Cache Packages llama-cpp/b3542:80d3c8d077516ce6e7dc913ef92912e9172eae4c - Build Installing (downloading, building) binaries... [HOOK - conan-center.py] pre_source(): [IMMUTABLE SOURCES (KB-H010)] OK llama-cpp/b3542: Configuring sources in /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/source/src llama-cpp/b3542: [HOOK - conan-center.py] post_source(): [LIBCXX MANAGEMENT (KB-H011)] OK [HOOK - conan-center.py] post_source(): [CPPSTD MANAGEMENT (KB-H022)] OK [HOOK - conan-center.py] post_source(): [SHORT_PATHS USAGE (KB-H066)] OK llama-cpp/b3542: Copying sources to build folder llama-cpp/b3542: Building your package in /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c llama-cpp/b3542: Generator txt created conanbuildinfo.txt llama-cpp/b3542: Calling generate() llama-cpp/b3542: Preset 'release' added to CMakePresets.json. Invoke it manually using 'cmake --preset release' llama-cpp/b3542: If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE=/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release/generators/conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW -DCMAKE_BUILD_TYPE=Release' llama-cpp/b3542: Aggregating env generators [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK. 'fPIC' option found and apparently well managed [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b3542: Calling build() llama-cpp/b3542: CMake command: cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/src" ----Running------ > cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/src" ----------------- -- Using Conan toolchain: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release/generators/conan_toolchain.cmake -- Conan toolchain: Setting CMAKE_POSITION_INDEPENDENT_CODE=ON (options.fPIC) -- Conan toolchain: Setting BUILD_SHARED_LIBS = OFF -- The C compiler identification is Clang 13.0.1 -- The CXX compiler identification is Clang 13.0.1 -- Check for working C compiler: /usr/local/bin/clang -- Check for working C compiler: /usr/local/bin/clang -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Detecting C compile features -- Detecting C compile features - done -- Check for working CXX compiler: /usr/local/bin/clang++ -- Check for working CXX compiler: /usr/local/bin/clang++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.43.2") -- Looking for pthread.h -- Looking for pthread.h - found -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Check if compiler accepts -pthread -- Check if compiler accepts -pthread - yes -- Found Threads: TRUE -- Could NOT find OpenMP_C (missing: OpenMP_C_FLAGS OpenMP_C_LIB_NAMES) -- Could NOT find OpenMP_CXX (missing: OpenMP_CXX_FLAGS OpenMP_CXX_LIB_NAMES) -- Could NOT find OpenMP (missing: OpenMP_C_FOUND OpenMP_CXX_FOUND) -- Using llamafile -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: x86_64 -- x86 detected -- Configuring done -- Generating done -- Build files have been written to: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release llama-cpp/b3542: CMake command: cmake --build "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release" '--' '-j3' ----Running------ > cmake --build "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release" '--' '-j3' ----------------- [ 4%] Generating build details from Git -- Found Git: /usr/bin/git (found version "2.43.2") Scanning dependencies of target ggml [ 8%] Building C object ggml/src/CMakeFiles/ggml.dir/ggml.c.o [ 12%] Building C object ggml/src/CMakeFiles/ggml.dir/ggml-alloc.c.o Scanning dependencies of target build_info [ 16%] Building CXX object common/CMakeFiles/build_info.dir/build-info.cpp.o [ 16%] Built target build_info [ 20%] Building C object ggml/src/CMakeFiles/ggml.dir/ggml-backend.c.o [ 25%] Building C object ggml/src/CMakeFiles/ggml.dir/ggml-quants.c.o [ 29%] Building CXX object ggml/src/CMakeFiles/ggml.dir/llamafile/sgemm.cpp.o [ 33%] Building C object ggml/src/CMakeFiles/ggml.dir/ggml-aarch64.c.o [ 37%] Linking CXX static library libggml.a [ 37%] Built target ggml Scanning dependencies of target llama [ 41%] Building CXX object src/CMakeFiles/llama.dir/llama.cpp.o [ 45%] Building CXX object src/CMakeFiles/llama.dir/llama-vocab.cpp.o [ 50%] Building CXX object src/CMakeFiles/llama.dir/llama-grammar.cpp.o [ 54%] Building CXX object src/CMakeFiles/llama.dir/llama-sampling.cpp.o [ 58%] Building CXX object src/CMakeFiles/llama.dir/unicode.cpp.o [ 62%] Building CXX object src/CMakeFiles/llama.dir/unicode-data.cpp.o [ 66%] Linking CXX static library libllama.a [ 66%] Built target llama Scanning dependencies of target common [ 70%] Building CXX object common/CMakeFiles/common.dir/common.cpp.o [ 75%] Building CXX object common/CMakeFiles/common.dir/sampling.cpp.o [ 79%] Building CXX object common/CMakeFiles/common.dir/console.cpp.o [ 83%] Building CXX object common/CMakeFiles/common.dir/grammar-parser.cpp.o [ 87%] Building CXX object common/CMakeFiles/common.dir/json-schema-to-grammar.cpp.o [ 91%] Building CXX object common/CMakeFiles/common.dir/train.cpp.o [ 95%] Building CXX object common/CMakeFiles/common.dir/ngram-cache.cpp.o [100%] Linking CXX static library libcommon.a [100%] Built target common llama-cpp/b3542: Package '80d3c8d077516ce6e7dc913ef92912e9172eae4c' built llama-cpp/b3542: Build folder /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release llama-cpp/b3542: Generated conaninfo.txt llama-cpp/b3542: Generated conanbuildinfo.txt llama-cpp/b3542: Generating the package llama-cpp/b3542: Package folder /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c llama-cpp/b3542: Calling package() llama-cpp/b3542: Copied 1 file: LICENSE llama-cpp/b3542: CMake command: cmake --install "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release" --prefix "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c" ----Running------ > cmake --install "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/build/80d3c8d077516ce6e7dc913ef92912e9172eae4c/build/Release" --prefix "/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c" ----------------- -- Install configuration: "Release" -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/lib/libggml.a -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-alloc.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-backend.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-blas.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-cann.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-cuda.h -- Up-to-date: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-kompute.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-metal.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-rpc.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-sycl.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/ggml-vulkan.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/lib/libllama.a -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/include/llama.h -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/lib/cmake/llama/llama-config.cmake -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/lib/cmake/llama/llama-version.cmake -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/bin/convert_hf_to_gguf.py -- Installing: /home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/lib/pkgconfig/llama.pc llama-cpp/b3542: Copied 16 '.gguf' files llama-cpp/b3542: Copied 13 '.out' files llama-cpp/b3542: Copied 13 '.inp' files llama-cpp/b3542: Copied 1 file: .editorconfig llama-cpp/b3542: Copied 9 '.h' files llama-cpp/b3542: Copied 2 '.hpp' files: json.hpp, base64.hpp llama-cpp/b3542: Copied 1 '.a' file: libcommon.a llama-cpp/b3542: Copied 1 '.cmake' file: llama-cpp-cuda-static.cmake [HOOK - conan-center.py] post_package(): [PACKAGE LICENSE (KB-H012)] OK [HOOK - conan-center.py] post_package(): [DEFAULT PACKAGE LAYOUT (KB-H013)] OK [HOOK - conan-center.py] post_package(): [MATCHING CONFIGURATION (KB-H014)] OK [HOOK - conan-center.py] post_package(): [SHARED ARTIFACTS (KB-H015)] OK [HOOK - conan-center.py] post_package(): [STATIC ARTIFACTS (KB-H074)] OK [HOOK - conan-center.py] post_package(): [EITHER STATIC OR SHARED OF EACH LIB (KB-H076)] OK [HOOK - conan-center.py] post_package(): [PC-FILES (KB-H020)] OK [HOOK - conan-center.py] post_package(): [CMAKE-MODULES-CONFIG-FILES (KB-H016)] OK [HOOK - conan-center.py] post_package(): [PDB FILES NOT ALLOWED (KB-H017)] OK [HOOK - conan-center.py] post_package(): [LIBTOOL FILES PRESENCE (KB-H018)] OK [HOOK - conan-center.py] post_package(): [MS RUNTIME FILES (KB-H021)] OK [HOOK - conan-center.py] post_package(): [SHORT_PATHS USAGE (KB-H066)] OK [HOOK - conan-center.py] post_package(): [MISSING SYSTEM LIBS (KB-H043)] OK [HOOK - conan-center.py] post_package(): [APPLE RELOCATABLE SHARED LIBS (KB-H077)] OK llama-cpp/b3542 package(): Packaged 3 '.a' files: libcommon.a, libllama.a, libggml.a llama-cpp/b3542 package(): Packaged 1 '.cmake' file: llama-cpp-cuda-static.cmake llama-cpp/b3542 package(): Packaged 21 '.h' files llama-cpp/b3542 package(): Packaged 2 '.hpp' files: json.hpp, base64.hpp llama-cpp/b3542 package(): Packaged 2 files: LICENSE, .editorconfig llama-cpp/b3542 package(): Packaged 1 '.py' file: convert_hf_to_gguf.py llama-cpp/b3542 package(): Packaged 16 '.gguf' files llama-cpp/b3542 package(): Packaged 13 '.out' files llama-cpp/b3542 package(): Packaged 13 '.inp' files llama-cpp/b3542: Package '80d3c8d077516ce6e7dc913ef92912e9172eae4c' created llama-cpp/b3542: Created package revision 9a6764c78e9fe30457c751f8dda8279d [HOOK - conan-center.py] post_package_info(): [CMAKE FILE NOT IN BUILD FOLDERS (KB-H019)] OK [HOOK - conan-center.py] post_package_info(): [LIBRARY DOES NOT EXIST (KB-H054)] OK [HOOK - conan-center.py] post_package_info(): [INCLUDE PATH DOES NOT EXIST (KB-H071)] OK Aggregating env generators fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). CMake Warning at ggml/src/CMakeLists.txt:167 (message): OpenMP not found CMake Warning at common/CMakeLists.txt:30 (message): Git repository not found; to enable automatic generation of build info, make sure Git is installed and the project is a Git repository. fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). WARN: *** Conan 1 is legacy and on a deprecation path *** WARN: *** Please upgrade to Conan 2 *** llama-cpp/b3542: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b3542: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior [HOOK - conan-center.py] post_package_info(): WARN: [CMAKE FILE NOT IN BUILD FOLDERS (KB-H019)] The *.cmake files have to be placed in a folder declared as `cpp_info.builddirs`. Currently folders declared: {'/home/conan/workspace/prod-v1/bsr/84618/facbe/.conan/data/llama-cpp/b3542/_/_/package/80d3c8d077516ce6e7dc913ef92912e9172eae4c/'} [HOOK - conan-center.py] post_package_info(): WARN: [CMAKE FILE NOT IN BUILD FOLDERS (KB-H019)] Found files: ./lib/cmake/llama-cpp-cuda-static.cmake