******************************************************************************** conan install llama-cpp/b2038@#de0a3da65cebc162725e089500a3c524 --build=llama-cpp -pr /home/conan/w/prod-v1/bsr/100782/bfadc/profile_linux_7_libstdcpp11_gcc_release_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Conan 1 is on a deprecation path, please consider migrating to Conan 2 Auto detecting your dev setup to initialize the default profile (/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/profiles/default) CC and CXX: /usr/bin/gcc, /usr/bin/g++ Found gcc 7 gcc>=5, using the major as version ************************* WARNING: GCC OLD ABI COMPATIBILITY *********************** Conan detected a GCC version > 5 but has adjusted the 'compiler.libcxx' setting to 'libstdc++' for backwards compatibility. Your compiler is likely using the new CXX11 ABI by default (libstdc++11). If you want Conan to use the new ABI for the default profile, run: $ conan profile update settings.compiler.libcxx=libstdc++11 default Or edit '/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/profiles/default' and set compiler.libcxx=libstdc++11 ************************************************************************************ Default settings os=Linux os_build=Linux arch=x86_64 arch_build=x86_64 compiler=gcc compiler.version=7 compiler.libcxx=libstdc++ build_type=Release *** You can change them in /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/profiles/default *** *** Or override with -s compiler='other' -s ...s*** Configuration: [settings] arch=x86_64 build_type=Release compiler=gcc compiler.libcxx=libstdc++11 compiler.version=7 os=Linux [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b2038: Forced build from source Installing package: llama-cpp/b2038 Requirements llama-cpp/b2038 from local cache - Cache Packages llama-cpp/b2038:66c5327ebdcecae0a01a863939964495fa019a06 - Build Installing (downloading, building) binaries... [HOOK - conan-center.py] pre_source(): [IMMUTABLE SOURCES (KB-H010)] OK llama-cpp/b2038: Configuring sources in /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/source/src llama-cpp/b2038: [HOOK - conan-center.py] post_source(): [LIBCXX MANAGEMENT (KB-H011)] OK [HOOK - conan-center.py] post_source(): [CPPSTD MANAGEMENT (KB-H022)] OK [HOOK - conan-center.py] post_source(): [SHORT_PATHS USAGE (KB-H066)] OK llama-cpp/b2038: Copying sources to build folder llama-cpp/b2038: Building your package in /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06 llama-cpp/b2038: Generator txt created conanbuildinfo.txt llama-cpp/b2038: Calling generate() llama-cpp/b2038: Preset 'release' added to CMakePresets.json. Invoke it manually using 'cmake --preset release' llama-cpp/b2038: If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE=/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release/generators/conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW -DCMAKE_BUILD_TYPE=Release' llama-cpp/b2038: Aggregating env generators [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK. 'fPIC' option found and apparently well managed [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b2038: Calling build() llama-cpp/b2038: CMake command: cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/package/66c5327ebdcecae0a01a863939964495fa019a06" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/src" ----Running------ > cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/package/66c5327ebdcecae0a01a863939964495fa019a06" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/src" ----------------- -- Using Conan toolchain: /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release/generators/conan_toolchain.cmake -- Conan toolchain: Setting CMAKE_POSITION_INDEPENDENT_CODE=ON (options.fPIC) -- Conan toolchain: Setting BUILD_SHARED_LIBS = OFF -- The C compiler identification is GNU 7.5.0 -- The CXX compiler identification is GNU 7.5.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/gcc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/g++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: /usr/bin/git (found version "2.43.0") -- Looking for pthread.h -- Looking for pthread.h - found -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed -- Check if compiler accepts -pthread -- Check if compiler accepts -pthread - yes -- Found Threads: TRUE -- Warning: ccache not found - consider installing it or use LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: x86_64 -- x86 detected -- Configuring done -- Generating done -- Build files have been written to: /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release llama-cpp/b2038: CMake command: cmake --build "/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release" '--' '-j3' ----Running------ > cmake --build "/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release" '--' '-j3' ----------------- [ 6%] Generating build details from Git -- Found Git: /usr/bin/git (found version "2.43.0") Scanning dependencies of target ggml [ 13%] Building C object CMakeFiles/ggml.dir/ggml-alloc.c.o [ 20%] Building C object CMakeFiles/ggml.dir/ggml.c.o Scanning dependencies of target build_info [ 26%] Building CXX object common/CMakeFiles/build_info.dir/build-info.cpp.o [ 26%] Built target build_info [ 33%] Building C object CMakeFiles/ggml.dir/ggml-backend.c.o [ 40%] Building C object CMakeFiles/ggml.dir/ggml-quants.c.o CMakeFiles/ggml.dir/build.make:120: recipe for target 'CMakeFiles/ggml.dir/ggml-quants.c.o' failed CMakeFiles/Makefile2:120: recipe for target 'CMakeFiles/ggml.dir/all' failed Makefile:148: recipe for target 'all' failed llama-cpp/b2038: fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). CMake Warning at common/CMakeLists.txt:24 (message): Git repository not found; to enable automatic generation of build info, make sure Git is installed and the project is a Git repository. fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). fatal: not a git repository (or any parent up to mount point /home) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/src/ggml-quants.c: In function ‘ggml_vec_dot_iq2_xs_q8_K’: /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/src/ggml-quants.c:8600:42: error: implicit declaration of function ‘_mm256_set_m128i’; did you mean ‘_mm256_set_epi8’? [-Werror=implicit-function-declaration] const __m256i full_signs_1 = _mm256_set_m128i(full_signs_l, full_signs_l); ^~~~~~~~~~~~~~~~ _mm256_set_epi8 /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/src/ggml-quants.c:8600:42: error: incompatible types when initializing type ‘__m256i {aka const __vector(4) long long int}’ using type ‘int’ /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/src/ggml-quants.c:8601:42: error: incompatible types when initializing type ‘__m256i {aka const __vector(4) long long int}’ using type ‘int’ const __m256i full_signs_2 = _mm256_set_m128i(full_signs_h, full_signs_h); ^~~~~~~~~~~~~~~~ cc1: some warnings being treated as errors make[2]: *** [CMakeFiles/ggml.dir/ggml-quants.c.o] Error 1 make[2]: *** Waiting for unfinished jobs.... make[1]: *** [CMakeFiles/ggml.dir/all] Error 2 make: *** [all] Error 2 llama-cpp/b2038: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b2038: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b2038: ERROR: Package '66c5327ebdcecae0a01a863939964495fa019a06' build failed llama-cpp/b2038: WARN: Build folder /home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release ERROR: llama-cpp/b2038: Error in build() method, line 70 cmake.build() ConanException: Error 2 while executing cmake --build "/home/conan/w/prod-v1/bsr/100782/fbdac/.conan/data/llama-cpp/b2038/_/_/build/66c5327ebdcecae0a01a863939964495fa019a06/build/Release" '--' '-j3'