******************************************************************************** conan install llama-cpp/b2038@#c5d8251b84afa3f7d5c06ad29b467c11 --build=llama-cpp -pr C:/J2/w/prod-v1/bsr@2/102588/faece/profile_windows_16_md_vs_release_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Conan 1 is on a deprecation path, please consider migrating to Conan 2 Auto detecting your dev setup to initialize the default profile (C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\profiles\default) Found Visual Studio 17 Default settings os=Windows os_build=Windows arch=x86_64 arch_build=x86_64 compiler=Visual Studio compiler.version=17 build_type=Release *** You can change them in C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\profiles\default *** *** Or override with -s compiler='other' -s ...s*** Configuration: [settings] arch=x86_64 build_type=Release compiler=Visual Studio compiler.runtime=MD compiler.version=16 os=Windows [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b2038: Forced build from source Installing package: llama-cpp/b2038 Requirements llama-cpp/b2038 from local cache - Cache Packages llama-cpp/b2038:3fb49604f9c2f729b85ba3115852006824e72cab - Build Installing (downloading, building) binaries... [HOOK - conan-center.py] pre_source(): [IMMUTABLE SOURCES (KB-H010)] OK llama-cpp/b2038: Configuring sources in C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\source\src llama-cpp/b2038: [HOOK - conan-center.py] post_source(): [LIBCXX MANAGEMENT (KB-H011)] OK [HOOK - conan-center.py] post_source(): [CPPSTD MANAGEMENT (KB-H022)] OK [HOOK - conan-center.py] post_source(): [SHORT_PATHS USAGE (KB-H066)] OK llama-cpp/b2038: Copying sources to build folder llama-cpp/b2038: Building your package in C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab llama-cpp/b2038: Generator txt created conanbuildinfo.txt llama-cpp/b2038: Calling generate() llama-cpp/b2038: Preset 'default' added to CMakePresets.json. Invoke it manually using 'cmake --preset default' llama-cpp/b2038: If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE=C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\generators\conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW' llama-cpp/b2038: Aggregating env generators [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b2038: Calling build() llama-cpp/b2038: CMake command: cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/build/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src" ----Running------ > cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/build/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src" ----------------- -- Using Conan toolchain: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/build/generators/conan_toolchain.cmake -- Conan toolchain: Setting BUILD_SHARED_LIBS = OFF -- The C compiler identification is MSVC 19.29.30148.0 -- The CXX compiler identification is MSVC 19.29.30148.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.29.0.windows.1") -- Looking for pthread.h -- Looking for pthread.h - not found -- Found Threads: TRUE -- Warning: ccache not found - consider installing it or use LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: AMD64 -- CMAKE_GENERATOR_PLATFORM: x64 -- x86 detected -- Performing Test HAS_AVX_1 -- Performing Test HAS_AVX_1 - Success -- Performing Test HAS_AVX2_1 -- Performing Test HAS_AVX2_1 - Success -- Performing Test HAS_FMA_1 -- Performing Test HAS_FMA_1 - Success -- Performing Test HAS_AVX512_1 -- Performing Test HAS_AVX512_1 - Failed -- Performing Test HAS_AVX512_2 -- Performing Test HAS_AVX512_2 - Failed -- Configuring done -- Generating done -- Build files have been written to: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/build llama-cpp/b2038: CMake command: cmake --build "C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build" --config Release ----Running------ > cmake --build "C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build" --config Release ----------------- Microsoft (R) Build Engine version 16.11.2+f32259642 for .NET Framework Copyright (C) Microsoft Corporation. All rights reserved. Checking Build System Generating build details from Git -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.29.0.windows.1") fatal: not a git repository (or any of the parent directories): .git fatal: not a git repository (or any of the parent directories): .git Building Custom Rule C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/src/common/CMakeLists.txt build-info.cpp build_info.vcxproj -> C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\common\build_info.dir\Release\build_info.lib Building Custom Rule C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/src/CMakeLists.txt ggml.c ggml-alloc.c ggml-backend.c ggml-quants.c C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(677,36): warning C4244: '=': conversion from 'float' to 'int8_t', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(895,46): warning C4244: '=': conversion from 'float' to 'int8_t', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(896,46): warning C4244: '=': conversion from 'float' to 'int8_t', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(9125,1): warning C4244: 'initializing': conversion from 'int' to 'float', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(9158,1): warning C4244: 'initializing': conversion from 'int' to 'float', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(9346,1): warning C4244: 'initializing': conversion from 'int' to 'float', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(9378,1): warning C4244: 'initializing': conversion from 'int' to 'float', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(9736,1): warning C4244: 'initializing': conversion from 'int' to 'float', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\ggml-quants.c(9769,1): warning C4244: 'initializing': conversion from 'int' to 'float', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.vcxproj] ggml.vcxproj -> C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\ggml.dir\Release\ggml.lib Building Custom Rule C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/src/CMakeLists.txt llama.cpp C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\llama.cpp(3169,69): warning C4566: character represented by universal-character-name '\u010A' cannot be represented in the current code page (1252) [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\llama.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\llama.cpp(11536,28): warning C4146: unary minus operator applied to unsigned type, result still unsigned [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\llama.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\llama.cpp(11571,28): warning C4146: unary minus operator applied to unsigned type, result still unsigned [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\llama.vcxproj] llama.vcxproj -> C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\Release\llama.lib Building Custom Rule C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/src/common/CMakeLists.txt common.cpp sampling.cpp console.cpp grammar-parser.cpp train.cpp C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\common\sampling.cpp(76,47): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\common\common.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\common\sampling.cpp(76,47): warning C4267: 'initializing': conversion from 'size_t' to 'const int', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\common\common.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\common\console.cpp(253,38): warning C4267: 'initializing': conversion from 'size_t' to 'DWORD', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\common\common.vcxproj] C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\src\common\console.cpp(407,43): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\common\common.vcxproj] common.vcxproj -> C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\common\Release\common.lib Building Custom Rule C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/src/CMakeLists.txt ggml_static.vcxproj -> C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build\Release\ggml_static.lib Building Custom Rule C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/build/3fb49604f9c2f729b85ba3115852006824e72cab/src/CMakeLists.txt llama-cpp/b2038: Package '3fb49604f9c2f729b85ba3115852006824e72cab' built llama-cpp/b2038: Build folder C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build llama-cpp/b2038: Generated conaninfo.txt llama-cpp/b2038: Generated conanbuildinfo.txt llama-cpp/b2038: Generating the package llama-cpp/b2038: Package folder C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\package\3fb49604f9c2f729b85ba3115852006824e72cab llama-cpp/b2038: Calling package() llama-cpp/b2038: Copied 1 file: LICENSE llama-cpp/b2038: CMake command: cmake --install "C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build" --config Release --prefix "C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab" ----Running------ > cmake --install "C:\J2\w\prod-v1\bsr@2\102588\bfcaa\.conan\data\llama-cpp\b2038\_\_\build\3fb49604f9c2f729b85ba3115852006824e72cab\build" --config Release --prefix "C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab" ----------------- -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/lib/cmake/Llama/LlamaConfig.cmake -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/lib/cmake/Llama/LlamaConfigVersion.cmake -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/include/ggml.h -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/include/ggml-alloc.h -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/include/ggml-backend.h -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/lib/llama.lib -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/include/llama.h -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/bin/convert.py -- Installing: C:/J2/w/prod-v1/bsr@2/102588/bfcaa/.conan/data/llama-cpp/b2038/_/_/package/3fb49604f9c2f729b85ba3115852006824e72cab/bin/convert-lora-to-ggml.py llama-cpp/b2038: Copied 1 file: .editorconfig llama-cpp/b2038: Copied 10 '.gguf' files llama-cpp/b2038: Copied 1 '.hpp' file: base64.hpp llama-cpp/b2038: Copied 7 '.h' files llama-cpp/b2038: Copied 2 '.lib' files: build_info.lib, common.lib [HOOK - conan-center.py] post_package(): [PACKAGE LICENSE (KB-H012)] OK [HOOK - conan-center.py] post_package(): [DEFAULT PACKAGE LAYOUT (KB-H013)] OK [HOOK - conan-center.py] post_package(): [MATCHING CONFIGURATION (KB-H014)] OK [HOOK - conan-center.py] post_package(): [SHARED ARTIFACTS (KB-H015)] OK [HOOK - conan-center.py] post_package(): [STATIC ARTIFACTS (KB-H074)] OK [HOOK - conan-center.py] post_package(): [EITHER STATIC OR SHARED OF EACH LIB (KB-H076)] OK [HOOK - conan-center.py] post_package(): [PC-FILES (KB-H020)] OK [HOOK - conan-center.py] post_package(): [CMAKE-MODULES-CONFIG-FILES (KB-H016)] OK [HOOK - conan-center.py] post_package(): [PDB FILES NOT ALLOWED (KB-H017)] OK [HOOK - conan-center.py] post_package(): [LIBTOOL FILES PRESENCE (KB-H018)] OK [HOOK - conan-center.py] post_package(): [MS RUNTIME FILES (KB-H021)] OK [HOOK - conan-center.py] post_package(): [SHORT_PATHS USAGE (KB-H066)] OK [HOOK - conan-center.py] post_package(): [MISSING SYSTEM LIBS (KB-H043)] OK [HOOK - conan-center.py] post_package(): [APPLE RELOCATABLE SHARED LIBS (KB-H077)] OK llama-cpp/b2038 package(): Packaged 2 '.py' files: convert-lora-to-ggml.py, convert.py llama-cpp/b2038 package(): Packaged 11 '.h' files llama-cpp/b2038 package(): Packaged 1 '.hpp' file: base64.hpp llama-cpp/b2038 package(): Packaged 3 '.lib' files: build_info.lib, common.lib, llama.lib llama-cpp/b2038 package(): Packaged 2 files: LICENSE, .editorconfig llama-cpp/b2038 package(): Packaged 10 '.gguf' files llama-cpp/b2038: Package '3fb49604f9c2f729b85ba3115852006824e72cab' created llama-cpp/b2038: Created package revision a1667994882f81645e844fc86f013321 [HOOK - conan-center.py] post_package_info(): [CMAKE FILE NOT IN BUILD FOLDERS (KB-H019)] OK [HOOK - conan-center.py] post_package_info(): [LIBRARY DOES NOT EXIST (KB-H054)] OK [HOOK - conan-center.py] post_package_info(): [INCLUDE PATH DOES NOT EXIST (KB-H071)] OK Aggregating env generators fatal: not a git repository (or any of the parent directories): .git fatal: not a git repository (or any of the parent directories): .git CMake Warning at common/CMakeLists.txt:24 (message): Git repository not found; to enable automatic generation of build info, make sure Git is installed and the project is a Git repository. llama-cpp/b2038: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b2038: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior