******************************************************************************** conan install llama-cpp/b3040@#955fe526f9564a05a96653bd8fdfb322 --build=llama-cpp -pr C:/J/workspace/prod-v1/bsr/51178/fdfaa/profile_windows_16_mdd_vs_debug_64.llama-cpp-shared-True.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Auto detecting your dev setup to initialize the default profile (C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\profiles\default) Found Visual Studio 17 Default settings os=Windows os_build=Windows arch=x86_64 arch_build=x86_64 compiler=Visual Studio compiler.version=17 build_type=Release *** You can change them in C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\profiles\default *** *** Or override with -s compiler='other' -s ...s*** Configuration: [settings] arch=x86_64 build_type=Debug compiler=Visual Studio compiler.runtime=MDd compiler.version=16 os=Windows [options] llama-cpp:shared=True [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b3040: Forced build from source Installing package: llama-cpp/b3040 Requirements llama-cpp/b3040 from local cache - Cache Packages llama-cpp/b3040:4f1710918aa542fccb5a54d7bd712e4b0750b50d - Build Installing (downloading, building) binaries... [HOOK - conan-center.py] pre_source(): [IMMUTABLE SOURCES (KB-H010)] OK llama-cpp/b3040: Configuring sources in C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\source\src llama-cpp/b3040: [HOOK - conan-center.py] post_source(): [LIBCXX MANAGEMENT (KB-H011)] OK [HOOK - conan-center.py] post_source(): [CPPSTD MANAGEMENT (KB-H022)] OK [HOOK - conan-center.py] post_source(): [SHORT_PATHS USAGE (KB-H066)] OK llama-cpp/b3040: Copying sources to build folder llama-cpp/b3040: Building your package in C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d llama-cpp/b3040: Generator txt created conanbuildinfo.txt llama-cpp/b3040: Calling generate() llama-cpp/b3040: Preset 'default' added to CMakePresets.json. Invoke it manually using 'cmake --preset default' llama-cpp/b3040: If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE=C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\generators\conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW' llama-cpp/b3040: Aggregating env generators [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b3040: Calling build() llama-cpp/b3040: CMake command: cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src" ----Running------ > cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src" ----------------- -- Using Conan toolchain: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/generators/conan_toolchain.cmake -- Conan toolchain: Setting BUILD_SHARED_LIBS = ON -- The C compiler identification is MSVC 19.29.30148.0 -- The CXX compiler identification is MSVC 19.29.30148.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.29.0.windows.1") -- Looking for pthread.h -- Looking for pthread.h - not found -- Found Threads: TRUE -- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF -- CMAKE_SYSTEM_PROCESSOR: AMD64 -- CMAKE_GENERATOR_PLATFORM: x64 -- x86 detected -- Performing Test HAS_AVX_1 -- Performing Test HAS_AVX_1 - Success -- Performing Test HAS_AVX2_1 -- Performing Test HAS_AVX2_1 - Success -- Performing Test HAS_FMA_1 -- Performing Test HAS_FMA_1 - Success -- Performing Test HAS_AVX512_1 -- Performing Test HAS_AVX512_1 - Failed -- Performing Test HAS_AVX512_2 -- Performing Test HAS_AVX512_2 - Failed -- Configuring done -- Generating done -- Build files have been written to: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build llama-cpp/b3040: CMake command: cmake --build "C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build" --config Debug ----Running------ > cmake --build "C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build" --config Debug ----------------- Microsoft (R) Build Engine version 16.11.2+f32259642 for .NET Framework Copyright (C) Microsoft Corporation. All rights reserved. Checking Build System Generating build details from Git -- Found Git: C:/Program Files/Git/cmd/git.exe (found version "2.29.0.windows.1") fatal: not a git repository (or any of the parent directories): .git fatal: not a git repository (or any of the parent directories): .git Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/common/CMakeLists.txt build-info.cpp build_info.vcxproj -> C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\build_info.dir\Debug\build_info.lib Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/CMakeLists.txt ggml.c ggml-alloc.c ggml-backend.c ggml-quants.c C:\Program Files (x86)\Windows Kits\10\Include\10.0.20348.0\ucrt\assert.h(21,1): warning C4005: 'static_assert': macro redefinition (compiling source file C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\ggml-quants.c) [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\ggml.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\ggml-common.h(58): message : see previous definition of 'static_assert' (compiling source file C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\ggml-quants.c) [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\ggml.vcxproj] sgemm.cpp C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\sgemm.cpp(46,9): warning C4068: unknown pragma 'GCC' [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\ggml.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\sgemm.cpp(47,9): warning C4068: unknown pragma 'GCC' [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\ggml.vcxproj] ggml.vcxproj -> C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\ggml.dir\Debug\ggml.lib Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/CMakeLists.txt llama.cpp unicode.cpp unicode-data.cpp C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\llama.cpp(13875,1): warning C4297: 'llama_grammar_init': function assumed not to throw an exception but does [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\llama.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\llama.cpp(13875,1): message : __declspec(nothrow), throw(), noexcept(true), or noexcept was specified on the function [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\llama.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\llama.cpp(18281,44): warning C4101: 'e': unreferenced local variable [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\llama.vcxproj] Auto build dll exports Creating library C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/Debug/llama.lib and object C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/Debug/llama.exp llama.vcxproj -> C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\bin\Debug\llama.dll Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/common/CMakeLists.txt common.cpp sampling.cpp console.cpp grammar-parser.cpp json-schema-to-grammar.cpp train.cpp ngram-cache.cpp C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\sampling.cpp(97,47): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\sampling.cpp(97,47): warning C4267: 'initializing': conversion from 'size_t' to 'const int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\console.cpp(253,38): warning C4267: 'initializing': conversion from 'size_t' to 'DWORD', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\console.cpp(407,43): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(20,50): warning C4244: 'argument': conversion from 'int64_t' to 'const int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(100,5): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(147,36): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(147,36): warning C4267: 'initializing': conversion from 'size_t' to 'const int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(156,84): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(156,84): warning C4267: 'initializing': conversion from 'size_t' to 'const int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(170,79): warning C4267: 'initializing': conversion from 'size_t' to 'int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(170,79): warning C4267: 'initializing': conversion from 'size_t' to 'const int', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(202,52): warning C4267: 'initializing': conversion from 'size_t' to 'int32_t', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\ngram-cache.cpp(202,52): warning C4267: 'initializing': conversion from 'size_t' to 'const int32_t', possible loss of data [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\json-schema-to-grammar.cpp(369,60): warning C4101: 'e': unreferenced local variable [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\src\common\common.cpp(1629): warning C4715: 'string_random_prompt': not all control paths return a value [C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\common.vcxproj] common.vcxproj -> C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\common\Debug\common.lib Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/CMakeLists.txt Auto build dll exports Creating library C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/Debug/ggml_shared.lib and object C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/build/Debug/ggml_shared.exp ggml_shared.vcxproj -> C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\bin\Debug\ggml_shared.dll Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/CMakeLists.txt ggml_static.vcxproj -> C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build\Debug\ggml_static.lib Building Custom Rule C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/build/4f1710918aa542fccb5a54d7bd712e4b0750b50d/src/CMakeLists.txt llama-cpp/b3040: Package '4f1710918aa542fccb5a54d7bd712e4b0750b50d' built llama-cpp/b3040: Build folder C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build llama-cpp/b3040: Generated conaninfo.txt llama-cpp/b3040: Generated conanbuildinfo.txt llama-cpp/b3040: Generating the package llama-cpp/b3040: Package folder C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\package\4f1710918aa542fccb5a54d7bd712e4b0750b50d llama-cpp/b3040: Calling package() llama-cpp/b3040: Copied 1 file: LICENSE llama-cpp/b3040: CMake command: cmake --install "C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build" --config Debug --prefix "C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d" ----Running------ > cmake --install "C:\J\workspace\prod-v1\bsr\51178\bccbd\.conan\data\llama-cpp\b3040\_\_\build\4f1710918aa542fccb5a54d7bd712e4b0750b50d\build" --config Debug --prefix "C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d" ----------------- -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/lib/ggml_shared.lib -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/bin/ggml_shared.dll -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/lib/cmake/Llama/LlamaConfig.cmake -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/lib/cmake/Llama/LlamaConfigVersion.cmake -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/include/ggml.h -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/include/ggml-alloc.h -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/include/ggml-backend.h -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/lib/llama.lib -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/bin/llama.dll -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/include/llama.h -- Installing: C:/J/workspace/prod-v1/bsr/51178/bccbd/.conan/data/llama-cpp/b3040/_/_/package/4f1710918aa542fccb5a54d7bd712e4b0750b50d/bin/convert.py llama-cpp/b3040: Copied 1 file: .editorconfig llama-cpp/b3040: Copied 18 '.gguf' files llama-cpp/b3040: Copied 13 '.inp' files llama-cpp/b3040: Copied 13 '.out' files llama-cpp/b3040: Copied 2 '.hpp' files: base64.hpp, json.hpp llama-cpp/b3040: Copied 9 '.h' files llama-cpp/b3040: Copied 2 '.lib' files: build_info.lib, common.lib [HOOK - conan-center.py] post_package(): [PACKAGE LICENSE (KB-H012)] OK [HOOK - conan-center.py] post_package(): [DEFAULT PACKAGE LAYOUT (KB-H013)] OK [HOOK - conan-center.py] post_package(): [MATCHING CONFIGURATION (KB-H014)] OK [HOOK - conan-center.py] post_package(): [SHARED ARTIFACTS (KB-H015)] OK [HOOK - conan-center.py] post_package(): [STATIC ARTIFACTS (KB-H074)] OK [HOOK - conan-center.py] post_package(): [EITHER STATIC OR SHARED OF EACH LIB (KB-H076)] OK [HOOK - conan-center.py] post_package(): [PC-FILES (KB-H020)] OK [HOOK - conan-center.py] post_package(): [CMAKE-MODULES-CONFIG-FILES (KB-H016)] OK [HOOK - conan-center.py] post_package(): [PDB FILES NOT ALLOWED (KB-H017)] OK [HOOK - conan-center.py] post_package(): [LIBTOOL FILES PRESENCE (KB-H018)] OK [HOOK - conan-center.py] post_package(): [MS RUNTIME FILES (KB-H021)] OK [HOOK - conan-center.py] post_package(): [SHORT_PATHS USAGE (KB-H066)] OK ********************************************************************** ** Visual Studio 2019 Developer Command Prompt v16.11.26 ** Copyright (c) 2021 Microsoft Corporation ********************************************************************** [vcvarsall.bat] Environment initialized for: 'x64' [HOOK - conan-center.py] post_package(): [MISSING SYSTEM LIBS (KB-H043)] OK [HOOK - conan-center.py] post_package(): [APPLE RELOCATABLE SHARED LIBS (KB-H077)] OK llama-cpp/b3040 package(): Packaged 1 '.py' file: convert.py llama-cpp/b3040 package(): Packaged 2 '.dll' files: ggml_shared.dll, llama.dll llama-cpp/b3040 package(): Packaged 13 '.h' files llama-cpp/b3040 package(): Packaged 2 '.hpp' files: base64.hpp, json.hpp llama-cpp/b3040 package(): Packaged 4 '.lib' files: build_info.lib, common.lib, ggml_shared.lib, llama.lib llama-cpp/b3040 package(): Packaged 2 files: LICENSE, .editorconfig llama-cpp/b3040 package(): Packaged 18 '.gguf' files llama-cpp/b3040 package(): Packaged 13 '.inp' files llama-cpp/b3040 package(): Packaged 13 '.out' files llama-cpp/b3040: Package '4f1710918aa542fccb5a54d7bd712e4b0750b50d' created llama-cpp/b3040: Created package revision 47925d660c4957b78eb97645b142e32a [HOOK - conan-center.py] post_package_info(): [CMAKE FILE NOT IN BUILD FOLDERS (KB-H019)] OK [HOOK - conan-center.py] post_package_info(): [LIBRARY DOES NOT EXIST (KB-H054)] OK [HOOK - conan-center.py] post_package_info(): [INCLUDE PATH DOES NOT EXIST (KB-H071)] OK Aggregating env generators fatal: not a git repository (or any of the parent directories): .git fatal: not a git repository (or any of the parent directories): .git CMake Warning at common/CMakeLists.txt:29 (message): Git repository not found; to enable automatic generation of build info, make sure Git is installed and the project is a Git repository. WARN: *** Conan 1 is legacy and on a deprecation path *** WARN: *** Please upgrade to Conan 2 *** llama-cpp/b3040: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b3040: WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior