******************************************************************************** conan test cci-09238596\recipes\llama-cpp\all\test_package\conanfile.py llama-cpp/b3542@#b9cee04f44b2458374fd718dd683caf6 -pr C:/J/workspace/prod-v1/bsr/84752/ecfef/profile_windows_16_mdd_vs_debug_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Configuration: [settings] arch=x86_64 build_type=Debug compiler=Visual Studio compiler.runtime=MDd compiler.version=16 os=Windows [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b3542 (test package): Installing package Requirements llama-cpp/b3542 from local cache - Cache Packages llama-cpp/b3542:d057732059ea44a47760900cb5e4855d2bea8714 - Cache Installing (downloading, building) binaries... llama-cpp/b3542: Already installed! llama-cpp/b3542 (test package): Generator 'VirtualRunEnv' calling 'generate()' llama-cpp/b3542 (test package): Generator txt created conanbuildinfo.txt llama-cpp/b3542 (test package): Generator 'CMakeToolchain' calling 'generate()' llama-cpp/b3542 (test package): Preset 'default' added to CMakePresets.json. Invoke it manually using 'cmake --preset default' llama-cpp/b3542 (test package): If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE=C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\build\generators\conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW' llama-cpp/b3542 (test package): Generator 'CMakeDeps' calling 'generate()' llama-cpp/b3542 (test package): Aggregating env generators llama-cpp/b3542 (test package): Generated conaninfo.txt llama-cpp/b3542 (test package): Generated graphinfo Using lockfile: 'C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\build\generators/conan.lock' Using cached profile from lockfile [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b3542 (test package): Calling build() llama-cpp/b3542 (test package): CMake command: cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J/workspace/prod-v1/bsr/cci-09238596/recipes/llama-cpp/all/test_package/build/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\." ----Running------ > cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J/workspace/prod-v1/bsr/cci-09238596/recipes/llama-cpp/all/test_package/build/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\." ----------------- -- Using Conan toolchain: C:/J/workspace/prod-v1/bsr/cci-09238596/recipes/llama-cpp/all/test_package/build/generators/conan_toolchain.cmake -- The CXX compiler identification is MSVC 19.29.30148.0 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Conan: Component target declared 'llama-cpp::common' -- Conan: Component target declared 'llama-cpp::llama' -- Conan: Target declared 'llama-cpp::llama-cpp' -- Configuring done -- Generating done -- Build files have been written to: C:/J/workspace/prod-v1/bsr/cci-09238596/recipes/llama-cpp/all/test_package/build llama-cpp/b3542 (test package): CMake command: cmake --build "C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\build" --config Debug ----Running------ > cmake --build "C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\build" --config Debug ----------------- Microsoft (R) Build Engine version 16.11.2+f32259642 for .NET Framework Copyright (C) Microsoft Corporation. All rights reserved. Checking Build System Building Custom Rule C:/J/workspace/prod-v1/bsr/cci-09238596/recipes/llama-cpp/all/test_package/CMakeLists.txt test_package.cpp test_package.vcxproj -> C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\build\Debug\test_package.exe Building Custom Rule C:/J/workspace/prod-v1/bsr/cci-09238596/recipes/llama-cpp/all/test_package/CMakeLists.txt llama-cpp/b3542 (test package): Running test() ----Running------ > "C:\J\workspace\prod-v1\bsr\cci-09238596\recipes\llama-cpp\all\test_package\build\generators\conanrun.bat" && Debug\test_package ----------------- Main GPU: 0 CMake Warning: Manually-specified variables were not used by the project: CMAKE_POLICY_DEFAULT_CMP0091 WARN: *** Conan 1 is legacy and on a deprecation path *** WARN: *** Please upgrade to Conan 2 *** llama-cpp/b3542 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b3542 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior