******************************************************************************** conan test cci-ba81f30b\recipes\llama-cpp\all\test_package\conanfile.py llama-cpp/b2038@#c5d8251b84afa3f7d5c06ad29b467c11 -pr C:/J/workspace/prod-v1/bsr/51168/bafac/profile_windows_16_mdd_vs_debug_64.llama-cpp-shared-False.txt -c tools.system.package_manager:mode=install -c tools.system.package_manager:sudo=True ******************************************************************************** Auto detecting your dev setup to initialize the default profile (C:\J\workspace\prod-v1\bsr\51168\bbfbe\.conan\profiles\default) Found Visual Studio 17 Default settings os=Windows os_build=Windows arch=x86_64 arch_build=x86_64 compiler=Visual Studio compiler.version=17 build_type=Release *** You can change them in C:\J\workspace\prod-v1\bsr\51168\bbfbe\.conan\profiles\default *** *** Or override with -s compiler='other' -s ...s*** Configuration: [settings] arch=x86_64 build_type=Debug compiler=Visual Studio compiler.runtime=MDd compiler.version=16 os=Windows [options] llama-cpp:shared=False [build_requires] [env] [conf] tools.system.package_manager:mode=install tools.system.package_manager:sudo=True llama-cpp/b2038 (test package): Installing package Requirements llama-cpp/b2038 from local cache - Cache Packages llama-cpp/b2038:d057732059ea44a47760900cb5e4855d2bea8714 - Download Installing (downloading, building) binaries... llama-cpp/b2038: Retrieving package d057732059ea44a47760900cb5e4855d2bea8714 from remote 'conan-center' Downloading conanmanifest.txt Downloading conaninfo.txt Downloading conan_package.tgz llama-cpp/b2038: Package installed d057732059ea44a47760900cb5e4855d2bea8714 llama-cpp/b2038: Downloaded package revision ec9039033e7b9e7c497e7dcaf68d74ed llama-cpp/b2038 (test package): Generator 'VirtualRunEnv' calling 'generate()' llama-cpp/b2038 (test package): Generator 'CMakeDeps' calling 'generate()' llama-cpp/b2038 (test package): Generator txt created conanbuildinfo.txt llama-cpp/b2038 (test package): Generator 'CMakeToolchain' calling 'generate()' llama-cpp/b2038 (test package): Preset 'default' added to CMakePresets.json. Invoke it manually using 'cmake --preset default' llama-cpp/b2038 (test package): If your CMake version is not compatible with CMakePresets (<3.19) call cmake like: 'cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE=C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\generators\conan_toolchain.cmake -DCMAKE_POLICY_DEFAULT_CMP0091=NEW' llama-cpp/b2038 (test package): Aggregating env generators llama-cpp/b2038 (test package): Generated conaninfo.txt llama-cpp/b2038 (test package): Generated graphinfo Using lockfile: 'C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\generators/conan.lock' Using cached profile from lockfile [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] 'fPIC' option not found [HOOK - conan-center.py] pre_build(): [FPIC MANAGEMENT (KB-H007)] OK llama-cpp/b2038 (test package): Calling build() llama-cpp/b2038 (test package): CMake command: cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J/workspace/prod-v1/bsr/cci-ba81f30b/recipes/llama-cpp/all/test_package/build/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\." ----Running------ > cmake -G "Visual Studio 16 2019" -DCMAKE_TOOLCHAIN_FILE="C:/J/workspace/prod-v1/bsr/cci-ba81f30b/recipes/llama-cpp/all/test_package/build/generators/conan_toolchain.cmake" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" "C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\." ----------------- -- Using Conan toolchain: C:/J/workspace/prod-v1/bsr/cci-ba81f30b/recipes/llama-cpp/all/test_package/build/generators/conan_toolchain.cmake -- The CXX compiler identification is MSVC 19.29.30148.0 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Conan: Component target declared 'llama-cpp::llama' -- Conan: Component target declared 'llama-cpp::common' -- Conan: Target declared 'llama-cpp::llama-cpp' -- Configuring done -- Generating done -- Build files have been written to: C:/J/workspace/prod-v1/bsr/cci-ba81f30b/recipes/llama-cpp/all/test_package/build llama-cpp/b2038 (test package): CMake command: cmake --build "C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build" --config Debug ----Running------ > cmake --build "C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build" --config Debug ----------------- Microsoft (R) Build Engine version 16.11.2+f32259642 for .NET Framework Copyright (C) Microsoft Corporation. All rights reserved. Checking Build System Building Custom Rule C:/J/workspace/prod-v1/bsr/cci-ba81f30b/recipes/llama-cpp/all/test_package/CMakeLists.txt test_package.cpp llama.lib(llama.obj) : warning LNK4099: PDB 'llama.pdb' was not found with 'llama.lib(llama.obj)' or at 'C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\Debug\llama.pdb'; linking object as if no debug info [C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\test_package.vcxproj] llama.lib(ggml.obj) : warning LNK4099: PDB 'ggml.pdb' was not found with 'llama.lib(ggml.obj)' or at 'C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\Debug\ggml.pdb'; linking object as if no debug info [C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\test_package.vcxproj] llama.lib(ggml-alloc.obj) : warning LNK4099: PDB 'ggml.pdb' was not found with 'llama.lib(ggml-alloc.obj)' or at 'C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\Debug\ggml.pdb'; linking object as if no debug info [C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\test_package.vcxproj] llama.lib(ggml-backend.obj) : warning LNK4099: PDB 'ggml.pdb' was not found with 'llama.lib(ggml-backend.obj)' or at 'C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\Debug\ggml.pdb'; linking object as if no debug info [C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\test_package.vcxproj] llama.lib(ggml-quants.obj) : warning LNK4099: PDB 'ggml.pdb' was not found with 'llama.lib(ggml-quants.obj)' or at 'C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\Debug\ggml.pdb'; linking object as if no debug info [C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\test_package.vcxproj] test_package.vcxproj -> C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\Debug\test_package.exe Building Custom Rule C:/J/workspace/prod-v1/bsr/cci-ba81f30b/recipes/llama-cpp/all/test_package/CMakeLists.txt llama-cpp/b2038 (test package): Running test() ----Running------ > "C:\J\workspace\prod-v1\bsr\cci-ba81f30b\recipes\llama-cpp\all\test_package\build\generators\conanrun.bat" && Debug\test_package ----------------- CMake Warning: Manually-specified variables were not used by the project: CMAKE_POLICY_DEFAULT_CMP0091 WARN: *** Conan 1 is legacy and on a deprecation path *** WARN: *** Please upgrade to Conan 2 *** llama-cpp/b2038 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior llama-cpp/b2038 (test package): WARN: Using the new toolchains and generators without specifying a build profile (e.g: -pr:b=default) is discouraged and might cause failures and unexpected behavior