Created
February 1, 2026 20:07
-
-
Save jaypeche/e9b36c7ce1e64880ef115e3843534e4a to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| * IMPORTANT: 6 news items need reading for repository 'gentoo'. | |
| * Use eselect news read to view new items. | |
| These are the packages that would be merged, in order: | |
| Calculating dependencies ... done! | |
| Dependency resolution took 4.07 s (backtrack: 0/20). | |
| [ebuild R ~] sci-ml/ollama-0.14.2::guru USE="blas* cuda mkl vulkan -rocm" AMDGPU_TARGETS="-gfx90a -gfx803 -gfx900 -gfx906 -gfx908 -gfx940 -gfx941 -gfx942 -gfx1010 -gfx1011 -gfx1012 -gfx1030 -gfx1031 -gfx1100 -gfx1101 -gfx1102 -gfx1103 -gfx1150 -gfx1151 -gfx1200 -gfx1201" CPU_FLAGS_X86="avx avx2 avx_vnni bmi2 f16c fma3 sse4_2 -avx512_vnni -avx512f -avx512vbmi" 0 KiB | |
| Total: 1 package (1 reinstall), Size of downloads: 0 KiB | |
| >>> Verifying ebuild manifests | |
| >>> Running pre-merge checks for sci-ml/ollama-0.14.2 | |
| >>> Emerging (1 of 1) sci-ml/ollama-0.14.2::guru | |
| * ollama-0.14.2.gh.tar.gz BLAKE2B SHA512 size ;-) ... [ ok ] | |
| * ollama-0.14.2-deps.tar.xz BLAKE2B SHA512 size ;-) ... [ ok ] | |
| >>> Unpacking source... | |
| >>> Unpacking 'ollama-0.14.2.gh.tar.gz' to /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work | |
| >>> Unpacking 'ollama-0.14.2-deps.tar.xz' to /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work | |
| go mod verify | |
| all modules verified | |
| >>> Source unpacked in /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work | |
| >>> Preparing source in /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2 ... | |
| [32m*[0m Applying ollama-9999-use-GNUInstallDirs.patch ... | |
| patching file CMakeLists.txt | |
| Hunk #1 succeeded at 54 with fuzz 1 (offset 19 lines). | |
| [A[72C [34;01m[ [32;01mok[34;01m ][0m | |
| [32m*[0m Source directory (CMAKE_USE_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2" | |
| [32m*[0m Build directory (BUILD_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build" | |
| [32m*[0m Hardcoded definition(s) removed in CMakeLists.txt: | |
| [32m*[0m set(CMAKE_BUILD_TYPE Release) | |
| >>> Source prepared. | |
| >>> Configuring source in /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2 ... | |
| [32m*[0m Source directory (CMAKE_USE_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2" | |
| [32m*[0m Build directory (BUILD_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build" | |
| cmake -C /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build/gentoo_common_config.cmake -G Ninja -DCMAKE_INSTALL_PREFIX=/usr -DGGML_CCACHE=no -DGGML_BACKEND_DL=yes -DGGML_BACKEND_DIR=/usr/lib64/ollama -DGGML_BLAS=yes -DCMAKE_DISABLE_FIND_PACKAGE_Vulkan=OFF -DGGML_BLAS_VENDOR=Intel10_64lp -DCMAKE_CUDA_ARCHITECTURES=all-major -DCMAKE_HIP_COMPILER=NOTFOUND -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_TOOLCHAIN_FILE=/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build/gentoo_toolchain.cmake /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2 | |
| loading initial cache file /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build/gentoo_common_config.cmake | |
| -- The C compiler identification is GNU 14.3.1 | |
| -- The CXX compiler identification is GNU 14.3.1 | |
| -- Detecting C compiler ABI info | |
| -- Detecting C compiler ABI info - done | |
| -- Check for working C compiler: /usr/bin/x86_64-pc-linux-gnu-gcc - skipped | |
| -- Detecting C compile features | |
| -- Detecting C compile features - done | |
| -- Detecting CXX compiler ABI info | |
| -- Detecting CXX compiler ABI info - done | |
| -- Check for working CXX compiler: /usr/bin/x86_64-pc-linux-gnu-g++ - skipped | |
| -- Detecting CXX compile features | |
| -- Detecting CXX compile features - done | |
| -- Performing Test CMAKE_HAVE_LIBC_PTHREAD | |
| -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success | |
| -- Found Threads: TRUE | |
| -- CMAKE_SYSTEM_PROCESSOR: x86_64 | |
| -- GGML_SYSTEM_ARCH: x86 | |
| -- Including CPU backend | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-x64: | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-sse42: -msse4.2 GGML_SSE42 | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-sandybridge: -msse4.2;-mavx GGML_SSE42;GGML_AVX | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-haswell: -msse4.2;-mf16c;-mfma;-mbmi2;-mavx;-mavx2 GGML_SSE42;GGML_F16C;GGML_FMA;GGML_BMI2;GGML_AVX;GGML_AVX2 | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-alderlake: -msse4.2;-mf16c;-mfma;-mbmi2;-mavx;-mavx2;-mavxvnni GGML_SSE42;GGML_F16C;GGML_FMA;GGML_BMI2;GGML_AVX;GGML_AVX2;GGML_AVX_VNNI | |
| [33mCMake Warning at ml/backend/ggml/ggml/src/ggml-blas/CMakeLists.txt:9 (find_package): | |
| By not providing "FindBLAS.cmake" in CMAKE_MODULE_PATH this project has | |
| asked CMake to find a package configuration file provided by "BLAS", but | |
| CMake did not find one. | |
| Could not find a package configuration file provided by "BLAS" with any of | |
| the following names: | |
| BLASConfig.cmake | |
| blas-config.cmake | |
| Add the installation prefix of "BLAS" to CMAKE_PREFIX_PATH or set | |
| "BLAS_DIR" to a directory containing one of the above files. If "BLAS" | |
| provides a separate development package or SDK, be sure it has been | |
| installed. | |
| [0m | |
| [31mCMake Error at ml/backend/ggml/ggml/src/ggml-blas/CMakeLists.txt:84 (message): | |
| BLAS not found, please refer to | |
| https://cmake.org/cmake/help/latest/module/FindBLAS.html#blas-lapack-vendors | |
| to set correct GGML_BLAS_VENDOR | |
| [0m | |
| -- Configuring incomplete, errors occurred! | |
| [31;01m*[0m ERROR: sci-ml/ollama-0.14.2::guru failed (configure phase): | |
| [31;01m*[0m cmake failed | |
| [31;01m*[0m | |
| [31;01m*[0m Call stack: | |
| [31;01m*[0m ebuild.sh, line 143: Called src_configure | |
| [31;01m*[0m environment, line 2791: Called cmake_src_configure | |
| [31;01m*[0m environment, line 1462: Called die | |
| [31;01m*[0m The specific snippet of code: | |
| [31;01m*[0m "${CMAKE_BINARY}" "${cmakeargs[@]}" "${CMAKE_USE_DIR}" || die "cmake failed"; | |
| [31;01m*[0m | |
| [31;01m*[0m If you need support, post the output of `emerge --info '=sci-ml/ollama-0.14.2::guru'`, | |
| [31;01m*[0m the complete build log and the output of `emerge -pqv '=sci-ml/ollama-0.14.2::guru'`. | |
| [31;01m*[0m The complete build log is located at '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/build.log'. | |
| [31;01m*[0m The ebuild environment file is located at '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/environment'. | |
| [31;01m*[0m Working directory: '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build' | |
| [31;01m*[0m S: '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2' | |
| >>> Failed to emerge sci-ml/ollama-0.14.2, Log file: | |
| >>> '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/build.log' | |
| * Messages for package sci-ml/ollama-0.14.2: | |
| * ERROR: sci-ml/ollama-0.14.2::guru failed (configure phase): | |
| * cmake failed | |
| * Call stack: | |
| * ebuild.sh, line 143: Called src_configure | |
| * environment, line 2791: Called cmake_src_configure | |
| * environment, line 1462: Called die | |
| * The specific snippet of code: | |
| * "${CMAKE_BINARY}" "${cmakeargs[@]}" "${CMAKE_USE_DIR}" || die "cmake failed"; | |
| * If you need support, post the output of `emerge --info '=sci-ml/ollama-0.14.2::guru'`, | |
| * the complete build log and the output of `emerge -pqv '=sci-ml/ollama-0.14.2::guru'`. | |
| * The complete build log is located at '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/build.log'. | |
| * The ebuild environment file is located at '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/environment'. | |
| * Working directory: '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build' | |
| * S: '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2' |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment