Created
February 1, 2026 20:09
-
-
Save jaypeche/b7d5ae1fe0c89369d4a9f7a9ee85d019 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| [32m * [39;49;00mPackage: sci-ml/ollama-0.14.2:0 | |
| [32m * [39;49;00mRepository: guru | |
| [32m * [39;49;00mMaintainer: negril.nx+gentoo@gmail.com | |
| [32m * [39;49;00mUSE: abi_x86_64 amd64 blas cpu_flags_x86_avx cpu_flags_x86_avx2 cpu_flags_x86_avx_vnni cpu_flags_x86_bmi2 cpu_flags_x86_f16c cpu_flags_x86_fma3 cpu_flags_x86_sse4_2 cuda elibc_glibc kernel_linux mkl vulkan | |
| [32m * [39;49;00mFEATURES: network-sandbox preserve-libs sandbox userpriv usersandbox | |
| >>> Unpacking source... | |
| >>> Unpacking 'ollama-0.14.2.gh.tar.gz' to /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work | |
| >>> Unpacking 'ollama-0.14.2-deps.tar.xz' to /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work | |
| go mod verify | |
| all modules verified | |
| >>> Source unpacked in /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work | |
| >>> Preparing source in /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2 ... | |
| [32m*[0m Applying ollama-9999-use-GNUInstallDirs.patch ... | |
| patching file CMakeLists.txt | |
| Hunk #1 succeeded at 54 with fuzz 1 (offset 19 lines). | |
| [A[72C [34;01m[ [32;01mok[34;01m ][0m | |
| [32m*[0m Source directory (CMAKE_USE_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2" | |
| [32m*[0m Build directory (BUILD_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build" | |
| [32m*[0m Hardcoded definition(s) removed in CMakeLists.txt: | |
| [32m*[0m set(CMAKE_BUILD_TYPE Release) | |
| >>> Source prepared. | |
| >>> Configuring source in /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2 ... | |
| [32m*[0m Source directory (CMAKE_USE_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2" | |
| [32m*[0m Build directory (BUILD_DIR): "/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build" | |
| cmake -C /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build/gentoo_common_config.cmake -G Ninja -DCMAKE_INSTALL_PREFIX=/usr -DGGML_CCACHE=no -DGGML_BACKEND_DL=yes -DGGML_BACKEND_DIR=/usr/lib64/ollama -DGGML_BLAS=yes -DCMAKE_DISABLE_FIND_PACKAGE_Vulkan=OFF -DGGML_BLAS_VENDOR=Intel10_64lp -DCMAKE_CUDA_ARCHITECTURES=all-major -DCMAKE_HIP_COMPILER=NOTFOUND -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_TOOLCHAIN_FILE=/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build/gentoo_toolchain.cmake /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2 | |
| loading initial cache file /var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build/gentoo_common_config.cmake | |
| -- The C compiler identification is GNU 14.3.1 | |
| -- The CXX compiler identification is GNU 14.3.1 | |
| -- Detecting C compiler ABI info | |
| -- Detecting C compiler ABI info - done | |
| -- Check for working C compiler: /usr/bin/x86_64-pc-linux-gnu-gcc - skipped | |
| -- Detecting C compile features | |
| -- Detecting C compile features - done | |
| -- Detecting CXX compiler ABI info | |
| -- Detecting CXX compiler ABI info - done | |
| -- Check for working CXX compiler: /usr/bin/x86_64-pc-linux-gnu-g++ - skipped | |
| -- Detecting CXX compile features | |
| -- Detecting CXX compile features - done | |
| -- Performing Test CMAKE_HAVE_LIBC_PTHREAD | |
| -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success | |
| -- Found Threads: TRUE | |
| -- CMAKE_SYSTEM_PROCESSOR: x86_64 | |
| -- GGML_SYSTEM_ARCH: x86 | |
| -- Including CPU backend | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-x64: | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-sse42: -msse4.2 GGML_SSE42 | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-sandybridge: -msse4.2;-mavx GGML_SSE42;GGML_AVX | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-haswell: -msse4.2;-mf16c;-mfma;-mbmi2;-mavx;-mavx2 GGML_SSE42;GGML_F16C;GGML_FMA;GGML_BMI2;GGML_AVX;GGML_AVX2 | |
| -- x86 detected | |
| -- Adding CPU backend variant ggml-cpu-alderlake: -msse4.2;-mf16c;-mfma;-mbmi2;-mavx;-mavx2;-mavxvnni GGML_SSE42;GGML_F16C;GGML_FMA;GGML_BMI2;GGML_AVX;GGML_AVX2;GGML_AVX_VNNI | |
| [33mCMake Warning at ml/backend/ggml/ggml/src/ggml-blas/CMakeLists.txt:9 (find_package): | |
| By not providing "FindBLAS.cmake" in CMAKE_MODULE_PATH this project has | |
| asked CMake to find a package configuration file provided by "BLAS", but | |
| CMake did not find one. | |
| Could not find a package configuration file provided by "BLAS" with any of | |
| the following names: | |
| BLASConfig.cmake | |
| blas-config.cmake | |
| Add the installation prefix of "BLAS" to CMAKE_PREFIX_PATH or set | |
| "BLAS_DIR" to a directory containing one of the above files. If "BLAS" | |
| provides a separate development package or SDK, be sure it has been | |
| installed. | |
| [0m | |
| [31mCMake Error at ml/backend/ggml/ggml/src/ggml-blas/CMakeLists.txt:84 (message): | |
| BLAS not found, please refer to | |
| https://cmake.org/cmake/help/latest/module/FindBLAS.html#blas-lapack-vendors | |
| to set correct GGML_BLAS_VENDOR | |
| [0m | |
| -- Configuring incomplete, errors occurred! | |
| [31;01m*[0m ERROR: sci-ml/ollama-0.14.2::guru failed (configure phase): | |
| [31;01m*[0m cmake failed | |
| [31;01m*[0m | |
| [31;01m*[0m Call stack: | |
| [31;01m*[0m ebuild.sh, line 143: Called src_configure | |
| [31;01m*[0m environment, line 2791: Called cmake_src_configure | |
| [31;01m*[0m environment, line 1462: Called die | |
| [31;01m*[0m The specific snippet of code: | |
| [31;01m*[0m "${CMAKE_BINARY}" "${cmakeargs[@]}" "${CMAKE_USE_DIR}" || die "cmake failed"; | |
| [31;01m*[0m | |
| [31;01m*[0m If you need support, post the output of `emerge --info '=sci-ml/ollama-0.14.2::guru'`, | |
| [31;01m*[0m the complete build log and the output of `emerge -pqv '=sci-ml/ollama-0.14.2::guru'`. | |
| [31;01m*[0m The complete build log is located at '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/build.log'. | |
| [31;01m*[0m The ebuild environment file is located at '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/temp/environment'. | |
| [31;01m*[0m Working directory: '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2_build' | |
| [31;01m*[0m S: '/var/tmp/notmpfs/portage/sci-ml/ollama-0.14.2/work/ollama-0.14.2' |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment