Skip to content

Instantly share code, notes, and snippets.

@EvilFreelancer
Created July 6, 2025 13:04
Show Gist options
  • Save EvilFreelancer/40d7379b855004d0c8797a34add93b72 to your computer and use it in GitHub Desktop.
Save EvilFreelancer/40d7379b855004d0c8797a34add93b72 to your computer and use it in GitHub Desktop.
intel arc a770 log on llama.cpp
pasha@gpu01:~/containers/llama.cpp$ ./build/bin/llama-server -hf unsloth/Qwen3-1.7B-GGUF:Q4_K_M
WARNING: Small BAR detected for device 0000:06:00.0
WARNING: Small BAR detected for device 0000:84:00.0
WARNING: Small BAR detected for device 0000:06:00.0
WARNING: Small BAR detected for device 0000:84:00.0
[New LWP 7785]
[New LWP 7784]
[New LWP 7783]
[New LWP 7782]
[New LWP 7781]
[New LWP 7780]
[New LWP 7779]
[New LWP 7778]
[New LWP 7777]
[New LWP 7776]
[New LWP 7775]
[New LWP 7774]
[New LWP 7773]
[New LWP 7772]
[New LWP 7771]
[New LWP 7770]
[New LWP 7769]
[New LWP 7768]
[New LWP 7767]
[New LWP 7766]
[New LWP 7765]
[New LWP 7764]
[New LWP 7763]
[New LWP 7762]
[New LWP 7761]
[New LWP 7760]
[New LWP 7759]
[New LWP 7758]
[New LWP 7757]
[New LWP 7756]
[New LWP 7755]
[New LWP 7754]
[New LWP 7753]
[New LWP 7752]
[New LWP 7751]
[New LWP 7750]
[New LWP 7749]
[New LWP 7748]
[New LWP 7747]
warning: File "/opt/intel/oneapi/compiler/2025.2/lib/libsycl.so.8.0.0-gdb.py" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load".
To enable execution of this file add
add-auto-load-safe-path /opt/intel/oneapi/compiler/2025.2/lib/libsycl.so.8.0.0-gdb.py
line to your configuration file "/home/pasha/.config/gdb/gdbinit".
To completely disable this security protection add
set auto-load safe-path /
line to your configuration file "/home/pasha/.config/gdb/gdbinit".
For more information about this security protection see the
"Auto-loading safe path" section in the GDB manual. E.g., run from the shell:
info "(gdb)Auto-loading safe path"
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
__syscall_cancel_arch () at ../sysdeps/unix/sysv/linux/x86_64/syscall_cancel.S:56
warning: 56 ../sysdeps/unix/sysv/linux/x86_64/syscall_cancel.S: Нет такого файла или каталога
#0 __syscall_cancel_arch () at ../sysdeps/unix/sysv/linux/x86_64/syscall_cancel.S:56
56 in ../sysdeps/unix/sysv/linux/x86_64/syscall_cancel.S
#1 0x00007ff2f6e99668 in __internal_syscall_cancel (a1=<optimized out>, a2=<optimized out>, a3=<optimized out>, a4=<optimized out>, a5=a5@entry=0, a6=a6@entry=0, nr=61) at ./nptl/cancellation.c:49
warning: 49 ./nptl/cancellation.c: Нет такого файла или каталога
#2 0x00007ff2f6e996ad in __syscall_cancel (a1=<optimized out>, a2=<optimized out>, a3=<optimized out>, a4=<optimized out>, a5=a5@entry=0, a6=a6@entry=0, nr=61) at ./nptl/cancellation.c:75
75 in ./nptl/cancellation.c
#3 0x00007ff2f6f04787 in __GI___wait4 (pid=<optimized out>, stat_loc=<optimized out>, options=<optimized out>, usage=<optimized out>) at ../sysdeps/unix/sysv/linux/wait4.c:30
warning: 30 ../sysdeps/unix/sysv/linux/wait4.c: Нет такого файла или каталога
#4 0x00007ff2f7373456 in ggml_print_backtrace () from /home/pasha/containers/llama.cpp/build/bin/libggml-base.so
#5 0x00007ff2f73873d6 in ggml_uncaught_exception() () from /home/pasha/containers/llama.cpp/build/bin/libggml-base.so
#6 0x00007ff2f70b344a in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6
#7 0x00007ff2f70a15e9 in std::terminate() () from /lib/x86_64-linux-gnu/libstdc++.so.6
#8 0x00007ff2f70b36c8 in __cxa_throw () from /lib/x86_64-linux-gnu/libstdc++.so.6
#9 0x00007ff2f745c8e5 in dpct::dev_mgr::dev_mgr() () from /home/pasha/containers/llama.cpp/build/bin/libggml-sycl.so
#10 0x00007ff2f7435e60 in ggml_sycl_init() () from /home/pasha/containers/llama.cpp/build/bin/libggml-sycl.so
#11 0x00007ff2f74387ff in ggml_backend_sycl_reg () from /home/pasha/containers/llama.cpp/build/bin/libggml-sycl.so
#12 0x00007ff2f7939e0d in ggml_backend_registry::ggml_backend_registry() () from /home/pasha/containers/llama.cpp/build/bin/libggml.so
#13 0x00007ff2f79387ad in ggml_backend_load_best(char const*, bool, char const*) () from /home/pasha/containers/llama.cpp/build/bin/libggml.so
#14 0x00007ff2f7936f9d in ggml_backend_load_all_from_path () from /home/pasha/containers/llama.cpp/build/bin/libggml.so
#15 0x00000000004d1561 in common_params_parser_init(common_params&, llama_example, void (*)(int, char**)) ()
#16 0x00000000004cf4ef in common_params_parse(int, char**, common_params&, llama_example, void (*)(int, char**)) ()
#17 0x000000000041cc76 in main ()
[Inferior 1 (process 7745) detached]
terminate called after throwing an instance of 'std::runtime_error'
what(): can not find preferred GPU platform
Аварийный останов
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment