* [#407854] TESTED llama.cpp.git=7988-alt1
@ 2026-02-10 20:47 Girar awaiter (vt)
0 siblings, 0 replies; only message in thread
From: Girar awaiter (vt) @ 2026-02-10 20:47 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/407854/logs/events.1.1.log
https://packages.altlinux.org/tasks/407854
subtask name aarch64 i586 x86_64
#100 llama.cpp 6:14 - 5:07
2026-Feb-10 20:34:21 :: test-only task #407854 for sisyphus started by vt:
#100 build 7988-alt1 from /people/vt/packages/llama.cpp.git fetched at 2026-Feb-10 20:34:19
2026-Feb-10 20:34:23 :: [x86_64] #100 llama.cpp.git 7988-alt1: build start
2026-Feb-10 20:34:23 :: [aarch64] #100 llama.cpp.git 7988-alt1: build start
2026-Feb-10 20:34:23 :: [i586] #100 llama.cpp.git 7988-alt1: build start
2026-Feb-10 20:34:30 :: [i586] #100 llama.cpp.git 7988-alt1: build SKIPPED
build/100/x86_64/log:[00:02:13] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:13] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Feb-10 20:39:30 :: [x86_64] #100 llama.cpp.git 7988-alt1: build OK
2026-Feb-10 20:40:37 :: [aarch64] #100 llama.cpp.git 7988-alt1: build OK
2026-Feb-10 20:40:45 :: 100: build check OK
2026-Feb-10 20:40:46 :: build check OK
2026-Feb-10 20:41:01 :: #100: llama.cpp.git 7988-alt1: version check OK
2026-Feb-10 20:41:02 :: build version check OK
--- llama.cpp-cpu-7988-alt1.x86_64.rpm.share 2026-02-10 20:41:05.621898788 +0000
+++ llama.cpp-cpu-7988-alt1.aarch64.rpm.share 2026-02-10 20:41:06.873910461 +0000
@@ -8,3 +8,3 @@
/usr/share/doc/llama.cpp/README.md 100644 UTF-8 Unicode English text, with very long lines
-/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text, with very long lines
+/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text
/usr/share/doc/llama.cpp/docs 40755 directory
warning (#100): non-identical /usr/share part
2026-Feb-10 20:41:21 :: noarch check OK
2026-Feb-10 20:41:23 :: plan: src +1 -1 =21565, aarch64 +8 -8 =38149, x86_64 +10 -10 =39163
#100 llama.cpp 7819-alt1 -> 1:7988-alt1
Tue Feb 10 2026 Vitaly Chikunov <vt@altlinux> 1:7988-alt1
- Update to b7988 (2026-02-10).
2026-Feb-10 20:42:11 :: patched apt indices
2026-Feb-10 20:42:21 :: created next repo
2026-Feb-10 20:42:33 :: duplicate provides check OK
2026-Feb-10 20:43:16 :: dependencies check OK
2026-Feb-10 20:43:50 :: [x86_64 aarch64] ELF symbols check OK
2026-Feb-10 20:44:01 :: [x86_64] #100 libllama: install check OK
2026-Feb-10 20:44:06 :: [x86_64] #100 libllama-debuginfo: install check OK
x86_64: libllama-devel=1:7988-alt1 post-install unowned files:
/usr/lib64/cmake
2026-Feb-10 20:44:11 :: [x86_64] #100 libllama-devel: install check OK
2026-Feb-10 20:44:11 :: [aarch64] #100 libllama: install check OK
2026-Feb-10 20:44:22 :: [aarch64] #100 libllama-debuginfo: install check OK
2026-Feb-10 20:44:28 :: [x86_64] #100 llama.cpp: install check OK
aarch64: libllama-devel=1:7988-alt1 post-install unowned files:
/usr/lib64/cmake
2026-Feb-10 20:44:32 :: [aarch64] #100 libllama-devel: install check OK
2026-Feb-10 20:44:33 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Feb-10 20:44:42 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Feb-10 20:44:45 :: [aarch64] #100 llama.cpp: install check OK
2026-Feb-10 20:44:56 :: [aarch64] #100 llama.cpp-cpu: install check OK
2026-Feb-10 20:44:59 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Feb-10 20:45:12 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Feb-10 20:45:17 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Feb-10 20:45:23 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Feb-10 20:45:24 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2026-Feb-10 20:45:30 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2026-Feb-10 20:45:37 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2026-Feb-10 20:45:56 :: [x86_64-i586] generated apt indices
2026-Feb-10 20:45:56 :: [x86_64-i586] created next repo
2026-Feb-10 20:46:07 :: [x86_64-i586] dependencies check OK
2026-Feb-10 20:46:56 :: gears inheritance check OK
2026-Feb-10 20:46:56 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Feb-10 20:46:57 :: acl check OK
2026-Feb-10 20:47:08 :: created contents_index files
2026-Feb-10 20:47:17 :: created hash files: aarch64 src x86_64
2026-Feb-10 20:47:20 :: task #407854 for sisyphus TESTED
^ permalink raw reply [flat|nested] only message in thread
only message in thread, other threads:[~2026-02-10 20:47 UTC | newest]
Thread overview: (only message) (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2026-02-10 20:47 [#407854] TESTED llama.cpp.git=7988-alt1 Girar awaiter (vt)
ALT Linux Girar Builder robot reports
This inbox may be cloned and mirrored by anyone:
git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git
# If you have public-inbox 1.1+ installed, you may
# initialize and index your mirror using the following commands:
public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \
sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com
public-inbox-index sisyphus-incominger
Example config snippet for mirrors.
Newsgroup available over NNTP:
nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger
AGPL code for this site: git clone https://public-inbox.org/public-inbox.git