* [#407752] TESTED llama.cpp.git=7974-alt1
@ 2026-02-09 22:15 Girar awaiter (vt)
0 siblings, 0 replies; only message in thread
From: Girar awaiter (vt) @ 2026-02-09 22:15 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/407752/logs/events.1.1.log
https://packages.altlinux.org/tasks/407752
subtask name aarch64 i586 x86_64
#100 llama.cpp 5:52 - 5:07
2026-Feb-09 22:04:12 :: test-only task #407752 for sisyphus started by vt:
#100 build 7974-alt1 from /people/vt/packages/llama.cpp.git fetched at 2026-Feb-09 22:04:09
2026-Feb-09 22:04:14 :: [x86_64] #100 llama.cpp.git 7974-alt1: build start
2026-Feb-09 22:04:14 :: [aarch64] #100 llama.cpp.git 7974-alt1: build start
2026-Feb-09 22:04:14 :: [i586] #100 llama.cpp.git 7974-alt1: build start
2026-Feb-09 22:04:21 :: [i586] #100 llama.cpp.git 7974-alt1: build SKIPPED
build/100/x86_64/log:[00:02:12] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:12] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Feb-09 22:09:21 :: [x86_64] #100 llama.cpp.git 7974-alt1: build OK
2026-Feb-09 22:10:06 :: [aarch64] #100 llama.cpp.git 7974-alt1: build OK
2026-Feb-09 22:10:15 :: 100: build check OK
2026-Feb-09 22:10:16 :: build check OK
2026-Feb-09 22:10:29 :: #100: llama.cpp.git 7974-alt1: version check OK
2026-Feb-09 22:10:30 :: build version check OK
--- llama.cpp-cpu-7974-alt1.x86_64.rpm.share 2026-02-09 22:10:33.291525391 +0000
+++ llama.cpp-cpu-7974-alt1.aarch64.rpm.share 2026-02-09 22:10:34.367534213 +0000
@@ -8,3 +8,3 @@
/usr/share/doc/llama.cpp/README.md 100644 UTF-8 Unicode English text, with very long lines
-/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text, with very long lines
+/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text
/usr/share/doc/llama.cpp/docs 40755 directory
warning (#100): non-identical /usr/share part
2026-Feb-09 22:10:47 :: noarch check OK
2026-Feb-09 22:10:49 :: plan: src +1 -1 =21583, aarch64 +8 -8 =38212, x86_64 +10 -10 =39228
#100 llama.cpp 7819-alt1 -> 1:7974-alt1
Mon Feb 09 2026 Vitaly Chikunov <vt@altlinux> 1:7974-alt1
- Update to b7974 (2026-02-09).
2026-Feb-09 22:11:30 :: patched apt indices
2026-Feb-09 22:11:39 :: created next repo
2026-Feb-09 22:11:49 :: duplicate provides check OK
2026-Feb-09 22:12:27 :: dependencies check OK
2026-Feb-09 22:13:01 :: [x86_64 aarch64] ELF symbols check OK
2026-Feb-09 22:13:12 :: [x86_64] #100 libllama: install check OK
2026-Feb-09 22:13:17 :: [x86_64] #100 libllama-debuginfo: install check OK
2026-Feb-09 22:13:20 :: [aarch64] #100 libllama: install check OK
x86_64: libllama-devel=1:7974-alt1 post-install unowned files:
/usr/lib64/cmake
2026-Feb-09 22:13:21 :: [x86_64] #100 libllama-devel: install check OK
2026-Feb-09 22:13:31 :: [aarch64] #100 libllama-debuginfo: install check OK
2026-Feb-09 22:13:38 :: [x86_64] #100 llama.cpp: install check OK
aarch64: libllama-devel=1:7974-alt1 post-install unowned files:
/usr/lib64/cmake
2026-Feb-09 22:13:42 :: [aarch64] #100 libllama-devel: install check OK
2026-Feb-09 22:13:44 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Feb-09 22:13:53 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Feb-09 22:13:53 :: [aarch64] #100 llama.cpp: install check OK
2026-Feb-09 22:14:04 :: [aarch64] #100 llama.cpp-cpu: install check OK
2026-Feb-09 22:14:10 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Feb-09 22:14:20 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Feb-09 22:14:27 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Feb-09 22:14:31 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2026-Feb-09 22:14:33 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Feb-09 22:14:40 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2026-Feb-09 22:14:45 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2026-Feb-09 22:15:03 :: [x86_64-i586] generated apt indices
2026-Feb-09 22:15:03 :: [x86_64-i586] created next repo
2026-Feb-09 22:15:14 :: [x86_64-i586] dependencies check OK
2026-Feb-09 22:15:15 :: gears inheritance check OK
2026-Feb-09 22:15:15 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: disapproved by vt
2026-Feb-09 22:15:16 :: acl check IGNORED
2026-Feb-09 22:15:28 :: created contents_index files
2026-Feb-09 22:15:36 :: created hash files: aarch64 src x86_64
2026-Feb-09 22:15:39 :: task #407752 for sisyphus TESTED
^ permalink raw reply [flat|nested] only message in thread
only message in thread, other threads:[~2026-02-09 22:15 UTC | newest]
Thread overview: (only message) (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2026-02-09 22:15 [#407752] TESTED llama.cpp.git=7974-alt1 Girar awaiter (vt)
ALT Linux Girar Builder robot reports
This inbox may be cloned and mirrored by anyone:
git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git
# If you have public-inbox 1.1+ installed, you may
# initialize and index your mirror using the following commands:
public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \
sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com
public-inbox-index sisyphus-incominger
Example config snippet for mirrors.
Newsgroup available over NNTP:
nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger
AGPL code for this site: git clone https://public-inbox.org/public-inbox.git