* [#394153] TESTED llama.cpp.git=6397-alt1
@ 2025-09-06 6:25 Girar awaiter (vt)
2025-09-07 4:05 ` [#394153] DONE (try 2) llama.cpp.git=6397-alt1 Girar pender (vt)
0 siblings, 1 reply; 2+ messages in thread
From: Girar awaiter (vt) @ 2025-09-06 6:25 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/394153/logs/events.1.1.log
https://packages.altlinux.org/tasks/394153
subtask name aarch64 i586 x86_64
#100 llama.cpp 8:02 - 7:28
2025-Sep-06 06:11:23 :: test-only task #394153 for sisyphus started by vt:
#100 build 6397-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Sep-06 06:11:21
2025-Sep-06 06:11:25 :: [x86_64] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-06 06:11:25 :: [i586] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-06 06:11:25 :: [aarch64] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-06 06:11:38 :: [i586] #100 llama.cpp.git 6397-alt1: build SKIPPED
build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Sep-06 06:18:53 :: [x86_64] #100 llama.cpp.git 6397-alt1: build OK
2025-Sep-06 06:19:27 :: [aarch64] #100 llama.cpp.git 6397-alt1: build OK
2025-Sep-06 06:19:51 :: #100: llama.cpp.git 6397-alt1: build check OK
2025-Sep-06 06:19:52 :: build check OK
2025-Sep-06 06:20:04 :: noarch check OK
2025-Sep-06 06:20:06 :: plan: src +1 -1 =20597, aarch64 +9 -8 =36404, x86_64 +11 -10 =37232
#100 llama.cpp 6121-alt1 -> 1:6397-alt1
Sat Sep 06 2025 Vitaly Chikunov <vt@altlinux> 1:6397-alt1
- Update to b6397 (2025-09-06).
- Python-based model conversion scripts are sub-packaged. Note that they are
not supported and are provided as-is.
2025-Sep-06 06:20:45 :: patched apt indices
2025-Sep-06 06:20:54 :: created next repo
2025-Sep-06 06:21:03 :: duplicate provides check OK
2025-Sep-06 06:21:40 :: dependencies check OK
2025-Sep-06 06:22:10 :: [x86_64 aarch64] ELF symbols check OK
2025-Sep-06 06:22:25 :: [x86_64] #100 libllama: install check OK
2025-Sep-06 06:22:33 :: [x86_64] #100 libllama-debuginfo: install check OK
2025-Sep-06 06:22:33 :: [aarch64] #100 libllama: install check OK
x86_64: libllama-devel=1:6397-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Sep-06 06:22:40 :: [x86_64] #100 libllama-devel: install check OK
2025-Sep-06 06:22:46 :: [aarch64] #100 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:6397-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Sep-06 06:22:57 :: [aarch64] #100 libllama-devel: install check OK
2025-Sep-06 06:23:05 :: [x86_64] #100 llama.cpp: install check OK
2025-Sep-06 06:23:12 :: [aarch64] #100 llama.cpp: install check OK
2025-Sep-06 06:23:13 :: [x86_64] #100 llama.cpp-convert: install check OK
2025-Sep-06 06:23:22 :: [x86_64] #100 llama.cpp-cpu: install check OK
2025-Sep-06 06:23:26 :: [aarch64] #100 llama.cpp-convert: install check OK
2025-Sep-06 06:23:36 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Sep-06 06:23:39 :: [aarch64] #100 llama.cpp-cpu: install check OK
2025-Sep-06 06:23:59 :: [x86_64] #100 llama.cpp-cuda: install check OK
2025-Sep-06 06:24:00 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Sep-06 06:24:14 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2025-Sep-06 06:24:24 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2025-Sep-06 06:24:31 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Sep-06 06:24:33 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2025-Sep-06 06:24:44 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Sep-06 06:25:00 :: [x86_64-i586] generated apt indices
2025-Sep-06 06:25:00 :: [x86_64-i586] created next repo
2025-Sep-06 06:25:11 :: [x86_64-i586] dependencies check OK
2025-Sep-06 06:25:12 :: gears inheritance check OK
2025-Sep-06 06:25:12 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Sep-06 06:25:13 :: acl check OK
2025-Sep-06 06:25:25 :: created contents_index files
2025-Sep-06 06:25:32 :: created hash files: aarch64 src x86_64
2025-Sep-06 06:25:35 :: task #394153 for sisyphus TESTED
^ permalink raw reply [flat|nested] 2+ messages in thread
* [#394153] DONE (try 2) llama.cpp.git=6397-alt1
2025-09-06 6:25 [#394153] TESTED llama.cpp.git=6397-alt1 Girar awaiter (vt)
@ 2025-09-07 4:05 ` Girar pender (vt)
0 siblings, 0 replies; 2+ messages in thread
From: Girar pender (vt) @ 2025-09-07 4:05 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/archive/done/_384/394153/logs/events.2.2.log
https://packages.altlinux.org/tasks/394153
2025-Sep-07 03:59:06 :: task #394153 for sisyphus resumed by vt:
#100 build 6397-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Sep-06 06:11:21
2025-Sep-07 03:59:07 :: [i586] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-07 03:59:07 :: [x86_64] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-07 03:59:07 :: [aarch64] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-07 03:59:26 :: [aarch64] #100 llama.cpp.git 6397-alt1: build OK (cached)
build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Sep-07 03:59:28 :: [x86_64] #100 llama.cpp.git 6397-alt1: build OK (cached)
2025-Sep-07 03:59:34 :: [i586] #100 llama.cpp.git 6397-alt1: build SKIPPED
2025-Sep-07 03:59:56 :: #100: llama.cpp.git 6397-alt1: build check OK
2025-Sep-07 03:59:58 :: build check OK
2025-Sep-07 04:00:11 :: noarch check OK
2025-Sep-07 04:00:13 :: plan: src +1 -1 =20597, aarch64 +9 -8 =36382, x86_64 +11 -10 =37210
#100 llama.cpp 6121-alt1 -> 1:6397-alt1
Sat Sep 06 2025 Vitaly Chikunov <vt@altlinux> 1:6397-alt1
- Update to b6397 (2025-09-06).
- Python-based model conversion scripts are sub-packaged. Note that they are
not supported and are provided as-is.
2025-Sep-07 04:00:59 :: patched apt indices
2025-Sep-07 04:01:09 :: created next repo
2025-Sep-07 04:01:20 :: duplicate provides check OK
2025-Sep-07 04:02:01 :: dependencies check OK
2025-Sep-07 04:02:35 :: [x86_64 aarch64] ELF symbols check OK
2025-Sep-07 04:02:47 :: [x86_64] #100 libllama: install check OK (cached)
2025-Sep-07 04:02:52 :: [x86_64] #100 libllama-debuginfo: install check OK (cached)
2025-Sep-07 04:02:55 :: [aarch64] #100 libllama: install check OK (cached)
x86_64: libllama-devel=1:6397-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Sep-07 04:02:57 :: [x86_64] #100 libllama-devel: install check OK (cached)
2025-Sep-07 04:03:02 :: [aarch64] #100 libllama-debuginfo: install check OK (cached)
2025-Sep-07 04:03:03 :: [x86_64] #100 llama.cpp: install check OK (cached)
2025-Sep-07 04:03:08 :: [x86_64] #100 llama.cpp-convert: install check OK (cached)
aarch64: libllama-devel=1:6397-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Sep-07 04:03:10 :: [aarch64] #100 libllama-devel: install check OK (cached)
2025-Sep-07 04:03:12 :: [x86_64] #100 llama.cpp-cpu: install check OK (cached)
2025-Sep-07 04:03:17 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2025-Sep-07 04:03:18 :: [aarch64] #100 llama.cpp: install check OK (cached)
2025-Sep-07 04:03:24 :: [x86_64] #100 llama.cpp-cuda: install check OK (cached)
2025-Sep-07 04:03:26 :: [aarch64] #100 llama.cpp-convert: install check OK (cached)
2025-Sep-07 04:03:30 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK (cached)
2025-Sep-07 04:03:34 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2025-Sep-07 04:03:35 :: [x86_64] #100 llama.cpp-vulkan: install check OK (cached)
2025-Sep-07 04:03:40 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2025-Sep-07 04:03:42 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2025-Sep-07 04:03:50 :: [aarch64] #100 llama.cpp-vulkan: install check OK (cached)
2025-Sep-07 04:03:58 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2025-Sep-07 04:04:17 :: [x86_64-i586] generated apt indices
2025-Sep-07 04:04:17 :: [x86_64-i586] created next repo
2025-Sep-07 04:04:29 :: [x86_64-i586] dependencies check OK
2025-Sep-07 04:04:30 :: gears inheritance check OK
2025-Sep-07 04:04:30 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Sep-07 04:04:31 :: acl check OK
2025-Sep-07 04:04:44 :: created contents_index files
2025-Sep-07 04:04:53 :: created hash files: aarch64 src x86_64
2025-Sep-07 04:04:56 :: task #394153 for sisyphus TESTED
2025-Sep-07 04:04:57 :: task is ready for commit
2025-Sep-07 04:05:02 :: repo clone OK
2025-Sep-07 04:05:02 :: packages update OK
2025-Sep-07 04:05:08 :: [x86_64 aarch64] update OK
2025-Sep-07 04:05:09 :: repo update OK
2025-Sep-07 04:05:20 :: repo save OK
2025-Sep-07 04:05:20 :: src index update OK
2025-Sep-07 04:05:21 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2025-Sep-07 04:05:40 :: gears update OK
2025-Sep-07 04:05:40 :: task #394153 for sisyphus DONE
^ permalink raw reply [flat|nested] 2+ messages in thread
end of thread, other threads:[~2025-09-07 4:05 UTC | newest]
Thread overview: 2+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2025-09-06 6:25 [#394153] TESTED llama.cpp.git=6397-alt1 Girar awaiter (vt)
2025-09-07 4:05 ` [#394153] DONE (try 2) llama.cpp.git=6397-alt1 Girar pender (vt)
ALT Linux Girar Builder robot reports
This inbox may be cloned and mirrored by anyone:
git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git
# If you have public-inbox 1.1+ installed, you may
# initialize and index your mirror using the following commands:
public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \
sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com
public-inbox-index sisyphus-incominger
Example config snippet for mirrors.
Newsgroup available over NNTP:
nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger
AGPL code for this site: git clone https://public-inbox.org/public-inbox.git