* [#400709] TESTED llama.cpp.git=7127-alt1
@ 2025-11-21 20:41 Girar awaiter (vt)
2025-11-21 21:29 ` [#400709] TESTED (try 2) llama.cpp.git=7127-alt1 Girar awaiter (vt)
` (2 more replies)
0 siblings, 3 replies; 4+ messages in thread
From: Girar awaiter (vt) @ 2025-11-21 20:41 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/400709/logs/events.1.1.log
https://packages.altlinux.org/tasks/400709
subtask name aarch64 i586 x86_64
#100 llama.cpp 8:18 - 7:53
2025-Nov-21 20:27:12 :: test-only task #400709 for sisyphus started by vt:
#100 build 7127-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Nov-21 20:27:10
2025-Nov-21 20:27:14 :: [x86_64] #100 llama.cpp.git 7127-alt1: build start
2025-Nov-21 20:27:14 :: [aarch64] #100 llama.cpp.git 7127-alt1: build start
2025-Nov-21 20:27:14 :: [i586] #100 llama.cpp.git 7127-alt1: build start
2025-Nov-21 20:27:25 :: [i586] #100 llama.cpp.git 7127-alt1: build SKIPPED
build/100/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Nov-21 20:35:07 :: [x86_64] #100 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 20:35:32 :: [aarch64] #100 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 20:35:40 :: 100: build check OK
2025-Nov-21 20:35:42 :: build check OK
2025-Nov-21 20:35:55 :: #100: llama.cpp.git 7127-alt1: version check OK
2025-Nov-21 20:35:55 :: build version check OK
2025-Nov-21 20:36:08 :: noarch check OK
2025-Nov-21 20:36:10 :: plan: src +1 -1 =21047, aarch64 +8 -9 =37062, x86_64 +10 -11 =37900
#100 llama.cpp 6869-alt1 -> 1:7127-alt1
Fri Nov 21 2025 Vitaly Chikunov <vt@altlinux> 1:7127-alt1
- Update to b7127 (2025-11-21).
- spec: Remove llama.cpp-convert package.
- model: detect GigaChat3-10-A1.8B as deepseek lite.
2025-Nov-21 20:36:52 :: patched apt indices
2025-Nov-21 20:37:01 :: created next repo
2025-Nov-21 20:37:11 :: duplicate provides check OK
2025-Nov-21 20:37:51 :: dependencies check OK
2025-Nov-21 20:38:26 :: [x86_64 aarch64] ELF symbols check OK
2025-Nov-21 20:38:41 :: [x86_64] #100 libllama: install check OK
2025-Nov-21 20:38:49 :: [x86_64] #100 libllama-debuginfo: install check OK
2025-Nov-21 20:38:49 :: [aarch64] #100 libllama: install check OK
x86_64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 20:38:56 :: [x86_64] #100 libllama-devel: install check OK
2025-Nov-21 20:39:02 :: [aarch64] #100 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 20:39:14 :: [aarch64] #100 libllama-devel: install check OK
2025-Nov-21 20:39:20 :: [x86_64] #100 llama.cpp: install check OK
2025-Nov-21 20:39:28 :: [aarch64] #100 llama.cpp: install check OK
2025-Nov-21 20:39:29 :: [x86_64] #100 llama.cpp-cpu: install check OK
2025-Nov-21 20:39:41 :: [aarch64] #100 llama.cpp-cpu: install check OK
2025-Nov-21 20:39:43 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 20:40:02 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 20:40:07 :: [x86_64] #100 llama.cpp-cuda: install check OK
2025-Nov-21 20:40:16 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2025-Nov-21 20:40:32 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2025-Nov-21 20:40:34 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 20:40:41 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2025-Nov-21 20:40:52 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 20:41:11 :: [x86_64-i586] generated apt indices
2025-Nov-21 20:41:11 :: [x86_64-i586] created next repo
2025-Nov-21 20:41:22 :: [x86_64-i586] dependencies check OK
2025-Nov-21 20:41:23 :: gears inheritance check OK
2025-Nov-21 20:41:24 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Nov-21 20:41:24 :: acl check OK
2025-Nov-21 20:41:37 :: created contents_index files
2025-Nov-21 20:41:45 :: created hash files: aarch64 src x86_64
2025-Nov-21 20:41:49 :: task #400709 for sisyphus TESTED
^ permalink raw reply [flat|nested] 4+ messages in thread* [#400709] TESTED (try 2) llama.cpp.git=7127-alt1
2025-11-21 20:41 [#400709] TESTED llama.cpp.git=7127-alt1 Girar awaiter (vt)
@ 2025-11-21 21:29 ` Girar awaiter (vt)
2025-11-21 21:47 ` [#400709] TESTED (try 3) llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-23 3:02 ` [#400709] DONE (try 4) llama.cpp.git=7127-alt1 Girar pender (vt)
2 siblings, 0 replies; 4+ messages in thread
From: Girar awaiter (vt) @ 2025-11-21 21:29 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/400709/logs/events.2.1.log
https://packages.altlinux.org/tasks/400709
subtask name aarch64 i586 x86_64
#200 llama.cpp 8:28 - 7:52
2025-Nov-21 21:14:49 :: test-only task #400709 for sisyphus resumed by vt:
#100 removed
#200 build 7127-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Nov-21 21:14:47
2025-Nov-21 21:14:51 :: [i586] #200 llama.cpp.git 7127-alt1: build start
2025-Nov-21 21:14:51 :: [x86_64] #200 llama.cpp.git 7127-alt1: build start
2025-Nov-21 21:14:51 :: [aarch64] #200 llama.cpp.git 7127-alt1: build start
2025-Nov-21 21:15:03 :: [i586] #200 llama.cpp.git 7127-alt1: build SKIPPED
build/200/x86_64/log:[00:04:16] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/200/x86_64/log:[00:04:16] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Nov-21 21:22:43 :: [x86_64] #200 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 21:23:19 :: [aarch64] #200 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 21:23:27 :: 200: build check OK
2025-Nov-21 21:23:29 :: build check OK
2025-Nov-21 21:23:43 :: #200: llama.cpp.git 7127-alt1: version check OK
2025-Nov-21 21:23:43 :: build version check OK
2025-Nov-21 21:23:57 :: noarch check OK
2025-Nov-21 21:23:59 :: plan: src +1 -1 =21047, aarch64 +8 -9 =37062, x86_64 +10 -11 =37900
#200 llama.cpp 6869-alt1 -> 1:7127-alt1
Fri Nov 21 2025 Vitaly Chikunov <vt@altlinux> 1:7127-alt1
- Update to b7127 (2025-11-21).
- spec: Remove llama.cpp-convert package.
- model: detect GigaChat3-10-A1.8B as deepseek lite.
2025-Nov-21 21:24:43 :: patched apt indices
2025-Nov-21 21:24:52 :: created next repo
2025-Nov-21 21:25:03 :: duplicate provides check OK
2025-Nov-21 21:25:44 :: dependencies check OK
2025-Nov-21 21:26:22 :: [x86_64 aarch64] ELF symbols check OK
2025-Nov-21 21:26:36 :: [x86_64] #200 libllama: install check OK
2025-Nov-21 21:26:44 :: [x86_64] #200 libllama-debuginfo: install check OK
2025-Nov-21 21:26:46 :: [aarch64] #200 libllama: install check OK
x86_64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 21:26:52 :: [x86_64] #200 libllama-devel: install check OK
2025-Nov-21 21:27:00 :: [aarch64] #200 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 21:27:12 :: [aarch64] #200 libllama-devel: install check OK
2025-Nov-21 21:27:16 :: [x86_64] #200 llama.cpp: install check OK
2025-Nov-21 21:27:24 :: [x86_64] #200 llama.cpp-cpu: install check OK
2025-Nov-21 21:27:26 :: [aarch64] #200 llama.cpp: install check OK
2025-Nov-21 21:27:38 :: [x86_64] #200 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 21:27:40 :: [aarch64] #200 llama.cpp-cpu: install check OK
2025-Nov-21 21:28:02 :: [x86_64] #200 llama.cpp-cuda: install check OK
2025-Nov-21 21:28:02 :: [aarch64] #200 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 21:28:17 :: [aarch64] #200 llama.cpp-vulkan: install check OK
2025-Nov-21 21:28:27 :: [x86_64] #200 llama.cpp-cuda-debuginfo: install check OK
2025-Nov-21 21:28:35 :: [aarch64] #200 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 21:28:36 :: [x86_64] #200 llama.cpp-vulkan: install check OK
2025-Nov-21 21:28:46 :: [x86_64] #200 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 21:29:05 :: [x86_64-i586] generated apt indices
2025-Nov-21 21:29:05 :: [x86_64-i586] created next repo
2025-Nov-21 21:29:15 :: [x86_64-i586] dependencies check OK
2025-Nov-21 21:29:17 :: gears inheritance check OK
2025-Nov-21 21:29:17 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #200: llama.cpp: allowed for vt
2025-Nov-21 21:29:18 :: acl check OK
2025-Nov-21 21:29:30 :: created contents_index files
2025-Nov-21 21:29:37 :: created hash files: aarch64 src x86_64
2025-Nov-21 21:29:40 :: task #400709 for sisyphus TESTED
^ permalink raw reply [flat|nested] 4+ messages in thread
* [#400709] TESTED (try 3) llama.cpp.git=7127-alt1
2025-11-21 20:41 [#400709] TESTED llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-21 21:29 ` [#400709] TESTED (try 2) llama.cpp.git=7127-alt1 Girar awaiter (vt)
@ 2025-11-21 21:47 ` Girar awaiter (vt)
2025-11-23 3:02 ` [#400709] DONE (try 4) llama.cpp.git=7127-alt1 Girar pender (vt)
2 siblings, 0 replies; 4+ messages in thread
From: Girar awaiter (vt) @ 2025-11-21 21:47 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/400709/logs/events.3.1.log
https://packages.altlinux.org/tasks/400709
subtask name aarch64 i586 x86_64
#300 llama.cpp 8:11 - 7:54
2025-Nov-21 21:33:25 :: test-only task #400709 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 build 7127-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Nov-21 21:33:22
2025-Nov-21 21:33:26 :: [i586] #300 llama.cpp.git 7127-alt1: build start
2025-Nov-21 21:33:26 :: [x86_64] #300 llama.cpp.git 7127-alt1: build start
2025-Nov-21 21:33:26 :: [aarch64] #300 llama.cpp.git 7127-alt1: build start
2025-Nov-21 21:33:37 :: [i586] #300 llama.cpp.git 7127-alt1: build SKIPPED
build/300/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/300/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Nov-21 21:41:20 :: [x86_64] #300 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 21:41:37 :: [aarch64] #300 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 21:41:45 :: 300: build check OK
2025-Nov-21 21:41:47 :: build check OK
2025-Nov-21 21:42:00 :: #300: llama.cpp.git 7127-alt1: version check OK
2025-Nov-21 21:42:00 :: build version check OK
2025-Nov-21 21:42:13 :: noarch check OK
2025-Nov-21 21:42:15 :: plan: src +1 -1 =21047, aarch64 +8 -9 =37062, x86_64 +10 -11 =37900
#300 llama.cpp 6869-alt1 -> 1:7127-alt1
Fri Nov 21 2025 Vitaly Chikunov <vt@altlinux> 1:7127-alt1
- Update to b7127 (2025-11-21).
- spec: Remove llama.cpp-convert package.
- model: detect GigaChat3-10-A1.8B as deepseek lite.
2025-Nov-21 21:42:58 :: patched apt indices
2025-Nov-21 21:43:08 :: created next repo
2025-Nov-21 21:43:18 :: duplicate provides check OK
2025-Nov-21 21:43:58 :: dependencies check OK
2025-Nov-21 21:44:33 :: [x86_64 aarch64] ELF symbols check OK
2025-Nov-21 21:44:48 :: [x86_64] #300 libllama: install check OK
2025-Nov-21 21:44:56 :: [x86_64] #300 libllama-debuginfo: install check OK
2025-Nov-21 21:44:57 :: [aarch64] #300 libllama: install check OK
x86_64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 21:45:03 :: [x86_64] #300 libllama-devel: install check OK
2025-Nov-21 21:45:11 :: [aarch64] #300 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 21:45:23 :: [aarch64] #300 libllama-devel: install check OK
2025-Nov-21 21:45:28 :: [x86_64] #300 llama.cpp: install check OK
2025-Nov-21 21:45:36 :: [x86_64] #300 llama.cpp-cpu: install check OK
2025-Nov-21 21:45:37 :: [aarch64] #300 llama.cpp: install check OK
2025-Nov-21 21:45:50 :: [x86_64] #300 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 21:45:51 :: [aarch64] #300 llama.cpp-cpu: install check OK
2025-Nov-21 21:46:13 :: [aarch64] #300 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 21:46:14 :: [x86_64] #300 llama.cpp-cuda: install check OK
2025-Nov-21 21:46:27 :: [aarch64] #300 llama.cpp-vulkan: install check OK
2025-Nov-21 21:46:39 :: [x86_64] #300 llama.cpp-cuda-debuginfo: install check OK
2025-Nov-21 21:46:45 :: [aarch64] #300 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 21:46:48 :: [x86_64] #300 llama.cpp-vulkan: install check OK
2025-Nov-21 21:46:59 :: [x86_64] #300 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 21:47:16 :: [x86_64-i586] generated apt indices
2025-Nov-21 21:47:16 :: [x86_64-i586] created next repo
2025-Nov-21 21:47:27 :: [x86_64-i586] dependencies check OK
2025-Nov-21 21:47:28 :: gears inheritance check OK
2025-Nov-21 21:47:28 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #300: llama.cpp: allowed for vt
2025-Nov-21 21:47:29 :: acl check OK
2025-Nov-21 21:47:41 :: created contents_index files
2025-Nov-21 21:47:49 :: created hash files: aarch64 src x86_64
2025-Nov-21 21:47:52 :: task #400709 for sisyphus TESTED
^ permalink raw reply [flat|nested] 4+ messages in thread
* [#400709] DONE (try 4) llama.cpp.git=7127-alt1
2025-11-21 20:41 [#400709] TESTED llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-21 21:29 ` [#400709] TESTED (try 2) llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-21 21:47 ` [#400709] TESTED (try 3) llama.cpp.git=7127-alt1 Girar awaiter (vt)
@ 2025-11-23 3:02 ` Girar pender (vt)
2 siblings, 0 replies; 4+ messages in thread
From: Girar pender (vt) @ 2025-11-23 3:02 UTC (permalink / raw)
To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus
https://git.altlinux.org/tasks/archive/done/_391/400709/logs/events.4.1.log
https://packages.altlinux.org/tasks/400709
subtask name aarch64 i586 x86_64
#300 llama.cpp 8:13 - 10:21
2025-Nov-23 02:43:59 :: task #400709 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 build 7127-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Nov-21 21:33:22
2025-Nov-23 02:44:01 :: [x86_64] #300 llama.cpp.git 7127-alt1: build start
2025-Nov-23 02:44:01 :: [aarch64] #300 llama.cpp.git 7127-alt1: build start
2025-Nov-23 02:44:01 :: [i586] #300 llama.cpp.git 7127-alt1: build start
2025-Nov-23 02:44:14 :: [i586] #300 llama.cpp.git 7127-alt1: build SKIPPED
2025-Nov-23 02:52:14 :: [aarch64] #300 llama.cpp.git 7127-alt1: build OK
build/300/x86_64/log:[00:06:06] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/300/x86_64/log:[00:06:06] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Nov-23 02:54:22 :: [x86_64] #300 llama.cpp.git 7127-alt1: build OK
2025-Nov-23 02:54:31 :: 300: build check OK
2025-Nov-23 02:54:32 :: build check OK
2025-Nov-23 02:54:46 :: #300: llama.cpp.git 7127-alt1: version check OK
2025-Nov-23 02:54:47 :: build version check OK
2025-Nov-23 02:54:59 :: noarch check OK
2025-Nov-23 02:55:01 :: plan: src +1 -1 =21047, aarch64 +8 -9 =37063, x86_64 +10 -11 =37901
#300 llama.cpp 6869-alt1 -> 1:7127-alt1
Fri Nov 21 2025 Vitaly Chikunov <vt@altlinux> 1:7127-alt1
- Update to b7127 (2025-11-21).
- spec: Remove llama.cpp-convert package.
- model: detect GigaChat3-10-A1.8B as deepseek lite.
2025-Nov-23 02:55:42 :: patched apt indices
2025-Nov-23 02:55:51 :: created next repo
2025-Nov-23 02:56:01 :: duplicate provides check OK
2025-Nov-23 02:56:40 :: dependencies check OK
2025-Nov-23 02:57:16 :: [x86_64 aarch64] ELF symbols check OK
2025-Nov-23 02:57:38 :: [x86_64] #300 libllama: install check OK
2025-Nov-23 02:57:40 :: [aarch64] #300 libllama: install check OK
2025-Nov-23 02:57:53 :: [aarch64] #300 libllama-debuginfo: install check OK
2025-Nov-23 02:57:54 :: [x86_64] #300 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-23 02:58:05 :: [aarch64] #300 libllama-devel: install check OK
x86_64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-23 02:58:06 :: [x86_64] #300 libllama-devel: install check OK
2025-Nov-23 02:58:20 :: [aarch64] #300 llama.cpp: install check OK
2025-Nov-23 02:58:34 :: [aarch64] #300 llama.cpp-cpu: install check OK
2025-Nov-23 02:58:36 :: [x86_64] #300 llama.cpp: install check OK
2025-Nov-23 02:58:46 :: [x86_64] #300 llama.cpp-cpu: install check OK
2025-Nov-23 02:58:55 :: [aarch64] #300 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-23 02:59:03 :: [x86_64] #300 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-23 02:59:10 :: [aarch64] #300 llama.cpp-vulkan: install check OK
2025-Nov-23 02:59:28 :: [aarch64] #300 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-23 02:59:31 :: [x86_64] #300 llama.cpp-cuda: install check OK
2025-Nov-23 02:59:59 :: [x86_64] #300 llama.cpp-cuda-debuginfo: install check OK
2025-Nov-23 03:00:10 :: [x86_64] #300 llama.cpp-vulkan: install check OK
2025-Nov-23 03:00:22 :: [x86_64] #300 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-23 03:00:40 :: [x86_64-i586] generated apt indices
2025-Nov-23 03:00:40 :: [x86_64-i586] created next repo
2025-Nov-23 03:00:52 :: [x86_64-i586] dependencies check OK
2025-Nov-23 03:00:53 :: gears inheritance check OK
2025-Nov-23 03:00:54 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #300: llama.cpp: allowed for vt
2025-Nov-23 03:00:54 :: acl check OK
2025-Nov-23 03:01:07 :: created contents_index files
2025-Nov-23 03:01:16 :: created hash files: aarch64 src x86_64
2025-Nov-23 03:01:19 :: task #400709 for sisyphus TESTED
2025-Nov-23 03:01:20 :: task is ready for commit
2025-Nov-23 03:01:26 :: repo clone OK
2025-Nov-23 03:01:26 :: packages update OK
2025-Nov-23 03:01:32 :: [x86_64 aarch64] update OK
2025-Nov-23 03:01:32 :: repo update OK
2025-Nov-23 03:01:44 :: repo save OK
2025-Nov-23 03:01:44 :: src index update OK
2025-Nov-23 03:01:47 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2025-Nov-23 03:02:05 :: gears update OK
2025-Nov-23 03:02:05 :: task #400709 for sisyphus DONE
^ permalink raw reply [flat|nested] 4+ messages in thread
end of thread, other threads:[~2025-11-23 3:02 UTC | newest]
Thread overview: 4+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2025-11-21 20:41 [#400709] TESTED llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-21 21:29 ` [#400709] TESTED (try 2) llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-21 21:47 ` [#400709] TESTED (try 3) llama.cpp.git=7127-alt1 Girar awaiter (vt)
2025-11-23 3:02 ` [#400709] DONE (try 4) llama.cpp.git=7127-alt1 Girar pender (vt)
ALT Linux Girar Builder robot reports
This inbox may be cloned and mirrored by anyone:
git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git
# If you have public-inbox 1.1+ installed, you may
# initialize and index your mirror using the following commands:
public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \
sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com
public-inbox-index sisyphus-incominger
Example config snippet for mirrors.
Newsgroup available over NNTP:
nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger
AGPL code for this site: git clone https://public-inbox.org/public-inbox.git