ALT Linux Girar Builder robot reports
 help / color / mirror / Atom feed
* [#377221] [test-only] FAILED llama.cpp.git=4855-alt1
@ 2025-03-08  2:04 Girar awaiter (vt)
  2025-03-08  2:21 ` [#377221] [test-only] FAILED (try 2) llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (5 more replies)
  0 siblings, 6 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-08  2:04 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.1.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp   failed     -    8:05

2025-Mar-08 01:56:15 :: test-only task #377221 for sisyphus started by vt:
#100 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-08 01:56:12
2025-Mar-08 01:56:17 :: [aarch64] #100 llama.cpp.git 4855-alt1: build start
2025-Mar-08 01:56:17 :: [x86_64] #100 llama.cpp.git 4855-alt1: build start
2025-Mar-08 01:56:17 :: [i586] #100 llama.cpp.git 4855-alt1: build start
2025-Mar-08 01:56:30 :: [i586] #100 llama.cpp.git 4855-alt1: build SKIPPED
[aarch64] Processing files: libllama-4855-alt1
[aarch64] error: No such file or directory: /usr/src/tmp/llama.cpp-buildroot/usr/lib64/libggml-cuda.so
[aarch64] RPM build errors:
[aarch64]     No such file or directory: /usr/src/tmp/llama.cpp-buildroot/usr/lib64/libggml-cuda.so
2025-Mar-08 02:00:49 :: [aarch64] llama.cpp.git 4855-alt1: remote: build failed
2025-Mar-08 02:00:50 :: [aarch64] #100 llama.cpp.git 4855-alt1: build FAILED
2025-Mar-08 02:00:50 :: [aarch64] requesting cancellation of task processing
build/100/x86_64/log:[00:05:07] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:05:07] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-08 02:04:22 :: [x86_64] #100 llama.cpp.git 4855-alt1: build OK
2025-Mar-08 02:00:50 :: [aarch64] build FAILED
2025-Mar-08 02:04:23 :: task #377221 for sisyphus FAILED


^ permalink raw reply	[flat|nested] 7+ messages in thread

* [#377221] [test-only] FAILED (try 2) llama.cpp.git=4855-alt1
  2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
@ 2025-03-08  2:21 ` Girar awaiter (vt)
  2025-03-08  2:43 ` [#377221] TESTED (try 3) llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (4 subsequent siblings)
  5 siblings, 0 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-08  2:21 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.2.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #200  llama.cpp     5:37     -    6:17

2025-Mar-08 02:12:57 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-08 02:12:54
2025-Mar-08 02:12:58 :: [x86_64] #200 llama.cpp.git 4855-alt1: build start
2025-Mar-08 02:12:58 :: [aarch64] #200 llama.cpp.git 4855-alt1: build start
2025-Mar-08 02:12:58 :: [i586] #200 llama.cpp.git 4855-alt1: build start
2025-Mar-08 02:13:20 :: [i586] #200 llama.cpp.git 4855-alt1: build SKIPPED
2025-Mar-08 02:18:35 :: [aarch64] #200 llama.cpp.git 4855-alt1: build OK
build/200/x86_64/log:[00:03:36] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/200/x86_64/log:[00:03:36] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-08 02:19:15 :: [x86_64] #200 llama.cpp.git 4855-alt1: build OK
2025-Mar-08 02:19:31 :: #200: llama.cpp.git 4855-alt1: build check OK
2025-Mar-08 02:19:32 :: build check OK
2025-Mar-08 02:20:00 :: noarch check OK
2025-Mar-08 02:20:02 :: plan: src +1 -1 =19911, aarch64 +5 -5 =34598, x86_64 +5 -5 =35399
#200 llama.cpp 3441-alt1 -> 1:4855-alt1
 Fri Mar 07 2025 Vitaly Chikunov <vt@altlinux> 1:4855-alt1
 - Update to b4855 (2025-03-07).
 - test: Enable NVIDIA GPU.
2025-Mar-08 02:20:39 :: patched apt indices
2025-Mar-08 02:20:48 :: created next repo
2025-Mar-08 02:20:58 :: duplicate provides check OK
	x86_64: NEW unmet dependencies detected:
 libllama-debuginfo#1:4855-alt1:sisyphus+377221.200.2.1@1741400312  debug64(libcuda.so.1)
	ACLs of affected packages (1):
 llama.cpp  vt @everybody
2025-Mar-08 02:21:35 :: unmets: x86_64 +1 -0 =1
2025-Mar-08 02:21:35 :: dependencies check FAILED
2025-Mar-08 02:21:35 :: task #377221 for sisyphus FAILED


^ permalink raw reply	[flat|nested] 7+ messages in thread

* [#377221] TESTED (try 3) llama.cpp.git=4855-alt1
  2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
  2025-03-08  2:21 ` [#377221] [test-only] FAILED (try 2) llama.cpp.git=4855-alt1 Girar awaiter (vt)
@ 2025-03-08  2:43 ` Girar awaiter (vt)
  2025-03-08 20:27 ` [#377221] [test-only] FAILED (try 4) llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (3 subsequent siblings)
  5 siblings, 0 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-08  2:43 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.3.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #300  llama.cpp     5:36     -    6:06

2025-Mar-08 02:31:55 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-08 02:31:52
2025-Mar-08 02:31:57 :: [aarch64] #300 llama.cpp.git 4855-alt1: build start
2025-Mar-08 02:31:57 :: [i586] #300 llama.cpp.git 4855-alt1: build start
2025-Mar-08 02:31:57 :: [x86_64] #300 llama.cpp.git 4855-alt1: build start
2025-Mar-08 02:32:16 :: [i586] #300 llama.cpp.git 4855-alt1: build SKIPPED
2025-Mar-08 02:37:33 :: [aarch64] #300 llama.cpp.git 4855-alt1: build OK
build/300/x86_64/log:[00:03:31] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/300/x86_64/log:[00:03:31] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-08 02:38:03 :: [x86_64] #300 llama.cpp.git 4855-alt1: build OK
2025-Mar-08 02:38:18 :: #300: llama.cpp.git 4855-alt1: build check OK
2025-Mar-08 02:38:19 :: build check OK
2025-Mar-08 02:38:45 :: noarch check OK
2025-Mar-08 02:38:47 :: plan: src +1 -1 =19911, aarch64 +5 -5 =34598, x86_64 +5 -5 =35399
#300 llama.cpp 3441-alt1 -> 1:4855-alt1
 Fri Mar 07 2025 Vitaly Chikunov <vt@altlinux> 1:4855-alt1
 - Update to b4855 (2025-03-07).
 - test: Enable NVIDIA GPU.
2025-Mar-08 02:39:27 :: patched apt indices
2025-Mar-08 02:39:35 :: created next repo
2025-Mar-08 02:39:45 :: duplicate provides check OK
2025-Mar-08 02:40:23 :: dependencies check OK
2025-Mar-08 02:40:57 :: [x86_64 aarch64] ELF symbols check OK
2025-Mar-08 02:41:19 :: [aarch64] #300 libllama: install check OK
2025-Mar-08 02:41:24 :: [x86_64] #300 libllama: install check OK
2025-Mar-08 02:41:33 :: [aarch64] #300 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-08 02:41:44 :: [aarch64] #300 libllama-devel: install check OK
2025-Mar-08 02:41:46 :: [x86_64] #300 libllama-debuginfo: install check OK
2025-Mar-08 02:41:58 :: [aarch64] #300 llama.cpp: install check OK
	x86_64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-08 02:42:06 :: [x86_64] #300 libllama-devel: install check OK
2025-Mar-08 02:42:28 :: [x86_64] #300 llama.cpp: install check OK
2025-Mar-08 02:42:30 :: [aarch64] #300 llama.cpp-debuginfo: install check OK
2025-Mar-08 02:43:02 :: [x86_64] #300 llama.cpp-debuginfo: install check OK
2025-Mar-08 02:43:20 :: [x86_64-i586] generated apt indices
2025-Mar-08 02:43:20 :: [x86_64-i586] created next repo
2025-Mar-08 02:43:30 :: [x86_64-i586] dependencies check OK
2025-Mar-08 02:43:31 :: gears inheritance check OK
2025-Mar-08 02:43:31 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #300: llama.cpp: disapproved by vt
2025-Mar-08 02:43:32 :: acl check IGNORED
2025-Mar-08 02:43:44 :: created contents_index files
2025-Mar-08 02:43:51 :: created hash files: aarch64 src x86_64
2025-Mar-08 02:43:54 :: task #377221 for sisyphus TESTED


^ permalink raw reply	[flat|nested] 7+ messages in thread

* [#377221] [test-only] FAILED (try 4) llama.cpp.git=4855-alt1
  2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
  2025-03-08  2:21 ` [#377221] [test-only] FAILED (try 2) llama.cpp.git=4855-alt1 Girar awaiter (vt)
  2025-03-08  2:43 ` [#377221] TESTED (try 3) llama.cpp.git=4855-alt1 Girar awaiter (vt)
@ 2025-03-08 20:27 ` Girar awaiter (vt)
  2025-03-09  1:07 ` [#377221] TESTED (try 5) llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (2 subsequent siblings)
  5 siblings, 0 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-08 20:27 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.4.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #400  llama.cpp   failed     -    5:27

2025-Mar-08 20:22:15 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 removed
#400 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-08 20:22:12
2025-Mar-08 20:22:16 :: [x86_64] #400 llama.cpp.git 4855-alt1: build start
2025-Mar-08 20:22:16 :: [aarch64] #400 llama.cpp.git 4855-alt1: build start
2025-Mar-08 20:22:16 :: [i586] #400 llama.cpp.git 4855-alt1: build start
2025-Mar-08 20:22:28 :: [i586] #400 llama.cpp.git 4855-alt1: build SKIPPED
[aarch64] 27/27 Test #27: test-autorelease ..................   Passed    0.00 sec
[aarch64] 100% tests passed, 0 tests failed out of 27
[aarch64] Label Time Summary:
2025-Mar-08 20:25:40 :: [aarch64] llama.cpp.git 4855-alt1: remote: build failed
2025-Mar-08 20:25:40 :: [aarch64] #400 llama.cpp.git 4855-alt1: build FAILED
2025-Mar-08 20:25:40 :: [aarch64] requesting cancellation of task processing
build/400/x86_64/log:[00:03:00] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/400/x86_64/log:[00:03:00] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-08 20:27:43 :: [x86_64] #400 llama.cpp.git 4855-alt1: build OK
2025-Mar-08 20:25:40 :: [aarch64] build FAILED
2025-Mar-08 20:27:43 :: task #377221 for sisyphus FAILED


^ permalink raw reply	[flat|nested] 7+ messages in thread

* [#377221] TESTED (try 5) llama.cpp.git=4855-alt1
  2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (2 preceding siblings ...)
  2025-03-08 20:27 ` [#377221] [test-only] FAILED (try 4) llama.cpp.git=4855-alt1 Girar awaiter (vt)
@ 2025-03-09  1:07 ` Girar awaiter (vt)
  2025-03-09 10:49 ` [#377221] [test-only] FAILED (try 6) llama.cpp.git=4855-alt1 Girar awaiter (vt)
  2025-03-09 11:35 ` [#377221] TESTED (try 7) llama.cpp.git=4855-alt1 Girar awaiter (vt)
  5 siblings, 0 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-09  1:07 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.5.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #500  llama.cpp     4:37     -    6:49

2025-Mar-09 00:55:41 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 removed
#400 removed
#500 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-09 00:55:38
2025-Mar-09 00:55:42 :: [i586] #500 llama.cpp.git 4855-alt1: build start
2025-Mar-09 00:55:42 :: [x86_64] #500 llama.cpp.git 4855-alt1: build start
2025-Mar-09 00:55:42 :: [aarch64] #500 llama.cpp.git 4855-alt1: build start
2025-Mar-09 00:55:54 :: [i586] #500 llama.cpp.git 4855-alt1: build SKIPPED
2025-Mar-09 01:00:19 :: [aarch64] #500 llama.cpp.git 4855-alt1: build OK
build/500/x86_64/log:[00:03:44] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/500/x86_64/log:[00:03:44] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-09 01:02:31 :: [x86_64] #500 llama.cpp.git 4855-alt1: build OK
2025-Mar-09 01:02:55 :: #500: llama.cpp.git 4855-alt1: build check OK
2025-Mar-09 01:02:56 :: build check OK
2025-Mar-09 01:03:20 :: noarch check OK
2025-Mar-09 01:03:22 :: plan: src +1 -1 =19914, aarch64 +5 -5 =34622, x86_64 +5 -5 =35423
#500 llama.cpp 3441-alt1 -> 1:4855-alt1
 Fri Mar 07 2025 Vitaly Chikunov <vt@altlinux> 1:4855-alt1
 - Update to b4855 (2025-03-07).
 - test: Enable NVIDIA GPU.
2025-Mar-09 01:04:03 :: patched apt indices
2025-Mar-09 01:04:13 :: created next repo
2025-Mar-09 01:04:23 :: duplicate provides check OK
2025-Mar-09 01:05:03 :: dependencies check OK
2025-Mar-09 01:05:35 :: [x86_64 aarch64] ELF symbols check OK
2025-Mar-09 01:05:51 :: [x86_64] #500 libllama: install check OK
2025-Mar-09 01:05:57 :: [aarch64] #500 libllama: install check OK
2025-Mar-09 01:06:00 :: [x86_64] #500 libllama-debuginfo: install check OK
	x86_64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-09 01:06:08 :: [x86_64] #500 libllama-devel: install check OK
2025-Mar-09 01:06:10 :: [aarch64] #500 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-09 01:06:21 :: [aarch64] #500 libllama-devel: install check OK
	x86_64: llama.cpp=1:4855-alt1 post-install unowned files:
 /usr/lib/llama
2025-Mar-09 01:06:32 :: [x86_64] #500 llama.cpp: install check OK
	aarch64: llama.cpp=1:4855-alt1 post-install unowned files:
 /usr/lib/llama
2025-Mar-09 01:06:35 :: [aarch64] #500 llama.cpp: install check OK
2025-Mar-09 01:07:05 :: [aarch64] #500 llama.cpp-debuginfo: install check OK
2025-Mar-09 01:07:08 :: [x86_64] #500 llama.cpp-debuginfo: install check OK
2025-Mar-09 01:07:25 :: [x86_64-i586] generated apt indices
2025-Mar-09 01:07:25 :: [x86_64-i586] created next repo
2025-Mar-09 01:07:36 :: [x86_64-i586] dependencies check OK
2025-Mar-09 01:07:36 :: gears inheritance check OK
2025-Mar-09 01:07:37 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #500: llama.cpp: disapproved by vt
2025-Mar-09 01:07:37 :: acl check IGNORED
2025-Mar-09 01:07:48 :: created contents_index files
2025-Mar-09 01:07:56 :: created hash files: aarch64 src x86_64
2025-Mar-09 01:07:58 :: task #377221 for sisyphus TESTED


^ permalink raw reply	[flat|nested] 7+ messages in thread

* [#377221] [test-only] FAILED (try 6) llama.cpp.git=4855-alt1
  2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (3 preceding siblings ...)
  2025-03-09  1:07 ` [#377221] TESTED (try 5) llama.cpp.git=4855-alt1 Girar awaiter (vt)
@ 2025-03-09 10:49 ` Girar awaiter (vt)
  2025-03-09 11:35 ` [#377221] TESTED (try 7) llama.cpp.git=4855-alt1 Girar awaiter (vt)
  5 siblings, 0 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-09 10:49 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.6.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #600  llama.cpp   failed     -  failed

2025-Mar-09 10:43:15 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 removed
#400 removed
#500 removed
#600 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-09 10:43:13
2025-Mar-09 10:43:16 :: [i586] #600 llama.cpp.git 4855-alt1: build start
2025-Mar-09 10:43:16 :: [aarch64] #600 llama.cpp.git 4855-alt1: build start
2025-Mar-09 10:43:16 :: [x86_64] #600 llama.cpp.git 4855-alt1: build start
2025-Mar-09 10:43:40 :: [i586] #600 llama.cpp.git 4855-alt1: build SKIPPED
[aarch64] find-requires: running scripts (cpp,debuginfo,files,lib,pam,perl,pkgconfig,pkgconfiglib,python,python3,rpmlib,shebang,shell,static,symlinks,systemd-services)
[aarch64] /usr/src/tmp/llama.cpp-buildroot/usr/include/ggml-cpp.h:4:2: error: #error "This header is for C++ only"
[aarch64]     4 | #error "This header is for C++ only"
[aarch64] /usr/src/tmp/llama.cpp-buildroot/usr/include/ggml-cpp.h:11:10: fatal error: memory: No such file or directory
[aarch64]    11 | #include <memory>
[aarch64] --
[aarch64]     /usr/share/llama.cpp/requirements/requirements-tool_bench.txt
[aarch64] error: File list check failed, terminating build
[aarch64] RPM build errors:
[aarch64]     Installed (but unpackaged) file(s) found:
[aarch64]     File list check failed, terminating build
2025-Mar-09 10:47:13 :: [aarch64] llama.cpp.git 4855-alt1: remote: build failed
2025-Mar-09 10:47:13 :: [aarch64] #600 llama.cpp.git 4855-alt1: build FAILED
2025-Mar-09 10:47:13 :: [aarch64] requesting cancellation of task processing
[x86_64] find-requires: running scripts (cpp,debuginfo,files,lib,pam,perl,pkgconfig,pkgconfiglib,python,python3,rpmlib,shebang,shell,static,symlinks,systemd-services)
[x86_64] /usr/src/tmp/llama.cpp-buildroot/usr/include/ggml-cpp.h:4:2: error: #error "This header is for C++ only"
[x86_64]     4 | #error "This header is for C++ only"
[x86_64] /usr/src/tmp/llama.cpp-buildroot/usr/include/ggml-cpp.h:11:10: fatal error: memory: No such file or directory
[x86_64]    11 | #include <memory>
[x86_64] --
[x86_64]     /usr/share/llama.cpp/requirements/requirements-tool_bench.txt
[x86_64] error: File list check failed, terminating build
[x86_64] RPM build errors:
[x86_64]     Installed (but unpackaged) file(s) found:
[x86_64]     File list check failed, terminating build
2025-Mar-09 10:49:21 :: [x86_64] llama.cpp.git 4855-alt1: remote: build failed
2025-Mar-09 10:49:21 :: [x86_64] #600 llama.cpp.git 4855-alt1: build FAILED
2025-Mar-09 10:47:13 :: [aarch64] build FAILED
2025-Mar-09 10:49:21 :: [x86_64] build FAILED
2025-Mar-09 10:49:21 :: task #377221 for sisyphus FAILED


^ permalink raw reply	[flat|nested] 7+ messages in thread

* [#377221] TESTED (try 7) llama.cpp.git=4855-alt1
  2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
                   ` (4 preceding siblings ...)
  2025-03-09 10:49 ` [#377221] [test-only] FAILED (try 6) llama.cpp.git=4855-alt1 Girar awaiter (vt)
@ 2025-03-09 11:35 ` Girar awaiter (vt)
  5 siblings, 0 replies; 7+ messages in thread
From: Girar awaiter (vt) @ 2025-03-09 11:35 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/377221/logs/events.7.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #700  llama.cpp     4:38     -    5:43

2025-Mar-09 11:24:33 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 removed
#400 removed
#500 removed
#600 removed
#700 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-09 11:24:31
2025-Mar-09 11:24:35 :: [aarch64] #700 llama.cpp.git 4855-alt1: build start
2025-Mar-09 11:24:35 :: [i586] #700 llama.cpp.git 4855-alt1: build start
2025-Mar-09 11:24:35 :: [x86_64] #700 llama.cpp.git 4855-alt1: build start
2025-Mar-09 11:24:46 :: [i586] #700 llama.cpp.git 4855-alt1: build SKIPPED
2025-Mar-09 11:29:13 :: [aarch64] #700 llama.cpp.git 4855-alt1: build OK
build/700/x86_64/log:[00:03:11] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/700/x86_64/log:[00:03:11] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-09 11:30:18 :: [x86_64] #700 llama.cpp.git 4855-alt1: build OK
2025-Mar-09 11:30:35 :: #700: llama.cpp.git 4855-alt1: build check OK
2025-Mar-09 11:30:36 :: build check OK
2025-Mar-09 11:30:56 :: noarch check OK
2025-Mar-09 11:30:58 :: plan: src +1 -1 =19921, aarch64 +6 -5 =34637, x86_64 +8 -5 =35440
#700 llama.cpp 3441-alt1 -> 1:4855-alt1
 Fri Mar 07 2025 Vitaly Chikunov <vt@altlinux> 1:4855-alt1
 - Update to b4855 (2025-03-07).
 - Enable CUDA backend (for NVIDIA GPU) in llama.cpp-cuda package.
 - Disable BLAS backend (issues/12282).
2025-Mar-09 11:31:36 :: patched apt indices
2025-Mar-09 11:31:45 :: created next repo
2025-Mar-09 11:31:54 :: duplicate provides check OK
2025-Mar-09 11:32:31 :: dependencies check OK
2025-Mar-09 11:33:01 :: [x86_64 aarch64] ELF symbols check OK
2025-Mar-09 11:33:15 :: [x86_64] #700 libllama: install check OK
2025-Mar-09 11:33:23 :: [x86_64] #700 libllama-debuginfo: install check OK
2025-Mar-09 11:33:23 :: [aarch64] #700 libllama: install check OK
	x86_64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-09 11:33:30 :: [x86_64] #700 libllama-devel: install check OK
2025-Mar-09 11:33:36 :: [aarch64] #700 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-09 11:33:47 :: [aarch64] #700 libllama-devel: install check OK
2025-Mar-09 11:33:52 :: [x86_64] #700 llama.cpp: install check OK
2025-Mar-09 11:34:01 :: [x86_64] #700 llama.cpp-cpu: install check OK
2025-Mar-09 11:34:01 :: [aarch64] #700 llama.cpp: install check OK
2025-Mar-09 11:34:15 :: [aarch64] #700 llama.cpp-cpu: install check OK
2025-Mar-09 11:34:20 :: [x86_64] #700 llama.cpp-cpu-debuginfo: install check OK
2025-Mar-09 11:34:42 :: [x86_64] #700 llama.cpp-cuda: install check OK
2025-Mar-09 11:34:45 :: [aarch64] #700 llama.cpp-cpu-debuginfo: install check OK
2025-Mar-09 11:35:06 :: [x86_64] #700 llama.cpp-cuda-debuginfo: install check OK
2025-Mar-09 11:35:21 :: [x86_64-i586] generated apt indices
2025-Mar-09 11:35:21 :: [x86_64-i586] created next repo
2025-Mar-09 11:35:31 :: [x86_64-i586] dependencies check OK
2025-Mar-09 11:35:32 :: gears inheritance check OK
2025-Mar-09 11:35:33 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #700: llama.cpp: allowed for vt
2025-Mar-09 11:35:33 :: acl check OK
2025-Mar-09 11:35:44 :: created contents_index files
2025-Mar-09 11:35:52 :: created hash files: aarch64 src x86_64
2025-Mar-09 11:35:54 :: task #377221 for sisyphus TESTED


^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2025-03-09 11:35 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2025-03-08  2:04 [#377221] [test-only] FAILED llama.cpp.git=4855-alt1 Girar awaiter (vt)
2025-03-08  2:21 ` [#377221] [test-only] FAILED (try 2) llama.cpp.git=4855-alt1 Girar awaiter (vt)
2025-03-08  2:43 ` [#377221] TESTED (try 3) llama.cpp.git=4855-alt1 Girar awaiter (vt)
2025-03-08 20:27 ` [#377221] [test-only] FAILED (try 4) llama.cpp.git=4855-alt1 Girar awaiter (vt)
2025-03-09  1:07 ` [#377221] TESTED (try 5) llama.cpp.git=4855-alt1 Girar awaiter (vt)
2025-03-09 10:49 ` [#377221] [test-only] FAILED (try 6) llama.cpp.git=4855-alt1 Girar awaiter (vt)
2025-03-09 11:35 ` [#377221] TESTED (try 7) llama.cpp.git=4855-alt1 Girar awaiter (vt)

ALT Linux Girar Builder robot reports

This inbox may be cloned and mirrored by anyone:

	git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git

	# If you have public-inbox 1.1+ installed, you may
	# initialize and index your mirror using the following commands:
	public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \
		sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com
	public-inbox-index sisyphus-incominger

Example config snippet for mirrors.
Newsgroup available over NNTP:
	nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger


AGPL code for this site: git clone https://public-inbox.org/public-inbox.git