ALT Linux Girar Builder robot reports
 help / color / mirror / Atom feed
* [#383621] TESTED llama.cpp.git=5318-alt1
@ 2025-05-09  1:42 Girar awaiter (vt)
  2025-05-09  5:09 ` [#383621] TESTED (try 2) llama.cpp.git=5318-alt1 Girar awaiter (vt)
  0 siblings, 1 reply; 2+ messages in thread
From: Girar awaiter (vt) @ 2025-05-09  1:42 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/383621/logs/events.1.1.log
https://packages.altlinux.org/tasks/383621

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp     9:40     -    7:30

2025-May-09 01:26:17 :: test-only task #383621 for sisyphus started by vt:
#100 build 5318-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-May-09 01:26:14
2025-May-09 01:26:19 :: [i586] #100 llama.cpp.git 5318-alt1: build start
2025-May-09 01:26:19 :: [x86_64] #100 llama.cpp.git 5318-alt1: build start
2025-May-09 01:26:19 :: [aarch64] #100 llama.cpp.git 5318-alt1: build start
2025-May-09 01:26:38 :: [i586] #100 llama.cpp.git 5318-alt1: build SKIPPED
build/100/x86_64/log:[00:03:48] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:03:48] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-May-09 01:33:49 :: [x86_64] #100 llama.cpp.git 5318-alt1: build OK
2025-May-09 01:35:59 :: [aarch64] #100 llama.cpp.git 5318-alt1: build OK
2025-May-09 01:36:19 :: #100: llama.cpp.git 5318-alt1: build check OK
2025-May-09 01:36:21 :: build check OK
2025-May-09 01:36:47 :: noarch check OK
2025-May-09 01:36:48 :: plan: src +1 -1 =20210, aarch64 +8 -6 =35092, x86_64 +10 -8 =35893
#100 llama.cpp 4855-alt1 -> 1:5318-alt1
 Fri May 09 2025 Vitaly Chikunov <vt@altlinux> 1:5318-alt1
 - Update to b5318 (2025-05-08).
 - Enable Vulkan backend (for GPU) in llama.cpp-vulkan package.
2025-May-09 01:37:36 :: patched apt indices
2025-May-09 01:37:45 :: created next repo
2025-May-09 01:37:56 :: duplicate provides check OK
2025-May-09 01:38:37 :: dependencies check OK
2025-May-09 01:39:11 :: [x86_64 aarch64] ELF symbols check OK
2025-May-09 01:39:26 :: [x86_64] #100 libllama: install check OK
2025-May-09 01:39:34 :: [aarch64] #100 libllama: install check OK
2025-May-09 01:39:34 :: [x86_64] #100 libllama-debuginfo: install check OK
	x86_64: libllama-devel=1:5318-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-May-09 01:39:41 :: [x86_64] #100 libllama-devel: install check OK
2025-May-09 01:39:47 :: [aarch64] #100 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:5318-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-May-09 01:39:58 :: [aarch64] #100 libllama-devel: install check OK
2025-May-09 01:40:06 :: [x86_64] #100 llama.cpp: install check OK
2025-May-09 01:40:13 :: [aarch64] #100 llama.cpp: install check OK
2025-May-09 01:40:16 :: [x86_64] #100 llama.cpp-cpu: install check OK
2025-May-09 01:40:27 :: [aarch64] #100 llama.cpp-cpu: install check OK
2025-May-09 01:40:37 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-May-09 01:40:57 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-May-09 01:41:03 :: [x86_64] #100 llama.cpp-cuda: install check OK
2025-May-09 01:41:12 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2025-May-09 01:41:29 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-May-09 01:41:30 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2025-May-09 01:41:40 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2025-May-09 01:41:51 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-May-09 01:42:10 :: [x86_64-i586] generated apt indices
2025-May-09 01:42:10 :: [x86_64-i586] created next repo
2025-May-09 01:42:22 :: [x86_64-i586] dependencies check OK
2025-May-09 01:42:23 :: gears inheritance check OK
2025-May-09 01:42:24 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: disapproved by vt
2025-May-09 01:42:25 :: acl check IGNORED
2025-May-09 01:42:37 :: created contents_index files
2025-May-09 01:42:45 :: created hash files: aarch64 src x86_64
2025-May-09 01:42:48 :: task #383621 for sisyphus TESTED


^ permalink raw reply	[flat|nested] 2+ messages in thread

* [#383621] TESTED (try 2) llama.cpp.git=5318-alt1
  2025-05-09  1:42 [#383621] TESTED llama.cpp.git=5318-alt1 Girar awaiter (vt)
@ 2025-05-09  5:09 ` Girar awaiter (vt)
  0 siblings, 0 replies; 2+ messages in thread
From: Girar awaiter (vt) @ 2025-05-09  5:09 UTC (permalink / raw)
  To: Vitaly Chikunov; +Cc: sisyphus-incominger, girar-builder-sisyphus

https://git.altlinux.org/tasks/383621/logs/events.2.1.log
https://packages.altlinux.org/tasks/383621

subtask  name       aarch64  i586  x86_64
   #200  llama.cpp     9:47     -    8:13

2025-May-09 04:52:50 :: test-only task #383621 for sisyphus resumed by vt:
#100 removed
#200 build 5318-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-May-09 04:52:48
2025-May-09 04:52:53 :: [aarch64] #200 llama.cpp.git 5318-alt1: build start
2025-May-09 04:52:53 :: [i586] #200 llama.cpp.git 5318-alt1: build start
2025-May-09 04:52:53 :: [x86_64] #200 llama.cpp.git 5318-alt1: build start
2025-May-09 04:53:07 :: [i586] #200 llama.cpp.git 5318-alt1: build SKIPPED
build/200/x86_64/log:[00:04:11] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/200/x86_64/log:[00:04:11] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-May-09 05:01:06 :: [x86_64] #200 llama.cpp.git 5318-alt1: build OK
2025-May-09 05:02:40 :: [aarch64] #200 llama.cpp.git 5318-alt1: build OK
2025-May-09 05:03:02 :: #200: llama.cpp.git 5318-alt1: build check OK
2025-May-09 05:03:04 :: build check OK
2025-May-09 05:03:30 :: noarch check OK
2025-May-09 05:03:32 :: plan: src +1 -1 =20210, aarch64 +8 -6 =35092, x86_64 +10 -8 =35893
#200 llama.cpp 4855-alt1 -> 1:5318-alt1
 Fri May 09 2025 Vitaly Chikunov <vt@altlinux> 1:5318-alt1
 - Update to b5318 (2025-05-08).
 - Enable Vulkan backend (for GPU) in llama.cpp-vulkan package.
2025-May-09 05:04:16 :: patched apt indices
2025-May-09 05:04:25 :: created next repo
2025-May-09 05:04:37 :: duplicate provides check OK
2025-May-09 05:05:19 :: dependencies check OK
2025-May-09 05:05:57 :: [x86_64 aarch64] ELF symbols check OK
2025-May-09 05:06:12 :: [x86_64] #200 libllama: install check OK
2025-May-09 05:06:20 :: [aarch64] #200 libllama: install check OK
2025-May-09 05:06:21 :: [x86_64] #200 libllama-debuginfo: install check OK
	x86_64: libllama-devel=1:5318-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-May-09 05:06:29 :: [x86_64] #200 libllama-devel: install check OK
2025-May-09 05:06:33 :: [aarch64] #200 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:5318-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-May-09 05:06:44 :: [aarch64] #200 libllama-devel: install check OK
2025-May-09 05:06:58 :: [aarch64] #200 llama.cpp: install check OK
2025-May-09 05:07:07 :: [x86_64] #200 llama.cpp: install check OK
2025-May-09 05:07:13 :: [aarch64] #200 llama.cpp-cpu: install check OK
2025-May-09 05:07:19 :: [x86_64] #200 llama.cpp-cpu: install check OK
2025-May-09 05:07:40 :: [x86_64] #200 llama.cpp-cpu-debuginfo: install check OK
2025-May-09 05:07:42 :: [aarch64] #200 llama.cpp-cpu-debuginfo: install check OK
2025-May-09 05:07:57 :: [aarch64] #200 llama.cpp-vulkan: install check OK
2025-May-09 05:08:06 :: [x86_64] #200 llama.cpp-cuda: install check OK
2025-May-09 05:08:15 :: [aarch64] #200 llama.cpp-vulkan-debuginfo: install check OK
2025-May-09 05:08:33 :: [x86_64] #200 llama.cpp-cuda-debuginfo: install check OK
2025-May-09 05:08:43 :: [x86_64] #200 llama.cpp-vulkan: install check OK
2025-May-09 05:08:55 :: [x86_64] #200 llama.cpp-vulkan-debuginfo: install check OK
2025-May-09 05:09:14 :: [x86_64-i586] generated apt indices
2025-May-09 05:09:14 :: [x86_64-i586] created next repo
2025-May-09 05:09:26 :: [x86_64-i586] dependencies check OK
2025-May-09 05:09:28 :: gears inheritance check OK
2025-May-09 05:09:28 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #200: llama.cpp: allowed for vt
2025-May-09 05:09:29 :: acl check OK
2025-May-09 05:09:41 :: created contents_index files
2025-May-09 05:09:49 :: created hash files: aarch64 src x86_64
2025-May-09 05:09:52 :: task #383621 for sisyphus TESTED


^ permalink raw reply	[flat|nested] 2+ messages in thread

end of thread, other threads:[~2025-05-09  5:09 UTC | newest]

Thread overview: 2+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2025-05-09  1:42 [#383621] TESTED llama.cpp.git=5318-alt1 Girar awaiter (vt)
2025-05-09  5:09 ` [#383621] TESTED (try 2) llama.cpp.git=5318-alt1 Girar awaiter (vt)

ALT Linux Girar Builder robot reports

This inbox may be cloned and mirrored by anyone:

	git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git

	# If you have public-inbox 1.1+ installed, you may
	# initialize and index your mirror using the following commands:
	public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \
		sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com
	public-inbox-index sisyphus-incominger

Example config snippet for mirrors.
Newsgroup available over NNTP:
	nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger


AGPL code for this site: git clone https://public-inbox.org/public-inbox.git