From: "Girar awaiter (vt)" <girar-builder@altlinux.org> To: Vitaly Chikunov <vt@altlinux.org> Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-sisyphus@altlinux.org Subject: [#383655] TESTED llama.cpp.git=5332-alt1 Date: Fri, 9 May 2025 22:33:52 +0000 Message-ID: <girar.task.383655.1.1@gyle.mskdc.altlinux.org> (raw) https://git.altlinux.org/tasks/383655/logs/events.1.1.log https://packages.altlinux.org/tasks/383655 subtask name aarch64 i586 x86_64 #100 llama.cpp 9:43 - 7:20 2025-May-09 22:17:44 :: test-only task #383655 for sisyphus started by vt: #100 build 5332-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-May-09 22:17:41 2025-May-09 22:17:47 :: [aarch64] #100 llama.cpp.git 5332-alt1: build start 2025-May-09 22:17:47 :: [i586] #100 llama.cpp.git 5332-alt1: build start 2025-May-09 22:17:47 :: [x86_64] #100 llama.cpp.git 5332-alt1: build start 2025-May-09 22:17:58 :: [i586] #100 llama.cpp.git 5332-alt1: build SKIPPED build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2025-May-09 22:25:07 :: [x86_64] #100 llama.cpp.git 5332-alt1: build OK 2025-May-09 22:27:30 :: [aarch64] #100 llama.cpp.git 5332-alt1: build OK 2025-May-09 22:27:50 :: #100: llama.cpp.git 5332-alt1: build check OK 2025-May-09 22:27:52 :: build check OK 2025-May-09 22:28:15 :: noarch check OK 2025-May-09 22:28:17 :: plan: src +1 -1 =20214, aarch64 +8 -6 =35105, x86_64 +10 -8 =35906 #100 llama.cpp 4855-alt1 -> 1:5332-alt1 Fri May 09 2025 Vitaly Chikunov <vt@altlinux> 1:5332-alt1 - Update to b5332 (2025-05-09). - Enable Vulkan backend (for GPU) in llama.cpp-vulkan package. 2025-May-09 22:28:56 :: patched apt indices 2025-May-09 22:29:05 :: created next repo 2025-May-09 22:29:15 :: duplicate provides check OK 2025-May-09 22:29:52 :: dependencies check OK 2025-May-09 22:30:25 :: [x86_64 aarch64] ELF symbols check OK 2025-May-09 22:30:39 :: [x86_64] #100 libllama: install check OK 2025-May-09 22:30:47 :: [x86_64] #100 libllama-debuginfo: install check OK 2025-May-09 22:30:48 :: [aarch64] #100 libllama: install check OK x86_64: libllama-devel=1:5332-alt1 post-install unowned files: /usr/lib64/cmake 2025-May-09 22:30:54 :: [x86_64] #100 libllama-devel: install check OK 2025-May-09 22:31:01 :: [aarch64] #100 libllama-debuginfo: install check OK aarch64: libllama-devel=1:5332-alt1 post-install unowned files: /usr/lib64/cmake 2025-May-09 22:31:12 :: [aarch64] #100 libllama-devel: install check OK 2025-May-09 22:31:18 :: [x86_64] #100 llama.cpp: install check OK 2025-May-09 22:31:27 :: [aarch64] #100 llama.cpp: install check OK 2025-May-09 22:31:27 :: [x86_64] #100 llama.cpp-cpu: install check OK 2025-May-09 22:31:41 :: [aarch64] #100 llama.cpp-cpu: install check OK 2025-May-09 22:31:47 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-May-09 22:32:11 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-May-09 22:32:11 :: [x86_64] #100 llama.cpp-cuda: install check OK 2025-May-09 22:32:26 :: [aarch64] #100 llama.cpp-vulkan: install check OK 2025-May-09 22:32:37 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK 2025-May-09 22:32:43 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-May-09 22:32:46 :: [x86_64] #100 llama.cpp-vulkan: install check OK 2025-May-09 22:32:58 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-May-09 22:33:16 :: [x86_64-i586] generated apt indices 2025-May-09 22:33:16 :: [x86_64-i586] created next repo 2025-May-09 22:33:26 :: [x86_64-i586] dependencies check OK 2025-May-09 22:33:28 :: gears inheritance check OK 2025-May-09 22:33:28 :: srpm inheritance check OK girar-check-perms: access to llama.cpp ALLOWED for vt: project leader check-subtask-perms: #100: llama.cpp: disapproved by vt 2025-May-09 22:33:29 :: acl check IGNORED 2025-May-09 22:33:41 :: created contents_index files 2025-May-09 22:33:49 :: created hash files: aarch64 src x86_64 2025-May-09 22:33:52 :: task #383655 for sisyphus TESTED
next reply other threads:[~2025-05-09 22:33 UTC|newest] Thread overview: 3+ messages / expand[flat|nested] mbox.gz Atom feed top 2025-05-09 22:33 Girar awaiter (vt) [this message] 2025-05-10 0:09 ` [#383655] TESTED (try 2) llama.cpp.git=5332-alt1 Girar awaiter (vt) 2025-05-10 0:59 ` [#383655] DONE (try 3) llama.cpp.git=5332-alt1 Girar pender (vt)
Reply instructions: You may reply publicly to this message via plain-text email using any one of the following methods: * Save the following mbox file, import it into your mail client, and reply-to-all from there: mbox Avoid top-posting and favor interleaved quoting: https://en.wikipedia.org/wiki/Posting_style#Interleaved_style * Reply using the --to, --cc, and --in-reply-to switches of git-send-email(1): git send-email \ --in-reply-to=girar.task.383655.1.1@gyle.mskdc.altlinux.org \ --to=girar-builder@altlinux.org \ --cc=devel@lists.altlinux.org \ --cc=girar-builder-sisyphus@altlinux.org \ --cc=sisyphus-incominger@lists.altlinux.org \ --cc=vt@altlinux.org \ /path/to/YOUR_REPLY https://kernel.org/pub/software/scm/git/docs/git-send-email.html * If your mail client supports setting the In-Reply-To header via mailto: links, try the mailto: link
ALT Linux Girar Builder robot reports This inbox may be cloned and mirrored by anyone: git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git # If you have public-inbox 1.1+ installed, you may # initialize and index your mirror using the following commands: public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \ sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com public-inbox-index sisyphus-incominger Example config snippet for mirrors. Newsgroup available over NNTP: nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger AGPL code for this site: git clone https://public-inbox.org/public-inbox.git