From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.1 (2015-04-28) on sa.local.altlinux.org X-Spam-Level: X-Spam-Status: No, score=-3.3 required=5.0 tests=BAYES_00,RP_MATCHES_RCVD autolearn=unavailable autolearn_force=no version=3.4.1 Date: Fri, 21 Nov 2025 21:47:52 +0000 From: "Girar awaiter (vt)" To: Vitaly Chikunov Subject: [#400709] TESTED (try 3) llama.cpp.git=7127-alt1 Message-ID: Mail-Followup-To: Girar awaiter robot References: MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: X-girar-task-id: 400709 X-girar-task-owner: vt X-girar-task-repo: sisyphus X-girar-task-try: 3 X-girar-task-iter: 1 X-girar-task-status: TESTED X-girar-task-URL: https://git.altlinux.org/tasks/400709/ X-girar-task-log: logs/events.3.1.log X-girar-task-summary: [#400709] TESTED (try 3) llama.cpp.git=7127-alt1 User-Agent: Mutt/1.10.1 (2018-07-13) Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-sisyphus@altlinux.org X-BeenThere: sisyphus-incominger@lists.altlinux.org X-Mailman-Version: 2.1.12 Precedence: list Reply-To: ALT Devel discussion list List-Id: ALT Linux Girar Builder robot reports List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Fri, 21 Nov 2025 21:47:55 -0000 Archived-At: List-Archive: https://git.altlinux.org/tasks/400709/logs/events.3.1.log https://packages.altlinux.org/tasks/400709 subtask name aarch64 i586 x86_64 #300 llama.cpp 8:11 - 7:54 2025-Nov-21 21:33:25 :: test-only task #400709 for sisyphus resumed by vt: #100 removed #200 removed #300 build 7127-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Nov-21 21:33:22 2025-Nov-21 21:33:26 :: [i586] #300 llama.cpp.git 7127-alt1: build start 2025-Nov-21 21:33:26 :: [x86_64] #300 llama.cpp.git 7127-alt1: build start 2025-Nov-21 21:33:26 :: [aarch64] #300 llama.cpp.git 7127-alt1: build start 2025-Nov-21 21:33:37 :: [i586] #300 llama.cpp.git 7127-alt1: build SKIPPED build/300/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/300/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2025-Nov-21 21:41:20 :: [x86_64] #300 llama.cpp.git 7127-alt1: build OK 2025-Nov-21 21:41:37 :: [aarch64] #300 llama.cpp.git 7127-alt1: build OK 2025-Nov-21 21:41:45 :: 300: build check OK 2025-Nov-21 21:41:47 :: build check OK 2025-Nov-21 21:42:00 :: #300: llama.cpp.git 7127-alt1: version check OK 2025-Nov-21 21:42:00 :: build version check OK 2025-Nov-21 21:42:13 :: noarch check OK 2025-Nov-21 21:42:15 :: plan: src +1 -1 =21047, aarch64 +8 -9 =37062, x86_64 +10 -11 =37900 #300 llama.cpp 6869-alt1 -> 1:7127-alt1 Fri Nov 21 2025 Vitaly Chikunov 1:7127-alt1 - Update to b7127 (2025-11-21). - spec: Remove llama.cpp-convert package. - model: detect GigaChat3-10-A1.8B as deepseek lite. 2025-Nov-21 21:42:58 :: patched apt indices 2025-Nov-21 21:43:08 :: created next repo 2025-Nov-21 21:43:18 :: duplicate provides check OK 2025-Nov-21 21:43:58 :: dependencies check OK 2025-Nov-21 21:44:33 :: [x86_64 aarch64] ELF symbols check OK 2025-Nov-21 21:44:48 :: [x86_64] #300 libllama: install check OK 2025-Nov-21 21:44:56 :: [x86_64] #300 libllama-debuginfo: install check OK 2025-Nov-21 21:44:57 :: [aarch64] #300 libllama: install check OK x86_64: libllama-devel=1:7127-alt1 post-install unowned files: /usr/lib64/cmake 2025-Nov-21 21:45:03 :: [x86_64] #300 libllama-devel: install check OK 2025-Nov-21 21:45:11 :: [aarch64] #300 libllama-debuginfo: install check OK aarch64: libllama-devel=1:7127-alt1 post-install unowned files: /usr/lib64/cmake 2025-Nov-21 21:45:23 :: [aarch64] #300 libllama-devel: install check OK 2025-Nov-21 21:45:28 :: [x86_64] #300 llama.cpp: install check OK 2025-Nov-21 21:45:36 :: [x86_64] #300 llama.cpp-cpu: install check OK 2025-Nov-21 21:45:37 :: [aarch64] #300 llama.cpp: install check OK 2025-Nov-21 21:45:50 :: [x86_64] #300 llama.cpp-cpu-debuginfo: install check OK 2025-Nov-21 21:45:51 :: [aarch64] #300 llama.cpp-cpu: install check OK 2025-Nov-21 21:46:13 :: [aarch64] #300 llama.cpp-cpu-debuginfo: install check OK 2025-Nov-21 21:46:14 :: [x86_64] #300 llama.cpp-cuda: install check OK 2025-Nov-21 21:46:27 :: [aarch64] #300 llama.cpp-vulkan: install check OK 2025-Nov-21 21:46:39 :: [x86_64] #300 llama.cpp-cuda-debuginfo: install check OK 2025-Nov-21 21:46:45 :: [aarch64] #300 llama.cpp-vulkan-debuginfo: install check OK 2025-Nov-21 21:46:48 :: [x86_64] #300 llama.cpp-vulkan: install check OK 2025-Nov-21 21:46:59 :: [x86_64] #300 llama.cpp-vulkan-debuginfo: install check OK 2025-Nov-21 21:47:16 :: [x86_64-i586] generated apt indices 2025-Nov-21 21:47:16 :: [x86_64-i586] created next repo 2025-Nov-21 21:47:27 :: [x86_64-i586] dependencies check OK 2025-Nov-21 21:47:28 :: gears inheritance check OK 2025-Nov-21 21:47:28 :: srpm inheritance check OK girar-check-perms: access to llama.cpp ALLOWED for vt: project leader check-subtask-perms: #300: llama.cpp: allowed for vt 2025-Nov-21 21:47:29 :: acl check OK 2025-Nov-21 21:47:41 :: created contents_index files 2025-Nov-21 21:47:49 :: created hash files: aarch64 src x86_64 2025-Nov-21 21:47:52 :: task #400709 for sisyphus TESTED