From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.1 (2015-04-28) on sa.local.altlinux.org X-Spam-Level: X-Spam-Status: No, score=-3.3 required=5.0 tests=BAYES_00,RP_MATCHES_RCVD autolearn=unavailable autolearn_force=no version=3.4.1 Date: Sat, 6 Sep 2025 06:25:35 +0000 From: "Girar awaiter (vt)" To: Vitaly Chikunov Subject: [#394153] TESTED llama.cpp.git=6397-alt1 Message-ID: Mail-Followup-To: Girar awaiter robot MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline X-girar-task-id: 394153 X-girar-task-owner: vt X-girar-task-repo: sisyphus X-girar-task-try: 1 X-girar-task-iter: 1 X-girar-task-status: TESTED X-girar-task-URL: https://git.altlinux.org/tasks/394153/ X-girar-task-log: logs/events.1.1.log X-girar-task-summary: [#394153] TESTED llama.cpp.git=6397-alt1 User-Agent: Mutt/1.10.1 (2018-07-13) Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-sisyphus@altlinux.org X-BeenThere: sisyphus-incominger@lists.altlinux.org X-Mailman-Version: 2.1.12 Precedence: list Reply-To: ALT Devel discussion list List-Id: ALT Linux Girar Builder robot reports List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Sat, 06 Sep 2025 06:25:39 -0000 Archived-At: List-Archive: https://git.altlinux.org/tasks/394153/logs/events.1.1.log https://packages.altlinux.org/tasks/394153 subtask name aarch64 i586 x86_64 #100 llama.cpp 8:02 - 7:28 2025-Sep-06 06:11:23 :: test-only task #394153 for sisyphus started by vt: #100 build 6397-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Sep-06 06:11:21 2025-Sep-06 06:11:25 :: [x86_64] #100 llama.cpp.git 6397-alt1: build start 2025-Sep-06 06:11:25 :: [i586] #100 llama.cpp.git 6397-alt1: build start 2025-Sep-06 06:11:25 :: [aarch64] #100 llama.cpp.git 6397-alt1: build start 2025-Sep-06 06:11:38 :: [i586] #100 llama.cpp.git 6397-alt1: build SKIPPED build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2025-Sep-06 06:18:53 :: [x86_64] #100 llama.cpp.git 6397-alt1: build OK 2025-Sep-06 06:19:27 :: [aarch64] #100 llama.cpp.git 6397-alt1: build OK 2025-Sep-06 06:19:51 :: #100: llama.cpp.git 6397-alt1: build check OK 2025-Sep-06 06:19:52 :: build check OK 2025-Sep-06 06:20:04 :: noarch check OK 2025-Sep-06 06:20:06 :: plan: src +1 -1 =20597, aarch64 +9 -8 =36404, x86_64 +11 -10 =37232 #100 llama.cpp 6121-alt1 -> 1:6397-alt1 Sat Sep 06 2025 Vitaly Chikunov 1:6397-alt1 - Update to b6397 (2025-09-06). - Python-based model conversion scripts are sub-packaged. Note that they are not supported and are provided as-is. 2025-Sep-06 06:20:45 :: patched apt indices 2025-Sep-06 06:20:54 :: created next repo 2025-Sep-06 06:21:03 :: duplicate provides check OK 2025-Sep-06 06:21:40 :: dependencies check OK 2025-Sep-06 06:22:10 :: [x86_64 aarch64] ELF symbols check OK 2025-Sep-06 06:22:25 :: [x86_64] #100 libllama: install check OK 2025-Sep-06 06:22:33 :: [x86_64] #100 libllama-debuginfo: install check OK 2025-Sep-06 06:22:33 :: [aarch64] #100 libllama: install check OK x86_64: libllama-devel=1:6397-alt1 post-install unowned files: /usr/lib64/cmake 2025-Sep-06 06:22:40 :: [x86_64] #100 libllama-devel: install check OK 2025-Sep-06 06:22:46 :: [aarch64] #100 libllama-debuginfo: install check OK aarch64: libllama-devel=1:6397-alt1 post-install unowned files: /usr/lib64/cmake 2025-Sep-06 06:22:57 :: [aarch64] #100 libllama-devel: install check OK 2025-Sep-06 06:23:05 :: [x86_64] #100 llama.cpp: install check OK 2025-Sep-06 06:23:12 :: [aarch64] #100 llama.cpp: install check OK 2025-Sep-06 06:23:13 :: [x86_64] #100 llama.cpp-convert: install check OK 2025-Sep-06 06:23:22 :: [x86_64] #100 llama.cpp-cpu: install check OK 2025-Sep-06 06:23:26 :: [aarch64] #100 llama.cpp-convert: install check OK 2025-Sep-06 06:23:36 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-Sep-06 06:23:39 :: [aarch64] #100 llama.cpp-cpu: install check OK 2025-Sep-06 06:23:59 :: [x86_64] #100 llama.cpp-cuda: install check OK 2025-Sep-06 06:24:00 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-Sep-06 06:24:14 :: [aarch64] #100 llama.cpp-vulkan: install check OK 2025-Sep-06 06:24:24 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK 2025-Sep-06 06:24:31 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-Sep-06 06:24:33 :: [x86_64] #100 llama.cpp-vulkan: install check OK 2025-Sep-06 06:24:44 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-Sep-06 06:25:00 :: [x86_64-i586] generated apt indices 2025-Sep-06 06:25:00 :: [x86_64-i586] created next repo 2025-Sep-06 06:25:11 :: [x86_64-i586] dependencies check OK 2025-Sep-06 06:25:12 :: gears inheritance check OK 2025-Sep-06 06:25:12 :: srpm inheritance check OK girar-check-perms: access to llama.cpp ALLOWED for vt: project leader check-subtask-perms: #100: llama.cpp: allowed for vt 2025-Sep-06 06:25:13 :: acl check OK 2025-Sep-06 06:25:25 :: created contents_index files 2025-Sep-06 06:25:32 :: created hash files: aarch64 src x86_64 2025-Sep-06 06:25:35 :: task #394153 for sisyphus TESTED