From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.1 (2015-04-28) on sa.local.altlinux.org X-Spam-Level: X-Spam-Status: No, score=-3.3 required=5.0 tests=BAYES_00,RP_MATCHES_RCVD autolearn=unavailable autolearn_force=no version=3.4.1 Date: Sat, 24 Jan 2026 02:30:49 +0000 From: "Girar awaiter (vt)" To: Vitaly Chikunov Subject: [#405966] TESTED llama.cpp.git=7819-alt1 Message-ID: Mail-Followup-To: Girar awaiter robot MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline X-girar-task-id: 405966 X-girar-task-owner: vt X-girar-task-repo: sisyphus X-girar-task-try: 1 X-girar-task-iter: 1 X-girar-task-status: TESTED X-girar-task-URL: https://git.altlinux.org/tasks/405966/ X-girar-task-log: logs/events.1.1.log X-girar-task-summary: [#405966] TESTED llama.cpp.git=7819-alt1 User-Agent: Mutt/1.10.1 (2018-07-13) Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-sisyphus@altlinux.org X-BeenThere: sisyphus-incominger@lists.altlinux.org X-Mailman-Version: 2.1.12 Precedence: list Reply-To: ALT Devel discussion list List-Id: ALT Linux Girar Builder robot reports List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Sat, 24 Jan 2026 02:30:53 -0000 Archived-At: List-Archive: https://git.altlinux.org/tasks/405966/logs/events.1.1.log https://packages.altlinux.org/tasks/405966 subtask name aarch64 i586 x86_64 #100 llama.cpp 10:01 - 11:49 2026-Jan-24 02:12:01 :: test-only task #405966 for sisyphus started by vt: #100 build 7819-alt1 from /people/vt/packages/llama.cpp.git fetched at 2026-Jan-24 02:11:58 2026-Jan-24 02:12:03 :: [aarch64] #100 llama.cpp.git 7819-alt1: build start 2026-Jan-24 02:12:03 :: [x86_64] #100 llama.cpp.git 7819-alt1: build start 2026-Jan-24 02:12:03 :: [i586] #100 llama.cpp.git 7819-alt1: build start 2026-Jan-24 02:12:15 :: [i586] #100 llama.cpp.git 7819-alt1: build SKIPPED 2026-Jan-24 02:22:04 :: [aarch64] #100 llama.cpp.git 7819-alt1: build OK build/100/x86_64/log:[00:07:19] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/100/x86_64/log:[00:07:19] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2026-Jan-24 02:23:52 :: [x86_64] #100 llama.cpp.git 7819-alt1: build OK 2026-Jan-24 02:24:00 :: 100: build check OK 2026-Jan-24 02:24:02 :: build check OK 2026-Jan-24 02:24:16 :: #100: llama.cpp.git 7819-alt1: version check OK 2026-Jan-24 02:24:17 :: build version check OK --- llama.cpp-cpu-7819-alt1.x86_64.rpm.share 2026-01-24 02:24:20.197980156 +0000 +++ llama.cpp-cpu-7819-alt1.aarch64.rpm.share 2026-01-24 02:24:21.248989789 +0000 @@ -8,3 +8,3 @@ /usr/share/doc/llama.cpp/README.md 100644 UTF-8 Unicode English text, with very long lines -/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text, with very long lines +/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text /usr/share/doc/llama.cpp/docs 40755 directory warning (#100): non-identical /usr/share part 2026-Jan-24 02:24:34 :: noarch check OK 2026-Jan-24 02:24:36 :: plan: src +1 -1 =21507, aarch64 +8 -8 =38069, x86_64 +10 -10 =39088 #100 llama.cpp 7388-alt1 -> 1:7819-alt1 Sat Jan 24 2026 Vitaly Chikunov 1:7819-alt1 - Update to b7819 (2026-01-23). - Responses API support (partial). 2026-Jan-24 02:25:19 :: patched apt indices 2026-Jan-24 02:25:27 :: created next repo 2026-Jan-24 02:25:37 :: duplicate provides check OK 2026-Jan-24 02:26:16 :: dependencies check OK 2026-Jan-24 02:26:50 :: [x86_64 aarch64] ELF symbols check OK 2026-Jan-24 02:27:07 :: [x86_64] #100 libllama: install check OK 2026-Jan-24 02:27:14 :: [aarch64] #100 libllama: install check OK 2026-Jan-24 02:27:16 :: [x86_64] #100 libllama-debuginfo: install check OK x86_64: libllama-devel=1:7819-alt1 post-install unowned files: /usr/lib64/cmake 2026-Jan-24 02:27:24 :: [x86_64] #100 libllama-devel: install check OK 2026-Jan-24 02:27:27 :: [aarch64] #100 libllama-debuginfo: install check OK aarch64: libllama-devel=1:7819-alt1 post-install unowned files: /usr/lib64/cmake 2026-Jan-24 02:27:39 :: [aarch64] #100 libllama-devel: install check OK 2026-Jan-24 02:27:53 :: [aarch64] #100 llama.cpp: install check OK 2026-Jan-24 02:27:54 :: [x86_64] #100 llama.cpp: install check OK 2026-Jan-24 02:28:06 :: [aarch64] #100 llama.cpp-cpu: install check OK 2026-Jan-24 02:28:06 :: [x86_64] #100 llama.cpp-cpu: install check OK 2026-Jan-24 02:28:26 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK 2026-Jan-24 02:28:27 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK 2026-Jan-24 02:28:41 :: [aarch64] #100 llama.cpp-vulkan: install check OK 2026-Jan-24 02:28:58 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK 2026-Jan-24 02:28:59 :: [x86_64] #100 llama.cpp-cuda: install check OK 2026-Jan-24 02:29:32 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK 2026-Jan-24 02:29:42 :: [x86_64] #100 llama.cpp-vulkan: install check OK 2026-Jan-24 02:29:54 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK 2026-Jan-24 02:30:12 :: [x86_64-i586] generated apt indices 2026-Jan-24 02:30:12 :: [x86_64-i586] created next repo 2026-Jan-24 02:30:23 :: [x86_64-i586] dependencies check OK 2026-Jan-24 02:30:25 :: gears inheritance check OK 2026-Jan-24 02:30:25 :: srpm inheritance check OK girar-check-perms: access to llama.cpp ALLOWED for vt: project leader check-subtask-perms: #100: llama.cpp: allowed for vt 2026-Jan-24 02:30:26 :: acl check OK 2026-Jan-24 02:30:38 :: created contents_index files 2026-Jan-24 02:30:46 :: created hash files: aarch64 src x86_64 2026-Jan-24 02:30:49 :: task #405966 for sisyphus TESTED