From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.1 (2015-04-28) on sa.local.altlinux.org X-Spam-Level: X-Spam-Status: No, score=-3.3 required=5.0 tests=BAYES_00,RP_MATCHES_RCVD autolearn=unavailable autolearn_force=no version=3.4.1 Date: Mon, 10 Mar 2025 02:45:13 +0000 From: "Girar pender (vt)" To: Vitaly Chikunov Subject: [#377221] DONE (try 9) llama.cpp.git=4855-alt1 Message-ID: Mail-Followup-To: Girar pender robot References: MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: X-girar-task-id: 377221 X-girar-task-owner: vt X-girar-task-repo: sisyphus X-girar-task-try: 9 X-girar-task-iter: 1 X-girar-task-status: DONE X-girar-task-URL: https://git.altlinux.org/tasks/archive/done/_368/377221/ X-girar-task-log: logs/events.9.1.log X-girar-task-summary: [#377221] DONE (try 9) llama.cpp.git=4855-alt1 User-Agent: Mutt/1.10.1 (2018-07-13) Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-sisyphus@altlinux.org X-BeenThere: sisyphus-incominger@lists.altlinux.org X-Mailman-Version: 2.1.12 Precedence: list Reply-To: ALT Devel discussion list List-Id: ALT Linux Girar Builder robot reports List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Mon, 10 Mar 2025 02:45:16 -0000 Archived-At: List-Archive: https://git.altlinux.org/tasks/archive/done/_368/377221/logs/events.9.1.log https://packages.altlinux.org/tasks/377221 subtask name aarch64 i586 x86_64 #1100 llama.cpp 4:48 - 7:06 2025-Mar-10 02:31:08 :: task #377221 for sisyphus resumed by vt: #100 removed #200 removed #300 removed #400 removed #500 removed #600 removed #700 removed #1000 removed #1100 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-10 02:31:06 2025-Mar-10 02:31:10 :: [x86_64] #1100 llama.cpp.git 4855-alt1: build start 2025-Mar-10 02:31:10 :: [i586] #1100 llama.cpp.git 4855-alt1: build start 2025-Mar-10 02:31:10 :: [aarch64] #1100 llama.cpp.git 4855-alt1: build start 2025-Mar-10 02:31:22 :: [i586] #1100 llama.cpp.git 4855-alt1: build SKIPPED 2025-Mar-10 02:35:58 :: [aarch64] #1100 llama.cpp.git 4855-alt1: build OK build/1100/x86_64/log:[00:04:00] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/1100/x86_64/log:[00:04:00] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2025-Mar-10 02:38:16 :: [x86_64] #1100 llama.cpp.git 4855-alt1: build OK 2025-Mar-10 02:38:33 :: #1100: llama.cpp.git 4855-alt1: build check OK 2025-Mar-10 02:38:34 :: build check OK 2025-Mar-10 02:38:55 :: noarch check OK 2025-Mar-10 02:38:57 :: plan: src +1 -1 =19927, aarch64 +6 -5 =34648, x86_64 +8 -5 =35451 #1100 llama.cpp 3441-alt1 -> 1:4855-alt1 Mon Mar 10 2025 Vitaly Chikunov 1:4855-alt1 - Update to b4855 (2025-03-07). - Enable CUDA backend (for NVIDIA GPU) in llama.cpp-cuda package. - Disable BLAS backend (issues/12282). - Install bash-completions. 2025-Mar-10 02:39:35 :: patched apt indices 2025-Mar-10 02:39:43 :: created next repo 2025-Mar-10 02:39:54 :: duplicate provides check OK 2025-Mar-10 02:40:31 :: dependencies check OK 2025-Mar-10 02:41:03 :: [x86_64 aarch64] ELF symbols check OK 2025-Mar-10 02:41:20 :: [x86_64] #1100 libllama: install check OK 2025-Mar-10 02:41:26 :: [aarch64] #1100 libllama: install check OK 2025-Mar-10 02:41:35 :: [x86_64] #1100 libllama-debuginfo: install check OK 2025-Mar-10 02:41:38 :: [aarch64] #1100 libllama-debuginfo: install check OK x86_64: libllama-devel=1:4855-alt1 post-install unowned files: /usr/lib64/cmake 2025-Mar-10 02:41:48 :: [x86_64] #1100 libllama-devel: install check OK aarch64: libllama-devel=1:4855-alt1 post-install unowned files: /usr/lib64/cmake 2025-Mar-10 02:41:49 :: [aarch64] #1100 libllama-devel: install check OK 2025-Mar-10 02:42:03 :: [aarch64] #1100 llama.cpp: install check OK 2025-Mar-10 02:42:18 :: [aarch64] #1100 llama.cpp-cpu: install check OK 2025-Mar-10 02:42:21 :: [x86_64] #1100 llama.cpp: install check OK 2025-Mar-10 02:42:36 :: [x86_64] #1100 llama.cpp-cpu: install check OK 2025-Mar-10 02:42:47 :: [aarch64] #1100 llama.cpp-cpu-debuginfo: install check OK 2025-Mar-10 02:42:57 :: [x86_64] #1100 llama.cpp-cpu-debuginfo: install check OK 2025-Mar-10 02:43:21 :: [x86_64] #1100 llama.cpp-cuda: install check OK 2025-Mar-10 02:43:47 :: [x86_64] #1100 llama.cpp-cuda-debuginfo: install check OK 2025-Mar-10 02:44:03 :: [x86_64-i586] generated apt indices 2025-Mar-10 02:44:03 :: [x86_64-i586] created next repo 2025-Mar-10 02:44:14 :: [x86_64-i586] dependencies check OK 2025-Mar-10 02:44:15 :: gears inheritance check OK 2025-Mar-10 02:44:15 :: srpm inheritance check OK girar-check-perms: access to llama.cpp ALLOWED for vt: project leader check-subtask-perms: #1100: llama.cpp: allowed for vt 2025-Mar-10 02:44:16 :: acl check OK 2025-Mar-10 02:44:28 :: created contents_index files 2025-Mar-10 02:44:35 :: created hash files: aarch64 src x86_64 2025-Mar-10 02:44:38 :: task #377221 for sisyphus TESTED 2025-Mar-10 02:44:38 :: task is ready for commit 2025-Mar-10 02:44:43 :: repo clone OK 2025-Mar-10 02:44:43 :: packages update OK 2025-Mar-10 02:44:48 :: [x86_64 aarch64] update OK 2025-Mar-10 02:44:48 :: repo update OK 2025-Mar-10 02:44:58 :: repo save OK 2025-Mar-10 02:44:58 :: src index update OK 2025-Mar-10 02:45:02 :: updated /gears/l/llama.cpp.git branch `sisyphus' 2025-Mar-10 02:45:13 :: gears update OK 2025-Mar-10 02:45:13 :: task #377221 for sisyphus DONE