From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.1 (2015-04-28) on sa.local.altlinux.org X-Spam-Level: X-Spam-Status: No, score=-3.3 required=5.0 tests=BAYES_00,RP_MATCHES_RCVD autolearn=ham autolearn_force=no version=3.4.1 Date: Sun, 27 Jul 2025 12:14:55 +0000 From: "Girar pender (amakeenk)" To: Vitaly Chikunov Subject: [#388454] p11 DONE (try 3) tinyllamas-gguf.git=0-alt1 llama.cpp.git=5753-alt1 Message-ID: Mail-Followup-To: Girar pender robot References: MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Disposition: inline In-Reply-To: X-girar-task-id: 388454 X-girar-task-owner: vt X-girar-task-repo: p11 X-girar-task-try: 3 X-girar-task-iter: 1 X-girar-task-status: DONE X-girar-task-URL: https://git.altlinux.org/tasks/archive/done/_379/388454/ X-girar-task-log: logs/events.3.1.log X-girar-task-summary: [#388454] p11 DONE (try 3) tinyllamas-gguf.git=0-alt1 llama.cpp.git=5753-alt1 User-Agent: Mutt/1.10.1 (2018-07-13) Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-p11@altlinux.org, girar-builder-p11@lists.altlinux.org X-BeenThere: sisyphus-incominger@lists.altlinux.org X-Mailman-Version: 2.1.12 Precedence: list Reply-To: ALT Devel discussion list List-Id: ALT Linux Girar Builder robot reports List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Sun, 27 Jul 2025 12:14:57 -0000 Archived-At: List-Archive: https://git.altlinux.org/tasks/archive/done/_379/388454/logs/events.3.1.log https://packages.altlinux.org/tasks/388454 subtask name aarch64 i586 x86_64 #40 tinyllamas-gguf 38 23 25 #100 llama.cpp 12:34 - 9:46 2025-Jul-27 11:53:03 :: task #388454 for p11 resumed by amakeenk: 2025-Jul-27 11:53:03 :: message: update #40 build 0-alt1 from /gears/t/tinyllamas-gguf.git fetched at 2025-Jun-29 22:38:31 from sisyphus #100 build 5753-alt1 from /gears/l/llama.cpp.git fetched at 2025-Jun-29 22:36:16 from sisyphus 2025-Jul-27 11:53:04 :: created build repo 2025-Jul-27 11:53:05 :: [i586] #40 tinyllamas-gguf.git 0-alt1: build start 2025-Jul-27 11:53:05 :: [x86_64] #40 tinyllamas-gguf.git 0-alt1: build start 2025-Jul-27 11:53:05 :: [aarch64] #40 tinyllamas-gguf.git 0-alt1: build start 2025-Jul-27 11:53:28 :: [i586] #40 tinyllamas-gguf.git 0-alt1: build OK 2025-Jul-27 11:53:29 :: [i586] #100 llama.cpp.git 5753-alt1: build start 2025-Jul-27 11:53:30 :: [x86_64] #40 tinyllamas-gguf.git 0-alt1: build OK 2025-Jul-27 11:53:30 :: [x86_64] #100 llama.cpp.git 5753-alt1: build start 2025-Jul-27 11:53:43 :: [aarch64] #40 tinyllamas-gguf.git 0-alt1: build OK 2025-Jul-27 11:53:43 :: [aarch64] #100 llama.cpp.git 5753-alt1: build start 2025-Jul-27 11:53:45 :: [i586] #100 llama.cpp.git 5753-alt1: build SKIPPED build/100/x86_64/log:[00:05:28] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/100/x86_64/log:[00:05:28] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2025-Jul-27 12:03:16 :: [x86_64] #100 llama.cpp.git 5753-alt1: build OK 2025-Jul-27 12:06:17 :: [aarch64] #100 llama.cpp.git 5753-alt1: build OK 2025-Jul-27 12:06:25 :: #40: tinyllamas-gguf.git 0-alt1: build check OK 2025-Jul-27 12:06:46 :: #100: llama.cpp.git 5753-alt1: build check OK 2025-Jul-27 12:06:48 :: build check OK 2025-Jul-27 12:07:19 :: noarch check OK 2025-Jul-27 12:07:20 :: plan: src +2 -1 =19730, aarch64 +8 -2 =34804, noarch +1 -0 =20864, x86_64 +10 -2 =35575 #100 llama.cpp 20240225-alt1 -> 1:5753-alt1 Wed Jun 25 2025 Vitaly Chikunov 1:5753-alt1 - Update to b5753 (2025-06-24). - Install an experimental rpc backend and server. The rpc code is a proof-of-concept, fragile, and insecure. Sat May 10 2025 Vitaly Chikunov 1:5332-alt1 - Update to b5332 (2025-05-09), with vision support in llama-server. - Enable Vulkan backend (for GPU) in llama.cpp-vulkan package. Mon Mar 10 2025 Vitaly Chikunov 1:4855-alt1 - Update to b4855 (2025-03-07). - Enable CUDA backend (for NVIDIA GPU) in llama.cpp-cuda package. [...] 2025-Jul-27 12:07:21 :: llama.cpp: closes bugs: 50962 2025-Jul-27 12:08:04 :: patched apt indices 2025-Jul-27 12:08:14 :: created next repo 2025-Jul-27 12:08:24 :: duplicate provides check OK 2025-Jul-27 12:09:04 :: dependencies check OK 2025-Jul-27 12:09:36 :: [x86_64 aarch64] ELF symbols check OK 2025-Jul-27 12:09:49 :: [i586] #40 tinyllamas-gguf: install check OK 2025-Jul-27 12:09:58 :: [x86_64] #100 libllama: install check OK 2025-Jul-27 12:10:00 :: [aarch64] #100 libllama: install check OK 2025-Jul-27 12:10:11 :: [x86_64] #100 libllama-debuginfo: install check OK 2025-Jul-27 12:10:13 :: [aarch64] #100 libllama-debuginfo: install check OK x86_64: libllama-devel=1:5753-alt1 post-install unowned files: /usr/lib64/cmake 2025-Jul-27 12:10:20 :: [x86_64] #100 libllama-devel: install check OK aarch64: libllama-devel=1:5753-alt1 post-install unowned files: /usr/lib64/cmake 2025-Jul-27 12:10:24 :: [aarch64] #100 libllama-devel: install check OK 2025-Jul-27 12:10:39 :: [aarch64] #100 llama.cpp: install check OK 2025-Jul-27 12:10:50 :: [x86_64] #100 llama.cpp: install check OK 2025-Jul-27 12:10:53 :: [aarch64] #100 llama.cpp-cpu: install check OK 2025-Jul-27 12:11:00 :: [x86_64] #100 llama.cpp-cpu: install check OK 2025-Jul-27 12:11:23 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-Jul-27 12:11:25 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-Jul-27 12:11:40 :: [aarch64] #100 llama.cpp-vulkan: install check OK 2025-Jul-27 12:11:49 :: [x86_64] #100 llama.cpp-cuda: install check OK 2025-Jul-27 12:11:58 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-Jul-27 12:12:09 :: [aarch64] #40 tinyllamas-gguf: install check OK 2025-Jul-27 12:12:17 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK 2025-Jul-27 12:12:27 :: [x86_64] #100 llama.cpp-vulkan: install check OK 2025-Jul-27 12:12:40 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-Jul-27 12:12:47 :: [x86_64] #40 tinyllamas-gguf: install check OK 2025-Jul-27 12:13:08 :: [x86_64-i586] generated apt indices 2025-Jul-27 12:13:08 :: [x86_64-i586] created next repo 2025-Jul-27 12:13:21 :: [x86_64-i586] dependencies check OK 2025-Jul-27 12:13:22 :: gears inheritance check OK 2025-Jul-27 12:13:23 :: srpm inheritance check OK girar-check-perms: access to @maint ALLOWED for cas: member of approved group girar-check-perms: access to @tester ALLOWED for amakeenk: member of approved group check-subtask-perms: #40: tinyllamas-gguf: approved by cas, approved by amakeenk girar-check-perms: access to @maint ALLOWED for cas: member of approved group girar-check-perms: access to @tester ALLOWED for amakeenk: member of approved group check-subtask-perms: #100: llama.cpp: approved by cas, approved by amakeenk 2025-Jul-27 12:13:25 :: acl check OK 2025-Jul-27 12:13:50 :: created contents_index files 2025-Jul-27 12:14:01 :: created hash files: aarch64 noarch src x86_64 2025-Jul-27 12:14:04 :: task #388454 for p11 TESTED 2025-Jul-27 12:14:05 :: task is ready for commit 2025-Jul-27 12:14:11 :: repo clone OK 2025-Jul-27 12:14:11 :: packages update OK 2025-Jul-27 12:14:19 :: [x86_64 aarch64 noarch] update OK 2025-Jul-27 12:14:19 :: repo update OK 2025-Jul-27 12:14:32 :: repo save OK 2025-Jul-27 12:14:32 :: src index update OK 2025-Jul-27 12:14:32 :: updated /gears/l/llama.cpp.git branch `p11' 2025-Jul-27 12:14:32 :: created /gears/t/tinyllamas-gguf.git branch `p11' 2025-Jul-27 12:14:55 :: gears update OK 2025-Jul-27 12:14:55 :: task #388454 for p11 DONE