From: "Girar awaiter (protvin)" <girar-builder@altlinux.org> To: Constantin Sunzow <protvin@altlinux.org> Cc: sisyphus-incominger@lists.altlinux.org, girar-builder-p11@altlinux.org, girar-builder-p11@lists.altlinux.org Subject: [#390615] p11 EPERM (try 2) tinyllamas-gguf.git=0-alt1 llama.cpp.git=5753-alt1 ... Date: Wed, 23 Jul 2025 10:17:15 +0000 Message-ID: <girar.task.390615.2.1@gyle.mskdc.altlinux.org> (raw) In-Reply-To: <girar.task.390615.1.1@gyle.mskdc.altlinux.org> https://git.altlinux.org/tasks/390615/logs/events.2.1.log https://packages.altlinux.org/tasks/390615 subtask name aarch64 i586 x86_64 #40 tinyllamas-gguf 38 21 21 #100 llama.cpp 12:11 - 8:20 #200 python3-module-asgi-lifespan 48 30 29 #300 python3-module-sse-starlette 1:01 40 40 #400 python3-module-starlette-context 48 28 28 #500 python3-module-llama-cpp-python 1:02 - 38 2025-Jul-23 09:52:14 :: task #390615 for p11 resumed by protvin: 2025-Jul-23 09:52:14 :: message: Update llama.cpp #40 build 0-alt1 from /gears/t/tinyllamas-gguf.git fetched at 2025-Jul-23 09:52:01 from sisyphus #100 build 5753-alt1 from /gears/l/llama.cpp.git fetched at 2025-Jul-23 09:41:26 from sisyphus #200 build 2.1.0-alt1.1.gff5d3d0 from /gears/p/python3-module-asgi-lifespan.git fetched at 2025-Jul-23 09:42:01 from sisyphus #300 build 2.3.5-alt1 from /gears/p/python3-module-sse-starlette.git fetched at 2025-Jul-23 09:44:27 from sisyphus #400 build 0.4.0-alt1 from /gears/p/python3-module-starlette-context.git fetched at 2025-Jul-23 09:44:31 from sisyphus #500 build 0.3.14-alt1 from /gears/p/python3-module-llama-cpp-python.git fetched at 2025-Jul-23 09:45:27 from sisyphus 2025-Jul-23 09:52:14 :: created build repo 2025-Jul-23 09:52:15 :: #100: force rebuild 2025-Jul-23 09:52:15 :: #200: force rebuild 2025-Jul-23 09:52:15 :: [i586] #40 tinyllamas-gguf.git 0-alt1: build start 2025-Jul-23 09:52:15 :: [aarch64] #40 tinyllamas-gguf.git 0-alt1: build start 2025-Jul-23 09:52:15 :: [x86_64] #40 tinyllamas-gguf.git 0-alt1: build start 2025-Jul-23 09:52:36 :: [i586] #40 tinyllamas-gguf.git 0-alt1: build OK 2025-Jul-23 09:52:36 :: [i586] #100 llama.cpp.git 5753-alt1: build start 2025-Jul-23 09:52:36 :: [x86_64] #40 tinyllamas-gguf.git 0-alt1: build OK 2025-Jul-23 09:52:37 :: [x86_64] #100 llama.cpp.git 5753-alt1: build start 2025-Jul-23 09:52:49 :: [i586] #100 llama.cpp.git 5753-alt1: build SKIPPED 2025-Jul-23 09:52:49 :: [i586] #200 python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build start 2025-Jul-23 09:52:53 :: [aarch64] #40 tinyllamas-gguf.git 0-alt1: build OK 2025-Jul-23 09:52:53 :: [aarch64] #100 llama.cpp.git 5753-alt1: build start 2025-Jul-23 09:53:19 :: [i586] #200 python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build OK 2025-Jul-23 09:53:19 :: [i586] #300 python3-module-sse-starlette.git 2.3.5-alt1: build start 2025-Jul-23 09:53:59 :: [i586] #300 python3-module-sse-starlette.git 2.3.5-alt1: build OK 2025-Jul-23 09:54:00 :: [i586] #400 python3-module-starlette-context.git 0.4.0-alt1: build start 2025-Jul-23 09:54:28 :: [i586] #400 python3-module-starlette-context.git 0.4.0-alt1: build OK 2025-Jul-23 09:54:28 :: [i586] #500 python3-module-llama-cpp-python.git 0.3.14-alt1: build start 2025-Jul-23 09:54:42 :: [i586] #500 python3-module-llama-cpp-python.git 0.3.14-alt1: build SKIPPED build/100/x86_64/log:[00:04:42] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled build/100/x86_64/log:[00:04:42] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled 2025-Jul-23 10:00:57 :: [x86_64] #100 llama.cpp.git 5753-alt1: build OK 2025-Jul-23 10:00:57 :: [x86_64] #200 python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build start 2025-Jul-23 10:01:26 :: [x86_64] #200 python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build OK 2025-Jul-23 10:01:26 :: [x86_64] #300 python3-module-sse-starlette.git 2.3.5-alt1: build start 2025-Jul-23 10:02:06 :: [x86_64] #300 python3-module-sse-starlette.git 2.3.5-alt1: build OK 2025-Jul-23 10:02:06 :: [x86_64] #400 python3-module-starlette-context.git 0.4.0-alt1: build start 2025-Jul-23 10:02:34 :: [x86_64] #400 python3-module-starlette-context.git 0.4.0-alt1: build OK 2025-Jul-23 10:02:34 :: [x86_64] #500 python3-module-llama-cpp-python.git 0.3.14-alt1: build start 2025-Jul-23 10:03:12 :: [x86_64] #500 python3-module-llama-cpp-python.git 0.3.14-alt1: build OK 2025-Jul-23 10:05:04 :: [aarch64] #100 llama.cpp.git 5753-alt1: build OK 2025-Jul-23 10:05:04 :: [aarch64] #200 python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build start 2025-Jul-23 10:05:52 :: [aarch64] #200 python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build OK 2025-Jul-23 10:05:52 :: [aarch64] #300 python3-module-sse-starlette.git 2.3.5-alt1: build start 2025-Jul-23 10:06:53 :: [aarch64] #300 python3-module-sse-starlette.git 2.3.5-alt1: build OK 2025-Jul-23 10:06:53 :: [aarch64] #400 python3-module-starlette-context.git 0.4.0-alt1: build start 2025-Jul-23 10:07:41 :: [aarch64] #400 python3-module-starlette-context.git 0.4.0-alt1: build OK 2025-Jul-23 10:07:41 :: [aarch64] #500 python3-module-llama-cpp-python.git 0.3.14-alt1: build start 2025-Jul-23 10:08:43 :: [aarch64] #500 python3-module-llama-cpp-python.git 0.3.14-alt1: build OK 2025-Jul-23 10:08:50 :: #40: tinyllamas-gguf.git 0-alt1: build check OK 2025-Jul-23 10:09:08 :: #100: llama.cpp.git 5753-alt1: build check OK 2025-Jul-23 10:09:15 :: #200: python3-module-asgi-lifespan.git 2.1.0-alt1.1.gff5d3d0: build check OK 2025-Jul-23 10:09:23 :: #300: python3-module-sse-starlette.git 2.3.5-alt1: build check OK 2025-Jul-23 10:09:30 :: #400: python3-module-starlette-context.git 0.4.0-alt1: build check OK 2025-Jul-23 10:09:35 :: #500: python3-module-llama-cpp-python.git 0.3.14-alt1: build check OK 2025-Jul-23 10:09:37 :: build check OK 2025-Jul-23 10:10:09 :: noarch check OK 2025-Jul-23 10:10:11 :: plan: src +6 -1 =19730, aarch64 +9 -2 =34800, noarch +4 -0 =20866, x86_64 +11 -2 =35571 #100 llama.cpp 20240225-alt1 -> 1:5753-alt1 Wed Jun 25 2025 Vitaly Chikunov <vt@altlinux> 1:5753-alt1 - Update to b5753 (2025-06-24). - Install an experimental rpc backend and server. The rpc code is a proof-of-concept, fragile, and insecure. Sat May 10 2025 Vitaly Chikunov <vt@altlinux> 1:5332-alt1 - Update to b5332 (2025-05-09), with vision support in llama-server. - Enable Vulkan backend (for GPU) in llama.cpp-vulkan package. Mon Mar 10 2025 Vitaly Chikunov <vt@altlinux> 1:4855-alt1 - Update to b4855 (2025-03-07). - Enable CUDA backend (for NVIDIA GPU) in llama.cpp-cuda package. [...] 2025-Jul-23 10:10:11 :: llama.cpp: closes bugs: 50962 2025-Jul-23 10:10:57 :: patched apt indices 2025-Jul-23 10:11:06 :: created next repo 2025-Jul-23 10:11:16 :: duplicate provides check OK 2025-Jul-23 10:11:54 :: dependencies check OK 2025-Jul-23 10:12:31 :: [x86_64 aarch64] ELF symbols check OK 2025-Jul-23 10:12:45 :: [i586] #200 python3-module-asgi-lifespan: install check OK 2025-Jul-23 10:12:45 :: [x86_64] #100 libllama: install check OK 2025-Jul-23 10:12:53 :: [i586] #300 python3-module-sse-starlette: install check OK 2025-Jul-23 10:12:53 :: [x86_64] #100 libllama-debuginfo: install check OK 2025-Jul-23 10:12:54 :: [aarch64] #100 libllama: install check OK x86_64: libllama-devel=1:5753-alt1 post-install unowned files: /usr/lib64/cmake 2025-Jul-23 10:13:00 :: [x86_64] #100 libllama-devel: install check OK 2025-Jul-23 10:13:01 :: [i586] #400 python3-module-starlette-context: install check OK 2025-Jul-23 10:13:07 :: [aarch64] #100 libllama-debuginfo: install check OK 2025-Jul-23 10:13:07 :: [i586] #40 tinyllamas-gguf: install check OK aarch64: libllama-devel=1:5753-alt1 post-install unowned files: /usr/lib64/cmake 2025-Jul-23 10:13:18 :: [aarch64] #100 libllama-devel: install check OK 2025-Jul-23 10:13:24 :: [x86_64] #100 llama.cpp: install check OK 2025-Jul-23 10:13:32 :: [aarch64] #100 llama.cpp: install check OK 2025-Jul-23 10:13:33 :: [x86_64] #100 llama.cpp-cpu: install check OK 2025-Jul-23 10:13:47 :: [aarch64] #100 llama.cpp-cpu: install check OK 2025-Jul-23 10:13:54 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-Jul-23 10:14:18 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK 2025-Jul-23 10:14:18 :: [x86_64] #100 llama.cpp-cuda: install check OK 2025-Jul-23 10:14:33 :: [aarch64] #100 llama.cpp-vulkan: install check OK 2025-Jul-23 10:14:44 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK 2025-Jul-23 10:14:51 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-Jul-23 10:14:53 :: [x86_64] #100 llama.cpp-vulkan: install check OK 2025-Jul-23 10:15:04 :: [aarch64] #200 python3-module-asgi-lifespan: install check OK 2025-Jul-23 10:15:05 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK 2025-Jul-23 10:15:13 :: [x86_64] #200 python3-module-asgi-lifespan: install check OK 2025-Jul-23 10:15:24 :: [aarch64] #500 python3-module-llama-cpp-python: install check OK 2025-Jul-23 10:15:25 :: [x86_64] #500 python3-module-llama-cpp-python: install check OK 2025-Jul-23 10:15:34 :: [x86_64] #300 python3-module-sse-starlette: install check OK 2025-Jul-23 10:15:37 :: [aarch64] #300 python3-module-sse-starlette: install check OK 2025-Jul-23 10:15:41 :: [x86_64] #400 python3-module-starlette-context: install check OK 2025-Jul-23 10:15:48 :: [x86_64] #40 tinyllamas-gguf: install check OK 2025-Jul-23 10:15:50 :: [aarch64] #400 python3-module-starlette-context: install check OK 2025-Jul-23 10:16:00 :: [aarch64] #40 tinyllamas-gguf: install check OK 2025-Jul-23 10:16:21 :: [x86_64-i586] generated apt indices 2025-Jul-23 10:16:21 :: [x86_64-i586] created next repo 2025-Jul-23 10:16:32 :: [x86_64-i586] dependencies check OK 2025-Jul-23 10:16:39 :: gears inheritance check OK 2025-Jul-23 10:16:39 :: srpm inheritance check OK girar-check-perms: access to tinyllamas-gguf DENIED for protvin: project `tinyllamas-gguf' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny check-subtask-perms: #40: tinyllamas-gguf: needs approvals from members of @maint and @tester groups girar-check-perms: access to llama.cpp DENIED for protvin: project `llama.cpp' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny check-subtask-perms: #100: llama.cpp: needs approvals from members of @maint and @tester groups girar-check-perms: access to python3-module-asgi-lifespan DENIED for protvin: project `python3-module-asgi-lifespan' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny check-subtask-perms: #200: python3-module-asgi-lifespan: needs approvals from members of @maint and @tester groups girar-check-perms: access to python3-module-sse-starlette DENIED for protvin: project `python3-module-sse-starlette' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny check-subtask-perms: #300: python3-module-sse-starlette: needs approvals from members of @maint and @tester groups girar-check-perms: access to python3-module-starlette-context DENIED for protvin: project `python3-module-starlette-context' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny check-subtask-perms: #400: python3-module-starlette-context: needs approvals from members of @maint and @tester groups girar-check-perms: access to python3-module-llama-cpp-python DENIED for protvin: project `python3-module-llama-cpp-python' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny check-subtask-perms: #500: python3-module-llama-cpp-python: needs approvals from members of @maint and @tester groups 2025-Jul-23 10:16:42 :: acl check FAILED 2025-Jul-23 10:17:03 :: created contents_index files 2025-Jul-23 10:17:12 :: created hash files: aarch64 noarch src x86_64 2025-Jul-23 10:17:15 :: task #390615 for p11 EPERM
prev parent reply other threads:[~2025-07-23 10:17 UTC|newest] Thread overview: 2+ messages / expand[flat|nested] mbox.gz Atom feed top 2025-07-23 9:47 [#390615] p11 FAILED llama.cpp.git=5753-alt1 Girar awaiter (protvin) 2025-07-23 10:17 ` Girar awaiter (protvin) [this message]
Reply instructions: You may reply publicly to this message via plain-text email using any one of the following methods: * Save the following mbox file, import it into your mail client, and reply-to-all from there: mbox Avoid top-posting and favor interleaved quoting: https://en.wikipedia.org/wiki/Posting_style#Interleaved_style * Reply using the --to, --cc, and --in-reply-to switches of git-send-email(1): git send-email \ --in-reply-to=girar.task.390615.2.1@gyle.mskdc.altlinux.org \ --to=girar-builder@altlinux.org \ --cc=devel@lists.altlinux.org \ --cc=girar-builder-p11@altlinux.org \ --cc=girar-builder-p11@lists.altlinux.org \ --cc=protvin@altlinux.org \ --cc=sisyphus-incominger@lists.altlinux.org \ /path/to/YOUR_REPLY https://kernel.org/pub/software/scm/git/docs/git-send-email.html * If your mail client supports setting the In-Reply-To header via mailto: links, try the mailto: link
ALT Linux Girar Builder robot reports This inbox may be cloned and mirrored by anyone: git clone --mirror http://lore.altlinux.org/sisyphus-incominger/0 sisyphus-incominger/git/0.git # If you have public-inbox 1.1+ installed, you may # initialize and index your mirror using the following commands: public-inbox-init -V2 sisyphus-incominger sisyphus-incominger/ http://lore.altlinux.org/sisyphus-incominger \ sisyphus-incominger@lists.altlinux.org sisyphus-incominger@lists.altlinux.ru sisyphus-incominger@lists.altlinux.com public-inbox-index sisyphus-incominger Example config snippet for mirrors. Newsgroup available over NNTP: nntp://lore.altlinux.org/org.altlinux.lists.sisyphus-incominger AGPL code for this site: git clone https://public-inbox.org/public-inbox.git