Skip to content

Actions: ggerganov/llama.cpp

CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
11,510 workflow runs
11,510 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

ggml : fixes for AVXVNNI instruction set with MSVC and Clang (#11027)
CI #18012: Commit 0827b2c pushed by slaren
December 31, 2024 14:23 51m 29s master
December 31, 2024 14:23 51m 29s
server : clean up built-in template detection (#11026)
CI #18011: Commit 45095a6 pushed by ngxson
December 31, 2024 14:22 47m 28s master
December 31, 2024 14:22 47m 28s
server : add OAI compat for /v1/completions (#10974)
CI #18006: Commit 5896c65 pushed by ngxson
December 31, 2024 11:34 52m 47s master
December 31, 2024 11:34 52m 47s
common, examples, ggml : fix MSYS2 GCC compiler errors and warnings w…
CI #18002: Commit 6e1531a pushed by slaren
December 31, 2024 00:46 34m 16s master
December 31, 2024 00:46 34m 16s
vulkan: optimize mul_mat for small values of N (#10991)
CI #18001: Commit 716bd6d pushed by 0cc4m
December 30, 2024 17:27 36m 34s master
December 30, 2024 17:27 36m 34s
Ass VisionOS compatibility by adding missing type definitions
CI #18000: Pull request #11019 opened by sinkingsugar
December 30, 2024 13:29 Action required shards-lang:master
December 30, 2024 13:29 Action required
android : fix llama_batch free (#11014)
CI #17999: Commit c250ecb pushed by ggerganov
December 30, 2024 12:35 1h 8m 23s master
December 30, 2024 12:35 1h 8m 23s
llamafile_sgemm API - INT8 implementation
CI #17998: Pull request #10912 synchronize by amritahs-ibm
December 30, 2024 07:27 21m 19s amritahs-ibm:sgemm_q8
December 30, 2024 07:27 21m 19s
llamafile_sgemm API - INT8 implementation
CI #17997: Pull request #10912 synchronize by amritahs-ibm
December 30, 2024 07:24 2m 53s amritahs-ibm:sgemm_q8
December 30, 2024 07:24 2m 53s
Vulkan: Destroy Vulkan instance on exit
CI #17996: Pull request #10989 synchronize by 0cc4m
December 30, 2024 05:15 23m 51s 0cc4m/vulkan-instance-cleanup
December 30, 2024 05:15 23m 51s
Add Jinja template support
CI #17995: Pull request #11016 synchronize by ochafik
December 30, 2024 04:59 26m 40s ochafik:jinja
December 30, 2024 04:59 26m 40s
Add Jinja template support
CI #17994: Pull request #11016 synchronize by ochafik
December 30, 2024 04:50 9m 13s ochafik:jinja
December 30, 2024 04:50 9m 13s
Add Jinja template support
CI #17993: Pull request #11016 synchronize by ochafik
December 30, 2024 04:39 11m 8s ochafik:jinja
December 30, 2024 04:39 11m 8s
Add Jinja template support
CI #17992: Pull request #11016 synchronize by ochafik
December 30, 2024 04:19 20m 13s ochafik:jinja
December 30, 2024 04:19 20m 13s
Add Jinja template support
CI #17991: Pull request #11016 synchronize by ochafik
December 30, 2024 04:10 9m 13s ochafik:jinja
December 30, 2024 04:10 9m 13s
Add Jinja template support
CI #17990: Pull request #11016 synchronize by ochafik
December 30, 2024 03:51 19m 55s ochafik:jinja
December 30, 2024 03:51 19m 55s
Add Jinja template support
CI #17989: Pull request #11016 opened by ochafik
December 30, 2024 03:48 3m 12s ochafik:jinja
December 30, 2024 03:48 3m 12s
fix https://github.com/ggerganov/llama.cpp/issues/9946
CI #17988: Pull request #11014 opened by ag2s20150909
December 30, 2024 02:46 34m 19s ag2s20150909:patch-2
December 30, 2024 02:46 34m 19s