Packages
llama_cpp_sdk
v 0.1.0
First concrete self-hosted inference backend for llama.cpp, owning llama-server boot specs, readiness probes, stop semantics, and endpoint publication through self_hosted_inference_core.
Dependencies of llama_cpp_sdk
1 dependency| Package | Requirement | Status |
|---|---|---|
| self_hosted_inference_core | ~> 0.1.0 |
Checksum
Dependency Config
mix.exs
rebar.config
erlang.mk
Package Details
Downloads
Last 30 days,
0.1.0
this version
0
yesterday
0
last 7 days
0
all time
0