Packages

Service-runtime kernel for self-hosted inference backends, owning readiness, health, lease reuse, and endpoint publication above the transport seam.

Current section

1 Dependant

Jump to

Packages depending on self_hosted_inference_core

1 package
  • First concrete self-hosted inference backend for llama.cpp, owning llama-server boot specs, readiness probes, stop semantics, and endpoint publication through self_hosted_inference_core.

    Updated 1 month ago

    73
    recent downloads
    total downloads: 73
1 package of 1 total

Checksum

Dependency Config

mix.exs

rebar.config

Gleam

erlang.mk

Package Details

Downloads Last 30 days, all versions
0 2 4 6 8

this version

102

yesterday

0

last 7 days

2

all time

102

Last Updated

Apr 07, 2026

License

MIT

Build Tools

mix

Publisher

nshkrdotcom nshkrdotcom