llama-cpp-hs: Haskell FFI bindings to the llama.cpp LLM inference library
Haskell bindings for llama.cpp, a performant, C++-based inference engine for running large language models (LLMs) like LLaMA, Mistral, Qwen, and others directly on local hardware.
[Skip to Readme]
Modules
[Index] [Quick Jump]
- Llama
- Llama.Adapter
- Llama.Backend
- Llama.ChatTemplate
- Llama.Context
- Llama.Decode
- Internal
- Llama.Internal.Foreign
- Llama.Internal.Foreign.Adapter
- Llama.Internal.Foreign.Backend
- Llama.Internal.Foreign.ChatTemplate
- Llama.Internal.Foreign.Context
- Llama.Internal.Foreign.Decode
- Llama.Internal.Foreign.GGML
- Llama.Internal.Foreign.KVCache
- Llama.Internal.Foreign.Model
- Llama.Internal.Foreign.Performance
- Llama.Internal.Foreign.Sampler
- Llama.Internal.Foreign.Split
- Llama.Internal.Foreign.State
- Llama.Internal.Foreign.Tokenize
- Llama.Internal.Foreign.Vocab
- Llama.Internal.Types
- Llama.Internal.Foreign
- Llama.KVCache
- Llama.Model
- Llama.Performance
- Llama.Sampler
- Llama.Split
- Llama.State
- Llama.Tokenize
- Llama.Vocab
Downloads
- llama-cpp-hs-0.1.0.0.tar.gz [browse] (Cabal source package)
- Package description (as included in the package)
Maintainer's Corner
For package maintainers and hackage trustees
Candidates
- No Candidates
| Versions [RSS] | 0.1.0.0 |
|---|---|
| Change log | CHANGELOG.md |
| Dependencies | base (>=4.7 && <5), bytestring (>=0.9 && <0.13), derive-storable (>=0.2 && <0.4) [details] |
| License | MIT |
| Copyright | 2025 tushar |
| Author | tushar |
| Maintainer | tusharadhatrao@gmail.com |
| Category | ai, ffi, natural-language-processing |
| Home page | https://github.com/tusharad/llama-cpp-hs#readme |
| Bug tracker | https://github.com/tusharad/llama-cpp-hs/issues |
| Source repo | head: git clone https://github.com/tusharad/llama-cpp-hs |
| Uploaded | by tusharad at 2025-05-21T09:13:36Z |
| Distributions | |
| Downloads | 17 total (4 in the last 30 days) |
| Rating | 2.0 (votes: 1) [estimated by Bayesian average] |
| Your Rating | |
| Status | Docs uploaded by user Build status unknown [no reports yet] |