cabal-version: 2.2 name: llama-cpp-haskell version: 0.2.1 synopsis: Haskell bindings for the llama.cpp llama-server and a simple CLI description: This is the interface that allows one to interface with llama-server RPC API using Haskell concepts. It also includes a `llamacall` binary to do it from your favorite command line shell and use it in scripting. license: AGPL-3.0-only license-file: LICENSE author: Sergey Alirzaev maintainer: l29ah@riseup.net -- copyright: category: Text, LLM, Llama, Machine Learning, AI, Network, CLI build-type: Simple tested-with: GHC == 9.12.2 -- extra-source-files: Source-repository head type: git location: https://github.com/l29ah/llama-cpp-haskell.git Source-repository this type: git location: https://github.com/l29ah/llama-cpp-haskell.git tag: 0.2.1 common stuff ghc-options: -Wall default-language: Haskell2010 build-depends: base >= 4 && < 5 , conduit ^>= 1.3.5 , conduit-extra ^>= 1.3.7 , exceptions ^>= 0.10.9 , http-client ^>= 0.7.19 , http-conduit ^>= 2.3 , aeson ^>= 2.2 , text ^>= 2.1 , http-types ^>= 0.12 , bytestring ^>= 0.12.2.0 , attoparsec ^>= 0.14.4 library import: stuff exposed-modules: Llama Llama.Streaming executable llamacall import: stuff main-is: Main.hs build-depends: optparse-generic ^>= 1.5.2 other-modules: Llama Llama.Streaming