# ๐Ÿฆ™ Ollama Haskell
logo image
**`ollama-haskell`** is an unofficial Haskell client for [Ollama](https://ollama.com), inspired by [`ollama-python`](https://github.com/ollama/ollama-python). It enables interaction with locally running LLMs through the Ollama HTTP API โ€” directly from Haskell. --- ## โœจ Features * ๐Ÿ’ฌ Chat with models * โœ๏ธ Text generation (with streaming) * โœ… Chat with structured messages and tools * ๐Ÿง  Embeddings * ๐Ÿงฐ Model management (list, pull, push, show, delete) * ๐Ÿ—ƒ๏ธ In-memory conversation history * โš™๏ธ Configurable timeouts, retries, streaming handlers --- ## โšก Quick Example ```haskell {-# LANGUAGE OverloadedStrings #-} module Main where import Data.Ollama.Generate import qualified Data.Text.IO as T main :: IO () main = do let ops = defaultGenerateOps { modelName = "gemma3" , prompt = "What is the meaning of life?" } eRes <- generate ops Nothing case eRes of Left err -> putStrLn $ "Something went wrong: " ++ show err Right r -> do putStr "LLM response: " T.putStrLn (genResponse r) ``` --- ## ๐Ÿ“ฆ Installation Add to your `.cabal` file: ```cabal build-depends: base >=4.7 && <5, ollama-haskell ``` Or use with `stack`/`nix-shell`. --- ## ๐Ÿ“š More Examples See [`examples/OllamaExamples.hs`](examples/OllamaExamples.hs) for: * Chat with conversation memory * Structured JSON output * Embeddings * Tool/function calling * Multimodal input * Streaming and non-streaming variants --- ## ๐Ÿ›  Prerequisite Make sure you have [Ollama installed and running locally](https://ollama.com/download). Run `ollama pull llama3` to download a model. --- ## ๐Ÿงช Dev & Nix Support Use Nix: ```bash nix-shell ``` This will install `stack` and Ollama. --- ## ๐Ÿ‘จโ€๐Ÿ’ป Author Created and maintained by [@tusharad](https://github.com/tusharad). PRs and feedback are welcome! --- ## ๐Ÿค Contributing Have ideas or improvements? Feel free to [open an issue](https://github.com/tusharad/ollama-haskell/issues) or submit a PR!