# ๐ฆ Ollama Haskell
**`ollama-haskell`** is an unofficial Haskell client for [Ollama](https://ollama.com), inspired by [`ollama-python`](https://github.com/ollama/ollama-python). It enables interaction with locally running LLMs through the Ollama HTTP API โ directly from Haskell.
---
## โจ Features
* ๐ฌ Chat with models
* โ๏ธ Text generation (with streaming)
* โ
Chat with structured messages and tools
* ๐ง Embeddings
* ๐งฐ Model management (list, pull, push, show, delete)
* ๐๏ธ In-memory conversation history
* โ๏ธ Configurable timeouts, retries, streaming handlers
---
## โก Quick Example
```haskell
{-# LANGUAGE OverloadedStrings #-}
module Main where
import Data.Ollama.Generate
import qualified Data.Text.IO as T
main :: IO ()
main = do
let ops =
defaultGenerateOps
{ modelName = "gemma3"
, prompt = "What is the meaning of life?"
}
eRes <- generate ops Nothing
case eRes of
Left err -> putStrLn $ "Something went wrong: " ++ show err
Right r -> do
putStr "LLM response: "
T.putStrLn (genResponse r)
```
---
## ๐ฆ Installation
Add to your `.cabal` file:
```cabal
build-depends:
base >=4.7 && <5,
ollama-haskell
```
Or use with `stack`/`nix-shell`.
---
## ๐ More Examples
See [`examples/OllamaExamples.hs`](examples/OllamaExamples.hs) for:
* Chat with conversation memory
* Structured JSON output
* Embeddings
* Tool/function calling
* Multimodal input
* Streaming and non-streaming variants
---
## ๐ Prerequisite
Make sure you have [Ollama installed and running locally](https://ollama.com/download). Run `ollama pull llama3` to download a model.
---
## ๐งช Dev & Nix Support
Use Nix:
```bash
nix-shell
```
This will install `stack` and Ollama.
---
## ๐จโ๐ป Author
Created and maintained by [@tusharad](https://github.com/tusharad). PRs and feedback are welcome!
---
## ๐ค Contributing
Have ideas or improvements? Feel free to [open an issue](https://github.com/tusharad/ollama-haskell/issues) or submit a PR!