A suite of CLI tools designed for LLM agents to interact with SaaS APIs. JSON output by default, --human flag for readable output.
| Binary | Service | API |
|---|---|---|
llm-cli |
Dispatcher | Execs llm-cli-<subcommand> from $PATH |
llm-cli-linear |
Linear | GraphQL |
llm-cli-discourse |
Discourse | REST |
llm-cli-slack |
Slack | REST |
- 1Password — all API keys are stored in 1Password (never in config files)
- 1Password CLI (
op) — used to retrieve credentials at call time - The 1Password desktop app must be running and unlocked, with CLI integration enabled
./install.shThis discovers all binary crates in the workspace and runs cargo install --path for each.
Run the interactive setup wizard to generate your config file:
llm-cli initThis detects which llm-cli-* tools are installed, provides instructions for creating API keys, and prompts for the required configuration fields.
All tools read from ~/.config/llm-cli/config.toml (or $XDG_CONFIG_HOME/llm-cli/config.toml).
[linear]
op_item_id = "your-1password-item-id"
[discourse.my-forum]
base_url = "https://forum.example.com"
op_item_id = "your-1password-item-id"
api_username = "your-username"
[slack]
op_item_id = "your-1password-item-id"API keys are never stored in config files. Instead, each tool retrieves credentials from 1Password at call time using the op CLI.
Under the hood, each invocation runs:
op item get <op_item_id> --field <op_field> --reveal- Install the 1Password CLI — follow the getting started guide
- Create an API key for the service (Linear, Discourse, or Slack)
- Store it in 1Password — create an item (e.g. type "API Credential" or "Login") and paste the key into a field named
credential - Find the item ID — open the item in 1Password.app and copy the ID from the URL bar (it looks like
a1b2c3d4e5f6g7h8), or run:op item list | grep "Linear"
- Add the item ID to your config as
op_item_id
op_item_id(required) — the 1Password item ID containing your API keyop_field(optional, default:"credential") — the field name within the 1Password item to read the key from. Set this if you stored the key in a different field.
The 1Password desktop app must be running and unlocked for op to work. If you use 1Password in the browser only, you'll need to enable CLI integration.
# Dispatcher
llm-cli linear issues list
llm-cli discourse posts latest
llm-cli slack messages read --channel general
# Direct invocation
llm-cli-linear issues list --limit 10 --mine --team ENG
llm-cli-linear issues list --priority 1 --label bug
llm-cli-linear issues list --cursor <next_cursor>
llm-cli-linear issues get --id PROJ-123
llm-cli-linear issues create --title "Bug" --team ENG
llm-cli-linear issues create --input issue.json
llm-cli-linear issues close --id PROJ-123
llm-cli-discourse posts latest --page 2
llm-cli-discourse posts get --id 42
llm-cli-discourse posts create --title "Topic" --category general --raw "Body"
llm-cli-discourse posts create --input topic.json
llm-cli-discourse comments create --topic-id 42 --raw "Reply"
llm-cli-slack messages send --channel general --text "hello"
llm-cli-slack messages send --input message.json
llm-cli-slack messages read --channel general --oldest 1711900000 --latest 1711990000
llm-cli-slack messages read --channel general --cursor <next_cursor>
llm-cli-slack messages dm --user U12345 --text "hey"
llm-cli-slack messages mentions
llm-cli-slack summary --channel generalllm-cli completions generates completions for the dispatcher and all installed llm-cli-* subcommands in a single script. One file gives you tab-completion for everything.
llm-cli completions --shell bash > ~/.local/share/bash-completion/completions/llm-cli# Ensure completions directory exists and is in fpath.
# Add to ~/.zshrc if not already present:
# fpath=(~/.zfunc $fpath)
# autoload -Uz compinit && compinit
mkdir -p ~/.zfunc
llm-cli completions --shell zsh > ~/.zfunc/_llm-clillm-cli completions --shell fish > ~/.config/fish/completions/llm-cli.fishRe-run after installing new subcommands to pick up their completions.
--human— human-readable output instead of JSON--debug— log HTTP requests/responses to stderr--debug=pretty— pretty-print JSON bodies and GraphQL queries--debug=curl— print reproducible curl commands (secrets redacted by default)--debug=dangerous_no_redact— show secrets in debug output--debug=curl,dangerous_no_redact— curl commands with secrets exposed--debug=pretty,curl— pretty + curl, secrets redacted
{
"success": true,
"data": { ... }
}List commands include a pagination object when more results are available:
{
"success": true,
"data": { ... },
"pagination": {
"has_more": true,
"next_cursor": "WyIyMDI2LTA0LTAxIl0"
}
}Errors are output as structured JSON to stdout (not stderr) with a non-zero exit code:
{
"success": false,
"error": {
"code": "CONFIG_NOT_FOUND",
"message": "Config file not found at ~/.config/llm-cli/config.toml",
"suggestion": "Create a config file with..."
}
}In --human mode, errors go to stderr as plain text.
| Code | Meaning |
|---|---|
| 0 | Success |
| 1 | Unknown/general error |
| 2 | Configuration error (missing config file, bad TOML, missing section) |
| 3 | Authentication error (1Password CLI missing, credential retrieval failed) |
| 4 | API error (HTTP failure, bad response) |
| 5 | Invalid CLI input (bad debug mode) |
Create commands accept --input <file> for structured JSON input instead of individual flags. Use --input - to read from stdin:
echo '{"title": "Bug", "team": "ENG"}' | llm-cli-linear issues create --input -
llm-cli-slack messages send --input message.jsonEach API tool has a schema subcommand that outputs a JSON description of available commands and arguments:
llm-cli-linear schema
llm-cli-discourse schema
llm-cli-slack schemaAll API crates retry once with a 1-second backoff on transient HTTP errors (429 rate limits, 5xx server errors). Slack respects the Retry-After header when present. Destructive operations (delete) are not retried.
See PRINCIPLES.md for the CLI design philosophy. These tools are agent-first: JSON output, structured errors with suggestions, named flags, no interactive prompts.
packages/
llm-cli/ # Dispatcher (std only, no deps)
llm-cli-linear/ # Linear GraphQL client
llm-cli-discourse/ # Discourse REST client
llm-cli-slack/ # Slack REST client
docs/
plans/ # Design documents