No description
src | ||
.gitignore | ||
Cargo.lock | ||
Cargo.toml | ||
README.md |
ocli
A simple command-line interface for the Ollama API, written in Rust.
Features
- Communication with the Ollama API
- Streaming of model responses
- Session handling (saving and reusing context)
- Interactive mode
- Configuration via TOML file (optional)
- Verbose mode for debugging and performance metrics
Installation
cargo install --git https://github.com/sionik/ocli
Or clone the repository and install it locally:
git clone https://github.com/sionik/ocli
cd ocli
cargo install --path .
Usage
Basic Usage
# Make a request to the default model
ocli ask "Explain quantum physics in simple terms"
# Make a request with a specific model
ocli -m llama3 ask "What's the difference between Rust and Go?"
# Make a request with non-streaming mode
ocli ask --no-stream "Tell me a story"
# Save context in a session
ocli ask -s mySession "Who was Albert Einstein?"
ocli ask -s mySession "What were his most important contributions?"
# Enable verbose mode to see performance metrics
ocli -v ask "How does quantum computing work?"
# Use a different Ollama API URL
ocli -u http://ollama.example.com:11434 ask "Hello, how are you?"
Interactive Mode
# Start interactive mode
ocli interactive
# Start interactive mode with a specific session
ocli interactive -s mySession
# Start interactive mode with verbose metrics
ocli -v interactive
# Start interactive mode with a custom Ollama API URL
ocli -u http://remote-ollama:11434 interactive
Session Management
# List all saved sessions
ocli sessions
# Delete a session
ocli delete-session mySession
Configuration
By default, ocli uses the default settings without a configuration file. You can output the default configuration and redirect it to a file:
# Create configuration file in the default location
mkdir -p ~/.config/ocli
ocli print-config > ~/.config/ocli/config.toml
# Print your current configuration (including CLI overrides)
ocli -m llama3 -u http://custom-ollama:11434 print-config > my-config.toml
# Print all configuration options (even defaults)
ocli print-config --all > full-config.toml
# Or on Windows
mkdir -p %APPDATA%\ocli
ocli print-config > %APPDATA%\ocli\config.toml
The default locations for the configuration file are:
- Linux/macOS:
~/.config/ocli/config.toml
- Windows:
C:\Users\<Username>\AppData\Roaming\ocli\config.toml
You can also explicitly specify the path to the configuration file:
ocli --config /path/to/config.toml ask "Hello, how are you?"
Example of a configuration file:
# Configuration file for ocli
# api_url = "http://localhost:11434"
model = "gemma3"
# num_ctx = 4096
Parameters
-m, --model <MODEL>
: The model to use-u, --api-url <URL>
: Ollama API URL (default: http://localhost:11434)-n, --num_ctx <NUM_CTX>
: Size of the context window (number of tokens)-s, --session <SESSION>
: Session ID for storing context--config <CONFIG>
: Path to the configuration file--no-stream
: Disables streaming mode for responses (only with ask command)-v, --verbose
: Enables verbose output with performance metrics
Commands
ask
: Makes a request to the modelinteractive
: Starts interactive modesessions
: Lists all saved sessionsdelete-session
: Deletes a sessionprint-config
: Outputs the current configuration--all
: Include all settings, even those with default values