4 Commits
v1.0.0 ... main

3 changed files with 334 additions and 8 deletions

21
LICENSE.md Normal file
View File

@@ -0,0 +1,21 @@
# MIT License
Copyright (c) 2026 Ollama CLI Contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

250
README.md Normal file
View File

@@ -0,0 +1,250 @@
# Ollama CLI
A comprehensive command-line interface for interacting with Ollama servers, featuring configuration management, profile support, and context persistence.
## Installation
Place the `ollama` script in your PATH and ensure it's executable:
```bash
chmod +x ollama
export PATH="$PATH:$(pwd)"
```
## Configuration
The script loads configuration from `~/.ollama/config.json`. If the file doesn't exist, it uses built-in defaults.
### Configuration File Format
```json
{
"api_url": "https://ollama-api.example.com/api",
"user": "ollama",
"password": "direct_password_here",
"password_path": "path/to/password/in/pass",
"default_model": "llama2"
}
```
**Configuration Options:**
- `api_url`: URL to the Ollama API server
- `user`: Username for API authentication
- `password`: Direct password (takes precedence over password_path)
- `password_path`: Path to retrieve password from `pass` command
- `default_model`: Default model to use for operations
### Environment Variable Overrides
All configuration values can be overridden via environment variables:
- `OLLAMA_API_URL_ENV`
- `OLLAMA_USER_ENV`
- `OLLAMA_PASS_PATH_ENV`
- `OLLAMA_PASSWORD_ENV`
- `DEFAULT_MODEL_ENV`
Or use direct variables (lower precedence):
- `OLLAMA_API_URL`
- `OLLAMA_USER`
- `OLLAMA_PASS_PATH`
- `OLLAMA_PASSWORD`
- `DEFAULT_MODEL`
### Directory Structure
The script maintains the following directories:
- `~/.ollama/context.json` - Conversation context for chat (can be overridden with `OLLAMA_CONTEXT_FILE`)
- `~/.ollama/profiles/` - Profile definitions (can be overridden with `OLLAMA_PROFILE_DIR`)
## Usage
### General Help
```
Usage: ollama <command> [options]
Commands:
generate <prompt> [options] Generate a response from a prompt (supports stdin)
chat [options] Chat with context memory (supports stdin)
embed <text> [options] Generate embeddings for text (supports stdin)
model <subcommand> Manage models
profile <subcommand> Manage profiles
version Show script and Ollama server version
help [command] Show this help message or help for a specific command
For detailed help on a command, run:
ollama help <command>
Examples:
ollama help generate
ollama help chat
ollama help embed
ollama help profile
ollama help model
Environment Variables:
OLLAMA_CONTEXT_FILE Override default context file location
OLLAMA_PROFILE_DIR Override default profile directory location
```
### Generate Command
```
Usage: ollama generate <prompt> [options]
Generate a single response from a prompt. Supports reading from stdin.
Options:
-m, --model MODEL Use a specific model (default: llama3:8b)
-p, --profile NAME Use a profile's system prompt (default: default)
Examples:
ollama generate "What is 2+2?"
ollama generate "Write a poem" -m deepseek-r1:8b
echo "Tell me a joke" | ollama generate
echo "How do I read a file?" | ollama generate -p python
ollama generate -p json "List colors"
```
### Chat Command
```
Usage: ollama chat [options]
Chat with context memory. Maintains conversation history between invocations.
Options:
-m, --model MODEL Use a specific model (default: llama3:8b)
-c, --context FILE Use a specific context file (default: ~/.ollama/context.json)
-p, --profile NAME Use a profile's system prompt (default: default)
-r, --reset Clear the context and start fresh
Examples:
ollama chat
echo "Hello!" | ollama chat
ollama chat -m deepseek-r1:14b
ollama chat -r # Reset context
ollama chat -c /tmp/custom_context.json # Use custom context file
OLLAMA_CONTEXT_FILE=/tmp/session.json ollama chat # Use env variable
ollama chat -p default
echo "What is Docker?" | ollama chat -p json
echo "How do I write a loop?" | ollama chat -p python
```
### Embed Command
```
Usage: ollama embed <text> [options]
Generate embeddings for text using a specified model. Supports reading from stdin.
Options:
-m, --model MODEL Use a specific model (default: llama3:8b)
-p, --profile NAME Use a profile's model selection (default: default)
-o, --output FORMAT Output format: text (default) or json
Output Formats:
text Display embedding summary with first 10 values
json Display full API response including all embedding values
Examples:
ollama embed "What is machine learning?"
ollama embed "Hello world" -m nomic-embed-text
echo "Some text to embed" | ollama embed
echo "Multiple lines
of text" | ollama embed -o json
ollama embed "Search query" -m nomic-embed-text -o json | jq '.embedding | length'
```
### Model Command
```
Usage: ollama model <subcommand>
Manage and query available models.
Subcommands:
list List all available models with specs
get MODEL Show detailed model information
ps List currently running/loaded models
Examples:
ollama model list
ollama model get llama3:8b
ollama model get deepseek-r1:8b
ollama model ps
```
### Profile Command
```
Usage: ollama profile <subcommand> [options]
Manage profiles for system prompts, pre/post scripts, and model selection.
Subcommands:
list List all profile names
get NAME Show profile details
add NAME PROMPT [OPTIONS] Create a new profile
update NAME [PROMPT] [OPTIONS] Update a profile's system prompt or scripts
delete NAME Delete a profile
Profile Options:
--pre-script SCRIPT Script to execute on input before sending to ollama
--post-script SCRIPT Script to execute on output after receiving from ollama
-m, --model MODEL Model to use for this profile
Examples:
ollama profile list # List all profiles
ollama profile get json # Show profile details
ollama profile add work "You are a professional..." # Create profile
ollama profile add uppercase "Respond in all caps" --pre-script "tr '[:lower:]' '[:upper:]'"
ollama profile add pretty "Return formatted JSON" --post-script "jq ."
ollama profile add deep "You are a thinking model" -m deepseek-r1:8b
ollama profile update python "You are a Python expert..." # Update profile
ollama profile update json -m gemma:7b-text # Change model
ollama profile delete work # Delete profile
```
### Version Command
```
Usage: ollama version [options]
Display the CLI script version and the Ollama server version.
Options:
-o, --output FORMAT Output format: text (default) or json
Examples:
ollama version
ollama version -o json
ollama version -o json | jq '.ollama_version'
```
## Requirements
The script requires the following commands to be available:
- `curl` - Always required
- `jq` - Always required
- `column` - Always required
- `less` - Always required
- `pass` - Only required if using `password_path` for authentication
## Features
- **Configuration Management**: Load settings from JSON config file with environment variable overrides
- **Multiple Authentication Methods**: Direct password, pass integration, or no authentication
- **Profiles**: Create and manage system prompts with optional pre/post-processing scripts
- **Context Persistence**: Maintain conversation history across chat invocations
- **Flexible Output**: Support for both text and JSON output formats
- **Stdin Support**: Most commands accept input from stdin for easy piping
## Version History
- **v1.1.0** - Added configuration file support with environment variable overrides
- **v1.0.0** - Initial release with version command fix
## License
MIT License - See LICENSE.md for details

71
ollama
View File

@@ -1,26 +1,74 @@
#!/bin/bash #!/bin/bash
set -euo pipefail set -euo pipefail
SCRIPT_VERSION="1.1.1"
# Default values (should be configured in ~/.ollama/config.json)
DEFAULT_OLLAMA_API_URL="https://ollama.local"
DEFAULT_OLLAMA_USER="ollama"
DEFAULT_OLLAMA_PASS_PATH=""
DEFAULT_OLLAMA_PASSWORD=""
DEFAULT_MODEL_DEFAULT="llama2"
# Configuration file
CONFIG_FILE="${HOME}/.ollama/config.json"
# Load configuration from config file
load_config() {
if [[ -f "$CONFIG_FILE" ]]; then
# Validate JSON first
if ! jq empty "$CONFIG_FILE" 2>/dev/null; then
echo "Error: Invalid JSON in config file: $CONFIG_FILE" >&2
return 1
fi
# Load values from config file, with fallback to empty if keys don't exist
OLLAMA_API_URL=$(jq -r '.api_url // empty' "$CONFIG_FILE") || return 1
OLLAMA_USER=$(jq -r '.user // empty' "$CONFIG_FILE") || return 1
OLLAMA_PASS_PATH=$(jq -r '.password_path // empty' "$CONFIG_FILE") || return 1
OLLAMA_PASSWORD=$(jq -r '.password // empty' "$CONFIG_FILE") || return 1
DEFAULT_MODEL=$(jq -r '.default_model // empty' "$CONFIG_FILE") || return 1
fi
return 0
}
# Apply configuration with environment variable overrides
load_config || exit 1
OLLAMA_API_URL="${OLLAMA_API_URL_ENV:-${OLLAMA_API_URL:-$DEFAULT_OLLAMA_API_URL}}"
OLLAMA_USER="${OLLAMA_USER_ENV:-${OLLAMA_USER:-$DEFAULT_OLLAMA_USER}}"
OLLAMA_PASS_PATH="${OLLAMA_PASS_PATH_ENV:-${OLLAMA_PASS_PATH:-$DEFAULT_OLLAMA_PASS_PATH}}"
OLLAMA_PASSWORD="${OLLAMA_PASSWORD_ENV:-${OLLAMA_PASSWORD:-$DEFAULT_OLLAMA_PASSWORD}}"
DEFAULT_MODEL="${DEFAULT_MODEL_ENV:-${DEFAULT_MODEL:-$DEFAULT_MODEL_DEFAULT}}"
# Allow direct env var overrides (OLLAMA_API_URL, OLLAMA_USER, OLLAMA_PASS_PATH, OLLAMA_PASSWORD, DEFAULT_MODEL)
OLLAMA_API_URL="${OLLAMA_API_URL:-$DEFAULT_OLLAMA_API_URL}"
OLLAMA_USER="${OLLAMA_USER:-$DEFAULT_OLLAMA_USER}"
OLLAMA_PASS_PATH="${OLLAMA_PASS_PATH:-$DEFAULT_OLLAMA_PASS_PATH}"
OLLAMA_PASSWORD="${OLLAMA_PASSWORD:-$DEFAULT_OLLAMA_PASSWORD}"
DEFAULT_MODEL="${DEFAULT_MODEL:-$DEFAULT_MODEL_DEFAULT}"
OLLAMA_API_URL="https://ollama-api.hq.ars9.space/api"
OLLAMA_USER="ollama"
OLLAMA_PASS_PATH="device/god-stronghold/authelia/users/ollama/password"
DEFAULT_MODEL="llama3:8b"
CONTEXT_FILE="${OLLAMA_CONTEXT_FILE:-$HOME/.ollama/context.json}" CONTEXT_FILE="${OLLAMA_CONTEXT_FILE:-$HOME/.ollama/context.json}"
PROFILE_DIR="${OLLAMA_PROFILE_DIR:-$HOME/.ollama/profiles}" PROFILE_DIR="${OLLAMA_PROFILE_DIR:-$HOME/.ollama/profiles}"
OUTPUT_FORMAT="text" # Can be "text" or "json" OUTPUT_FORMAT="text" # Can be "text" or "json"
SCRIPT_VERSION="1.0.0"
# Check for required dependencies # Check for required dependencies
check_dependencies() { check_dependencies() {
local missing=() local missing=()
for cmd in pass curl jq column less; do # Always required
for cmd in curl jq column less; do
if ! command -v "$cmd" &> /dev/null; then if ! command -v "$cmd" &> /dev/null; then
missing+=("$cmd") missing+=("$cmd")
fi fi
done done
# Only required if using pass for password
if [[ -n "$OLLAMA_PASS_PATH" ]]; then
if ! command -v pass &> /dev/null; then
missing+=("pass")
fi
fi
if [[ ${#missing[@]} -gt 0 ]]; then if [[ ${#missing[@]} -gt 0 ]]; then
echo "Error: missing required command(s): ${missing[*]}" >&2 echo "Error: missing required command(s): ${missing[*]}" >&2
return 1 return 1
@@ -112,9 +160,16 @@ ensure_default_profile() {
fi fi
} }
# Get password from pass # Get password from direct config or from pass
# Returns empty string if no password configured (for servers without auth)
get_password() { get_password() {
pass "$OLLAMA_PASS_PATH" if [[ -n "$OLLAMA_PASSWORD" ]]; then
echo "$OLLAMA_PASSWORD"
elif [[ -n "$OLLAMA_PASS_PATH" ]]; then
pass "$OLLAMA_PASS_PATH"
else
echo ""
fi
} }
# Generate a completion # Generate a completion