Compare commits
2 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 2c4d570184 | |||
| b7a3a07458 |
21
LICENSE.md
Normal file
21
LICENSE.md
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
# MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2026 Ollama CLI Contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
250
README.md
Normal file
250
README.md
Normal file
@@ -0,0 +1,250 @@
|
|||||||
|
# Ollama CLI
|
||||||
|
|
||||||
|
A comprehensive command-line interface for interacting with Ollama servers, featuring configuration management, profile support, and context persistence.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
Place the `ollama` script in your PATH and ensure it's executable:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
chmod +x ollama
|
||||||
|
export PATH="$PATH:$(pwd)"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
The script loads configuration from `~/.ollama/config.json`. If the file doesn't exist, it uses built-in defaults.
|
||||||
|
|
||||||
|
### Configuration File Format
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"api_url": "https://ollama-api.example.com/api",
|
||||||
|
"user": "ollama",
|
||||||
|
"password": "direct_password_here",
|
||||||
|
"password_path": "path/to/password/in/pass",
|
||||||
|
"default_model": "llama2"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Configuration Options:**
|
||||||
|
- `api_url`: URL to the Ollama API server
|
||||||
|
- `user`: Username for API authentication
|
||||||
|
- `password`: Direct password (takes precedence over password_path)
|
||||||
|
- `password_path`: Path to retrieve password from `pass` command
|
||||||
|
- `default_model`: Default model to use for operations
|
||||||
|
|
||||||
|
### Environment Variable Overrides
|
||||||
|
|
||||||
|
All configuration values can be overridden via environment variables:
|
||||||
|
- `OLLAMA_API_URL_ENV`
|
||||||
|
- `OLLAMA_USER_ENV`
|
||||||
|
- `OLLAMA_PASS_PATH_ENV`
|
||||||
|
- `OLLAMA_PASSWORD_ENV`
|
||||||
|
- `DEFAULT_MODEL_ENV`
|
||||||
|
|
||||||
|
Or use direct variables (lower precedence):
|
||||||
|
- `OLLAMA_API_URL`
|
||||||
|
- `OLLAMA_USER`
|
||||||
|
- `OLLAMA_PASS_PATH`
|
||||||
|
- `OLLAMA_PASSWORD`
|
||||||
|
- `DEFAULT_MODEL`
|
||||||
|
|
||||||
|
### Directory Structure
|
||||||
|
|
||||||
|
The script maintains the following directories:
|
||||||
|
- `~/.ollama/context.json` - Conversation context for chat (can be overridden with `OLLAMA_CONTEXT_FILE`)
|
||||||
|
- `~/.ollama/profiles/` - Profile definitions (can be overridden with `OLLAMA_PROFILE_DIR`)
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### General Help
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama <command> [options]
|
||||||
|
|
||||||
|
Commands:
|
||||||
|
generate <prompt> [options] Generate a response from a prompt (supports stdin)
|
||||||
|
chat [options] Chat with context memory (supports stdin)
|
||||||
|
embed <text> [options] Generate embeddings for text (supports stdin)
|
||||||
|
model <subcommand> Manage models
|
||||||
|
profile <subcommand> Manage profiles
|
||||||
|
version Show script and Ollama server version
|
||||||
|
help [command] Show this help message or help for a specific command
|
||||||
|
|
||||||
|
For detailed help on a command, run:
|
||||||
|
ollama help <command>
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama help generate
|
||||||
|
ollama help chat
|
||||||
|
ollama help embed
|
||||||
|
ollama help profile
|
||||||
|
ollama help model
|
||||||
|
|
||||||
|
Environment Variables:
|
||||||
|
OLLAMA_CONTEXT_FILE Override default context file location
|
||||||
|
OLLAMA_PROFILE_DIR Override default profile directory location
|
||||||
|
```
|
||||||
|
|
||||||
|
### Generate Command
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama generate <prompt> [options]
|
||||||
|
|
||||||
|
Generate a single response from a prompt. Supports reading from stdin.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
-m, --model MODEL Use a specific model (default: llama3:8b)
|
||||||
|
-p, --profile NAME Use a profile's system prompt (default: default)
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama generate "What is 2+2?"
|
||||||
|
ollama generate "Write a poem" -m deepseek-r1:8b
|
||||||
|
echo "Tell me a joke" | ollama generate
|
||||||
|
echo "How do I read a file?" | ollama generate -p python
|
||||||
|
ollama generate -p json "List colors"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Chat Command
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama chat [options]
|
||||||
|
|
||||||
|
Chat with context memory. Maintains conversation history between invocations.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
-m, --model MODEL Use a specific model (default: llama3:8b)
|
||||||
|
-c, --context FILE Use a specific context file (default: ~/.ollama/context.json)
|
||||||
|
-p, --profile NAME Use a profile's system prompt (default: default)
|
||||||
|
-r, --reset Clear the context and start fresh
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama chat
|
||||||
|
echo "Hello!" | ollama chat
|
||||||
|
ollama chat -m deepseek-r1:14b
|
||||||
|
ollama chat -r # Reset context
|
||||||
|
ollama chat -c /tmp/custom_context.json # Use custom context file
|
||||||
|
OLLAMA_CONTEXT_FILE=/tmp/session.json ollama chat # Use env variable
|
||||||
|
ollama chat -p default
|
||||||
|
echo "What is Docker?" | ollama chat -p json
|
||||||
|
echo "How do I write a loop?" | ollama chat -p python
|
||||||
|
```
|
||||||
|
|
||||||
|
### Embed Command
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama embed <text> [options]
|
||||||
|
|
||||||
|
Generate embeddings for text using a specified model. Supports reading from stdin.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
-m, --model MODEL Use a specific model (default: llama3:8b)
|
||||||
|
-p, --profile NAME Use a profile's model selection (default: default)
|
||||||
|
-o, --output FORMAT Output format: text (default) or json
|
||||||
|
|
||||||
|
Output Formats:
|
||||||
|
text Display embedding summary with first 10 values
|
||||||
|
json Display full API response including all embedding values
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama embed "What is machine learning?"
|
||||||
|
ollama embed "Hello world" -m nomic-embed-text
|
||||||
|
echo "Some text to embed" | ollama embed
|
||||||
|
echo "Multiple lines
|
||||||
|
of text" | ollama embed -o json
|
||||||
|
ollama embed "Search query" -m nomic-embed-text -o json | jq '.embedding | length'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Model Command
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama model <subcommand>
|
||||||
|
|
||||||
|
Manage and query available models.
|
||||||
|
|
||||||
|
Subcommands:
|
||||||
|
list List all available models with specs
|
||||||
|
get MODEL Show detailed model information
|
||||||
|
ps List currently running/loaded models
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama model list
|
||||||
|
ollama model get llama3:8b
|
||||||
|
ollama model get deepseek-r1:8b
|
||||||
|
ollama model ps
|
||||||
|
```
|
||||||
|
|
||||||
|
### Profile Command
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama profile <subcommand> [options]
|
||||||
|
|
||||||
|
Manage profiles for system prompts, pre/post scripts, and model selection.
|
||||||
|
|
||||||
|
Subcommands:
|
||||||
|
list List all profile names
|
||||||
|
get NAME Show profile details
|
||||||
|
add NAME PROMPT [OPTIONS] Create a new profile
|
||||||
|
update NAME [PROMPT] [OPTIONS] Update a profile's system prompt or scripts
|
||||||
|
delete NAME Delete a profile
|
||||||
|
|
||||||
|
Profile Options:
|
||||||
|
--pre-script SCRIPT Script to execute on input before sending to ollama
|
||||||
|
--post-script SCRIPT Script to execute on output after receiving from ollama
|
||||||
|
-m, --model MODEL Model to use for this profile
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama profile list # List all profiles
|
||||||
|
ollama profile get json # Show profile details
|
||||||
|
ollama profile add work "You are a professional..." # Create profile
|
||||||
|
ollama profile add uppercase "Respond in all caps" --pre-script "tr '[:lower:]' '[:upper:]'"
|
||||||
|
ollama profile add pretty "Return formatted JSON" --post-script "jq ."
|
||||||
|
ollama profile add deep "You are a thinking model" -m deepseek-r1:8b
|
||||||
|
ollama profile update python "You are a Python expert..." # Update profile
|
||||||
|
ollama profile update json -m gemma:7b-text # Change model
|
||||||
|
ollama profile delete work # Delete profile
|
||||||
|
```
|
||||||
|
|
||||||
|
### Version Command
|
||||||
|
|
||||||
|
```
|
||||||
|
Usage: ollama version [options]
|
||||||
|
|
||||||
|
Display the CLI script version and the Ollama server version.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
-o, --output FORMAT Output format: text (default) or json
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
ollama version
|
||||||
|
ollama version -o json
|
||||||
|
ollama version -o json | jq '.ollama_version'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
The script requires the following commands to be available:
|
||||||
|
- `curl` - Always required
|
||||||
|
- `jq` - Always required
|
||||||
|
- `column` - Always required
|
||||||
|
- `less` - Always required
|
||||||
|
- `pass` - Only required if using `password_path` for authentication
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **Configuration Management**: Load settings from JSON config file with environment variable overrides
|
||||||
|
- **Multiple Authentication Methods**: Direct password, pass integration, or no authentication
|
||||||
|
- **Profiles**: Create and manage system prompts with optional pre/post-processing scripts
|
||||||
|
- **Context Persistence**: Maintain conversation history across chat invocations
|
||||||
|
- **Flexible Output**: Support for both text and JSON output formats
|
||||||
|
- **Stdin Support**: Most commands accept input from stdin for easy piping
|
||||||
|
|
||||||
|
## Version History
|
||||||
|
|
||||||
|
- **v1.1.0** - Added configuration file support with environment variable overrides
|
||||||
|
- **v1.0.0** - Initial release with version command fix
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT License - See LICENSE.md for details
|
||||||
2
ollama
2
ollama
@@ -1,7 +1,7 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
|
||||||
set -euo pipefail
|
set -euo pipefail
|
||||||
SCRIPT_VERSION="1.1.0"
|
SCRIPT_VERSION="1.1.1"
|
||||||
|
|
||||||
# Default values (should be configured in ~/.ollama/config.json)
|
# Default values (should be configured in ~/.ollama/config.json)
|
||||||
DEFAULT_OLLAMA_API_URL="https://ollama.local"
|
DEFAULT_OLLAMA_API_URL="https://ollama.local"
|
||||||
|
|||||||
Reference in New Issue
Block a user