Initial commit.
All checks were successful
Release / release (push) Successful in 3m13s

This commit is contained in:
Jay
2025-10-25 21:23:20 -04:00
commit 91c61ca4c8
10 changed files with 1406 additions and 0 deletions

View File

@@ -0,0 +1,84 @@
name: Release
on:
push:
tags:
- "v*"
jobs:
release:
runs-on: ubuntu-latest
steps:
# Checkout repository
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
ssh-key: "${{ secrets.SSH_PRIVATE_KEY }}"
ssh-known-hosts: "${{ secrets.SSH_KNOWN_HOST }}"
# Fetch annotated tags
- name: Fetch annotated tags
run: |
echo "Fetching annotated tag objects from remote..."
git fetch --tags --force origin
# Install go
- name: Install go
uses: actions/setup-go@v3
with:
go-version: ">=1.23.5"
# Get version info
- name: Get version info
run: |
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_ENV
echo "COMMIT=$(git rev-parse --short HEAD)" >> $GITHUB_ENV
echo "DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")" >> $GITHUB_ENV
echo "PACKAGE=git.wisehodl.dev/jay/aicli" >> $GITHUB_ENV
echo "Release metadata:"
echo " Version: ${GITHUB_REF#refs/tags/}"
echo " Commit: $(git rev-parse --short HEAD)"
echo " Date: $(date -u +"%Y-%m-%dT%H:%M:%SZ")"
# Extract tag message
- name: Extract tag message
run: |
TAG_NAME="${GITHUB_REF#refs/tags/}"
echo "Extracting annotation from $TAG_NAME..."
echo "TAG_MESSAGE<<EOF" >> $GITHUB_ENV
git for-each-ref --format='%(contents)' "refs/tags/${TAG_NAME}" >> $GITHUB_ENV
echo "EOF" >> $GITHUB_ENV
echo "Tag message:"
git for-each-ref --format='%(contents)' "refs/tags/${TAG_NAME}"
# Build binaries
- name: Build binaries
run: |
echo "Building binaries for ${{ env.VERSION }}..."
chmod +x ./scripts/build.sh
VERSION=${{ env.VERSION }} PACKAGE=${{ env.PACKAGE }} DATE=${{ env.DATE }} COMMIT=${{ env.COMMIT }} ./scripts/build.sh
# Create release
- name: Create Release
uses: https://gitea.com/actions/release-action@main
with:
files: dist/*
api_key: "${{ secrets.RELEASE_TOKEN }}"
title: "aicli ${{ env.VERSION }}"
body: |
## Changelog
${{ env.TAG_MESSAGE }}
## Installation
See [INSTALL.md](INSTALL.md) for detailed installation instructions.
## Available Downloads
- Linux (amd64, arm64, 386, armv7, armv6)
- macOS (Intel and Apple Silicon)
- Windows (64-bit and 32-bit)
- FreeBSD, OpenBSD, NetBSD, and Solaris (amd64)

117
INSTALL.md Normal file
View File

@@ -0,0 +1,117 @@
# Installing AICLI
This document provides instructions for installing and configuring AICLI on various platforms.
## Pre-built Binaries
The easiest way to install AICLI is to download a pre-built binary from the [releases page](https://git.wisehodl.dev/jay/aicli/releases).
### Platform Selection Guide
Choose the appropriate binary for your platform:
- **Linux (64-bit x86)**: `aicli-linux-amd64`
- **Linux (32-bit x86)**: `aicli-linux-386`
- **Linux (64-bit ARM)**: `aicli-linux-arm64`
- **Linux (ARMv7)**: `aicli-linux-armv7`
- **Linux (ARMv6)**: `aicli-linux-armv6`
- **macOS (Intel)**: `aicli-darwin-amd64`
- **macOS (Apple Silicon)**: `aicli-darwin-arm64`
- **Windows (64-bit)**: `aicli-windows-amd64.exe`
- **Windows (32-bit)**: `aicli-windows-386.exe`
- **FreeBSD (64-bit)**: `aicli-freebsd-amd64`
- **OpenBSD (64-bit)**: `aicli-openbsd-amd64`
- **NetBSD (64-bit)**: `aicli-netbsd-amd64`
- **Solaris (64-bit)**: `aicli-solaris-amd64`
### Linux/macOS Installation
```bash
# Download the appropriate binary (replace with actual version and platform)
curl -LO https://git.wisehodl.dev/jay/aicli/releases/download/v1.0.0/aicli-linux-amd64
# Make executable
chmod +x aicli-linux-amd64
# Move to a directory in your PATH
sudo mv aicli-linux-amd64 /usr/local/bin/aicli
# Verify installation
aicli --version
```
### Windows Installation
1. Download the appropriate EXE file for your system
2. Rename the executable to `aicli.exe` if desired
3. Add the directory to your PATH or move the executable to a directory in your PATH
4. Open Command Prompt or PowerShell and verify the installation:
```
aicli --version
```
## Configuration
### API Key Setup
You'll need an API key to use AICLI. Set it up using one of these methods:
```bash
# Direct method (less secure)
export AICLI_API_KEY="your-api-key"
# File method (more secure)
echo "your-api-key" > ~/.aicli_key
export AICLI_API_KEY_FILE=~/.aicli_key
# Or specify in config file (see below)
```
### Configuration File
Create a configuration file at `~/.aicli.yaml` or use the sample config provided in the release:
```bash
# Download the sample config
curl -LO https://git.wisehodl.dev/jay/aicli/raw/branch/main/sample-config.yml
# Copy to your home directory
cp sample-config.yml ~/.aicli.yaml
# Edit with your preferred editor
nano ~/.aicli.yaml
```
See the README for detailed configuration options.
## Verification
Verify the downloaded files against the provided checksums:
```bash
# Download the checksum file
curl -LO https://git.wisehodl.dev/jay/aicli/releases/download/v1.0.0/SHA256SUMS
# Verify your downloaded binary
sha256sum -c SHA256SUMS --ignore-missing
```
## Building from Source
If you prefer to build from source:
```bash
# Clone the repository
git clone https://git.wisehodl.dev/jay/aicli.git
cd aicli
# Build
go build -o aicli
# Install
sudo mv aicli /usr/local/bin/
```
## Next Steps
See the [README.md](README.md) for usage instructions and examples.

21
LICENSE Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2025 nostr:npub10mtatsat7ph6rsq0w8u8npt8d86x4jfr2nqjnvld2439q6f8ugqq0x27hf
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

290
README.md Normal file
View File

@@ -0,0 +1,290 @@
# AICLI
A flexible command-line interface for interacting with LLM chat APIs.
AICLI provides a streamlined way to interact with language models from your terminal. Send prompts and files to chat models like OpenAI's GPT or local Ollama models, customize system prompts, and receive responses right in your terminal or save them to files.
## Features
- Query OpenAI-compatible APIs or Ollama models directly
- Send files as context with your prompts
- Customize system prompts
- Configure via environment variables, config files, or CLI flags
- Save responses to files
- Automatic model fallbacks if primary models fail
- Flexible input handling (stdin, files, direct prompts)
## Installation
### Pre-built Binaries
Download the latest binary for your platform from the [Releases](https://git.wisehodl.dev/jay/aicli/releases) page:
Make the file executable (Linux/macOS):
```bash
chmod +x aicli-linux-amd64
mv aicli-linux-amd64 /usr/local/bin/aicli # or any directory in your PATH
```
### Building from Source
Requires Go 1.16+:
```bash
git clone https://git.wisehodl.dev/jay/aicli.git
cd aicli
go build -o aicli
```
## Configuration
AICLI can be configured in multiple ways, with the following precedence (highest to lowest):
1. Command-line flags
2. Environment variables
3. Config file
4. Default values
### API Key Setup
Set up your API key using one of these methods:
```bash
# Command line
aicli --key "your-api-key" ...
# Environment variable
export AICLI_API_KEY="your-api-key"
# Key file
echo "your-api-key" > ~/.aicli_key
export AICLI_API_KEY_FILE=~/.aicli_key
```
### Environment Variables
```bash
# API Configuration
export AICLI_API_KEY="your-api-key"
export AICLI_API_KEY_FILE="~/.aicli_key"
export AICLI_PROTOCOL="openai" # or "ollama"
export AICLI_URL="https://api.ppq.ai/chat/completions" # custom endpoint
# Model Selection
export AICLI_MODEL="gpt-4o-mini"
export AICLI_FALLBACK="gpt-4.1-mini,gpt-3.5-turbo"
# Prompts
export AICLI_SYSTEM="You are a helpful AI assistant."
export AICLI_DEFAULT_PROMPT="Analyze the following:"
# File Paths
export AICLI_CONFIG_FILE="~/.aicli.yaml"
export AICLI_PROMPT_FILE="~/prompts/default.txt"
export AICLI_SYSTEM_FILE="~/prompts/system.txt"
```
### Config File (YAML)
Create a YAML config file (e.g., `~/.aicli.yaml`):
```yaml
protocol: openai
url: https://api.ppq.ai/chat/completions
model: gpt-4o-mini
fallback: gpt-4.1-mini,gpt-3.5-turbo
key_file: ~/.aicli_key
system_file: ~/prompts/system.txt
```
## Basic Usage
### Simple Queries
```bash
# Direct question
aicli -p "Explain quantum computing in simple terms"
# Using stdin
echo "What is the capital of France?" | aicli
# Save response to file
aicli -p "Write a short poem about coding" -o poem.txt
```
### Working with Files
```bash
# Analyze a code file
aicli -f main.go -p "Review this code for bugs and improvements"
# Analyze multiple files
aicli -f main.go -f utils.go -p "Explain how these files work together"
# Using stdin as a file
cat log.txt | aicli -F -p "Find problems in this log"
# Combining stdin file with regular files
cat log.txt | aicli -F -f config.json -p "Find problems in this log and config"
```
### Customizing Prompts
```bash
# Multiple prompt sections
aicli -p "Analyze this data:" -p "Focus on trends over time:" -f data.csv
# Combining prompts from files and flags
aicli -pf prompt_template.txt -p "Apply this to the finance sector" -f report.txt
# Using system prompt
aicli -s "You are a security expert" -p "Review this code for vulnerabilities" -f app.js
```
### API Configuration
```bash
# Using Ollama with local model
aicli -l ollama -u http://localhost:11434/api/chat -m llama3 -p "Explain Docker"
# Custom OpenAI-compatible endpoint
aicli -u https://api.company.ai/v1/chat/completions -p "Generate a marketing slogan"
# With fallback models
aicli -m claude-3-opus -b claude-3-sonnet,gpt-4o -p "Write a complex algorithm"
```
## Advanced Examples
### Code Review Workflow
```bash
# Review pull request changes
git diff main..feature-branch | aicli -p "Review these changes. Identify potential bugs and suggest improvements."
```
### Data Analysis
```bash
# Analyze CSV data
aicli -p "Analyze this CSV data and provide insights:" -f data.csv -o analysis.md
```
### Content Generation with Context
```bash
# Generate documentation with multiple context files
aicli -s "You are a technical writer creating clear documentation" \
-p "Create a README for this project" \
-f main.go -f api.go -f config.go -o README.md
```
### Translation Pipeline
```bash
# Translate text from a file
cat text.txt | aicli -p "Translate this text to Spanish" > text_es.txt
```
### Meeting Summarization
```bash
# Summarize meeting transcript
aicli -pf summarization_prompt.txt -f transcript.txt -o summary.md
```
## Tips and Tricks
### Creating Reusable Prompt Files
Store common prompt patterns in files:
```
# ~/prompts/code-review.txt
Review the following code:
1. Identify potential bugs or edge cases
2. Suggest performance improvements
3. Highlight security concerns
4. Recommend better patterns or practices
```
Then use them:
```bash
aicli -pf ~/prompts/code-review.txt -f main.go
```
### Environment Variable Shortcuts
Set up aliases with pre-configured environment variables:
```bash
# In your .bashrc or .zshrc
alias aicli-code="AICLI_SYSTEM_FILE=~/prompts/system-code.txt aicli"
alias aicli-creative="AICLI_SYSTEM='You are a creative writer' AICLI_MODEL=gpt-4o aicli"
```
### Combining with Other Tools
```bash
# Generate commit message from changes
git diff --staged | aicli -q -p "Generate a concise commit message for these changes:" | git commit -F -
# Analyze logs
grep ERROR /var/log/app.log | aicli -p "Identify patterns in these error logs"
```
## Full Command Reference
```
Usage: aicli [OPTION]... [FILE]...
Send files and prompts to LLM chat endpoints.
With no FILE, or when FILE is -, read standard input.
Global:
--version display version information and exit
Input:
-f, --file PATH input file (repeatable)
-F, --stdin-file treat stdin as file contents
-p, --prompt TEXT prompt text (repeatable, can be combined with --prompt-file)
-pf, --prompt-file PATH prompt from file (combined with any --prompt flags)
System:
-s, --system TEXT system prompt text
-sf, --system-file PATH system prompt from file
API:
-l, --protocol PROTO API protocol: openai, ollama (default: openai)
-u, --url URL API endpoint (default: https://api.ppq.ai/chat/completions)
-k, --key KEY API key (if present, --key-file is ignored)
-kf, --key-file PATH API key from file (used only if --key is not provided)
Models:
-m, --model NAME primary model (default: gpt-4o-mini)
-b, --fallback NAMES comma-separated fallback models (default: gpt-4.1-mini)
Output:
-o, --output PATH write to file instead of stdout
-q, --quiet suppress progress output
-v, --verbose enable debug logging
Config:
-c, --config PATH YAML config file
```
## License
MIT License - see LICENSE file for details.
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add some amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request

5
go.mod Normal file
View File

@@ -0,0 +1,5 @@
module git.wisehodl.dev/jay/aicli
go 1.23.5
require gopkg.in/yaml.v3 v3.0.1

4
go.sum Normal file
View File

@@ -0,0 +1,4 @@
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

768
main.go Normal file
View File

@@ -0,0 +1,768 @@
package main
import (
"bytes"
"encoding/json"
"flag"
"fmt"
"io"
"net/http"
"os"
"strings"
"time"
"git.wisehodl.dev/jay/aicli/version"
"gopkg.in/yaml.v3"
)
const defaultPrompt = "Analyze the following:"
type stdinRole int
const (
stdinAsPrompt stdinRole = iota
stdinAsPrefixedContent
stdinAsFile
)
type Config struct {
Protocol string
URL string
Key string
Model string
Fallbacks []string
SystemText string
PromptText string
Files []FileData
OutputPath string
Quiet bool
Verbose bool
}
type FileData struct {
Path string
Content string
}
type flagValues struct {
files []string
prompts []string
promptFile string
system string
systemFile string
key string
keyFile string
protocol string
url string
model string
fallback string
output string
config string
stdinFile bool
quiet bool
verbose bool
showVersion bool
}
const usageText = `Usage: aicli [OPTION]... [FILE]...
Send files and prompts to LLM chat endpoints.
With no FILE, or when FILE is -, read standard input.
Global:
--version display version information and exit
Input:
-f, --file PATH input file (repeatable)
-F, --stdin-file treat stdin as file contents
-p, --prompt TEXT prompt text (repeatable, can be combined with --prompt-file)
-pf, --prompt-file PATH prompt from file (combined with any --prompt flags)
System:
-s, --system TEXT system prompt text
-sf, --system-file PATH system prompt from file
API:
-l, --protocol PROTO API protocol: openai, ollama (default: openai)
-u, --url URL API endpoint (default: https://api.ppq.ai/chat/completions)
-k, --key KEY API key (if present, --key-file is ignored)
-kf, --key-file PATH API key from file (used only if --key is not provided)
Models:
-m, --model NAME primary model (default: gpt-4o-mini)
-b, --fallback NAMES comma-separated fallback models (default: gpt-4.1-mini)
Output:
-o, --output PATH write to file instead of stdout
-q, --quiet suppress progress output
-v, --verbose enable debug logging
Config:
-c, --config PATH YAML config file
Environment variables:
AICLI_API_KEY API key
AICLI_API_KEY_FILE Path to file containing API key (used only if AICLI_API_KEY is not set)
AICLI_PROTOCOL API protocol
AICLI_URL API endpoint
AICLI_MODEL primary model
AICLI_FALLBACK fallback models
AICLI_SYSTEM system prompt
AICLI_DEFAULT_PROMPT default prompt override
AICLI_CONFIG_FILE Path to config file
AICLI_PROMPT_FILE Path to prompt file
AICLI_SYSTEM_FILE Path to system file
API Key precedence: --key flag > --key-file flag > AICLI_API_KEY > AICLI_API_KEY_FILE > config file
Examples:
echo "What is Rust?" | aicli
cat file.txt | aicli -F -p "Analyze this file"
aicli -f main.go -p "Review this code"
aicli -c ~/.aicli.yaml -f src/*.go -o analysis.md
aicli -p "First prompt" -pf prompt.txt -p "Last prompt"
`
func printUsage() {
fmt.Fprint(os.Stderr, usageText)
}
type fileList []string
func (f *fileList) String() string {
return strings.Join(*f, ", ")
}
func (f *fileList) Set(value string) error {
*f = append(*f, value)
return nil
}
type promptList []string
func (p *promptList) String() string {
return strings.Join(*p, "\n")
}
func (p *promptList) Set(value string) error {
*p = append(*p, value)
return nil
}
func main() {
if err := run(); err != nil {
fmt.Fprintf(os.Stderr, "error: %v\n", err)
os.Exit(1)
}
}
func run() error {
// Check for verbose flag early
verbose := false
for _, arg := range os.Args {
if arg == "-v" || arg == "--verbose" {
verbose = true
break
}
}
// Check for config file in environment variable before parsing flags
configFilePath := os.Getenv("AICLI_CONFIG_FILE")
flags := parseFlags()
if flags.showVersion {
fmt.Printf("aicli %s\n", version.GetVersion())
return nil
}
if flags.config == "" && configFilePath != "" {
flags.config = configFilePath
}
envVals := loadEnvVars(verbose)
fileVals, err := loadConfigFile(flags.config)
if err != nil {
return err
}
merged := mergeConfigSources(verbose, flags, envVals, fileVals)
if err := validateConfig(merged); err != nil {
return err
}
if promptFilePath := os.Getenv("AICLI_PROMPT_FILE"); promptFilePath != "" && flags.promptFile == "" {
content, err := os.ReadFile(promptFilePath)
if err != nil {
if verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read AICLI_PROMPT_FILE at %s: %v\n", promptFilePath, err)
}
} else {
merged.PromptText = string(content)
}
}
if systemFilePath := os.Getenv("AICLI_SYSTEM_FILE"); systemFilePath != "" && flags.systemFile == "" && flags.system == "" {
content, err := os.ReadFile(systemFilePath)
if err != nil {
if verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read AICLI_SYSTEM_FILE at %s: %v\n", systemFilePath, err)
}
} else {
merged.SystemText = string(content)
}
}
stdinContent, hasStdin := detectStdin()
role := determineStdinRole(flags, hasStdin)
inputData, err := resolveInputStreams(merged, stdinContent, hasStdin, role, flags)
if err != nil {
return err
}
config := buildCompletePrompt(inputData)
if config.Verbose {
logVerbose("Configuration resolved", config)
}
startTime := time.Now()
response, usedModel, err := sendChatRequest(config)
duration := time.Since(startTime)
if err != nil {
return err
}
return writeOutput(response, usedModel, duration, config)
}
func parseFlags() flagValues {
fv := flagValues{}
var files fileList
var prompts promptList
flag.Usage = printUsage
flag.Var(&files, "f", "")
flag.Var(&files, "file", "")
flag.Var(&prompts, "p", "")
flag.Var(&prompts, "prompt", "")
flag.StringVar(&fv.promptFile, "pf", "", "")
flag.StringVar(&fv.promptFile, "prompt-file", "", "")
flag.StringVar(&fv.system, "s", "", "")
flag.StringVar(&fv.system, "system", "", "")
flag.StringVar(&fv.systemFile, "sf", "", "")
flag.StringVar(&fv.systemFile, "system-file", "", "")
flag.StringVar(&fv.key, "k", "", "")
flag.StringVar(&fv.key, "key", "", "")
flag.StringVar(&fv.keyFile, "kf", "", "")
flag.StringVar(&fv.keyFile, "key-file", "", "")
flag.StringVar(&fv.protocol, "l", "", "")
flag.StringVar(&fv.protocol, "protocol", "", "")
flag.StringVar(&fv.url, "u", "", "")
flag.StringVar(&fv.url, "url", "", "")
flag.StringVar(&fv.model, "m", "", "")
flag.StringVar(&fv.model, "model", "", "")
flag.StringVar(&fv.fallback, "b", "", "")
flag.StringVar(&fv.fallback, "fallback", "", "")
flag.StringVar(&fv.output, "o", "", "")
flag.StringVar(&fv.output, "output", "", "")
flag.StringVar(&fv.config, "c", "", "")
flag.StringVar(&fv.config, "config", "", "")
flag.BoolVar(&fv.stdinFile, "F", false, "")
flag.BoolVar(&fv.stdinFile, "stdin-file", false, "")
flag.BoolVar(&fv.quiet, "q", false, "")
flag.BoolVar(&fv.quiet, "quiet", false, "")
flag.BoolVar(&fv.verbose, "v", false, "")
flag.BoolVar(&fv.verbose, "verbose", false, "")
flag.BoolVar(&fv.showVersion, "version", false, "")
flag.Parse()
fv.files = files
fv.prompts = prompts
return fv
}
func loadEnvVars(verbose bool) map[string]string {
env := make(map[string]string)
if val := os.Getenv("AICLI_PROTOCOL"); val != "" {
env["protocol"] = val
}
if val := os.Getenv("AICLI_URL"); val != "" {
env["url"] = val
}
if val := os.Getenv("AICLI_API_KEY"); val != "" {
env["key"] = val
}
if env["key"] == "" {
if val := os.Getenv("AICLI_API_KEY_FILE"); val != "" {
content, err := os.ReadFile(val)
if err != nil && verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read AICLI_API_KEY_FILE at %s: %v\n", val, err)
} else {
env["key"] = strings.TrimSpace(string(content))
}
}
}
if val := os.Getenv("AICLI_MODEL"); val != "" {
env["model"] = val
}
if val := os.Getenv("AICLI_FALLBACK"); val != "" {
env["fallback"] = val
}
if val := os.Getenv("AICLI_SYSTEM"); val != "" {
env["system"] = val
}
if val := os.Getenv("AICLI_DEFAULT_PROMPT"); val != "" {
env["prompt"] = val
}
return env
}
func loadConfigFile(path string) (map[string]interface{}, error) {
if path == "" {
return nil, nil
}
data, err := os.ReadFile(path)
if err != nil {
return nil, fmt.Errorf("read config file: %w", err)
}
var config map[string]interface{}
if err := yaml.Unmarshal(data, &config); err != nil {
return nil, fmt.Errorf("parse config file: %w", err)
}
return config, nil
}
func mergeConfigSources(verbose bool, flags flagValues, env map[string]string, file map[string]interface{}) Config {
cfg := Config{
Protocol: "openai",
URL: "https://api.ppq.ai/chat/completions",
Model: "gpt-4o-mini",
Fallbacks: []string{"gpt-4.1-mini"},
Quiet: flags.quiet,
Verbose: flags.verbose,
}
if env["protocol"] != "" {
cfg.Protocol = env["protocol"]
}
if env["url"] != "" {
cfg.URL = env["url"]
}
if env["key"] != "" {
cfg.Key = env["key"]
}
if env["model"] != "" {
cfg.Model = env["model"]
}
if env["fallback"] != "" {
cfg.Fallbacks = strings.Split(env["fallback"], ",")
}
if env["system"] != "" {
cfg.SystemText = env["system"]
}
if file != nil {
if v, ok := file["protocol"].(string); ok {
cfg.Protocol = v
}
if v, ok := file["url"].(string); ok {
cfg.URL = v
}
if v, ok := file["model"].(string); ok {
cfg.Model = v
}
if v, ok := file["fallback"].(string); ok {
cfg.Fallbacks = strings.Split(v, ",")
}
if v, ok := file["system_file"].(string); ok {
content, err := os.ReadFile(v)
if err != nil {
if verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read system_file at %s: %v\n", v, err)
}
} else {
cfg.SystemText = string(content)
}
}
if v, ok := file["key_file"].(string); ok && cfg.Key == "" {
content, err := os.ReadFile(v)
if err != nil {
if cfg.Verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read key_file at %s: %v\n", v, err)
}
} else {
cfg.Key = strings.TrimSpace(string(content))
}
}
}
if flags.protocol != "" {
cfg.Protocol = flags.protocol
}
if flags.url != "" {
cfg.URL = flags.url
}
if flags.model != "" {
cfg.Model = flags.model
}
if flags.fallback != "" {
cfg.Fallbacks = strings.Split(flags.fallback, ",")
}
if flags.system != "" {
cfg.SystemText = flags.system
}
if flags.systemFile != "" {
content, err := os.ReadFile(flags.systemFile)
if err != nil {
if cfg.Verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read system file at %s: %v\n", flags.systemFile, err)
}
} else {
cfg.SystemText = string(content)
}
}
if flags.key != "" {
cfg.Key = flags.key
} else if flags.keyFile != "" {
content, err := os.ReadFile(flags.keyFile)
if err != nil {
if cfg.Verbose {
fmt.Fprintf(os.Stderr, "[verbose] Failed to read key file at %s: %v\n", flags.keyFile, err)
}
} else {
cfg.Key = strings.TrimSpace(string(content))
}
}
if flags.output != "" {
cfg.OutputPath = flags.output
}
return cfg
}
func validateConfig(cfg Config) error {
if cfg.Key == "" {
return fmt.Errorf("API key required: use --key, --key-file, AICLI_API_KEY, AICLI_API_KEY_FILE, or key_file in config")
}
if cfg.Protocol != "openai" && cfg.Protocol != "ollama" {
return fmt.Errorf("protocol must be 'openai' or 'ollama', got: %s", cfg.Protocol)
}
return nil
}
func detectStdin() (string, bool) {
stat, err := os.Stdin.Stat()
if err != nil {
return "", false
}
if (stat.Mode() & os.ModeCharDevice) != 0 {
return "", false
}
content, err := io.ReadAll(os.Stdin)
if err != nil {
return "", false
}
return string(content), true
}
func determineStdinRole(flags flagValues, hasStdin bool) stdinRole {
if !hasStdin {
return stdinAsPrompt
}
if flags.stdinFile {
return stdinAsFile
}
hasExplicitPrompt := len(flags.prompts) > 0 || flags.promptFile != ""
if hasExplicitPrompt {
return stdinAsPrefixedContent
}
return stdinAsPrompt
}
func resolveInputStreams(cfg Config, stdinContent string, hasStdin bool, role stdinRole, flags flagValues) (Config, error) {
hasPromptFlag := len(flags.prompts) > 0 || flags.promptFile != ""
hasFileFlag := len(flags.files) > 0
// Handle case where only stdin as file is provided
if !hasPromptFlag && !hasFileFlag && hasStdin && flags.stdinFile {
cfg.Files = append(cfg.Files, FileData{Path: "input", Content: stdinContent})
return cfg, nil
}
if !hasStdin && !hasFileFlag && !hasPromptFlag {
return cfg, fmt.Errorf("no input provided: supply stdin, --file, or --prompt")
}
for _, path := range flags.files {
if path == "" {
return cfg, fmt.Errorf("empty file path provided")
}
}
if flags.system != "" && flags.systemFile != "" {
return cfg, fmt.Errorf("cannot use both --system and --system-file")
}
if len(flags.prompts) > 0 {
cfg.PromptText = strings.Join(flags.prompts, "\n")
}
if flags.promptFile != "" {
content, err := os.ReadFile(flags.promptFile)
if err != nil {
return cfg, fmt.Errorf("read prompt file: %w", err)
}
if cfg.PromptText != "" {
cfg.PromptText += "\n\n" + string(content)
} else {
cfg.PromptText = string(content)
}
}
if hasStdin {
switch role {
case stdinAsPrompt:
cfg.PromptText = stdinContent
case stdinAsPrefixedContent:
if cfg.PromptText != "" {
cfg.PromptText += "\n\n" + stdinContent
} else {
cfg.PromptText = stdinContent
}
case stdinAsFile:
cfg.Files = append(cfg.Files, FileData{Path: "input", Content: stdinContent})
}
}
for _, path := range flags.files {
content, err := os.ReadFile(path)
if err != nil {
return cfg, fmt.Errorf("read file %s: %w", path, err)
}
cfg.Files = append(cfg.Files, FileData{Path: path, Content: string(content)})
}
return cfg, nil
}
func buildCompletePrompt(inputData Config) Config {
result := inputData
promptParts := []string{}
// Use inputData's prompt if set, otherwise check for overrides
if inputData.PromptText != "" {
promptParts = append(promptParts, inputData.PromptText)
} else if override := os.Getenv("AICLI_DEFAULT_PROMPT"); override != "" {
promptParts = append(promptParts, override)
} else if len(inputData.Files) > 0 {
promptParts = append(promptParts, defaultPrompt)
}
// Format files if present
if len(inputData.Files) > 0 {
fileSection := formatFiles(inputData.Files)
if len(promptParts) > 0 {
promptParts = append(promptParts, "", fileSection)
} else {
promptParts = append(promptParts, fileSection)
}
}
result.PromptText = strings.Join(promptParts, "\n")
return result
}
func formatFiles(files []FileData) string {
var buf strings.Builder
for i, f := range files {
if i > 0 {
buf.WriteString("\n\n")
}
buf.WriteString(fmt.Sprintf("File: %s\n\n```\n%s\n```", f.Path, f.Content))
}
return buf.String()
}
func sendChatRequest(cfg Config) (string, string, error) {
models := append([]string{cfg.Model}, cfg.Fallbacks...)
for i, model := range models {
if !cfg.Quiet && i > 0 {
fmt.Fprintf(os.Stderr, "Model %s failed, trying %s...\n", models[i-1], model)
}
response, err := tryModel(cfg, model)
if err == nil {
return response, model, nil
}
if !cfg.Quiet {
fmt.Fprintf(os.Stderr, "Model %s failed: %v\n", model, err)
}
}
return "", "", fmt.Errorf("all models failed")
}
func tryModel(cfg Config, model string) (string, error) {
payload := buildPayload(cfg, model)
body, err := json.Marshal(payload)
if err != nil {
return "", fmt.Errorf("marshal payload: %w", err)
}
if cfg.Verbose {
fmt.Fprintf(os.Stderr, "Request payload: %s\n", string(body))
}
req, err := http.NewRequest("POST", cfg.URL, bytes.NewReader(body))
if err != nil {
return "", fmt.Errorf("create request: %w", err)
}
req.Header.Set("Content-Type", "application/json")
req.Header.Set("Authorization", fmt.Sprintf("Bearer %s", cfg.Key))
client := &http.Client{Timeout: 5 * time.Minute}
resp, err := client.Do(req)
if err != nil {
return "", fmt.Errorf("execute request: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
bodyBytes, _ := io.ReadAll(resp.Body)
return "", fmt.Errorf("HTTP %d: %s", resp.StatusCode, string(bodyBytes))
}
respBody, err := io.ReadAll(resp.Body)
if err != nil {
return "", fmt.Errorf("read response: %w", err)
}
if cfg.Verbose {
fmt.Fprintf(os.Stderr, "Response: %s\n", string(respBody))
}
return parseResponse(respBody, cfg.Protocol)
}
func buildPayload(cfg Config, model string) map[string]interface{} {
if cfg.Protocol == "ollama" {
payload := map[string]interface{}{
"model": model,
"prompt": cfg.PromptText,
"stream": false,
}
if cfg.SystemText != "" {
payload["system"] = cfg.SystemText
}
return payload
}
messages := []map[string]string{}
if cfg.SystemText != "" {
messages = append(messages, map[string]string{
"role": "system",
"content": cfg.SystemText,
})
}
messages = append(messages, map[string]string{
"role": "user",
"content": cfg.PromptText,
})
return map[string]interface{}{
"model": model,
"messages": messages,
}
}
func parseResponse(body []byte, protocol string) (string, error) {
var result map[string]interface{}
if err := json.Unmarshal(body, &result); err != nil {
return "", fmt.Errorf("parse response: %w", err)
}
if protocol == "ollama" {
if response, ok := result["response"].(string); ok {
return response, nil
}
return "", fmt.Errorf("no response field in ollama response")
}
choices, ok := result["choices"].([]interface{})
if !ok || len(choices) == 0 {
return "", fmt.Errorf("no choices in response")
}
firstChoice, ok := choices[0].(map[string]interface{})
if !ok {
return "", fmt.Errorf("invalid choice format")
}
message, ok := firstChoice["message"].(map[string]interface{})
if !ok {
return "", fmt.Errorf("no message in choice")
}
content, ok := message["content"].(string)
if !ok {
return "", fmt.Errorf("no content in message")
}
return content, nil
}
func writeOutput(response, model string, duration time.Duration, cfg Config) error {
if cfg.OutputPath == "" {
if !cfg.Quiet {
fmt.Println("--- aicli ---")
fmt.Println()
fmt.Printf("Used model: %s\n", model)
fmt.Printf("Query duration: %.1fs\n", duration.Seconds())
fmt.Println()
fmt.Println("--- response ---")
fmt.Println()
}
fmt.Println(response)
return nil
}
if err := os.WriteFile(cfg.OutputPath, []byte(response), 0644); err != nil {
return fmt.Errorf("write output file: %w", err)
}
if !cfg.Quiet {
fmt.Printf("Used model: %s\n", model)
fmt.Printf("Query duration: %.1fs\n", duration.Seconds())
fmt.Printf("Wrote response to: %s\n", cfg.OutputPath)
}
return nil
}
func logVerbose(msg string, cfg Config) {
fmt.Fprintf(os.Stderr, "[verbose] %s\n", msg)
fmt.Fprintf(os.Stderr, " Protocol: %s\n", cfg.Protocol)
fmt.Fprintf(os.Stderr, " URL: %s\n", cfg.URL)
fmt.Fprintf(os.Stderr, " Model: %s\n", cfg.Model)
fmt.Fprintf(os.Stderr, " Fallbacks: %v\n", cfg.Fallbacks)
fmt.Fprintf(os.Stderr, " Files: %d\n", len(cfg.Files))
}

15
sample-config.yml Normal file
View File

@@ -0,0 +1,15 @@
# AICLI Sample Configuration
# Save this file and specify its path with --config flag or AICLI_CONFIG_FILE
# environment variable
# API Configuration
protocol: openai # API protocol: openai or ollama
url: https://api.ppq.ai/chat/completions # API endpoint URL
key_file: ~/.aicli_key # Path to file containing your API key
# Model Configuration
model: gpt-4o-mini # Primary model to use
fallback: gpt-4.1-mini,o3 # Comma-separated fallback models
# Prompt Configuration
system_file: ~/.aicli_system # Path to file containing system prompt

86
scripts/build.sh Normal file
View File

@@ -0,0 +1,86 @@
#!/usr/bin/env bash
set -euo pipefail
# Build script for AICLI
# This script builds binaries for multiple platforms and architectures
# Set default values if not provided by environment
: "${VERSION:="dev"}"
: "${PACKAGE:="git.wisehodl.dev/jay/aicli"}"
: "${DATE:=$(date -u +"%Y-%m-%dT%H:%M:%SZ")}"
: "${COMMIT:=$(git rev-parse --short HEAD 2>/dev/null || echo "unknown")}"
# Output directories
DIST_DIR="$(pwd)/dist"
mkdir -p "$DIST_DIR"
# Build info
LDFLAGS="-s -w -X '${PACKAGE}/version.Version=${VERSION}' -X '${PACKAGE}/version.CommitHash=${COMMIT}' -X '${PACKAGE}/version.BuildDate=${DATE}'"
echo "Building AICLI version ${VERSION} (${COMMIT}) built at ${DATE}"
TARGETS=(
"linux/amd64"
"linux/arm64"
"linux/386"
"linux/arm/7" # ARMv7
"linux/arm/6" # ARMv6
"darwin/amd64"
"darwin/arm64"
"windows/amd64"
"windows/386"
"freebsd/amd64"
"openbsd/amd64"
"netbsd/amd64"
"solaris/amd64"
)
# Build all targets
for target in "${TARGETS[@]}"; do
os=$(echo "$target" | cut -d/ -f1)
arch=$(echo "$target" | cut -d/ -f2)
arm_version=""
# Handle ARM version if specified
if [[ "$target" == */* && "$target" != */amd64 && "$target" != */386 && "$target" != */arm64 ]]; then
arm_version=$(echo "$target" | cut -d/ -f3)
echo "Building for ${os}/${arch} (ARM version ${arm_version})"
else
echo "Building for ${os}/${arch}"
fi
# Set output filename
if [[ "$os" == "windows" ]]; then
output="${DIST_DIR}/aicli-${os}-${arch}.exe"
elif [[ -n "$arm_version" ]]; then
output="${DIST_DIR}/aicli-${os}-armv${arm_version}"
else
output="${DIST_DIR}/aicli-${os}-${arch}"
fi
# Set GOOS, GOARCH, and GOARM
export GOOS=$os
export GOARCH=$arch
if [[ -n "$arm_version" ]]; then
export GOARM=$arm_version
else
unset GOARM
fi
# Build the binary
echo "Building ${output}..."
go build -ldflags "${LDFLAGS}" -o "$output" .
# Make binary executable
if [[ "$os" != "windows" ]]; then
chmod +x "$output"
fi
done
echo "All binaries built to ${DIST_DIR}"
# Generate checksums
echo "Generating checksums..."
(cd "$DIST_DIR" && sha256sum ./* >SHA256SUMS)
echo "Build complete! Binaries and checksums available in ${DIST_DIR}"

16
version/version.go Normal file
View File

@@ -0,0 +1,16 @@
package version
// These variables are set by the build process using ldflags
var (
// Version is the semantic version of the release
Version = "dev"
// CommitHash is the git commit hash of the build
CommitHash = "unknown"
// BuildDate is the date when the binary was built
BuildDate = "unknown"
)
// GetVersion returns a formatted version string
func GetVersion() string {
return Version + " (" + CommitHash + ") built at " + BuildDate
}