Configuration
All environment variables and configuration options for the ReductrAI binary.
Core Settings
| Variable | Description | Default |
|---|---|---|
REDUCTRAI_LICENSE REQUIRED |
Your license key | — |
REDUCTRAI_PORT optional |
Telemetry ingestion port | 8080 |
REDUCTRAI_DATA_DIR optional |
DuckDB storage location | ~/.reductrai |
LLM Configuration
Configure which LLM provider to use for AI investigations. See LLM Providers for detailed setup guides.
| Variable | Description | Default |
|---|---|---|
LLM_ENDPOINT optional |
LLM API endpoint (OpenAI-compatible) | http://localhost:11434/v1 |
LLM_API_KEY optional |
API key for LLM provider | — |
LLM_MODEL optional |
Model name to use | llama3.2 |
All LLM inference runs locally (BitNet or Ollama). No external AI services required.
Autonomy Levels
Autonomy level is configured in the dashboard:
| Level | Behavior | Availability |
|---|---|---|
| Approve | AI investigates and proposes remediation, human approves (DEFAULT) | All tiers |
| Auto | AI auto-resolves cleared anomalies only | BUSINESS+ |
All LLM-generated remediation actions require human approval. The "auto" level only auto-resolves issues that have already cleared.
Storage & Retention
Data retention is automatic based on your license tier:
| Tier | Retention | Compression |
|---|---|---|
| FREE | 30 days | 91-95% automatic |
| PRO | 90 days | 91-95% automatic |
| BUSINESS | 180 days | 91-95% automatic |
| ENTERPRISE | 365 days | 91-95% automatic |
Data is automatically tiered: HOT (<1 hour) → WARM (compressed) → COLD (archived). Compression achieves 91-95% storage reduction.
Example Configuration
# Core (required)
REDUCTRAI_LICENSE=RF-xxx-xxx-xxx
# Optional overrides
REDUCTRAI_PORT=8080
REDUCTRAI_DATA_DIR=~/.reductrai
# Self-hosted LLM (optional - for air-gapped deployments)
LLM_ENDPOINT=http://localhost:11434/v1
LLM_MODEL=llama3.2
REDUCTRAI_LICENSE=RF-xxx-xxx-xxx
# Optional overrides
REDUCTRAI_PORT=8080
REDUCTRAI_DATA_DIR=~/.reductrai
# Self-hosted LLM (optional - for air-gapped deployments)
LLM_ENDPOINT=http://localhost:11434/v1
LLM_MODEL=llama3.2
CLI Commands
# Start the agent
reductrai start
# Stop the agent
reductrai stop
# Check status
reductrai status
# Query local DuckDB
reductrai query "SELECT * FROM metrics LIMIT 10"
# Show database schema and example queries
reductrai schema
# Show version
reductrai version
reductrai start
# Stop the agent
reductrai stop
# Check status
reductrai status
# Query local DuckDB
reductrai query "SELECT * FROM metrics LIMIT 10"
# Show database schema and example queries
reductrai schema
# Show version
reductrai version