Skip to main content
This guide walks you through installing Opsh, completing the first-run setup, and running your first natural-language command. By the end, you will have a working Opsh session and understand how to review and confirm the commands it generates.
1

Install Opsh

Run the install script in your terminal:
curl -fsSL https://opsh.dxu.one/install.sh | bash
The installer downloads the latest Opsh binary for your platform (macOS or Linux, arm64 or x64), places it in ~/.opsh/bin, and adds an auto-start block to your shell configuration files.
2

Open a new terminal

Close your current terminal window and open a new one. The auto-start block in your shell rc file launches Opsh automatically when you open an interactive session.If Opsh does not start automatically, you can launch it manually:
opsh
3

Complete first-run setup

On the first launch, Opsh detects that no configuration exists and immediately starts the interactive setup. You will be asked four questions:Enable warp mode?Warp mode lets Opsh auto-run commands it classifies as safe and gives you a plain-English summary of the result. You can change this later.
? Enable warp mode by default? Warp auto-runs safer commands and answers from the result. (y/N)
Require confirmation before execution?When warp mode is off, this controls whether Opsh asks you to confirm before running every command. The default is yes.
? Require confirmation before execution? (Y/n)
Choose an AI provider:
? Active LLM provider
❯ OpenRouter
  OpenAI API
  Anthropic API
  Gemini (Google AI API)
  Ollama (local)
Opsh supports OpenRouter, OpenAI, Anthropic, Gemini, and Ollama. If you want to run models locally without an API key, select Ollama (local) — Ollama must already be installed and running on your machine.Choose a model:After selecting a provider, Opsh shows the available models for that provider. For example, with OpenRouter selected:
? Model
❯ Auto Router        openrouter/auto          — OpenRouter picks a strong model automatically
  Claude Sonnet 4.5  anthropic/claude-...      — Strong default for coding and general use
  Claude Opus 4.5    anthropic/claude-...      — Highest-end Anthropic option
  GPT-5.1            openai/gpt-5.1            — Latest flagship GPT-5 family model
  Gemini 2.5 Flash   google/gemini-2.5-flash   — Fast price-performance option
Enter your API key (not shown for Ollama):
? API key (stored in ~/.opsh/config.json) ************
Your API key is stored locally in ~/.opsh/config.json and is never sent anywhere other than your chosen provider’s API endpoint.
4

Type your first natural-language request

After setup completes, Opsh drops you into its interactive REPL. Type a task in plain English:
~ > list all files modified in the last 24 hours
You can also pass a one-shot request directly from your regular shell prompt:
opsh "show disk usage for each folder in the current directory"
5

Review the plan and confirm

Opsh shows you the generated command, a risk classification, and a brief explanation before running anything:
Command   find . -maxdepth 1 -type d -exec du -sh {} +
Risk      safe
Explain   Lists disk usage for each top-level directory in the current folder.

Run this command? [Y/n]
Press Enter or type y to run the command. Type n to cancel and try a different request.

Reconfigure at any time

To change your provider, model, API key, or other settings, run:
opsh --init
This re-runs the full interactive setup and saves the updated configuration to ~/.opsh/config.json.
Use opsh --print-only to preview generated commands without executing them. This is useful when you want to inspect what Opsh would run before committing to it, or when you want to copy the command and use it elsewhere.
opsh --print-only "compress all jpg files in this directory"