I Built an AI CLI That Lets You Chat With Your APIs

Tired of writing Postman test scripts? I built Octrafic – a CLI tool where you describe API tests in plain English and AI executes them. It’s open-source, runs locally, and supports multiple LLM providers.

Why I Built This

API testing sucks. You’re either:

  • Clicking through Postman manually (doesn’t scale)
  • Writing JavaScript test scripts (tedious boilerplate)
  • Using curl and forgetting the syntax every time

I wanted something faster: load your API spec, describe what you want tested, get results. The AI figures out the right HTTP requests, payloads, and validations.

How It Works

1. Install (single binary, no dependencies)

# Linux/macOS
curl -fsSL https://octrafic.com/install.sh | bash

# Homebrew
brew install octrafic/tap/octrafic

# Windows
iex (iwr -useb https://octrafic.com/install.ps1)

2. Configure your AI provider
Octrafic supports Claude, OpenAI, OpenRouter, Ollama, or llama.cpp. You bring your own API key – nothing goes through my servers.

# Run
octrafic

3. Load your API

octrafic -u https://api.example.com -s openapi.json -n "My API"

4. Start testing
The AI understands your API structure and generates appropriate requests based on natural language:

Technical Stack

Go – Single binary distribution across Linux, macOS, Windows. No runtime dependencies.

Bubble Tea – Terminal UI framework for interactive chat interface. Arrow key navigation, command history, proper scrolling.

Multi-LLM Support – Built adapters for Claude, OpenAI, OpenRouter, Ollama, and llama.cpp. Switch providers without changing your workflow.

Real Use Cases

API Exploration – Point it at an unfamiliar API and ask questions. What does this endpoint do?
What parameters does it accept?
Faster than reading docs.

Quick TestingTest this endpoint with edge cases generates multiple requests with boundary conditions, missing fields, invalid data types.

Manual Testing Alternative – Instead of clicking through Postman, describe what you want tested. The AI executes and shows results.

Local Models – Privacy-focused? Run Ollama or llama.cpp locally. Your API data never leaves your machine.

Current State

It works. I use it daily for API testing. A few other developers are using it too.
That said, it’s alpha software. Some features are rough. But the core is solid and I’m actively fixing issues.

GitHub repo: https://github.com/Octrafic/octrafic-cli
Website: https://octrafic.com
Try it. Open issues for bugs. PR’s welcome if you want to contribute.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Accuracy You Can Count On – Choose Dimensional Air Gages!

Next Post

Sustaining a Quality Culture

Related Posts