AI-powered log analysis
from your terminal
Stop staring at walls of text at 2am. Get instant answers.
One-time purchase
Pay once, own forever. No subscriptions.
Demo
Try it out
gratis
- ✓ Basic analysis
- ✓ Ollama backend
Lite
Personal use
$29
- ✓ Basic analysis
- ✓ Ollama backend
- ✓ 2 devices
Pro
POPULARFull features
$69
- ✓ Everything in Lite
- ✓ OpenAI/Anthropic
- ✓ JSON/MD output
- ✓ 3 devices
COMING SOON
Team Pro
Enterprise
TBC
- ✓ Everything in Team Lite
- ✓ SSO
- ✓ Audit logging
Get product updates
New features, tips, and occasional discounts. No spam.
It's this easy...
What you get
⚡
Instant analysis
Single binary, zero setup, works in moments
💬
Plain English
Not just "error detected" — explains why and what to do
🔒
Works offline
Local LLMs via Ollama, your logs never leave your machine
☁️
Cloud LLM support
Bring your own OpenAI or Anthropic API key
📁
Multiple formats
Nginx, Apache, Docker, Kubernetes, syslog, JSON logs
🔗
Pipe anything
kubectl logs pod | logtorque analyze -
FAQ
What log formats are supported? ▼
Nginx, Apache, Docker, Kubernetes, syslog, and any text-based log file. JSON logs are also supported.
Do I need an API key? ▼
No. You can use Ollama for free, local analysis. API keys are optional for cloud LLMs.
How does licensing work? ▼
One purchase = one license for up to 3 devices. Team licenses include 5+ seats.
Can I use it offline? ▼
Yes! With Ollama, everything runs locally. Your logs never leave your machine.
What's the difference between tiers? ▼
Demo is free with basic features. Lite adds more analysis depth. Pro adds cloud LLMs and output formats. Team adds watch mode and multiple seats.