Documentation
LLMverify Documentation
Complete guides for the local-first LLM output verification toolkit. Zero network requests. Zero telemetry. v1.5.2.
$ npm install llmverifyDocumentation
5 min
Getting Started
Installation, setup wizard, and your first verification in 30 seconds
Read docs
Reference
API Reference
Complete programmatic API documentation for llmverify v1.5.2+
Read docs
Reference
CLI Reference
Command-line interface with presets, wizard, doctor, and baseline management
Read docs
5 min
Server Mode
Run llmverify as HTTP server on port 9009 for IDE and tool integration
Read docs
10 min
Algorithms & Detection
How pattern-based detection works, accuracy numbers, and detection methodology
Read docs
5 min
Limitations
What llmverify cannot do, false positive rates, and known constraints
Read docs
Quick Start
terminal
# Install
npm install llmverify
# Run the setup wizard
npx llmverify wizard
# Quick verification
npx llmverify run "Your AI output here" --preset dev
# Or use programmatically
import { verify, isInputSafe, redactPII } from 'llmverify';
const result = await verify({ content: aiOutput });
console.log(result.risk.level); // 'low' | 'moderate' | 'high' | 'critical'Want to see llmverify in action?
Try the Live Demo