Add AI transparency to your workflow — track every line of code AI writes and every decision agents make. Pick your tool, install, and you're done.
VibeCheck is the validator, reporter, and attestation tool for the VIBES ecosystem. It validates your .ai-audit/ data, generates reports, signs cryptographic attestations, and computes risk scores.
Fast Rust binary. Validates VIBES data, generates reports, signs attestations, and scores risk. Clone the repo and build with Cargo.
View on GitHubThen run vibecheck init in any project to bootstrap a .ai-audit/ directory.
VibeCheck validates and reports on VIBES data, but it doesn't generate that data. To produce .ai-audit/ records automatically, you need a VIBES integration for the AI tool you use. Each tool requires its own integration, installed separately.
A hook for Claude Code that automatically generates .ai-audit/ data as you work. Records code, decisions, and reasoning.
A VIBES integration for Google Gemini CLI. Tracks model variants, tool use, and file modifications with automatic annotations.
A VIBES plugin for OpenAI Codex CLI. Records AI provenance, sandbox execution, and shell commands alongside code changes.
Instrumentation for IDE-integrated tools is in development. Join the waitlist to be notified when support ships.
Don't see your tool? Check the full tooling page for the latest support, or ask your provider to add native VIBES support. Tool builders can use the Implementors Guide to add VIBES in hours.
Want the full VIBES ecosystem without any setup? Maestro comes with everything built in — tracking for code and agent decisions, verification, risk scoring, and governance — ready out of the box.
Maestro is a full implementation of the VIBES ecosystem. No manual instrumentation, no configuration — just start working and everything gets tracked automatically. Code generation, agent decisions, delegation chains — all of it.
Full audit trail + cryptographic signing + risk scoring + agent governance
Get started at runmaestro.ai →Once your AI tool has a VIBES integration installed (or you're using Maestro), here's what happens in your project.
.ai-audit/ folder appears in your projectThis is where all VIBES data lives. It's a normal folder in your repository — committed to git, versioned alongside your code, and portable across any system. Run vibecheck init to bootstrap it, or let your tool's VIBES integration create it automatically.
The VIBES integration in your AI tool records what it writes and what it was asked. When an agent makes a decision — approving a PR, scaling infrastructure, modifying a pipeline — the integration captures the decision and reasoning. No manual work required.
vibecheckRun vibecheck verify to validate your audit data, or vibecheck --format html to see a visual summary of all AI involvement — which models touched which files, risk scores, and coverage statistics.
Run vibecheck attest to cryptographically sign your audit data and submit it to the public registry. This proves your data hasn't been tampered with and creates a verifiable public record.
Show that your project tracks AI involvement. Add a VIBES badge to your README — it tells contributors and users that every AI action in your project is recorded and auditable.
Add this to your README.md:
See all badge options and customization → · Trust-tier attestation badges →
You've got everything you need. Pick a path and start tracking what AI does in your workflow today.
Install VibeCheck to validate, report on, and attest your VIBES audit data. Then add a VIBES integration for your AI tool when available.
Back to top →Learn how VERIFY, PRISM, and EVOLVE extend VIBES with signing, risk scoring, and agent governance.
The full ecosystem →Full VIBES ecosystem — VIBES, VERIFY, PRISM, and EVOLVE — built in and ready to go.
runmaestro.ai →