AI Verifications
Automatically review data and catch errors before they hit your system

Manual checks are no longer the only way to ensure clean, accurate data extraction. Our latest feature, AI Verifications, mimics the "manual review" step that is often required to ensure data quality.
What It Does
AI Verifications is built to replicate the logic of a human teammate: look over the extracted data and flag anything that seems off. It works in two key ways:
1. Generic Verification
After TableFlow extracts data from a file (PDF, CSV, Excel, etc.), it checks whether the output looks correct. Think of it as a second set of eyes: "This item number doesn't match the source file," or "This row might have been misread." It's a fast sanity check to catch common extraction issues.
2. Custom Verification
For more complex documents, you can define specific rules the extracted data should follow. For example, you can cross-check hundreds of extracted rows against total values that appear elsewhere in the PDF. If the totals don't match? We flag it before anything moves forward.
How It Works in Flows
AI Verifications is built right into Flows, our tool for building repeatable document workflows. Here's a simple setup:
- 1. Add an Extraction step
- 2. Add a Verification step
If verification passes, move forward. If it fails, TableFlow can flag the issue, stop the flow, and even auto-correct based on what it learns.
Built for Real-World Ops
Docs don't always follow perfect rules—and we've designed AI Verifications to reflect that. A few other helpful features:
Verification Score: Get a clear, quantified view of how confident the system is in the extracted data.
Issues and Severities: Define rules with different levels: info, warning, or error. For example, if a summary total doesn't match your extracted table, that's an error—and the flow will stop until it's fixed.
Why It Matters
Your team shouldn't have to review every single extraction. AI Verifications gives you confidence that extracted data is right.