Estimate AI-Likeness with Explainable Metrics

Paste any excerpt, toggle academic or strict modes, and receive a probability score with highlighted passages and shareable reports.

Paste text and click Analyze.

Overview

AI-generated prose is everywhere, but blunt detection claims often lack transparency. Our checker surfaces the signals behind its confidence score so editors, educators, and compliance teams can weigh context before making decisions.

The analyser runs entirely in your browser. No text leaves your device, which is critical when reviewing embargoed briefs, student submissions, or confidential copy drafts.

Key features

  • Explainable scoring

    See burstiness, repetition, entropy, rare-word usage, and stopword ratios alongside a confidence score to understand the why behind results.

  • Highlight overlays

    View colour-coded highlights for repetitive segments or statistically flat passages so you can revise or request rewrites quickly.

  • Academic & strict modes

    Toggle heuristics tuned for formal writing or tighten thresholds when you need conservative outputs for compliance reviews.

  • Shareable exports

    Copy plain-language reports, export JSON audits, or copy annotated HTML to discuss results with colleagues or learners.

How it works

  1. 1

    Paste or type text

    Import text via paste, keyboard, or the sample button. Longer passages (300+ words) return more reliable signals.

  2. 2

    Choose analysis mode

    Enable academic or strict mode to adjust thresholds before running the analysis, especially for technical or highly structured writing.

  3. 3

    Run the analysis

    Click “Analyze” to generate the probability score, key metrics, and highlight overlays instantly on the right-hand panel.

  4. 4

    Review & export

    Copy the summary, export JSON for record keeping, or download highlighted HTML to annotate feedback inside your LMS or CMS.

Use cases

Editorial QA

Check contributed articles or sponsored posts for AI-heavy phrasing before publication and request more human revision when needed.

Academic integrity checks

Scan essays or dissertations to spot sections that mimic machine-written cadence and provide students with improvement guidance.

Compliance reviews

Document due diligence when verifying disclosure statements or regulated content, keeping exports on file for auditors.

Examples & tips

Audit a press release

Paste the announcement copy, review highlighted boilerplate, and share the HTML export with PR leads for manual refinement.

Assess community submissions

Queue user-generated stories, run strict mode, and archive the JSON output as part of your moderation notes.

Coach a student rewrite

Highlight repetitive phrases, discuss alternative structures, and encourage mixing sentence lengths to improve burstiness.

Pro tips

  • Combine the checker with the Word Counter to review readability before publishing.
  • Maintain a manual audit trail whenever AI detection influences policy or grading decisions.

Frequently asked questions

Is the score definitive?
No detection method is perfect. Treat the score as a signal and combine it with human judgement and other evidence.
Why is short text unreliable?
Short passages provide limited data for statistical analysis. For best results, analyse 300+ words.
Does the tool store submissions?
Nothing is uploaded or stored. Refreshing the page clears your text and results.
Can I automate this workflow?
Use the JSON export to feed results into your moderation systems or data pipelines.

Resources & internal links

What's next?

Run your sample text above, export the report for documentation, and follow up with the Word Counter to adjust tone and readability before hitting publish.

Have feedback?

Found a bug or have an idea to improve this tool?