Skip to main content

Create a Custom AI Test Case

Write your own tests using natural language assertions.

J
Written by Jakka Pranav swaroop Naidu

When to write a custom test

Use a custom test when the built-in suites do not catch something specific to your brand, content, or compliance needs. Examples: "Every product page must mention free shipping," "No page should reference our competitor by name," "All blog posts must include an author bio."

Open the Custom Test form

Go to Quality Lab > Test Cases and click Add Custom Test. Fill in the fields:

  • Name: a short label that will appear in the issue title.

  • Prompt or Input: the assertion in plain English. Be specific about what should be true.

  • Expected Outcome: what passing looks like.

  • Run Scope: pick specific projects or All Projects (org-wide). Tester role members can only set specific projects; Admin and Owner can set All Projects.

  • Category tags: optional. Helps filter your library later.

Judge Agent dry run

Before the test is enabled, Jakka runs a one-time Judge Agent verification. It checks four things:

  • Meaningful: the assertion is testable.

  • Relevant: it applies to web content.

  • Unique: it does not duplicate an existing built-in test.

  • Appropriate: it is well-scoped, not too broad or too narrow.

A passing dry run unlocks the Add to Suite button along with a token estimate per page. A failing dry run returns specific reasons and editing tips.

Versioning and edits

Edits create a new version (v1, v2, v3...). Authorship is tracked. Built-in Core and eCommerce tests are read-only.

Enable or disable

Use the status toggle to enable, disable, or pause a custom test without deleting it.

Related articles

Did this answer your question?