Subcategory · AI Citation Index

Who AI is citing in CI/CD

Vercel leads CI/CD evaluation queries at 71 avg score; no brand clears 50% shortlist coverage across 15 prompts.

Brands tracked

50

Avg AEO score

66/100

Citation coverage

0%

of brands cited at least once

Dominant brands

0

cited > 50% of queries

Discovery stage

The shortlist

When buyers ask AI for the best CI/CD software

No discovery-stage prompts have been scored against this category yet — once the shortlist cron runs, this section will surface which brands AI cites most.

Evaluation stage

The battleground

How brands fare on comparison queries · category median 15/100

Vercel leads head-to-head comparisons with a 71 avg evaluation score across 4 queries. Netlify trails at 41, IBM Terraform at 33, and IBM Business Automation Workflow at 30. The median evaluation score is 15, showing most brands earn weak or zero citation in direct matchups. GitLab, TeamCity, and Render all scored 0 despite appearing in comparison queries.

Each brand's evaluation score averages how AI responds to head-to-head comparison queries that mention them. Above-median brands win their comparisons more often than they lose.

Trends

Over the last 12 months

Too few audits to establish a trend — only 2 months of data (March and April 2026, 8 and 19 prompts). Vercel rose 15 points month-over-month; Render fell 5. Category avg hovered near 66.

71686663602026-032026-04

Editorial picks

Brands worth watching

FAQ

CI/CD questions, answered

Which CI/CD tool does AI cite most often in head-to-head comparisons?+
Vercel leads with a 71 avg evaluation score across 4 comparison queries. Netlify is second at 41, and IBM Terraform third at 33. The median score is 15, so most brands earn weak or zero citation.
Is there a dominant CI/CD platform in AI-generated shortlists?+
No. Zero brands achieved dominant shortlist presence across the 15 category prompts audited. The field is fragmented across 50 tracked tools, and even top scorers like Vercel and Jira Product Discovery (both 78 featured-candidate scores) lack consistent discovery citations.
How is GitLab performing in AI citations for CI/CD?+
GitLab scored 72 as a featured candidate but 0 in evaluation queries (4 comparisons). It appears in discovery contexts but is ignored when users ask for direct tool matchups.
What is the best CI/CD software for small teams?+
Vercel and Netlify lead evaluation citations in the category, with Vercel at 71 and Netlify at 41. Both also score 72+ as featured candidates. GitLab scores well in discovery (72) but drops to 0 in head-to-head queries.
Are CI/CD citation trends shifting toward newer platforms?+
Too few audits to establish a trend — only March and April 2026 data exists. Vercel gained 15 points month-over-month; Render fell 5. The category avg held near 66.
Why does TeamCity score zero in evaluation queries?+
TeamCity appeared in 6 comparison queries but earned a 0 avg evaluation score on all of them — the largest zero-score sample in the category. It holds a 66 featured-candidate score, suggesting some discovery presence but no head-to-head traction.

Related

More in Developer Tools

Want to know if AI cites your brand for CI/CD?

Free audit. ChatGPT, Perplexity, Gemini, Claude.

Run an audit →

See the full CI/CD leaderboard →