Subcategory · AI Citation Index
Who AI is citing in Wiki & Docs
No brand owns AI shortlists in Wiki & Docs; Seismic Knowledge leads head-to-head comparisons at 52.
Brands tracked
36
Avg AEO score
60/100
Citation coverage
0%
of brands cited at least once
Dominant brands
0
cited > 50% of queries
Discovery stage
The shortlist
When buyers ask AI for the best Wiki & Docs software
No discovery-stage prompts have been scored against this category yet — once the shortlist cron runs, this section will surface which brands AI cites most.
Evaluation stage
The battleground
How brands fare on comparison queries · category median 38/100
Seismic Knowledge leads comparison queries with an average score of 52 across six evaluations. Notion follows at 48 (four queries), then M Files at 46 (five queries). The median evaluation score sits at 38, suggesting most brands struggle to differentiate in direct matchups.
Each brand's evaluation score averages how AI responds to head-to-head comparison queries that mention them. Above-median brands win their comparisons more often than they lose.
Trends
Over the last 12 months
Too few audits to establish a trend. Snapshot data shows Seismic Knowledge up 16 points and Notion down 5 between March and April 2026, but only two months of coverage is too thin for pattern recognition.
Editorial picks
Brands worth watching
Seismic Knowledge
Leads all comparison queries at 52 average and posted the largest single-month gain (+16) in the limited trend window. Appears in six head-to-head matchups — more than any peer.
Read brand profile →Nuclino
Scores highest in featured placement at 71, despite a 44 evaluation average. The gap suggests strong semantic relevance to general wiki queries even when direct comparisons are weaker.
Read brand profile →Archbee
Shows up in six comparison queries — tied for most — with a 41 average. Featured score of 66 places it third, indicating broad but mid-tier citation strength.
Read brand profile →Notion
Second in evaluation score (48) but dropped five points month-over-month. Featured score of 58 is below median, suggesting AI may be narrowing its definition of 'wiki' away from all-in-one tools.
Read brand profile →Glean
Evaluation average of 37 ranks eighth, but featured score matches Notion at 58. Appears in five comparison queries, signaling emerging consideration despite weak head-to-head performance.
Read brand profile →FAQ
Wiki & Docs questions, answered
What is the best wiki software for internal documentation?+
How do Notion and Nuclino compare for team wikis?+
Which documentation tool is rising in AI citations?+
What is the typical score for a wiki platform in AI comparisons?+
Are there any dominant brands in AI responses for wiki software?+
How does Archbee perform in AI citations?+
Related
More in Content & Knowledge
Want to know if AI cites your brand for Wiki & Docs?
Free audit. ChatGPT, Perplexity, Gemini, Claude.
Run an audit →