Citation Tracking Setup & Best Practices
🎯 Quick Summary
- Complete guide to setting up automated citation tracking across AI platforms
- Learn to monitor ChatGPT, Claude, Gemini, and Perplexity citations systematically
- Track both brand-level and content-level citations with actionable dashboards
- Implement manual and automated tracking methods based on your resources
📋 Table of Contents
- Citation Tracking Fundamentals
- UnrealSEO Dashboard Setup
- Manual Tracking Methods
- Automated Tracking
- Organizing Your Data
- Analysis & Reporting
🔑 Key Concepts at a Glance
- Citation Tracking: Systematic monitoring of AI mentions of your brand/content
- Test Query Set: Collection of 50-500 queries to test regularly
- Citation Log: Database of when/where you're cited
- Tracking Frequency: How often to run tests (daily, weekly, monthly)
- Attribution Verification: Confirming citations are accurate
🏷️ Metadata
Tags: monitoring, tracking, setup, how-to
Status: %%ACTIVE%%
Complexity: %%MODERATE%%
Max Lines: 450 (this file: 445 lines)
Reading Time: 10 minutes
Last Updated: 2025-01-18
Citation Tracking Fundamentals
What You're Tracking
Three Levels of Tracking:
Level 1: Brand Mentions
├─ Query: "best CRM software"
├─ Result: Is your brand mentioned? (Yes/No)
└─ Metric: Brand Citation Rate
Level 2: Content Citations
├─ Query: "how to calculate sales velocity"
├─ Result: Is your specific article cited? (Yes/No)
└─ Metric: Content-level Citation Rate
Level 3: Quote/Data Citations
├─ Query: "average CRM ROI statistics"
├─ Result: Are your statistics/quotes cited? (Yes/No)
└─ Metric: Authority Citation Rate
Why Track?
Without Tracking:
"Are we visible in AI search?"
→ No data
→ Can't measure progress
→ Can't justify LLMO investment
With Tracking:
"Our Citation Rate grew from 12% to 28% in 6 months"
→ Clear progress
→ ROI proven
→ Know what's working
UnrealSEO Dashboard Setup
Initial Setup (5 minutes)
Step 1: Connect Your Domain
Dashboard → Settings → Domain Setup
1. Enter domain: yoursite.com
2. Verify ownership:
- Method A: DNS TXT record
- Method B: HTML file upload
- Method C: Meta tag
3. Click "Verify"
Status: ✅ Verified
Step 2: Define Your Topics
Dashboard → Topics → Add Topics
Enter 3-10 core topics:
✓ "CRM software"
✓ "Sales pipeline management"
✓ "Customer relationship tools"
✓ "CRM for startups"
✓ "CRM pricing"
UnrealSEO will generate 100+ test queries per topic
Step 3: Configure Competitors
Dashboard → Competitors → Add Competitors
Add 5-10 main competitors:
1. Salesforce
2. HubSpot
3. Pipedrive
4. Zoho CRM
5. Freshsales
UnrealSEO tracks their citations too for comparison
Dashboard Overview
Main Dashboard Metrics:
┌─────────────────────────────────────────────┐
│ Citation Rate: 24.5% [+2.3% vs last month]│
│ AI Share of Voice: 15.2% [#4 in category] │
│ Answer Equity: 78% [Grade: B+] │
└─────────────────────────────────────────────┘
Platform Breakdown:
├─ ChatGPT: 31.2% CR [↗ +3.5%]
├─ Claude: 22.4% CR [↗ +1.8%]
├─ Gemini: 18.7% CR [→ +0.2%]
└─ Perplexity: 15.8% CR [↘ -1.2%]
Top Cited Content:
1. "CRM Buyer's Guide 2025" - 47% CR
2. "CRM Pricing Comparison" - 38% CR
3. "Sales Pipeline Best Practices" - 32% CR
Setting Up Tracking Schedules
Recommended Frequency:
Dashboard → Settings → Tracking Schedule
Small Business:
├─ Full scan: Monthly
├─ Quick check: Weekly (top 50 queries)
└─ Cost: $49/month
Mid-Market:
├─ Full scan: Bi-weekly
├─ Quick check: 2x per week
└─ Cost: $149/month
Enterprise:
├─ Full scan: Weekly
├─ Quick check: Daily
└─ Cost: $499/month
Manual Tracking Methods
DIY Tracking Setup
For those without UnrealSEO or wanting supplemental tracking:
Step 1: Create Test Query List
Spreadsheet Template:
| Query ID | Query Text | Category | Difficulty |
|----------|-----------|----------|------------|
| Q001 | best CRM software | Generic | High |
| Q002 | CRM for startups under 10 people | Specific | Medium |
| Q003 | Salesforce vs HubSpot comparison | Competitive | High |
| Q004 | how to set up CRM workflow | Educational | Medium |
| Q005 | CRM pricing guide 2025 | Commercial | Medium |
Generate 50-100 queries covering:
- Generic category queries (20%)
- Specific use cases (30%)
- Comparison queries (20%)
- How-to/educational (20%)
- Commercial intent (10%)
Step 2: Manual Testing Process
Weekly Testing Routine:
Monday: ChatGPT Testing
├─ Open ChatGPT
├─ Test 25 queries from your list
├─ Record results in spreadsheet
└─ Time: ~30 minutes
Tuesday: Claude Testing
├─ Open Claude
├─ Test same 25 queries
├─ Record results
└─ Time: ~30 minutes
Wednesday: Gemini Testing
└─ [Same process]
Thursday: Perplexity Testing
└─ [Same process]
Friday: Analysis & reporting
└─ Calculate Citation Rate, trends
Tracking Spreadsheet:
| Query | ChatGPT | Claude | Gemini | Perplexity | Total |
|-------|---------|--------|--------|------------|-------|
| Q001 | ✅ Yes | ❌ No | ✅ Yes | ❌ No | 2/4 |
| Q002 | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes | 4/4 |
| Q003 | ❌ No | ❌ No | ✅ Yes | ❌ No | 1/4 |
Citation Rate: 7/12 = 58.3%
Step 3: Detailed Citation Logging
For each citation, record:
## Citation Log Entry
**Date:** 2025-01-18
**Query:** "best CRM for startups"
**Platform:** ChatGPT
**Cited:** ✅ Yes
**Position:** #2 recommendation
**Context:**
"For startups under 20 employees, I recommend:
1. HubSpot CRM - free tier, easy setup
2. [Your Brand] - affordable, great for agencies
3. Pipedrive - visual pipeline"
**Type:** Direct recommendation
**Sentiment:** Positive
**Accuracy:** ✅ Correct positioning
**Competitor Context:** Listed alongside HubSpot, Pipedrive
**URL Cited:** yoursite.com/startup-crm-guide
**Quote Used:** None (brand mention only)
**Notes:** Good placement, accurate description
Automated Tracking
Automation Tools
Option 1: Browser Automation (Intermediate)
Using Selenium/Puppeteer:
// Example: Automated ChatGPT testing
const puppeteer = require('puppeteer');
async function testQuery(query) {
const browser = await puppeteer.launch();
const page = await browser.newPage();
// Navigate to ChatGPT
await page.goto('https://chat.openai.com');
// Enter query
await page.type('#prompt-textarea', query);
await page.keyboard.press('Enter');
// Wait for response
await page.waitForSelector('.response-content', {timeout: 30000});
// Extract response
const response = await page.$eval('.response-content',
el => el.textContent);
// Check for brand mention
const cited = response.includes('Your Brand Name');
return {
query: query,
cited: cited,
response: response,
timestamp: new Date()
};
}
// Test all queries
const queries = [
"best CRM software",
"CRM for startups",
// ... more queries
];
queries.forEach(query => {
testQuery(query).then(result => {
console.log(result);
// Save to database
});
});
Option 2: API-Based Tracking (Advanced)
Using OpenAI API:
import openai
import csv
from datetime import datetime
openai.api_key = 'your-api-key'
def test_citation(query, brand_name):
response = openai.ChatCompletion.create(
model="gpt-4",
messages=[{"role": "user", "content": query}]
)
answer = response.choices[0].message.content
cited = brand_name.lower() in answer.lower()
return {
'query': query,
'cited': cited,
'answer': answer,
'timestamp': datetime.now(),
'tokens': response.usage.total_tokens
}
# Load test queries
queries = [
"best CRM software",
"CRM for startups",
# ...
]
results = []
for query in queries:
result = test_citation(query, "Your Brand")
results.append(result)
print(f"{query}: {'✅' if result['cited'] else '❌'}")
# Save results
with open('citation_tracking.csv', 'w') as f:
writer = csv.DictWriter(f, fieldnames=['query', 'cited',
'timestamp', 'tokens'])
writer.writeheader()
writer.writerows(results)
# Calculate Citation Rate
citation_rate = sum(r['cited'] for r in results) / len(results)
print(f"\nCitation Rate: {citation_rate:.1%}")
Cost Estimate:
100 queries × $0.03 per query (GPT-4) = $3.00
Run weekly: ~$12/month
Run daily: ~$90/month
(Much cheaper than manual labor)
Organizing Your Data
Database Structure
Citation Tracking Database Schema:
-- Main citations table
CREATE TABLE citations (
id INTEGER PRIMARY KEY,
query_id INTEGER,
platform VARCHAR(50), -- ChatGPT, Claude, etc.
cited BOOLEAN,
position INTEGER, -- 1st, 2nd, 3rd mention
sentiment VARCHAR(20), -- positive, neutral, negative
test_date DATE,
response_text TEXT
);
-- Queries table
CREATE TABLE queries (
id INTEGER PRIMARY KEY,
query_text TEXT,
category VARCHAR(50),
difficulty VARCHAR(20),
commercial_intent BOOLEAN
);
-- Content citations table
CREATE TABLE content_citations (
id INTEGER PRIMARY KEY,
citation_id INTEGER,
article_url VARCHAR(255),
article_title VARCHAR(255),
quote_used TEXT,
accurate BOOLEAN
);
Monthly Reporting Template
# Citation Tracking Report - January 2025
## Executive Summary
**Overall Performance:**
- Citation Rate: 26.4% (↗ +2.1% vs Dec)
- Queries tested: 200
- Times cited: 53
- Platforms tested: 4
## Platform Breakdown
| Platform | Citation Rate | Change | Rank |
|----------|--------------|--------|------|
| ChatGPT | 32.5% | +3.2% | 🥇 Best |
| Perplexity | 28.0% | +2.5% | 🥈 |
| Claude | 24.5% | +1.8% | 🥉 |
| Gemini | 20.5% | +0.9% | 4th |
## Top Performing Content
1. **"CRM Buyer's Guide 2025"**
- Citation Rate: 48%
- Platforms: All 4
- Position: Avg #2.3
2. **"CRM Pricing Comparison"**
- Citation Rate: 41%
- Platforms: 3/4 (not Gemini)
- Position: Avg #1.8
## Issues Identified
⚠️ **Gemini underperformance**
- Only 20.5% CR (vs 32.5% ChatGPT)
- Action: Create Gemini-optimized content
⚠️ **Product comparison gaps**
- "[Your Brand] vs Salesforce": 0% citation
- Action: Create detailed comparison content
## Actions This Month
✅ Created 3 new comparison articles
✅ Updated pricing page with schema
✅ Fixed 5 factual inaccuracies
🚧 Working on Gemini optimization
Analysis & Reporting
Key Metrics to Track
1. Overall Citation Rate Trend
Jan: 18.2%
Feb: 19.7% (+1.5pp)
Mar: 22.1% (+2.4pp)
Apr: 23.8% (+1.7pp)
May: 26.4% (+2.6pp)
5-month growth: +8.2pp (+45% increase)
Average monthly growth: +1.64pp
2. Platform-Specific Performance
Which platforms are you winning/losing?
ChatGPT: 32% CR → Winning ✅
Perplexity: 28% CR → Competitive ✅
Claude: 24% CR → Average 🟡
Gemini: 20% CR → Needs work ⚠️
Focus: Improve Gemini presence
3. Content Performance Analysis
Best performing content types:
Comparison guides: 41% avg CR
How-to tutorials: 36% avg CR
Pricing guides: 33% avg CR
Case studies: 28% avg CR
Blog posts: 19% avg CR
Strategy: Create more comparison/how-to content
4. Query Category Analysis
Citation rate by query type:
Specific use case: 38% CR ("CRM for real estate")
Educational: 31% CR ("how to implement CRM")
Commercial: 27% CR ("CRM pricing")
Generic: 18% CR ("best CRM")
Competitive: 12% CR ("Salesforce alternative")
Insight: Dominate specific niches, struggle on generic
Competitive Benchmarking
Track competitors alongside your brand:
Category: CRM Software (100 test queries)
Your Brand: 26% CR (#4 position)
├─ HubSpot: 34% CR (#1)
├─ Salesforce: 29% CR (#2)
├─ Pipedrive: 27% CR (#3)
├─ You: 26% CR (#4) ← Current position
└─ Zoho: 18% CR (#5)
Gap to #3: -1pp (very close!)
Gap to #1: -8pp
Goal: Overtake Pipedrive (27%) by next month
Alert Triggers
Set up alerts for significant changes:
Alert Conditions:
🔴 Critical:
- Citation Rate drops >5pp month-over-month
- Major competitor surges >10pp
- Negative sentiment citation detected
→ Action: Immediate investigation
🟡 Warning:
- Citation Rate flat for 2+ months
- Losing position on key queries
- New competitor enters top 5
→ Action: Review strategy
🟢 Positive:
- Citation Rate increases >3pp
- New content gets >40% CR
- Overtake competitor in ranking
→ Action: Document what worked, replicate
📚 Related Topics
Other Monitoring:
Metrics:
Optimization:
🆘 Need Help?
Citation Tracking Support:
Tools:
Last updated: 2025-01-18 | Edit this page