AEO Prompt Data

Access historical answers, citations, and mentions for the prompts you're tracking across AI answer engines.

The AEO Prompt Data step lets you pull historical brand visibility data for your tracked prompts across AI answer engines. Get comprehensive insights about how your brand appears in AI-generated responses, including answers, citations, and mentions with sentiment scores.

Overview

What it does: Access answers, citations, and mentions for prompts you're tracking for your brand across answer engines like ChatGPT, Perplexity, Gemini, Google AI Mode, and Google AI Overview. Data is available for the last 7, 30, or 90 days.

When to use it:

  • Monitor brand visibility trends across AI platforms

  • Track competitive positioning in AI-generated responses

  • Identify which prompts drive the most brand mentions

  • Analyze sentiment shifts over time

  • Build automated reports on AI search performance

Key benefits:

  • Go from insight to action: Connect your AEO analytics directly to content optimization workflows

  • Unified cross-platform monitoring across all major AI answer engines from a single workflow step

  • Competitive benchmarking with automated sentiment analysis and mention rate comparisons

  • Flexible time ranges to analyze trends over 7, 30, or 90 days

  • Actionable data that feeds directly into content creation and optimization workflows

Supported AI Platforms

The AEO Prompt Data step works with five major AI platforms:

Platform
Description

ChatGPT

OpenAI's conversational AI assistant

Perplexity

AI-powered answer engine with real-time web search

Gemini

Google's multimodal AI assistant

Google AI Mode

Google Search's AI-powered response mode

Google AI Overview

AI-generated summaries in Google Search results

Default behavior: All platforms are selected automatically to ensure comprehensive coverage. You can filter to specific platforms using the multi-select dropdown in the step configuration.

Data aggregation: Results are combined across all selected platforms, giving you unified metrics rather than platform-specific siloed data.

Available Report Types

1. Answers Report

Access actual AI-generated responses for context and positioning analysis

Purpose: Review the actual content of AI responses to understand how your brand is positioned, what context it appears in, and identify opportunities for improvement.

Report Fields

Field
Type
Description
Format

answer

String

Complete AI-generated response text (truncated at 10,000 characters if needed)

Full response content

date

String

When the AI response was generated

MM-DD-YYYY

Understanding the Data

  • Content Analysis: Examine how AI platforms describe your brand, products, or services in their responses

  • Contextual Positioning: Understand what topics and questions trigger mentions of your brand

  • Competitive Framing: See how you're positioned relative to competitors in direct comparisons

  • Messaging Consistency: Identify variations in how your brand is described across different prompts

Sample Response


2. Citations Report

Monitor which URLs are referenced as authoritative sources

Purpose: Track which URLs are being cited by AI platforms as authoritative sources for specific prompts, ranked by how frequently they appear across AI responses.

Report Fields

Field
Type
Description
Notes

url

String

The complete URL being cited (query parameters removed)

Each URL appears only once

citation_rate

Float

Frequency this URL appears in answers with citations

0.0 to 1.0 (higher = more frequent)

Understanding the Data

  • Citation Rate: Defined as (number of answers citing this URL) / (number of answers containing at least one citation). A citation rate of 0.67 means this URL appeared in 67% of responses that included citations.

  • URL Normalization: Query parameters are automatically stripped from URLs to avoid duplicate entries

  • Authority Ranking: Results are sorted by citation rate (descending), showing which sources AI platforms reference most frequently

Sample Response


3. Mentions Report

Track brand and competitor visibility with sentiment analysis

Purpose: Monitor how frequently your brand and competitors appear in AI responses for specific prompts, with automated sentiment scoring to understand perception trends.

Report Fields

Field
Type
Description
Range/Format

brand_name

String

Brand or competitor name from your Brand Kit

Domain's name field

mention_rate

Float

Relative frequency of mentions for this brand

0.0 to 1.0 (proportion of total mentions)

sentiment_rate

Float

Average sentiment score of mentions

-1.0 to 1.0 (-1 = negative, 1 = positive)

Understanding the Data

  • Mention Rate: Calculated as (this brand's mentions) / (total mentions across all brands). A rate of 0.25 means this brand accounts for 25% of all brand mentions in the analyzed responses.

  • Sentiment Rate: Averaged across all mentions using sentiment analysis. Values range from -1.0 (completely negative) to 1.0 (completely positive), with 0.0 being neutral.

  • Competitive Context: Results are sorted by mention rate (descending), allowing you to quickly identify brands with the largest share of voice.

Sample Response

Step Configuration Guide

When setting up your AEO Prompt Data step, configure these fields:

1. Report Type (Required)

Description: Choose the type of analysis you want to perform.

Options:

  • Answers: Access actual AI response content for positioning analysis

  • Citations: Track which URLs are cited as authoritative sources

  • Mentions: Monitor brand visibility with sentiment analysis

2. Platform (Multi-select)

Description: Select which AI platforms to include in your analysis.

Options: ChatGPT, Perplexity, Gemini, Google AI Mode, Google AI Overview (all selected by default)

3. Brand Kit (Required)

Description: Select the Brand Kit containing your brand information and competitor list. This determines which brands are analyzed and compared.

4. Question (Required)

Description: Select a tracked prompt/question from your Brand Kit to analyze. This focuses the analysis on a specific prompt that you're monitoring.

5. Time Range (Required)

Description: Determines the date range for the data.

Options:

  • Last 7 days: Most recent snapshot of AI visibility

  • Last 30 days: Monthly trend analysis

  • Last 90 days: Quarterly performance review

Use Cases

Content Optimization Pipeline

Pull prompt data to identify gaps in AI responses, then trigger content creation or refresh workflows:

  1. Use AEO Prompt Data to find prompts where competitors are mentioned more frequently

  2. Route to a content analysis step to identify missing topics

  3. Generate optimized content targeting those prompts

Competitive Monitoring Dashboard

Build automated reports that track competitive positioning:

  1. Pull mentions data across key prompts

  2. Compare sentiment rates between your brand and competitors

  3. Output to Google Sheets or Notion for stakeholder review

Citation Authority Tracking

Monitor which content performs best in AI citations:

  1. Pull citations data for high-priority prompts

  2. Identify your most-cited URLs

  3. Analyze patterns to inform content strategy

Last updated

Was this helpful?