Analytics

Track traffic, bot activity, AI crawler attribution, and per-path request logs for every domain you manage.

SerpWise Analytics provides visibility into the traffic flowing through your gateway. Because all requests pass through the gateway, analytics captures both human and bot traffic with high fidelity — including AI crawlers that most analytics tools miss.

Analytics are available per-domain from the Analytics tab in your domain dashboard.

Overview Metrics

The overview panel shows aggregate statistics for the selected date range:

MetricDescription
Total RequestsAll requests processed by the gateway for this domain
Bot RequestsRequests identified as coming from known bots or crawlers
Bot PercentageBot requests as a share of total traffic
Avg. Response TimeAverage time from gateway receiving the request to returning the response (includes origin fetch time)
Unique PathsCount of distinct URL paths that received traffic
Blocked RequestsRequests rejected by rate limiting (HTTP 429). The trend indicator is inverted — an upward trend shows in red because increasing blocks may indicate an ongoing attack or a limit that's too restrictive.

Date Range Filtering

Filter analytics by date range using the date picker at the top of the Analytics tab. Available presets:

  • Last 24 hours
  • Last 7 days
  • Last 30 days
  • Custom range

All charts and tables update to reflect the selected range.

Traffic Over Time

The time-series chart shows request volume broken down by day (for longer ranges) or hour (for the last 24 hours). Two series are shown:

  • Total requests — all traffic
  • Bot requests — crawler and bot traffic only

This makes it easy to spot traffic spikes and distinguish human traffic events (launches, campaigns) from bot crawl activity.

Bot Breakdown

The bot breakdown table shows a ranked list of identified bots, with request counts for the selected period.

SerpWise identifies 50+ known bots by matching the User-Agent header against a continuously updated registry, including:

Search engine crawlers:

  • Googlebot, Google-InspectionTool
  • Bingbot, AdIdxBot
  • DuckDuckBot
  • Baiduspider, YandexBot

AI training crawlers:

  • GPTBot (OpenAI)
  • ClaudeBot (Anthropic)
  • CCBot (Common Crawl)
  • PerplexityBot
  • Amazonbot
  • Meta-ExternalAgent

SEO tools:

  • AhrefsBot, SemrushBot, MajesticBot, DotBot
  • Screaming Frog, Sitebulb

Monitoring and uptime:

  • UptimeRobot, Pingdom, StatusCake

Requests that match no known bot pattern are identified as human traffic.

AI Bot Attribution

AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) are tagged separately in the bot breakdown. This gives you insight into how frequently AI companies are crawling your content and which pages they're most interested in.

Why AI crawler visibility matters

AI training crawlers don't appear in traditional web analytics tools (which require JavaScript execution) because they read the raw HTML. SerpWise captures these at the gateway level, giving you a complete picture of who is consuming your content.

Top Paths

The top paths table shows the most-requested URL paths for the selected date range, with separate columns for total requests and bot requests.

Use this to identify:

  • Your highest-traffic pages (for prioritizing rule coverage)
  • Pages receiving disproportionate bot traffic
  • Paths that are being aggressively crawled or scanned

Request Logs

The request log viewer provides a paginated, filterable log of individual requests. Each log entry includes:

FieldDescription
TimestampWhen the request was processed
PathThe URL path requested
MethodHTTP method (GET, HEAD, etc.)
StatusResponse status code (from origin or gateway)
Response TimeTotal gateway processing time in milliseconds
BotIdentified bot name, or blank for human traffic
Is AI BotFlag for known AI training crawlers
User AgentFull User-Agent string

Filtering Logs

Filter the log table by:

  • Date range — synced with the global date picker
  • Bot only — show only bot requests
  • AI bots only — show only AI crawler requests
  • Status code — filter to specific HTTP status codes
  • Path — partial match on URL path

How Analytics Are Collected

Request logs are written asynchronously — they don't add latency to the request processing pipeline. The gateway records each request after the response is sent.

Analytics data is retained for 90 days. Older logs are automatically purged.

Using Analytics for SEO

A few ways analytics data helps your SEO work:

Identify crawl budget waste — If Googlebot is hitting thousands of parameterized URLs (faceted navigation, session IDs), use that insight to create noindex rules or redirects for those paths.

Verify rule coverage — Cross-reference top paths with your rule trigger statistics to confirm your most-visited pages are covered by the right rules.

Monitor AI crawler access — Track which content AI companies are indexing. Use this data to make informed decisions about set_response_header rules with X-Robots-Tag: noai for content you want to restrict.

Spot anomalies — Sudden spikes in bot traffic to specific paths can indicate aggressive scraping. Use Shield or IP-based rules to manage this.

Per-URL Analytics

Domain analytics shows aggregate data across all paths. If you need to drill down into traffic, bot activity, response times, and referrers for a single URL, use the Analytics tab on any URL Detail page. See Per-URL Analytics for details.

On this page