HTTP deep check

Run a full HTTP audit: headers, security posture, cookies, cache policy, compression, protocol, and special paths like robots.txt and sitemap.xml.

If a site blocks scanner traffic, you can configure your server/CDN to allow one custom header and set it here.

What this report helps you diagnose

HTTP Deep Check is useful when standard uptime checks pass but traffic quality is still poor. It surfaces policy-level issues that affect performance, security, and crawlability, including caching mistakes and weak header defaults.

Security posture
Review missing headers and score impact to prioritize fixes with the highest risk reduction.
Cache behavior
Validate Cache-Control, ETag, and Expires consistency so browsers and CDNs cache responses as intended.
Crawl readiness
Confirm robots.txt and sitemap.xml availability on the same environment that serves public traffic.
For page-level metadata validation, continue with Robots + X-Robots Checker and Canonical Checker.

FAQ

What is included?
Status, protocol, security headers, cookies, cache directives, compression, and special crawl files.
How to use score?
Treat score as triage, then use per-header impact details to define concrete implementation tasks.
Why robots/sitemap here?
They are core crawl signals, so checking them in the same run catches deployment-level indexing regressions early.

Related guides