Content Safeguard
Protect your content from AI training and unauthorized usage with advanced directives
What is Content Safeguard?
Content Safeguard is ProRank SEO's advanced protection system that helps you control how your content is used by AI systems, search engines, and other crawlers. It goes beyond traditional SEO to protect your intellectual property in the age of AI.
This feature implements noai and noimageai meta directives alongside traditional SEO controls like snippet length and image preview settings. Note: adoption of these directives varies by AI provider — robots.txt blocking (via the AI Protection tab) remains the most effective method.
AI Content Protection
AI Directives (Experimental)
noai Directive
Instructs AI systems not to use your content for training, analysis, or generation purposes.
<meta name="robots" content="noai" />noimageai Directive
Prevents AI systems from using your images for training computer vision models or image generation.
<meta name="robots" content="noimageai" />Combined Protection
Use both directives together for complete AI protection:
<meta name="robots" content="noai, noimageai" />Adoption of these meta directives varies across AI providers. For broader protection, combine with robots.txt rules. ProRank SEO automatically applies these when AI protection is enabled.
Content Control Settings
Available Controls
Snippet Control
- max-snippet: Control text snippet length in search results (-1 for unlimited, 0 for none, or specific character count)
- max-video-preview: Control video preview duration in seconds
Image Control
- max-image-preview: Control image thumbnail size (none, standard, or large)
- noimageindex: Prevent images from being indexed separately
Translation & Archive
- notranslate: Prevent automatic translation of your content
Configuration Guide
- Navigate to Content Safeguard:Go to ProRank SEO → Technical SEO → Robots & Indexing → Content Safeguard tab
- Enable AI Protection:Use "Add noai meta tag to all pages" and "Add noimageai meta tag to all pages" toggles
- Configure Snippet Settings:Set maximum snippet length (characters), video preview (seconds), and image preview size
- Set Additional Protections:Enable notranslate, noimageindex, or nositelinkssearchbox as needed
- Apply Per-Page Overrides:The ProRank SEO meta box on individual posts/pages supports noindex, nofollow, noarchive, nosnippet, and noimageindex overrides
- Save and Test:Save settings and check page source to verify meta tags are applied
Implementation Examples
E-commerce Product Protection
Protect product descriptions and images from AI scraping while maintaining SEO visibility:
<!-- Applied to product pages -->
<meta name="robots" content="index, follow, noai, noimageai, max-snippet:160, max-image-preview:large" />Premium Content Protection
Maximum protection for exclusive content:
<!-- Applied to premium content -->
<meta name="robots" content="noindex, noai, noimageai, notranslate, noimageindex" />
<!-- X-Robots-Tag HTTP header also applied -->
X-Robots-Tag: noindex, noai, noimageai, notranslateBlog Post Balanced Approach
Allow search indexing but protect from AI training:
<!-- Applied to blog posts -->
<meta name="robots" content="index, follow, noai, max-snippet:320, max-image-preview:standard" />Feature Availability
Content Safeguard features are available in the free tier with no page limits. Scope depends on the directive. Most settings (noai, noimageai, max-snippet, notranslate, noimageindex, nositelinkssearchbox) apply broadly to frontend pages when enabled. Rule-based noindex and unavailable_after are limited to singular content.
Free (all tiers)
- Noindex controls (no page limit)
- AI protection directives (noai, noimageai)
- Standard meta robots tags
- Snippet and preview controls
- Rule-based noindex for singular content (e.g. thin or aging posts)
- X-Robots-Tag HTTP headers
X-Robots-Tag Headers
HTTP Header Implementation
X-Robots-Tag headers provide an additional layer of protection that works even for non-HTML resources like PDFs, images, and other media files.
Advantages over Meta Tags
- • Works on all file types (PDFs, images, videos)
- • Cannot be removed by content scrapers
- • Processed before content download
- • Applies to direct file access
Example Headers
# Applied to all responses
X-Robots-Tag: noai, noimageai
# Applied to specific file types
X-Robots-Tag: noindex, nofollow # PDFs
X-Robots-Tag: noimageindex # Images
X-Robots-Tag: noindex # VideosCommon Use Cases
Recommended For
- ✓ Original research and data
- ✓ Premium/paid content
- ✓ Proprietary images and graphics
- ✓ Legal and medical content
- ✓ Personal or sensitive information
- ✓ Creative works and art
Consider Impact On
- ⚠ SEO visibility (with noindex)
- ⚠ Translation availability (with notranslate)
- ⚠ Image search traffic (with noimageindex)
- ⚠ International reach (with notranslate)
- ⚠ Image indexing (with noimageindex)
- ⚠ AI-powered search features
Verification Methods
How to Verify Settings
- Check Page Source:View page source and search for meta name="robots" tags
- Inspect HTTP Headers:Use browser DevTools Network tab to check X-Robots-Tag headers
- Google Search Console:Use URL Inspection tool to see how Google interprets your directives
- Third-Party Tools:Use SEO crawlers to verify robots meta tags across your site
Example Verification
<!-- What you should see in page source -->
<meta name="robots" content="index, follow, noai, noimageai, max-snippet:160" />
<!-- In HTTP headers -->
X-Robots-Tag: noai, noimageaiImportant Considerations
AI Directive Support: Not all AI companies currently respect noai/noimageai directives. ProRank SEO combines these with robots.txt blocking for comprehensive protection.
SEO Impact: Content Safeguard settings can affect your search visibility. Test changes on less important pages first before applying site-wide.
Legal Protection: While these technical measures help protect your content, they don't replace proper copyright notices and legal protections.
Best Practices
Recommended Approach
- Start with AI Protection:Enable noai and noimageai globally as a baseline
- Layer Additional Controls:Add snippet and preview limits based on content type
- Per-Page Meta Robots:Apply stricter settings to premium or sensitive content
- Monitor Impact:Track search traffic and adjust settings if needed
- Combine with Robots.txt:Use robots.txt to block known AI crawlers as backup
- Regular Reviews:Update settings as new AI systems and directives emerge