Cloaking User-Agent

penalites advanced

Definition

A cloaking technique that detects the browser or bot user-agent to serve different content to search engines.

User-agent cloaking is a concealment method that analyzes the user-agent string sent by the browser or indexing bot in the HTTP request header. When the server identifies a user-agent matching a crawler (Googlebot, Bingbot, etc.), it serves a version of the page specifically optimized for SEO, often overloaded with keywords and text content. Visitors using a standard browser (Chrome, Firefox, Safari) receive a completely different version. This is the most widespread form of cloaking because it is technically simple to implement via server-side conditions (PHP, Python, .htaccess). Nevertheless, it is easily detectable by Google, which can modify its user-agent strings at will to test suspect sites.

UA Cloaking User-Agent Cloaking User-Agent Based Cloaking

Key Points

  • The most common form of cloaking because it is technically simple to implement
  • Relies on analyzing the user-agent string in HTTP headers
  • Google can easily detect this technique by modifying its user-agent
  • Not to be confused with dynamic rendering, which serves the same content in a different format

Practical Examples

Detection via HTTP headers

A PHP script checks $_SERVER['HTTP_USER_AGENT'] and, if it contains 'Googlebot', serves an HTML page with 2,000 words optimized for the target keyword, while standard browsers see a page with little content.

.htaccess rules

Rewrite rules in the .htaccess file use RewriteCond to detect bot user-agents and redirect them to satellite pages invisible to normal users.

Conditional JavaScript rendering

A site detects the user-agent server-side and serves pre-rendered static HTML stuffed with keywords to bots, while browsers receive a JavaScript application with different content.

Frequently Asked Questions

No, dynamic rendering is tolerated by Google provided that the content served to bots is identical to that visible to users, simply pre-rendered as static HTML to facilitate JavaScript indexation. User-agent cloaking, on the other hand, serves fundamentally different content to manipulate rankings.

Compare the content visible in your browser with what Google sees using the URL inspection tool in Google Search Console, or by modifying your user-agent via browser developer tools to simulate Googlebot. Tools like Screaming Frog also allow version comparison.

Go Further with LemmiLink

Discover how LemmiLink can help you put these SEO concepts into practice.

Last updated: 2026-02-07