There seems to be some confusion here on this issue of cloaking. The distinction here is not between users and search engines, but between users coming from one place, and users coming from another. Google apparently wants its customers to have more access to your site than customers from elsewhere - which shouldn't come as a surprise.
*However*, if you allow googlebot to index the whole page, but only show a limited page to browsers (whether refered from Google or not), then it is cloaking.
Google would possibly (probably?) argue that it's okay as long as the googlebot indexed page has the same content as the page seen by users *when they click through from Google*
I wonder why anybody would want to implement this. Maybe some of the above enthusiasts could explain what good they see here?
الاشتراك في:
تعليقات الرسالة (Atom)
ليست هناك تعليقات:
إرسال تعليق