GOOGLEBOT
Google has recently confirmed that the company now has a Googlebot that can recognise duplicate content even before it is crawled. The confirmation regarding the existence of the said Googlebot was made by no other than Google’s John Mueller during a company’s Webmaster Central hangout. Mueller spilled the beans as a response to a question raised by a site owner who is wondering if Google will be considering a content written in French to be a duplicate of its English version. Aside from revealing the presence of such bots, Mueller’s revelation also struck a vital chord among SEO practitioners. This is because Mueller’s statement hinted that with the help of this bot Google might determine that a page has duplicate content in case it shares similar URL parameters with those that are not different from each other. Given this, many SEO experts think that such feature may not be favourable for many site owners. The primary reason for this notion is that there might be some instances where the pages they publish contain unique content but with the same URL parameters with those that have duplicate content and these will most likely be identified by Google as duplicates. But despite that, other site owners and SEO practitioners are optimistic that the possibility of having most of the unique content that they are posting will be identified as a duplicate by the Googlebot that Mueller has shared can still be avoided by focusing more attention on the manner that their websites are generating the URL parameters. Finally, Mueller has admitted that the branding of unique content as duplicates might always be regarded as the fault on the part of the webmasters because Google sometimes has its version of bugs that are responsible for doing such.
Google Confirms Googlebot that Automatically Detect Duplicate Content HTTPS://ANYTHINGSEO.WORDPRESS.COM/