Baidu Spider is an automated program contains a piece of software code used by Baidu search engine. Like every crawler Baidu spider visits the web pages on the internet and index them in the database based on keywords. When a user searches the particular keyword the most relevant pages are displayed top in the search result page.
User agents for Baidu Spider
Baidu spider uses different user agents for different purposes as shown below:
|Product Name||Baidu User Agent|
|Business Search (Advertisements)||Baiduspider-ads|
Baidu spider-cpro and Baidu Spider-ads only crawl the web to perform the operation agreed with the customer but not index any pages and not comply with standard robots.txt protocol.
Crawling Control of Baidu Spider
Baidu spider automatically crawls your content to find the latest updates in your site. If Baidu spider’s crawling affects your site’s performance then you can change the crawling rate in your Baidu webmaster tools account.
Using robots.txt file will stop Baidu spider crawling your web pages. If you are setting your robots.txt file to stop accessing an indexed web page, it may take several months for removal of your indexed page from the search results.
You also have an option to use the following meta tag setting to prevent Baidu search result showing the snapshot of your web pages
Leave a Reply
Your email is safe with us.