HOT POST

6/recent/ticker-posts

HOW TO CREATE AND ADD ROBOTS.TXT FILE IN BLOGGER FOR INDEXING AND CRAWLING

  • When we search keyword in Google or any other search Engine. How it shows results and how this result crawled by search Engine.
  • This is act as protocol between your website and search engine. This program is called robotic.txt file.Robotic.txt file is normal text file which is available on all the website. All the indexes and crawling being done by this file.
What the robotic.txt file is?
  • The robotic file is a program that is used to crawl and index the webpages.
  • The file has user-agent media partners-Google user disallow
On Page Seo Techniques
  • Disallow
  • By adding this your disallow the crawler and robots not to crawl your particular pages.
  • Disallow/search
Off Page Seo Techniques
  • Can I disallow some pages in Blogger
  • yes! you can disallow some pages in blogger by using robotic txt file. 
  • You can disallow pages like contact, privacy police etc.
  • Search Engine will not craw these pages. You can disallow all label pages.
  • Only diallow the labels and post when its necessary otherwise you can lose your blog traffic.
How To Disallow pages using Robotics.txt file.
  • You can easily disallow the search Engines to index and crawl you pages and posts by using robots.txt
  • If you wish to disallow some specific page.
  • Go to robotic.txt file and add
  • Disallow:/year/month/URL.html
  • for pages
  • Disallow/p/url.html
How to add robots.txt file in blogger
  • You can easily add robots.txt file in blogger just login to blogger>> Go to setting >> search preference >> click on the custom.robots.txt file in crawling and indexing section.
  • Click edit and enable robot.txt file and here adds your file.
  • Save the Changes 

Post a Comment

0 Comments