The spider will enter the homepage of a website and then follow every link it comes across in the code after saving the content. It will then crawl each page linked from the homepage and keep repeating the process until it has found every single page on the entire website.
An XML Sitemap is a good way to tell the search spiders exactly where all the pages of content are located which should be linked within your robots.txt file.
Check your broken links for search spiders
Posted by realwebseo under MarketingFrom http://www.realwebseo.com 5242 days ago
Who Voted for this Story
Subscribe
Comments