The spider will enter the homepage of a website and then follow every link it comes across in the code after saving the content. It will then crawl each page linked from the homepage and keep repeating the process until it has found every single page on the entire website.

An XML Sitemap is a good way to tell the search spiders exactly where all the pages of content are located which should be linked within your robots.txt file.

Who Voted for this Story





Comments


Log in to comment or register here.
Subscribe

Share your small business tips with the community!
Share your small business tips with the community!
Share your small business tips with the community!
Share your small business tips with the community!