Robots.txt file is retrieved by search engines, like Google and Bing, and used to state what pages need indexing and what must be ignored.
A robot (also known as Spider or Web Crawler) is a program that automatically traverses the web's hypertext structure by retrieving a document and recursively retrieving all documents that are referenced. Don't forget that instructing search engines bots help you adding visibility to your site and let people reach you in a more efficient way excluding irrelevant more...
- Brand new interface.
- Completely rewritten plugin.
- Support for multiple directives for each rule.
- Mac OS X 10.5 or later
- RapidWeaver 4 or later