Publisher description
Robots.txt is a visual editor for Robot Exclusion Files and a log analyzer software. It allows a user to quickly and easily create the robots.txt files required to instruct search engine spiders, which parts of a Web site are not to be indexed and made searchable by the general Web public and then to identify spiders, which do not keep to those instructions. The program provides the user a way to log onto his FTP or local network server and then select the documents and directories which are not to be made searchable. By means of this program you will be able to visually generate industry standard robots.txt files; identify malicious and unwanted spiders and ban them from your site; direct search engine crawlers to the appropriate pages for multilingual sites; use robots.txt files for doorway page management; keep spiders out of sensitive and private areas of your Web site; upload the correctly formatted robots.txt files directly to your FTP server not switching from Robots.txt Editor; track spider visits; create spider visits reports in HTML, Microsoft Excel CSV and XML formats and more. Free program`s updates and upgrades unrestricted in time; unlimited number of Web sites to work with.
Related Programs
ROBO Optimizer Search Engine Optimization 2.5.5
Powerful search engine optimization wizard
WebPosition 4 Pro SEO Software Search Engine Optimization 4.0b.785
SEO Software WebPosition Gold 4
ROBO Optimizer Pro Search Engine Optimization 2.5.3
Serious search engine optimization software
1-Hour Search Engine Optimization Crash Course 1.5
Free search engine optimization crash course.
Urfin - File Search Engine for LAN 1.1
Urfin is a network file search engine