Skip to Main Content

SEO PowerSuite Introduces Robots.txt Generation

Posted on 4.01.2012

Link-Assistant.Com, the creator of the SEO PowerSuite toolkit, has announced that its on-page SEO tool WebSite Auditor has been enhanced with robots.txt generation function.

Robots.txt files contain instructions for search engine crawlers on accessing Web pages, and website owners can use these files to hide confidential pages (i.e. pages meant for internal use only), to conceal pages with outdated information (i.e. past sales) and for improving overall SEO results by ensuring that only the best-converting pages show up in search results.

Users can set up the new function simply by hitting the robtots.txt button, selecting a page they want to protect from crawling in their WebSite Auditor’s project, opting for disallow and choosing a search robot from the list.

More information on the update is available here.

“We update every tool in SEO PowerSuite toolkit on a weekly basis, making them compliant with every search engine update,” explains Viktar Khamianok, CEO of Link-Assistant.Com. “Taking into account recent and coming Google updates, SEOs should pay particular attention to best on-page SEO practices – that’s when WebSite Auditor comes in handy.”

Apart from WebSite Auditor, the SEO PowerSuite consists of the following tools:

•        Rank Tracker (rank checking and keyword research tool)
•        SEO SpyGlass (backlink checker)
•        LinkAssistant (link building software)

Leave Your Comment

Login to Comment

Become a Member

Not already a part of our community?
Sign up to participate in the discussion. It's free and quick.

Sign Up


Leave a comment
    Load more comments
    New code