Meet our Kitextbot crawler

Kitextbot uses a Mobile variant to crawl the web for our Research needs in Text Analytics & Natural Language Processing.

User Agent: Mozilla/5.0 (iPhone; CPU iPhone OS 11_0 like Mac OS X) Mobile Safari/604.1 (compatible; Kitextbot/1.0; +https://kitext.com/bot.html)

Controlling Crawl and Crawl Rates

To control how our Kitextbot crawler interacts with your website, you can add Allow + Disallow listings to your robots.txt files, for example:


	User-agent: Kitextbot
	Allow: /recipes/
	Disallow: /

This will restrict Kitextbot to crawl only pages under /recipes/ path.

Kitextbot crawl rate is one page every two seconds and at most 20 pages every minute. If you wish to slow this rate, you can add "Crawl-delay" rules, for example:


	User-agent: Kitextbot
	Crawl-delay: 10

This will reduce the rate to one page every ten seconds.

Controlling how our public search engine might display a description of your pages

If you do not wish our public search engine to show a description nor a preview thumbnail (were applicable) for a page, add a "robots" Metatag to the html header, as following:


	<meta name="robots" content="nosnippet">

Kitextbot is not archiving nor caching pages for showing in the search engine results. It supports and saves a copy of "robots" Metatags from the html header for later reference, as following:


	<meta name="robots" content="noindex">
	<meta name="robots" content="nofollow">
	<meta name="robots" content="noarchive">
	<meta name="robots" content="nocache">
	<meta name="robots" content="noodp">
	<meta name="robots" content="nosnippet">

It also saves a copy of robots.txt and sitemap xml files that can be used later for traceability.

Reporting Problems

Should you notice crawl issues with Kitextbot crawler, please contact us through Our Contact page.