# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html # # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html # Custom crawlers disable User-agent: FeedDemon Disallow: / User-agent: BOT/0.1 (BOT for JCE) Disallow: / User-agent: CrawlDaddy Disallow: / User-agent: Java Disallow: / User-agent: Jullo Disallow: / User-agent: Feedly Disallow: / User-agent: UniversalFeedParser Disallow: / User-agent: ApacheBench Disallow: / User-agent: Swiftbot Disallow: / User-agent: YandexBot Disallow: / User-agent: jikeSpider Disallow: / User-agent: MJ12bot Disallow: / User-agent: ZmEu phpmyadmin Disallow: / User-agent: WinHttp Disallow: / User-agent: EasouSpider Disallow: / User-agent: HttpClient Disallow: / User-agent: Microsoft URL Control Disallow: / User-agent: YYSpider Disallow: / User-agent: jaunty Disallow: / User-agent: oBot Disallow: / User-agent: Python-urllib Disallow: / User-agent: Indy Library Disallow: / User-agent: FlightDeckReports Bot Disallow: / User-agent: Linguee Bot Disallow: / User-agent: * Sitemap: https://www.firefly.world/sitemap.xml