robots.txt Detection

robots.txt Detection

Check whether a website has robots.txt. After configuration, this extension can detect whether other paths exist.

What is robots.txt Detection?
Robots.txt Detection is a Firefox add-on that allows users to check whether a website has a robots.txt file and detect other paths. It automatically prompts the user with an icon in the address bar. Configure the extension to detect specified paths for each website being viewed.
By: Werner
Users: 25 ▲ 1
Version: 0.3 (Last updated: 2019-11-16)
Creation date: 2019-11-12
Weekly download count: 2
Firefox on Android: No
Risk impact: High risk impact
Risk likelihood: Low risk likelihood
Manifest version: 2
  • tabs
  • storage
  • <all_urls>
Size: 50.89K
URLs: Website
Stats date:

Other platforms

Not available on Chrome
Not available on Edge
Want to check extension ranking and stats more quickly for other Firefox add-ons? Install Chrome-Stats extension to view Chrome-Stats data as you browse the Firefox Browser Add-ons.
Chrome-Stats extension

The purpose of this extension is to automatically detect if there is a specified path for each website being viewed. By default only robots.txt is detected. If it does, an icon appears in the address bar to prompt the user.

Risk impact

robots.txt Detection is risky to use as it requires a number of sensitive permissions that can potentially harm your browser and steal your data. Exercise caution when installing this add-on. Review carefully before installing. We recommend that you only install robots.txt Detection if you trust the publisher.

Risk likelihood

robots.txt Detection has earned a fairly good reputation and likely can be trusted.

Upgrade to see risk analysis details