RED is a robot that checks HTTP resources to see how they'll behave, pointing out common problems and suggesting improvements. Although it is not a HTTP conformance tester, it can find a number of HTTP-related issues.
RED interacts with the resource at the provided URL to check for a large number of common HTTP problems, including:
Additionally, it will tell how well your resource supports a number of HTTP features, like:
See the source (available from the project page) for more details; RED's capabilities are added to all of the time.
redbot.org may be used to check your resources as well as those owned by others. However, please refrain from creating abusive amounts of traffic; e.g., repeatedly checking the same resource. All requests are logged.
This service may not be framed, scraped or otherwise repurposed. Automated queries (e.g., from a script) are not allowed. REDbot can be downloaded, deployed and reused, as long as the license terms are met; see the RED project page for more information.
redbot.org keeps logs of requests, to help debug the service. It also keeps more information when a problem is detected. If your requests include sensitive information, it may be stored as a result.
We undertake not to abuse or share this information. If you are concerned about this, you should consider running your own instance of RED.
Additionally, keep in mind that while RED uses SSL/TLS to encrypt requests to HTTPS URLs, communication between your browser and redbot.org is unencrypted.
You can download it from the RED project page to run locally.
If you believe RED's output is incorrect, please raise a detailed issue on the RED project page.
If you are a Webmaster and don't want this service to be able to access your pages, please add the "RED" User-Agent to your robots.txt file.
Other discussion can occur on the redbot-users mailing list; see the RED project page.