| Commit message (Collapse) | Author | Age | Files | Lines | |
|---|---|---|---|---|---|
| * | More Bots exluded in robots.txt - DataForSeoBot SemrushBot Applebot GPTBot ↵ | Bruno Cornec | 2026-03-09 | 1 | -0/+5 |
| | | | | | SeznamBot) | ||||
| * | Use more appropriate decoding/encoding when redirecting | Dan Fandrich | 2025-06-06 | 1 | -2/+2 |
| | | |||||
| * | Properly escape the target in then anti-robot redirect | Dan Fandrich | 2025-06-06 | 1 | -2/+2 |
| | | | | | Any additional URL parameters after a & were previously dropped. | ||||
| * | Use an absolute URL when redirecting | Dan Fandrich | 2025-05-23 | 1 | -3/+4 |
| | | | | | | This reduces the possibility of a malicious URL redirecting to another domain. | ||||
| * | Use a fixed random number in the cookie | Dan Fandrich | 2025-05-23 | 1 | -1/+1 |
| | | | | | | The intent of this cookie isn't actually to track sessions, so eliminate any privacy impact by using a fixed number instead. | ||||
| * | Add another allowed character for cookie redirects | Dan Fandrich | 2025-05-23 | 1 | -1/+1 |
| | | |||||
| * | Block expensive svnweb operations without a cookie | Dan Fandrich | 2025-05-23 | 1 | -0/+27 |
| | | | | | | | | | | If an expensive request comes in from anyone without a cookie attached, redirect to a page where the cookie is set using JavaScript, then redirect back. This should block robots from these paths, most of which do not support JavaScript. The collateral damage is that a JavaScript browser is now required for users to access those paths. The contents of the cookie is not currently checked, merely that it is set. | ||||
| * | svnweb: add facebookexternalhit to robots.txt | Jani Välimaa | 2024-07-04 | 1 | -0/+1 |
| | | |||||
| * | svnweb: add more bots to robots.txt | Jani Välimaa | 2024-06-17 | 1 | -0/+4 |
| | | |||||
| * | Add two more robots to the svnweb robots.txt file | Dan Fandrich | 2024-04-25 | 1 | -1/+3 |
| | | | | | | These two are hammering the site right now, causing severe slowdowns. Restrict the paths they are allowed to access. | ||||
| * | viewvc/robots.txt: ban AhrefsBot | Nicolas Vigier | 2013-08-14 | 1 | -1/+2 |
| | | | | | | It doesn't respect Crawl-delay, or disallowed URLs. So try to disallow everything for them. | ||||
| * | viewvc/robots.txt: add more URLs | Nicolas Vigier | 2013-01-26 | 1 | -2/+2 |
| | | |||||
| * | viewvc/robots.txt: add Crowl-delay | Nicolas Vigier | 2013-01-26 | 1 | -0/+2 |
| | | |||||
| * | viewvc/robots.txt: add more URLs | Nicolas Vigier | 2013-01-26 | 1 | -2/+4 |
| | | |||||
| * | viewvc/robots.txt: add more URLs | Nicolas Vigier | 2013-01-26 | 1 | -1/+3 |
| | | |||||
| * | viewvc/robots.txt: add more URLs | Nicolas Vigier | 2013-01-26 | 1 | -0/+2 |
| | | |||||
| * | viewvc/robots.txt: filter ?sortby= urls | Nicolas Vigier | 2013-01-03 | 1 | -0/+1 |
| | | |||||
| * | viewvc/robots.txt: fix tags path | Nicolas Vigier | 2013-01-03 | 1 | -1/+1 |
| | | |||||
| * | viewvc: add robots.txt | Nicolas Vigier | 2013-01-03 | 1 | -0/+12 |
