-
-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: @block_robots decorator for views #42
Comments
👍 |
I think it's out the scope of this application |
This code defines a block_robots decorator that first checks whether the User-agent of the incoming request matches any of the blocked agents in the list. If a match is found, an HTTP 403 Forbidden response is returned. If no match is found, the request is allowed to continue to the wrapped view. Feel free to customize the list of blocked agents according to your requirements. The code uses regular expressions to enable partial matches and case-insensitive search, so you can easily include wildcards in the blocked agents list as needed. Remember that even though this workaround prevents misbehaving bots from accessing your views, the ideal method of restricting access is still employing a properly configured robots.txt file. Here is sample code to create a custom decorator @block_robots that will block robots from views based on user-agent:
You would need to define the BLOCKED_ROBOTS list in your Django settings file with the user-agent strings of the robots you want to block. The decorator @check_robots_txt is included to ensure that the view respects the robots.txt file. You can add this decorator to any view you want to respect the robots.txt file, even if it doesn't need to block robots. Here's an example of how you could define the BLOCKED_ROBOTS in your Django settings file: settings.py
Note that this example is case-insensitive, so any user-agent string containing "googlebot" will be blocked, regardless of whether it's spelled in uppercase or lowercase letters. If you want to make it case-sensitive, you can remove the lower() method in the robot_blocked function. |
It would be nice if django-robots included a decorator to block robots from views based on User-agent (like
robots.txt
). It would help django apps outright prevent robots - even mis-behaving ones that don't followrobots.txt
- from accessing views that they shouldn't.The text was updated successfully, but these errors were encountered: