Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SITE_BY_REQUEST missing from docs #111

Open
thenewguy opened this issue Apr 1, 2021 · 1 comment
Open

SITE_BY_REQUEST missing from docs #111

thenewguy opened this issue Apr 1, 2021 · 1 comment

Comments

@thenewguy
Copy link

Used in the following places:

@some1ataplace
Copy link

SITE_BY_REQUEST is a setting in django-robots that determines whether the robots.txt file should be generated dynamically based on the current request's domain.

When SITE_BY_REQUEST is set to True, the get_robots_txt() function in django_robots/utils.py will generate the robots.txt file based on the domain of the current request. This can be useful if you have multiple sites running on the same Django instance, each with their own robots.txt file.

To use SITE_BY_REQUEST, you can set the ROBOTS_SITE_BY_REQUEST setting in your Django settings file to True:

ROBOTS_SITE_BY_REQUEST = True

Then, in your RobotsView class in django_robots/views.py, you can check the value of settings.SITE_BY_REQUEST to determine whether to generate the robots.txt file dynamically:

from django.views.generic import TemplateView
from django_robots.utils import get_robots_txt
from django.conf import settings

class RobotsView(TemplateView):
    content_type = 'text/plain'

    def render_to_response(self, context, **kwargs):
        if settings.SITE_BY_REQUEST:
            content = get_robots_txt(request=self.request)
        else:
            content = get_robots_txt()
        response = self.get_response(content)
        return response

In this modified implementation, we're checking the value of settings.SITE_BY_REQUEST to determine whether to generate the robots.txt file dynamically using the get_robots_txt() function with the request parameter set to self.request. If SITE_BY_REQUEST is False, we generate the robots.txt file using the default behavior of get_robots_txt().

By default, SITE_BY_REQUEST is set to False, so if you don't need to generate the robots.txt file dynamically, you don't need to do anything. However, if you have multiple sites running on the same Django instance and each site has its own robots.txt file, you can use SITE_BY_REQUEST to generate the correct robots.txt file for each site based on the current request's domain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants