docs
  1. SCAYLE Resource Center
  2. Sitemap
  3. Sitemap Add-On
  4. Integrating Hosted Files

Integrating Hosted Files

After setting up the Sitemap and the robots.txt file, ensure that you include the corresponding Sitemap entry in your robots.txt file. The goal is to make the Sitemap and Robots available under your own shop domain.

For the frontend integration, you have two options for implementation:

Get Sitemap and Robots URL

Sitemap

  1. Go to Add-Ons > Sitemap Add-On > Sitemap Management.
  2. In the Schedule section, the publicly accessible AWS link is shown.
    Copy the link and embed it in your templates.

Robots

  1. Go to Add-Ons > Sitemap Add-On > Robots Management.
  2. Per generated robots file, the publicly accessible AWS link is added to the respective section.
    Copy the link and embed it in your templates.

Examples

Proxy Pass

Nginx

To set up a proxy pass in Nginx, you need to ensure that the http_proxy modules are enabled.

Add the following configuration to your Nginx server block:

location /robots.txt {
    proxy_pass https://{{tenant-space}}-sitemaps-sitemap.s3.eu-west-1.amazonaws.com/robots_files/ac/{{locale}}-robots.txt;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
}

Apache

To set up a proxy pass in Apache, you need to ensure that the mod_proxy and mod_proxy_http modules are enabled. Then, you can add the following configuration:

ProxyPass /robots.txt https://{{tenant-space}}-sitemaps-sitemap.s3.eu-west-1.amazonaws.com/robots_files/ac/{{locale}}-robots.txt
ProxyPassReverse /robots.txt https://{{tenant-space}}-sitemaps-sitemap.s3.eu-west-1.amazonaws.com/robots_files/ac/{{locale}}-robots.txt

Redirect

Nginx:

To set up a redirect in Nginx, you can add the following configuration to your server block:

location /robots.txt {
    return 301 https://{{tenant-space}}-sitemaps-sitemap.s3.eu-west-1.amazonaws.com/robots_files/ac/{{locale}}-robots.txt;
}

Apache:

The Apache configuration uses a rewrite rule to redirect the request to the AWS URL. Add the following configuration to your Apache .htaccess file or server configuration:

RewriteEngine On
RewriteRule ^robots.txt$ https://{{tenant-space}}-sitemaps-sitemap.s3.eu-west-1.amazonaws.com/robots_files/ac/{{locale}}-robots.txt [P,L]

Benefits

  • Transparency: Users and search engines will see the sitemap and robots.txt as being served from your domain.
  • Ease of Update: Changes to the files are reflected immediately without needing to update your web server files.