fbpx

Shopify Robots.txt

One of the biggest foundational pieces of website is the robots.txt file. This document tells search engine crawlers which pages on your site they should index and which ones they should ignore.
Unlike normal websites, Shopify sites have additional features such as faceted navigation, which can create duplicate pages and increase the size of the store exponentially.

You, therefore, have to control what search engines see and crawl. Doing this goes a long way in preventing low-quality pages from ending up in the search engine results pages (SERPs).
Most importantly, it helps Shopify owners manage crawl budget - the number of pages Google crawls on your site each day.

This article outlines everything you need to know about the Shopify Robots.txt file - from what it is to how you can edit it to what the correct syntax looks like. By the end of this post, you'll have the knowledge you need to make sure your Shopify site is being indexed correctly.

What Is a Robots.txt File?

shopify robots txt

A Shopify robots.txt file is a text file that tells search engine crawlers which pages on your store to index and which ones to block. This file also blocks low-quality pages from being indexed, which improves your store's SEO.

The robots.text file is generated using another file known as robots.txt.liquid, which is located in the Layout directory of your theme.

A Look at Shopify's Default Robots.txt File

When you check out your new Shopify store, you’ll find a robots.txt file already configured. You can access this file by going to Online Store > Domains. In the Shopify admin, the robots.txt file is located in the Assets folder. You can also access it by going to: domain.com/robots.txt

The default robots.txt file looks like this:

User-agent: *

Disallow: /assets/

Disallow: /pages/

Disallow: /products/

Disallow: /themes/

Disallow: /cart/

Disallow: /checkout/

Disallow: */.*

The first line, User-agent: *, means that this file applies to all robots. The next few lines tell the robot which folders it should avoid. The last line, Disallow: */.*, tells the robot not to visit any files in any folder.

There are a wide range of rules contained in the default robots.txt file. Some of the most important ones include:

Disallow: /search: This rule tells the robot not to visit the internal search results page. The internal search results page is the page that shows up when you search your store.

Disallow: /checkout: Your checkout page is where your customers enter their payment information. You don't want robots to index this page because it could lead to fraud.

Disallow: /cart: The cart page is where your customers can see the items they've added to their cart. Like the checkout page, you don't want robots to index this page for security reasons.

Disallow: /account: The account page is where your customers can manage their account information. For security reasons, you need to add a Shopify noindex tag to this page.

Disallow: /collections/*+*: Your Shopify store contains multiple duplicate category pages created by faceted navigation. These pages are not meant to be indexed by robots, and if they are, they can cause problems with your store's SEO.

Sitemap: [Sitemap Links]: You will see a list of all the sitemaps for your store. You can find more information about sitemaps in our article on XML sitemaps.

The default robots.txt file is a great starting point. In fact, most store owners never need to change it. The default configuration is enough to keep most robots out of areas of your store that you don't want them to be in.

However, there are some situations where you might need to edit your robots.txt file. This includes adding or removing some rules to suit your unique needs. We will look at these potential use cases later in the article.

How to Create the Shopify Robots.txt.liquid?

shopify edit robots.txt

The process for creating the Shopify Robots.txt.liquid is quite simple and only requires a few steps:

  • Log into your Shopify account and go to Online Store > Themes.
  • Click the Actions button and select Edit code.
  • In the Assets folder, click on Add a new asset.
  • Under Templates, click Add New.
  • On the left-most dropdown, click robots.txt
  • In the right-most dropdown, select robots.txt.liquid and click Add asset.
  • Choose Create Template

You can now edit your robots.txt. liquid file.

How to Edit the Shopify Robots.txt File

Adding a Rule

You may need to add a rule to your robots.txt file for a number of reasons. The most common reason is to allow or block a specific robot.

To add a rule, you need to add new blocks of code to your robots.txt file. Each block of code is called a User-agent. The blocks should look like this:

{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: [URLPath]‘ }}

{%- endif -%}

For example, if you want to block all robots from indexing your checkout page, you would add the following code:

{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /checkout’ }}

{%- endif -%}

If you want to block multiple directories, such as your /cart/ and /account/ pages, you must add all the directories you want to block, one after the other:

%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /cart/.*’ }}

{%- endif -%}

{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /account/.*’ }}

{%- endif -%}

{%- if group.user_agent.value == ‘*’ -%}

{{ ‘Disallow: /pages/.*’ }}

{%- endif -%}

You can also add rules that allow specific robots. For example, if you want to allow Googlebot to index your store, you would add the following code:

{%- if group.user_agent.value == ‘Googlebot’ -%}

{{ ‘Allow: /’ }}

{%- endif -%}

robots.txt shopify

Potential Use Cases

Earlier, we mentioned that the default Shopify robots.txt file is usually enough to keep most robots out of areas of your store that you don't want them to be in. However, there are some situations where you might need to edit your robots.txt file. These potential use cases include:

Internal Site Search

At GoldenWeb, we advise our clients to block internal search pages from being indexed by Google and other search engines. This is a good SEO practice because internal search pages generally don't have much content on them, and they can compete with your actual product and category pages in the SERPs. This leads to a huge number of low-quality pages competing for the same keywords, which can result in keyword cannibalization.

To block internal search pages, you would add the following lines to your robots.txt file enter this command:

User-agent: *

Disallow: /search

However, we sometimes find Shopify stores that don't use Shopify's default internal search. They use internal search technologies like Swiftype or site search solutions like Searchspring. In these cases, they are not protected by the default robots.txt file, and we need to add custom commands to block search engines from crawling the /pages/search directory.

Faceted Navigation

shopify robots

Shopify's default robots.txt file also blocks search engines from crawling faceted navigation pages. Faceted navigation is a way of allowing customers to filter products by different criteria, such as price, color, size, etc. While this can be a great way to improve the user experience on your site, it can also result in a lot of duplicate content.

To block faceted navigation pages, we add the following lines to your robots.txt file:

User-agent: *

Disallow: /collections/all/price-*

Disallow: /collections/all/color-*

This blocks all pages that are generated by the price and color filters. However, we might want to allow some of these pages to be indexed if they have unique content on them. In this case, we add an Allow command for each page that you want to be indexed.

For example, if we want to allow the /collections/all/color-blue page to be indexed, we add the following line to your robots.txt file:

Allow: /collections/all/color-blue

We use Google Search Console to identify pages with the most organic traffic so that we can add Allow commands for them.

Sorting Navigation

Sorting navigation allows customers to sort products by different criteria, such as price, name, alphabetical order, etc. This is a great way to keep your store organized and make it easy for customers to find what they're looking for.

However, like faceted navigation, it can also result in a lot of duplicate content. This is because the products on the page are usually the same, but they're just sorted in a different order.

Aat GoldenWeb we advise our clients to block sorting navigation pages from being indexed by search engines. This is because these pages generally don't have much unique content, and can compete with your actual product and category pages in the SERPs.

To block sorting navigation pages, we add the following lines to your robots.txt file:

User-agent: *

Disallow: /collections/all?sort_by=*

shopify robot

Partner with GoldenWeb for All Your SEO Needs

Editing your robots.txt file is just the tip of the iceberg when it comes to SEO. There are hundreds of Shopify technical SEO issues that need to be fixed in order for your store to rank higher in Google.

At GoldenWeb, our comprehensive SEO audit will identify all of the technical SEO issues that are holding your store back. We then create a personalized action plan to fix these issues so that you can start ranking higher and getting more organic traffic.

We identify over 285 SEO and CRO potential growth opportunities that can help increase traffic, conversion rate, and revenue. They range from easy to complex and are prioritized by the level of impact they have.

Schedule a free mini-audit to give you an idea of what we can do for you and your Shopify store. We'll go over the results with you and answer any questions that you have.

Deven Davis

RomaDesignerJewelry.com

"GoldenWeb’s work has been amazing for us.

We already had a decent base with content and structure, but their work has positioned us to continue to make gains in very competitive areas."

Alex Smith

ShopSolarKits.com

"Benjamin and his team have been great to work with, have proven themselves with a ton of results.

Benjamin goes out of his way to provide value in all areas of the business as he's an operator himself."