Overview

One of the great features of WooCommerce is the amount of flexibility you have when creating your storefront. While the levels of customisation are great, there needs to be consideration for how your site will perform and how you wish to have bots crawling the site.

With the explosion of AI services now all trying to crawl websites to scrape content combined with traditional crawlers for search engines, a poorly configured site may be come slow and completely unresponsive. With performance and attention span of buyers being a key to customer conversion, ensuring your WooCommerce site is correctly configured is paramount to your success.

Guidance

Ensure you have page caching

The easiest way to ensure most of your site loads without any issue is to simply enable page caching. We still recommend W3 Total Cache, however any correctly configured page cache is fine. Page caching ensures that a static version of the page is used where possible, allowing for the fastest possible page load speeds while not taking resource from customers who may be buying.

Ensure your theme is coded correctly

Ideally, you want to tell bots and crawlers which pages should be indexed and which shouldn’t. Links to things like the add-to-cart should be correctly coded to tell bots not to follow the link. Similarly, links to filters for colours, brands and similar can cause thousands of extra URL’s required to be crawled on your website if not configured correctly.

We recommend talking to your web developer to double check your theme has been correctly configured here.

Ensure a static robots.txt file exists

The robots.txt file tells the bots and crawlers which pages they should crawl and which they shouldn’t. While this can be ignored by some AI bots, it’s at least a starting point to ensure all of the legitimate bots don’t hit parts of your website which shouldn’t be indexed.

If the file doesn’t exist, WordPress will then attempt to dynamically generate it, placing even more load on your site. It’s highly recommended to have a static file, which is also supported by the top two SEO plugins:

You can then add further guidance (beyond “nofollow / noindex” flags in your theme code) to explicitly tell any bot or crawler which pages they shouldn’t even try.

Here’s a simple (but may not be comprehensive) set of rules you can add to your robots.txt:

User-agent: *

# Block filters and Woo links
Disallow: /*add-to-cart=
Disallow: /*add-to-wishlist=
Disallow: /*filter_* 
Disallow: /*orderby_*
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/

# Block WP-Admin
Disallow: /wp-admin/

# Block search pages
Disallow: *s=*
Disallow: */search/*

Allow: /wp-admin/admin-ajax.php

Won’t this affect SEO?

In general, no. Because most of the product variances for WooCommerce are things such as size and colour, the content on the pages isn’t unique and therefore very highly unlikely to even show within a search engine.

Was this article helpful?

Related Articles