E-COMMERCE SEO 5 min read

Faceted Navigation

Filter navigation in webshops that creates SEO challenges through thousands of URL combinations.

Reinier Sierag
Reinier Sierag Founder Kobalt

Faceted navigation (filter navigation) is the system in webshops that lets visitors filter products by attributes like color, size, price, brand, and more. Each filter combination often generates a unique URL, which can lead to thousands or millions of indexable pages.

The SEO problem

Unmanaged faceted navigation causes crawl budget waste, duplicate content (same products in different order), and thin content (filter pages with few results). This can undermine your overall SEO — bibliotheekterm performance.

Solutions

Use robots.txt — bibliotheekterm or noindex for non-valuable filter combinations. Implement canonical tags to the main category page. Make deliberate choices: which filter combinations are SEO-valuable landing pages, and which should be blocked? Ajax filtering without URL changes is an alternative for non-essential filters.

Decision tree: index or block?

Not every filter combination deserves a spot in the search index — bibliotheekterm. Use this decision tree to determine the right approach per filter type:

  1. Does this filter combination have search volume?
    • Yes (e.g., "Nike running shoes women") → go to step 2
    • No (e.g., "size 9 + color blue + brand Nike + material mesh") → block
  2. Does the filter page deliver unique, valuable content?
    • Yes (enough products, unique text possible) → go to step 3
    • No (fewer than 3 products, no unique content) → block
  3. Is it a primary attribute (brand, category, type) or secondary (color, size, price)?
    • Primary attribute → make indexable as a separate landing page with unique H1, meta title, and introductory text
    • Secondary attribute → block, unless there is demonstrable search volume
  4. Combination filters (multiple facets at once)?
    • Almost always block. The number of URLs explodes exponentially and most combinations have no search volume.

Code examples: robots.txt rules

Block non-valuable filter paths in your robots.txt. Note: this prevents crawling — bibliotheekterm but not necessarily indexing (if other pages link to them).

# Block all filter URLs with parameters
User-agent: *
Disallow: /*?color=
Disallow: /*?size=
Disallow: /*?price=
Disallow: /*?sort=
Disallow: /*&color=
Disallow: /*&size=
Disallow: /*&price=
Disallow: /*&sort=

# Allow valuable brand filters
Allow: /shoes/nike/
Allow: /shoes/adidas/

# Block pagination of filter results
Disallow: /*?page=

Code examples: canonical tags

Implement canonical tags on filter pages that point to the main category page. This is more effective than robots.txt because it gives Google a clear signal about the preferred version.

<!-- On the filtered page /shoes?color=blue&size=9 -->
<link rel="canonical" href="https://www.example.com/shoes" />

<!-- On a valuable brand landing page /shoes/nike -->
<!-- Self-referencing canonical: this page IS the preferred version -->
<link rel="canonical" href="https://www.example.com/shoes/nike" />

<!-- Combine with noindex for extra certainty -->
<meta name="robots" content="noindex, follow" />

Impact on crawl budget

A webshop with 500 products and 5 filter types (brand, color, size, price, material) with an average of 10 options each can grow exponentially:

ScenarioNumber of URLsCrawl budget impact
No filters (categories only)50Minimal
1 filter at a time50 + (50 × 50) = 2,550Manageable
2 filters combined~125,000Problematic
3+ filters combined~5,000,000+Catastrophic
+ sorting × paginationTens of millionsGoogle stops crawling

Every URL Google has to crawl takes away from the crawl budget for your important pages. For large webshops, this can cause new products or changed pages to take weeks to get indexed.

Frequently asked questions

Should I use Ajax filtering to prevent SEO problems?

Ajax filtering (where the URL doesn't change) is an effective solution for secondary filters. Content is loaded dynamically without generating a new URL, preventing duplicate content and crawl budget waste. Downside: if the filter combination has search volume, you can't capture that traffic. Use Ajax only for filters without search volume (price range, sorting, size) and static URLs for valuable filters (brand, category).

Is noindex or robots.txt better for faceted navigation?

Both have pros and cons. robots.txt prevents Google from crawling the page, but if other pages link to it, Google can still index the URL (without knowing the content). noindex requires Google to first crawl the page to see the tag, which costs crawl budget, but guarantees the page won't end up in the index. The best approach is a combination: noindex, follow on the page itself, so Google can still follow the links on the page but won't index it.

How do I know which filter pages Google has already indexed?

Use Google Search Console (Pages > "Indexed, not submitted in sitemap — bibliotheekterm") or search in Google with site:example.com inurl:color= to see which filter pages are in the index. Screaming Frog can also perform a full crawl to map all faceted URLs.

What if my webshop platform offers no control over faceted URLs?

Many platforms (Shopify, Magento, WooCommerce) offer plugins or configuration options for faceted navigation. If your platform doesn't offer direct control, you can still add canonical tags and meta robots tags server-side through middleware or template modifications. As a last resort, you can address the problem through Google Search Console's URL Parameters tool (if available) or through a server-side redirect strategy.

RELATED TERMS

Crawling

The automated scanning of websites by search engines and AI bots to discover content.

Reinier Sierag Reinier Sierag

Indexing

The storing and cataloging of web content by search engines so it becomes findable.

Bas Vermeer Bas Vermeer

Canonical URL

An HTML tag that tells search engines which version of a page is the original when duplicate content exists.

Bas Vermeer Bas Vermeer
Reinier Sierag
Reinier Sierag

Founder Kobalt

I have been building websites for over twenty years. That sounds like a long time, and it is. What started as a fascination with fast, accessible sites grew into Kobalt. Hundreds of websites built, o...