By Christian Fillion E-Commerce Strategist & Founder, Marketing Media
You installed the “Layered Navigation” module (Faceted Search) on your PrestaShop store. It looks great. Your customers can filter by Size, Color, Price, and Brand.
But while your customers are happy, Google is confused.
You look at your Search Console coverage report. You see thousands of URLs like:
- yoursite.com/2-home?q=Categories-Women
- yoursite.com/5-tshirts?size=s&color=blue&price=10-20
- yoursite.com/5-tshirts?size=s&color=blue&price=10-21
Google is crawling every single possible combination of your filters. Instead of indexing your 500 products, it is trying to index 500,000 unique filter URLs that all show the same content.
You are paying a “Crawl Budget Tax.”
Googlebot has a limit on how many pages it will crawl on your site per day. If it spends 90% of its budget crawling useless “Blue T-shirts sorted by Price” pages, it leaves before it ever sees your new products.
- You are diluting your keyword relevance (Duplicate Content).
- You are slowing down the indexing of new inventory.
- You are wasting server resources serving pages to bots that generate no value.
You are sending the crawler into a maze with no exit.
This is why we audit the Faceted Search Configuration on every PrestaShop build. We ensure the filters help users, but don’t trap bots.
1. The “Infinite Loop” vs. The “Indexable Limit”
PrestaShop’s native module (ps_facetedsearch) is powerful, but dangerous out of the box.
- The Friction: By default, PrestaShop might generate a unique, indexable URL for every filter attribute. If you have 5 sizes and 5 colors, that’s 25 combinations. If you add “Price Range” sliders, the combinations become mathematically infinite.
- The Fix: Module Configuration. We go into the Faceted Search module settings. We explicitly set “Indexable” to NO for low-value attributes (like “Price” or “Weight”). We tell PrestaShop: “Let humans use this, but do not create an SEO URL for it.”
You stop the creation of useless pages at the source.
The Optimization ROI: We cleaned up a PrestaShop store that had 1.2 million indexed pages (mostly useless filter results). After configuring the module to stop indexing price sliders, their “Valid Page” count dropped to 4,000, and their organic traffic increased because the authority was finally concentrated on the main categories.
2. The “Robots.txt” Bouncer vs. The Open Door
Sometimes, you can’t stop the URL generation, so you have to block the entry.
- The Friction: Even if you turn off “Indexable,” PrestaShop might still let bots crawl the query strings (?q=). Googlebot is curious. If it sees a link, it clicks it.
- The Fix: Strict Disallow Rules. We edit the robots.txt file (located in your root). We add specific rules like Disallow: /*?q= or Disallow: /*selected_filters to physically block the bot from entering the filter parameters.
You put a security guard at the door of your filter results.
3. The “Canonical” Shield vs. The Duplicate Penalty
What if a bot does get in?
- The Friction: A bot lands on /men-shoes?color=black. It looks 90% identical to the main /men-shoes category page. Google sees this as Duplicate Content and penalizes both pages.
- The Fix: Dynamic Canonicals. We ensure your theme is coded to point the rel=”canonical” tag of every filter page back to the root category URL.
We tell Google: “Even if you are looking at the filtered version, please give all the SEO credit to the main Category Page.”
Stop The Spider Trap
In the physical world, you wouldn’t send a health inspector to check every single tile on your floor individually. You would show them the room.
In the digital world, your filter combinations are those individual tiles.
- You control the crawl path.
- You control the index.
- You control the authority.
If your “Excluded” tab in Search Console is exploding with thousands of URLs, your Faceted Search is leaking.
[Schedule Your Strategy Call with Christian Fillion]
Leave a Reply