Post by account_disabled on Feb 27, 2024 1:57:33 GMT -6
The limits the URL being changed this is commonly done with JavaScript. The reason is simple it provides the ease of browsing and filtering products while potentially only generating a single URL. However this can go too far in the opposite direction you will need to manually ensure that you have indexable landing pages for key facet combinations e.g. black dresses. Heres a table outlining what I wrote above in a more digestible way. Options Solves duplicate content Solves crawl budget Recycles link equity Passes equity from external links Allows internal link equity flow Other notes Noindex follow Yes No No Yes Yes Canonicalization Yes No Yes Yes Yes Can only be used on pages that are similar.
Robots.txt Yes Yes No No No Technically pages that are blocked in robots.txt can still be Kazakhstan Phone Number indexed. Nofollow internal links to undesirable facets No Yes No Yes No JavaScript setup Yes Yes Yes Yes Yes Requires more work to set up in most cases. But whats the ideal setup First off its important to understand there is no onesizefitsall solution. In order to get to your ideal setup you will most likely need to use a combination of the above options. Im going to highlight an example fix below that should work for most sites but its important to understand that your solution might vary based on how your site is built how your URLs are structured etc. how we get to an ideal solution by asking ourselves one question.
Do we care more about our crawl budget or our link equity By answering this question were able to get closer to an ideal solution. Consider this You have a website that has a faceted navigation that allows the indexation and discovery of every single facet and facet combination. You arent concerned about link equity but clearly Google is spending valuable time crawling millions of pages that dont need to be crawled. What we care about in this scenario is crawl budget. In this specific scenario I would recommend the following solution. Category subcategory and subsubcategory pages should remain discoverable and indexable. e.g. clothing clothingwomens clothingwomensdresses For each category with.
Robots.txt Yes Yes No No No Technically pages that are blocked in robots.txt can still be Kazakhstan Phone Number indexed. Nofollow internal links to undesirable facets No Yes No Yes No JavaScript setup Yes Yes Yes Yes Yes Requires more work to set up in most cases. But whats the ideal setup First off its important to understand there is no onesizefitsall solution. In order to get to your ideal setup you will most likely need to use a combination of the above options. Im going to highlight an example fix below that should work for most sites but its important to understand that your solution might vary based on how your site is built how your URLs are structured etc. how we get to an ideal solution by asking ourselves one question.
Do we care more about our crawl budget or our link equity By answering this question were able to get closer to an ideal solution. Consider this You have a website that has a faceted navigation that allows the indexation and discovery of every single facet and facet combination. You arent concerned about link equity but clearly Google is spending valuable time crawling millions of pages that dont need to be crawled. What we care about in this scenario is crawl budget. In this specific scenario I would recommend the following solution. Category subcategory and subsubcategory pages should remain discoverable and indexable. e.g. clothing clothingwomens clothingwomensdresses For each category with.