Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practices for adding Dynamic URL's to XML Sitemap
-
Hi Guys,
I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them).
The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible).
I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach.
The URL structure for the products are as follows:
http://www.xyz.com/products/product1-is-really-cool
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolestHere are 2 approaches I was considering:
1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product
OR
2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap
I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this.
Thanks heaps,
LW
-
Hi LW
I agree with Mark re archiving products. Although our products don't expire as quickly as yours appear to do I use http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html on a cron job to keep our sitemap fresh.
I also exclude some of our over dynamic URLs using this tool from appearing in the sitemap.
Dean
-
Hi LW,
What system is backing the online store? Are you using a CMS-driven e-commerce solution?
My suggestion would be to create an automated sitemap for the products. Pay careful attention to the priorities you assign and the update frequencies. (Hourly/daily is fine) I definitely think that you'd be spending far too much time on updating a sitemap if you had to do it manually.
This method will result in you having a more accurate sitemap on crawling.
Also, if you are planning on offering the same project in future, it might be an idea not to remove the product altogether, but rather have a page saying "This offer is currently not available" or something along those lines.
Another option might be to have an archive category of products, where all your expired offers can be placed, not available for order. This could allow you to keep your indexed pages, avoid 404s as well as use the product pages to direct new visitors to related/newer products should they see the products in the archive.
Just thinking out loud.

I'd be interested to see the website and the solution that you do eventually implement.
Regards
Mark
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes. How does this impact SEO?
Technical SEO | | DML-Tampa0 -
Include or exclude noindex urls in sitemap?
We just added tags to our pages with thin content. Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
Technical SEO | | vcj0 -
Automate XML Sitemaps
Quick question, which is the best method that people have for automating sitemaps. We publish around 200 times a day and I would like to make sure as soon as we publish it gets updated in the site map. What is the best method of updating a sitemap so it gets updated immediately after it is published.
Technical SEO | | mattdinbrooklyn0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0 -
Whats with the backslash in the url adding as duplicate content?
Is this a bug or something that needs to be addressed? If so, just use a redirect?
Technical SEO | | Boogily0 -
Sitemap for dynamic website with over 10,000 pages
If I have a website with thousands of products, is it a good idea to create a sitemap for this website for the search engines where you show maybe 250 products on a page so it makes it easy for the search engine to find the part and also puts that part closer to the home page? Seems like google likes pages that are the closest to the home page (less clicks the better)
Technical SEO | | roundbrix0 -
How best to redirect URL from expired classified ads?
We have problem because our content are classifieds. Every ad expired after one or two mounts and then ad becomes inactive and we keep his page for one mount latter like a same page but we ad a notice that ad is inactive. After that we delete the ad and his page but need to redirect that URL to search results page which contains similar ads because we don't want to lose the traffic form that pages. How is the best way to redirect ad URL? Our thinking was to redirect internal without 301 redirection because the httacces file will be very big after a while and we are thinking to try a canonicalization because we don't want engine to think that we have to much duplicate content.
Technical SEO | | Donaab0