Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitemaps: Best Practice
-
What should and what shouldn't go in the sitemap?
In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs?
Thanks for any advice/ anecdotes

-
So, sometimes, people think adding a sitemap to their company website, is something thats very difficult to do.
for example, they may think they need a web designer to do this for them, yet often you can do it yourself, its very simple.
so if your business has a WordPress website, then it can be a piece of cake to add a site map.
If you use Yoast, its a free plugin, , you can add a site map very easily to your website, which you can then send to your site map to Google Search Console for indexing .
We did this for a large garden room company within the city of Bristol, and what happens is that it makes sure every single page and blog post is indexed.
-
Pages that I like to call 'core' site URLs should go in your sitemap. Basically, unique (canonical) pages which are not highly duplicate, which Google would wish to rank
I would include core addresses
I wouldn't include uploaded documents, installers, archives, resources (images, JS modules, CSS sheets, SWF objects), pagination URLs or parameter based children of canonical pages (e.g: example.com/some-page is ok to rank, but not example.com/some-page?tab=tab3). Parameters are additional funky stuff added to URLs following "?" or "&".
There are exceptions to these rules, some sites use parameters to render their on-page content - even for canonical addresses. Those old architecture types are fast dying out, though. If you're on WordPress I would index categories, but not tags which are non-hierarchical and messy (they really clutter up your SERPs)
Try crawling your site using Screaming Frog. Export all the URLs (or a large sample of them) into an Excel file. Filter the file, see which types of addresses exist on your site and which technologies are being used. Feed Google the unique, high-value pages that you know it should be ranking
I have said not to feed pagination URLs to Google, that doesn't mean they should be completely de-indexed. I just think that XML sitemaps should be pretty lean and streamlined. You can allow things which aren't in your XML sitemap to have a chance of indexation, but if you have used something like a Meta no-index tag or a robots.txt edit to block access to a page - **do not **then feed it to Google in your XML. Try to keep **all **of your indexation modules in line with each other!
No page which points to another, separate address via a canonical tag (thus calling itself 'non-canonical') should be in your XML sitemap. No page that is blocked via Meta no-index or Robots.txt should be in your sitemap.XML either
If you end up with too many pages, think about creating a sitemap XML index instead, which links through to other, separate sitemap files
Hope that helps!
-
To further on from this, we have some parameter urls in our sitemap which make me uneasy. should url.com/blah.html?option=1 be in the sitemap? If so, what benefit is that giving us?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
What are the best practices for geo-targeting by sub-folders?
My domain is currently targeting the US, but I'm building out sub-folders that will need to geo-target France, England, and Spain. Each country will have it's own sub-folder, and professionally translated (domain.com/france). Other than the hreflang tags, what are other best practices I can implement? Can Google Webmaster tools geo-target by subfolder? Any suggestions would be appreciated. Thanks Justin
Intermediate & Advanced SEO | | Rhythm_Agency0 -
What is the best practice for URLs for E-commerce products in multiple categories?
Hello all! I have always worked successfully with SEO on E-commerce sites, however we are currently revamping an older site for a client and so I thought I'd turn to the community to ask what the best practices that you guys are experiencing for url structures at the moment. Obviously we do not wish to create duplicate content and so the big question is, what would you guys do for the very best structure for URLs on an E-commerce site that has products in multiple categories? Let's imagine we are selling toy cars. I have a sports car for sale, so naturally it can go in the sports cars category and it could also go in to the convertibles category too. What is the best way you have found recently that works and increases rankings, but does not create duplicate content? Thanks in advance! 🙂 Kind Regards, JDM
Intermediate & Advanced SEO | | Hatfish0 -
Slug best practices?
Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below. Are there reasons why one might be better than the others? http://www.washingtonpost.com/news/morning-mix/wp/2014/06/20/bad-boys-yum-yum-violent-criminal-or-not-this-mans-mugshot-is-heating-up-the-web/ http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/ http://www.tmz.com/2014/06/19/mugshot-eyes-felon-sexy/
Intermediate & Advanced SEO | | TheaterMania0 -
Best practice for retiring old product pages
We’re a software company. Would someone be able to help me with a basic process for retiring old product pages and re-directing the SEO value to new pages. We are retiring some old products to focus on new products. The new software has much similar functionality to the old software, but has more features. How can we ensure that the new pages get the best start in life? Also, what is the best way of doing this for users? Our plan currently is to: Leave the old pages up initially with a message to the user that the old software has been retired. There will also be a message explaining that the user might be interested in one of our new products and a link to the new pages. When traffic to these pages reduces, then we will delete these pages and re-direct them to the homepage. Has anyone got any recommendations for how we could approach this differently? One idea that I’m considering is to immediately re-direct the old product pages to the new pages. I was wondering if we could then provide a message to the user explaining that the old product has been retired but that the new improved product is available. I’d also be interested in pointing the re-directs to the new product pages that are most relevant rather than the homepage, so that they get the value of the old links. I’ve found in the past that old retirement pages for products can outrank the new pages as until you 301 them then all the links and authority flow to these pages. Any help would be very much appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0 -
Urls missing from product_cat sitemap
I'm using Yoast SEO plugin to generate XML sitemaps on my e-commerce site (woocommerce). I recently changed the category structure and now only 25 of about 75 product categories are included. Is there a way to manually include urls or what is the best way to have them all indexed in the sitemap?
Intermediate & Advanced SEO | | kisen0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9