Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Exclude Child URLs from XML Sitemap Generator (Wordpress)
-
Hi all,
I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs.
There is a section Exclude items and a subsection Exclude posts. I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work. So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked.
I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz!
Cheers.
-
AH! You did it Keith - I thought clicking 'update' at the bottom would do it, but there's a little link hidden in some text at the top saying "rebuild the sitemap manually".
Finally it's done, thanks so much for your help!
Mark
-
Did you try to generate a new sitemap after clicking update options and then submitting it to webmaster tools?
Generally it will only update when you add/delete pages on it's own.
-
I'm just trying to exclude these child URLs from the sitemap - in future I may block them entirely, but I certainly don't want to submit a sitemap with these URLs and then contradict that in robots.txt.
I have used the Post ID numbers to exclude the pages from the sitemap, however they remain in place.
Thanks once again for your assistance and quick responses!
-
It may take some time for it to propagate to Google if that is what you are asking. Are you trying to block the pages/posts completely from search engines?
-
Hi Keith,
Thanks once again for a quick response. I have actually tried that method, however when I check the live sitemap I can still see the pages in my sitemap. Very frustrating! Is it that the sitemap doesn't update live straight away? And just to confirm, I am clicking "Update Options" at the bottom - quite often it'll be something stupid like that!

Thanks,
Mark
-
Great question, and WP really should make this easier!
http://businessaccent.com/2009/06/08/what-is-my-wordpress-post-id-number-and-how-can-i-find-it/ This article explains one way to see it, also if you open up the post/page in the admin panel to edit it you can just look in your browser to see the url which will have the post ID in it... IE: www.yoursite.com/wp-admin/post.php?post=615&action=edit (615 is the post ID)
Hope that helped

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Generating a signature and expires in java
Hello, I am developing a tool for my company to get stats from SeoMoz using your API. During development, I have been using the example signature and expires values which are auto-generated for me. Now that testing is complete, my code will need to generate these values. I have been googling looking for a resource demonstrating how to do this using Java, but I have not found a good example. I was hoping that someone at SeoMoz would have a resource or an example that they could share. The email associated with this account belongs to a non-developer, so if a response is provided via email in addition to the forum, sending it to my email would be much appreciated. Thank you, Anthony [email protected]
Technical SEO | | TRich500 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0 -
Where does Wordpress store the 301 redirects?
Hi, I've just created a campaign for my new wordpress blog and found 11 301 redirects which I was not aware of. It looks like wordpress has created them automatically. Does any one know how wordpress handles this issues or where are they stored so I can delete them? They are of no use for me. 9 of these redirects point to the same url with an added '/' and are in pages 1 is on a post. I've been changing the permalink and some urls several times and maybe one of these times the Wordpress has automatically created the 301 redirect. But why? I do not want to keep the old url. the last redirect is very strange it goes from http://www.mydomain.com/folder to http://www.mydomain.com where folder is the folder where I installed wordpress. But again, I want no one to type the url with the folder name or even know this folder exists. Any comment on this would be greatly appreciated. Thanks a lot, David
Technical SEO | | dballari0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0