Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Old URLs Appearing in SERPs
-
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to.
Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s.
We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain.
We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input.
- Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise.
- Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days.
- Update robots.txt to block access to the redirecting directories.
Thank you.
Rosemary
One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
-
You're right - I'm worrying about something that isn't yet a problem.
Thank you
-
In my experience, the best way to absolutely get rid of them is to use the 410 permanently gone status code, then resubmit them for indexation (possibly via an XML sitemap submission, and you can also use Google's crawl testing tool in Search Console to double-check). That said, even with 410, Google can take their time.
The other option is to recreate 200 pages there and use the meta robots noindex tag on the page to specifically exclude them. The temporary block in Google Search Console can work, too, but, it's temporary and I can't say whether it will actually extend the time that the redirected pages appear in the index via the site: command.
All that said, if the pages only show via a site: command, there's almost no chance anyone will see them
-
Ok, Rand - one last questions.
I do think one year is a long time to have old results and if I was going to do a test to get Google to stop showing them in their SERPs what would you do? --- Let's say a client asked you to have these URLs disappear
The 79 pages that appear in the /eichler/ directory are from a personal site so I don't care what happens with those pages in the SERPs.
My ideas are:
-
Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise.
-
Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days.
-
Update robots.txt to block access to the redirecting directories.
-
Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise.
-
Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days.
-
Update robots.txt to block access to the redirecting directories.
-
-
14 months! Wow. That is a long time indeed. Although, now that I look, Moz redirected OpenSiteExplorer just about a year ago, and we still have URLs showing for the site: command in Google too (https://www.google.com/search?q=site%3Aopensiteexplorer.org) so I suppose it's not that uncommon.
Glad to hear traffic and rankings are solid. Let us know if we can help out in the future!
-
Thank you Rand. It has been 14 months since these pages were moved and I'd never seen Google retain pages anywhere near this long.
You're right of course, there has been no impact to traffic for our site as these pages weren't about our search business.
Thanks for taking a look at our issue.
Rosemary
-
Oh gosh - it's my pleasure! Thanks for being part of the Moz community
I'm honored to help out.
As for the URLs - looks like everything's fine. Google often maintains old URLs in a searchable index form long after they've been 301'd, but for every query I tried, they're clearly pulling up the correct/new version of the page, so those redirects seem to be working just great. You're simply seeing the vestigal remnants of them still in Google (which isn't unusual - we had URLs from seomoz.org findable via site: queries for many months after moving to Moz, but the right, new pages were all ranking for normal queries and traffic wasn't being hurt).
Some examples:
- https://www.google.com/search?q=Enter+the+World+of+Eichler+Design
- https://www.google.com/search?q=Eichler+History+flashbacks
- https://www.google.com/search?q=eichler+resources+on+the+web+books
Unless you're also seeing a loss in search traffic/rankings, I wouldn't sweat it much. They'll disappear eventually from the site: query, too. It just takes a while.
-
Wow - do I ever feel privileged to have you respond! Thank you Rand.
You can see a batch of redirected URLs here < site:totheweb.com eichler >
I appreciate any suggestions.
Rosemary
-
Hi Rosemary - can you share some examples of the URLs and the queries that bring them up in search results? If so, we can likely do a diagnosis of what might be going on with Google and why the pages aren't correctly showing the redirected-to URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.xml sitemap showing in SERP
Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work.
Technical SEO | | Kyleroe950 -
Same URL for languages sub-directories
Hi All, I have a main domain and 9 different subdirectories for languages, example: www.example.com/page.html www.example.com/uk/page-uk.html www.example.com/es/page-es.html we are implementing hreflang tags for the languages, but we are thinking to get rid of the dashes on the languages URL: -uk or -es, so it will be: www.example.com/page.html www.example.com/uk/page.html www.example.com/es/page.hrml would this be a problem? to have same page names even if they are in different subdirectories? would we need to add canonical tags, at lease for the main domain URLs? www.kornferry.com/page.html Thank you, Rachel
Technical SEO | | RaquelSaiz0 -
Unpublish and republish old articles
This might be a dumb question but we had an incident where a new SEO guy thought it would be a good idea to un-publish and republish all of your 200+ blog posts which we carefully scheduled over the last 6 months. He did not update the content and did not change anything. His intention was to send out google a sign to recheck the sites or something. Now, the entire blog looks like it wen't live in one day, which I don't think is good? Should we load a backup and get our old publishing dates back, should we keep it with the new publishing dates? What are the consequences? Will it effect our SEO?
Technical SEO | | Funlocity1 -
Some Old date showing in SERP
I see some old date Jan 21 2013 showing up for some categories in Google search results. These are category pages and I do not see the date in view source. This is not a wordpress site or a blog page. We keep changing this page by removing/adding items so it is not outdated.
Technical SEO | | rbai0 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
Redirect URLS with 301 twice
Hello, I had asked my client to ask her web developer to move to a more simplified URL structure. There was a folder called "home" after the root which served no purpose. I asked for the URLs to be redirected using 301 to the new URLs which did not have this structure. However, the web developer didn't agree and decided to just rename the "home" folder "p". I don't know why he did this. We argued the case and he then created the URL structure we wanted. Initially he had 301 redirected the old URLS (the one with "Home") to his new version (the one with the "p"). When we asked for the more simplified URL after arguing, he just redirected all the "p" URLS to the PAGE NOT FOUND. However, remember, all the original URLs are now being redirected to the PAGE NOT FOUND as a result. The problems I see are these unless he redirects again: The new simplified URLS have to start from scratch to rank 2)We have duplicated content - two URLs with the same content Customers clicking products in the SERPs will currently find that they are being redirect to the 404 page. I understand that redirection has to occur but my questions are these: Is it ok to redirect twice with 301 - so old URL to the "p" version then to final simplified version. Will link juice be lost doing this twice? If he redirects from the original URLS to the final version missing out the "p" version, what should happen to the "p" version - they are currently indexed. Any help would be appreciated. Thanks
Technical SEO | | AL123al0 -
How long will Google take to stop crawling an old URL once it has been 301 redirected
I need to do a clean-up old urls that have been redirected in sitemap and was wondering about this.
Technical SEO | | Ant-8080 -
Singular vs plural in urls
In keyword research for an ecommerce site, I've found that widget, singular gets a lot more searches than widgets, plural AND is much less competitive. Is it better for SEO purposes to have the URLs (and matching title tags) in the catalog as /brass-widget.html, /steel-widget.html, etc., or /brass-widgets.html, etc.? I'm worried that a) searches for widgets will pass by the singular urls but not vice versa, and b) the singular form will strike visitors as bad grammar. Any advice?
Technical SEO | | AmericanOutlets0