Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Proper 301 in Place but Old Site Still Indexed In Google
-
So i have stumbled across an interesting issue with a new SEO client. They just recently launched a new website and implemented a proper 301 redirect strategy at the page level for the new website domain. What is interesting is that the new website is now indexed in Google BUT the old website domain is also still indexed in Google? I even checked the Google Cached date and it shows the new website with a cache date of today.
The redirect strategy has been in place for about 30 days. Any thoughts or suggestions on how to get the old domain un-indexed in Google and get all authority passed to the new website?
-
How big is the site in question? How many pages are there to de-index?
What does Google Webmaster Tools tell you about the old domain? Does it show pages being removed from the index over time?
If you do a site:{old domain} query, can you see that the number of results being returned is gradually decreasing?
How have you implemented the redirects?
Have you submitted a change of address request in Webmaster Tools?
On the new website, have you submitted a sitemap fom the old website as well as the new one?
What does the backlink profile on the old domain look like? Can you start to get authoritative links to the old site updated? What about any embedded internal links in your content - have they also been updated?
More guidance from Google here:
-
It could just be me kchandler, but I've seen it take as long as 8 months for old pages to get purged from Google's index, redirected or not. The redirect and indexing are independent of one another.
-
Kyle -
Sorry this is so puzzling. The only other thing I could think of is that perhaps the older pages still somehow exist and/or are being served by the server? For example, the .htaccess file might have the /old-page.php redirecting to the /new-page.php... but somehow the old-page.php is still accessible? I'd also look at caching, too? For example, our site, www.CustomerParadigm.com uses varnish for caching, so if we make a change to the site, we need to clear out that page or the change won't be reflected publicly.
Hope this helps?
-- Jeff
-
Hi Jeff, thank you for the quick response, it is truly appreciated
Unfortunately i am not able to publicly release their URL in forums due to part of our contract. However i can provide some feedback to your ideas.
- Different web servers - the website is the same and on the same hosting platform, they just updated their branding and along with that their domain name
- WWW. vs non-WWW. - I did a quick check and it looks like both versions of the old domain properly 301 redirect no matter what the subdomain. I am checking that both with my Chrome developer tools as well s checkmyheaders.com.
- Robots.txt on old server - as it related to my first bullet, it is technically the same website and server the the robots.txt is the same for the new website just reflecting the new domain.
Are there any other things that i could look at for a sanity check? I have never seen a website not get de-indexed after a 301 redirect. Do you think i would need to submit something to Google Webmaster Tools for the old URLs/domains?
Regards, Kyle
-
Without seeing the new and old sites, my first impression is that this might have been caused by having the older site on a different server; the newer site might be on a newer, different Web server. If this is the case, and the older server is still online, I'd check your DNS zone files to make sure that the older site isn't somehow still accessible? I've seen cases where there's two A records for the www. version of a domain; not ideal, but it can cause issues. I'd also set the robot.txt file on the older server / older site to no-index / no-follow.
Hope this helps?
Thanks,
- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed.
Intermediate & Advanced SEO | | alphonsehaThe above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!?
Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
How to 301 redirect old wordpress category?
Hi All, In order to avoid duplication errors we've decided to redirect old categories (merge some categories).
Intermediate & Advanced SEO | | BeytzNet
In the past we have been very generous with the number of categories we assigned each post. One category needs to be redirected back to blog home (removed completely) while a couple others should be merged. Afterwords we will re-categorize some of the old posts. What is the proper way to do so?
We are not technical, Is there a plugin that can assist? Thanks0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
How to properly link network of microsites and main sites?
Law firm has a main brand site (lawfirmname.com) with lots of content focusing on personal injury related areas of law. They also do other unrelated areas of law such as bankruptcy and divorce. They have a separate website for bankruptcy and a separate one for divorce. These websites have good quality content, a backlinking campaign, and are fairly large websites, with landing pages for different cities. They also have created local microsites in the areas of bankruptcy and divorce that target specific smaller cities that the main bankruptcy site and divorce site do not target well. These microsites have a good deal of original content and the content is mostly specific to the city the website is about, and virtually no backlinks. There are about 15 microsites for cities in bankruptcy and 10 in divorce and they rank pretty well for these city specific local searches. None of these sites are linked at all, and all 28 of the sites are under the same hosting account (all are subdomains of root domain of hosting account). Question, should I link these sites together at all and if so how? I considered making a simple and general page on the lawfirmname.com personal injury site for bankruptcy and divorce (lawfirmname.com/bankruptcy and lawfirmname.com/divorce) and then saying on the page something to the effect of "for more information on bankruptcy go to our main bankruptcy site at ....." and putting the link to the main bankruptcy site. Same for divorce. This way users can go to lawfirmname.com site and find Other Practice Areas, go to bankruptcy page, and link to main bankruptcy site. Is this the best way to link to these two main sites for bankruptcy and divorce or should I be linking upward? Secondly, should I link the city specific microsites to any of the other sites or leave them completely separate? Thirdly, should all of these sites be hosted on the same account or is this something that should be changed? I was considering not linking the city specific sites at all, but if I did this I didn't know if I should create different hosting accounts for them (which could be expensive). The sites work well in themselves without being linked, but wanted to try to network them in some way if possible without getting penalized or causing any issues with the search engines. Any help would be appreciated on how to network and host all of these websites.
Intermediate & Advanced SEO | | broca777110 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
I have a .com site but I am only ranking good on google for Canada and not the USA.
We are located in Canada but sell our products world wide. We are ranking ok on google.ca but are not in the top 50 on google.com. Is it due to my ip address? Is there any tips that you can give me to help up my rating for google.com. Any info you can provide me with will be amazing. Thanks,
Intermediate & Advanced SEO | | drewzal0