Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long for google to de-index old pages on my site?
-
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters).
I see that when I google my site, My new open graph settings are coming up correct.
Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages?
Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service.
My site is the following:
http://studio35design.com -
Awesome! Thanks Bas. Thats a great idea. I'll give it a shot.
-
Hi Ruben,
Have you tried deleting these old pages from the index at Google Webmaster Tools?
https://www.google.com/webmasters/tools/url-removal
You can only delete them temporarily but it might overlap the process of actually deleting the pages that you have already set in motion by uploading a new site map.
I did that about a week ago and the effect was noticeable within a couple of days.
Bas
-
Hi Martijn. Thanks for your response. My primary concern are the links that appear below my main link in the SERP. See screenshot. Half out those are no longer working. Sure, they redirect to a 301, but its still messy.
-
Hi Mark. Thanks for your response. All links as far as I can tell now have 301s. I'm sure there might be the odd page out that I forgot, but I'll be monitoring search console for errors.
Your suggestion about the specific page to redirect web design traffic is a good one. I'll think about it.
-
Hi,
Yes this really depends on how frequently Google crawls your site. Do these pages now lead to a 404 error? If yes I would suggest 301 redirecting them to other pages on your site. See this useful Moz blog about 301 redirects: https://a-moz.groupbuyseo.org/blog/heres-how-to-keep-301-redirects-from-ruining-your-seo
You also mentioned that you don’t offer the web design service anymore. If you still gets some traffic there you could make a specific page. Here you can state that you don’t offer web design but maybe some other relevant services.
-
This can take a very long time sometimes, for bigger sites I could see this take months with smaller sites it depends on the frequency and the crawl rate that Google visits your site. If Google is not very active on your site because the content doesn't really relate to something that is updated often then Google might decide not to come back too often to save their own servers and find other content elsewhere on the web.
In your case I would focus on making sure that the new site and structure are working flawless and less about de-indexing the old pages. I can't imagine that they still receive a ton of traffic. Without any doubt is 4 days still very early for Google to pick up the changes.
Hope this helps!?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About porn sites and ranking
Hello, I'm thinking to extend my website into porn. At the moment there is no pornography on it, although we do talk about sex related topics and products (from dating to tutorials, to toys etc.) Would it be dangerous to keep the porn section on the same domain as the rest? Would this negatively affect my non-porn content as Googlebot would "flag" my website as being pornographic (although only a few pages would be)? Or simply Googlebot would leave the current non-porn pages ranking as they are now, just fine, and plus it would rank the porn pages if they "deserve" to? I hope my question is clear. I don't want to create a subdomain.
Algorithm Updates | | fabx0 -
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
Google & Tabbed Content
Hi I wondered if anyone had a case study or more info on how Google treats content under tabs? We have an ecommerce site & I know it is common to put product content under tabs, but will Google ignore this? Becky
Algorithm Updates | | BeckyKey1 -
Ecommerce SEO: Is it bad to link to product/category pages directly from content pages?
Hi ! In Moz' Whiteboard friday video Headline Writing and Title Tag SEO in a Clickbait World, Rand is talking about (among other things) best practices related to linking between search, clickbait and conversion pages. For a client of ours, a cosmetics and make-up retailer, we are planning to build content pages around related keywords, for example video, pictures and text about make-up and fashion in order to best target and capture search traffic related to make-up that is prevalent earlier in the costumer journey. Among other things, we plan to use these content pages to link directly to some of the products. For example a content piece about how to achieve full lashes will to link to particular mascaras and/or the mascara category) Things is, in the Whiteboard video Rand Says:
Algorithm Updates | | Inevo
_"..So your click-bait piece, a lot of times with click-bait pieces they're going to perform worse if you go over and try and link directly to your conversion page, because it looks like you're trying to sell people something. That's not what plays on Facebook, on Twitter, on social media in general. What plays is, "Hey, this is just entertainment, and I can just visit this piece and it's fun and funny and interesting." _ Does this mean linking directly to products pages (or category pages) from content pages is bad? Will Google think that, since we are also trying to sell something with the same piece of content, we do not deserve to rank that well on the content, and won't be considered that relevant for a search query where people are looking for make-up tips and make-up guides? Also.. is there any difference between linking from content to categories vs. products? ..I mean, a category page is not a conversion page the same way a products page is. Looking forward to your answers 🙂0 -
Google Index
Hi all, I just submit my url and linked pages along with xml map to index. How long does it take google to index my new pages?
Algorithm Updates | | businessowner0 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Is it possible that Google may have erroneous indexing dates?
I am consulting someone for a problem related to copied content. Both sites in question are WordPress (self hosted) sites. The "good" site publishes a post. The "bad" site copies the post (without even removing all internal links to the "good" site) a few days after. On both websites it is obvious the publishing date of the posts, and it is clear that the "bad" site publishes the posts days later. The content thief doesn't even bother to fake the publishing date. The owner of the "good" site wants to have all the proofs needed before acting against the content thief. So I suggested him to also check in Google the dates the various pages were indexed using Search Tools -> Custom Range in order to have the indexing date displayed next to the search results. For all of the copied pages the indexing dates also prove the "bad" site published the content days after the "good" site, but there are 2 exceptions for the very 2 first posts copied. First post:
Algorithm Updates | | SorinaDascalu
On the "good" website it was published on 30 January 2013
On the "bad" website it was published on 26 February 2013
In Google search both show up indexed on 30 January 2013! Second post:
On the "good" website it was published on 20 March 2013
On the "bad" website it was published on 10 May 2013
In Google search both show up indexed on 20 March 2013! Is it possible to be an error in the date shown in Google search results? I also asked for help on Google Webmaster forums but there the discussion shifted to "who copied the content" and "file a DMCA complain". So I want to be sure my question is better understood here.
It is not about who published the content first or how to take down the copied content, I am just asking if anybody else noticed this strange thing with Google indexing dates. How is it possible for Google search results to display an indexing date previous to the date the article copy was published and exactly the same date that the original article was published and indexed?0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0