Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long after https migration that google shows in search console new sitemap being indexed?
-
We migrated 4 days ago to https and followed best practices..
In search console now still 80% of our sitemaps appear as "pending" and among those sitemaps that were processed only less than 1% of submitted pages appear as indexed?Is this normal ?
How long does it take for google to index pages from sitemap?
Before https migration nearly all our pages were indexed and I see in the crawler stats that google has crawled a number of pages each day after migration that corresponds to number of submitted pages in sitemap.Sitemap and crawler stats show no errors.
-
thanks Stephan.
It took nearly a month for search console to display the majority of our pages in sitemap as indexed, even though pages showed up much earler in SERPs. We had it split down into 30 different sitemaps. Later we published also a sitemap index and saw a nice increase a few days later in indexed pages which may have been related.
Finally google now is indexing 88% of our sitemap.
Do you think in general that 88% is for a site of this size a somehow normal percentage or would you normally expect a higher percentage of indexed sitemap page and investigate deeper for potential pages that google may consider thin content? Navigation I can rule out as a reason. -
Did the "pending" message go away in the end? Unfortunately you're fairly limited in what you can do with this. The message likely indicates/indicated that one of the following was true:
- Google had difficulty accessing the sitemap (though you did say no errors)
- It was taking a long time to do it because of the large number of links
You could try splitting your sitemap up into several smaller ones, and using a sitemap index. Or have you done this already? By splitting it into several sitemaps, you can at least see whether some index and some don't, whether there do turn out to be issues with some of the URLs listed there, etc.
You can also prioritise the most important pages by putting them into their own sitemap (linked to from the sitemap index, of course), and submitting that one first. So at least if everything else takes longer you'll get your most important landing pages indexed.
-
Update. now 10 days passed since our migration to https and upload of sitemap, still same situation.
-
Google has been crawling all our pages during the last days. I see it in the crawling stats.
My concern is that
- majority of my sitemaps are still showing up as "pending" 3 days after I originally submited the sitemaps.
- those sitemaps that are processed show as indexed only less than 1% of my submitted pages.
We do have around 170.000 pages in our sitemap.
So I wonder wheher this is unusual or normal delay from google search console.
-
Its difficult to say. It depends on many factors like (importance of your site in Google's eyes, when they crawled your site the last time, relevance of the topic in general, etc.) BUT you can speed up the process a lot, i.e. initiate it on your own. You don't have to wait until Google recrawls your site at random. Did you know?
Go to Search Console - Crawl - Fetch as Google - Add your site's URL or URL of a particular sub page. Press Fetch
Google will recrawl that page again very quickly. When I do that with a particular page (not the entire domain) it usually takes 1-2 days at most to recrawl and index it again.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
How old is 404 data from Google Search Console?
I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?
Intermediate & Advanced SEO | | McTaggart0 -
Which search engines should we submit our sitemap to?
Other than Google and Bing, which search engines should we submit our sitemap to?
Intermediate & Advanced SEO | | NicheSocial0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
Does Google Index URLs that are always 302 redirected
Hello community Due to the architecture of our site, we have a bunch of URLs that are 302 redirected to the same URL plus a query string appended to it. For example: www.example.com/hello.html is 302 redirected to www.example.com/hello.html?___store=abc The www.example.com/hello.html?___store=abc page also has a link canonical tag to www.example.com/hello.html In the above example, can www.example.com/hello.html every be Indexed, by google as I assume the googlebot will always be redirected to www.example.com/hello.html?___store=abc and will never see www.example.com/hello.html ? Thanks in advance for the help!
Intermediate & Advanced SEO | | EcommRulz0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1