Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Fetch as Google temporarily lifting a penalty?
-
Hi, I was wondering if anyone has seen this behaviour before? I haven't!
We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword.
I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds.
I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax.
Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied.
Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh!
So before I dig myself even deeper, has anyone any ideas?
Thanks.
-
Unfortunately it's going to be difficult to dig deeper into this without knowing the site - are you able to share the details?
I'm with Martijn that there should be no connection between these features. The only thing I have come up with that could plausibly cause anything like what you are seeing is something related to JavaScript execution (and this would not be a feature working as it's intended to work). We know that there is a delay between initial indexing and JavaScript indexing. It seems plausible to me that if there were a serious enough issue with the JS execution / indexing that either that step failed or that it made the site look spammy enough to get penalised that we could conceivably see the behaviour you describe - where it ranks until Google executes the JS.
I guess my first step to investigating this would be to look at the JS requirements on your site and consider the differences between with and without JS rendering (and if there is any issue with the chrome version that we know executes the JS render at Google's side).
Interested to hear if you discover anything more.
-
Hey, it’s good to have a fresh pair of eyes on it. There may something else as simple that I’ve missed.
Rankings check on my Mac, 2 x webservers via Remote Desktop in different locations, 2 x independent ranking checkers and they all show the same... recovery for 12ish hours, along with traffic and conversions. Then nothing.
When the rankings dissapear I can hit fetch as again straight away and they come straight back. If I leave it for days, then they’re gone for days, until I hit fetch then straight back.
cheers
-
Ok, that still doesn't mean that they're not personalized. But I'll skip on that part for now.
In the end, the changes that you're seeing aren't triggered by what you're doing with Fetch as Google. I'll leave it up to some others to see if they'll shine a light on the situation. -
Hi,
Thanks,
They’re not personalised as my ranking checkers don’t show personalised results
-
Hi,
I'm afraid I have to help you with this dream, there is no connection whatsoever between the rankings and the feature to Fetch as Google within Google Search Console. What likely is happening is that you're already getting personalized results and within a certain timeframe, the ads won't be shown as the results will be different as Google thinks that you've already seen the first results on the page the first time that you Googled this.
Fetch as Google doesn't provide any signal to the regular ranking engines to say: "Hey, we've fetched something new and now it's going to make an impact on this". Definitely not at the speed that you're describing (within seconds).
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google News and Discover down by a lot
Hi,
Technical SEO | | SolenneGINX
Could you help me understand why my website's Google News and Discover Performance dropped suddenly and drastically all of a sudden in November? numbers seem to pick up a little bit again but nowhere close what we used to see before then0 -
Does google look at H3 tags?
I've had someone tell me that google doesn't pay attention to H3 tags -- only H1 and H2. I haven't found much online to back this up or discredit it; thought I'd ask the Moz community!
Technical SEO | | LivDetrick5 -
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
Image Indexing Issue by Google
Hello All,My URL is: www.thesalebox.comI have Submitted my image Sitemap in google webmaster tool on 10th Oct 2013,Still google could not indexing any of my web images,Please refer my sitemap - www.thesalebox.com/AppliancesHomeEntertainment.xml and www.thesalebox.com/Hardware.xmland my webmaster status and image indexing status are below,
Technical SEO | | CommercePunditCan you please help me, why my images are not indexing in google yet? is there any issue? please give me suggestions?Thanks!
0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
Does Google Bot accept Cookies
I am working with a per page results refinement that stores a cookie on the users computer and then keeps that same per page as the user goes around the site. I was just wondering if that was true for Google bot or Bing bot as well. Will they keep the cookie or would they not be able to accept it. I just want to know as I dont want different urls created if they can keep the cookie. Thanks!
Technical SEO | | Gordian0 -
UK website ranking higher in Google.com than Google.co.uk
Hi, I have a UK website which was formerly ranked 1<sup>st</sup> in Google.co.uk and .com for my keyword phrase and has recently slipped to 6<sup>th</sup> in .co.uk but is higher in position 4 in Google.com. I have conducted a little research and can’t say for certain but I wonder if it is possible that too many of my backlinks are US based and therefore Google thinks my website is also US based. Checked Google WmT and we the geo-targeted to the UK. Our server is also UK based. Does anyone have an opinion on this? Thanks
Technical SEO | | tdsnet0 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1