Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cached version of my site is not showing content?
-
Hi mozzers,
I am a bit worried since I looked a cache version of my site and somehow content is partially showing up and navigation has completely disappeared.
Where could this come from? What should I be doing?
Thanks!
-
Thanks guys!
When using FETCH and RENDER on GSC everything looks normal, I do see navigational items which is good but I see a PARTIAL status. Is that ok?
-
Hi There,
The cached version is how your website looked when Google cached it. It is possible that your site's scripts might not have loaded during that time and made it partially available to Google. I would recommend to run tests and try to resolve before next caching happens. The two possible reasons for the site partial load can be 1.due to caching issue on your website 2. script load time is too much.
I hope this helps. Let me know if you have further questions.
Regards,
Vijay
-
Sometimes the cached pages aren't the actual representation of the page that you're looking for as some of the assets can't be loaded. What I do is go into Google Search Console and try to Fetch as Google for the same page you just tried to look up as the cached version. Then check what the response is and if that matches, often it's not the case and the page will fetch just as you see it as a user. With some of the more modern techniques it's possible that the page wont' render correctly in a cached version versus the live version.
-
What are you using CloudFlare? If so you'll have to play with the different settings, you might have something interacting badly with your content.
Purge the Cache for starters, then tweak the settings. Remove AMP, uncheck the minified HTML/CSS/Javascript etc.
If you have another type of CDN and want me to look to see if i can find a solution feel free to letme know. Of coarse if your hiding your IP behind it and don't want it leaked as to what CDN, I totally understand.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration - Pagination
Hi, We are migrating our website and an issue we are facing is how to handle paginated content in our categories. Our new website will have the same structure but with different urls. Should we 301 redirect all the paginated content (if crawled by Google) to the url of the main category? To put this into an example: Old urls: www.example.com/technology/tvs (main category of TVs & also page 1) ** www.example.com/technology/tvs?v=0&page=2 ** ( page 2 of TVs) New urls: **www.example.com/soundvision/tvs **(main category of TVs & also page 1) **www.example.com/soundvision/tvs?page=2 **(page 2 of tvs) Should we redirect all of the old TV urls (also the paginated) to www.example.com/soundvision/tvs ? The is no rel next, prev tag in our site and no canonicals. Also there is a view all products page in each category, BUT it doesn't contain all the products(max. is 100 per page - yes the view all page is also paginated). The same view all products page (paginated) will exist in the new website also. I checked google search console, and Google has decided to treat as canonical page the first page www.example.com/technology/tvs . Also, all the organic traffic of our categories goes to these pages (main category page - 1st page). I would appreciate any thoughts on this.
Intermediate & Advanced SEO | | HellasSITES0 -
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Regional and Global Site
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa. These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region. My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD. For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site? Many thanks, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
Intermediate & Advanced SEO | | Motava
ftp.DOMAIN2.com/?action=news&id=63
META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN3.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN4.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
mail.DOMAIN5.com/?action=category&id=17
META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27 There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.0 -
Micro sites?
Hi, I have been speaking to seo firms regarding strategies and they mentioned setting up micro sites under domains that are relevant. i.e setting up armanidoamin.co.uk and we use it as a blog type site to update all info, product reviews, news relating to armani. Whats peoples thoughts on this? Does it work? Is it worth the effort? Im not so sure but obviously looking for ideas. Cheers
Intermediate & Advanced SEO | | YNWA0 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Badges For a B2b site
love this seo tactic but it seems hard to get people to adopt it Has anyone seen a successful badge campaign for a b2b site? please provide examples if you can.
Intermediate & Advanced SEO | | DavidKonigsberg0