Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Reasons Why Our Website Pages Randomly Loads Without Content
-
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution.
- Happens on Chrome, IE and Firefox, testing with multiple browser versions
- Happens across various page types - but seems to be only the main content section/container
- Happens while on the company network, as well as externally
- Happens after deleting cookies, temporary internet files and restarting computer
- We are using a CMS that is virtually unheard of - Bridgeline/Iapps
- Codebase is .net
Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it.
I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz
Linking to an image example below
-
Thanks Gazzerman - This helps as well. Yes, I agree the site definitely needs SEO attention. It was not until recently that a more experience team was brought on to 'fix the site'.
-
I took one look at the code and was amazed by how much is inline. Most of it should be in js and css files. I noticed the solutions pages many of them don't even have title tags! What's going on there? That needs seo attention and is a big no no for coding. This leads me to believe the code needs very close inspection. It surly cant take that long to strip a whole bunch of that code out of each page or template.
If a page is slow to load or pulls in external js files as well you can have scripts try to execute before the required code has loaded. This is probably what is happening some of the time and would explain the inconsistent nature of it.
Run the site through a bunch of validators. I am sure you will find a bunch of stuff.
-
Oddly enough it only seemed to happen on the /Servers page - for me at least. The first time I loaded the page, I got a broken image and an X-Out/Close box. That may help you hone in on the problem.
-
Travis - Thanks for your response. This gives me a better idea of where to start looking. Honestly this site was very poorly developed in my opinion. It was before my time and much of it outsourced to junior teams overseas, which is evident when looking at the source code. But I am stuck with it for now.
-
Found the site. I'm able to recreate the problem. Check the image attachment.
I ran it through tools.pingdom.com, a chrome extension called cast_sender.js isn't loading. But something makes me think there's an issue with the CSS that only crops up from time to time, like the JS blocks it every once in a while. There's both a Global.css and a Global.js in the head. There are also folder levels called /Script%20Library/ and /Style%20Library/.
There's a ton of JavaScript on the site, which I don't have time to get into. And honestly, I'm more of a WordPress guy. I managed to make the page load fail a few times out of a dozen or so tries. I think something in the JavaScript is causing the issue.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
Is having a site map page necessary?
Hello all! So I know having a sitemap XML file is important to include in your robots.txt file. I also know it is important to submit your XML sitemap to Google and Bing. However, I am wondering if it is beneficial for your site's SEO value to have a sitemap page displayed on your website? Or is this just a redundant action if you have already done the above two actions with your XML sitemap? Thanks in advance!
Web Design | | Myles920 -
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Incorporating Spanish Page/Site
We bought an exact match domain (in Spanish) to incorporate with regular website for a particular keyword. This is our first attempt at this, and while we do have Spanish speaking staff that will translate/create a nice, quality page, we're not going to redo everything in Spanish page. Any advice on how to implement this? Do I need to create a whole other website in Spanish? Will that be duplicate content if I do? Can I just set it up to show the first page in Spanish, but if they click on anything else it redirects to our site? I'm pretty clueless on this, so if anything I've suggested is off-the-wall or a violation, I'm really just spit-balling, trying to figure out how to implement this. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Spanish website indexed in English, redirect to spanish or english version if i do a new website design?
Hi MOZ users, i have this problem. We have a website in Spanish Language but Google crawls it on English (it is not important the reasons). We re made the entire website and now we are planning the move. The new website will have different language versions, english, spanish and portuguese. Somebody tells me that we have to redirect the old urls (crawled on english) to the new english versions, not to the spanish (the real language of the firsts). Example: URL1 Language: Spanish - Crawled on English --> redirect to Language English version. the other option will be redirect to the spanish new version, which the visitor is waiting to find. URL1 Language: Spanish - Crawled on English --> redirect to Language Spanish version. What do you think? Which is the better option?
Web Design | | NachoRetta0 -
Subdomains, duplicate content and microsites
I work for a website that generates a high amount of unique, quality content. This website though has had development issues with our web builder and they are going to separate the site into different subdomains upon launch. It's a scholarly site so the subdomains will be like history and science and stuff. Don't ask why aren't we aren't using subdirectories because trust me I wish we could. So we have to use subdomains and I'm wondering a couple questions. Will the duplication of coding, since all subdomains will have the same design and look, heavily penalize us and is there any way around that? Also if we generate a good amount of high quality content on each site could we link all those sites to our other site as a possible benefit for link building? And finally, would footer links, linking all the subdirectories, be a good thing to put in?
Web Design | | mdorville0 -
Page Size
Hello Mozers, What is the best page size ( or the max page size ie KB ) for a home page or a 2nd level page. Thank you - I appreciate you looking at this question. Vijay
Web Design | | vijayvasu0