Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
404 Errors flaring on nonexistent or unpublished pages – should we be concerned for SEO?
-
Hello!
We keep getting "critical crawler" notifications on Moz because of firing 404 codes. We've checked each page and know that we are not linking to them anywhere on our site, they are not published and they are not indexed on Google. It's only happened since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up and that they are just lingering in our code somewhere.
However, we are still concerned having these codes fire implies negative consequences for our SEO. Is this the case? Should we be concerned about these 404 codes despite the pages from those URLs not actually existing?
Thank you!
Chloe -
If the errors are detected by Moz's crawler and Google Search Console (both at the same time) then I'd be much more concerned. It does also depend on the volume of them, if there are like three then it's probably not worth your time to sort it out. If there are hundreds or thousands, you might want to think about that
If you have hidden links in the coding which Moz is picking up on (that's how Moz's crawler works, by following links) then you can't really say: "We've checked each page and know that we are not linking to them anywhere on our site" - the fact that the crawler found the links means they exist and are there (even if you can't see them or find them). That is of course, unless your site is on one of the unusual architecture that Rogerbot (Moz's crawler) has difficulties with. That shouldn't be your first assumption, though - he usually knows where he's going
Where you say this:
"since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up" - pull them up on it! If their developers coded a load of errors into your site, that's their fault not yours and it should be their expense (not yours) to fix it
This is the page regarding their CMS:
https://www.hubspot.com/products/marketing/content-management-system
It does say "A Content Management System Built for Professional Marketers" - so migrating to it, shouldn't cause loads of SEO problems, as SEO is still the largest chunk of most site's online marketing and traffic. That should be nailed down, no problems, fewer problem than your prior system
In-fact, HubSpot know that SEO is important for a CMS: https://www.hubspot.com/cms-and-seo - "Every marketer has been told that they need to consider SEO when creating content. But what makes SEO a unique marketing strategy that marketers should prioritize? And why should your CMS have tools that help you execute your SEO strategy?" - I would argue that a load of 404 errors, could not be considered "tools that help you execute your SEO strategy"
Whether their developers messed up or their CMS is at fault is not really relevant. The main point is, the responsibility to sort it out should be on their side (not yours, IMO)
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
Should I apply Canonical Links from my Landing Pages to Core Website Pages?
I am working on an SEO project for the website: https://wave.com.au/ There are some core website pages, which we want to target for organic traffic, like this one: https://wave.com.au/doctors/medical-specialties/anaesthetist-jobs/ Then we have basically have another version that is set up as a landing page and used for CPC campaigns. https://wave.com.au/anaesthetists/ Essentially, my question is should I apply canonical links from the landing page versions to the core website pages (especially if I know they are only utilising them for CPC campaigns) so as to push link equity/juice across? Here is the GA data from January 1 - April 30, 2019 (Behavior > Site Content > All Pages😞
Intermediate & Advanced SEO | | Wavelength_International0 -
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Do 403 Forbidden errors from website pages hurt rankings?
Hi All, I noticed that our website has lot of 403 errors across different pages using the tool http://www.deadlinkchecker.com/. Do these errors hurt website rankings? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
SEO strategy for conversion-optimised home page
I'm working on a very conventional-type site with a home page (why come to us), methods we use, pricing, reviews, FAQs and contact us. After reading the Moz case study at (http://www.conversion-rate-experts.com/seomoz-case-study/), I have been working on a conversion-optimised home page that consolidates much of content in all these pages. At the bottom of the home page, I then plan to add a list of blog posts "Want to read more? We have a lot of useful information on our blog. Here are the most popular articles:" with articles that explain more about the methods we use for example (content that was formerly on our methods page). Obviously this new blog will also have more interesting information (but a lot that could actually be converted into pages) This radically changes the site into just a home page full of selling points and calls-to-action and a blog. I have some questions about this strategy: How do we keep our search engine ranking for keywords such as "[our service] prices" or "[a particular method] London". We rank quite well on Google for these and it goes straight to the relevant page. Shall we keep the pages active somewhere even though the information is also on the home page? Is a blog actually necessary here (SEO wise)? The things I'm planning to write could easily be made into more pages. Am I going about this completely wrong by trying using the CRO guide? Should this sort of page be reserved for landing pages? The reason why I'm considering making a conversion-generating home page is because we only sell one service pretty much (although there are differences in how we do it on children vs. adults) and because we are quite niche so most of our traffic comes from organic sources. Thank you
Intermediate & Advanced SEO | | LondonAli0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Are there any negative effects to using a 301 redirect from a page to another internal page?
For example, from http://www.dog.com/toys to http://www.dog.com/chew-toys. In my situation, the main purpose of the 301 redirect is to replace the page with a new internal page that has a better optimized URL. This will be executed across multiple pages (about 20). None of these pages hold any search rankings but do carry a decent amount of page authority.
Intermediate & Advanced SEO | | Visually0