Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
-
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ?
Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ?
Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too.
Please help me out in this regard.
Thank you beforehand.
Amit Ganguly
http://aamthoughts.blogspot.com - Sustainable Sphere
-
i'm working on clearing my 4xx client errors and following the instructions, the offending referrer is my sitemap.xml
is it as simple as opening that file up in wordpad, removing all the broken links and upload back to my site?
-
Thank you so much, Cyrus.This certainly helps a lot.
Much Regards
Amit Ganguly
-
Hi Amit,
This is an important question, and how you address these errors and warnings depends on your experience level and the needs of your site. It's also a tremendous opportunity to further your SEO education.
For many folks like yourself, the best thing to do is to tackle each one of these issues one at time, learn from online resources until you are a near expert, then move onto the next one.
Each site is different, so there's no "one size fits all" solution. The exact "fix" will always depend on too many variables to list here, but here's some tips to get you started.
1. 4xx Errors. The best thing to do is download the CSV of your crawl report and open it in a spreadsheet program. Find the URLs that cause the error, and in the last column find the "referrer". This referrer will tell you the URL that the bad link was found on. If you go to this page, you can usually find where the broken link originated and decide if it needs fixing.
2. Too Many Links - This is a warning, not an error, so you may choose not to fix this. To understand the warning further, I recommend reading this article by Dr. Pete:
http://www.seomoz.org/blog/how-many-links-is-too-many
If you decide that you should address the pages with too many links, you can then start to decide which links you should remove.
3. Canoncial - Finally, these are notices, which aren't necessarily bad, we just want you to know they are there. For a little background, you might want to read the following:
http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Is it good to redirect million of pages on a single page?
My site has 10 lakh approx. genuine urls. But due to some unidentified bugs site has created irrelevant urls 10 million approx. Since we don’t know the origin of these non-relevant links, we want to redirect or remove all these urls. Please suggest is it good to redirect such a high number urls to home page or to throw 404 for these pages. Or any other suggestions to solve this issue.
Technical SEO | | vivekrathore0 -
How to inform Google to remove 404 Pages of my website?
Hi, I want to remove more than 6,000 pages of my website because of bad keywords, I am going to drop all these pages and making them ‘404’ I want to know how can I inform google that these pages does not exists so please don’t send me traffic from those bad keywords? Also want to know can I use disavow tool of google website to exclude these 6,000 pages of my own website?
Technical SEO | | renukishor4 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Ok to internally link to pages with NOINDEX?
I manage a directory site with hundreds of thousands of indexed pages. I want to remove a significant number of these pages from the index using NOINDEX and have 2 questions about this: 1. Is NOINDEX the most effective way to remove large numbers of pages from Google's index? 2. The IA of our site means that we will have thousands of internal links pointing to these noindexed pages if we make this change. Is it a problem to link to pages with a noindex directive on them? Thanks in advance for all responses.
Technical SEO | | OMGPyrmont0 -
Error: Missing Meta Description Tag on pages I can't find in order to correct
This seems silly, but I have errors on blog URLs in our WordPress site that I don't know how to access because they are not in our Dashboard. We are using All in One SEO. The errors are for blog archive dates, authors and just simply 'blog'. Here are samples: http://www.fateyes.com/2012/10/
Technical SEO | | gfiedel
http://www.fateyes.com/author/gina-fiedel/
http://www.fateyes.com/blog/ Does anyone know how to input descriptions for pages like these?
Thanks!!0 -
Does Google pass link juice a page receives if the URL parameter specifies content and has the Crawl setting in Webmaster Tools set to NO?
The page in question receives a lot of quality traffic but is only relevant to a small percent of my users. I want to keep the link juice received from this page but I do not want it to appear in the SERPs.
Technical SEO | | surveygizmo0 -
How many strong tags is too many
Hi everyone, just a quick question, what are your views on the use of strong tags in content? how many is too many? What is you have strong tags around every keywords for a sentance etc?
Technical SEO | | pauledwards1