Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
403 forbidden error how to solve them
-
hi, i have been using a great tool today called screaming frog which was shown to me by Thomas Zickell
when i used the tool i found some worrying things for my site www.in2town.co.uk. what i have found is, i have a large number of 403 forbidden status on my home page and i do not know why
here is an example
http://www.in2town.co.uk/emmerdale/emmerdale-debbie-hits-rock-bottom
it loads fine but on the tool it shows it as an error and shows it as having no meta tags or anything but there is meta tags in there
can anyone please let me know how to solve this and why it has happened
many thanks
-
Hi Tim,
Glad it helped. It might be worth asking your host what kind of features they have for preventing flooding attacks, there are various ways of addressing them on the server side that most hosts will have enabled in one way or another. Unless you have a specific issue with these kind of attacks, it seems to me that this part of the module is causing more harm than good as it is now.
-
thank you for this. i have turned it off and will speak to sh404sef to find out what they can do about it, as i am worried about having the security feature, but as you said that was the problem and now the site is showing fine, there are no errors showing.
many thanks for this. I hope other people who are having this problem get to read this post as they must be going through what i am going through. many thanks for all your help and the solution
-
Hi Tim,
Did you ever get to the bottom of the issue mentioned in this question? It is almost certainly the same problem.
Have a look at this page and try either turning of the sh404SEF anti flooding feature or else boosting the max number of requests allowed. http://forum.joomla.org/viewtopic.php?p=1368937
The anti flooding part of this component is basically blocking requests for pages if it thinks someone is trying to do a dos attack on your site. The current setup seems to be too sensitive and is bloocking screaming frog after the first few requests, quite possibly blocking the google bots, maybe blocking the moz crawler also, so certainly something you should address.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"5XX (Server Error)" - How can I fix this?
Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Technical SEO | | RocketStats0 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
Best strategy to handle over 100,000 404 errors.
I recently been given a site that has over one-hundred thousand 404 error codes listed in Google Webmasters. It is really odd because according to Google Webmasters, the pages that are linking to these 404 pages are also pages that no longer exist (they are 404 pages themselves). These errors were a result of site migration that had occurred. Appreciate any input on how one might go about auditing and repairing large amounts of 404 errors. Thank you.
Technical SEO | | SEO_Promenade0 -
Error report in Bing Evaluated size of HTML....
Hi Whilst checking Bing's SEO analyser I got this error message for our page www.tidy-books.co.uk/childrens-bookcases "Evaluated size of HTML is estimated to be over 125 KB and risks not being fully cached. (Issue marker for this rule is not visible in the current view)" Just wondering what needs to be done about it and what it actually means? Thanks
Technical SEO | | tidybooks0 -
Error: Missing Meta Description Tag on pages I can't find in order to correct
This seems silly, but I have errors on blog URLs in our WordPress site that I don't know how to access because they are not in our Dashboard. We are using All in One SEO. The errors are for blog archive dates, authors and just simply 'blog'. Here are samples: http://www.fateyes.com/2012/10/
Technical SEO | | gfiedel
http://www.fateyes.com/author/gina-fiedel/
http://www.fateyes.com/blog/ Does anyone know how to input descriptions for pages like these?
Thanks!!0 -
Google's "cache:" operator is returning a 404 error.
I'm doing the "cache:" operator on one of my sites and Google is returning a 404 error. I've swapped out the domain with another and it works fine. Has anyone seen this before? I'm wondering if G is crawling the site now? Thx!
Technical SEO | | AZWebWorks0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
Should there be a canonical tag on my 404 error page?
In my crawl diagnostics, I notice some 4xx client errors. They are appearing for pages that no longer exist, so I'm not sure what the problem is. Shouldn't they just be dealt as 404's? Anyway, on closer inspection I noticed that my 404 error page contains a canonical tag which points to the missing page. Could this be the issue? Is it a good idea to remove the canonical tag from this error page? Thanks.
Technical SEO | | Leighm0