Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hacked website - Dealing with 301 redirects and a large .htaccess file
-
One of my client's websites was recently hacked and I've been dealing with the after effects of it. The website is now clean of malware and I already appealed to Google about the malware issue. The current issue I have is dealing with the 20, 000+ crawl errors which are garbage links that were created from the hacking.
How does one go about dealing with all the 301 redirects I need to create for all the 404 crawl errors? I'm already noticing an increased load time on the website due to having a rather large .htaccess file with a couple thousand 301 redirects done already which I fear will result in my client's website performance and SEO performance taking a hit as well.
-
This is the correct answer.
To expand on this slightly, just make sure none of the 404s are internal (ie there are no links on your site pointing to one of these dodgy pages as a result of the hack) and you're all good.
Remove the entries from your htaccess file to avoid having to parse them constantly and let any external links to dodgy pages 404. This sort of circumstance is exactly what 404s are made for!
The only site at risk of a ranking drop from these 404s is the one pointing to those dodgy pages - who cares about your hackers' rankings?

-
So robots part could be at the end but in my case it worked fine too.
-
Just a correction here. I agree with all the items above, with one very, very, very, very, very important change.
DO NOT set the corrected urls to disallow in your robots.txt
If you do not allow Google to crawl the pages, Google will not see that the links were removed, that the page is now 4xx, etc. If you were to disallow all those pages, all the clean up work that you have done will not be seen by Google and would be for naught.
If you later want to disallow those pages, that would be fine, but you need to let Google see your clean up work first.
-
Hi
I just finished similar job.
What you should do:
- collect all bad "pages" and links pointing to them
- find a pattern like some kind of directory
- set them (directories I believe?) 410, not 404
- set robots to disallow those directories
- push all pages and links to reindex
- remove from Google index
- done (need to wait some time)
Important thing is to get rid of all bad links pointing to those pages. If you do that, then there'll be no issues. However this could be ongoing negseo. If you need help with that, pm me.
Krzysztof
-
If they are garbage links, why are you redirecting them? Let them 404. Having not found pages does not lead to penalties, in and of itself.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you 301 redirect URLs with a hashbang (#!) format? We just lost a ton of pagerank because we thought javascript redirect was the only way! But other sites have been able to do this – examples and details inside
Hi Moz, Here's more info on our problem, and thanks for reading! We’re trying to Create 301 redirects for 44 pages on site.com. We’re having trouble 301 redirecting these pages, possibly because they are AJAX and have hashbangs in the URLs. These are locations pages. The old locations URLs are in the following format: www.site.com/locations/#!new-york and the new URLs that we want to redirect to are in this format: www.site.com/locations/new-york We have not been able to create these redirects using Yoast WordPress SEO plugin v.1.5.3.2. The CMS is WordPress version 3.9.1 The reason we want to 301 redirect these pages is because we have created new pages to replace them, and we want to pass pagerank from the old pages to the new. A 301 redirect is the ideal way to pass pagerank. Examples of pages that are able to 301 redirect hashbang URLs include http://www.sherrilltree.com/Saddles#!Saddles and https://twitter.com/#!RobOusbey.
Intermediate & Advanced SEO | | DA20130 -
Too many 301 redirects?
Hey, My company currently has one chief website with about 500-600 other domains that all feature the same material as the chief website. These domains have been around for about 5 years and have actually picked up some link traffic. I have all of these identical web-pages utilizing rel=canonical but I was wondering if I would be better served, from SEO purposes, to 301 redirect all of these sites to their respective pages on our chief website? If I add 500 301 redirects, will the major search engines consider this to be black-hat link-building even though the sites are related and technically already feature the same content? For an example, the chief website is www.1099pro.com and I would 301 redirect the below sites to the chief site: 1099softwarepro.com 1099softwarepro.info 1099softwarepro.net 1099softwarepro.biz 1099softwareprofessionals.com 1099softwareprofessionals.info ...you get the point
Intermediate & Advanced SEO | | Stew2220 -
301 Redirection and apostrophes in URLs
Hi I am experiencing trouble getting any redirects with apostrophes in the URLs to 301 redirect in order to eliminate 404 errors. I have tried replacing the instance of the apostrophe in the source URL field to %27 and variations of this but to no avail. The site is a wordpress site (the old URLS are legacies from the old Business Catalyst site) and I am using the redirection plug in. I have gone into some detail with a helpful soul here http://wordpress.org/support/topic/how-to-deal-with-apostrophes-in-source-url but unfortunately to no result. If anyone has any idea how to solve this puzzle I would be grateful for the help. Example: http://www.tesselaars.com/blog/Inside_Flowers/post/Online_Marketing_for_Florists_Part_1%E2%80%93_A_Website_You_Won%27t_Regret/
Intermediate & Advanced SEO | | Seamoose0 -
Can I make 301 redirects on a Windows server (without access to IIS)?
Hey everyone, I've been trying to figure out a way to set up some 301 redirects to handle the broken links left behind after a site restructuring, but I can only ever find information on 2 methods that I can't use (as far as I can tell). The first method is to do some stuff with an htaccess file, but that looks like it only works on Linux-based servers. The method described for Windows servers is generally to install this IIS rewrite/redirect module and run that, but I don't think our web hosting company allows users to log directly into the server, so I wouldn't be able to use the IIS thing. Is there any other way to get a 301 redirect set up? And is this uncommon for a web hosting company to do, or do you all just run your sites on Linux-based servers or your own Windows machines? Thanks!
Intermediate & Advanced SEO | | BrianAlpert780 -
Multiple 301 Redirects for the Same Page
Hi Mozzers, What happens if I have a trail of 301 redirects for the same page? For example,
Intermediate & Advanced SEO | | Travis-W
SiteA.com/10 --> SiteA.com/11 --> SiteA.com/13 --> SiteA.com/14 I know I lose a little bit of link juice by 301 redirecting.
The question is, would the link juice look like this for the example above? 100% --> 90% --> 81% -->72.9%
Or just 100% -----------------------------------------> 90% Does this link juice refer to juice from inbound links or links between internal pages on my site? Thanks!0 -
301 doesn't redirect a page that ends in %20, and others being appended with ?q=
I have a product page that ends /product-name**%20** that I'm trying to redirect in this way: Redirect 301 /products/product-name%20 http://www.site.com/products/product-name And it doesn't redirect at all. The others, those with %20, are being redirected to a url hybrid of old and new: http://www.site.com/products/product-name**?q=old-url** I'm using Drupal CMS, and it may be creating rules that counter my entries.
Intermediate & Advanced SEO | | Brocberry0 -
Is 301 redirect suggested on pagination pages
Hi - Due to pagination the default page of site is coming in 2 url with - ?page=1/ sub-url and /sub-url is 301 a recommended solution due to this pagination urls Also - is it required to create separate title and meta description of every pagination page We are taking specifically in context of our discounts and offer section http://www.mycarhelpline.com/index.php?option=com_offers&view=list&Itemid=9
Intermediate & Advanced SEO | | Modi0 -
301 redirect from .html to non .html?
Previously our site was using this as our URL structure: www.site.com/page.html. A few months ago we updated our URL structure to this: www.site.com/page & we're not using the .html. I've read over this guide & don't see anywhere that discusses this: http://www.seomoz.org/learn-seo/redirection. I've currently got a programmer looking into, but am always a bit weary with their workarounds, as I'd previously had them cause more problems then fix it. Here is the solution he is looking to do: The way that I am doing the redirect is fine. The problem is of where to put the code. The issue is that the files are .html files that need to be redirected to the same url with out a .html on them. I can see if I can add that to the 404 redirect page if there is one inside of there and see if that does the trick. That way if there is no page that exists without the .html then it will still be a 404 page. However if it is there then it will work as normal. I will see what I can find and get back. Any help would be greatly appreciated. Thanks, BJ
Intermediate & Advanced SEO | | seointern0