Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
-
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links.
Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there.
The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com".

Thanks in advance for any assistance!
-
Hey, this is Russ here at Moz.
Do the redirects point to the homepage or to the current URL? For example, does the http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html redirect to http://newsite.com or http://newsite.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html
If it does redirect to the same URL on newsite.com, I would try using wildcard robots.txt entries to simply block the offending content altogether. For example, if all the spam is off the styless.asp page, you could simply block styless.asp?* in your robots.txt and prevent Google from ever crawling those spammy links.
However, if you are redirecting everything to the homepage, I think you will need to go back to the old webmaster and figure something out. While Google is great at detecting spam, once you are under a penalty it can be difficult to recover. No one is perfect, including Google, and you don't want to be one of their "mistakes".
-
Hi usDragons,
Having too many crawl errors is not healthy. Usually a few number of pages are deleted every now and then, but having hundreds or thousands of 404s means something is wrong with the website, and from your description it's obvious that something is wrong. In fact, redirecting unnatural/thin content pages to your website can harm it, as its in a way links that send traffic (through 301 redirects) to your website, so you need to disavow these.
Because you have no control over the website, you should treat it as an external site that is spamming you. So don't think of it as a site that you own but have no access to.
The disavow tool requires you to create a .TXT file that have an explanation of why you disavow each group of domains/links. So you should explain that these are bad links that send you traffic, and you tried to "request" deleting these links and you got no help from whoever controls it, which i guess is true in your case.
Try to explain everything in your comments (in the .TXT file) (See attached)
Good luck and I hope I could help in anyway.
-
Thanks. We've been through this bad link cleanup process before, but not this kind of situation. Some advice I read said Google doesn't care about those 404s because it's obviously unrelated spam, but I would think having so many crawl errors can't be healthy for the site and I don't like the idea of redirecting them to the new site.
Now the trick is we don't have control of the old site, so we can't verify it in Google Search Console. The old site is just a redirect to the current site, so there is no website to work with. Looks like the disavow tool wants you to select a website property, but we can only use the new domain name. Will the disavow tool understand that these bad links to the old domain are redirected to the new domain name?
-
usDragons, the best way to deal with these links is to use Google's Disavow Links tool to disavow them.
First, you need to identify all of the links, and you an do that by downloading all your links from Open Site Explorer, Majestic.com, ahrefs.com, and Google Search Console. Combine the lists and remove the duplicates.
You'll want to manually review all of them, make a list of the ones you want Google to ignore, then upload a list of the domain names using Google's disavow links tool. Google has more info about their disavow tool here: https://support.google.com/webmasters/answer/2648487?hl=en
-
Hi there,
Seems to me that you should follow the standard process when you have unnatural links. You should:
- Compile a list of links and domains.
- Contact Webmasters of these domains, requesting removal of links (include the pages where these links are added in your email)
- Save all your sent and received emails to/from Webmasters
- Ones that don't reply to you, email them one more time a couple of weeks later
- Create a disavow file for domains that you couldn't get links removed from, state the reason and dates of emails.
- Submit the disavow file to the disavow tools
I know its not straight froward nor fast, but thats how you maintain the public link profile of any website since the Penguin Updates started.
I hope it helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
Does anyone use buzzfeed to creat links traffic and increase brand
Hi i would like to know if anyone uses http://www.buzzfeed.com to create links, gain traffic and increase brand awareness. I have signed up for an account but cannot get it to work and would like some help. I can get my content on there but cannot manage to get the links to work I signed up for this account a while back and a friend shown me how to use it but i have forgotten. here is my page http://www.buzzfeed.com/lifestylemagazine some links work and some links do not. what i am trying to do is to publish stories from my site as well as other sites and have the link included where you press the title and it goes to the site any help would be great
Technical SEO | | ClaireH-1848860 -
Does it really matter to maintain 301 redirect after de-indexing of old URLs?
Today, I was reading latest blog post on SEOmoz blog about. Uncrawled 301s - A Quick Fix for When Relaunches Go Too Well This is very interesting study about 301 & How it useful to maintain traffic. I'm working on eCommerce website and I have done similar stuff on my website. I have big confusion to manage 301 redirect. My website generates new URLs due to following actions. Re-write dynamic URLs. Re-launch entire website on different eCommerce platform. [osCommerce to Magento Commerce] Re-name category. Trasfer one product from one category to another category. I'm managing my 301 redirect with old practice. Excel sheet data from Google webmaster tools and set specific new URLs for redirect. Hoooo... Now, I have 8.5K redirect in htaccess... And, I'm thinking it's too much. Can we remove old 301 redirect from htaccess or not? This is big question for me. Because, all pages are not hyperlink on external website. Google have just de-indexed old URLs and indexed new URLs. So, Is it require to maintain 301 redirect after Google process?
Technical SEO | | CommercePundit0 -
Drupal URL Aliases vs 301 Redirects + Do URL Aliases create duplicates?
Hi all! I have just begun work on a Drupal site which heavily uses the URL Aliases feature. I fear that it is creating duplicate links. For example:: we have http://www.URL.com/index.php and http://www.URL.com/ In addition we are about to switch a lot of links and want to keep the search engine benefit. Am I right in thinking URL aliases change the URL, while leaving the old URL live and without creating search engine friendly redirects such as 301s? Thanks for any help! Christian
Technical SEO | | ChristianMKTG0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0