Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to make second site in same niche and do white hat SEO
-
Hello,
As much as we would like, there's a possibility that our site will never recover from it's Google penalties.
Our team has decided to launch a new site in the same niche.
What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.)
We won't have duplicate content, but it's hard to make the sites not similar.
Thanks
-
I'm sorry to hear that I would recommend requesting the people linking to your existing site that are using high quality powerful links to update the back link to point to your new site.
the advantages of dealing with people with legitimate sites are they are much easier to find and will actually help you with these types of things. It's not the nightmare that it is trying to get a hold of a blackhat webmaster.
Outside of creating a 100% legitimate website with a slightly different niche may be content, inbound marketing whatever buzzword you want to use for the very short time I hope it takes you to get your most powerful white hat links to point to your new website.
Removeem.com It is a wonderful tool for finding the names and contact info of webmasters you can use it to make a polite request saying that you have a new domain and you would appreciate if they would please update the link pointing at your site.
After you have taken the best Backlinks away from your existing site I would move to your new site.
I would also be upfront about moving place text saying you are changing domain names in a conspicuous location on your site.
If you feel that your livelihood is being jeopardized by this I definitely can understand I would then really put 110% into creating some top-notch content and user friendly/mobile design on your new brand. When you go live you want to really have something better than what you had before.
I'm sorry I don't know any methods that would be instant but I would consider using pay per click to soften the blow.
I hope this is of help,
Thomas
-
Tom,
I appreciate the responses and they make sense. I don't see a solution. I don't see our current site ever pulling out of penalty no matter what I do and we've got an income off of it.
Any ideas?
-
this is older but
http://googlewebmastercentral.blogspot.com/2010/11/best-practices-for-running-multiple.html
https://www.webmasterworld.com/google/4557285.htm
and this discussion of tactics used to do what are considered now black hat
http://www.nichepursuits.com/should-you-host-all-your-niche-sites-on-the-same-hosting-account/
it is no ok in Google ad words either
sorry for all the posts,
Tom
-
with all that said I think if you go after a slightly new niche or offer things from a different angle you're obviously doing twice the work.
Are you concerned that if you 301 redirect you will be bringing the penalty over?
sincerely,
Tom
-
talking about taking the new site and building it using white hat tactics that were implemented after the penalty in which the original site has yet to return from. I know that creating sites that are essentially going to be the same but contain unique content just to get better rankings is against the rules.
if you remove the first site after building the first site using white hat methods currently employed on the existing site
( I should say domain because that's what's coming down to right?)
it would be in your best interest to remove the first site when the second website goes live.
I know this is not the ideal situation because you probably have some good Backlinks on the original but having two sites that are competing for the same niche owned by the same person/company would be competing for the same place in the SERPS I believe would be considered a method of rigging the system.
if you have one site that is completely fine if you have one that is going to go after different niche that is completely fine.
I am basing this on an e-commerce client of mine who had competitor selling the exact same product with unique content across three domains.
The client reported this to Google and the spam team acted or there was an incredible coincidence because two months later sites reported could not be found in Google's index.
I that is of help,
Tom will
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
White H1 Tag Hurting SEO?
Hi, We're having an issue with a client not wanting the H1 tag to display on their site and using an image of their logo instead. We made the H1 tag white (did not deliberately hide with CSS) and i just read an article where this is considered black hat SEO. https://www.websitemagazine.com/blog/16-faqs-of-seo The only reason we want to hide it is because it looks redundant appearing there along with the brand name logo. Does anyone have any suggestions? Would putting the brand logo image inside of an H1 tag be ok? Thanks for the help
White Hat / Black Hat SEO | | AliMac261 -
Negative SEO Click Bot Lowering My CTR?
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term. Is there any way to detect this activity? Background: We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO. We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only) Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory. Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
White Hat / Black Hat SEO | | TheDude0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
White Hat / Black Hat SEO | | DCochrane0 -
How Is Your Approach Towards Adult SEO?
I would like to know how SEOMoz community members approach adult SEO. How do you approach a project when you get one (if you do it that is). If you dont do adult SEO, why do you not do it? Is it because it's much more difficult than normal SEO or do you not want to associate yourself with that industry?
White Hat / Black Hat SEO | | ConversionChamp0 -
Recovering From Black Hat SEO Tactics
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site. My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed. Craig Cook
White Hat / Black Hat SEO | | SEOptPro
http://seoptimization.pro
[email protected]0