Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Outranking a crappy outdated site with domain age & keywords in URL.
-
I'm trying to outrank a website with the following:
Website with #1 ranking for a search query with "City & Brand"
- Domain Authority - 2
- Domain Age - 11 years & 9 months old
- Has both the City & brand in the URL name.
- The site is crap, outdated.. probably last designed in the 90's, old layouts, not a lot of content & NO keywords in the titles & descriptions on all pages.
My site ranks 5th for the same keyword.. BEHIND 4 pages from the site described above.
- Domain Authority - 2
- Domain Age - 4 years & 2 months old
- Has only the CITY in the URL.
- Brand new site design this past year, new content & individual keywords in the titles, descriptions on each page.
My main question is.... do you think it would be be beneficial to buy a new domain name with the BRAND in the URL & CITY & 301 redirect my 4 year old domain to the new domain to pass along the authority it has gained.
Will having the brand in the URL make much of a difference?
Do you think that small step would even help to beat the crappy but old site out?
Thanks for any help & suggestions on how to beat this old site or at least show up second.
-
Thanks all. This is what I had recommended to the client to begin with. I just needed some backup from all you smart SEO's out there.
Unfortunately the URL would not be for sale as it's a brick and mortar business.
Thanks again!
-
I personally lean more towards the reaction of EGOL. If you put enough effort in it, keep it straight white hat and do all the steps suggested in the SEOblog section (like link earning in stead of link building) then over time you can outrank that site for sure. But keep working on it. Be social.. share everything you can on facebook, twitter and off course the BIG G+.
Tricks can help you in the short run but hurt you in the long run so i wouldn't go for that straight away.
You could also try registering the other domain you mentioned, put up some content and everything and build it next to your existing website (without copying text etc.). You could do this as a supporting role for your primary website if you wish. But i would focus on my primary website first and improving that one.
regards
Jarno
-
I like irving's suggestion to see if the webmaster is willing to sell the site. whats the link profile like? any particular high authority links that might be giving it the advantage over your site?
-
**Will having the brand in the URL make much of a difference? **
The brand? Yes, if people know you.
Really, I would not change domains for the tiny advantage that you think a keyword in the domain might bring. It is very possible that you will lose more linkjuice in the redirect than you will gain from the keyword in the domain.
Do you think that small step would even help to beat the crappy but old site out?
heh... That crappy old site is beating you because they are beating you.

It is easier to beat an old crappy site with "work" than it is to beat them with "tricks'.
-
Absolutely not, your site is aged. A new site is like starting all over even if you do 301 the old site to the new.
a) work on improving the on page SEO on your site
b) if that new domain is available you could play around with setting that up as a stand alone site and see if you can get it ranked #1, it could take 6-12 months before Google really trusts it enough.
c) if it's that old and outdated maybe he wants to sell it at a reasonable price if it's worth that much to you?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Chinese Sites Linking With Bizarre Keywords Creating 404's
Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra". Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?
Intermediate & Advanced SEO | | alrockn0 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
10,000+ links from one site per URL--is this hurting us?
We manage content for a partner site, and since much of their content is similar to ours, we canonicalized their content to ours. As a result, some URLs have anything from 1,000,000 inbound links / URL to 10,000+ links / URL --all from the same domain. We've noticed a 10% decline in traffic since this showed up in our webmasters account & were wondering if we should nofollow these links?
Intermediate & Advanced SEO | | nicole.healthline0 -
Include Cross Domain Canonical URL's in Sitemap - Yes or No?
I have several sites that have cross domain canonical tags setup on similar pages. I am unsure if these pages that are canonicalized to a different domain should be included in the sitemap. My first thought is no, because I should only include pages in the sitemap that I want indexed. On the other hand, if I include ALL pages on my site in the sitemap, once Google gets to a page that has a cross domain canonical tag, I'm assuming it will just note that and determine if the canonicalized page is the better version. I have yet to see any errors in GWT about this. I have seen errors where I included a 301 redirect in my sitemap file. I suspect its ok, but to me, it seems that Google would rather not find these URL's in a sitemap, have to crawl them time and time again to determine if they are the best page, even though I'm indicating that this page has a similar page that I'd rather have indexed.
Intermediate & Advanced SEO | | WEB-IRS0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0