Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Url with hypen or.co?
-
Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)?
Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co?
Thank you in advance!
-
Hi Joe, this is for sure an awesome question, so many different point of views, the problem I see with .co is this one:
"Sites with country-coded top-level domains (such as .ie) are already associated with a geographic region, in this case Ireland. In this case, you won't be able to specify a geographic location."
Source: http://www.google.com/support/webmasters/bin/answer.py?answer=62399
So if I understand this correctly, and you want to target real estate clients in the Chicago area (which I love and will be there for the U2 concert on July 4th) and over US/worldwide, a .co domain is probably not the way to go here.
There has been a lot of talk about .co (TLD for Colombia), same as .ws, supposedly "WebSite", actually West Samoa, so I would advice to make the obvious, look at your competitors, does anyone has a .co domain and are ranking in Chicago? are any of the top 100 results anything but .com? try different keywords just to check if there are any .co sites ranking in the real estate market.
Hope that helps!
-
Thanks for the feedback. Thats the beauty of SEO. The only way to figure out what is the most effective is to try multiple ways and measure. Then, as soon as you get it and have a conclusion, the rules change...

-
At the risk of getting a bunch of thumbs down, between the choices you have specifically asked, I am going to throw in with the .co.
I think the issue is going to be how you promote the site, where you host it and where you get your links from.
If you host it in the USA and build a solid local link building campaign no one is going to have any trouble figuring out where you should be relevant. least of all the major search engines.
The other concern would be when someone tries to type in your url directly. However, There will be a tendency to automatically add an "m" to the end. But will that be any more of a problem then trying to get people to put a hyphen in the right place?
If people really find your site helpful, they'll just bookmark it in my experience.
-
Trust me when I say that I didn't think of the .co because of the Super Bowl ad.
I have heard mixed results on the .co but really haven't seen it in search results but I dont see to many hyphenated urls either. Maybe I will just add a word to the .com? -
They had an ad in the superbowl, I've heard from 5 different clients about if they should buy the .co after that.
-
This link might help as well...
-
Completely disagree with you Korgo the average user doesn't even know there is a .co TLD that exists.
They have been available for a while, I spend a lot of time online through work and play and have never seen a site using one so not sure why you think they will take off if they haven't already despite virtually ever domain seller pushing them heavily last year.
-
I agree with James and would aim for one hyphen on the .com TLD. I did some unscientific user testing in this area and one hyphen was fine, 2 or more was a turn off for the user.
The same users expected a site to be .co.uk (I'm in the UK) or .com and some were confused by the existence of different TLD's wondering where the .co.uk or .com was and thinking the URL might not work without them.
-
I would pick hypenated over anything but .com. I would nt even use .net - .org is the only one I would consider for a true non-profit organisation.
I have some hyphenated domains for ecommerce websites, and have found no big problem with them personally. Of course go with non-hyphenated .com's if you can!
-
I don't like hyphens, but I don't like foreign domain extensions even more (Columbia!) despite what they say about it meaning "company", no, no. They pulled the same stunt with .me it's not on.
It depends how competitive the niche is and how much you want it. I have a feeling EMD won't be as strong in the coming months for long tail searches like this, but for now I guess it will give you the edge, what I'm trying to say is if you don't like the domain don't go with it, follow what you feel is most logical, as that is probably best for long term SEO success.The EMD benefit is nowhere near the same (in my exp) with hyphenated or foreign domains, don't get me wrong they are a benefit, but a .com, .org or net will always outrank (for now).
So in response to your question, If I was you I would buy them both (so comp. can't steal em' later), make them both blogs and get a nice brand-able domain for your business, use the two blogs as feeders for your business.
-
Thanks for your reply.
-
Thanks! I figured two hyphens wouldn't be a good idea but it's sure tempting.
-
According to the book The Art of SEO, my personal SEO bible, if you're not concerned with type-in-traffic, branding or name recognition, you don't need to worry about this. However to build a successful website long term you need to own the .com address and if you then want to use .co then the .com should redirect to it. According to the book, with the exception of the geeky, most people who use the web still assume that .com is all that's available or these are the domains that are most trustworthy. So don't lose traffic by having another address!
-
Hi Joe,
I wont go after 2 hyphens, usually if the .com is not available i go after a .net.
But in your case, i would go with a .co
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My url disappeared from Google but Search Console shows indexed. This url has been indexed for more than a year. Please help!
Super weird problem that I can't solve for last 5 hours. One of my urls: https://www.dcacar.com/lax-car-service.html Has been indexed for more than a year and also has an AMP version, few hours ago I realized that it had disappeared from serps. We were ranking on page 1 for several key terms. When I perform a search "site:dcacar.com " the url is no where to be found on all 5 pages. But when I check my Google Console it shows as indexed I requested to index again but nothing changed. All other 50 or so urls are not effected at all, this is the only url that has gone missing can someone solve this mystery for me please. Thanks a lot in advance.
Intermediate & Advanced SEO | | Davit19850 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
Canonical URLs and Sitemaps
We are using canonical link tags for product pages in a scenario where the URLs on the site contain category names, and the canonical URL points to a URL which does not contain the category names. So, the product page on the site is like www.example.com/clothes/skirts/skater-skirt-12345, and also like www.example.com/sale/clearance/skater-skirt-12345 in another category. And on both of these pages, the canonical link tag references a 3rd URL like www.example.com/skater-skirt-12345. This 3rd URL, used in the canonical link tag is a valid page, and displays the same content as the other two versions, but there are no actual links to this generic version anywhere on the site (nor external). Questions: 1. Does the generic URL referenced in the canonical link also need to be included as on-page links somewhere in the crawled navigation of the site, or is it okay to be just a valid URL not linked anywhere except for the canonical tags? 2. In our sitemap, is it okay to reference the non-canonical URLs, or does the sitemap have to reference only the canonical URL? In our case, the sitemap points to yet a 3rd variation of the URL, like www.example.com/product.jsp?productID=12345. This page retrieves the same content as the others, and includes a canonical link tag back to www.example.com/skater-skirt-12345. Is this a valid approach, or should we revise the sitemap to point to either the category-specific links or the canonical links?
Intermediate & Advanced SEO | | 379seo0 -
URL Error or Penguin Penalty?
I am currently having a major panic as our website www.uksoccershop.com has been largely dropped from Google. We have not made any changes recently and I am not sure why this is happening, but having heard all sorts of horror stories of penguin update, I am fearing the worst. If you google "uksoccershop" you will see that the homepage does not rank. We previously ranked in the top 3 for "football shirts" but now we don't, although on page 2, 3 and 4 you will see one of our category pages ranking (this didn't used to happen). Some rankings are intact, but many have disappeared completely and in some cases been replaced by other pages on our site. I should point out our existing rankings have been consistently there for 5-6 years until today. I logged into webmaster tools and thankfully there is no warning message from Google about spam, etc, but what we do have is 35,000 URL errors for pages which are accessible. An example of this is: | URL: | http://www.uksoccershop.com/categories/5_295_327.html | | Error details In Sitemaps Linked from Last crawled: 6/20/12First detected: 6/15/12Googlebot couldn't access the contents of this URL because the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request. Is it possible this is the cause of the issue (we are not currently sure why the URL's are being blocked) and if so, how severe is it and how recoverable?If that is unlikely to cause the issue, what would you recommend our next move is?All help is REALLY REALLY appreciated 🙂
Intermediate & Advanced SEO | | ukss19840 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
Is it safe to redirect multiple URLs to a single URL?
Hi, I have an old Wordress website with about 300-400 original pages of content on it. All relating to my company's industry: travel in Africa. It's a legitimate site with travel stories, photos, advice etc. Nothing spammy about. No adverts on it. No affiliates. The site hasn't been updated for a couple of years and we no longer have a need for it. Many of the stories on it are quite out of date. The site has built up a modest Mozrank value over the last 5 years, and has a few hundreds organically achieved inbound links. Recently I set up a swanky new branded website on ExpressionEngine on a new domain. My intention is to: Shut down the old site Focus all attention on building up content on the new website Ask the people linking to the old site to my new site instead (I wonder how many will actually do so...) Where possible, setup a 301 redirect from pages on the old site to their closest match on the new site Setup a 301 redirect from the old site's home page to new site's homepage Sounds good, right? But there is one issue I need some advice on... The old site has about 100 pages that do not have a good match on the new site. These pages are outdated or inferior quality, so it doesn't really make sense to rewrite them and put them on the new site. I call these my "black sheep pages". So... for these "black sheep pages" should I (A) redirect the urls to the new site's homepage (B) redirect the urls the old site's home page (which in turn, redirects to the new site's homepage, or (C) not redirect the urls, and let them die a lonely 404 death? OPTION A: oldsite.com/page1.php -> newsite.com
Intermediate & Advanced SEO | | AndreVanKets
oldsite.com/page2.php -> newsite.com
oldsite.com/page3.php -> newsite.com
oldsite.com/page4.php -> newsite.com
oldsite.com/page5.php -> newsite.com
oldsite.com -> newsite.com OPTION B: oldsite.com/page1.php -> oldsite.com
oldsite.com/page2.php -> oldsite.com
oldsite.com/page3.php -> oldsite.com
oldsite.com/page4.php -> oldsite.com
oldsite.com/page5.php -> oldsite.com
oldsite.com -> newsite.com OPTION 😄 oldsite.com/page1.php : do not redirect, let page 404 and disappear forever
oldsite.com/page2.php : do not redirect, let page 404 and disappear forever
oldsite.com/page3.php : do not redirect, let page 404 and disappear forever
oldsite.com/page4.php : do not redirect, let page 404 and disappear forever
oldsite.com/page5.php : do not redirect, let page 404 and disappear forever
oldsite.com -> newsite.com My intuition tells me that Option A would pass the most "link juice" to my new site, but I am concerned that it could also be seen by Google as a spammy redirect technique. What would you do? Help 😐1 -
Blocking Dynamic URLs with Robots.txt
Background: My e-commerce site uses a lot of layered navigation and sorting links. While this is great for users, it ends up in a lot of URL variations of the same page being crawled by Google. For example, a standard category page: www.mysite.com/widgets.html ...which uses a "Price" layered navigation sidebar to filter products based on price also produces the following URLs which link to the same page: http://www.mysite.com/widgets.html?price=1%2C250 http://www.mysite.com/widgets.html?price=2%2C250 http://www.mysite.com/widgets.html?price=3%2C250 As there are literally thousands of these URL variations being indexed, so I'd like to use Robots.txt to disallow these variations. Question: Is this a wise thing to do? Or does Google take into account layered navigation links by default, and I don't need to worry. To implement, I was going to do the following in Robots.txt: User-agent: * Disallow: /*? Disallow: /*= ....which would prevent any dynamic URL with a '?" or '=' from being indexed. Is there a better way to do this, or is this a good solution? Thank you!
Intermediate & Advanced SEO | | AndrewY1