Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I Add Location to ALL of My Client's URLs?
-
Hi Mozzers,
My first Moz post! Yay! I'm excited to join the squad
My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc.
I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like:
example.com/weddings/planners-washington-dc-md-va
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-vaOR
example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lightingIn both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it.
Thoughts?
Thank you!!
-
No website in particular that springs to mind, I'm afraid. But it's not uncommon practice, and I'm sure you'll find plenty within your industry from a little competitor research.
Good luck!
-
This is great stuff. Thank you! Would you happen to have an example of a site that does this well? I think you're spot on in your suggestions and would love to see it in practice.
-
(I had posted my response, but Moz didn't fancy saving it for some reason and it's just gone. So I'll try and remember what I typed and repost it...)
I wouldn't dilute the site authority by using subdomains for your locations.
As a user, I would recommend your main site navigation lists the different event types (weddings, parties, corporate, etc) and branch your locations from there.
e.g.
-
Weddings - /weddings/ (Weddings)
-
Miami - /weddings/miami/ (Weddings in Miami)
-
Planners - /weddings/miami/planners/ (Wedding Planners in Miami)
-
DJs - /weddings/miami/djs/ (Wedding DJs in Miami)
-
Ballroom Lighting - /weddings/miami/ballroom-lighting/ (Ballroom Lighting for Weddings in Miami)
That structure seems the most logical to me, but you should do your own research to back this up. Conduct thorough keyword research for each service in each location and structure your landing page content accordingly. For example, main category pages broadly targeting root keyword, but display "cards" or sections that link to each location without optimising those main category pages for the locations - save this for the location-based landing pages. So this sub-navigation is in the body, rather than in the main navigation, for user-friendliness.
I think with something like events, you don't want to shove the locations in the user's face first thing. Let them see what you offer (the different event types), then delve down into the locations, and the specific services within those locations.
People are free to disagree with me, and I welcome critique on these thoughts. I do think with SEO, it gets to a point after "best practices" that it comes down to more of personal preferences.
-
-
Excellent advice Ria. I'll likely give that advice to the client.
Another question that brewed from this: how then should main navigation be handled as we expand? obviously we can't have D.C. centric keywords in the main navigation as the business expands. I think we could create unique content and landing pages for each individual service and location, but how would that be incorporate into the overall user flow and URL structure?
Would it be more of a sitemap play? If someone goes to www.example.com, should they be given an option to choose their location then be routed to that specific city's subdomain and yhenbrowse from there?
I guess my main question is, how exactly should we structure the site navigation for users from multiple cities to both please UX and the big G?
Thank you!
-
For a handful of different locations, it's quite common to structure them as different subdirectories, as you said. site.com/weddings/miami/planners or /miami/weddings/planners - whichever makes the most sense for your customer base and how you're targeting the content.
Just ensure that these are not considered doorway pages or appear to be too templated. Make each landing page for each location unique, and tailored specifically to your customers in each location. If you have nothing unique to say, then you don't need separate pages. It would be best to target the different locations on the same landing pages. But you being the expert in the industry, I can imagine it'll be easy enough to cater toward each audience specifically. Especially when you're not dealing with tens if not hundreds or thousands of different towns.
If you are certain on expanding to different cities soon, then it might be best to begin the URL structuring with /washington-dc/ subdirectory somewhere, so you don't have to change this later.
-
Thank you, Ria. That's very helpful.
Im curious, when the business expands to different cities in the coming months (for example, Miami and Chicago are being considered, not yet finalized), then in that case I would assume we need to have location in the URL path for the sake of designation and differentiation. This may be a sub folder in and of itself though. Thoughts?
-
I'd avoid adding the location in the URL if you only work with those services for a single location. It looks messy to the user, and can look spammy to Google. And it would save you from having to change the URL and set up redirects, if you need to remove the location keywords from the URL at a later date in order to please the Big G. Optimising for location within the content, title and meta can be easily tweaked with time. Tweaking URLs can be a lot messier.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URL that has been 301'd for months appearing in SERPs
We created a more keyword friendly url with dashes instead of underscores in December. That new URL is in Google's Index and has a few links to it naturally. The previous version of the URL (with underscores) continues to rear it's ugly head in the SERPs, though when you click on it you are 301'd to the new url. The 301 is implemented correctly and checked out on sites such as http://www.redirect-checker.org/index.php. Has anyone else experienced such a thing? I understand that Google can use it's discretion on pages, title tags, canonicals, etc.... But I've never witnessed them continue to show an old url that has been 301'd to a new for months after discovery or randomly.
Intermediate & Advanced SEO | | seoaustin0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Why is /home used in this company's home URL?
Just working with a company that has chosen a home URL with /home latched on - very strange indeed - has anybody else comes across this kind of homepage URL "decision" in the past? I can't see why on earth anybody would do this! Perhaps simply a logic-defying decision?
Intermediate & Advanced SEO | | McTaggart0 -
Chinese Sites Linking With Bizarre Keywords Creating 404's
Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra". Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?
Intermediate & Advanced SEO | | alrockn0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
Intermediate & Advanced SEO | | Motava
ftp.DOMAIN2.com/?action=news&id=63
META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN3.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN4.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
mail.DOMAIN5.com/?action=category&id=17
META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27 There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0