Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I Add Location to ALL of My Client's URLs?
-
Hi Mozzers,
My first Moz post! Yay! I'm excited to join the squad

My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc.
I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like:
example.com/weddings/planners-washington-dc-md-va
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-vaOR
example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lightingIn both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it.
Thoughts?
Thank you!!
-
No website in particular that springs to mind, I'm afraid. But it's not uncommon practice, and I'm sure you'll find plenty within your industry from a little competitor research.
Good luck!
-
This is great stuff. Thank you! Would you happen to have an example of a site that does this well? I think you're spot on in your suggestions and would love to see it in practice.
-
(I had posted my response, but Moz didn't fancy saving it for some reason and it's just gone. So I'll try and remember what I typed and repost it...)
I wouldn't dilute the site authority by using subdomains for your locations.
As a user, I would recommend your main site navigation lists the different event types (weddings, parties, corporate, etc) and branch your locations from there.
e.g.
-
Weddings - /weddings/ (Weddings)
-
Miami - /weddings/miami/ (Weddings in Miami)
-
Planners - /weddings/miami/planners/ (Wedding Planners in Miami)
-
DJs - /weddings/miami/djs/ (Wedding DJs in Miami)
-
Ballroom Lighting - /weddings/miami/ballroom-lighting/ (Ballroom Lighting for Weddings in Miami)
That structure seems the most logical to me, but you should do your own research to back this up. Conduct thorough keyword research for each service in each location and structure your landing page content accordingly. For example, main category pages broadly targeting root keyword, but display "cards" or sections that link to each location without optimising those main category pages for the locations - save this for the location-based landing pages. So this sub-navigation is in the body, rather than in the main navigation, for user-friendliness.
I think with something like events, you don't want to shove the locations in the user's face first thing. Let them see what you offer (the different event types), then delve down into the locations, and the specific services within those locations.
People are free to disagree with me, and I welcome critique on these thoughts. I do think with SEO, it gets to a point after "best practices" that it comes down to more of personal preferences.
-
-
Excellent advice Ria. I'll likely give that advice to the client.
Another question that brewed from this: how then should main navigation be handled as we expand? obviously we can't have D.C. centric keywords in the main navigation as the business expands. I think we could create unique content and landing pages for each individual service and location, but how would that be incorporate into the overall user flow and URL structure?
Would it be more of a sitemap play? If someone goes to www.example.com, should they be given an option to choose their location then be routed to that specific city's subdomain and yhenbrowse from there?
I guess my main question is, how exactly should we structure the site navigation for users from multiple cities to both please UX and the big G?
Thank you!
-
For a handful of different locations, it's quite common to structure them as different subdirectories, as you said. site.com/weddings/miami/planners or /miami/weddings/planners - whichever makes the most sense for your customer base and how you're targeting the content.
Just ensure that these are not considered doorway pages or appear to be too templated. Make each landing page for each location unique, and tailored specifically to your customers in each location. If you have nothing unique to say, then you don't need separate pages. It would be best to target the different locations on the same landing pages. But you being the expert in the industry, I can imagine it'll be easy enough to cater toward each audience specifically. Especially when you're not dealing with tens if not hundreds or thousands of different towns.
If you are certain on expanding to different cities soon, then it might be best to begin the URL structuring with /washington-dc/ subdirectory somewhere, so you don't have to change this later.
-
Thank you, Ria. That's very helpful.
Im curious, when the business expands to different cities in the coming months (for example, Miami and Chicago are being considered, not yet finalized), then in that case I would assume we need to have location in the URL path for the sake of designation and differentiation. This may be a sub folder in and of itself though. Thoughts?
-
I'd avoid adding the location in the URL if you only work with those services for a single location. It looks messy to the user, and can look spammy to Google. And it would save you from having to change the URL and set up redirects, if you need to remove the location keywords from the URL at a later date in order to please the Big G. Optimising for location within the content, title and meta can be easily tweaked with time. Tweaking URLs can be a lot messier.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
Is Chamber of Commerce membership a "paid" link, breaking Google's rules?
Hi guys, This drives me nuts. I hear all the time that any time value is exchanged for a link that it technically violates Google's guidelines. What about real organizations, chambers of commerce, trade groups, etc. that you are a part of that have online directories with DO-follow links. On one hand people will say these are great links with real value outside of search and great for local SEO..and on the other hand some hardliners are saying that these technically should be no-follow. Thoughts???
Intermediate & Advanced SEO | | RickyShockley0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Do 404s really 'lose' link juice?
It doesn't make sense to me that a 404 causes a loss in link juice, although that is what I've read. What if you have a page that is legitimate -- think of a merchant oriented page where you sell an item for a given merchant --, and then the merchant closes his doors. It makes little sense 5 years later to still have their merchant page so why would removing them from your site in any way hurt your site? I could redirect forever but that makes little sense. What makes sense to me is keeping the page for a while with an explanation and options for 'similar' products, and then eventually putting in a 404. I would think the eventual dropping out of the index actually REDUCES the overall link juice (ie less pages), so there is no harm in using a 404 in this way. It also is a way to avoid the site just getting bigger and bigger and having more and more 'bad' user experiences over time. Am I looking at it wrong? ps I've included this in 'link building' because it is related in a sense -- link 'paring'.
Intermediate & Advanced SEO | | friendoffood0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0