Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I Add Location to ALL of My Client's URLs?
-
Hi Mozzers,
My first Moz post! Yay! I'm excited to join the squad
My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc.
I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like:
example.com/weddings/planners-washington-dc-md-va
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-vaOR
example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lightingIn both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it.
Thoughts?
Thank you!!
-
No website in particular that springs to mind, I'm afraid. But it's not uncommon practice, and I'm sure you'll find plenty within your industry from a little competitor research.
Good luck!
-
This is great stuff. Thank you! Would you happen to have an example of a site that does this well? I think you're spot on in your suggestions and would love to see it in practice.
-
(I had posted my response, but Moz didn't fancy saving it for some reason and it's just gone. So I'll try and remember what I typed and repost it...)
I wouldn't dilute the site authority by using subdomains for your locations.
As a user, I would recommend your main site navigation lists the different event types (weddings, parties, corporate, etc) and branch your locations from there.
e.g.
-
Weddings - /weddings/ (Weddings)
-
Miami - /weddings/miami/ (Weddings in Miami)
-
Planners - /weddings/miami/planners/ (Wedding Planners in Miami)
-
DJs - /weddings/miami/djs/ (Wedding DJs in Miami)
-
Ballroom Lighting - /weddings/miami/ballroom-lighting/ (Ballroom Lighting for Weddings in Miami)
That structure seems the most logical to me, but you should do your own research to back this up. Conduct thorough keyword research for each service in each location and structure your landing page content accordingly. For example, main category pages broadly targeting root keyword, but display "cards" or sections that link to each location without optimising those main category pages for the locations - save this for the location-based landing pages. So this sub-navigation is in the body, rather than in the main navigation, for user-friendliness.
I think with something like events, you don't want to shove the locations in the user's face first thing. Let them see what you offer (the different event types), then delve down into the locations, and the specific services within those locations.
People are free to disagree with me, and I welcome critique on these thoughts. I do think with SEO, it gets to a point after "best practices" that it comes down to more of personal preferences.
-
-
Excellent advice Ria. I'll likely give that advice to the client.
Another question that brewed from this: how then should main navigation be handled as we expand? obviously we can't have D.C. centric keywords in the main navigation as the business expands. I think we could create unique content and landing pages for each individual service and location, but how would that be incorporate into the overall user flow and URL structure?
Would it be more of a sitemap play? If someone goes to www.example.com, should they be given an option to choose their location then be routed to that specific city's subdomain and yhenbrowse from there?
I guess my main question is, how exactly should we structure the site navigation for users from multiple cities to both please UX and the big G?
Thank you!
-
For a handful of different locations, it's quite common to structure them as different subdirectories, as you said. site.com/weddings/miami/planners or /miami/weddings/planners - whichever makes the most sense for your customer base and how you're targeting the content.
Just ensure that these are not considered doorway pages or appear to be too templated. Make each landing page for each location unique, and tailored specifically to your customers in each location. If you have nothing unique to say, then you don't need separate pages. It would be best to target the different locations on the same landing pages. But you being the expert in the industry, I can imagine it'll be easy enough to cater toward each audience specifically. Especially when you're not dealing with tens if not hundreds or thousands of different towns.
If you are certain on expanding to different cities soon, then it might be best to begin the URL structuring with /washington-dc/ subdirectory somewhere, so you don't have to change this later.
-
Thank you, Ria. That's very helpful.
Im curious, when the business expands to different cities in the coming months (for example, Miami and Chicago are being considered, not yet finalized), then in that case I would assume we need to have location in the URL path for the sake of designation and differentiation. This may be a sub folder in and of itself though. Thoughts?
-
I'd avoid adding the location in the URL if you only work with those services for a single location. It looks messy to the user, and can look spammy to Google. And it would save you from having to change the URL and set up redirects, if you need to remove the location keywords from the URL at a later date in order to please the Big G. Optimising for location within the content, title and meta can be easily tweaked with time. Tweaking URLs can be a lot messier.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Need a layman's definition/analogy of the difference between schema and structured data
I'm currently writing a blog post about schema. However I want to set the record straight that schema is not exactly the same as structured data, although both are often used interchangeably. I understand this schema.org is a vocabulary of global identifiers for properties and things. Structured data is what Google officially stated as "a standard way to annotate your content so machines can understand it..." Does anybody know of a good analogy to compare the two? Thanks!
Intermediate & Advanced SEO | | RosemaryB0 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9