Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
-
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online
It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online".
We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication.
Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online".
Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL?
Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
-
Does urls with numbers accepted by various SEO submission sites? If not, which sites are mainly not accepting? Can you kindly list a few if possible?
-
In our case, this is comparatively a better solution for url after doing so many research. Otherwise url duplicate issue, identity issue will be arise.
Example URL: http://public.beta.travelyaari.com/vrl-travels-13555-online (keywords, texts and ID is there)
Can you suggest what are the pros and cons? And which is overruled? So if it does not have that much impact, then we will continue with it.
-
Hello,
I would keep numbers out of page titles if at all possible. The more it looks like computer code I think the worse it is. Try to keep page URL readable by humans and also convey a message about the content of the page.
Best Regards
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Opinion on Gotch SEO methods & services
I would love to get you all's take on Gotch SEO. I am gearing up to link build for a site in the next several months, and have been reading up from sources other than Moz, in preparation. (Need to re-read Moz's guide, too, but I have already read it last year) I'm reading Gotch SEO's main link building method articles right now, and am wondering what you all think. Do you think they have a good approach and are generally reliable? Likewise, has anyone used their service for getting a link? What was your experience? Or if you haven't used the service, any quick takes on it?
White Hat / Black Hat SEO | | scienceisrad0 -
URL Masking or Cloaking?
Hi Guy's, On our webshop we link from our menu to categories were we want to rank on in Google. Because the menu is sitewide i guess Google finds the categories in the menu important and meaby let them score better (onside links) The problem that i'm facing with is that we make difference in Gender. In the menu we have: Man and Woman. Links from the menu go to: /categorie?gender=1/ and /category?gender=2/. But we don't want to score on gender but on the default URL. For example: Focus keyword = Shoes Menu Man link: /shoes?gender=1 Menu Woman link: /shoes?gender=2 But we only want to rank on /shoes/. But that URL is not placed in the menu. Every URL with: "?" has a follow noindex. So i was thinking to make a link in the menu, on man and woman: /shoes/, but on mouse down (program it that way) ?=gender. Is this cloaking for Google? What we also could do is make a canonical to the /shoes/ page. But i don't know if we get intern linking value on ?gender pages that have a canonical. Hope it makes senses 🙂 Advises are also welcome, such as: Place al the default URL's in the footer.
White Hat / Black Hat SEO | | Happy-SEO0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://a-moz.groupbuyseo.org/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
How to make second site in same niche and do white hat SEO
Hello, As much as we would like, there's a possibility that our site will never recover from it's Google penalties. Our team has decided to launch a new site in the same niche. What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.) We won't have duplicate content, but it's hard to make the sites not similar. Thanks
White Hat / Black Hat SEO | | BobGW0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0