Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
-
Hi,
I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites:
The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know.
This is how they currently hand this:
What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why?
Your thoughts? Best... Mike
P.S.,
Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
-
Hi Donna,
Thanks for all the help. I really appreciate it.
Best... Mike
-
Give them two options in a grid format:
(1) do nothing;
(2) redirect mobile to desktop
To the right of that, use two columns to convey pros and cons.
I guess you could do nothing and measure the impact for a few months, comparing this year to last. If things don't look good, then execute option 2. Might be hard to isolate the impact of the mobile index versus anything else though, but it's probably the best you can do.
And then there's the 3rd option... go responsive, but as you've said, you don't have time or budget for that unfortunately.
-
Hi Donna,
Thanks for the speedy reply. Yes, the thing that makes me nervous about that recommendation from the article is how do I even begin to weigh the odds on it being a net gain and then convey it to management? I mean, it's one thing for me to think, "yeah, let's roll the dice" and another to convey the trade-offs to a very typical management in something like numbers.
Thank you for noticing my avatar portrait. I did it over a Summer in the south of France. It will probably be worth a fortune once I am gone and regarded as a giant of the early 21st century world of art.
I wrote Moz about the "Staff" thing and it looks like they deleted the title... all titles really.
Best... Mike
-
What you have to weigh is the user impact. How much traffic are you currently getting from mobile devices? Will the desktop version of the website look awful, be hard to interact with or understand on a mobile phone or tablet device? You'll also lose the "mobile friendly" designation which might lower your rankings and click-thru rates.
It's a trade-off decision only you can make.
PS - I don't see "Staff" under your cool avatar.
-
Hi Donna,
Thanks for the insight and resource. What do you think of while waiting for the next year mobile responsive site, to 301 the two existing mobile sites to the desktop site? How would one begin to estimate the effect of that? Thanks, again.
Best... Mike
-
Michael,
I read a helpful article that touched on this exact topic yesterday. It's https://www.searchenginejournal.com/mobile-first-index-actually-mean/178017/ . As you've already pointed out, a responsive solution is best, but if the website's mobile and desktop content are the same, you may not have to do anything right away.
Check it out.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structured data: Product vs auto rental schema?
Hi mozzers, If you are rental company, is it useful to add both the product and auto rental schemas or auto rental schema on its own should just be enough? Finally, on the auto rental schema, you have to add an address. Could we just add a city instead of an entire address and avoid receiving a warning message on the strutured data testing tool? Thank you.
Intermediate & Advanced SEO | | Ty19860 -
Top hierarchy pages vs footer links vs header links
Hi All, We want to change some of the linking structure on our website. I think we are repeating some non-important pages at footer menu. So I want to move them as second hierarchy level pages and bring some important pages at footer menu. But I have confusion which pages will get more influence: Top menu or bottom menu or normal pages? What is the best place to link non-important pages; so the link juice will not get diluted by passing through these. And what is the right place for "keyword-pages" which must influence our rankings for such keywords? Again one thing to notice here is we cannot highlight pages which are created in keyword perspective in top menu. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Canonical tag + HREFLANG vs NOINDEX: Redundant?
Hi, We launched our new site back in Sept 2013 and to control indexation and traffic, etc we only allowed the search engines to index single dimension pages such as just category, brand or collection but never both like category + brand, brand + collection or collection + catergory We are now opening indexing to double faceted page like category + brand and the new tag structure would be: For any other facet we're including a "noindex, follow" meta tag. 1. My question is if we're including a "noindex, follow" tag to select pages do we need to include a canonical or hreflang tag afterall? Should we include it either way for when we want to remove the "noindex"? 2. Is the x-default redundant? Thanks for any input. Cheers WMCA
Intermediate & Advanced SEO | | WMCA0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
301 vs 410 redirect: What to use when removing a URL from the website
We are in the process of detemining how to handle URLs that are completely removed from our website? Think of these as listings that have an expiration date (i.e. http://www.noodle.org/test-prep/tphU3/sat-group-course). What is the best practice for removing these listings (assuming not many people are linking to them externally). 301 to a general page (i.e. http://www.noodle.org/search/test-prep) Do nothing and leave them up but remove from the site map (as they are no longer useful from a user perspective) return a 404 or 410?
Intermediate & Advanced SEO | | abargmann0 -
Help - .ie vs .co.uk in google uk
We have a website that for years has attracted a high level of organic searches and had a very high level of links. It has the .ie extension (Ireland) and did very well when competing in the niche market it is in on google.co.uk. We have the same domain name but in .co.uk format and basically redirected traffic to it when people typed in .co.uk instead. Since the latest panda update, we have noticed that the number of visits organically has dropped to a quarter of what it was and this is continuing to go down. We have also noticed that the .ie version is no longer listed in google and has been replaced by .co.uk. As we've never exchanged or submitted links for the .co.uk domain this means there are only links indexed in google. Is there any way I can get google to re-index the site using the .ie domain rather than the .co.uk domain? I am hemorrhaging sales now and becoming a much more withdrawn person by the day!!! PS - the .co.uk domain is set up as a domain alias in plesk with both .ie and .co.uk domain dns pointing to the the same IP address. Kind Regards
Intermediate & Advanced SEO | | rufo
Steve0 -
Is it worth switching from .net to .com if you own both domain names
For over 20 years the company I work for has used www.company.net as their TLD, because we could not register www.company.com at that time. However, currently we also own www.company.com www.company.com has a 301 re-direct to www.company.net We are a global company, and market leader in our industry. Our company name is associated with the product we make, and our competitors use our company name as their targeted keywords to attract visitors to their sites because our company name is synonym with the product we and they make. As we are a global company we also have lots of TLDcc's. The email address of all our employees worldwide have a .net email address extension. Would you advice switching from www.company.net to www.company.com??? And if so, what would be the reasons for this switch. Would it only be for branding purposes? Looking forward to some insights before taking on such an invasive switch (because of the switch of all email addresses of employees worldwide). Best regards, Astrid Groeneveld
Intermediate & Advanced SEO | | Cordstrap0 -
Wordtracker vs Google Keyword Tool
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
Intermediate & Advanced SEO | | nicole.healthline0