Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best practice for multilanguage website ( PHP feature based on Browser or Geolocalisation)
-
Hi Moz Experts
I would like to know what does it the best practice for multilanguage website for the default language ?
There are several PHP features to help users to get the right language when they come from SEO and direct; present the default language by browser language, by gelolocalisation, etc. However, which one is the most appropriate for Quebec company that try to get outside Canada ? PRO and CONS.
Thank you in advance.
-
I love it Kate Moriss.
it make it simple with the small questions.
-
Thank you. So, I will need to add the hreflang for the first strategy.
For the Webmastertool, I'm not so sure, because we didn't have subdirectories. We have www.mycompany.com and www.mycompany.com/fr. So, actually, I have webmastertool for www.mycompany.com. However, we have a Google plus page, but we didn't have a Content marketing strategy.
that's all we have for the moment in place?
-
I built a tool to help people understand how to best go about international expansion. It's here: katemorris.com/issg
-
Not sure if you could really call it a best practice - but in Belgium (3 different languages) the normal configuration is not to determine the default language automatically but rather to present a first time visitor a "choose language" page and store the choice in a cookie for future visits. This is mainly for direct visits.
People coming in via search engines use queries in one of the languages, so normally Google will direct them to pages in that language. Again, on first time visit, the implicit choice of language is stored in a cookie.
All pages contain a link to switch to the other language(s) - which also changes the choice stored in the cookie.
Disadvantage of this system is that you add an additional layer to the site (choice of language) - advantage is that error margin is zero.
Systems which are based ip, browser language, ...etc are not 100% reliable - which could lead to unwanted results (in Belgium quite a sensitive issue if you server a Dutch page to a French speaking person - idem for French & Dutch speaking).
Hope this helps,
Dirk
-
Hi there
Look into hreflang attributes - if you have the same site with multiple languages, this is a best practice.
You can also look into setting up Webmaster Tool accounts for different sites (if they have different country codes or subdirectories) and geotarget those for the country they are supposed to appear to.
Does this help? Let me know - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing content and design of the website gonna affect my all the backlinks i have made till now
i have been working on my link profile for a month now, after learning about 5 step moz methodology i have decided that i would like to change all of the content of my site and taylor it to what my customers need, am i gonna loose all the domain authority if make changes? if it gonna affect, hows that gonna come out
Web Design | | calvinkj0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Reasons Why Our Website Pages Randomly Loads Without Content
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution. Happens on Chrome, IE and Firefox, testing with multiple browser versions Happens across various page types - but seems to be only the main content section/container Happens while on the company network, as well as externally Happens after deleting cookies, temporary internet files and restarting computer We are using a CMS that is virtually unheard of - Bridgeline/Iapps Codebase is .net Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it. I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz Linking to an image example below knEUzqd
Web Design | | CliqStudios0 -
E-Commerce Website Architecture - Cannibalization between Product Categories and Blog Categories?
Hi, I have an e-commerce site that sells laptops. My main landing pages and category pages are as follows:
Web Design | | BeytzNet
"Toshiba Laptops", "Samsung Laptops", etc. We also run a WP blog with industry news.
The posts are divided into categories which are basically as our landing pages.
The posts themselves usually link to the appropriate e-commerce landing page.
For example: a post about a new Samsung Laptop which is categorized in the blog under "Samsung Laptops" will naturally link somewhere inside to the "samsung laptops" ecommerce landing page. Is that good or do the categories on the blog cannibalize my more important e-commerce section landing pages? Thanks0 -
Website Redesign - Will it hurt SERP?
Hi - I am planning to redesign my blog and I was wondering if this will affect my rankings? The new website template (custom designed) is much more user and seo friendly. The content, url structure, internal linking structure, meta tags, and site structure will remain exactly the same, but the visual design will be different (new sidebar widgets, and slightly different layout on inner pages). The current website is ranking very well (mostly top 5), has a healthy backlink profile, strong social media presence, and great traffic. I have heard that switching to a new template will dramatically hurt the rankings. Is this true? Are there any exceptions? Any ways I can prevent the rankings from dropping? Would really appreciate your input. Thanks in advance. Howard
Web Design | | howardd0 -
Javascript, PhP and SEO Impact?
What are the Pro's and Con's of using Java Script and PHP in a site when factoring in SEO?
Web Design | | bronxpad0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0