Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best practice for multilanguage website ( PHP feature based on Browser or Geolocalisation)
-
Hi Moz Experts
I would like to know what does it the best practice for multilanguage website for the default language ?
There are several PHP features to help users to get the right language when they come from SEO and direct; present the default language by browser language, by gelolocalisation, etc. However, which one is the most appropriate for Quebec company that try to get outside Canada ? PRO and CONS.
Thank you in advance.
-
I love it Kate Moriss.
it make it simple with the small questions. 
-
Thank you. So, I will need to add the hreflang for the first strategy.
For the Webmastertool, I'm not so sure, because we didn't have subdirectories. We have www.mycompany.com and www.mycompany.com/fr. So, actually, I have webmastertool for www.mycompany.com. However, we have a Google plus page, but we didn't have a Content marketing strategy.
that's all we have for the moment in place?

-
I built a tool to help people understand how to best go about international expansion. It's here: katemorris.com/issg
-
Not sure if you could really call it a best practice - but in Belgium (3 different languages) the normal configuration is not to determine the default language automatically but rather to present a first time visitor a "choose language" page and store the choice in a cookie for future visits. This is mainly for direct visits.
People coming in via search engines use queries in one of the languages, so normally Google will direct them to pages in that language. Again, on first time visit, the implicit choice of language is stored in a cookie.
All pages contain a link to switch to the other language(s) - which also changes the choice stored in the cookie.
Disadvantage of this system is that you add an additional layer to the site (choice of language) - advantage is that error margin is zero.
Systems which are based ip, browser language, ...etc are not 100% reliable - which could lead to unwanted results (in Belgium quite a sensitive issue if you server a Dutch page to a French speaking person - idem for French & Dutch speaking).
Hope this helps,
Dirk
-
Hi there
Look into hreflang attributes - if you have the same site with multiple languages, this is a best practice.
You can also look into setting up Webmaster Tool accounts for different sites (if they have different country codes or subdirectories) and geotarget those for the country they are supposed to appear to.
Does this help? Let me know - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
Can a cloud based firewall affect my search ranking?
Hi, I recently implemented a firewall on my website to prevent hacking attacks. We were getting a crazy amount of people per day trying to brute force our website. I used the sucuri cloud proxy firewall service which they claim because of the super fast caching actually helps SEO. I was just wondering is this true? Because we're slowly falling further and further down the SERPS and i really don't know why. If not, is there any major google update recently I don't know about? Thanks, Robert
Web Design | | BearPaw880 -
Website Home page suddenly disappeared after changing Hosting
HI All, My site was ranking very well and was in 1st page of google for most of my keywords. Last week we did some update to the site and moved it to new hosting and from then onwards I dont see my site home page in Google ranking . My Website Name is : royalevents.com.au. We used to be in 1st of Google for keywords like wedding Mandaps, Indian Wedding Mandaps etc, Would be great if some one helps us to figure out whats gone wrong .. I also did Webmaster Fetch as Google but nothing happened. Thanks
Web Design | | Verve-Innovation0 -
Subdomains For Real Estate Website
I am currently working on a proposal for a clients Wordpress website development which includes ongoing SEO after the website is developed. I have looked into a number of options and the one that seems the most cost effective involves using subdomains for the individual listings pages. What I want: clientsdomain.com/listings/idxnumber/ What I can get for a decent price: listings.clientsdomain.com/idxnumber/ So the majority of the website will actually exist on a subdomain because the IDX API will automatically populate pages for all of the MLS listings in the area (hundreds or thousands). Meanwhile the domain itself will have all the neighborhood pages and other optimized content, blogs and whatnot. My concern is that dividing the website like this will have negative effects on SEO. There wont be duplicate content across subdomain and main domain, but they will share a lot of links back and forth. I haven't found any recent sources on the topic. Almost everything I have found says that dividing a website in this manor is bad for SEO, but these articles are often many years old. Does anyone know of a Wordpress plugin/IDX company that can provide a solution that doesn't use a subdomain and actually just lists each MLS page within a directory? I am open to using another platform, I am just most familiar with Wordpress. Will using a subdomain in the ways mentioned above have a profound negative effect on SEO? Thank you for your time in responding, I greatly appreciate it.
Web Design | | TotalMarketExposure0 -
Website Redesign - Will it hurt SERP?
Hi - I am planning to redesign my blog and I was wondering if this will affect my rankings? The new website template (custom designed) is much more user and seo friendly. The content, url structure, internal linking structure, meta tags, and site structure will remain exactly the same, but the visual design will be different (new sidebar widgets, and slightly different layout on inner pages). The current website is ranking very well (mostly top 5), has a healthy backlink profile, strong social media presence, and great traffic. I have heard that switching to a new template will dramatically hurt the rankings. Is this true? Are there any exceptions? Any ways I can prevent the rankings from dropping? Would really appreciate your input. Thanks in advance. Howard
Web Design | | howardd0 -
Does Google penalize duplicate website design?
Hello, We are very close to launching five new websites, all in the same business sector. Because we would like to keep our brand intact, we are looking to use the same design on all five websites. My question is, will Google penalize the sites if they have the same design? Thank you! Best regards,
Web Design | | Tiberiu
Tiberiu0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
Where is the best place to put reciprocal links on our website?
Where should reciprocal links be placed on our website? Should we create a "Resources" page? Should the page be "hidden" from the public? I know there is a right answer out there! Thank you for your help! Jay
Web Design | | theideapeople0