Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
English pages given preference over local language
-
We recently launched a new design of our website and for SEO purposes we decided to have our website both in English and in Dutch. However, when I look at the rankings in MOZ for many of our keywords, it seems the English pages are being preferred over the Dutch ones. That never used to be the case when we had our website in the old design. It mainly is for pages that have an English keyword attached to them, but even then the Dutch page would just rank.
I'm trying to figure out why English pages are being preferred now and whether that could actually damage our rankings, as search engines would prefer copy in the local language.
An example is this page: https://www.bluebillywig.com/nl/html5-video-player/ for the keywords "HTML5 player" and "HTML5 video player".
-
Possible Reasons for English Page Preference:
Technical SEO:
Hreflang tags: Double-check your hreflang implementation to ensure it's correctly pointing to the Dutch version for Dutch users.
Content differences: Verify if the content on the Dutch page is identical to the English page, including title tags, meta descriptions, and headings. Even slight differences can impact rankings.
Mobile responsiveness: Ensure both versions are mobile-friendly and optimized for different screen sizes.
Content Quality:
Keyword targeting: Analyze keyword usage in both versions. Are the Dutch pages properly optimized for Dutch keywords?
Unique content: While mirroring content is acceptable, unique value in the Dutch version can attract Dutch users and improve rankings.
User engagement: Check analytics to see if users engage more with the English page (e.g., higher time on page, lower bounce rate). This can signal search engines about user preference.
Potential Impacts and Actions:Ranking Damage: While having English pages rank for Dutch keywords isn't necessarily damaging, it can divert traffic from the intended audience. Ideally, the Dutch page should rank for Dutch keywords in the Netherlands.
Investigate: Use tools like Google Search Console and Moz to analyze specific keyword rankings, crawl errors, and user engagement metrics for both versions.
Optimize Dutch Pages: Ensure proper technical SEO, optimize content for Dutch keywords, and consider adding unique value to attract Dutch users.
Monitor and Refine: Track progress and adjust your approach based on ongoing analysis and results.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Website with no backlinks and a DA of 1 ranks first. Why?
https://www.realestatephotos.melbourne does not have any backlinks and has a DA of 1.
Keyword Explorer | | garrypat
This site ranks first for keywords - real estate photography melbourne and property photography melbourne.
Not sure why.
URL contains keywords and site is ok. But other sites with many links and higher DA rank lower. Why? Garry1 -
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Unsolved Has anyone noticed the Google quote request response rate never changes?
We respond to almost 100% of our quote requests, yet every quote email that comes in from Google shows a 27% response rate and it never changes. Has anyone else seen this or have any insight into it?
Local Listings | | r1200gsa0 -
Footer backlink for/to Web Design Agency
I read some old (10+ years) information on whether footer backlinks from the websites that design agencies build are seen as spammy and potentially cause a negative effect. We have over 150 websites that we have built over the last few years, all with sitewide footer backlinks back to our homepage (designed and managed by COMPANY NAME). Semrush flags some of the links as potential spammy links. What are the current thoughts on this type of footer backlink? Are we better to have 1 dofollow backlink and the rest of the website nofollow from each domain?
Link Building | | MultiAdE1 -
Appending a code at the end of a URL
Hi All, Some real estate/ news companies have a code appended to the end of a URL https://www.realestate.com.au/property-house-qld-ormiston-141747584 https://www.brisbanetimes.com.au/national/queensland/childcare-centre-could-face-prosecution-for-leaving-child-on-hot-bus-20230320-p5ctqs.html Can I ask if there's any negative SEO implications for doing this? Cheers Dave
Technical SEO | | Redooo0 -
Unsolved GMB Local SEO question
I am trying to diagnose how one particular competitor is smoking us in local rankings. I came across a text field “Service Details' within Google My Business Services. This allows me to put in a brief description of each service we offer. My thought is that this could be a good place for keywords. That said, the descriptions are not public facing (or to the best of my knowledge) so I am reluctant to do all the work for nothing. I am wondering if anyone has filled these out and if there were any noticeable results. Any insight is appreciated
Local SEO | | jorda0910 -
Keyword rich domain names -> Point to sales funnel sites or to landing pages on primary domain?
Hey everyone,
Local SEO | | Transpera
We have a tonne of old domains we have done nothing with. All of them are keyword-rich domains.
Things like "[City]SEOPro" or "[City]DigitalMarketing" where [city] is a city that we are already targeting services in. So all of these domains will be targeted for local cities as keywords. We have been having an internal debate about whether or not we should just host sales funnel pages on these domains, that are rich in keywords and content......... ... Or ... ... Should we point these domains to landing pages on our existing domain that are basically the same as what we would do with the sales funnel pages, but are on our primary site? (keyword rich, with good and plentiful content) Then, as a follow-up question... Should these be set as just 301 redirects on these domains to our actual primary domain so the browser sees the landing page domain instead of the actual keyword-rich domain? ( [city]seopro.com ) Thanks guys. I know for some, the response will be an obvious one. However; we have probably way over thought this and have arguments for almost every scenario. We think we have an answer but wanted to send this out to the community first. I won't post what we are thinking yet, so that the answers can remain unbiased for now and we can have a conversation without it being swayed any one way. We understand that 301 redirects would be seen as a doorway page.
We are also only discussing in the context of organic search only.
If we ran the domains as their own sites, they would be about 3 pages of content only. Pretty static, but good content. Think of a PAS style sales funnel. Problem -> Acknowledgement -> Solution.0