Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multilingual website
-
My website is https://www.india-visa-gov.in and we are doing multilingual.
There are three options
1. TLD eg india-visa-gov.fr (French) india-visa-gov.de (German)
2. Subdomain eg: fr.india-visa-gov.in (French) de.india-visa-gov.in (German)
3. Folders https://www.india-visa-gov.in/fr/ (French) https://www.india-visa-gov.in/de/ (German)
We have tried the 3rd option but need to know whether its better or not for the long term health from SEO. Does the MOZ DA carry better in Subdomain or TLD or Folders? What does MOZ recommend to maintain DA?
Thanks
-
Andreas
Thanks for sharing your story
You did genuine outreach so that works best for both human users and also Google.
Insightful
-
Links from xyz.wordpress.com and abc.wordpress.com are 2 different links from 2 domains. Not one of them has the "DA" of www.wordpress.com because thats also a different domain.
We can say, that google is seeing it that way and not. Not because, in search console linking domain would be "wordpress.com". Noretheless they are different in a lot of points, DA is the most important one.
If you get more links from the same domain, it makes them less and less important (for every new link, not the old links). In my tests (wordpress & blogspot) it was the case, that it was less and less for the one linking subdomain, not all the sub-domains. Somehow understandable? So the answer is "it depends"
The test is not 100 % accurate and hard to compare but seems like thats the way it is.
What I also realized, when I get high increase in organic traffic and I have done nothing and no update was going on, it is mostly one new organic link. You can build links as much as you want but organic links beat everything and I dont know how google can figure that out so clearly.
So when you have a good amount of links, you should focus on your user. Like I do, thatswhy I cant tell you how MOZ or any other Tool is handling it, I simply do not care that much.
One of my former competitors has 10 times more Links as we have and today I call him "former" competitor. Former because we reached a new level, he not. We have now 9times more organic traffic and he still stuck where he is since a year. Well we have a single Page with more monthly organic visitors than his domain. We started on the same level 1.5 years ago. I did not build one single link, I just focus on users. A lot users, talking about hundrethousands organic visitors, for germany in this niche a lot. Ok we are now going into other topics, but thats not the point here.
All these topics are finance topics so maybe a YMYL special thing, but I have some more domains wich work better and better without building links and non-YMYL topics. But there is a lot great content wich makes the domains earn links. At least you need an entry, a dooropener to get links coming in.
Woop - to much hah
So you are somehow right, but dont think to much link
Anyway - Good Luck!
-
Thanks Andreas.
Also, if one website were to get backlinks from say
back to my website.
Then will MOZ count it backlinks from two domains?
Does Google treat it in the same way?
Meaning if I get backlinks from wordpress and blogspot, then they are counted as individual domains?
-
1.) Is imho the case shouldn't chose. Or - thats what majority would tell you. And thats still the case. So the new sub-domains are new domains. Google told us that they treat subdomains like other domains. Of course they do, they always did, but still - new domains.
2.) For long term, they all work well I am sure. It is easier in short term with new TLD's eg. .de .fr
So in germany we have a lot of .de Domains in SERPs and less .com/de or de.xyz.com. But you have to manage more and more domains.3.)This is what everybody is telling you in terms of Links and Pagerank. Stuff gets links in germany, french pages benefit. But - correctly linked content is working equal with sub-domains or new TLDs.
Amazon is doing 2nd, a lot SEO-Tool-Providers (e.g. ryte) do it with .com/folder - and both are working. I mention ryte because they also had .org before, they never used .de for germany. And that should be a hint - do what fits most to you and your needs. SEO is not that important - it is, but not in wich way you do it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website ranking declined after connecting to CDN
Hi! We are trying to rank https://windowmart.ca for various local search terms. Our head office is in Edmonton where we try to rank https://windowmart.ca/edmonton-windows-doors/ for such terms as "windows Edmonton", "replacement windows Edmonton", "windows and doors Edmonton" as well as others. The website was the leader in its niche for around 2 years. Then we've got some server related issues, moved to a new server and connected CDN Nitropack that really improved our google speed test results. Recently we noticed that our rankings started to drop. Do you know if Nitropack can negatively effect local SEO rankings? Thank you!
Technical SEO | | vaskrupp0 -
Why did my website DA fell down?
Hello, Could you please let me know why might my website's DA have fallen down in merely a week? What might be a reason? I also noticed traffic from google dropped down at the very same week. Will be very thankful for any advise!
Technical SEO | | kirupa0 -
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Rel=canonical on Godaddy Website builder
Hey crew! First off this is a last resort asking this question here. Godaddy has not been able to help so I need my Moz Fam on this one. So common problem My crawl report is showing I have duplicate home pages www.answer2cancer.org and www.answer2cancer.org/home.html I understand this is a common issue with apache webservers which is why the wonderful rel=canonical tag was created! I don't want to go through the hassle of a 301 redirect of course for such a simple issue. Now here's the issue. Godaddy website builder does not make any sense to me. In wordpress I could just go add the tag to the head in the back end. But no such thing exist in godaddy. You have to do this weird drag and drop html block and drag it somewhere on the site and plug in the code. I think putting before the code instead of just putting it in there. So I did that but when I publish and inspect in chrome I cannot see the tag in the head! This is confusing I know. the guy at godaddy didn't stand a chance lol. Anyway much love for any replies!
Technical SEO | | Answer2cancer0 -
Images, CSS and Javascript on subdomain or external website
Hi guy's, I came across webshops that put images, CSS and Javascript on different websites or subdomains. Does this boost SEO results? On our Wordpress webshop all the sourcescodes are placed after our own domainname: www.ourdomainname.com/wp-includes/js/jquery/jquery.js?ver=1.11.3'
Technical SEO | | Happy-SEO
www.ourdomainname.com/wp-content/uploads/2015/09/example.jpg Examples of other website: Website 1:
https://www.zalando.nl/heren-home/ Sourcecode:
https://secure-i3.ztat.net//camp/03/d5/1a0168ac81f2ffb010803d108221.jpg
https://secure-media.ztat.net/media/cms/adproduct/ad-product.min.css?_=1447764579000 Website 2:
https://www.bol.com/nl/index.html Sourcecode:
https://s.s-bol.com/nl/static/css/main/webselfservice.1358897755.css
//s.s-bol.com/nl/upload/images/logos/bol-logo-500500.jpg Website 3:
http://www.wehkamp.nl/ Sourcecode:
https://static.wehkamp.nl/assets/styles/themes/wehkamp.color.min.css?v=f47bf1
http://assets.wehkamp.com/i/wehkamp/350-450-layer-SDD-wk51-v3.jpg0 -
How to Remove a website from your Bing Webmaster Tools account
I have a site in Bing Webmaster Tools that I no longer work on. I can't seem to find where to delete this website from my webmaster tools account. Anyone know how (there doesn't seem to be anything obvious under Bing Help or on a Google Search).
Technical SEO | | TopFloor0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
Has google panelized us ? If so, why ? How do I know if our website is panelized ?
We were ranked on first page among top 5 position a year ago for most of our pages. On one fine day, google decided to drop us from the results although google keeps indexing our pages. Google index our pages regularly but doesn't show them in its results. All google traffic we receive is for our own site name and its variations. I wanted to know - how do we know if google has panelized us. Why has google panelized us ? If they have panelized us, what can we do to get out of it ? Also I wanted to know if any tool will help me identify such thing. We have not done any link building. Our site page rank is 4 (it was 5 few months ago). All we did was on page optimization. Thanks for your help!
Technical SEO | | seoidea0