Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is single H1 tag still best practice?
-
Hi Guys,
Is having a single h1 tag still best practice for SEO?
Guessing multiple h1 tags dilute the value of the tag and keywords within the tag.
Thoughts?
Cheers.
-
That's just laziness on their part, they should update to have an H1 on every page.
-
MOZ site crawl indicated we had missing H1’s as well as the canonical tags on almost all of our pages. We work with a website company that has been building their websites from many years ago, but they have very useful tools for printers. They seem to have everything listed as H2's. I was told it would take a lot of effort on their part to make the H1’s. Are H2's sufficient?
-
From what I read here, the use of multiple H1 tags depends on what HTML version you are using. If it's HTML4 or XHTML, only use one H1 per page, but if you are using HTML5 you can have one per section. HTML5 uses the new semantic elements like
<header>and
<footer>to work out the hierarchy, with headings only affecting hierarchy within one of those tags.
Source: https://a-moz.groupbuyseo.org/community/q/how-will-it-effect-seo-to-have-multiple-h1-tags-on-a-page
</footer>
</header>
-
Hi there!
There are some theories that include a single H1 as a best practice.
Although, a few months ago, google said that its just fine to have several H1 tags.
My site's template has multiple H1 tags - Google Webmasters Youtube ChannelOf course, those multiple H1 tags must be correctly used. Google does understand those tags and comprehends whether are correctly used. Try using h2, h3 or other, as long as the content requieres them.
Your principal concern should be creating exceptional content, not having more than one H1 tag.Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it alright to repeat a keyword in the title tag?
I know at first glance, the answer to this is a resounding NO, that it can be construed as keyword stuffing,
Intermediate & Advanced SEO | | MIGandCo
but please hear me out. I am working on optimizing a client's website and although MOST of the title tags
can be optimized without repeating a keyword, occasionally I run into one where it doesn't read right if I
don't repeat the keyword. Here's an example: Current title:
Photoshop on the Cloud | Adobe Photoshop Webinars | Company Name What I am considering using as the optimized title:
Adobe Photoshop on the Cloud | Adobe Photoshop Webinars | Company Name Yes, I know both titles are longer than recommended. In both instances, only the company name gets
truncated so I am not too worried about that. So I guess what I want to know is this: Am I right in my original assumption that it is NEVER okay to
repeat keywords in a title tag or is it alright when it makes sense to do so?0 -
Help FORUM ( User generated content ) SEO best practices
Hello Moz folks ! For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC. I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated. Best, Yan
Intermediate & Advanced SEO | | ydesjardins2000 -
Duplicate Content www vs. non-www and best practices
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
Intermediate & Advanced SEO | | EnvoyWeb
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?0 -
Meta tags - are they case sensitive?
I just ran the wordtracker tool and noticed something interesting. The tool didn't pick up our meta description. It's strange as our meta descriptions appear in organic search results and Moz never reported missing meta descriptions.After reviewing other pages, I noticed our meta description tag is written as the following: name="Description" content=" I never thought about this, but are meta tags case sensitive? Should it be written as: name="description" content=" Thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
E-commerce site, one product multiple categories best practice
Hi there, We have an e-commerce shopping site with over 8000 products and over 100 categories. Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees" The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well. Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking. Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page. Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well) Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it. Thank you all!
Intermediate & Advanced SEO | | arikbar0 -
Should I remove Meta Keywords tags?
Hi, Do you recommend removing Meta Keywords or is there "nothing to lose" with having them? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Best Practice for Inter-Linking to CCTLD brand domains
Team, I am wondering what people recommend as best SEO practice to inter-link to language specific brand domains e.g. : amazon.com
Intermediate & Advanced SEO | | tomypro
amazon.de
amazon.fr
amazon.it Currently I have 18 CCTLDs for one brand in different languages (no DC). I am linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD doamin. However, with Google's discouragement of site-wide links I am reviewing this practice. I am tending towards making the language redirects on each page javascript driven and to start linking only from my home page to the other pages with optimized link titles. Anyone having any thoughts/opinions on this topic they are open to sharing? /Thomas0