• majorAlexa

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Let your business shine with Listings AI
          Moz Local

          Let your business shine with Listings AI

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          NEW Keyword Suggestions by Topic
          Moz Pro

          NEW Keyword Suggestions by Topic

          Learn more
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Unlock flexible pricing & new endpoints
          Moz API

          Unlock flexible pricing & new endpoints

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Technical SEO
        4. GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2

        Technical SEO
        google http https indexation crawl
        5
        6
        368
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • AKCAC
          AKCAC last edited by

          Whole website moved to https://www. HTTP/2 version 3 years ago.

          When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol

          • Robots file is correct (simply allowing all and referring to https://www. sitemap

          • Sitemap is referencing https://www. pages including homepage

          • Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working

          • 301 redirects set up for non-secure and non-www versions of website all to https://www. version

          • Not using a CDN or proxy

          • GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.

          Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2

          Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.

          Any thoughts, further tests, ideas, direction or anything will be much appreciated!

          john1408 James-Avery 3 Replies Last reply Reply Quote 1
          • UmerIdrisi1
            UmerIdrisi1 @James-Avery last edited by

            Quoting here, to ask again, why this is happening with out pages too? is Google going crazy or what?

            @James-Avery said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

            @AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

            Whole website moved to https://www. HTTP/2 version 3 years ago.

            When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol

            • Robots file is correct (simply allowing all and referring to https://www. sitemap

            • Sitemap is referencing https://www. pages including homepage

            • Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working

            • 301 redirects set up for non-secure and non-www versions of website all to https://www. version

            • Not using a CDN or proxy

            • GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.

            Totally understand it can take time to update such as our page at backwards 3 index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2

            Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.

            Any thoughts, further tests, ideas, direction or anything will be much appreciated!

            First off, it's great that your entire website made the transition to HTTPS and HTTP/2 three years ago. That's definitely a step in the right direction for performance and security.

            Since your hosting provider has confirmed that the server is configured correctly for HTTP/2 and you've got the 301 redirects set up properly, it's puzzling why GoogleBot is still sticking to HTTP/1.1 for accessing the homepage. One thing you might want to double-check is if there are any specific directives in your server configuration that could be affecting how GoogleBot accesses your site. Sometimes, even seemingly minor configurations can have unintended consequences.

            Regarding the non-secure version of your website still showing up in the Discovery section of Google Search Console (GSC), despite the homepage being correctly indexed with the HTTPS version, it could be a matter of Google's index taking some time to catch up. However, it's worth investigating further to ensure there aren't any lingering issues causing this discrepancy.

            As for the home page not ranking as well in SERPs compared to other pages, despite having better content and speed, this could be due to a variety of factors. It's possible that Google's algorithms are prioritizing other pages for certain keywords or that there are specific technical issues with the homepage that are affecting its visibility.

            In terms of next steps, I'd recommend continuing to monitor the situation closely and perhaps reaching out to Google's support team for further assistance. They may be able to provide additional insights or suggestions for resolving these issues.

            Overall, it sounds like you've done a thorough job of troubleshooting so far, but sometimes these technical SEO mysteries require a bit of persistence to unravel. Keep at it, and hopefully, you'll be able to get to the bottom of these issues soon!

            1 Reply Last reply Reply Quote 0
            • Peter_Cox
              Peter_Cox @john1408 last edited by

              @john1408 said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

              @AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

              Whole website moved to https://www. HTTP/2 version 3 years ago.

              When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol

              • Robots file is correct (simply allowing all and referring to https://www. sitemap

              • Sitemap is referencing https://www. pages including homepage

              • Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working

              • 301 redirects set up for non-secure and non-www versions of website all to https://www. version

              • Not using a CDN or proxy

              • GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.

              Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2

              Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.

              Any thoughts, further tests, ideas, direction or anything will be much appreciated!

              It's baffling that GoogleBot persists with HTTP/1.1 for the homepage despite proper setup. Consider exploring Google Search Console further for indexing insights, and reach out to Google Support for assistance in resolving this unusual behavior.

              1 Reply Last reply Reply Quote 0
              • James-Avery
                James-Avery @AKCAC last edited by

                @AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

                Whole website moved to https://www. HTTP/2 version 3 years ago.

                When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol

                • Robots file is correct (simply allowing all and referring to https://www. sitemap

                • Sitemap is referencing https://www. pages including homepage

                • Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working

                • 301 redirects set up for non-secure and non-www versions of website all to https://www. version

                • Not using a CDN or proxy

                • GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.

                Totally understand it can take time to update backwards 3 index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2

                Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.

                Any thoughts, further tests, ideas, direction or anything will be much appreciated!

                First off, it's great that your entire website made the transition to HTTPS and HTTP/2 three years ago. That's definitely a step in the right direction for performance and security.

                Since your hosting provider has confirmed that the server is configured correctly for HTTP/2 and you've got the 301 redirects set up properly, it's puzzling why GoogleBot is still sticking to HTTP/1.1 for accessing the homepage. One thing you might want to double-check is if there are any specific directives in your server configuration that could be affecting how GoogleBot accesses your site. Sometimes, even seemingly minor configurations can have unintended consequences.

                Regarding the non-secure version of your website still showing up in the Discovery section of Google Search Console (GSC), despite the homepage being correctly indexed with the HTTPS version, it could be a matter of Google's index taking some time to catch up. However, it's worth investigating further to ensure there aren't any lingering issues causing this discrepancy.

                As for the home page not ranking as well in SERPs compared to other pages, despite having better content and speed, this could be due to a variety of factors. It's possible that Google's algorithms are prioritizing other pages for certain keywords or that there are specific technical issues with the homepage that are affecting its visibility.

                In terms of next steps, I'd recommend continuing to monitor the situation closely and perhaps reaching out to Google's support team for further assistance. They may be able to provide additional insights or suggestions for resolving these issues.

                Overall, it sounds like you've done a thorough job of troubleshooting so far, but sometimes these technical SEO mysteries require a bit of persistence to unravel. Keep at it, and hopefully, you'll be able to get to the bottom of these issues soon!

                UmerIdrisi1 1 Reply Last reply Reply Quote 0
                • john1408
                  john1408 @AKCAC last edited by

                  @AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

                  Whole website moved to https://www. HTTP/2 version 3 years ago.
                  When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol

                  Robots file is correct (simply allowing all and referring to https://www. sitemap

                  Sitemap is referencing https://www. pages including homepage

                  Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working

                  301 redirects set up for non-secure and non-www versions of website all to https://www. version

                  Not using a CDN or proxy

                  GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.

                  Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2
                  Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.
                  Any thoughts, further tests, ideas, direction or anything will be much appreciated!

                  t seems like you've taken several steps to ensure the correct protocol (HTTP/2) for your website, and it's puzzling that GoogleBot still accesses the home page via HTTP/1.1. A few additional suggestions:

                  Crawl Rate Settings: Check your Google Search Console (GSC) for crawl rate settings. Google might be intentionally crawling your site slowly.

                  Server Logs: Reanalyze server logs to confirm that GoogleBot is indeed accessing via HTTP/1.1 for the home page. This could help identify patterns or anomalies.

                  Mobile Usability: Ensure your home page is mobile-friendly. Google tends to prioritize mobile indexing.

                  Fetch and Render Tool: Use GSC's Fetch and Render tool to see how Google renders your home page. It might provide insights into how Google sees your page.

                  Structured Data and Markup: Ensure structured data and markup on your home page are correct and up-to-date.

                  Manual Submission: Consider manually requesting indexing for your home page through GSC.

                  Regarding the new pages performing well compared to the home page, it might be worth revisiting your on-page SEO elements and analyzing the competition for relevant keywords.

                  1 Reply Last reply Reply Quote 0
                  • john1408
                    john1408 @AKCAC last edited by

                    @AKCAC said in GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2:

                    Whole website moved to https://www. HTTP/2 version 3 years ago.

                    When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol

                    • Robots file is correct (simply allowing all and referring to https://www. sitemap

                    • Sitemap is referencing https://www. pages including homepage

                    • Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working

                    • 301 redirects set up for non-secure and non-www versions of website all to https://www. version

                    • Not using a CDN or proxy

                    • GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so.

                    Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2

                    Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page.

                    Any thoughts, further tests, ideas, direction or anything will be much appreciated!

                    Peter_Cox 1 Reply Last reply Reply Quote 0
                    • 1 / 1
                    • First post
                      Last post

                    Browse Questions

                    Explore more categories

                    • Moz Tools

                      Chat with the community about the Moz tools.

                    • SEO Tactics

                      Discuss the SEO process with fellow marketers

                    • Community

                      Discuss industry events, jobs, and news!

                    • Digital Marketing

                      Chat about tactics outside of SEO

                    • Research & Trends

                      Dive into research and trends in the search industry.

                    • Support

                      Connect on product support and feature requests.

                    • See all categories

                    Related Questions

                    • seogod123234

                      Why MOZ just index some of the links?

                      crawl indexing

                      hello everyone i've been using moz pro for a while and found a lot of backlink oppertunites as checking my competitor's backlink profile.
                      i'm doing the same way as my competitors but moz does not see and index lots of them, maybe just index 10% of them. though my backlinks are commenly from sites with +80 and +90 DA like Github, Pinterest, Tripadvisor and .... and the strange point is that 10% are almost from EDU sites with high DA. i go to EDU sites and place a comment and in lots of case, MOZ index them in just 2-3 days!! with maybe just 10 links like this, my DA is incresead from 15 to 19 in less than one month! so, how does this "SEO TOOL" work?? is there anyway to force it to crawl a page?

                      Link Building | | seogod123234
                      0
                    • photoseo1

                      Redirecting an Entire Website?

                      redirecting domain redirect former site

                      Is it best to redirect an old website to a new website page by page to like pages or just the entire site all at once to the home page of the new site? I do have about 10 good pages on the site that are worth directing to corresponding pages on the new site. Just trying to figure out what is going to preserve the most link juice. Thanks for the help!

                      Technical SEO | | photoseo1
                      0
                    • icogems

                      Unsolved how to add my known backlinks manually to moz

                      backlinks crawl

                      hello
                      i have cryptocurrency website and i found backlinks listed in my google webmasters dashboard, but those backlinks dont show in my moz dashboard even after 45 days. so my question is can i add those backlinks to moz, just to check my website real da score thanks,

                      Moz Local | | icogems
                      0
                    • Hasanovic

                      Google Not Indexing Pages (Wordpress)

                      google search console indexed urls indexation

                      Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.

                      Technical SEO | | Hasanovic
                      1
                    • Ginovdw

                      Keywords are indexed on the home page

                      indexation keyword rankings

                      Hello everyone, For one of our websites, we have optimized for many keywords. However, it seems that every keyword is indexed on the home page, and thus not ranked properly. This occurs only on one of our many websites. I am wondering if anyone knows the cause of this issue, and how to solve it. Thank you.

                      Technical SEO | | Ginovdw
                      1
                    • linkjuiced

                      Googlebot and other spiders are searching for odd links in our website trying to understand why, and what to do about it.

                      I recently began work on an existing Wordpress website that was revamped about 3 months ago. https://thedoctorwithin.com. I'm a bit new to Wordpress, so I thought I should reach out to some of the experts in the community.Checking ‘Not found’ Crawl Errors in Google Search Console, I notice many irrelevant links that are not present in the website, nor the database, as near as I can tell. When checking the source of these irrelevant links, I notice they’re all generated from various pages in the site, as well as non-existing pages, allegedly in the site, even though these pages have never existed. For instance: https://thedoctorwithin.com/category/seminars/newsletters/page/7/newsletters/page/3/feedback-and-testimonials/        allegedly linked from: https://thedoctorwithin.com/category/seminars/newsletters/page/7/newsletters/page/3/ (doesn’t exist) In other cases, these goofy URLs are even linked from the sitemap. BTW - all the URLs in the sitemap are valid URLs. Currently, the site has a flat structure. Nearly all the content is merely URL/content/ without further breakdown (or subdirectories). Previous site versions had a more varied page organization, but what I'm seeing doesn't seem to reflect the current page organization, nor the previous page organization. Had a similar issue, due to use of Divi's search feature. Ended up with some pretty deep non-existent links branching off of /search/, such as: https://thedoctorwithin.com/search/newsletters/page/2/feedback-and-testimonials/feedback-and-testimonials/online-continuing-education/consultations/  allegedly linked from: https://thedoctorwithin.com/search/newsletters/page/2/feedback-and-testimonials/feedback-and-testimonials/online-continuing-education/ (doesn't exist). I blocked the /search/ branches via robots.txt. No real loss, since neither /search/ nor any of its subdirectories are valid. There are numerous pre-existing categories and tags on the site. The categories and tags aren't used as pages. I suspect Google, (and other engines,) might be creating arbitrary paths from these. Looking through the site’s 404 errors, I’m seeing the same behavior from Bing, Moz and other spiders, as well. I suppose I could use Search Console to remove URL/category/ and URL/tag/. I suppose I could do the same, in regards to other legitimate spiders / search engines. Perhaps it would be better to use Mod Rewrite to lead spiders to pages that actually do exist. Looking forward to suggestions about best way to deal with these errant searches. Also curious to learn about why these are occurring. Thank you.

                      Technical SEO | | linkjuiced
                      0
                    • Allstar

                      Location Based Content / Googlebot

                      Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?

                      Technical SEO | | Allstar
                      0
                    • SuperMikeLewis

                      Googlebot Crawl Rate causing site slowdown

                      I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT: http://imgur.com/dyIbf I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot. Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings? Thanks

                      Technical SEO | | SuperMikeLewis
                      0

                    Get started with Moz Pro!

                    Unlock the power of advanced SEO tools and data-driven insights.

                    Start my free trial
                    Products
                    • Moz Pro
                    • Moz Local
                    • Moz API
                    • Moz Data
                    • STAT
                    • Product Updates
                    Moz Solutions
                    • SMB Solutions
                    • Agency Solutions
                    • Enterprise Solutions
                    • Digital Marketers
                    Free SEO Tools
                    • Domain Authority Checker
                    • Link Explorer
                    • Keyword Explorer
                    • Competitive Research
                    • Brand Authority Checker
                    • Local Citation Checker
                    • MozBar Extension
                    • MozCast
                    Resources
                    • Blog
                    • SEO Learning Center
                    • Help Hub
                    • Beginner's Guide to SEO
                    • How-to Guides
                    • Moz Academy
                    • API Docs
                    About Moz
                    • About
                    • Team
                    • Careers
                    • Contact
                    Why Moz
                    • Case Studies
                    • Testimonials
                    Get Involved
                    • Become an Affiliate
                    • MozCon
                    • Webinars
                    • Practical Marketer Series
                    • MozPod
                    Connect with us

                    Contact the Help team

                    Join our newsletter
                    Moz logo
                    © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                    • Accessibility
                    • Terms of Use
                    • Privacy

                    Looks like your connection to Moz was lost, please wait while we try to reconnect.