• majorAlexa

        See all notifications

        Skip to content
        Moz logo Menu open Menu close
        • Products
          • Moz Pro
          • Moz Pro Home
          • Moz Local
          • Moz Local Home
          • STAT
          • Moz API
          • Moz API Home
          • Compare SEO Products
          • Moz Data
        • Free SEO Tools
          • Domain Analysis
          • Keyword Explorer
          • Link Explorer
          • Competitive Research
          • MozBar
          • More Free SEO Tools
        • Learn SEO
          • Beginner's Guide to SEO
          • SEO Learning Center
          • Moz Academy
          • MozCon
          • Webinars, Whitepapers, & Guides
        • Blog
        • Why Moz
          • Digital Marketers
          • Agency Solutions
          • Enterprise Solutions
          • Small Business Solutions
          • The Moz Story
          • New Releases
        • Log in
        • Log out
        • Products
          • Moz Pro

            Your all-in-one suite of SEO essentials.

          • Moz Local

            Raise your local SEO visibility with complete local SEO management.

          • STAT

            SERP tracking and analytics for enterprise SEO experts.

          • Moz API

            Power your SEO with our index of over 44 trillion links.

          • Compare SEO Products

            See which Moz SEO solution best meets your business needs.

          • Moz Data

            Power your SEO strategy & AI models with custom data solutions.

          Let your business shine with Listings AI
          Moz Local

          Let your business shine with Listings AI

          Learn more
        • Free SEO Tools
          • Domain Analysis

            Get top competitive SEO metrics like DA, top pages and more.

          • Keyword Explorer

            Find traffic-driving keywords with our 1.25 billion+ keyword index.

          • Link Explorer

            Explore over 40 trillion links for powerful backlink data.

          • Competitive Research

            Uncover valuable insights on your organic search competitors.

          • MozBar

            See top SEO metrics for free as you browse the web.

          • More Free SEO Tools

            Explore all the free SEO tools Moz has to offer.

          NEW Keyword Suggestions by Topic
          Moz Pro

          NEW Keyword Suggestions by Topic

          Learn more
        • Learn SEO
          • Beginner's Guide to SEO

            The #1 most popular introduction to SEO, trusted by millions.

          • SEO Learning Center

            Broaden your knowledge with SEO resources for all skill levels.

          • On-Demand Webinars

            Learn modern SEO best practices from industry experts.

          • How-To Guides

            Step-by-step guides to search success from the authority on SEO.

          • Moz Academy

            Upskill and get certified with on-demand courses & certifications.

          • MozCon

            Save on Early Bird tickets and join us in London or New York City

          Unlock flexible pricing & new endpoints
          Moz API

          Unlock flexible pricing & new endpoints

          Find your plan
        • Blog
        • Why Moz
          • Digital Marketers

            Simplify SEO tasks to save time and grow your traffic.

          • Small Business Solutions

            Uncover insights to make smarter marketing decisions in less time.

          • Agency Solutions

            Earn & keep valuable clients with unparalleled data & insights.

          • Enterprise Solutions

            Gain a competitive edge in the ever-changing world of search.

          • The Moz Story

            Moz was the first & remains the most trusted SEO company.

          • New Releases

            Get the scoop on the latest and greatest from Moz.

          Surface actionable competitive intel
          New Feature

          Surface actionable competitive intel

          Learn More
        • Log in
          • Moz Pro
          • Moz Local
          • Moz Local Dashboard
          • Moz API
          • Moz API Dashboard
          • Moz Academy
        • Avatar
          • Moz Home
          • Notifications
          • Account & Billing
          • Manage Users
          • Community Profile
          • My Q&A
          • My Videos
          • Log Out

        The Moz Q&A Forum

        • Forum
        • Questions
        • My Q&A
        • Users
        • Ask the Community

        Welcome to the Q&A Forum

        Browse the forum for helpful insights and fresh discussions about all things SEO.

        1. Home
        2. SEO Tactics
        3. Intermediate & Advanced SEO
        4. Google Indexing Of Pages As HTTPS vs HTTP

        Moz Q&A is closed.

        After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

        Google Indexing Of Pages As HTTPS vs HTTP

        Intermediate & Advanced SEO
        4
        13
        3933
        Loading More Posts
        • Watching

          Notify me of new replies.
          Show question in unread.

        • Not Watching

          Do not notify me of new replies.
          Show question in unread if category is not ignored.

        • Ignoring

          Do not notify me of new replies.
          Do not show question in unread.

        • Oldest to Newest
        • Newest to Oldest
        • Most Votes
        Reply
        • Reply as question
        Locked
        This topic has been deleted. Only users with question management privileges can see it.
        • vikasnwu
          vikasnwu last edited by

          We recently updated our site to be mobile optimized.  As part of the update, we had also planned on adding SSL security to the site.  However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet.  So, those iframes weren't displaying the content.

          As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for.

          However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors.  The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https.  I have fixed the htaccess file to no longer have https.

          My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?

          1 Reply Last reply Reply Quote 1
          • ThompsonPaul
            ThompsonPaul @vikasnwu last edited by

            That's not going to solve your problem, vikasnwu. Your immediate issue is that you have URLs in the index that are HTTPS and will cause searchers who click on them not to reach your site due to the security error warnings. The only way to fix that quickly is to get the SSL certificate and redirect to HTTP in place.

            You've sent the search engines a number of very conflicting signals. Waiting while they try to work out what URLs they're supposed to use and then waiting while they reindex them is likely to cause significant traffic issues and ongoing ranking harm before the SEs figure it out for themselves. The whole point of what I recommended is it doesn't depend on the SEs figuring anything out - you will have provided directives that force them to do what you need.

            Paul

            1 Reply Last reply Reply Quote 1
            • AgenciaSEO.eu
              AgenciaSEO.eu @vikasnwu last edited by

              Remember you can force indexing using Google Search Console 😉

              1 Reply Last reply Reply Quote 0
              • AgenciaSEO.eu
                AgenciaSEO.eu @GastonRiera last edited by

                Nice answer!

                But you forgot to mention:

                1. Updating the sitemap files with the good URLs
                2. Upload them to Google Search Console
                3. You can even force the indexing at Google Search Console

                Thanks,

                Roberto

                1 Reply Last reply Reply Quote 0
                • GastonRiera
                  Gaston Riera @ThompsonPaul last edited by

                  Paul, 
                  I just provided the solution to de-index the https version. I understood that what's wanted, as they need their client to fix their end.

                  And of course that there is no way to noindex by protocol. I do agree what you are saying.

                  Thanks a lot for explaining further and prividing other ways to help solvinf the issue, im inspired by used like you to help others and make a great community.

                  GR.

                  1 Reply Last reply Reply Quote 0
                  • vikasnwu
                    vikasnwu @ThompsonPaul last edited by

                    i'm first going to see what happens if I just upload a sitemap with http URLs since there wasn't a sitemap in webmaster tools from before.  Will give you the update then.

                    AgenciaSEO.eu ThompsonPaul 2 Replies Last reply Reply Quote 0
                    • ThompsonPaul
                      ThompsonPaul @vikasnwu last edited by

                      Great! I'd really like to hear how it goes when you get the switch back in.

                      P.

                      vikasnwu 1 Reply Last reply Reply Quote 0
                      • vikasnwu
                        vikasnwu last edited by

                        Paul that does make sense - i'll add the SSL certificate back, and then redirect from https to http via the htaccess file.

                        ThompsonPaul 1 Reply Last reply Reply Quote 0
                        • ThompsonPaul
                          ThompsonPaul @GastonRiera last edited by

                          You can't noindex a URL by protocol, Gaston - adding no-index would eliminate the page from being returned as a search result regardless of whether HTTP or HTTPS, essentially making those important pages invisible and wasting whatever link equity they may have. (You also can't block in robots.txt by protocol either, in my experience.)

                          GastonRiera 1 Reply Last reply Reply Quote 0
                          • ThompsonPaul
                            ThompsonPaul last edited by

                            There's a very simple solution to this issue - and no, you absolutely do NOT want to artificially force removal of those HTTPS pages from the index.

                            You need to make sure the SSL certificate is still in place, then re-add the 301-redirect in the site's htaccess file, but this time redirecting all HTTPS URLs back their HTTP equivalents.

                            You don't want to forcibly "remove" those URLs from the SERPs, because they are what Google now understands to be the correct pages. If you remove them, you'll have to wait however long it takes for Google and other search engines to completely re-understand the conflicting signals you've sent them about your site. And traffic will inevitably suffer in that process. Instead, you need to provide standard directives that the search engines don't have to interpret and can't ignore. Once the search engines have seen the new redirects for long enough, they'll start reverting the SERP listings back to the HTTP URLs naturally.

                            The key here is the SSL cert must stay in place. As it stands now, a visitor clicking a page in the search engine is trying to make an HTTPS connection to your site. If there is no certificate in place, they will get the harmful security warning. BUT! You can't just put in a 301-redirect in that case. The reason for this is that the initial connection from the SERP is coming in over the "secure channel". That connection must be negotiated securely first, before the redirect can even be read. If that first connection isn't secure, the browser will return the security warning without ever trying to read the redirect.

                            Having the SSL cert in place even though you're not running all pages under HTTPS means that first connection can still be made securely, then the redirect can be read back to the HTTP URL, and the visitor will get to the page they expect in a seamless manner. And search engines will be able to understand and apply authority without misunderstandings/confusion.

                            Hope that all makes sense?

                            Paul

                            1 Reply Last reply Reply Quote 3
                            • GastonRiera
                              Gaston Riera @vikasnwu last edited by

                              Noup, Robots.txt works on a website level. This means that there has to be a file for the http and another for the https website.
                              And, there is no need for waiting until the whole site is indexed.

                              Just to clarify, robots.txt itself does not remove pages already indexed. It just blocks bots from crawling a website and/or specific pages with in it.

                              1 Reply Last reply Reply Quote 0
                              • vikasnwu
                                vikasnwu @GastonRiera last edited by

                                GR - thanks for the response.

                                Given our site is just 65 pages, would it make sense to just put all of the site's "https" URLs in the robots.txt file as "noindex" now rather than waiting for all the pages to get indexed as "https" and then remove them?

                                And then upload a sitemap to webmaster tools with the URLS as "http://"?

                                VW

                                GastonRiera 1 Reply Last reply Reply Quote 0
                                • GastonRiera
                                  Gaston Riera last edited by

                                  Hello vikasnwu,

                                  As what you are looking for is to remove from index the pages, follow this steps:

                                  1. Allow the whole website to be crawable in the robots.txt
                                  2. add the robots meta tag with "noindex,follow" parametres
                                  3. wait several weeks, 6 to 8 weeks is a fairly good time. Or just do a followup on those pages
                                  4. when you got the results (all your desired pages to be de-indexed) re-block with robots.txt those pages
                                  5. DO NOT erase the meta robots tag.

                                  Remember that http://site.com andhttps://site.com are different websites to google.
                                  When your client's website is fixed with https, follow these steps:

                                  1. Allow the whole website (or the parts wanted to be indexed) to be crawable in robots.txt
                                  2. Remove the robots meta tag
                                  3. Redirect 301 http to https
                                  4. Sit and wait.

                                  Information about the redirection to HTTPS and a cool checklist:
                                  The Big List of SEO Tips and Tricks for Using HTTPS on Your Website - Moz Blog
                                  The HTTP to HTTPs Migration Checklist in Google Docs to Share, Copy & Download - AleydaSolis
                                  Google SEO HTTPS Migration Checklist - SERoundtable

                                  Hope I'm helpful.
                                  Best luck.
                                  GR.

                                  vikasnwu ThompsonPaul AgenciaSEO.eu 3 Replies Last reply Reply Quote 0
                                  • 1 / 1
                                  • First post
                                    Last post

                                  Browse Questions

                                  Explore more categories

                                  • Moz Tools

                                    Chat with the community about the Moz tools.

                                  • SEO Tactics

                                    Discuss the SEO process with fellow marketers

                                  • Community

                                    Discuss industry events, jobs, and news!

                                  • Digital Marketing

                                    Chat about tactics outside of SEO

                                  • Research & Trends

                                    Dive into research and trends in the search industry.

                                  • Support

                                    Connect on product support and feature requests.

                                  • See all categories

                                  Related Questions

                                  • ioannisa

                                    Mass Removal Request from Google Index

                                    Hi, I am trying to cleanse a news website.  When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts.  This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012.  So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
                                    https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove.  Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404?  I believe this is very wrong.  As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
                                    https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
                                    http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?

                                    Intermediate & Advanced SEO | | ioannisa
                                    0
                                  • vivekrathore

                                    "Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console

                                    Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards

                                    Intermediate & Advanced SEO | | vivekrathore
                                    0
                                  • abhihan

                                    Google indexing only 1 page out of 2 similar pages made for different cities

                                    We have created two category pages, in which we are showing products which could be delivered in separate cities. Both pages are related to cake delivery in that city. But out of these two category pages only 1 got indexed in google and other has not. Its been around 1 month but still only Bangalore category page got indexed. We have submitted sitemap and google is not giving any crawl error. We have also submitted for indexing from "Fetch as google" option in webmasters. www.winni.in/c/4/cakes (Indexed - Bangalore page - http://www.winni.in/sitemap/sitemap_blr_cakes.xml) 2. http://www.winni.in/hyderabad/cakes/c/4 (Not indexed - Hyderabad page - http://www.winni.in/sitemap/sitemap_hyd_cakes.xml) I tried searching for "hyderabad site:www.winni.in" in google but there also http://www.winni.in/hyderabad/cakes/c/4 this link is not coming, instead of this only www.winni.in/c/4/cakes is coming. Can anyone please let me know what could be the possible issue with this?

                                    Intermediate & Advanced SEO | | abhihan
                                    0
                                  • 94501

                                    Better to 301 or de-index 403 pages

                                    Google WMT recently found and called out a large number of old unpublished pages as access denied errors. The pages are tagged "noindex, follow."  These old pages are in Google's index. At this point, would it better to 301 all these pages or submit an index removal request or what? Thanks... Darcy

                                    Intermediate & Advanced SEO | | 94501
                                    0
                                  • Michael_Rock

                                    Should I set up no index no follow on low quality pages?

                                    I know it is a good idea for duplicate pages, blog tags, etc. but I remember somewhere that you can help the overall link juice of a website by adding no index no follow or no index follow low quality content pages of your website. Is it still a good idea to do this or was it never a good idea to begin with? Michael

                                    Intermediate & Advanced SEO | | Michael_Rock
                                    0
                                  • sparrowdog

                                    Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?

                                    I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.

                                    Intermediate & Advanced SEO | | sparrowdog
                                    0
                                  • edlondon

                                    Google Not Indexing XML Sitemap Images

                                    Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT.  If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'.  That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt.  As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in.  Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change.  But over a week later, that seems to have had no impact.  The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this.  I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ

                                    Intermediate & Advanced SEO | | edlondon
                                    0
                                  • PepMozBot

                                    How long does google take to show the results in SERP once the pages are indexed ?

                                    Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !

                                    Intermediate & Advanced SEO | | PepMozBot
                                    0

                                  Get started with Moz Pro!

                                  Unlock the power of advanced SEO tools and data-driven insights.

                                  Start my free trial
                                  Products
                                  • Moz Pro
                                  • Moz Local
                                  • Moz API
                                  • Moz Data
                                  • STAT
                                  • Product Updates
                                  Moz Solutions
                                  • SMB Solutions
                                  • Agency Solutions
                                  • Enterprise Solutions
                                  • Digital Marketers
                                  Free SEO Tools
                                  • Domain Authority Checker
                                  • Link Explorer
                                  • Keyword Explorer
                                  • Competitive Research
                                  • Brand Authority Checker
                                  • Local Citation Checker
                                  • MozBar Extension
                                  • MozCast
                                  Resources
                                  • Blog
                                  • SEO Learning Center
                                  • Help Hub
                                  • Beginner's Guide to SEO
                                  • How-to Guides
                                  • Moz Academy
                                  • API Docs
                                  About Moz
                                  • About
                                  • Team
                                  • Careers
                                  • Contact
                                  Why Moz
                                  • Case Studies
                                  • Testimonials
                                  Get Involved
                                  • Become an Affiliate
                                  • MozCon
                                  • Webinars
                                  • Practical Marketer Series
                                  • MozPod
                                  Connect with us

                                  Contact the Help team

                                  Join our newsletter
                                  Moz logo
                                  © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                  • Accessibility
                                  • Terms of Use
                                  • Privacy

                                  Looks like your connection to Moz was lost, please wait while we try to reconnect.