Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why google stubbornly keeps indexing my http urls instead of the https ones?
-
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why.
Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum
Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum
The third organic result listed is still http.
Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index.
Anyone knows why?
My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
-
Thanks again Dirk! At the end I used xenu link sleuth and I am happy with the result.
-
Hi Massimiliano,
In Screaming Frog there is the option: Bulk Export > All inlinks -> this generates the full list of all your internal links with both source & destination. In Excel you just have to put a filter on the "Destination" column - to show only the url's starting with "http://" and you get all the info you need. This will probably not solve the issues with the images. For this the next solution below could be used.
The list can be quite long depending on the total number of url's on your site. An alternative would be to add a custom filter under 'Configuration>Custom' - only including url's that contain "http://www.gomme-auto.it" or "http://blog.gomme-auto.it" in the source, but in your case this wouldn't be very helpful as all the pages on your site contain this url in the javascript part. If you change the url's in the Javascript to https this could be used to find references to non https images.
If you want to do it manually, it's also an option - in the view 'internal' of the crawler you put "http://" in the search field - this shows you the list of all the http:// url's. You have to select the http url's one by one. For each of the url's you can select "Inlinks" at the bottom of the screen & then you see all the url's linking to the http version. This works for both the html & the images.
Hope this helps,
rgds
Dirk
-
Forgot to mention, yes I checked the scheme of the serp results for those pages, is not just google not displaying it, it really still have the http version indexed.
-
Hi DC,
in screaming frog I can see the old http links. Usually are manually inserted links and images in wordpress posts, I am more than eager to edit them, my problem is how to find all the pages containing them, in screaming frog I can see the links, but I don't see the referrer, in which page they are contained. Is there a way to see that in screaming frog, or in some other crawling software?
-
Hi,
First of all, are you sure that Google didn't take the migration into account?I just did a quick check on other https sites. Example: when I look for "Google Analytics" in Google - the first 3 results are all pointing to Google Analytics site, however only for the 3rd result the https is shown, even when all three are in https. So it's possible it is just a display issue rather than a real issue.
Second, I did a quick crawl of your site and I noticed that on some pages you still have links to the http version of your site (they are redirected but it's better to keep your internal links clean - without redirections).
When I checked one of these pages (https://www.gomme-auto.it/pneumatici/pneumatici-cinesi) I noticed that this page has some issues as it seems to load elements which are not in https - possible there are others as well.
example: /pneumatici/pneumatici-cinesi:1395 Mixed Content: The page at 'https://www.gomme-auto.it/pneumatici/pneumatici-cinesi' was loaded over HTTPS, but requested an insecure image 'http://www.gomme-auto.it/i/pneumatici-cinesi.jpg'. This content should also be served over HTTPS.
The page you mention as example: the http version still receives two internal links from https://www.gomme-auto.it/blog/pneumatici-barum-gli-economici-che-assicurano-ottime-prestazioni and https://www.gomme-auto.it/pneumatici/continental with anchor texts 'pneumatici Barmum' & 'Barum'
Guess google reasons, if the owner of the site is not updating his internal links, I'm not going to update my index

On all your pages there is a part of the source which contains calls to the http version - it's inside a script so not sure if it's really important, but you could try to change it to https as well
My advice would be to crawl your site with Screaming Frog, and check where links exist to http versions and update these links to https (or use relative links - which is adviced by Google (https://support.google.com/webmasters/answer/6073543?hl=en see part 'common pitfalls')
rgds
Dirk
-
Mhhh, you are right theoretically could be the crawler budget. But if that is the case I should see that from the log, I should miss crawler visits on that page. Instead the crawler is happily visiting them.
By the way, how would you "force" the crawler to parse these pages?
I am going to check the sitemap now to remove that port number and try to split them. Thanks.
-
Darn it, you are right, we added a new site, not a change of address, sorry about that. Apparently my coffee is no longer effective!
-
As far as I know the change of address for http to https doesn't work, the protocol is not accepted when you do a change of address. And somewhere I read google itself saying when moving to https you should not do a change of address.
But they suggest to add a new site for the https version in GWT, which I did, and in fact the traffic slowly transitioned from the http site to the https site in GWT in the weeks following the move.
-
Are you sure? On https://support.google.com/webmasters/answer/6033080?hl=en&ref_topic=6033084 it says: "No need to submit a change of address if you are only moving your site from HTTP to HTTPS."
I dont think you are given the option to select the same domain for change of address in GWT.
-
Looks like you are doing everything right (set up 301 redirects, updated all links on the site, updated canonical urls) - just need to force the crawlers to parse those pages more. perhaps crawler is hitting its budget before it gets to recrawl all of your new urls?
You should also update your sitemap as it contains a bunch of links that look like: https://www.gomme-auto.it:443/pneumatici/estivi/pirelli/cinturato-p1-verde/145/65/15/h/72
I recommend creating several sitemaps for different sections of the site and seeing how they are indexed via GWT.
-
Did you do a change of address in Google Webmaster Tools? Http and Https are considered different URLs, and you will have to do a change of address if you switched to a full https site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Vanity URL vs domain URL
Hi guys, Our CEO is having an interview with a known broadcaster on radio. During the interview he will mention a specific URL www.example.com/marketingcampaign that we want track on Google Analytics, therefore behaving like a vanity URL redirecting to the actual URL www.example.com/resources/primary-keyword-2018. Would this work the same way a vanity URL in terms of tracking or not such as following guideline here ? I am asking because vanity URLs are supposed to be completely different domain name that gets purchased and in our case it is the same domain name just with a different URI. thanks guys!
Reporting & Analytics | | Taysir0 -
Redirecting all URLs appended with index.htm or index.html
It has come to my attention with one of my clients (WordPress website) that for some time they have within their Landing Page report (of GA - Google Analytics) URLs that should all be pointing to the one page, example: domain.com/about-us, also has a listing in GA as domain.com/about-us/index.htm Is this some kind of indication of a subdirectory issue? Has anyone had experience with this in such wordpress plugins as Yoast SEO, or other SEO plugin? My thoughts here are to simply redirect any of these non-existent files with a redirect in .htaccess - but what I'm using isn't working. I will insert the redirect here - - and any help would be greatly appreciated. RewriteEngine onRewriteCond %{THE_REQUEST} ^./index.html?
Reporting & Analytics | | cceebar
RewriteRule ^(.)index.html?$ http://www.dupontservicecenter.com/$1 [R=301,L] and this rewrite doesn't work: RewriteEngine on
RewriteRule ^(.+).htm$ http://dupontservicecenter.com/$1.php [R,NC] _Cindy0 -
How to turn on persistent urls in WordPress?
I'm using an appointment form on my website and I have the option to add a referral url to form submissions so that i know which pages the form submission came from. I need to be able to distinguish between organically generated form submissions and those that come in via AdWords. If referral url shows the AdWords tracking code i know the form submission came in from AdWords. My problem is that when a visitor comes in after clicking an ad and then visits another page on my website that AdWords tracking code disappears from the url. I was told that there was a way to turn on persistent urls in WordPress but I can't figure out how to do it. I'm assuming that if i turn persistent urls on the AdWords tracking code will remain on every subsequent url that they visit on my website. Is this true? Any help with this will be greatly appreciated.
Reporting & Analytics | | SpaMedica0 -
How to safely exclude search result pages from Google's index?
Hello everyone,
Reporting & Analytics | | llamb
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blah I wanted to block everything of that sort, but how do I do it without blocking /index.php ? Thanks in advance and have a great day everyone!0 -
Why is Google Analytics showing index.php after every page URL?
Hi, My client's site has GA tracking code gathering correct data on the site, but the pages are listed in GA as having /index.php at the end of every URL, although this does not appear when you visit the site pages. Even if there is a redirect happening for site visitors, shouldn't GA be showing the pages as their redirect destination, i.e. the URL that visitors actually see? Could this discrepancy be adversely affecting my search performance? Example page: http://freshstarttax.com/innocent-spouse/ shows up in GA as http://freshstarttax.com/innocent-spouse/index.php thanks
Reporting & Analytics | | JMagary0 -
Google Analytics and DNS change
Our new alumni application is going be tested at domain uva.imodules.com . We are going to collect traffic data with a Google analytics account number UA-884652-XX. So going to uva.imodules.com/myPage.html would send its data to Google Analytics with that account number. Then when it is ready for production we are going to just change the domain name of the application and switch the DNS over to dardencommunity.darden.virginia.edu . So going to dardencommunity.darden.virginia.edu /myPage.html would send its data to Google Analtics with that SAME account number. Aside from having the testing domain data in the same profile are there any other issues/problems we may run into?
Reporting & Analytics | | Darden0 -
How to remove unwanted dynamic parameters from a URL in Google Analytics
Hi, Would really appreciate some help with this. I have been experimenting with RegEx to achieve this but as I’ve never used it before am currently failing miserably. We have conversion pages i need to set goals for that are formatted as below: https://www.domain.co.uk//Application_Form/(S(ewhbqp5cki0mppuzukunkqno))/enterCardDetails.aspx I need to remove the (s(xxx)) section from the URL as rather than one pages i currently have thousands of unique URL's. What’s catching me out is that as it’s not a URL parameter I can’t discount and as half way through can’t just do head matches etc to /entercarddetails Help would be much appreciated. Thanks.
Reporting & Analytics | | Sarbs0 -
I need to whitelist the Google Analytics servers via IP Address. Any one know their server IPs?
Due to a walled garden situation, I need to input all the servers we should have access to via IP Address. We need to track the way our users are browsing within the garden, so I require all the IP addresses that Google Analytics would use. I tried googling it, but was not able to find any definitive answers. Thank you in advance, Heather
Reporting & Analytics | | hmckenna0