Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Few pages without SSL
-
Hi,
A website is not fully secured with a SSL certificate.
Approx 97% of the pages on the website are secured.A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work.
It's a website where you can play online games. These games do not work with an SSL connection.
Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking?Regards,
Tom -
It may potentially affect the rankings on:
-
pages without SSL
-
pages linking to pages without SSL
At first, not drastically - but you'll find that you'll get more and more behind until you had wished you just embraced HTTPS.
The exception to this of course, is if no one who is competing over the same keywords, is fully embracing SSL. If the majority of the query-space's ranking sites are insecure, even though Google frowns upon that - there's not much they can do (they can't just rank no one!)
So you need to do some legwork. See if your competitors suffer from the same issue. If they all do, maybe don't be so concerned at this point. If they're all showing signs of fully moving over to HTTPS, be more worried
-
-
Just to be sure, i would secure every page with an SLL certificate. When Google finds out that not every page is secure, this it may raise some eyebrows and even effect the whole site.
-
Yes that can hurt Google rankings. Insecure pages tend to rank less well and over time, that trend is only set to increase (with Google becoming less and less accepting of insecure pages, eventually they will probably be labelled a 'bad neighborhood' like gambling and porn sites). Additionally, URLs which link out to insecure pages (which are not on HTTPS) can also see adverse ranking effects (as Google knows that those pages are likely to direct users to insecure areas of the web)
At the moment, you can probably get by with some concessions. Those concessions would be, accepting that the insecure URLs probably won't rank very well compared with pages offering the same entertainment / functionality, which have fully embraced secure browsing (which are on HTTPS, which are still responsive, which don't link to insecure addresses)
If you're confident that the functionality you are offering, fundamentally can't be offered through HTTPS - then that may be only a minor concern (as all your competitors are bound by the same restrictions). If you're wrong, though - you're gonna have a bad time. Being 'wrong' now, may be more appealing than being 'dead wrong' later
Google will not remove the warnings your pages have, unless you play ball. If you think that won't bother your users, or that your competition is fundamentally incapable of a better, more secure integration - fair enough. Google is set to take more and more action on this over time
P.S: if your main, ranking pages are secure and if they don't directly link to this small subset of insecure pages, then you'll probably be ok (at least in the short term)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images on their own page?
Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael
Intermediate & Advanced SEO | | yaelslater1 -
Massive Amount of Pages Deindexed
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following: Ensured all pages are "index,follow" Ensured there are no manual penalites Ensured the sitemap correlates to all the pages Resubmitted to Google ALL pages are gone from Bing as well In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help! The url is https://www.hkqpc.com
Intermediate & Advanced SEO | | D.J.Hanchett0 -
Merging Pages and SEO
Hi, We are redesigning our website the following way: Before: Page A with Content A, Page B with Content B, Page C with Content C, etc
Intermediate & Advanced SEO | | viatrading1
e.g. one page for each Customer Returns, Overstocks, Master Case, etc
Now: Page D with content A + B + C etc.
e.g. one long page containing all Product Conditions, one after the other So we are merging multiples pages into one.
What is the best way to do so, so we don't lose traffic? (or we lose the minimum possible) e.g. should we 301 Redirect A/B/C to D...?
Is it likely that we lose significant traffic with this change? Thank you,0 -
Too many on page links
Hi I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+ How important is this now? I've read a few articles to say it's not as crucial as before. Thanks!
Intermediate & Advanced SEO | | BeckyKey1 -
Multiple pages optimised for the same keywords but pages are functionally different and visually different
Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
Intermediate & Advanced SEO | | TrueluxGroup
https://www.whichledlight.com/categories/led-spotlights
and the other page is
https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂0 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
PDF or HTML Page?
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
Intermediate & Advanced SEO | | Sika220 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0