Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Loading images below the fold? Impact on SEO
-
I got this from my developers. Does anyone know if this will be a SEO issue?
We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
-
Happy to help!
-
Thanks Tom!
As always, an amazing response.
Best
-
Hi Chris sorry for the late reply absolutely you can do this by using a plug-in cloudfare or PHP code
- https://wordpress.org/plugins/wp-deferred-javascripts/
- https://wordpress.org/plugins/defer-css-addon-for-bwp-minify/
Another plugin that does this solution but providing an administration area to configure it manually is Autoptimize, that allows to define a specific CSS code in a independent way of your theme CSS stylesheet
- http://www.oxhow.com/optimize-defer-javascript-wordpress/
- https://seo-hacker.com/optimizing-site-speed-asynchronous-deferred-javascript/
- http://www.laplacef.com/how-to-defer-parsing-javascript-in-wordpress/
The solution of these problem is removing those render-blocking scripts. But if you remove them, some plugins may not work properly. So, the best solution for the smooth rendering is:
1. Remove them from your website source page.
2. Use a single script, hosted by Google as the alternative.
3. Push down the new script at end of the page ( before “” tag).
Here is how to do it.
Copy the code from the following link and paste at your theme’s function.php file.
function optimize_jquery() { if (!is_admin()) { wp_deregister_script('jquery'); wp_deregister_script('jquery-migrate.min'); wp_deregister_script('comment-reply.min'); $protocol='http:'; if($_SERVER['HTTPS']=='on') { $protocol='https:'; } wp_register_script('jquery', $protocol.'//ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js', false, '3.6', true); wp_enqueue_script('jquery'); } } add_action('template_redirect', 'optimize_jquery');Save the file and you are done! Now recheck the source of any page and you won’t see those two scripts at the head section. Alternatively, you can see the Google hosted JavaScriptscript source at the end of the page.
That’s all! Now the visible section of your page will be rendered smoothly.
Defer Loading JavaScript
Another suggestion from Google Page Speed tool is “Defer JavaScripts”. This problem happens when you use any inline JavaScripts like the scripts for Facebook like box or button, Google plus button, Twitter button etc. If you defer the JavaScript then the scripts are triggered after loading of the entire document.
How to defer JavaScript at WordPress
1. Create a JavaScript file and give the name as defer.js.
2. Place the JavaScripts codes that you want to defer into the defer.js file. For instance, if you want to defer Facebook like box script, paste the following at that file.
(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1&appId=326473900710878"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));3. Save the file and upload at your theme folder.
4. Now, copy the following code and paste at the head section of the source page. Here in WordPress, open header.php file of your theme and paste the code before the closing head tag.
Make sure to put the correct path of defer.js. For example, the source path should be like this:
/wp-content/themes/theme_name/defer.js ______________________________________________________________________________________________I hope that helps,
Tom
-
happy I could help
-
Thomas,
Can this be implemented on a Wordpress site?
Apologize for hijacking!
-
What a great response! Just what I was looking for. Thank you!
-
lazy loading images is not as good as deferring an image. Because lazy loading images can cause issues can cause JavaScript issues that will not cause problems if you deferred the image instead of lazy loading.
Defer images you will have a easier time the method discussed here does not hurt search engine optimization in fact it will help it because increased load speeds or what people perceive as an increased load speed always helps the end-user.
Here is the best way
https://www.feedthebot.com/pagespeed/defer-images.html
This is where we defer the images without lazy loading
In the scenario of a one page template, there is no reason to do all the things that lazy loading does (observe, monitor and react to a scroll postion).
Why not just defer those images and have them load immediately after the page has loaded?
How to do it
To do this we need to markup our images and add a small and extremely simple javascript. I will show the method I actually use for this site and others. It uses a base 64 image, but do not let that scare you.
The html
The javascript
-
I have looked for information on this in the past and come up empty handed. With page speed Google really pits you against best SEO practices. I think if you follow most of the page speed insights you can severely limit your SEO. How many images are you talking about, how does Google render the page in their fetch as Google?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
JavaScript page loader - SEO impact
Hello all,
Technical SEO | | Lvet
I am working on a site that has a bizarre page load system. All pages get loaded trough the same Javascript snippet, for example: Changing the values in the form changes the page that is loaded. The most incredible thing is that, against my expectations, pages do get indexed by Google.
My question is: "Does loading pages dynamically using JavaScript affect the overall SEO performance?" Why are pages getting indexed? Thank you for shedding light on this.
Cheers
Luca0 -
SEO impact of the anatomy of URL subdirectory structure?
I've been pushing hard to get our Americas site (DA 34) integrated with our higher domain authority (DA 51) international website. Currently our international website is setup in the following format... website.com/us-en/ website.com/fr-fr/ etc... The problem that I am facing is that I need my development framework installed in it's own directory. It cannot be at the root of the website (website.com) since that is where the other websites (us-en, fr-fr, etc.) are being generated from. Though we will have control of /us-en/ after the integration I cannot use that as the website main directory since the americas website is going to be designed for scalability (eventually adopting all regions and languages) so it cannot be region specific. What we're looking at is website.com/[base]/us-en. I'm afraid that if base has any length to it in terms of characters it is going to dilute the SEO value of whatever comes after it in the URL (website.com/[base]/us-en/store/product-name.html). Any recommendations?
Technical SEO | | bearpaw0 -
SEO-impact of mouseover text on header pictures
Hi, what do you reckon of taking away the mouseover effect on the header pictures seen on www.viventura.de/reisen/peru?
Technical SEO | | viventuraSEO
We are thinking of eliminating the mouseover text to make User Experience even better but are worrying that our ranking might go down when doing so. Any experiences, any help is highly appreciated!
Thanks, Benno0 -
SEO value of InDesign pages?
Hi there, my company is exploring creating an online magazine built with Adobe's InDesign toolset. If we proceeded with this, could we make these pages "as spiderable" as normal html/css webpages? Or are we limited to them being less spiderable, or not at all spiderable?
Technical SEO | | TheaterMania1 -
Will blocking the Wayback Machine (archive.org) have any impact on Google crawl and indexing/SEO?
Will blocking the Wayback Machine (archive.org) by adding the code they give have any impact on Google crawl and indexing/SEO? Anyone know? Thanks! ~Brett
Technical SEO | | BBuck0 -
Image search and CDNs
Hi, Our site has a very high domain strength. Although our site ranks well for general search phrases, we rank poorly for image search (even though our site has very high quality images). Our images are hosted on a separate CDN with a different domain. Although there are a number of benefits to doing this, since they are on a different domain, are we not able to capitalize on our my site's domain strength? Is there any way to associate our CDN to our main site via Google webmaster tools? Has anyone researched the search ranking impacts due to storing your images on a CDN, given that your domain strength is very high? Curious on people's thoughts?
Technical SEO | | NicB10 -
Impact of 401s on Site Rankings
Will having 401s on a site negatively impact rankings? (e.g. 401s thrown from a social media sharing icon)
Technical SEO | | Christy-Correll0