Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Javascript to manipulate Google's bounce rate and time on site?
-
I was referred to this "awesome" solution to high bounce rates.
It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question).
I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me.
Can someone with experience in JS help me by explaining what this script does?
I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in.
-
Stephen,
Thanks for the explanation - I just had a client ask me about this script. Based on your explanation, this script will change your bounce rate. This is because once the event is triggered, the visit will no longer be considered a bounce, even if the user only visits one page. So it's an artificial/false decrease in bounce rate, not a "fix" as others claim.
I wrote a short blog post on this (and referenced your description)!
~Adam
-
Thanks for the encouragement Martin.
As it turns out, with the help of the two previous answers, the script is actually based on a valid script adjustment that might actually help some people in their reports but the what my client thought was that this was an easy/quick way to get more traffic. The article they found was saying this would dramatically change results in GA and then directly effect their site's ranking in the SERPs.
They had "proof" in the form of some GA screenshots so I needed more information on what the script actually does. I was able to let my client know what exactly this was and recommend not doing it unless there was a problem in the GA reports that they wanted fixed.
Thanks again for your reply.
-
Dont do it - just improve your content. You know it's wrong to try and cheat the system. Think about what would happen if you banned from the results.
Look i dont mean to be harsh - but i allways balance risks against rewards. In this situation - the risk is to high.
-
Thanks for that link.
The site (link in the previous reply) my client referred me to was manipulating the way they were reporting the results. The closer I looked at it, I realized that it was a little spike but then it went right back down. Knowing them they just paid a bunch of people to visit the site.
This stuff is annoying and gives us SEO's a bad name.
-
The code was from this site http://millionairevolution.com/cut-bounce-rate-by-80/ and looking at the dates and analytics shown on the page this is nothing more than a misrepresentation of the facts and data.
I knew Google doesn't use data from GA but the data graph was showing a contradiction and I didn't know exactly what the script was doing.
-
First, Google Analytics reporting does not, to my knowledge, influence SERP rankings. Altering the data collected through Google Analytics should not affect SEO indicators.
Second, this is from here: http://briancray.com/posts/time-on-site-bounce-rate-get-the-real-numbers-in-google-analytics/
Once this code is installed, your site will update Google Analytics every 10 seconds under the Event Category "Time", the Event Action "Log", and the Event Value will be based on the pattern of 0:10, 0:20, 0:30, 0:40, 0:50, 1:00, 1:10, etc.
The script does not change your bounce rate, it just gives you additional information.
-
You're correct that it's a GA hack. Avoid it.
Google has publicly stated that they don't use your site-specific GA metrics to influence organic search rankings. E.g., they're not taking data from your GA profile, and feeding that to the Search Quality team to determine if your site should rank better or worse. They have MANY better ways to accurately track anonymous user interactions with sites at scale (e.g. Chrome).
The only thing that you'll accomplish with this code is making all of your own internal metrics turn to garbage. Accurate metrics are important. If you bounce rate is high, knowing that allows you to take action to improve your site and reduce it.
The more people who stay on your site for more than 1 pageview, the more money your business is likely to make. Improve your bounce rate to improve the profitability of your website, not for some supposed correlation between bounce rate and organic search ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Google Indexed a version of my site w/ MX record subdomain
We're doing a site audit and found "internal" links to a page in search console that appear to be from a subdomain of our site based on our MX record. We use Google Mail internally. The links ultimately redirect to our correct preferred subdomain "www", but I am concerned as to why this is happening and if it can have any negative SEO implications. Example of one of the links: Links aspmx3.googlemail.com.sullivansolarpower.com/about/solar-power-blog/daniel-sullivan/renewable-energy-and-electric-cars-are-not-political-footballs I did a site operator search, site:aspmx3.googlemail.com.sullivansolarpower.com on google and it returns several results.
Technical SEO | | SS.Digital0 -
Site indexed by Google, but (almost) never gets impressions
Hi there, I have a question that I wasn't able to give it a reasonable answer yet, so I'm going to trust on all of you. Basically a site has all its pages indexed by Google (I verified with site:sitename.com) and it also has great and unique content. All on-page grades are A with absolutely no negative factors at all. However its pages do not get impressions almost at all. Of course I didn't expect it to be on page 1 since it has been launched on Dec, 1st, but it looks like Google is ignoring (or giving it bad scores) for some reason. Only things that can contribute to that could be: domain privacy on the domain, redirect from the www to the subdomain we use (we did this because it will be a multi-language site, so we'll assign to each country a subdomain), recency (it has been put online on Dec 1st and the domain is just a couple of months old). Or maybe because we blocked crawlers for a few days before the launch? Exactly a few days before Dec 1st. What do you think? What could be the reason for that? Thanks guys!
Technical SEO | | ruggero0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
Structuring URL's for better SEO
Hello, We were rolling our fresh urls for our new service website. Currently we have our structure as www.practo.com/health/dental/clinic/bangalore We like to have it as www.practo.com/health/dental-clinic-bangalore Can someone advice us better which one of the above structure would work out better and why? Should this be a focus of attention while going ahead since this is like a search engine platform for patients looking out for actual doctors. Thanks, Aditya
Technical SEO | | shanky10 -
Found a Typo in URL, what's the best practice to fix it?
Wordpress 3.4, Yoast, Multisite The URL is supposed to be "www.myexample.com/great-site" but I just found that it's "www.myexample.com/gre-atsite" It is a relatively new site but we already pointed several internal links to "www.myexample.com/gre-atsite" What's the best practice to correct this? Which option is more desirable? 1.Creating a new page I found that Yoast has "301 redirect" option in the Advanced tap Can I just create a new page(exact same page) and put noindex, nofollow and redirect it to http://www.myexample.com/great-site OR 2. htacess redirect rule simply change the URL to http://www.myexample.com/great-site and update it, and add Options +FollowSymLinks RewriteEngine On
Technical SEO | | joony2008
RewriteCond %{HTTP_HOST} ^http://www.myexample.com/gre-atsite$ [NC]
RewriteRule ^(.*)$ http://www.myexample.com/great-site$1 [R=301,L]0 -
Googlebot Crawl Rate causing site slowdown
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT: http://imgur.com/dyIbf I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot. Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings? Thanks
Technical SEO | | SuperMikeLewis0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0