Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Analytics: how to filter out pages with low bounce rate?
-
Hello here,
I am trying to find out how I can filter out pages in Google Analytics according to their bounce rate.
The way I am doing now is the following:
1. I am working inside the Content > Site Content > Landing Pages report
2. Once there, I click the "advanced" link on the right of the filter field.
3. Once there, I define to "include" "Bounce Rate" "Greater than" "0.50" which should show me which pages have a bounce rate higher of 0.50%.... instead I get the following warning on the graph:
"Search constraints on metrics can not be applied to this graph"
I am afraid I am using the wrong approach... any ideas are very welcome!
Thank you in advance.
-
Thank you Mark! Yes, I knew about that option and helps a big deal!
I appreciated your help. Thanks!
-
Thank you Lynn, yes sorry, I meant 50% not 0.50% (I got confused with the conversion rate). Yes, I didn't notice that the data under the graph was actually updated, looks like it is just the graph that doesn't show that kind of filtered data (too bad!).
Thank you again, I appreciated your help!
-
I would also look into sorting the data by the metrics you want, and using weighted sort instead of the default. Weighted sort takes into account other metrics as well - so this way, when you sort by bounce rate, it doesn't just show you 100% bounce rate at the top, even if that page only has 1 view and so is skewed, but gives you a much better idea at pages that are performing poorly and actually getting visits.
You can read more about weighted sort here on the GA blog - http://analytics.blogspot.co.il/2010/08/introducing-weighted-sort.html
Hope this helps,
Mark
-
Hi Fabrizio,
If you put 50 in the 'bounce rate greater than box' instead of 0.5 then the table shown below the graph shows the data you want (only bounce rates over 50%, I think that is what you are after right?). I guess the graph cannot be show this filtered data although you can click the 'select a metric' link next to the visits dropdown on the top/left of the graph and add bounce rate to see the average bounce rate by visits/day if that helps with getting a baseline.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Will Reduced Bounce Rate, Increased Pages/Session, Increased Session Duration-RESULT IN BETTER RANKING?
Our relaunched website has a much lower bounce rate (66% before, now 58%) increased pages per session (1.89 before, now 3.47) and increased session duration (1:33 before, now 3:47). The relaunch was December 20th. Should these improvements result in an improvement in Google rank? How about in MOZ authority? We have not significantly changed the content of the site but the UX has been greatly improved. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
I think Google Analytics is mis-reporting organic landing pages.
I have multiple clients whose Google Analytics accounts are showing me that some of the top performing organic landing pages (in terms of highest conversion rates) look like this: /cart.php /quote /checkout.php /finishorder.php /login.php In some cases, these pages are blocked by Robots.txt. In other cases they are not even indexed at all in Google. These pages are clearly part of the conversion process. A couple of them are links sent out when a cart is abandoned, etc. - is it possible they actually came in organically but then re-entered via one of these links which is what Google is calling the organic landing page? How is it possible that these pages would be the top performing landing pages for organic visitors?
Intermediate & Advanced SEO | | FPD_NYC0 -
Google indexing pages from chrome history ?
We have pages that are not linked from site yet they are indexed in Google. It could be possible if Google got these pages from browser. Does Google takes data from chrome?
Intermediate & Advanced SEO | | vivekrathore0 -
Should I set up no index no follow on low quality pages?
I know it is a good idea for duplicate pages, blog tags, etc. but I remember somewhere that you can help the overall link juice of a website by adding no index no follow or no index follow low quality content pages of your website. Is it still a good idea to do this or was it never a good idea to begin with? Michael
Intermediate & Advanced SEO | | Michael_Rock0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0