Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Have you seen an increase in your Rankings after improving your TTFB rate?
-
After reading the Post by ZOOMPF http://a-moz.groupbuyseo.org/blog/improving-search-rank-by-optimizing-your-time-to-first-byte I was wondering if any of you have optimized your TTFB and noticed improvements in rankings. Have you optimized and not noticed any differences?
-
The best post/information I came across was what Geoff has written here: http://a-moz.groupbuyseo.org/blog/site-speed-are-you-fast-does-it-matter
Under How important is Site Speed? In two words summary of Google and Matt Cutts reply is that its a factor, but a very small factor of the many factors involved in rankings.
I am also interested in this question as much as you are, but in my efforts I haven't seen a noticeable change in rankings just solely related to speed, this includes TTFB and other speed improvements.
Having said that. TTFB and overall speed optimization is important in conversions on your site and ultimately in providing better user experience, which is always really important.
This leads to another question that you can ask, did improving TTFB improve conversions, or lower bounce rate, etc? This also is a great question, that I am curious in as well. From my end I did not have a client that was that worried, or had a horrable TTFB that I could measure and notice a difference, but very interested if someone had this effect
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Will changing a URL negatively affect ranking?
Hello Mozzers, We would like to change the URL for a page on our website which ranks well for some our keyphrases/words. We are hoping the change of URL, through the addition of an additional keyword would help boost the rank of that URL further. At the moment out page gets 2 x A and 2 x B 1xF on the MOZ page rank tool using 5 keyphrase/word variations . One phrase ranks 4, one ranks 3 and the other 3 are 'not in the top 50' Our plan was to change the URL, using SHF404, and use 'Fetch' in the Google search console to re-submit the page to Google. Appreciate you can't give any guarantees how Google will behave, just wondered what your thoughts were on the wisdom of changing the URL in the first place? Thanks Ian
On-Page Optimization | | Substance-create0 -
Is a Mega Menu with over 300 links in it hurting my rankings?
I got hit pretty badly by Panda 4.0 (1/3 of my traffic lost), and I'm fairly certain it was because Google had potentially indexed over 20 million pages from a site filtering piece of software and got done for duplicate content. I have since fixed that using URL Parameters and that 20 million is down to 2.7 million now and I have submitted a clean site map, so now I wait. I have just done a site relaunch and am trying to determine if there are any other issues. I run an online store, and I have a mega menu with well over 300 links in it - makes the user experience really quick and easy to jump exactly where you want - and then I have about 30 links in the footer. I know there's a 'no more than 100 links on a page' guideline for Moz, but does anyone know if Google is smart enough to see the same header / footer navigation structure on every page of a site and know it's navigation and not water down the rest of the links, or do I need to re-think and simplify my navigation? It's one of those things that's there for a user experience and now I'm worried that I'm being penalised. The site is www dot shopnaturally dot com dot au
On-Page Optimization | | sparrowdog0 -
Why I am ranking for irrelevant keywords
My website is e-commerce and used to rank for all industry related keywords like buy widgets, cheap widgets, online widgets in top10. And suddenly my website was hacked and to resolve this hacking issue i have re-write all my dynamic urls into static pages after that new pages are indexed and ranking well. But after few months i have notice few changes in keywords ranking going down. But suddenly after Google Algo (EMD/Panda) update on Sept 27 i lost all my positions. And then according to Google guidelines i have worked on over optimization and low quality pages. I have removed all tones of low quality pages from SERP and simultaneously worked on url re-write. But i have notice small percent of changes in keyword positions like when Google Algo (EMD/Panda) is rolled out i lost my keyword positions from 1st page to 200 page and after working on over optimization and low quality pages the keywords are came back to 100 pages. Recently i have notice that my web pages ranking for irrelevant keywords. For example, let's say i used to rank for home page for these keywords; buy widgets, cheap widgets, online widgets but now am ranking for different inner pages say (guide pages). Can any one suggest me whats wrong..
On-Page Optimization | | BipSum0 -
Can you change meta description at any time without loosing rankings?
I'm wondering if it is possible to change the meta description of a page on your site at any time without it affecting rankings on google? Is it only changing how my SERP looks? Or could bad things happen to changing your meta description? Appreciate all opinions on this one, thank you guys!
On-Page Optimization | | danielpett0 -
Need help with fluctuating ranking for a specific keyword
my website www.totalmanagement.com fluctuates for the search term: web based property management software I have been using SEO Moz for a few months now and have managed to get to the top 5 and jump around between 3 and 5. Does anyone have any suggestions to assist me? Long term goal is also to really target: Property Management Software But I am still very new at this. Thanks in advance for the help!
On-Page Optimization | | dgruhin0 -
Do images on a CDN affect my Google Ranking?
I have recently switched my images to a CDN (MaxCDN) and all of the images within my post are now get loaded directly from the CDN. Will this affect my Google ranking? Do Google care if the image is hosted physicaly on the domain?
On-Page Optimization | | Amosnet0 -
Tag clouds: good for internal linking and increase of keyword relevant pages?
As Matt Cutts explained, tag clouds are OK if you're not engaged in keyword stuffing (http://www.youtube.com/watch?v=bYPX_ZmhLqg) - i.e. if you're not putting in 500 tags. I'm currently creating tags for an online-bookseller; just like Amazon this e-commerce-site has potentially a couple of million books. Tag clouds will be added to each book detail page in order to enrich each of these pages with relevant keywords both for search engines and users (get a quick overview over the main topics of the book; navigate the site and find other books associated with each tag). Each of these book-specific tag clouds will hold up to 50 tags max, typically rather in the range of up to 10-20. From an SEO perspective, my question is twofold: 1. Does the site benefit from these tag clouds by improving the internal linking structure? 2. Does the site benefit from creating lots of additional tag-specific-pages (up to 200k different tags) or can these pages become a problem, as they don't contain a lot of rich content as such but rather lists of books associated with each tag? Thanks in advance!
On-Page Optimization | | semantopic0