Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Asynchronous loading of product prices bad for SEO?
-
We are currently looking into improving our TTFB on our ecommerce site.
A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched.
The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB.
My question is whether google considers this as black hat SEO or not?
-
Thanks for your response. We'll definitely go for this improvement.
But can you please explain what you mean by "an unintuitive UX idea" ?
-
I don't see any reason why this would be seen as black hat. On the contrary, I see it as an unintuitive UX idea and you should definitely do it.
The only information your withholding (and you're not even cloaking it) is a price that is dependent on a lot of factors. You're not hiding any content or links, so there's no worry there. Even if you were hiding content it wouldn't be a problem, unless it was completely irrelevant and there just to rank the page.
Any affect this could have is that if you're deferring elements to load on the page to improve Time To First Byte, then Google may not read them as they crawl and therefore the content it sees on the page may be depleted, affecting your ability to rank the page. But for something like deferring a price tag, this isn't relevant at all.
I'd say go for it - think it would be a great idea for user experience.
-
Definitely not black hat but could impact SEO and negate any schema markup you have.
I would go to GWT > Crawl > Fetch as Google and see what HTML is received by Googlebot.
If all the async elements are there, you should be gravy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links to internal search with pharma spam anchor text. Negative seo attack
Suddenly in October I had a spike on inbound links from forums and spams sites. Each one had setup hundreds of links. The links goes to WordPress internal search. Example: mysite.com/es/?s=⚄
White Hat / Black Hat SEO | | Arlinaite470 -
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
Is toggle Good For seo
Hi there, I have Client Who dont want to show his content to publicly, So team decided to use toggle, So Google can also See Content, But i want bu sure. Does Google will really cache that Content?? Does it down my website Ranking?? Please any one can Help, I need urgent basis Thnx in advance Falguni
White Hat / Black Hat SEO | | iepl20010 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Unique page URLs and SEO titles
www.heartwavemedia.com / Wordpress / All in One SEO pack I understand Google values unique titles and content but I'm unclear as to the difference between changing the page url slug and the seo title. For example: I have an about page with the url "www.heartwavemedia.com/about" and the SEO title San Francisco Video Production | Heartwave Media | About I've noticed some of my competitors using url structures more like "www.competitor.com/san-francisco-video-production-about" Would it be wise to follow their lead? Will my landing page rank higher if each subsequent page uses similar keyword packed, long tail url? Or is that considered black hat? If advisable, would a url structure that includes "san-francisco-video-production-_____" be seen as being to similar even if it varies by one word at the end? Furthermore, will I be penalized for using similar SEO descriptions ie. "San Francisco Video Production | Heartwave Media | Portfolio" and San Francisco Video Production | Heartwave Media | Contact" or is the difference of one word "portfolio" and "contact" sufficient to read as unique? Finally...am I making any sense? Any and all thoughts appreciated...
White Hat / Black Hat SEO | | keeot0 -
Finding and Removing bad backlinks
Ok here goes. Over the past 2 years our traffic and rankings have slowly declined, most importantly, for keywords that we ranked #1 and #2 at for years. With the new Penguin updates this year, we never saw a huge drop but a constant slow loss. My boss has tasked me with cleaning up our bad links and reshaping our link profile so that it is cleaner and more natural. I currently have access to Google Analytics and Webmaster Tools, SEOMoz, and Link Builder. 1)What is the best program or process for identifying bad backlinks? What exactly am I looking for? Too many links from one domain? Links from Low PR or low “Trust URL” sites? I have gotten conflicting information reading about all this on the net, with some saying that too many good links(high PR) can be unnatural without some lower level PR links, so I just want to make sure that I am not asking for links to be removed that we need to create or maintain our link profile. 2)What is the best program or process for viewing our link profile and what exactly am I looking for? What constitutes a healthy link profile after the new google algorithm updates? What is the best way to change it? 3)Where do I start with this task? Remove spammy links first or figure out or profile first and then go after bad links? 4)We have some backlinks that are to our old .aspx that we moved to our new platform 2 years ago, there are quite a few (1000+). Some of these pages were redirected and some the redirects were broken at some point. Is there any residual juice in these backlinks still? Should we fix the broken redirects, or does it do nothing? My boss says the redirects wont do anything now that google no longer indexes the old pages but other people have said differently. Whats the deal should we still fix the redirects even though the pages are no longer indexed? I really appreciate any advice as basically if we cant get our site and sales turned around, my job is at stake. Our site is www.k9electronics.com if you want to take a look. We just moved hosts so there are some redirect issues and other things going on we know about.
White Hat / Black Hat SEO | | k9byron0 -
Negative SEO - Case Studies Prove Results. De-rank your competitors
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website. I could have swore I heard Matt Cutts say this was not possible, well the results are in. This really opens up a whole new can of worms for google. http://trafficplanet.com/topic/2369-case-study-negative-seo-results/ http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/ This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
White Hat / Black Hat SEO | | dean19860