Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Asynchronous loading of product prices bad for SEO?
-
We are currently looking into improving our TTFB on our ecommerce site.
A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched.
The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB.
My question is whether google considers this as black hat SEO or not?
-
Thanks for your response. We'll definitely go for this improvement.
But can you please explain what you mean by "an unintuitive UX idea" ?
-
I don't see any reason why this would be seen as black hat. On the contrary, I see it as an unintuitive UX idea and you should definitely do it.
The only information your withholding (and you're not even cloaking it) is a price that is dependent on a lot of factors. You're not hiding any content or links, so there's no worry there. Even if you were hiding content it wouldn't be a problem, unless it was completely irrelevant and there just to rank the page.
Any affect this could have is that if you're deferring elements to load on the page to improve Time To First Byte, then Google may not read them as they crawl and therefore the content it sees on the page may be depleted, affecting your ability to rank the page. But for something like deferring a price tag, this isn't relevant at all.
I'd say go for it - think it would be a great idea for user experience.
-
Definitely not black hat but could impact SEO and negate any schema markup you have.
I would go to GWT > Crawl > Fetch as Google and see what HTML is received by Googlebot.
If all the async elements are there, you should be gravy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Hiding Elements on Mobile. Will this effect SEO.
Hey guys and gals, I am hiding elements with @media sizes on the mobile experience for this site. http://prepacademyschools.org/ My question is when hiding elements from mobile, will this have a negative effect on rankings for mobile and or desktop? Right now it is a hero banner and testimonial. My interest is because I feel responsive is now working against conversions when it comes to mobile because desktop typically has the same info several times where mobile it can be repetitive and only needed once. Thanks,
White Hat / Black Hat SEO | | brightvessel1 -
Duplicate product content - from a manufacturer website, to retailers
Hi Mozzers, We're working on a website for a manufacturer who allows retailers to reuse their product information. Now, this of course raises the issue of duplicate content. The manufacturer is the content owner and originator, but retailers will copy the information for their own site and not link back (permitted by the manufacturer) - the only reference to the manufacturer will be the brand name citation on the retailer website. How would you deal with the duplicate content issues that this may cause. Especially considering the domain authority for a lot of the retailer websites is better than the manufacturer site? Thanks!!
White Hat / Black Hat SEO | | A_Q0 -
Adult Toy Store SEO
Hi fellows, I'm not so strange to SEO. I have been promoting our spiritual network through SEO and we have received great returns from it. I'm planning to promote an adult toy store via SEO. I have never done any adult store promoting before but I think there are a lot of down sides to it, such as: #1 When I search related keywords many porn websites show up; I assume it seems spammy to google's eye. Also most of the links that I will get are probably from porn websites due to relevancy. #2 Many of our returning customers are coming from retargeting but I assume there is no adult promotion via google display. Is that right? (It's not SEO related) I'm wondering to know if google is against adult content in any way? Any feedbacks are appreciated.
White Hat / Black Hat SEO | | Arian-Ya0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1 -
Recovering From Black Hat SEO Tactics
A client recently engaged my service to deliver foundational white hat SEO. Upon site audit, I discovered a tremendous amount of black hat SEO tactics employed by their former SEO company. I'm concerned that the efforts of the old company, including forum spamming, irrelevant backlink development, exploiting code vulnerabilities on BB's and other messy practices, could negatively influence the target site's campaigns for years to come. The site owner handed over hundreds of pages of paperwork from the old company detailing their black hat SEO efforts. The sheer amount of data is insurmountable. I took just one week of reports and tracked back the links to find that 10% of the accounts were banned, 20% tagged as abusive, some of the sites were shut down completely, WOT reports of abusive practices and mentions on BB control programs of blacklisting for the site. My question is simple. How does one mitigate the negative effects of old black hat SEO efforts and move forward with white hat solutions when faced with hundreds of hours of black gunk to clean up. Is there a clean way to eliminate the old efforts without contacting every site administrator and requesting removal of content/profiles? This seems daunting, but my client is a wonderful person who got in over her head, paying for a service that she did not understand. I'd really like to help her succeed. Craig Cook
White Hat / Black Hat SEO | | SEOptPro
http://seoptimization.pro
[email protected]0