Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does the use of sliders for text-on-page, effects SEO in any way?
-
The concept of using text sliders in an e-commerce site as a solution to placing SEO text above or in between product and high on ages, seems too good to be true.... or is it?
How would a text slider for FAQ or other on-page text done with sliding paragraphs (similar but not this specific code- http://demo.tutorialzine.com/2010/08/dynamic-faq-jquery-yql-google-docs/faq.html) might effect text-on-page SEO. Does Google consider it hidden text?
Would there be any other concerns or best practices with this design concept?
-
Fredrik,
This is very helpful and gives me a clearer understanding as to how to make this work properly. The example was just that, and meant to explain basic functionality. We'll make sure we end up using an index-able HTML based version.
Much thanks for your advise.
ron
-
Hi Ron
As Paul stated there are many ways of doing sliders. Most of the new sliders out there do work with JavaScript but often used already loaded dom elements for the slides. That means that the actual content is in the HTML and the JavaScript is used to animate or style them. This content would then be indexed just as a normal div would.
You can also use http://www.seobrowser.com/, (simple option is free) to see the page as Google would see it. If you then can read your content it should be possible to index it.
One thing to think of is that sliders, as the name implies, often contains more than one slide. If the slider has a heading in it it might be a good thing to make the first heading H1 and secondary sliders H2. This way you can place your most important content in the first slide.
Not sure if you use Jquery but if you do http://jquerytools.org/ offer great power and flexibility. Please note that I am NOT connected to them or work for them. We have just used their scripts on variious of our projects.
I had a quick look at your example and unfortunetely that would have a very hard time getting indexed since content is in the javascript. I would consider putting all content in the HTML and then just hide and show sections using Jquery instead.
Have a great day and good luck
Fredrik
-
Hi Paul,
Thank you so much for the detailed answer. deep down i worried this might be the case.
The truth is that the text in question is pretty much for SEO reasons only. Do you know f a better way, or another kind of script that would serve to have the text indexed?
Ron
-
The answer is that it actually depends very much on exactly what kind of coding is used to accomplish the effect, Ron.
In most cases, this kind of slider effect is accomplished using some variation of JavaScript. While Google has said it is "trying" to have it's crawlers recognize text from scripts, it almost never works that way.
So it won't be flagged as "hidden" text, because in fact Google won't even consider it to exist on the page.
An easy way to test is to view the source for the page in question - you'll see that none of the words of text actually exist on the page in any form, even in the code.
For the ultimate example of this - go into Google Webmaster Tools and use the Fetch as Googlebot tool to fetch the page. Then you'll see exactly the content that googlebot will see. It won't see the text, therefor it can't index and rank it. Ergo no SEO benefit at all.
Where you could get into trouble is if you did have text on the page designed to make googlebot think the page is about one thing, while using this kind of scripted text to try to show the visitor something completely different and unrelated. Google could then suspect you of cloaking and penalize accordingly. (Cloaking is when you intentionally show googlebot one thing and the user something different for nefarious purposes)
But if you're adding the text as a usability enhancement for your visitors in a way that googlebot doesn't happen to understand, you won't get any SEO benefit from it, but you also shouldn't be penalized for it.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is 301 redirect the only way when using Vanity URLs?
We have been using vanity urls for some of our pages. Mostly the pages that have a vanity URL have a long URL length. But now the problem is, the vanity URL is getting displayed on the search engine when the particular keyword related to the page is entered. I checked the google search console, the vanity URL is indexed and the original URL remains unindexed. What should I do? Is adding 301 redirect to the vanity URLs are solution? Since some of vanity URLs are not redirecting to the original. Some of the original pages are not getting traffic. Also, can using canonical tag help?
Technical SEO | | tejasbansode0 -
Best way to change URL for already ranking pages
Hello. I have a lot of pages that I'm optimising. The ones I'm focusing on right now is already ranking, but the URLs could be better (they don't include the keywords right now). However I'm worried that if I change the URLs they will drop in rankings or have to start over. I would of course set up 301 redirect, but is there more I need to do? What is the best way to change URL for already ranking pages?
Technical SEO | | GoMentor0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
Is it bad (black hat) to have an H1 text as a text indent?
Is it bad practice to use a text indent through CSS for H1 text on a homepage(basically hiding h1 text)? I'm just trying to compensate for the fact that some text that should really be in the h1 tag is actually an image.
Technical SEO | | inc.com1 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0