Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
ECommerce Filtering Affect on SEO
-
I'm building an eCommerce website which has an advanced filter on the left hand side of the category pages.
It allows users to tick boxes for colours, sizes, materials, and so on. When they've made their choices they submit (this will likely be an AJAX thing in a future release, but isn't at time of writing).
The new filtered page has a new URL, which is made up of the IDs of the filter's they've ticked - it's a bit like /department/2/17-7-4/10/
My concern is that the filtered pages are, on the most part, going to be the same as the parent. Which may lead to duplicate content.
My other concern is that these two URLs would lead to the exact same page (although the system would never generate the 'wrong' URL)
- /department/2/17-7-4/10/
- /department/2/**10/**17-7-4/
But I can't think of a way of canonicalising that automatically.
Tricky.
So the meat of the question is this: should I worry about this causing issues with the SEO - or can I have trust in Google to work it out?
-
Andie -
We work on a lot of eCommerce sites with similar left-hand navigation filters.
I think that the thing to keep in mind is that these pages are often like search results pages, and require a human to choose options to create those URLs. As a result, they shouldn't be pages that a typical crawl bot would find.
That said, each eCommerce system acts differently, and it's possible that permanent links are created that are added to a site map. Or, it's possible that Google's bots are starting to check boxes on eCommerce filters to better mimic human behavior. After all, Google has created self-driving cars.
The data driven approach: I would check to see if any of these pages are showing up in Google Webmaster tools to see if it is, indeed, an issue, before trying to go crazy about duplicate content.
Hope this helps,
-- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is NitroPack plugin Black Hat SEO for speed optimization
We are getting ready to launch our redesigned WP site and were considering using NitroPack performance optimization plugin, until some of our developers started ringing the alarm. Here is what some in the SEO community are saying about the tool. The rendering of the website made with the NitroPack plugin in the Page Metric Test Tools is based entirely on the inline CSS and JS in the HTML file without taking into account additional numerous CSS or JS files loaded on the page. As a result, the final metric score does not include CSS and JavaScript files evaluation and parsing. So what they are saying is that a lot of websites with the NitroPack plugin never become interactive in the Page Metric Tools because all interactivity is derived from JavaScript and CSS execution. So, their "Time to Interactive" and "Speed Index" should be reported as equal to infinity. Would Google consider this Black Hat SEO and start serving manual actions to sites using NitroPack? We are not ready to lose our hard-earned Google ranking. Please, let me know your thoughts on the plugin. Is it simply JS and CSS "lazy loading" that magically offers the first real-world implementation that works magic and yields fantastic results, or is it truly a Black Hat attempt at cheating Google PageSpeed Insights numbers? Thank you!
On-Page Optimization | | opiates0 -
Does blogging with a wysiwyg negatively affect SEO (vs. hand coding)?
Many bloggers use a wysiwyg editor to write posts. Are there any drawbacks to wysiwyg vs plain text? When I write blogs I prefer to hand code my text to be sure everything is optimized. My feeling is that wysiwyg leads to code bloat and generally fewer optimization opportunities. I have no real evidence. Is there any reason not to use the wysiwyg editor?
On-Page Optimization | | Jason-Rogers0 -
Can "window.location" javascript on homepage affect seo?
Hi! I need to add a splashpage to my wordpress site. I use "window.location" javascript on the homepage to redirect on the splashpage (controlled by cookie to redirect only for the first access). Can this technique affect the SEO on homepage? Thanks in advance!
On-Page Optimization | | StudioCiteroni0 -
Using Escaped Fragments with SEO
Our e-commerce platform is in the process of changing to what we call app based stores (essentially running in a browser as single page web-app) With these new stores they are being built in HTML 5 and using escaped fragments.
On-Page Optimization | | marketing_zoovy.com
Currently merchants are usually running 2 stores until we launch to app site at 100%. My questions are really concerning the app stores which right now show on a subdomain but will essentially take over the primary domain. Here is an example:
app.tikimater.com and app.sportsworld.com Since I am not a developer, I'm really having a hard time understanding the escaped fragments. I'm using this but https://developers.google.com/webmasters/ajax-crawling/docs/getting-started I'm not sure what my actual urls should look like and what the canonical should be set to. Right now they have been removed but previously they had http:app.tikimaster.com#!v=1 Also, and how I should be setting up my meta information for Google so 1) pages are indexed timely 2) pages are indexed with the correct information. I am still setting the meta titles and descriptions but in some instances Google uses other info. With the new platform we are moving away from on page content (written paragraphs) but category pages would have related products embedded. Should I still be pushing to have some type of intro text, since it would solely be for SEO and not the shoppers experience. All product pages have content (product description etc) Thank you for any advice0 -
Does 'XXX' in Domain get filtered by Google
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for. Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered. here is the url: xxxchurch.com Thanks, Brian
On-Page Optimization | | Add3.com0 -
Analyzing word count on page SEO
Hey guys quick question, when I am analyzing/ doing word count for a particluar key word and I want to make sure that i am no where near Keyword stuffing, does Google consider the alt and title tags keywords of images as part of the KW count when looking for on page Keyword stuffing. For example. let say I have a page that i just created with 1000 words. and Only 2 of the words are my target Keywords. Then, if i add a picture and add the keyword to both the alt and title tag and description of the image, does google now consider the "page" to have a total of 5 keywords? Also, a lot has changed recently since penguin and panda, is there a good rule of thumb for what ratio to stay under as far as keywords to text.?
On-Page Optimization | | david3050 -
Are blank Product Review pages bad for SEO?
Hi there, I'm running a new e-commerce site (BoatOutfitters.com) and have a question about our product review pages. On our current campaign, we have a lot of duplicate page content errors. When we export the data, it's almost all blank product review pages (since we are new, we don't have that many product reviews yet). Our product reviews aren't run through javascript, so we originally did not add them to a robots.txt file - however, I'm now wondering if it's worse to have all of these duplicate blank pages, or is it not affecting our SEO at all? Should we just wait until these products have reviews which will benefit our SEO and then they won't be considered "duplicate pages" - right? Sorry if this has been answered before - new here at SEO Moz and just looking for some help. Thanks!
On-Page Optimization | | BoatOutfitters0 -
Best SEO structure for blog
What is the best SEO page/link structure for a blog with, say 100 posts that grows at a rate of 4 per month? Each post is 500+ words with charts/graphics; they're not simple one paragraph postings. Rather than use a CMS I have a hand crafted HTML/CSS blog (for tighter integration with the parent site, some dynamic data effects, and in general to have total control). I have a sidebar with headlines from all prior posts, and my blog home page is a 1 line summary of each article. I feel that after 100 articles the sidebar and home page have too many links on them. What is the optimal way to split them up? They are all covering the same niche topic that my site is about. I thought of making the side bar and home page only have the most recent 25 postings, and then create an archive directory for older posts. But categorizing by time doesn't really help someone looking for a specific topic. I could tag each entry with 2-3 keywords and then make the sidebar a sorted list of tags. Clicking on a tag would then show an intermediate index of all articles that have that tag, and then you could click on an article title to read the whole article. Or is there some other strategy that is optimal for SEO and the indexing robots? Is it bad to have a blog that is too heirarchical (where articles are 3 levels down from the root domain) or too flat (if there are 100s of entries)? Thanks for any thoughts or pointers.
On-Page Optimization | | scanlin0