Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are the potential SEO downsides of using a service like unbounce for content pages?
-
I'm thinking of using unbounce.com to create some content driven pages. Unbounce is simple, easy-to-use, and very easy for non-devs at my company to create variations on pages.
I know they allow adding meta descriptions, title tags, etc and allow it to be indexable by Google, but I was wondering if there were any potential downsides to using unbounce as opposed to hosting it myself.
Any help would be appreciated!
-
Hi,
I'm the person behind SEO at Unbounce.

There is no technical SEO drawback. Unbounce allows you to directly control all of the elements of your on-page SEO. You can even employ rel="canonical" if you are so inclined to indicate which variation Google should pay the most attention to.
If you have any questions feel free to contact me: [email protected]
-
Hi Ben,
Thanks for the answer! Sorry if I wasn't clear in my original question, but we are actually using Unbounce for PPC testing already.
The pages that we are planning on creating are not necessarily landing pages. It's just much faster for us to create pages/content on Unbounce at the moment than it is creating actual pages on our site. (That way non-devs can work and create new pages as well.)
In your opinion would there be any major downsides to creating some page on unbounce? Obviously it's not ideal, but if there are no major issues we might use their service albeit temporarily.
Thanks!
Seiya
-
That's some solid advice right there.

-
This process may lend itself to PPC a bit more than SEO. When split testing you will need to be aware of duplicate content, and considering that your ultimate goal is to figure out which landing pages are more effective, you will end up removing some of the pages anyway. On a large scale this isn't going to be as effective.
I would consider running a PPC account to test these pages and not have them indexed. Then, once you have a landing page that performs well, create it on the site and promote it with SEO.
-
Seiyav-
If you are looking for a simple and easy solution to start taking advantage of AB testing and getting landing pages created quickly without waiting for a developer, it is a very cost effective model. You can generate landing pages quickly and easily without developers.....you can set up testing easily and the system will provide you metrics to measure the results without a lot of in-depth thought.
There really are no downsides other than the cost.....
We have found with larger clients as they generate some expertise and some clarity regarding which landing pages are working better, then they start to bring it inside to generate more control, save money and become more knowledgable about AB testing and development of landing pages. But unbounce is a great step in that process.
Good luck. Hope it helps..
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
JavaScript page loader - SEO impact
Hello all,
Technical SEO | | Lvet
I am working on a site that has a bizarre page load system. All pages get loaded trough the same Javascript snippet, for example: Changing the values in the form changes the page that is loaded. The most incredible thing is that, against my expectations, pages do get indexed by Google.
My question is: "Does loading pages dynamically using JavaScript affect the overall SEO performance?" Why are pages getting indexed? Thank you for shedding light on this.
Cheers
Luca0 -
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Will Adding Publish Date at end of Page Title for Blog posts Hurt SEO?
I'd like to be able to easily track blog posts by month but in Google reports when you set a date range obviously older blog post still appear and with amount of blog posts we generate without seeing the date in the title it's not obvious what was published and when it was published. For example if a Blog Title was "/dangers-of-sharing-KM-knowledge-01-11-15 would it hurt SEO? The reason is I'd like to have a quick way to know how new posts do each month compared to older content
Technical SEO | | inhouseninja0 -
How Does Dynamic Content for a Specific URL Impact SEO?
Example URL: http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from. For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO? How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well. One such challenge is duplicate content when different provinces may have the same information for some pages. Any suggestions for this?
Technical SEO | | ey_sja0 -
SEO for User Authenticated Content
Hi Everyone - I have a potential client who is seeking SEO for a site that contains about 95% of content only accessible through user authentication . Does anyone have tips for getting this indexed without having to open it up to the public? I was considering adding "snippets" into the robots.txt or creating an additional page with snippets linking to the login page. I'd appreciate any thoughts! Thanks!
Technical SEO | | manutx0 -
How to prevent duplicate content at a calendar page
Hi, I've a calender page which changes every day. The main url is
Technical SEO | | GeorgFranz
/calendar For every day, there is another url: /calendar/2012/09/12
/calendar/2012/09/13
/calendar/2012/09/14 So, if the 13th september arrives, the content of the page
/calendar/2012/09/13
will be shown at
/calendar So, it's duplicate content. What to do in this situation? a) Redirect from /calendar to /calendar/2012/09/13 with 301? (but the redirect changes the day after to /calendar/2012/09/14) b) Redirect from /calendar to /calendar/2012/09/13 with 302 (but I will loose the link juice of /calendar?) c) Add a canonical tag at /calendar (which leads to /calendar/2012/09/13) - but I will loose the power of /calendar (?) - and it will change every day... Any ideas or other suggestions? Best wishes, Georg.0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0