Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to optimize/setup a teaser "coming soon" page for a new product launch?
-
Within the context of a physical product launch what are some ideas around creating a /coming-soon page that "teases" the launch. Ideally I'd like to optimize a page around the product, but the client wants to try build consumer anticipation without giving too many details away. Any thoughts?
-
Thanks for our laugh for this morning Mike. We had no idea we were ranking for that!
-
Keri's example of Moz's "top secret project" is a good one.
If you Google "top secret project" they are appearing at the bottom of the second page SERP.
-
Just signed up

-
I wouldn't know. Never have heard of a company that ever tried to give out any teases when anything new is coming along.

-
The tricky part is the client wants to keep the product a "secret". They don't want to create the product page as the teaser URL, since the optimized URL for the new product would reveal the new product itself.
I agree with your approach, but I the trick here is going to be optimizing a /coming-soon page that then redirects to the new product page a week or two before the launch.
-
Not knowing the product kind of makes it difficult to go on that; however, some of these ideas may work.
I would create a landing page for the physical product, then additional pages optimized around the product launch:
- [product name] Release Date
- [product name] Specifications or Specs
- [product name] News
- [product name] Price
- [product name] Pre-order
- [product name] Price
- [product name] Sneak Peek
- etc.
Create a landing page that is specific to any question that a visitor would/could have about the product.
Does that help/give you some ideas?
Mike
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Page Rank Worse After Optimization
For a long time, we had terrible on page SEO. No keyword targeting, no meta titles or descriptions. Just a brief 2-4 sentence product description and shipping information. Strangely, we weren't ranking too bad. For one product, we were ranking on page 1 of Google for a certain keyword. My goal to reach the top of page 1 would be easy (or so I thought). I have now optimized this page to rank better for the same keyword. I have a 276 word description with detailed specifications and shipping information. I have a strong title and meta description with keywords and modifers. I have also included a video demonstration, additional photos and an PDF of the owners manual. In my eyes, the page is 100% better than it ever was. In the eyes of MOZ, it's better also. I've got an A with the On-Page Grader. Why is this page now ranking on page 8 of Google? What have I done wrong? What can I do to correct it?
Intermediate & Advanced SEO | | dkeipper0 -
Best way to block a sub-domain from being indexed
Hello, The search engines have indexed a sub-domain I did not want indexed its on old.domain.com and dev.domain.com - I was going to password them but is there a best practice way to block them. My main domain default robots.txt says :- Sitemap: http://www.domain.com/sitemap.xml global User-agent: *
Intermediate & Advanced SEO | | JohnW-UK
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category//
Disallow: */trackback/
Disallow: */feed/
Disallow: /comments/
Disallow: /?0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
How to properly link to products from category pages?
Hi All, We have an e-commerce website and the category pages are built so that there is a product image and below it there is the title. Both the image and the title are in a href (each on its own). I encountered the following unfinished discussion here at MOZ:
Intermediate & Advanced SEO | | BeytzNet
http://www.seomoz.org/q/how-to-optimize-achor-text-links-on-ecommerce-category-page#post-93758 The discussion states that its improper. The question is - if it is wrong then why? (maybe because Google will give its weight to the image anchor instead of the text anchor since it is higher in the page). The other question is how to resolve the matter?
Should I add nofollow to the image href? Thanks0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640