Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Templates for Meta Description, Good or Bad?
- 
					
					
					
					
Hello, We have a website where users can browse photos of different categories. For each photo we are using a meta description template such as:
Are you looking for a nice and cool photo? [Photo name] is the photo which might be of interest to you.
And in the keywords tags we are using:
[Photo name] photos, [Photo name] free photos, [Photo name] best photos.
I'm wondering, is this any safe method? it's very difficult to write a manual description when you have 3,000+ photos in the database.
Thanks!
 - 
					
					
					
					
I really like Dana's response - it covers the primary consideration - how much time would it REALLY take to write unique Meta descriptions? If the TRUE answer is "unrealistically too much time", then a template COULD work. The trick though is addressing the issues Dana talks about.
If you only use a primary product name as the variable, you run risks. If you have a 2nd database field you have that includes some differentiation between otherwise identical products, that can help. As long as you understand total length as a consideration.
 - 
					
					
					
					
I think this is an excellent question. It's something that was in place where I am the in-house SEO when I came on board. After two years of kicking and screaming, I finally got buy off on doing away with the template. Here's why I didn't like it:
- It caused a lot of duplicate content problems. We have products that might be alike in every way with the exception of a microphone frequency band. Often, this information wasn't included in the product name/title, and consequently, when it was used to populate the meta description "template" we ended up with tons of duplicates.
 - Problems with length. We had templated copy that worked just find for about 75% of our brands and products, but some of our brand names and products names were much longer, resulting in the templated descriptions being too long and getting truncated, totally defeating their own purpose.
 - Poor user experience. Many of our competitors use templated meta descriptions, specifically Sweetwater, Musician's Friend and Guitar Center. Nearly all of their descriptions are 100% identical with the exception of products swapped in and out. From a searcher's standpoint, this kind of sucks because it doesn't tell me anything interesting about the product.
 - Lost marketing opportunity - Are you really going to use the same marketing message for every single product on your site? That's a huge opportunity lost I think.
 
Okay, maybe if we were a huge brand like Sweetwater, it just wouldn't matter and we could get away with this because brand recognition would be strong enough to outweigh the fact that there was nothing of unique interest in the description...But, we aren't Sweetwater, so making every marketing opportunity count to us is crucial. We have about 3,000 SKUs, and a tiny marketing department. Somehow we're managing to crank out those unique descriptions just fine. 3,000 really isn't that many. If it does get to be too much, scaling this with freelancers would be extremely easy and cheap to do provided you lay down clear parameters for exactly what you want.
My advice? Take the time to add unique descriptions...oh, and forget about populating the meta keywords. You don't need to do that any more.
Hope that's helpful!
Dana
 
Browse Questions
Explore more categories
- 
		
		
Moz Tools
Chat with the community about the Moz tools.
 
- 
		
		
SEO Tactics
Discuss the SEO process with fellow marketers
 
- 
		
		
Community
Discuss industry events, jobs, and news!
 
- 
		
		
Digital Marketing
Chat about tactics outside of SEO
 
- 
		
		
Research & Trends
Dive into research and trends in the search industry.
 
- 
		
		
Support
Connect on product support and feature requests.
 
Related Questions
- 
		
		
		
		
		
		
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 - 
		
		
		
		
		
		
Can you force Google to use meta description?
Is it possible to force Google to use only the Meta description put in place for a page and not gather additional text from the page?
Technical SEO | | A_Q0 - 
		
		
		
		
		
		
Bing webmaster tools incorrectly showing missing title and description tags
Hey all, Was wondering if anyone else has come across this issue. Bing is showing title and description tags missing in the head of my wordpress blog. I can't seem to find any documentation on this. Thanks, Roman
Technical SEO | | Dynata_panel_marketing0 - 
		
		
		
		
		
		
How can I block incoming links from a bad web site ?
Hello all, We got a new client recently who had a warning from Google Webmasters tools for manual soft penalty. I did a lot of search and I found out one particular site that sounds roughly 100k links to one page and has been potentialy a high risk site. I wish to block those links from coming in to my site but their webmaster is nowhere to be seen and I do not want to use the disavow tool. Is there a way I can use code to our htaccess file or any other method? Would appreciate anyone's immediate response. Kind Regards
Technical SEO | | artdivision0 - 
		
		
		
		
		
		
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 - 
		
		
		
		
		
		
Move established site from .co.uk to .org - good or bad idea?
I am currently considering moving our site from the current .co.uk domain to the .org version which we also own. The site is established and indexed for 7 years, ranks well and has circa 10k traffic per month which is mainly UK & US traffic. The reason for the change to the .org domain is to make the site more global facing and give us the opportunity to develop the site into multi language within directories (.org/es/ etc.) and then target those to the local search engines. For the kind of site it is (community based) it wouldn’t really work to split this into lots of separate country targeted domains. So the choice is to either stick with the .co.uk and add the other foreign language specific content in directories within the .co.uk or move to the .org and do the same (there is also a potential third option of purchasing the .com which is currently unused but that could be pricey!) We are also planning a big overhaul of the site with redesign, lots of added content and reorganisation of the site – but are thinking that it would be better to move the domain on a 1:1 basis first with the current design, content and URL structure in place and then do the other changes 2 or 3 months down the line. I have read up on SEOmoz, google guidelines etc on moving a site to a new domain and understand the theoretical approach of moving the site and the steps to take (1to1 301 redirects, sitemaps on old and new etc) and I will retain ownership of the .co.uk so the redirects can remain in place indefinitely. However having worked so hard to get the site to where it is in the search engines and traffic levels I am very worried about whether the domain change is a good move. I am more than happy to accept a temporary fluctuation in rankings & traffic for 1 – 4 weeks as reported may happen as long as I can be sure it will return after a temporary period and be as strong (or almost as strong) as the previous rankings / traffic. Looking for peoples experiences to give me the confidence / reassurance to go ahead with this or any info on why I shouldn’t Thanks in advance for your advice. Adrian.
Technical SEO | | Zilla0 - 
		
		
		
		
		
		
How Can I Block Archive Pages in Blogger when I am not using classic/default template
Hi, I am trying to block all the archive pages of my blog as Google is indexing them. This could lead to duplicate content issue. I am not using default blogger theme or classic theme and therefore, I cannot use this code therein: Please suggest me how I can instruct Google not to index archive pages of my blog? Looking for quick response.
Technical SEO | | SoftzSolutions0