Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Help FORUM ( User generated content ) SEO best practices
-
Hello Moz folks !
For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC.
I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated.
Best,
Yan
-
do logged in-user and anonymous user should have the same behavior ?
For the most part, yes, however it depends on the forum you are running. The important piece to understand is that whatever is hidden behind a login wall, remains hidden to the search engines. So, you have to weigh that factor when deciding which content to display to everyone versus the content to display to only logged in users.
How do you suggest handling canonical in a UGC world ?
Canonicalization isn't too hard to manage. Your forum software should include canonical URLs, but if not you will want those implemented into the template as soon as possible. The use of the rel=prev and rel=next tags are highly recommended. This allows you to keep the main forum thread as the canonical URL and Google understands that the subsequent pages are related to the main page and how they add value.
Do you have specific editorial guidelines enforced on UGC ?
Again, that's up to you and your community. What work editorially for one forum may not be the most desirable for another (e.g. the use of profanity). As long as the content being added is of value, then I consider it good content. With forums, you can be a lot more loose with the guidelines and allow users to interact as they desire.
Don't let your forum become infested with Spam, obvious self-promoting threads, and make sure all links are nofollow. Many forums implement restrictions on users in regards to links and only when they prove themselves can they add links to their posts. Link and Spam management are very important for forums.
-
Thanks Ray-PP,
Is there any specific ? Exemple , do logged in-user and anonymous user should have the same behavior ? How do you suggest handling canonical in a UGC world ? Do you have specific editorial guidelines enforced on UGC ? Exemple should we noindex a post with a 3 word question and an image ?
Cheers,
Yan
-
Hello Yan,
Fortunately, the on-site SEO for UGC is not very different from the on-site SEO of other forms of content. We can still apply those best-practices to the forums and UGC you're experiencing in forums.
Duplicate content / on-page factors
- Make sure the forum is using proper canonicalization
- use of rel=prev/next for paginated threads
- Semantic SEO where appropriate
- Make sure to have all on-page SEO factors optimized (title, headings, images optimized, ect)
Broken links
- Use Moz or a tool like Screaming Frog SEO Spider to identify 404 pages. Redirect any important pages with a 301 to its nearest related page (save SEO authority for the important dead pages).
Are there more specific issues you are experiencing with the forum?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!
Intermediate & Advanced SEO | | MJTrevens1 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ? If it is Good Practice then how Google Panda or other algorithms consider it ?
Intermediate & Advanced SEO | | KaranX0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Help article / Knowledge base SEO consideration
Hi everyone, I am in the process of building the knowledge base for our SaaS product and I am afraid it could impact us negatively on the SEO side because of: Thin content on pages containing short answers to specific questions Keyword cannibalisation between some of our blog articles and the knowledge base articles I didn't find much on the impact of knowledge bases on SEO when I searched on Google. So I'm hoping we can use this thread to share a few thoughts and best practices on this topic. Below is a bit more details on the issues I face, any tips on how to address them would be most welcome. 1. Thin content: Some articles will have thin content by design: the H1 will be a specific question and there will be only 2 or 3 lines of text answering it in the article. I think creating a dedicated article per question is better than grouping 20 questions on one article from a UX point of view, because this will enable us to direct users more quickly to the answer when they use the live search function inside the software (help widget) or on the knowledge base (saves them the need to scrolling a long article to find the answer). Now the issue is that this will result in lots of pages with thin content. A workaround could be to have both a detailed FAQ style page with all the questions and answers, and individual articles for each question on top of that. The FAQ style page could be indexed in Google while the individual articles would have either a noIndex directive or a rel canonical to the FAQ style page. Have any of you faced similar issues when setting-up your knowledge base? Which approach would you recommend? 2.Keyword cannibalisation: There will be, to some extend, a level of keyword cannibalisation between our blog articles (which rank well) and some of the knowledge base articles. While we want both types of articles to appear in search, we don't want the "How to do XYZ" blog article containing practical tips to compete with the "How to do XYZ in the software" knowledge base article. Do you have any advice on how to achieve that? Having a specific Schema.org (or equivalent) type of markup to differentiate between the 2 types of articles would have been ideal but I couldn't find anything relating to help articles specifically when I searched.
Intermediate & Advanced SEO | | tbps0 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Do links in the nav bar help SEO?
If I am building a Nav bar should I use my keywords or make it easier for the user to find what they are looking for. IMO one should ALWAYS make a site based on user experience. If it Google and other SEs do count Nav links, would it be best to place more important keys first?
Intermediate & Advanced SEO | | SEODinosaur0