Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to beat Wikipedia article from the top spot on SERPS?
-
Hi Guys,
One of our clients has a good web site with lots of content that is ranked already on #2 for the top keyword (singular and plural) on Google UK. The keyword itself is a competitive one. The top spot is occupied by a wikipedia article that doesn't have much content in general. Can anyone come up with an advice what strategy we have to apply to outplace that article? Thanks!
-
Thank you guys!
-
Wikipedia can be really really really hard to beat.... did I say really really really really hard to beat?
Just keep working on your site like wikipedia was any other competitor. Build great content that gets liked and tweeted.. .stuff that engages your visitors.
There is no special bullet for killing wikipedia. You simply must overpower them by brute force of a great website.
Good luck.
-
I have seen some results for Wikipedia where you pull your hair out hehe, but the thing is the site is soo high authority and the internal link value it holds.
To be honest I have had experience taking down Wikipedia where you deal with big brand websites and you can highly target the site.
Really do some analysis on the links the wikipedia page has, see what you may be missing if content is not the problem.
Just keep pushing fresh content and social signals too if you can try and implement people to search for your website and drive higher CTR on the serp page.
-
I wouldn't suspect so. Wiki is seen as an incredibly Authoritive site and has many high quality links pointing to it, so it's high rankings are mainly down to the site being so authoritive and huge.
Wiki fulfills many of the factors within the periodic table of SEO ranking factors at http://searchengineland.com/seotable It's a difficult site to beat, though can and is certainly achieved.
Glad you like the suggestions, they will help to get there.
Regards
Simon
-
Thanks Simon, will try those. Do you think that google applies different ranking factors when it comes to Wikipedia in general?
-
Hi Ivaylo
I shall share a few pointers with you here for consideration;
-
Perform an on-page analysis of the website to identify and help resolve any issues that might come up, such as too many on-page links or too many no-followed links pointing in, any issues with titles or descriptions... (The SEOmoz toolset is great at helping with this).
-
Research what valuable links are pointing to the Wiki page and try and get some of the same links pointing to your clients' site (new followed links from different reputable websites will help a lot). Also, identify existing links where the anchor text could be improved.
-
Keep the content fresh, relevant and interesting.
-
Depending on what your clients' site offers, consider if there are any tools/widgets that could be developed to help make the site more useful.
-
Consider building upon the Social aspect, such as engaging with people on Twitter, Forums and Guest Blogging to attract more visitors and more sharing of your content.
Hope that helps,
Regards
Simon
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta-description issue in SERPs for different countries
I'm working with a US client on the SEO for their large ecommerce website, I'm working on it from the UK. We've now optimised several of the pages including updating the meta-descriptions etc. The problem is when I search on the keyword iin the UK I see the new updated version of the meta-description in SERPs results. BUT when my client searches on the same keyword in the US they're see the old version of the meta-description. Does any one have any idea why this is happening and how we can resolve it? Thanks Tanya
Intermediate & Advanced SEO | | TanyaKorteling0 -
Top hierarchy pages vs footer links vs header links
Hi All, We want to change some of the linking structure on our website. I think we are repeating some non-important pages at footer menu. So I want to move them as second hierarchy level pages and bring some important pages at footer menu. But I have confusion which pages will get more influence: Top menu or bottom menu or normal pages? What is the best place to link non-important pages; so the link juice will not get diluted by passing through these. And what is the right place for "keyword-pages" which must influence our rankings for such keywords? Again one thing to notice here is we cannot highlight pages which are created in keyword perspective in top menu. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Author Byline in Page Title in SERP
I was exploring my company's visibility in Google News results, and I noticed the author byline in a recently published article was being pulled into the page title in the SERP. See the attached image for a screenshot. It makes it sound awkward: "How to Find the Best Cannabis Experience and High for You Patrick..." - as if we're explaining it to some guy named Patrick? We have the byline the same way in all other posts, but this is the first I've seen this happen. Has anyone seen/had this happen, and if so, have any ways to prevent it? Thanks in advance for any insights! Here's the post URL: https://www.leafly.com/news/cannabis-101/how-to-find-best-cannabis-experience-high csvmF
Intermediate & Advanced SEO | | davidkaralisjr0 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0 -
What do you do with outdated news and articles?
What do you guys do with your old content/news/articles? Do you just leave them on your site forever for historical reasons? It goes without saying that you wouldn't delete an article that has links pointing to it. But if there aren't any links, it doesn't rank and it doesn't receive traffic… do you just scrap it? How say you? Update: I would also like to throw in that I have a client who in 2006/2007 used content from another site. What would you do with that content after this amount of time? Bother with it?
Intermediate & Advanced SEO | | BeTheBoss0 -
Old Redirecting Website Still Showing In SERPs
I have a client, a plumber, who bought another plumbing company (and that company's domain) at one point. This other company was very old and has a lot of name recognition so they created a dedicated page to this other company within their main website, and redirected the other company's old domain to that page. This has worked fine, in that this page on the main site is now #1 when you search for the other old company's name. But for some reason the old domain comes up #2 (despite the fact that it's redirecting). Now, I could understand if the redirect had only been set up recently, but I'm reasonably sure this happened about a year ago. Could it be due to the fact that there are many sites out there still linking to that old domain? Thanks in advance!
Intermediate & Advanced SEO | | VTDesignWorks1 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0