Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why do some sites have a higher Page Authority than Domain Authority in OSE??
-
Hi,
I have noticed when using OSE and enter a domain you very often see a higher Page Authority than Domain Authority.
If someone could explain why this would happen then I would be very grateful - its my current understanding that page authority would ALWAYS be LESS THAN Domain Authority but that is not always the case (I have seen cases where PA is more than 10 higher then DA)
Here's an example where PA > DA
http://www.opensiteexplorer.org/links.html?site=www.primelocation.com
Thanks
-
No, there are too many open variables: age of pages, localization, keyword popularity, etc.
Edit: look at mozTrust of urls and domain mozTrust of various sites. This will show you that a + b is not equal to c.
hope it helps clarify.
-
Hi Rob & Rand,
Many thanks for helping explain the concept. What I am trying to understand is if DA is a calculation based on PA of all pages on that domain (or if it is possible to work out DA from PA of all pages on that domain) or a totally separate calculation which incorporates other metrics. EG
A domain has 2 pages A (PA 50), B (PA 40). Is it possible to work out what DA is for this domain from these 2 values alone?Many thanks
-
An indervidual may be rated higher then the team he plays for, a page can be rated higher then the domain it belongs to.
-
Well,
Sorry, your understanding is incorrect. Domain authority is just that, an overall value for the domain. So, if you go to a site where the links all come to a home page or the vast majority do, you are going to see a high PA for the home page PA. If that same site has 50 pages with few links you are going to see a lot of PA=1.
So, while the home page is important it does not outweigh the fact that overall the site is not that highly trafficked. This is a reason sites have trouble with conversions of inner pages sometimes. All the value is placed on the home page and someone then has 500 internal links from that page so it does not even help the inner pages.
Remember this in the search engine world pages are ranked and not sites. So, PA is more important from that point of view.
Hope this clarifies,
Best
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is one page with long content better than multiple pages with shorter content?
(Note, the site links are from a sandbox site and has very low DA or PA) If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above. http://www.aspensecurityfasteners.com/Screws-s/432.htm My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g. http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
Moz Pro | | AspenFasteners
http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
Hi How long aproximately does G take to pass authority via a 301 from an old page to its new replacement page ? Does Moz Page Authority reflect this in its score once G has passed it ? All Best
Moz Pro | | Dan-Lawrence
Dan3 -
Can increasing website pages decrease domain authority?
Hello Mozzers! Say there is a website with 100 pages and a domain authority of 25. If the number of pages on this website increases to 10,000 can that decrease its domain authority or affect it in any way?
Moz Pro | | MozAddict0 -
My site's domain authority is 1\. why is that
Hi Guys My website's domain authority is 1 no matter i try www or non www.. why is that? can you guys please help? Thanks a lot in advance. http://www.opensiteexplorer.org/links?site=autoproject.com.au
Moz Pro | | JazzJack
http://www.opensiteexplorer.org/links?site=www.autoproject.com.au Jazz0 -
Comparing Domain Authority Scores
Since your scale (like PageRank) is a logarithmic scale, it makes it hard to judge the distance between 2 scores. Can you give me a rule of thumb. For PageRank, each jump is an exponential jump - so that a PR6 is perhaps 10 times stronger than a PR5. What is the log base that SEOMoz uses. Should I assume that a 60 is 10 times stronger than a 50? This is important when it comes to measuring progress because growth is going to get more difficult as you move up the scale and I need to communicate the distance between our current Authority score and our goal. Thank You!
Moz Pro | | apo11o1770 -
What is the logarithmic scale used for domain authority?
I want to quantify how much better a score of 80 is compared to 60. Or 60 compared to 30 etc.... What is the logarithm base? Thanks, Rik
Moz Pro | | garypropellernet0 -
Domain.com and domain.com/index.html duplicate content in reports even with rewrite on
I have a site that was recently hit by the Google penguin update and dropped a page back. When running the site through seomoz tools, I keep getting duplicate content in the reports for domain.com and domain.com/index.html, even though I have a 301 rewrite condition. When I test the site, domain.com/index.html redirects to domain.com for all directories and root. I don't understand how my index page can still get flagged as duplicate content. I also have a redirect from domain.com to www.domain.com. Is there anything else I need to do or add to my htaccess file? Appreciate any clarification on this.
Moz Pro | | anthonytjm0