Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to find all crawlable links on a particular page?
-
Hi! This might sound like a newbie question, but I'm trying to find all crawlable links (that google bot sees), on a particular page of my website. I'm trying to use screaming frog, but that gives me all the links on that particular page, AND all subsequent pages in the given sub-directory. What I want is ONLY the crawlable links pointing away from a particular page. What is the best way to go about this? Thanks in advance.
-
Thanks for sharing this information Thomas. Appreciate your time and help here. Regards.
-
I understand yes are referred that is a parameter or how far from home here's some information on a tool I'm using right now
http://www.internetmarketingninjas.com/seo-tools/google-sitemap-generator/
here is an HTML file of the results however you can see the how far from home on the left hand side I suggest you run the tool yourself so you can see the full results
Using the IMN Google Site Map Generator
Links are critically important to webpages, not only for connecting to other, related pages to help end users find the information they want, but in optimizing the pages for SEO. The Find Broken Links, Redirects & Google Sitemap Generator Free Tool allows webmasters and search engine optimizers to check the status of both external links and internal links on an entire website. The resulting report generated by the Google sitemap generator tool will give webmasters and SEOs insight to the link structure of a website, and identify link redirects and errors, all of which help in planning a link optimization strategy. We always offer the downloadable results and the sitemap generator free for everyone.
Get started
To start with the free sitemap generator, type (or paste) the full home page URL of the website you want scanned. Select the number of pages you want to scan (up to 500, up to 1,000, or up to 10,000). Note that the job starts immediately and runs in real time. For larger sites containing numerous pages, the process can take up to 30 minutes to crawl and gather data on 1,000 pages (and longer still for very large sites). You can set the Google sitemap generator tool to send you an email once the crawl is completed and the data report is prepared. The online sitemap generator offers several options and also acts as an XML sitemap generator or an HTML sitemap generator.
Note that the results table data of the online sitemap generator is interactive. Most of the data items are linked, either to the URLs referenced or to details about the data. For most cells that contain non-URL data, pause the mouse over the cell to see the full results.
Results Bar
When the tool starts, a results bar appears at the top of the page showing the following information:
- Status of the tool (Crawling or Done)
- Number of Internal URLs crawled
- Number of External links found
- Number of Internal HTTP Redirects found
- Number of External HTTP Redirects found
- Number of Internal HTTP error codes found
- Number of External HTTP error codes found
For those who need sitemaps provided by either an HTML sitemap generator or an XML sitemap generator,
there are corresponding options offered here. Also shown are the following:- Download XML Sitemap button
- Download tool results in Excel format
- Download tool results in HTML format
Lastly, if you love the free sitemap generator tool, you can tell the world by clicking any of the following social media buttons:
- Facebook Like
- Google+
Email notification
Next, you can submit your email address to have a copy of the report emailed to you if you choose not to wait for it to finish crawling. We offer this feature as well as the sitemap generator free to all users.
Tool results data
When results are ready, the HTML sitemap generator will organize the data into six tables:
- Internal links
- External links
- Internal errors (a subset of Internal Links)
- Internal redirects (another subset of Internal Links)
- External errors (a subset of External Links)
- External redirects (another subset of External Links)
The table data is typically linked to either page URLs or to details about the data. Click on column headers to sort the results.
1Internal Links table
The Internal links table created by the XML sitemap generator includes the following data fields:
- URLs crawled on the site
- Link to The On Page Optimization Analysis Free SEO Tool for that URL
- URL’s level from the domain root
- URL’s returned HTTP status code
- Number of internal links the URL has within the site (click to see the list of URLs)
- Link text used for the URL
- Number of internal links on the page (click to see the list of URLs)
- Number of external links on the page (click to see the list of URLs)
- Size of the page on kilobytes (click to see page load speed test results for this URL from Google)
- Link to the Check Image Sizes, Alt Text, Header Checks and More Free SEO Tool for that URL
- The tag text from the URL’s page
- The description tag text from the URL’s page
- The keywords tag text from the URL’s page
- Contents, if used, of the anchor tag’s “rel=” attribute
2External Links table
The External links table includes the following data fields:
- URL’s returned HTTP status code
- Number of times that URL is linked to from within the site (click to see the list of affected URLs)
- External URL used in the link
- Link text used for the URL
- Internal page URL on which the link was first found
3Internal HTTP code errors table
The Internal errors table gathers all of the pages returning HTTP code errors (4xx and 5xx level codes) in one place to help organize the effort to resolve the problems. It includes the following data fields:
- URL’s returned HTTP status code
- Number of times that URL is linked to from within the site (click to see the list of affected URLs)
- Internal URL used in the link
- Link text used for the URL
- Internal page URL on which the link was first found
The Internal errors table is a subset of the Internal links table showing just those pages returning HTTP status code errors.
4Internal HTTP redirects table
The Internal redirects table combines all of the pages returning HTTP redirects in one list so you can easily review them. You should not have to rely on redirects internally. Instead, you can fix the source code containing the redirected link. This table contains the following data fields:
- URL’s returned HTTP status code (click it to go to the HTTP Response Code Checker tool)
- Number of times that URL is linked to from within the site (click to see the list of affected URLs)
- Internal URL used in the link
- Link text used for the URL
- Redirect’s target URL
- Internal page URL on which the link was first found
The Internal redirects table is a subset of the Internal links table showing just those pages returning 301 and 302 HTTP status code redirects.
5External HTTP code errors table
The External errors table gathers all of the pages returning HTTP code errors (4xx and 5xx level codes) in one place to help organize the effort to resolve the problems. It includes the following data fields:
- URL’s returned HTTP status code (click it to go to the HTTP Response Code Checker tool)
- Number of times that URL is linked to from within the site (click to see the list of affected URLs)
- Internal URL used in the link
- Link text used for the URL
- Redirect’s target URL
- Internal page URL on which the link was first found
The External errors table is a subset of the External links table showing just those pages returning HTTP status code errors.
6External HTTP redirects table
The External redirects table combines all of the pages returning HTTP redirects in one list so you can easily review them. As the redirect to the targeted page does not affect your page, fix these URLs is a lower priority. This table contains the following data fields:
- URL’s returned HTTP status code (click it to go to the HTTP Response Code Checker tool)
- Number of times that URL is linked to from within the site (click to see the list of affected URLs)
- External URL used in the link
- Link text used for the URL
- Redirect’s target URL
- Internal page URL on which the link was first found
The External redirects table is a subset of the External links table showing just those pages returning 301 and 302 HTTP status code redirects.
-
Hi Thomas! When I say 1 click, I mean all links that can directly be reached from www.wishpicker.com. For example
wishpicker.com/gifts-for can be reached directly from wishpicker.com
wishpicker.com/gifts-for/boyfriend cannot be reached directly from wishpicker.com. I would first need to go to wishpicker.com/gifts-for, and then go to wishpicker.com/gifts-for/boyfriend. So wishpicker.com/gifts-for is 1 click away, and wishpicker.com/gifts-for/boyfriend is 2 clicks away from wishpicker.com.
I am looking to crawl all links that are only 1 click away. Thanks for your help here. Really appreciate it.
-
when you say one click away are you talking about a parameter?
I will run this through screaming frog and a couple other tools and see if I can get your answer.
-
Hi Thomas
Thanks for your response. Here is my website: www.wishpicker.com
What I am looking for is all the links present only 1 click away from the page www.wishpicker.com (both internal and external).
Performing a crawl with screaming frog is giving me all links (1, 2, 3, 4, and more clicks away). Not sure how to limit the crawl to show links that are only 1 click away, and exclude links that are 2 or more clicks away from this page.
Look forward to your response.
Thanks!
-
Hi,
Screaming frog does in fact show you the links that would be considered external links. Here is a great guide.
http://www.seerinteractive.com/blog/screaming-frog-guide
If you look at the external part of Screaming frog you'll find what you're looking for however you may also do this with
using either the campaign tool or the browser plug-in.
I would suggest reading the seer interactive guide and sticking with screaming frog it is an outstanding tool.
Here are some other tools which I hope will help you if that is not the route you wish to go.
If you could post a photograph of what you are looking for or what you mean by it only showing you the internal link count I know what you mean by that I just want to see what screen you're looking on to get the The answer you're looking for.
Here are some more tools that will allow you to scan up to 1000 pages of your website for free and will tell you the information you're looking for.
http://www.internetmarketingninjas.com/tools
if you cannot find what you're looking for in their you might want to try
http://www.quicksprout.com/2013/02/04/how-to-perform-a-seo-audit-free-5000-template-included/
distilled.net/U might be the best way to find out these types of things however it is a complete search engine optimization training course.
Sincerely,
Thomas
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
How to find orphan pages
Hi all, I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site. However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too). Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract? Thanks!
Technical SEO | | KJH-HAC1 -
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
Find all links in the site and anchor text
Hi, Find all links in the site and anchor text and i need this done on my own website so i know if we dont have links that are anchored to numbers and punctuations that are not seen at all. Thanks
Technical SEO | | mtthompsons0