Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Solved How to solve orphan pages on a job board
-
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time.
They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans.
How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed?
We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress.
What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
-
@cyrus-shepard-0 Thanks so much for your input! The categorization option was what we were thinking about as well, but not sure if the client will be ready to invest the time. Will definitely suggest it to them.
Not majorly concerned about the jobs being found via Google search as individual posts, it's more about avoiding the orphans, as I'm sure they will be seen as a red flag.
Also, yes, the job posts are covered in a sitemap, you are correct.
-
@michael_m Seems like you have a number of options.
Can you categorize the jobs into more specific types (e.g. region, job type, etc.) and then add them to more category-specific "job board" pages? Even if you had duplication across job boards, seems like you'd get better crawl + indexation coverage. Anything to create a more clear crawling path to those pages. Even 20-50 job categories (or other sort/filter features) might provide benefit, and those category pages probably have a better chance of ranking on their own.
Cross-linking from similar/related jobs might also be a good option to explore. Much how we link to related questions here in the Q&A.
Orphaned pages aren't always a problem, as long as the pages are getting indexed and ranked. I imagine the search volume is pretty low for some of those jobs, but Google's sitemap indexation report is going to be your friend here.
Hope that helps!
Are the job postings covered in a sitemap? As SEO tools are finding them as orphaned, I assume they are discovering the pages via sitemaps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | | davidvogel1 -
Unsolved Why My site pages getting video index viewport issue?
Hello, I have been publishing a good number of blogs on my site Flooring Flow. Though, there's been an error of the video viewport on some of my articles. I have tried fixing it but the error is still showing in Google Search Console. Can anyone help me fix it out?
Technical SEO | | mitty270 -
Unsolved Capturing Source Dynamically for UTM Parameters
Does anyone have a tutorial on how to dynamically capture the referring source to be populated in UTM parameters for Google Analytics? We want to syndicate content and be able to see all of the websites that provided referral traffic for this specific objective. We want to set a specific utm_medium and utm_campaign but have the utm_source be dynamic and capture the referring website. If we set a permanent utm_source, it would appear the same for all incoming traffic. Thanks in advance!
Technical SEO | | peteboyd0 -
Need some help understanding SEO - Please help before I lose [pull out] all my hair
I'm new to SEO, and am stubbornly trying to educate myself. I have a telescope shop in Canada, it's a small business that we run on the side. We're driving lots of traffic through FB and our outreach programs but I really want to increase our presence on search. We released a new website back in January and it killed some of our rankings. We're working our way back with a very specific set of efforts on regular SEO: Metadata and titles, although it seems that's not super relevant Building high quality backlinks and eliminating any spammy backlinks Rewriting product listings so that they are original content though I'm not sure how important this is in e-commerce Writing high quality articles and blog posts Working relevant keywords into our product pages and titles I understand that good SEO is about pushing on all the levers, and trying to make sure that your site is as valuable to the end user as possible. We're making some good progress, but I'm puzzled by the #1 shop in Canada. They don't put any apparent effort into SEO and they still rank #1 on every key product we compete with them on. I've worked with two separate, highly ranked and regarded SEO firms on this and neither has been able to tell my why this other site ranks so highly. Here's a specific example on a popular product that we both sell, the Celestron NexStar 8SE. Here’s the link to Telescope Canada’s page for their Celestron 8SE: https://telescopescanada.ca/products/celestron-nexstar-8se-computerized-telescope-11069 Here’s a link to the Celestron 8SE page from the manufacturer website: https://www.celestron.com/products/nexstar-8se-computerized-telescope Telescopes Canada has just copied and pasted. There is no original content aside from adding the shipping and return policy to the tab, and having some options for selecting accessories on the page. Here is our page: https://all-startelescope.com/products/celestron-nexstar-8se We have higher page authority, higher domain authority, and they keyword analyzer in moz says that our page is higher quality than the Telescopes Canada page. I can’t find a single metric on any tool (ubbersuggest, Moz, ahrefs, semrush) that says Telescopes Canada is a better site, or has a better NexStar 8SE product page. But they keep ranking ahead of us, and right at the top of google search. Our titles are good, our metadata is good (but I don’t think that’s been a serious ranking factor for about ten years). Our text is original, it’s relevant, we have healthy internal links to the page. According to Moz's page ranker it's 20 points higher than Telescope Canada's page. We have invensted in some excellent blog content, we’re adding new products to the website so that we rank for more keywords. All of those things are helping, but I fundamentally don’t understand why Telescopes Canada is #1 almost across the board on every key product in our market. There is something that I’m not seeing here. Can you see any metric, any tool in your toolbox that indicates why they rank at the top, or even higher than we do for in these search terms specific to that product: Celestron NexStar 8SE
Intermediate & Advanced SEO | | nkennett
NexStar 8SE
Celestron NexStar 8SE Canada
NexStar 8SE Canada I have a feeling it's something technical that I'm missing, but I'm not sure how obvious it is with two 'professional' firms not finding it. I'd really appreciate any help or insight that you can offer.0 -
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Why would page views per visitor suddenly increase?
My website traffic is growing by about 1% a week. It has a fairly stable page views/visitor of about 1.69. There's normally very little variability in this As we sell an industrial product. Today page views jumped by 50% and so did page views/visitor but visitor numbers stayed the same. I dont have a useful hypothesis to explain this. Analytics shows me that the traffic source, country of origin and pages viewed are pretty much the same as normal. There's been no substantive change to the site (today we changed the text in a widget to link to a new page - and no one visited it). It doesn't look like 1 person has gone through the whole site as that would skew the distribution of page views by country So why would user behavour suddenly change? I'll look at it for the rest of the week but in 7 years of looking after this website I haven't seen anything like this before.
Reporting & Analytics | | Zippy-Bungle0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1