Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
PageSpeed Insights DNS Issue
-
Hi
Anyone else having problems with Google's Pagespeed tool? I am trying to benchmark a couple of my sites but, according to Google, my sites are not loading. They will work when I run them through the test at one point but if I try again, say 15 mins later, they will present the following error message
An error has occured
DNS error while resolving DOMAIN. Check the spelling of the host, and ensure that the page is accessible from the public Internet. You may refresh to try again.
If the problem persists, please visit the PageSpeed Insights mailing list for support.
This isn't too much an issue for testing page speed but am concerned that if Google is getting this error on the speed test it will also get the error when trying to crawl and index the pages.
I can confirm the sites are up and running. I the sites are pointed at the server via A-records and haven't been changed for many weeks so cannot be a dns updating issue. Am at a loss to explain.
Any advice would be most welcome. Thanks.
-
Hi
Thanks for looking at the issue. There should be four working nameservers. I have four set in both WHM and at my domain registrar. I added two more two (3 and 4) so maybe they are taking a while to resolve around the web.
Will look at the SOA, thanks. Server and domain set up isn't at the top of my skill set. The domain you mention in this thread is just a testing domain to see what happens with a certain kind of content so it hasn't been treated too seriously to be honest.
-
It looks like you have some NS problems according to IntoDNS. The largest issue is that you have 4 nameservers listed with your registrar but only 2 are listed in your DNS.
~]$ dig NS yoactive.co.uk
;; ANSWER SECTION:
yoactive.co.uk. 86400 IN NS ns2.ccgdigitalmarketing.com.
yoactive.co.uk. 86400 IN NS ns1.ccgdigitalmarketing.com.You should have a NS record for each DNS server. Also, your SOA is set to expire in... 5 weeks?! That's crazy long.
If you can't get this resolved with your host, you can always try CloudFlare, a free DNS + proxy service.
-
thanks, very useful tool. None of my domains passed but i can still access them. Will try moving away from A record set up and use nameservers instead
-
Hi, you can check your dns config here:
http://dnscheck.pingdom.com. If problems with resolving your domain. ask you hostingprovider to set the dns right. otherwise you have to do it yourself in you hosting configuration tool like direct admin or plex.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix core web vital issue on shopify website , any recommned app from shopfiy store?
I'm facing challenges optimizing Core Web Vitals on my Shopify store. Does anyone have experience with Shopify apps that effectively address LCP, FID, and CLS issues? Any specific recommendations would be greatly appreciated.
Technical SEO | | faizalialiali0 -
Any crawl issues with TLS 1.3?
Not a techie here...maybe this is to be expected, but ever since one of my client sites has switched to TLS 1.3, I've had a couple of crawl issues and other hiccups. First, I noticed that I can't use HTTPSTATUS.io any more...it renders an error message for URLs on the site in question. I wrote to their support desk and they said they haven't updated to 1.3 yet. Bummer, because I loved httpstatus.io's functionality, esp. getting bulk reports. Also, my Moz campaign crawls were failing. We are setting up a robots.txt directive to allow rogerbot (and the other bot), and will see if that works. These fails are consistent with the date we switched to 1.3, and some testing confirmed it. Anyone else seeing these types of issues, and can suggest any workarounds, solves, hacks to make my life easier? (including an alternative to httpstatus.io...I have and use screaming frog...not as slick, I'm afraid!) Do you think there was a configuration error with the client's TLS 1.3 upgrade, or maybe they're using a problematic/older version of 1.3?? Thanks -
Technical SEO | | TimDickey0 -
Discovered - currently not indexed issue
Hello all, We have a sitemap with URLs that have mostly user generated content. Profile Overview section. Where users write about their services and some other things. Out of 46K URLs, only 14K are valid according to search console and 32K URLs are excluded. Out of these 32K, 28K are "Discovered - currently not indexed". We can't really update these pages as they have user generated content. However we do want to leverage all these pages to help us in our SEO. So the question is how do we make all of these pages indexable? If anyone can help in the regard, please let me know. Thanks!
Technical SEO | | akashkandari0 -
Does using a canonical with ?utm_source=gmb cause any issues?
All of our URLs in Google My Business are tagged with ?utm_source=gmb. This way when people click on it within a Google Map listing, knowledge graph, etc we know it came from there. I'm assuming using a canonical on all ?_utm_source _pages (we have others, including some in the index) won't cause any problems with this, correct? Since they're not technically traditional organic SERPs? Dumb question I know, but better safe than sorry. Thanks.
Technical SEO | | Alces1 -
Duplicate content issues... en-gb V en-us
Hi everyone, I have a global client with lots of duplicate page issues, mainly because they have duplicate pages for US, UK and AUS... they do this because they don't offer all services in all markets, and of course, want to show local contact details for each version. What is the best way to handle this for SEO as clearly I want to rank the local pages for each country. Cheers
Technical SEO | | Algorhythm_jT0 -
Issues with Magento layered navigation
Hi, We use Magento v.1.7 for our store. We have recently had an SEO audit and we have uncovered 2 major issues which can be pinpointed to our layered navigation. We use the MANAdev layered navigation module. There are numerous options available to help with SEO. All our filtered urls seem to be fine ie. https://www.tidy-books.co.uk/childrens-bookcases-shelves/colour/natural-finish-with-letters/letters/lowercase have canonical url correctly setup and the meta tags as noindex, follow but Magento is churning out tons of 404 error pages like this https://www.tidy-books.co.uk/childrens-bookcases-shelves/show/12/l/colour:24-4-9/letters:6-7 which google is indexing I'm at lost at how to solve this any help would be great. Thank you **This is from our SEO audit report ** The faceted navigation isn’t handled correctly and causes two major issues:● One of the faceted navigation filters causes 404 error. This means that the error isappended each sequence of the navigation options, multiplying the faulty URLs.● The pages created by the faceted nav are all accessible to the search engines. Thismeans that there are hundreds of duplicated category pages created by one of theparameters. The duplication issues can seriously hinder the organic visibility.The amount of 404 errors and the duplicated pages created by faceted navigation makes italmost impossible for a search engine crawler to finish the crawl. This means that the sitemight not be fully indexed and the newly introduced product pages or content won’t bediscovered for a very long time.
Technical SEO | | tidybooks0 -
I have multiple URLs that redirect to the same website. Is this an issue?
I have multiple URLs that all lead to the same website. Years ago they were purchased and were sitting dormant. Currently they are 301 redirects and each of the URLs feed to different areas of my website. Should I be worried about losing authority? And if so, is there a better way to do this?
Technical SEO | | undrdog990 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0