Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
-
Hi
I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? :
t&c's
shipping policy
pricing policy
privacy policy
etc
So in summary:
-
Shall I unblock these?
-
What caused it Shopify default settings or more likely my migration team?
All Best
Dan
-
-
Thanks for your advice Alex, yes i agree, will ask Shopify if this was them (re default settings) or if my migrators have been over enthusiastic but contrary to best practices.
Have a great BH weekend !
All Best
Dan
-
I wouldn't block them. While it's unlikely to affect the rank of your other pages, it may result in a poorer user experience, e.g. if someone were to be searching for one of your policies in Google, it would not be returned.
I'm afraid I'm not an expert on Shopify at all, so I can't answer why they wouldn't have been blocked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If website users don't accept GDPR cookie consent, does that prevent GA-GTM from tracking pageviews and any traffic from that user that would cause significant traffic decreases?
I've been doing a lot research on GDPR impact and implementation with GTM-GA for clients, but it's been 12 months since GDPR has gone live I haven't found anything on how GA traffic has been impacted if users don't accept cookie consent. However, I'm personally seeing GA accounts taking huge losses in traffic since implementing GDPR cookie solutions (because GTM/GA tags aren't firing until cookies are accepted). Is it common for websites to see significant decreases in traffic due to too many users not accepting cookie consent? Are there alternative solutions to avoid traffic loss like that and still maintain GDPR compliance? It seems to me that the industry underestimated how many people won't accept cookie consent. Most of the documentation and articles around GDPR's start (May 2018) didn't foresee or cover that aspect properly, everything seems to be technically focused with the assumption that if implemented properly most people would accept cookie consent, but I'm personally not seeing that trend and it's destroying GA data (lost traffic, minimal source attribution, inaccurate behavior data, etc). Thanks.
Reporting & Analytics | | Kickboard2 -
Google Analytics Question - Impressions & Queries Up, Sessions Down
I'm working with a client who, according to the Google Query report, impressions and sessions are up since we've started work with them about 6 months ago, but Google sessions are down. In moz, we're seeing a gradual, but steady increase in search visibility specifically with Google. Note: this is all organic. From when we started tracking queries, the first month we were tracking there were 43,581 impressions and 690 click throughs for the month. This past month there were 98,293 queries and 1015 clicks throughs for the month (granted not year over year data) - of these 1,015 clicks, 995 of them were from web. However, for those same time periods, sessions from Google are down over 30% - 1,750 vs. 1,189. I'm not sure how to interpret this. I realize that clicks and sessions are not a straightforward comparison, but I would think that if clicks were up according to the query report that sessions would also be up. Is it that some of these clicks are bouncing and therefore not being tracked as a session? Is there a potential issue with how data is being tracked?
Reporting & Analytics | | Corporate_Communications0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Google Analytics Goal/Event/SOMETHING to show only Wordpress "Posts", not pages, etc
Hi all, Our site is build on Wordpress and formerly the post URL's had the typical date format at the beginning. This made it easy for me to look at, for example, all search traffic to the blog. I would just view URL's containing /2014/ and /2015/ and boom. We have since removed the dates from the URL's with proper redirects etc, which is great, but now I can't figure out a way to look at ONLY the blog in GA. I like to track a KPI of 'search visits to blog posts' and I can't figure out how to now. Can I set up a GA event that only fires when the post type template for blog posts loads? Some other solution? I'm lost here, and there's gotta be a good way to do it...
Reporting & Analytics | | 3DR0 -
Why would page views per visitor suddenly increase?
My website traffic is growing by about 1% a week. It has a fairly stable page views/visitor of about 1.69. There's normally very little variability in this As we sell an industrial product. Today page views jumped by 50% and so did page views/visitor but visitor numbers stayed the same. I dont have a useful hypothesis to explain this. Analytics shows me that the traffic source, country of origin and pages viewed are pretty much the same as normal. There's been no substantive change to the site (today we changed the text in a widget to link to a new page - and no one visited it). It doesn't look like 1 person has gone through the whole site as that would skew the distribution of page views by country So why would user behavour suddenly change? I'll look at it for the rest of the week but in 7 years of looking after this website I haven't seen anything like this before.
Reporting & Analytics | | Zippy-Bungle0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0 -
Big variation in the number of search results. (person's name)
Hi, I have been noticing a really dramatic variation in the number of results Google is returning for the name "Carolyn Hadlock." Most of the time it seems to be around 2000. But then it will jump up to over 10,000. Does anyone know why there would be such a big jump? And then why it would go back? If tested both logged into Google and then not - as well as having others log is as themselves. That does not seem to be it. Any thoughts would be much appreciated.
Reporting & Analytics | | yandl0