

Add Expires Headers? Why You Should Think Twice Before Using Them
This article explains what expires headers are and how they benefit SEO, explains the dangers of improper implementation, and offers some insight on preventing issues.
Traditionally, the phrase Technical SEO refers to optimizing your site for crawling and indexing, but can also include any technical process meant to improve search visibility.
Technical SEO is a broad and exciting field, covering everything from sitemaps, meta tags, JavaScript indexing, linking, keyword research, and more.
If you’re new to SEO, we recommend starting with the chapter on Technical SEO in our Beginner’s Guide. Below are the latest posts on technical SEO, and we’ve included a few top articles here.
On-Site SEO : What are the technical on-page factors that influence your rankings? Our free learning center will get you started in the right direction.
The Web Developer's SEO Cheat Sheet : This handy—and printable—cheat sheet is invaluable for anyone building websites. Contains several useful references that cover a ton of technical SEO best practices.
MozBar : This free Chrome extension is an advanced SEO toolbar that helps you to examine and diagnose several technical SEO issues.
The Technical SEO Renaissance : Is it true that technical SEO isn't necessary, because Google is smart enough to figure your website out? Mike King puts this rumor to rest, and shows you what to focus on.
Technical SEO: The One Hour Guide to SEO : Want a quick introduction to the basics of technical SEO? Our guru Rand has you covered—all in about 10 minutes.
This article explains what expires headers are and how they benefit SEO, explains the dangers of improper implementation, and offers some insight on preventing issues.
Whether your website exists to reach out to customers, engage with them, build a community, sell products or act as a voice for your offline business, all of your efforts at maintaining your digital presence may be in vain if your website itself doesn’t display correctly!
Hi Mozzers, I've recently had to deal with several indexing problems that a few clients were experiencing. After digging deeper into the problems, I figured I'd write a post for Moz to share my experience so others don't have to spend as much time digging for answers to indexation problems. All it means is that your site or parts of it are not getting added to the Google (or one of the other guys) index, which means that no body will ever find your content in the search results.
Handling expired content can be an overwhelming experience for any SEO in charge of a dynamic website, whether it be an e-commerce, a classified (example: job search, real estate listings), or a seasonal/promotional (example: New York Fashion Week) site.
A lot of things can go wrong when you change most of the URLs on a website with thousands or millions of pages. But this is the story of how something went a little too "right", and how it was fixed by doing something a little bit "wrong". On February, 28 2012 FreeShipping.org relaunched with a new design and updated site architecture.The site'...
First of all, thank you to everyone who listened in to the Microformats and Schema.org webinar with Richard Baxter and myself. If you are a PRO member and haven't had a chance to listen in, be sure to check it out! During and after the webinar we received a ton of great feedback and q...
“It’s official, Google is broken and my career is over. Time to hide under my desk.” A bit extreme? Yes. But, if you saw what I saw a month ago, your reaction would’ve been exactly the same. Let me explain.
Rich snippets -- we see them everywhere in the SERPs, with some verticals having a higher abundance of them than others. For the average searcher, these rich snippets help show them what they're searching for is within reach on a particular site.
Domain migrations are one of those activities that even if in the long-term can represent a benefit for an SEO process -- especially if the new domain is more relevant, has already a high authority or give better geolocalization signals with a ccTLD -- can represent a risk for SEO because of the multiple tasks that should be performed correctly in order to avoid potential non-trivial crawling and indexing problems and consequential lost of rankings and organic traffic.
Building websites using AJAX to load content can make them fast, responsive and very user friendly. However, it's not always been possible to do this without introducing # or #! symbols into URLs - and breaking the way URLs are 'supposed' to work. The method outlined here will let you build fast AJAX-based websites that also work well for SEO.
I've deliberately put myself in some hot water to demonstrate how I would do a technical SEO site audit in 1 hour to look for quick fixes, (and I've actually timed myself just to make it harder). For the pros out there, here's a look into a fellow SEO 's workflow; for the aspiring, here's a base set of checks you can do quickly. I've got some lovely volu...
In this post I will explain how to handle cases of planned downtime. That is, a short period of time wherein you purposely make your website inaccessible. This can be due to significant changes to the site or because of server maintenance.
A website's code is like a play that tells a story to the search engine. If you have ink blotches and pages ripped or missing from the script it is hard for the search engine to understand the plot, if the search engine misses the plot it cannot tell others about it.
Matt Cutts announced at Pubcon that Googlebot is "getting smarter." He also announced that Googlebot can crawl AJAX to retrieve Facebook comments coincidentally only hours after I unveiled Joshua Giardino's research that suggested Googlebot is actually a headless browser based off the Chromium codebase at SearchLove New York. I'm going to challenge Matt Cutts's statements, Googlebot hasn't just recently gotten smarter, it actually hasn’t been a text-based crawler for some time now; nor has BingBot or Slurp for that matter. There is evidence that Search Robots are headless web browsers and the Search Engines have had this capability since 2004.