Having large groups of content that all revolve around the same topic will build more relevance around keywords that you're trying to rank for within these topics, and it makes it much easier for Google to associate your content with specific topics. Not only that, but it makes it much easier to interlink between your content, pushing more internal links through your website.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Thanks for sharing these tips, Brian. Agree with all of these, except maybe #3 Delete zombie pages. A better strategy would be to update these pages with fresh content and convert them into a long form blog posts/guides. Deleting them entirely would mean either setting up a 404 or 301 redirect – both of which can hurt your organic traffic in the short run.
Google’s local search results are displayed within Google Maps after a user searches for a keyword that includes a location, such as “pizza New York.” Landing within Google’s local search results requires that you have a Google My Business account. This is key for brick-and-mortar-based businesses because it puts your business in front of local people ready to buy.
×