Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Hi Brian, a very useful post, thanks for sharing. These things turned to be very useful for us: blocking thin content/pages from Google index, adjusting meta titles/descriptions and content of popular articles, improving internal links, improving page speed, implementing schema (didn’t notice a big difference here), optimizing images and their alt tags, making sure the site is mobile friendly and there are no accessibility issues (Lighthouse in Google Chrome helped a lot), some link building activity (long term) and of course keyword research and mapping. Thanks again for providing valuable info, regards.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
I did want to ask you about the section on “Don’t Focus on Long Tail Keywords”. This is topical for me as I actually have a tab opened from a recent post on the MOZ blog from Rand Fishkin that details reasons why you should focus on long tail keywords. I know you have said that “they have their place”, but as I say as a newbie to all of this, ever so slightly differing opinions from two authoritative people in the industry (that’s you and Rand of course 🙂 frazzles my brain somewhat and I’m not sure whether to turn left or right!
This content will help you boost your rankings in two primary ways. First, more content means more keywords, and therefore more opportunities for Google to return your site in the search results. Second, the more content you have, the more links you generally accumulate. Plus, having lots of content is great for getting visitors to stay on your site longer. Win-win!
Hi Brian! I’ve been reading your blog for quite a while and had a good amount of success. 🙂 I have a request, similar to what someone else mentioned. It would be nice to learn the HOW to the tips you give, including details like which pro designer you hired, or at least where you found them, what their credentials should be (for which tasks), etc. Example: you used a custom coder to change the “last updated” date. So how do we find our own custom coder, and what coding language would they need to know. Same thing with jump links. Also, which pro graphics designer do you use, or at least, where did you find them, and what type of graphics skills do they need?
As keywords are essentially the backbone of on-page SEO, you need to pay a lot of attention to them. There is no reason not to include them in your URLs.  The inclusion has its benefits. When you assimilate the targeted keyword into the URL, you are ensuring that Google’s has another reason and way to consider your article as more relevant for a particular phrase.
It’s a simple Google Chrome extension. First, you have to install the extension in your Google Chrome browser. Once installed, it will appear as a little checkmark icon beside your address bar. When you click on it, it will immediately start scanning all the links on a particular web page. If a link is broken or dead, it will be highlighted in red, and the error will be shown right beside the text (e.g., “404”).
To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.
I have been following Brian Dean for quite some time and his sky scraper technique brought some new changes in the field of SEO, in this article there is valuable information of SEO which will help the beginners and people who are new to seo to understand the real meaning of search engine optimization. Thanks for sharing these significant tips with the community.
If you’re updating the whole post, I think bloggers should create a brand new post and link the old to the new. I know, you’re starting over with no “link juice,” but at least it’s clear to the reader that the post has gotten a makeover. I remember reading a new post of yours a few months. I was about 25% through it and thought “man this sounds familiar.” So I checked it out on archive.org and realized that you had updated and republished it.

I have been following Brian Dean for quite some time and his sky scraper technique brought some new changes in the field of SEO, in this article there is valuable information of SEO which will help the beginners and people who are new to seo to understand the real meaning of search engine optimization. Thanks for sharing these significant tips with the community.
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
For example, a plumber could first find a service that has search volume on Google, but may not be talked about that in-depth on their competitor’s websites. In this example, a service like “leak detection” may be listed with a small blurb on your competitor’s sites, but none of them have elaborated on every angle, created FAQs, videos, or images. This represents an opportunity to dominate on that topic.
Next, log into Google AdWords and click “Tools” > “Keyword Planner.” Once you’re on the Keyword Planner menu, click “Search for new keywords using a phrase, website or category.” Complete the form that appears; to start with, search for your type of business and location. For example, if you own a hair salon in Chicago, you would want to enter “hair salon Chicago.”
×