Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
An easy way to keep your website current and relevant is by maintaining an active blog. This allows you to create posts that use your keywords while also telling Google your website is up-to-date without actually having to update your web pages. Consider writing on topics that answer frequently asked questions or sharing your expertise in your industry.
This is another master piece. You’re not only a top SEO but your writing skills are also amazing. Good to see that I am doing most of the things already that you mentioned under On-Page SEO tips. The only thing I am currently struggling with is to get my content published on top sites. Can you come up with a detailed article about how to approach top sites in your niche and get your content approved? Thanks
i have never read an informative post until i came across this and am hooked. Helped me recognize broken links that i had no idea where sitting there. Also right in the beginning about having link worthy site, its like you were talking to me about writing see the link "here" and the LSI was a good tip i knew nothing about. Thank you so much and all the best.

Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
One of your tips is “Make your content actionable”…I feel like this is not always the case depending on the context of the blog post. If I am to write a post on, say, “35 Incredible Linear Actuator Applications”, I feel like that’s something extremely hard to be actionable about. Rather, I feel my purpose is to be informative. Both of us are going for backlinks, but your blog I feel is more aimed towards giving awesome tips & advice for SEO’s. Whereas MY BLOG is about informing people about “linear actuators and their applications”.
After you hit “Get ideas,” you will see the number of average monthly searches for the term you entered, plus other related search terms that you may want to use as keywords on your website. Create a list of the keywords and search terms you want to rank for, and fold them in to your website copy and content as naturally as possible. As Google sees your website using these keywords, it will view your site as a relevant and quality search result.

Ever heard of Maslow's hierarchy of needs? It's a theory of psychology that prioritizes the most fundamental human needs (like air, water, and physical safety) over more advanced needs (like esteem and social belonging). The theory is that you can't achieve the needs at the top without ensuring the more fundamental needs are met first. Love doesn't matter if you don't have food.

Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
I am reading your emails from long time and tips and techniques are always wonderful, I have used many suggestions and also highly suggest others using the same, really seo is like playing with search engines and keeping your eye on everything going on and the changes and even implementing our own sense of humour while preparing metas, sometimes guessing what users will use to search products or services.
Link roundups are selected and organized updates from bloggers that link out to their favorite content during a given period. Roundups are mutually beneficial relationships. It’s really hard to curate content as it involves a lot of work. The bloggers creating these roundups are actively seeking content to link to. You can land links in bunches. Over time, you will gain roundup coverage naturally. After you pitch the blogger who curates the roundup, you should connect on social media. That way, they’ll discover your future updates naturally. I’ve gained some backlinks from link roundups.

Disclaimer: Google™ search engine and PageRank™ algorithm are the trademarks of Google Inc. CheckPageRank.net is not affiliated with Google Inc., but provides publicly available information about pagerank values of websites. We provide our services on "as is" and "as available" basis and we do not provide any guarantees regarding this service stability and/or availability.


The syntax and appearance of wikilinks may vary. Ward Cunningham's original wiki software, the WikiWikiWeb used CamelCase for this purpose. CamelCase was also used in the early version of Wikipedia and is still used in some wikis, such as TiddlyWiki, Trac, and PmWiki. A common markup syntax is the use of double square brackets around the term to be wikilinked. For example, the input "[[zebras]]" is converted by wiki software using this markup syntax to a link to a zebras article. Hyperlinks used in wikis are commonly classified as follows:
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
The great thing about the long tail for new sites that have no backlinks and no authority, is that it is possible to rank for these terms, assuming great on-page SEO, quality content etc.. So therefore focusing on the long tail is a strategy that is often recommended and in fact Rand himself (and indeed others of good repute) have cited 4+ words and lower LMS to avoid the med-high volume kws due to their kw difficulty. Have I completely missed the point in your guide or do you indeed have a slightly different view on the long tail?
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
This is another master piece. You’re not only a top SEO but your writing skills are also amazing. Good to see that I am doing most of the things already that you mentioned under On-Page SEO tips. The only thing I am currently struggling with is to get my content published on top sites. Can you come up with a detailed article about how to approach top sites in your niche and get your content approved? Thanks
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

Hi Brian! I’ve been reading your blog for quite a while and had a good amount of success. 🙂 I have a request, similar to what someone else mentioned. It would be nice to learn the HOW to the tips you give, including details like which pro designer you hired, or at least where you found them, what their credentials should be (for which tasks), etc. Example: you used a custom coder to change the “last updated” date. So how do we find our own custom coder, and what coding language would they need to know. Same thing with jump links. Also, which pro graphics designer do you use, or at least, where did you find them, and what type of graphics skills do they need?
Thank you Brian. This is SOLID stuff. I appreciate your mindset for SEO and SEM. What’s the use of SEO and all of this backlinking effort if it can’t stand the test of time? Plus these back links are like little promotions, little ads throughout the internet vs. just a backlink. Plus it makes a lot of sense to maximize promotion efforts to have our on page stuff liked by search engines but also have our on page stuff maximize clarity to what the user is looking for getting them excited to share and link back. Man I’ve got a lot of work to do! Thank you!

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Stellar post as always Brian! For marketers who have the time and budget we can do all of these things, however for most people who run small/local businesses they simply don’t have the time to do most of these things, and then it’s not cost feasible to pay someone to do this because that would require a full time position or paying an agency $xxxx/month which they can’t afford either. It’s a quandary of a position they find themselves in and makes it almost impossible to compete in modern day search against established giants. I wish Google would place more emphasis on relevancy and what’s actually on the page versus domain authority. Maybe one they’ll move towards that, but for now, all I see in the serps is giant sites then comes relevancy.

Next, log into Google AdWords and click “Tools” > “Keyword Planner.” Once you’re on the Keyword Planner menu, click “Search for new keywords using a phrase, website or category.” Complete the form that appears; to start with, search for your type of business and location. For example, if you own a hair salon in Chicago, you would want to enter “hair salon Chicago.”
×