Thanks for sharing these tips, Brian. Agree with all of these, except maybe #3 Delete zombie pages. A better strategy would be to update these pages with fresh content and convert them into a long form blog posts/guides. Deleting them entirely would mean either setting up a 404 or 301 redirect – both of which can hurt your organic traffic in the short run.

To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
The document containing a hyperlink is known as its source document. For example, in an online reference work such as Wikipedia, or Google, many words and terms in the text are hyperlinked to definitions of those terms. Hyperlinks are often used to implement reference mechanisms such as tables of contents, footnotes, bibliographies, indexes, letters and glossaries.
Many small businesses fail to write clear, concise headlines on their websites. Headlines are a big ranking factor for Google and other search engines. Because headlines are big and important looking, many small business owners are tempted to write clever or fun headlines, but this is a mistake. Instead, write headlines that convey a single who, what, where, when, or why statement that summarizes the content that follows. Imagine someone only reads the headlines—will they understand the content on your page? Clearly written headlines will help your readers and search engines understand your content.
This content will help you boost your rankings in two primary ways. First, more content means more keywords, and therefore more opportunities for Google to return your site in the search results. Second, the more content you have, the more links you generally accumulate. Plus, having lots of content is great for getting visitors to stay on your site longer. Win-win!
In order to optimize your site, you need to make sure you are including the keywords that you want to rank for multiple times throughout your site. Also, ensure your site has complete and up-to-date contact information, that you’re using appropriate meta tags, and that you include pertinent business information as text—not text on images that Google can’t search.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Thanks for sharing these tips, Brian. Agree with all of these, except maybe #3 Delete zombie pages. A better strategy would be to update these pages with fresh content and convert them into a long form blog posts/guides. Deleting them entirely would mean either setting up a 404 or 301 redirect – both of which can hurt your organic traffic in the short run.
Hey Brian first off all I want to say thanks to you for this epic post. I cant say how much I had learnt from you showing your post you are really a genius when it comes to SEO and linkbuilding. Though I have one question currently I am working on a project and that don’t have any blog so I have only links and social signals to boost my ranking. So can you please tell me what strategies should I follow for higher rankings with out blog ???
W3Schools is optimized for learning, testing, and training. Examples might be simplified to improve reading and basic understanding. Tutorials, references, and examples are constantly reviewed to avoid errors, but we cannot warrant full correctness of all content. While using this site, you agree to have read and accepted our terms of use, cookie and privacy policy. Copyright 1999-2020 by Refsnes Data. All Rights Reserved.
Local results favor the most relevant results for each search, and businesses with complete and accurate information are easier to match with the right searches. Make sure that you’ve entered all of your business information in Google My Business, so customers know more about what you do, where you are, and when they can visit you. Provide information like (but not limited to) your physical address, phone number, category, and attributes. Make sure to keep this information updated as your business changes. Learn how to edit your business information
I am reading your emails from long time and tips and techniques are always wonderful, I have used many suggestions and also highly suggest others using the same, really seo is like playing with search engines and keeping your eye on everything going on and the changes and even implementing our own sense of humour while preparing metas, sometimes guessing what users will use to search products or services.
These are all great. I am working on implementing most of these. My biggest issue is my site is brand new (2 months). I am ranking for a lot but seem to be limited because, I am assuming, google will not give enough trust to a new site. What should I be doing to overcome the newness of my site? I buy houses in the Dallas Fort Worth area and if you are not number 1 on google then you might as well be on page 10! Any advise would be well received and please keep up the great work!
Brain I got a question. I will be glad if you could answer it. So, you said that it is important to use keywords in the title and headings. But sometimes I observe that plugins like Yoast will recommend us to insert keywords in more and more subheadings. For example on a long post if I have 50 subheadings and 8 of them have my target keyword. Yoast is actually not happy with it and it wants me to add keywords into more number of subheadings. I felt that 8 itself is a bit too much. Does Google also looks at my content in the same angle of Yoast?. Ideally, as a good practice how many subheadings should have the target keyword?.
The W3C Recommendation called XLink describes hyperlinks that offer a far greater degree of functionality than those offered in HTML. These extended links can be multidirectional, linking from, within, and between XML documents. It can also describe simple links, which are unidirectional and therefore offer no more functionality than hyperlinks in HTML.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Organic search results are free and display below Google Ads and sometimes local results. Google uses a sophisticated algorithm to determine which sites rank highest for organic search results based on keyword usage, relevance, site design, and other factors. Generally, Google provides the highest quality, most relevant results based on the keyword(s) used by the searcher.
×