The tip that resonates with me the most is to publish studies, which you back up by linking to the study you collaborated on. That is spot on. It feels like having genuinely useful in depth content is THE strategy that will not be “Google updated” at any point. (Because if you were building a search engine, that’s the content you’d want to serve your users when they search for a topic.)
These are all great. I am working on implementing most of these. My biggest issue is my site is brand new (2 months). I am ranking for a lot but seem to be limited because, I am assuming, google will not give enough trust to a new site. What should I be doing to overcome the newness of my site? I buy houses in the Dallas Fort Worth area and if you are not number 1 on google then you might as well be on page 10! Any advise would be well received and please keep up the great work!
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
In a series of books and articles published from 1964 through 1980, Nelson transposed Bush's concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical proprietary worldwide computer network, and advocated the creation of such a network. Though Nelson's Xanadu Corporation was eventually funded by Autodesk in the 1980s, it never created this proprietary public-access network. Meanwhile, working independently, a team led by Douglas Engelbart (with Jeff Rulifson as chief programmer) was the first to implement the hyperlink concept for scrolling within a single document (1966), and soon after for connecting between paragraphs within separate documents (1968), with NLS. Ben Shneiderman working with graduate student Dan Ostroff designed and implemented the highlighted link in the HyperTIES system in 1983. HyperTIES was used to produce the world's first electronic journal, the July 1988 Communications of ACM, which was cited as the source for the link concept in Tim Berners-Lee's Spring 1989 manifesto for the Web. In 1988, Ben Shneiderman and Greg Kearsley used HyperTIES to publish "Hypertext Hands-On!", the world's first electronic book.[citation needed]
I was wondering if you by this reply really meant 410? Also, what is your take on the many suggestions out there saying that making 301 redirects is always better than deleting pages? I understand that reason is that it is (or is in risk of) being spam-ish to just redirect everything. Also I’m guessing that too many redirects will slow down the page.
In a series of books and articles published from 1964 through 1980, Nelson transposed Bush's concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical proprietary worldwide computer network, and advocated the creation of such a network. Though Nelson's Xanadu Corporation was eventually funded by Autodesk in the 1980s, it never created this proprietary public-access network. Meanwhile, working independently, a team led by Douglas Engelbart (with Jeff Rulifson as chief programmer) was the first to implement the hyperlink concept for scrolling within a single document (1966), and soon after for connecting between paragraphs within separate documents (1968), with NLS. Ben Shneiderman working with graduate student Dan Ostroff designed and implemented the highlighted link in the HyperTIES system in 1983. HyperTIES was used to produce the world's first electronic journal, the July 1988 Communications of ACM, which was cited as the source for the link concept in Tim Berners-Lee's Spring 1989 manifesto for the Web. In 1988, Ben Shneiderman and Greg Kearsley used HyperTIES to publish "Hypertext Hands-On!", the world's first electronic book.[citation needed]
Hi Brian! I’ve been reading your blog for quite a while and had a good amount of success. 🙂 I have a request, similar to what someone else mentioned. It would be nice to learn the HOW to the tips you give, including details like which pro designer you hired, or at least where you found them, what their credentials should be (for which tasks), etc. Example: you used a custom coder to change the “last updated” date. So how do we find our own custom coder, and what coding language would they need to know. Same thing with jump links. Also, which pro graphics designer do you use, or at least, where did you find them, and what type of graphics skills do they need?

In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.


Awesome work man. I am going to start my blog soon. And your blog is the only blog,I’m following day and night. Who says your blog is for advance seo techniques? You are just amazing for absolute beginners like me also. Though am pretty confused about how link building works in beauty blogosphere, still you always give me the courage to make this type of huge giantic list posts.
Google’s aim is to provide the most relevant result for any given query. Their entire business model relies on them being able to do this, consistently, across hundreds of billions of searches. For that reason, they’ve invested heavily into understanding the intent of queries, i.e., the reason a person typed a specific thing into Google in the first place.

Killer post, Brian. Really want to know how to create those charts you keep using throughout your posts. I searched Google for them but couldn’t find them so I’m guessing they’re custom designed but with what tool is the question… Would love to see a flat architecture diagram for blogs and non-ecommerce sites. Look forward to your upcoming Blab with Dmitry.


Brain I got a question. I will be glad if you could answer it. So, you said that it is important to use keywords in the title and headings. But sometimes I observe that plugins like Yoast will recommend us to insert keywords in more and more subheadings. For example on a long post if I have 50 subheadings and 8 of them have my target keyword. Yoast is actually not happy with it and it wants me to add keywords into more number of subheadings. I felt that 8 itself is a bit too much. Does Google also looks at my content in the same angle of Yoast?. Ideally, as a good practice how many subheadings should have the target keyword?.
One of my favorite ways to give content a boost is to run ads on Facebook targeting people with interests that are relevant to the content. It’s fairly low cost since you are offering a free piece of content. By targeting people with relevant interests to your content, you drive the right people to the content and into the top of your funnel. And if your content resonates with them, they’ll share, link, and engage with the content in ways that will help Google see its value.
You may find that your business doesn’t appear for relevant searches in your area. To maximize how often your customers see your business in local search results, complete the following tasks in Google My Business. Providing and updating business information in Google My Business can help your business’s local ranking on Google and enhance your presence in Search and Maps.
The first step that I take is to do a quick Google search to find pages on my domain where I've mentioned the keyword in question so that I can add an internal link. To do this, I'll use the following search query, replacing DOMAIN with your domain name (e.g. matthewbarby.com) and KEYWORD with the keyword you're targeting (e.g. "social media strategy"):
If you’re updating the whole post, I think bloggers should create a brand new post and link the old to the new. I know, you’re starting over with no “link juice,” but at least it’s clear to the reader that the post has gotten a makeover. I remember reading a new post of yours a few months. I was about 25% through it and thought “man this sounds familiar.” So I checked it out on archive.org and realized that you had updated and republished it.
A database program HyperCard was released in 1987 for the Apple Macintosh that allowed hyperlinking between various pages within a document. In 1990, Windows Help, which was introduced with Microsoft Windows 3.0, had widespread use of hyperlinks to link different pages in a single help file together; in addition, it had a visually different kind of hyperlink that caused a popup help message to appear when clicked, usually to give definitions of terms introduced on the help page. The first widely used open protocol that included hyperlinks from any Internet site to any other Internet site was the Gopher protocol from 1991. It was soon eclipsed by HTML after the 1993 release of the Mosaic browser (which could handle Gopher links as well as HTML links). HTML's advantage was the ability to mix graphics, text, and hyperlinks, unlike Gopher, which just had menu-structured text and hyperlinks.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

Search engine optimization (SEO) tools enable you to take the guesswork out of search engine optimization by giving you insights about your keywords, analyzing your website, helping you grow your domain authority through directories, and more. Optimizing a website to rank on Google can be tricky if you’re not a search engine optimization or web development pro. However, there are tools available to help make it a lot easier for small businesses.
If you find any broken links on topically related websites, you can immediately contact the website owner and inform him about it. Since you will do him a favor by pointing out a broken link, you can also kindly request a replacement with a link to your relevant resource. Of course, the replacement – your article – must be informative and useful for their audience.
Keep in mind, this will often mean shifting the focus of your business from more general to more specific products or services. For instance, instead of exclusively offering general home renovation services, you could consider specializing in "one day bathroom renos" or "custom kitchen makeovers." These more specific keyword phrases will likely be much easier to rank for, which will mean you can start ranking that much faster.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
The document containing a hyperlink is known as its source document. For example, in an online reference work such as Wikipedia, or Google, many words and terms in the text are hyperlinked to definitions of those terms. Hyperlinks are often used to implement reference mechanisms such as tables of contents, footnotes, bibliographies, indexes, letters and glossaries.
Use LSI keywords, and answer additional questions that users may think of after viewing the content. Simply offering only the content that a user searches for is no longer enough. Pages need to supply additional information a user may be seeking. Providing additional information will help retain the user, and tell search engines that the page’s content is not only answering the search query but providing additional value that other pieces of content may not be.

The syntax and appearance of wikilinks may vary. Ward Cunningham's original wiki software, the WikiWikiWeb used CamelCase for this purpose. CamelCase was also used in the early version of Wikipedia and is still used in some wikis, such as TiddlyWiki, Trac, and PmWiki. A common markup syntax is the use of double square brackets around the term to be wikilinked. For example, the input "[[zebras]]" is converted by wiki software using this markup syntax to a link to a zebras article. Hyperlinks used in wikis are commonly classified as follows:


When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Creation of new windows is probably the most common use of the "target" attribute. To prevent accidental reuse of a window, the special window names "_blank" and "_new" are usually available, and always cause a new window to be created. It is especially common to see this type of link when one large website links to an external page. The intention in that case is to ensure that the person browsing is aware that there is no endorsement of the site being linked to by the site that was linked from. However, the attribute is sometimes overused and can sometimes cause many windows to be created even while browsing a single site.

In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.
You should build a website to benefit your users, and any optimization should be geared toward making the user experience better. One of those users is a search engine, which helps other users discover your content. Search Engine Optimization is about helping search engines understand and present content. Your site may be smaller or larger than our example site and offer vastly different content, but the optimization topics we discuss below should apply to sites of all sizes and types. We hope our guide gives you some fresh ideas on how to improve your website, and we'd love to hear your questions, feedback, and success stories in the Google Webmaster Help Forum1.
One of your tips is “Make your content actionable”…I feel like this is not always the case depending on the context of the blog post. If I am to write a post on, say, “35 Incredible Linear Actuator Applications”, I feel like that’s something extremely hard to be actionable about. Rather, I feel my purpose is to be informative. Both of us are going for backlinks, but your blog I feel is more aimed towards giving awesome tips & advice for SEO’s. Whereas MY BLOG is about informing people about “linear actuators and their applications”.
One of my favorite ways to give content a boost is to run ads on Facebook targeting people with interests that are relevant to the content. It’s fairly low cost since you are offering a free piece of content. By targeting people with relevant interests to your content, you drive the right people to the content and into the top of your funnel. And if your content resonates with them, they’ll share, link, and engage with the content in ways that will help Google see its value.
×