Reviews are important to your small business because having reviews—especially positive reviews—is a ranking factor on Google. People are also more likely to click and visit your business if it’s listed with a lot of good reviews. To get reviews, start by asking your loyal customers and even your staff to leave reviews on major sites such as Google and Yelp.

Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
To get started, you will need to either claim your current business page or create a new one, depending on whether or not a page already exists for your business. If you are unsure, you can go to Google.com/business to search for the name of your business. For more information, read our step-by-step guide to setting up your Google My Business account.
Start testing with Progressive Web Apps, or PWAs, with the new mobile-first index. PWAs will deliver a mobile site if a user comes to your site from their smartphone tablet. Major brands like The Weather Channel and Lyft are already testing this, and you don’t want to get left behind. PWAs are beneficial to brands that drive revenue through ads. They are an excellent alternative to AMP.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.

The tip that resonates with me the most is to publish studies, which you back up by linking to the study you collaborated on. That is spot on. It feels like having genuinely useful in depth content is THE strategy that will not be “Google updated” at any point. (Because if you were building a search engine, that’s the content you’d want to serve your users when they search for a topic.)


Keep in mind that the number of average monthly searches for each suggested keyword is an estimate. However, it does represent the popularity of that keyword or search term. This makes a difference when doing your keyword research, as it gives you insight into what people in your market are searching for. Understanding what they want allows you to better position your business to provide them with relevant content and information.

If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.


As the industry continues to evolve, SiteLink brings you the right tools using today's technology. We listen to our customers' suggestions to enhance and add features. SiteLink users enjoy the collective experience of more than 15,000 operators. We exceed the strict SSAE 16 (SOC 1) Type II and PCI Level 1 Certifications to deliver peace of mind. SiteLink is cloud-based so you can do business from anywhere. SiteLink lets owners build the best websites so tenants can pay, reserve and rent online, 24/7 on any device.
I have read every one of your blog posts several times. They have all helped me rank websites I manage significantly! In regards to link building, how would you go about it for a lawn care service website? I have gotten most of the dofollow links from local landscape groups, but other then that, I haven’t had any luck with links. I have started blogs, but in this industry, there doesn’t seem to be much interest in the topics that I write about.

The document containing a hyperlink is known as its source document. For example, in an online reference work such as Wikipedia, or Google, many words and terms in the text are hyperlinked to definitions of those terms. Hyperlinks are often used to implement reference mechanisms such as tables of contents, footnotes, bibliographies, indexes, letters and glossaries.
Ensure that the pictures on your website have file names which include the target keyword. Also, your target keyword should be part of your image’s Alt Text. This will improve optimization for your article and also create a clearer picture for the search engines to the relevancy of your article/page. Images are an important component of any website as they make pages visually attractive as well as informative. Optimizing your images should naturally boost your ranking. Also, your image will get a high rank in Google image search.
In a graphical user interface, the appearance of a mouse cursor may change into a hand motif to indicate a link. In most graphical web browsers, links are displayed in underlined blue text when they have not been visited, but underlined purple text when they have. When the user activates the link (e.g., by clicking on it with the mouse) the browser displays the link's target. If the target is not an HTML file, depending on the file type and on the browser and its plugins, another program may be activated to open the file.

Ranking highly on Google is achieved by strategically using keywords and design elements on your website that are favored by Google. Learning how to rank on Google will help give you the ability to capture the top organic and local results. Every business should be consistent in their efforts to increase and/or hold their top position on Google, as online searches are a primary way for customers to find companies and products.
The most common destination anchor is a URL used in the World Wide Web. This can refer to a document, e.g. a webpage, or other resource, or to a position in a webpage. The latter is achieved by means of an HTML element with a "name" or "id" attribute at that position of the HTML document. The URL of the position is the URL of the webpage with a fragment identifier — "#id attribute" — appended.
The effect of following a hyperlink may vary with the hypertext system and may sometimes depend on the link itself; for instance, on the World Wide Web most hyperlinks cause the target document to replace the document being displayed, but some are marked to cause the target document to open in a new window (or, perhaps, in a new tab[2]). Another possibility is transclusion, for which the link target is a document fragment that replaces the link anchor within the source document. Not only persons browsing the document follow hyperlinks. These hyperlinks may also be followed automatically by programs. A program that traverses the hypertext, following each hyperlink and gathering all the retrieved documents is known as a Web spider or crawler.
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Using HTML frames to surround linked content is a completely different matter. For an example of this, click on this link to the W3C about link myths. Some companies have successfully sued to have their pages removed from these frames because it can make some readers believe that the linked page is actually a part of the originating site, and possibly owned or authored by that same site. But, in most cases, if the owner of a linked site objects to the frame and it's removed, there isn't any legal recourse.

While most of the links to your site will be added gradually, as people discover your content through search or other ways and link to it, Google understands that you'd like to let others know about the hard work you've put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject. As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.

One of my best pieces of advice when it comes to SEO for small businesses is to truly spend some time understanding your audience and their intent. Even if your website is perfectly optimized, if it’s done for the wrong audience, you will not see good traffic. Google is taking audience intent into account more and more, as updates like RankBrain try to understand the semantics of a search query and not just the literal definition of the words. If you can comprehensively answer the questions your audience is asking, your site will rank highly in Google organically.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.
This is a great list of SEO tips. One tip you can add to your broken links section is that there is a free tool called Xenu Link Sleuth. It will automatically crawl your website (or a competitor’s site) and find all the broken backlinks (and lots of other things). It’s very helpful if you have a large site or need to find broken links you aren’t aware of. (like images and .pdf files)
As keywords are essentially the backbone of on-page SEO, you need to pay a lot of attention to them. There is no reason not to include them in your URLs.  The inclusion has its benefits. When you assimilate the targeted keyword into the URL, you are ensuring that Google’s has another reason and way to consider your article as more relevant for a particular phrase.
It appears that the reason this page from a little-known website is able to rank amongst the bigger players is that the content itself is more focussed. It talks about how to name images for SEO, whereas most of the other pages are more general guides to image SEO—which all presumably mention the importance of naming images correctly, amongst other things.

We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Hey Brian, This is my first ever comment in Backlinko. I am silent lover of Backlinko. Whenever I stuck and feel numb about what to do next for my site ranking , I always come and find peace in backlinko planet. You are awesome man. Just a quick suggestion for you, Whenever I click on “Show only Brian’s favorite tips: NO” then it also shows your favorite tips too. It might be a problem ! See I learned from you and now I am helping you 😛 LOL …. Now Give me a authority link for my site 😀 Just Kidding !
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.

In a series of books and articles published from 1964 through 1980, Nelson transposed Bush's concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical proprietary worldwide computer network, and advocated the creation of such a network. Though Nelson's Xanadu Corporation was eventually funded by Autodesk in the 1980s, it never created this proprietary public-access network. Meanwhile, working independently, a team led by Douglas Engelbart (with Jeff Rulifson as chief programmer) was the first to implement the hyperlink concept for scrolling within a single document (1966), and soon after for connecting between paragraphs within separate documents (1968), with NLS. Ben Shneiderman working with graduate student Dan Ostroff designed and implemented the highlighted link in the HyperTIES system in 1983. HyperTIES was used to produce the world's first electronic journal, the July 1988 Communications of ACM, which was cited as the source for the link concept in Tim Berners-Lee's Spring 1989 manifesto for the Web. In 1988, Ben Shneiderman and Greg Kearsley used HyperTIES to publish "Hypertext Hands-On!", the world's first electronic book.[citation needed]
Wikilinks are visibly distinct from other text, and if an internal wikilink leads to a page that does not yet exist, it usually has a different specific visual appearance. For example, in Wikipedia wikilinks are displayed in blue, except those that link to pages that don't yet exist, which are instead shown in red.[6] Another possibility for linking is to display a highlighted clickable question mark after the wikilinked term.
This content will help you boost your rankings in two primary ways. First, more content means more keywords, and therefore more opportunities for Google to return your site in the search results. Second, the more content you have, the more links you generally accumulate. Plus, having lots of content is great for getting visitors to stay on your site longer. Win-win!
Hi Brian, great list. I noticed you mentioned using Kraken to optimize your images. A couple other tools I’ve found to work really well are ImageOptim (an app that you can download for your computer) and Optimus (a WordPress plugin that will optimize them when uploaded to your site). I’m not sure what your policy is on including links in comments so I won’t link to them here (even though I have no affiliate with either one.) Hope those resources can help someone!
The great thing about the long tail for new sites that have no backlinks and no authority, is that it is possible to rank for these terms, assuming great on-page SEO, quality content etc.. So therefore focusing on the long tail is a strategy that is often recommended and in fact Rand himself (and indeed others of good repute) have cited 4+ words and lower LMS to avoid the med-high volume kws due to their kw difficulty. Have I completely missed the point in your guide or do you indeed have a slightly different view on the long tail?

The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
Longer content not only helps in adding more keywords to it, but there is also a natural emphasis on information. The authenticity of a post increases with longer text, which means that Google would recognize it as something more relevant than a shorter and concise text. As search patterns are synonymous with long tail keywords nowadays, a longer text also improves the chances of your article/website to be on a higher ranking than others.
Because the act of linking to a site does not imply ownership or endorsement, you do not need to ask permission to link to a site that is publicly accessible. For example, if you found a site's URL through a search engine, then linking to it shouldn't have legal ramifications. There have been one or two cases in the U.S. that implied that the act of linking without permission is legally actionable, but these have been overturned every time.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
If you check out some of the suggestions below this though, you're likely to find some opportunities. You can also plug in a few variations of the question to find some search volume; for example, I could search for "cup of java" instead of "what is the meaning of a cup of java" and I'll get a number of keyword opportunities that I can align to the question.
The scientific literature is a place where link persistence is crucial to the public knowledge. A 2013 study in BMC Bioinformatics analyzed 15,000 links in abstracts from Thomson Reuters’ Web of Science citation index, founding that the median lifespan of Web pages was 9.3 years, and just 62% were archived.[10] The median lifespan of a Web page constitutes high-degree variable, but its order of magnitude usually is of some months.[11]
I was wondering if you by this reply really meant 410? Also, what is your take on the many suggestions out there saying that making 301 redirects is always better than deleting pages? I understand that reason is that it is (or is in risk of) being spam-ish to just redirect everything. Also I’m guessing that too many redirects will slow down the page.
Hi Brian, a very useful post, thanks for sharing. These things turned to be very useful for us: blocking thin content/pages from Google index, adjusting meta titles/descriptions and content of popular articles, improving internal links, improving page speed, implementing schema (didn’t notice a big difference here), optimizing images and their alt tags, making sure the site is mobile friendly and there are no accessibility issues (Lighthouse in Google Chrome helped a lot), some link building activity (long term) and of course keyword research and mapping. Thanks again for providing valuable info, regards.
It appears that the reason this page from a little-known website is able to rank amongst the bigger players is that the content itself is more focussed. It talks about how to name images for SEO, whereas most of the other pages are more general guides to image SEO—which all presumably mention the importance of naming images correctly, amongst other things.
×