Organic results includes a wider range of websites—not just local businesses. While location may still be a factor in the organic results, it is not generally the dominant factor. The main factor is the authority of the website (or the credibility of the site), external links back to the site, the length of time the site has been live, and other considerations. This is why you will often see sites such as Yelp, Angie’s List, and Facebook show up high in search results.
In a series of books and articles published from 1964 through 1980, Nelson transposed Bush's concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical proprietary worldwide computer network, and advocated the creation of such a network. Though Nelson's Xanadu Corporation was eventually funded by Autodesk in the 1980s, it never created this proprietary public-access network. Meanwhile, working independently, a team led by Douglas Engelbart (with Jeff Rulifson as chief programmer) was the first to implement the hyperlink concept for scrolling within a single document (1966), and soon after for connecting between paragraphs within separate documents (1968), with NLS. Ben Shneiderman working with graduate student Dan Ostroff designed and implemented the highlighted link in the HyperTIES system in 1983. HyperTIES was used to produce the world's first electronic journal, the July 1988 Communications of ACM, which was cited as the source for the link concept in Tim Berners-Lee's Spring 1989 manifesto for the Web. In 1988, Ben Shneiderman and Greg Kearsley used HyperTIES to publish "Hypertext Hands-On!", the world's first electronic book.[citation needed]

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
This is a great list of SEO tips. One tip you can add to your broken links section is that there is a free tool called Xenu Link Sleuth. It will automatically crawl your website (or a competitor’s site) and find all the broken backlinks (and lots of other things). It’s very helpful if you have a large site or need to find broken links you aren’t aware of. (like images and .pdf files)

Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.


Basically, what I’m talking about here is finding websites that have mentioned your brand name but they haven’t actually linked to you. For example, someone may have mentioned my name in an article they wrote (“Matthew Barby did this…”) but they didn’t link to matthewbarby.com. By checking for websites like this you can find quick opportunities to get them to add a link.


2. It could be easier to get a backlink with a jumplink especially to your long article/page, as it is easier to create/add linkable content to your current long page. Instead of creating a totally new page. And for the site who would link to you it would be more relevant if the link goes directly to the part of the page where they are referring to in the backlink.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Additionally, the title must also be interesting enough that people will actually want to click on it! A good example of this would be PT from PTMoney.com, who wrote a great post about "making extra money." However, rather than a boring title, like "Make Extra Money," he titled it "52 Ways to Make Extra Money." Now that is something I would want to read.
Thanks Zarina. 1. I’m actually not sure how to remove dates from comments. We don’t display dates on comments by default, so I’ve never had to change them. I wouldn’t change the dates on comments though. If the person left a comment on such and such a date, it’s not really kosher to change the date on that. 2. Good question. The same rule applies for any website: you need to publish awesome stuff. 3. Thank you 🙂

Thanks for sharing these tips, Brian. Agree with all of these, except maybe #3 Delete zombie pages. A better strategy would be to update these pages with fresh content and convert them into a long form blog posts/guides. Deleting them entirely would mean either setting up a 404 or 301 redirect – both of which can hurt your organic traffic in the short run.

When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".

Killer post, Brian. Really want to know how to create those charts you keep using throughout your posts. I searched Google for them but couldn’t find them so I’m guessing they’re custom designed but with what tool is the question… Would love to see a flat architecture diagram for blogs and non-ecommerce sites. Look forward to your upcoming Blab with Dmitry.


Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
Permalinks are URLs that are intended to remain unchanged for many years into the future, yielding hyperlink that are less susceptible to link rot. Permalinks are often rendered simply, that is, as friendly URLs, so as to be easy for people to type and remember. Permalinks are used in order to point and redirect readers to the same Web page, blog post or any online digital media[9].
This is a great list of SEO tips. One tip you can add to your broken links section is that there is a free tool called Xenu Link Sleuth. It will automatically crawl your website (or a competitor’s site) and find all the broken backlinks (and lots of other things). It’s very helpful if you have a large site or need to find broken links you aren’t aware of. (like images and .pdf files)
Links to online articles and websites improve the richness of online text and increase its search engine optimization. You can reference almost any website by copying and pasting the link into your email, text message, or document. The procedure may differ slightly depending upon the computer, device or program you are using. If the address is very long, you can use a link shortening service.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
In a graphical user interface, the appearance of a mouse cursor may change into a hand motif to indicate a link. In most graphical web browsers, links are displayed in underlined blue text when they have not been visited, but underlined purple text when they have. When the user activates the link (e.g., by clicking on it with the mouse) the browser displays the link's target. If the target is not an HTML file, depending on the file type and on the browser and its plugins, another program may be activated to open the file.
As keywords are essentially the backbone of on-page SEO, you need to pay a lot of attention to them. There is no reason not to include them in your URLs.  The inclusion has its benefits. When you assimilate the targeted keyword into the URL, you are ensuring that Google’s has another reason and way to consider your article as more relevant for a particular phrase.
Hi Brian, a very useful post, thanks for sharing. These things turned to be very useful for us: blocking thin content/pages from Google index, adjusting meta titles/descriptions and content of popular articles, improving internal links, improving page speed, implementing schema (didn’t notice a big difference here), optimizing images and their alt tags, making sure the site is mobile friendly and there are no accessibility issues (Lighthouse in Google Chrome helped a lot), some link building activity (long term) and of course keyword research and mapping. Thanks again for providing valuable info, regards.
Next, log into Google AdWords and click “Tools” > “Keyword Planner.” Once you’re on the Keyword Planner menu, click “Search for new keywords using a phrase, website or category.” Complete the form that appears; to start with, search for your type of business and location. For example, if you own a hair salon in Chicago, you would want to enter “hair salon Chicago.”
×