For example, a plumber could first find a service that has search volume on Google, but may not be talked about that in-depth on their competitor’s websites. In this example, a service like “leak detection” may be listed with a small blurb on your competitor’s sites, but none of them have elaborated on every angle, created FAQs, videos, or images. This represents an opportunity to dominate on that topic.

You may find that your business doesn’t appear for relevant searches in your area. To maximize how often your customers see your business in local search results, complete the following tasks in Google My Business. Providing and updating business information in Google My Business can help your business’s local ranking on Google and enhance your presence in Search and Maps.


If you find any broken links on topically related websites, you can immediately contact the website owner and inform him about it. Since you will do him a favor by pointing out a broken link, you can also kindly request a replacement with a link to your relevant resource. Of course, the replacement – your article – must be informative and useful for their audience.
To maximize your ROI when optimizing your site for mobile search, consult with a marketing professional to make sure your SEO efforts are in order. Use Mayple to be matched with a marketing expert from your industry, so you know your second set of eyes are from a professional. Visit Mayple’s site, fill out a brief identifying your business’ goals, and receive a FREE full audit of your marketing campaigns.

I have two tabs open. This article and another one. Both written in June. Each with a different opinion on keywords in URLs. Its so hard to follow SEO nowadays when everyone says they have the best data to prove stuff yet contradict each other. The only way around it is to test test test on my own stuff but it would be great if there was concensus.


When the cursor hovers over a link, depending on the browser and graphical user interface, some informative text about the link can be shown, popping up, not in a regular window, but in a special hover box, which disappears when the cursor is moved away (sometimes it disappears anyway after a few seconds, and reappears when the cursor is moved away and back). Mozilla Firefox, IE, Opera, and many other web browsers all show the URL. In addition, the URL is commonly shown in the status bar.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

Note: Here I only recommend one thing, before access the dark web links; please focus on your security, Do you know how to do that then check out my another post how to access the dark web. Do you want to know some brief introduction about the dark web, for more information, I searched alot on the deep web, And found some great stories which say, Tor network also have some loophole, in some cases, hacker can track your identity on the internet network. That’s why first your need to buy any premium VPN service. Which can provides you security into Tor environment? I have one best VPN which I always use for my personal task. This VPN service name is Nord VPN
Ensure that the pictures on your website have file names which include the target keyword. Also, your target keyword should be part of your image’s Alt Text. This will improve optimization for your article and also create a clearer picture for the search engines to the relevancy of your article/page. Images are an important component of any website as they make pages visually attractive as well as informative. Optimizing your images should naturally boost your ranking. Also, your image will get a high rank in Google image search.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
Hi Brian, great list. I noticed you mentioned using Kraken to optimize your images. A couple other tools I’ve found to work really well are ImageOptim (an app that you can download for your computer) and Optimus (a WordPress plugin that will optimize them when uploaded to your site). I’m not sure what your policy is on including links in comments so I won’t link to them here (even though I have no affiliate with either one.) Hope those resources can help someone!
Ever heard of Maslow's hierarchy of needs? It's a theory of psychology that prioritizes the most fundamental human needs (like air, water, and physical safety) over more advanced needs (like esteem and social belonging). The theory is that you can't achieve the needs at the top without ensuring the more fundamental needs are met first. Love doesn't matter if you don't have food.
An easy way to keep your website current and relevant is by maintaining an active blog. This allows you to create posts that use your keywords while also telling Google your website is up-to-date without actually having to update your web pages. Consider writing on topics that answer frequently asked questions or sharing your expertise in your industry.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
I am reading your emails from long time and tips and techniques are always wonderful, I have used many suggestions and also highly suggest others using the same, really seo is like playing with search engines and keeping your eye on everything going on and the changes and even implementing our own sense of humour while preparing metas, sometimes guessing what users will use to search products or services.
Take your competitors’ SEO work and apply it to yourself. For example, when writing your meta titles and descriptions, look at your competitors’ paid ads on Google for your keywords. Do they all mention a word or phrase (“complementary” or “free estimates,” for example)? Try using those to improve your titles and descriptions. After all, they spent money testing theirs out.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.

The effect of following a hyperlink may vary with the hypertext system and may sometimes depend on the link itself; for instance, on the World Wide Web most hyperlinks cause the target document to replace the document being displayed, but some are marked to cause the target document to open in a new window (or, perhaps, in a new tab[2]). Another possibility is transclusion, for which the link target is a document fragment that replaces the link anchor within the source document. Not only persons browsing the document follow hyperlinks. These hyperlinks may also be followed automatically by programs. A program that traverses the hypertext, following each hyperlink and gathering all the retrieved documents is known as a Web spider or crawler.


Warning: Before anything else first I want to tell you, some dark web links have mind disturbing content, fraudulent, unpleasant content, porn, child porn, Drugs, Weapons, Gadgets buying and selling, and more horrible things. That’s why you are the only one person who is responsible for your all activity; We are not recommending any dark web sites here, If you lose or damaged anything then we are not responsible for these. Here I am adding these links only for education and research purpose.

I was wondering if you by this reply really meant 410? Also, what is your take on the many suggestions out there saying that making 301 redirects is always better than deleting pages? I understand that reason is that it is (or is in risk of) being spam-ish to just redirect everything. Also I’m guessing that too many redirects will slow down the page.
Links to online articles and websites improve the richness of online text and increase its search engine optimization. You can reference almost any website by copying and pasting the link into your email, text message, or document. The procedure may differ slightly depending upon the computer, device or program you are using. If the address is very long, you can use a link shortening service.
The term "hyperlink" was coined in 1965 (or possibly 1964) by Ted Nelson at the start of Project Xanadu. Nelson had been inspired by "As We May Think", a popular 1945 essay by Vannevar Bush. In the essay, Bush described a microfilm-based machine (the Memex) in which one could link any two pages of information into a "trail" of related information, and then scroll back and forth among pages in a trail as if they were on a single microfilm reel.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Because the act of linking to a site does not imply ownership or endorsement, you do not need to ask permission to link to a site that is publicly accessible. For example, if you found a site's URL through a search engine, then linking to it shouldn't have legal ramifications. There have been one or two cases in the U.S. that implied that the act of linking without permission is legally actionable, but these have been overturned every time.
I have two tabs open. This article and another one. Both written in June. Each with a different opinion on keywords in URLs. Its so hard to follow SEO nowadays when everyone says they have the best data to prove stuff yet contradict each other. The only way around it is to test test test on my own stuff but it would be great if there was concensus.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.


Contentious in particular are deep links, which do not point to a site's home page or other entry point designated by the site owner, but to content elsewhere, allowing the user to bypass the site's own designated flow, and inline links, which incorporate the content in question into the pages of the linking site, making it seem part of the linking site's own content unless an explicit attribution is added.[13]

i have never read an informative post until i came across this and am hooked. Helped me recognize broken links that i had no idea where sitting there. Also right in the beginning about having link worthy site, its like you were talking to me about writing see the link "here" and the LSI was a good tip i knew nothing about. Thank you so much and all the best.
An inline link may display a modified version of the content; for instance, instead of an image, a thumbnail, low resolution preview, cropped section, or magnified section may be shown. The full content is then usually available on demand, as is the case with print publishing software – e.g., with an external link. This allows for smaller file sizes and quicker response to changes when the full linked content is not needed, as is the case when rearranging a page layout.

Basically, what I’m talking about here is finding websites that have mentioned your brand name but they haven’t actually linked to you. For example, someone may have mentioned my name in an article they wrote (“Matthew Barby did this…”) but they didn’t link to matthewbarby.com. By checking for websites like this you can find quick opportunities to get them to add a link.


A little more than 11% of search results have a featured snippet. These are the results that show up on search engine results pages typically after the ads but before the ranked results. They’re usually alongside an image, table, or a video, making them stand out even more and putting them in an even better position to steal clicks from even the highest ranked results.
Hi Brian! I’ve been reading your blog for quite a while and had a good amount of success. 🙂 I have a request, similar to what someone else mentioned. It would be nice to learn the HOW to the tips you give, including details like which pro designer you hired, or at least where you found them, what their credentials should be (for which tasks), etc. Example: you used a custom coder to change the “last updated” date. So how do we find our own custom coder, and what coding language would they need to know. Same thing with jump links. Also, which pro graphics designer do you use, or at least, where did you find them, and what type of graphics skills do they need?
Stellar post as always Brian! For marketers who have the time and budget we can do all of these things, however for most people who run small/local businesses they simply don’t have the time to do most of these things, and then it’s not cost feasible to pay someone to do this because that would require a full time position or paying an agency $xxxx/month which they can’t afford either. It’s a quandary of a position they find themselves in and makes it almost impossible to compete in modern day search against established giants. I wish Google would place more emphasis on relevancy and what’s actually on the page versus domain authority. Maybe one they’ll move towards that, but for now, all I see in the serps is giant sites then comes relevancy.
I did want to ask you about the section on “Don’t Focus on Long Tail Keywords”. This is topical for me as I actually have a tab opened from a recent post on the MOZ blog from Rand Fishkin that details reasons why you should focus on long tail keywords. I know you have said that “they have their place”, but as I say as a newbie to all of this, ever so slightly differing opinions from two authoritative people in the industry (that’s you and Rand of course 🙂 frazzles my brain somewhat and I’m not sure whether to turn left or right!
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.

Interact with customers by responding to reviews that they leave about your business. Responding to reviews shows that you value your customers and the feedback that they leave about your business. High-quality, positive reviews from your customers will improve your business’s visibility and increase the likelihood that a potential customer will visit your location. Encourage customers to leave feedback by creating a link they can click to write reviews. Learn more

Organic results includes a wider range of websites—not just local businesses. While location may still be a factor in the organic results, it is not generally the dominant factor. The main factor is the authority of the website (or the credibility of the site), external links back to the site, the length of time the site has been live, and other considerations. This is why you will often see sites such as Yelp, Angie’s List, and Facebook show up high in search results.

×