I love SiteLink! I switched to SiteLink from another software. Quick response from SiteLink support (when rarely needed). Through the years of transitioning to the web edition, Bob was always responsive to my requests/suggestions- adding some of them to the programming as early as 1 day later! Today, Tavis politely provided very helpful information on the lesser-used features that are available. Great program. Great service! Meladi Morris, Manager at Self Storage
I am blind myself and run a disability community and many of our users have different accessibility issues and needs – eg I can’t see your infographic, deaf people can’t listen to your podcast without transcript or captions. Adding this text will benefit Google and your SEO but also make life much better for people with different disabilities so thanks for thinking of us!
Hi Brian – I couldn’t agree more on the tip “delete zombie pages” to raise rankings. We’ve been blogging for 11 years now, and have been through the dark times when you were supposed to publish 400-600 blogs posts per minute in order to rank. Needless to say we had a lot of thin content… A few years back we embarked on a journey to cut out the dead wood, combine the good stuff, and create the long form content you espouse on your website. And guess what? Over those 2 years, traffic us up 628%. We’re down to around 72 pages / posts and couldn’t be happier. It gives us time to update the content when necessary and keep it fresh, rather than scratching our heads trying to figure out what new and exciting way to spin divorce mediation!
The effect of following a hyperlink may vary with the hypertext system and may sometimes depend on the link itself; for instance, on the World Wide Web most hyperlinks cause the target document to replace the document being displayed, but some are marked to cause the target document to open in a new window (or, perhaps, in a new tab[2]). Another possibility is transclusion, for which the link target is a document fragment that replaces the link anchor within the source document. Not only persons browsing the document follow hyperlinks. These hyperlinks may also be followed automatically by programs. A program that traverses the hypertext, following each hyperlink and gathering all the retrieved documents is known as a Web spider or crawler.

But sometimes there are site-wide technical issues that get in your way of ranking on Google. Luckily, fixing technical issues is not a required step for every single piece of content you create. However, as you create more and more content you should be aware of duplicate content, broken links, or problems with crawling and indexing. These issues can set you back in search results.
I love SiteLink! I switched to SiteLink from another software. Quick response from SiteLink support (when rarely needed). Through the years of transitioning to the web edition, Bob was always responsive to my requests/suggestions- adding some of them to the programming as early as 1 day later! Today, Tavis politely provided very helpful information on the lesser-used features that are available. Great program. Great service! Meladi Morris, Manager at Self Storage

To do this, I often align the launch of my content with a couple of guest posts on relevant websites to drive a load of relevant traffic to it, as well as some relevant links. This has a knock-on effect toward the organic amplification of the content and means that you at least have something to show for the content (in terms of ROI) if it doesn't do as well as you expect organically.
The most valuable tip I give small business owners who are looking to improve their ranking is to optimize their website for voice search. As of January 2018 alone, there were an estimated one billion voice searches per month. This number has grown exponentially over the past year and it will continue to grow in 2019. Optimizing their sites now will give them an edge in all aspects of their marketing.

Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
I was wondering if you by this reply really meant 410? Also, what is your take on the many suggestions out there saying that making 301 redirects is always better than deleting pages? I understand that reason is that it is (or is in risk of) being spam-ish to just redirect everything. Also I’m guessing that too many redirects will slow down the page.

You should build a website to benefit your users, and any optimization should be geared toward making the user experience better. One of those users is a search engine, which helps other users discover your content. Search Engine Optimization is about helping search engines understand and present content. Your site may be smaller or larger than our example site and offer vastly different content, but the optimization topics we discuss below should apply to sites of all sizes and types. We hope our guide gives you some fresh ideas on how to improve your website, and we'd love to hear your questions, feedback, and success stories in the Google Webmaster Help Forum1.
Quora is a website where users generate the content entirely. They post questions via threads and other users answer them. It’s basically a yahoo answers type social network that works like an internet forum. Both threads and answers can receive “upvotes” which signify the answer was worthy and popular. The answers with the most upvotes are put at the thread’s top.
Hi Brian, a very useful post, thanks for sharing. These things turned to be very useful for us: blocking thin content/pages from Google index, adjusting meta titles/descriptions and content of popular articles, improving internal links, improving page speed, implementing schema (didn’t notice a big difference here), optimizing images and their alt tags, making sure the site is mobile friendly and there are no accessibility issues (Lighthouse in Google Chrome helped a lot), some link building activity (long term) and of course keyword research and mapping. Thanks again for providing valuable info, regards.
The W3C Recommendation called XLink describes hyperlinks that offer a far greater degree of functionality than those offered in HTML. These extended links can be multidirectional, linking from, within, and between XML documents. It can also describe simple links, which are unidirectional and therefore offer no more functionality than hyperlinks in HTML.
Using HTML frames to surround linked content is a completely different matter. For an example of this, click on this link to the W3C about link myths. Some companies have successfully sued to have their pages removed from these frames because it can make some readers believe that the linked page is actually a part of the originating site, and possibly owned or authored by that same site. But, in most cases, if the owner of a linked site objects to the frame and it's removed, there isn't any legal recourse.
Use LSI keywords, and answer additional questions that users may think of after viewing the content. Simply offering only the content that a user searches for is no longer enough. Pages need to supply additional information a user may be seeking. Providing additional information will help retain the user, and tell search engines that the page’s content is not only answering the search query but providing additional value that other pieces of content may not be.
Quora is a website where users generate the content entirely. They post questions via threads and other users answer them. It’s basically a yahoo answers type social network that works like an internet forum. Both threads and answers can receive “upvotes” which signify the answer was worthy and popular. The answers with the most upvotes are put at the thread’s top.

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
An important factor in ranking is review signals, which refers to the quality, quantity, velocity, and diversity of reviews you get from customers. This rank factor is intriguing as it has jumped up year-over-year in importance. Google reviews are the most important, followed by third-party reviews (Yelp, Facebook, and other sites). It’s also important to get your product/service mentioned in the review. There is even some suggestion that responses to reviews are a factor in rank.
I love SiteLink! I switched to SiteLink from another software. Quick response from SiteLink support (when rarely needed). Through the years of transitioning to the web edition, Bob was always responsive to my requests/suggestions- adding some of them to the programming as early as 1 day later! Today, Tavis politely provided very helpful information on the lesser-used features that are available. Great program. Great service! Meladi Morris, Manager at Self Storage
Hi Brian, great list. I noticed you mentioned using Kraken to optimize your images. A couple other tools I’ve found to work really well are ImageOptim (an app that you can download for your computer) and Optimus (a WordPress plugin that will optimize them when uploaded to your site). I’m not sure what your policy is on including links in comments so I won’t link to them here (even though I have no affiliate with either one.) Hope those resources can help someone!
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Our team of more than 70 programmers, sales and customer support members all work under one roof with one goal: provide the best self-storage software. We invest heavily in personnel, training and technology to respond to your calls and deploy updates regularly. We love it when customers notice how we turn their suggestions into a new features in a few week's time.
The W3C Recommendation called XLink describes hyperlinks that offer a far greater degree of functionality than those offered in HTML. These extended links can be multidirectional, linking from, within, and between XML documents. It can also describe simple links, which are unidirectional and therefore offer no more functionality than hyperlinks in HTML.
In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Disclaimer: Google™ search engine and PageRank™ algorithm are the trademarks of Google Inc. CheckPageRank.net is not affiliated with Google Inc., but provides publicly available information about pagerank values of websites. We provide our services on "as is" and "as available" basis and we do not provide any guarantees regarding this service stability and/or availability.
Embedded content linking. This is most often done with either iframes or framesets — and most companies do not allow their content to be framed in such a way that it looks like someone else owns the content. If you're going to do that, you should be very aware that this annoys people. Furthermore, if you're not willing to remove the content in an iframe or the frameset around the linked page, you may be risking a lawsuit.
Unfortunately, high rankings rarely happen by chance. Even the most skilled and knowledgeable marketers struggle with getting the top-ranking spot. So, how can a regular business owner hope to achieve this feat? While there's no way to absolutely guarantee high rankings, this post will look at some strategies anyone can use to seriously increase their chances of claiming that #1 spot.

You may find that your business doesn’t appear for relevant searches in your area. To maximize how often your customers see your business in local search results, complete the following tasks in Google My Business. Providing and updating business information in Google My Business can help your business’s local ranking on Google and enhance your presence in Search and Maps.
As an alternative to tackling this on your own, try using a service like Mayple to match you with your industry’s best marketing experts. Mayple is a platform that connects business owners to vetted marketing experts, so you can focus on running your business and delegate the rest to experienced professionals — all you need to do is fill out a brief explaining your business’ goals. It even monitors your project’s progress and ensures your expert always delivers the best results. Get started today.
×