Simply look around at other sites—your clients, your partners, your providers, associations you’re a part of, local directories, or even some local communities or influencers. These are all websites that can have a huge impact on your SEO as well as help you get traffic and raise awareness for your business. You are probably already doing business with most of them, so simply ask for a mention, a case study, a testimonial, or other types of backlinks.

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Google’s aim is to provide the most relevant result for any given query. Their entire business model relies on them being able to do this, consistently, across hundreds of billions of searches. For that reason, they’ve invested heavily into understanding the intent of queries, i.e., the reason a person typed a specific thing into Google in the first place.
When someone searches for the name of your business specifically, Google will pull information from your Google My Business page and display it in a panel on the right-hand side of the search results, increasing your business’ exposure. This is great for small businesses, because not only do you get a lot of space on the first page of Google’s organic search results, but you are also able to immediately show what your business is about. Again, the panel is only available to those who have set up their free Google My Business page.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
The syntax and appearance of wikilinks may vary. Ward Cunningham's original wiki software, the WikiWikiWeb used CamelCase for this purpose. CamelCase was also used in the early version of Wikipedia and is still used in some wikis, such as TiddlyWiki, Trac, and PmWiki. A common markup syntax is the use of double square brackets around the term to be wikilinked. For example, the input "[[zebras]]" is converted by wiki software using this markup syntax to a link to a zebras article. Hyperlinks used in wikis are commonly classified as follows:
In a series of books and articles published from 1964 through 1980, Nelson transposed Bush's concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical proprietary worldwide computer network, and advocated the creation of such a network. Though Nelson's Xanadu Corporation was eventually funded by Autodesk in the 1980s, it never created this proprietary public-access network. Meanwhile, working independently, a team led by Douglas Engelbart (with Jeff Rulifson as chief programmer) was the first to implement the hyperlink concept for scrolling within a single document (1966), and soon after for connecting between paragraphs within separate documents (1968), with NLS. Ben Shneiderman working with graduate student Dan Ostroff designed and implemented the highlighted link in the HyperTIES system in 1983. HyperTIES was used to produce the world's first electronic journal, the July 1988 Communications of ACM, which was cited as the source for the link concept in Tim Berners-Lee's Spring 1989 manifesto for the Web. In 1988, Ben Shneiderman and Greg Kearsley used HyperTIES to publish "Hypertext Hands-On!", the world's first electronic book.[citation needed]
Hi Brian – I couldn’t agree more on the tip “delete zombie pages” to raise rankings. We’ve been blogging for 11 years now, and have been through the dark times when you were supposed to publish 400-600 blogs posts per minute in order to rank. Needless to say we had a lot of thin content… A few years back we embarked on a journey to cut out the dead wood, combine the good stuff, and create the long form content you espouse on your website. And guess what? Over those 2 years, traffic us up 628%. We’re down to around 72 pages / posts and couldn’t be happier. It gives us time to update the content when necessary and keep it fresh, rather than scratching our heads trying to figure out what new and exciting way to spin divorce mediation!
Keywords are the words and phrases that customers type into Google when looking for information. Use the Google Keyword Planner Tool, available through your Google Ads account, to find the most popular keywords people use when searching for your type of business. Optimize your website for those keywords by adding them in blog posts and to web pages.