Hi Brian, a very useful post, thanks for sharing. These things turned to be very useful for us: blocking thin content/pages from Google index, adjusting meta titles/descriptions and content of popular articles, improving internal links, improving page speed, implementing schema (didn’t notice a big difference here), optimizing images and their alt tags, making sure the site is mobile friendly and there are no accessibility issues (Lighthouse in Google Chrome helped a lot), some link building activity (long term) and of course keyword research and mapping. Thanks again for providing valuable info, regards.
An easy way to keep your website current and relevant is by maintaining an active blog. This allows you to create posts that use your keywords while also telling Google your website is up-to-date without actually having to update your web pages. Consider writing on topics that answer frequently asked questions or sharing your expertise in your industry.
One of your tips is “Make your content actionable”…I feel like this is not always the case depending on the context of the blog post. If I am to write a post on, say, “35 Incredible Linear Actuator Applications”, I feel like that’s something extremely hard to be actionable about. Rather, I feel my purpose is to be informative. Both of us are going for backlinks, but your blog I feel is more aimed towards giving awesome tips & advice for SEO’s. Whereas MY BLOG is about informing people about “linear actuators and their applications”.
This is a great list of SEO tips. One tip you can add to your broken links section is that there is a free tool called Xenu Link Sleuth. It will automatically crawl your website (or a competitor’s site) and find all the broken backlinks (and lots of other things). It’s very helpful if you have a large site or need to find broken links you aren’t aware of. (like images and .pdf files)
I have two tabs open. This article and another one. Both written in June. Each with a different opinion on keywords in URLs. Its so hard to follow SEO nowadays when everyone says they have the best data to prove stuff yet contradict each other. The only way around it is to test test test on my own stuff but it would be great if there was concensus.
To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.
A few links down and I've noticed that Brian has a link from WordPress.org. Not bad! Turns out that his content has been referenced within one of WordPress's codex posts. If I were to reach out and offer some additional insight, citing one of my articles, there's a chance I could bag a similar link, especially considering they have a 'Useful Resources' section.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Embedded content linking. This is most often done with either iframes or framesets — and most companies do not allow their content to be framed in such a way that it looks like someone else owns the content. If you're going to do that, you should be very aware that this annoys people. Furthermore, if you're not willing to remove the content in an iframe or the frameset around the linked page, you may be risking a lawsuit.
The syntax and appearance of wikilinks may vary. Ward Cunningham's original wiki software, the WikiWikiWeb used CamelCase for this purpose. CamelCase was also used in the early version of Wikipedia and is still used in some wikis, such as TiddlyWiki, Trac, and PmWiki. A common markup syntax is the use of double square brackets around the term to be wikilinked. For example, the input "[[zebras]]" is converted by wiki software using this markup syntax to a link to a zebras article. Hyperlinks used in wikis are commonly classified as follows:
Search engine optimization (SEO) tools enable you to take the guesswork out of search engine optimization by giving you insights about your keywords, analyzing your website, helping you grow your domain authority through directories, and more. Optimizing a website to rank on Google can be tricky if you’re not a search engine optimization or web development pro. However, there are tools available to help make it a lot easier for small businesses.
Quora is a website where users generate the content entirely. They post questions via threads and other users answer them. It’s basically a yahoo answers type social network that works like an internet forum. Both threads and answers can receive “upvotes” which signify the answer was worthy and popular. The answers with the most upvotes are put at the thread’s top.
In order to rank higher on Google in 2019, consider starting from the ground up with your website and SEO strategy. Try hiring experts like Optuno to build you a custom SEO-friendly website for your business. The professionals at Optuno also provide hosting, monthly maintenance, and a dedicated team to take care of the site. It also offers a 100% money-back guarantee if you’re not satisfied. Click here for a free consultation.