When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

Good stuff, Brian. The tip about getting longer (4-line) descriptions is awesome. I hadn’t noticed that too much in the SERPs, although now I’m on a mission to find some examples in my niche and study how to achieve these longer descriptions. I also like the tip about using brackets in the post’s title. One other thing that works well in certain niches is to add a CAPITAL word somewhere in the title. Based on some early tests, it appears to improve CTR.
Google’s aim is to provide the most relevant result for any given query. Their entire business model relies on them being able to do this, consistently, across hundreds of billions of searches. For that reason, they’ve invested heavily into understanding the intent of queries, i.e., the reason a person typed a specific thing into Google in the first place.
Google updates its search algorithm frequently. For example, on February 23rd, 2016, Google made significant changes to AdWords, removing right-column ads entirely and rolling out 4-ad top blocks on many commercial searches. While this was a paid search update, it had significant implications for CTR for both paid and organic results, especially on competitive keywords.
In order to rank higher on Google in 2019, consider starting from the ground up with your website and SEO strategy. Try hiring experts like Optuno to build you a custom SEO-friendly website for your business. The professionals at Optuno also provide hosting, monthly maintenance, and a dedicated team to take care of the site. It also offers a 100% money-back guarantee if you’re not satisfied. Click here for a free consultation.
×