Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Additionally, the title must also be interesting enough that people will actually want to click on it! A good example of this would be PT from PTMoney.com, who wrote a great post about "making extra money." However, rather than a boring title, like "Make Extra Money," he titled it "52 Ways to Make Extra Money." Now that is something I would want to read.
But sometimes there are site-wide technical issues that get in your way of ranking on Google. Luckily, fixing technical issues is not a required step for every single piece of content you create. However, as you create more and more content you should be aware of duplicate content, broken links, or problems with crawling and indexing. These issues can set you back in search results.

I was wondering if you by this reply really meant 410? Also, what is your take on the many suggestions out there saying that making 301 redirects is always better than deleting pages? I understand that reason is that it is (or is in risk of) being spam-ish to just redirect everything. Also I’m guessing that too many redirects will slow down the page.


An important factor in ranking is review signals, which refers to the quality, quantity, velocity, and diversity of reviews you get from customers. This rank factor is intriguing as it has jumped up year-over-year in importance. Google reviews are the most important, followed by third-party reviews (Yelp, Facebook, and other sites). It’s also important to get your product/service mentioned in the review. There is even some suggestion that responses to reviews are a factor in rank.
These are all great. I am working on implementing most of these. My biggest issue is my site is brand new (2 months). I am ranking for a lot but seem to be limited because, I am assuming, google will not give enough trust to a new site. What should I be doing to overcome the newness of my site? I buy houses in the Dallas Fort Worth area and if you are not number 1 on google then you might as well be on page 10! Any advise would be well received and please keep up the great work!
One of your tips is “Make your content actionable”…I feel like this is not always the case depending on the context of the blog post. If I am to write a post on, say, “35 Incredible Linear Actuator Applications”, I feel like that’s something extremely hard to be actionable about. Rather, I feel my purpose is to be informative. Both of us are going for backlinks, but your blog I feel is more aimed towards giving awesome tips & advice for SEO’s. Whereas MY BLOG is about informing people about “linear actuators and their applications”.
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
The syntax and appearance of wikilinks may vary. Ward Cunningham's original wiki software, the WikiWikiWeb used CamelCase for this purpose. CamelCase was also used in the early version of Wikipedia and is still used in some wikis, such as TiddlyWiki, Trac, and PmWiki. A common markup syntax is the use of double square brackets around the term to be wikilinked. For example, the input "[[zebras]]" is converted by wiki software using this markup syntax to a link to a zebras article. Hyperlinks used in wikis are commonly classified as follows:
For centuries, the myth of the starving artist has dominated our culture, seeping into the minds of creative people and stifling their pursuits. But the truth is that the world’s most successful artists did not starve. In fact, they capitalized on the power of their creative strength. In Real Artists Don’t Starve, Jeff Goins debunks the myth of the starving artist by unveiling the ideas that created it and replacing them with fourteen rules for artists to thrive.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
An inline link may display a modified version of the content; for instance, instead of an image, a thumbnail, low resolution preview, cropped section, or magnified section may be shown. The full content is then usually available on demand, as is the case with print publishing software – e.g., with an external link. This allows for smaller file sizes and quicker response to changes when the full linked content is not needed, as is the case when rearranging a page layout.
For example, a plumber could first find a service that has search volume on Google, but may not be talked about that in-depth on their competitor’s websites. In this example, a service like “leak detection” may be listed with a small blurb on your competitor’s sites, but none of them have elaborated on every angle, created FAQs, videos, or images. This represents an opportunity to dominate on that topic.
If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.

These are all great. I am working on implementing most of these. My biggest issue is my site is brand new (2 months). I am ranking for a lot but seem to be limited because, I am assuming, google will not give enough trust to a new site. What should I be doing to overcome the newness of my site? I buy houses in the Dallas Fort Worth area and if you are not number 1 on google then you might as well be on page 10! Any advise would be well received and please keep up the great work!
2. It could be easier to get a backlink with a jumplink especially to your long article/page, as it is easier to create/add linkable content to your current long page. Instead of creating a totally new page. And for the site who would link to you it would be more relevant if the link goes directly to the part of the page where they are referring to in the backlink.
To maximize your ROI when optimizing your site for mobile search, consult with a marketing professional to make sure your SEO efforts are in order. Use Mayple to be matched with a marketing expert from your industry, so you know your second set of eyes are from a professional. Visit Mayple’s site, fill out a brief identifying your business’ goals, and receive a FREE full audit of your marketing campaigns.
×