SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
Meta tags. Meta tags still play a vital role in SEO. If you type any keyword into a search engine, you’ll see how that keyword is reflected in the title for that page. Google looks at your page title as a signal of relevance for that keyword. The same holds true for the description of that page. (Don't worry about the keyword title tag -- Google has publicly said that it doesn't pay attention to that tag, since it has been abused by webmasters and all those trying to rank for certain keywords.)

Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉

Great post. I know most of the stuff experienced people read and think “I know that already”… but actually lots of things we tend to forget even though we know them. So its always good to read those. What I liked most was the broken link solution. Not only to create a substitute for the broken link but actually going beyond that. I know some people do this as SEO technique but its actually also useful for the internet as you repair those broken links that others find somewhere else.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. The simple fact that you comment back is awesome.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
SEO often involves improving the quality of the content, ensuring that it is rich in relevant keywords and organizing it by using subheads, bullet points, and bold and italic characters. SEO also ensures that the site’s HTML is optimized such that a search engine can determine what is on the page and display it as a search result in relevant searches. These standards involve the use of metadata, including the title tag and meta description. Cross linking within the website is also important.
Social media. The algorithms have truly changed since social media first emerged. Many content websites are community-oriented -- Digg began allowing users to vote which stories make the front page, and YouTube factors views and user ratings into their front page rankings. Therefore, e-commerce stores must establish a strong social media presence on sites like Facebook , Pinterest, Twitter, etc. These social media sites send search engines signals of influence and authority. 

In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
Having an industry influencer publish a blog post on your site or turning an interview with them into a blog post can help to drive traffic both through organic search but also via that influencer promoting the content to their audience (see the backlinks section above). This can also help to add more variety to your content and show your visitors that you are active in your field.

Promoting your websites by publishing articles to various article directories is by no means a new idea but still an extremely effective way to drive traffic. If you write content and publish it to websites like Article Base, and Article Dashboard website owners will pick it up and post it. This idea is similar to guest blogging except that you only have to write one piece of content that can end up on hundreds of even thousands of blogs and websites.  The same rule applies here: don’t be boring – be creative and interesting and use common keywords in your article and title so website owners can find it!

Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.

So many great tips! There are a couple of things I’ve implemented recently to try and boost traffic. One is to make a pdf version of my post that people can download. It’s a great way to build a list:) Another way is to make a podcast out of my post. I can then take a snippet of it and place it on my Facebook page as well as syndicate it. As far as video I’ve started to create a video with just a few key points from the post. The suggestion about going back to past articles is a tip I am definitely going to use especially since long-form content is so important. Thanks!

Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.

When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Hello Brian, really such an informative article and is more meaningful as you provided screen shots. I have noticed that articles with images bring more value to understand the things. I have just started my career in this industry and thus keep looking for some good articles/blogs that are meaningful and help me to implement tips in my work apart from my seniors instructions. I guess this was I can prove them about my caliber 🙂
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.
Think of this document as a living communication between you and your client or boss. It is a document you should refer to often. It keeps all parties on the same page and aligned. I recommend sharing it in a collaborative platform so updates are shared between all viewers without having to constantly send out new copies (nothing sucks the life out of efficiency faster than "versioning" issues).
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂