This is excellent and should be intuitive for marketers (and SEO pros are marketers!) but we often take the short cut and neglect critical details. What would also reinforce the strategy is way of providing solid projections for SEO (these could be based on industry trends and statistics). Clients now ask for ways to calculate ROI and they need numbers to get budget approvals. Increase in traffic by X,  increase in qualified traffic and leads, conversions etc, some way of quatifying the expected return.
Commenting on blog posts written by industry experts with lots of followers can bring your website a lot of traffic. When you post a comment (most) blogs allow you to leave a link back to your site for other readers to check out – as long as you leave an insightful comment you WILL get traffic from your blog comments. Make sure you comment as quickly as possible when new blog posts go up. The higher in the comments you are the more clicks you’ll get. I have Google Reader setup to alert me when new blog posts are made on the industry blogs I follow and I comment immediately to lock in my first place spot.
In fact, as stipulated by law, we can not and do not make any guarantees about your ability to get results or earn any money with our ideas, information, tools or strategies. We don’t know you and, besides, your results in life are up to you. Agreed? Your results will be impacted by numerous factors not limited to your experience, background, discipline and conscientiousness. Always do your own due diligence and use your own judgment when making buying decisions and investments for yourself or in your business.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[54] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[55]

Every time I write new content I post it to twitter. If you use the right keywords and make your tweet interesting enough you can get a lot of clickthroughs just from people searching. For example if I write an article about SEO and Google I can tag the end of the tweet with #SEO #Google and anyone that searches for those keywords on Twitter can see my tweet about the post that I wrote. Be sure to write creative headlines for your posts so people feel the urge to click on them.
I completely agree that defintion of a target audience isa great first step, but would ask if adding in competitors to the analysis (mentioned here as a later step) helps draw out who your target audience would be via comparisons, i.e. showing who you are an who you are not - would be very interested to hear opinions on how this tactic can be used within the overall step in coordination with targeted keyword discovery.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
There is no magic formula for content marketing success, despite what some would have you believe. For this reason, vary the length and format of your content to make it as appealing as possible to different kinds of readers. Intersperse shorter, news-based blog posts with long-form content as well as video, infographics and data-driven pieces for maximum impact.
Like many SEOs, I was hired with one vague responsibility: to set up an SEO program and achieve results. Like many SEOs, we jumped right in and started spewing out SEO audits, rewriting title tags, offering up link suggestions, rewriting URLs and so on. And like many SEOs we promised results. But what we didn’t do, until that fateful launch, was develop a comprehensive strategy.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Fantastic stuff, as usual, Brian. The First Link Priority Rule is always one that causes me great angst. I often get torn between search engines and usability when it comes to the main navigation bar. And, I’ve never known what the heck to do about the “Home” link. You can hardly target your keywords with that one without it being anything but awkward.
Think of it this way: The more specific your content, the more specific the needs of your audience are -- and the more likely you'll convert this traffic into leads. This is how Google finds value in the websites it crawls; the pages that dig into the interworkings of a general topic are seen as the best answer to a person's query, and will rank higher.
There is no magic formula for content marketing success, despite what some would have you believe. For this reason, vary the length and format of your content to make it as appealing as possible to different kinds of readers. Intersperse shorter, news-based blog posts with long-form content as well as video, infographics and data-driven pieces for maximum impact.
Fantastic stuff, as usual, Brian. The First Link Priority Rule is always one that causes me great angst. I often get torn between search engines and usability when it comes to the main navigation bar. And, I’ve never known what the heck to do about the “Home” link. You can hardly target your keywords with that one without it being anything but awkward.
Backlinks can actually serve as a proxy for interest. In Google's vision of a democratic web, they considered links to function like votes. Google wants editorial votes to influence their algorithm. So, if we assume all links are potentially editorial, then looking up backlink data can illustrate content that's truly beloved. Grab your favorite backlink data provider (hey — Moz has one!) and pull a report on a competitor's domain. Take a look at the linked pages, and with a little filtering, you'll see top linked pages emerge. Dive into those pages and develop some theories on why they're popular link targets.
Black hat SEO involves techniques such as paying to post links to a website on link farms, stuffing the metadata with nonrelated keywords, and using text that is invisible to readers to attract search engines. These and many other black hat SEO tactics may boost traffic, but search engines frown on the use of such measures. Search engines may punish sites that employ these methods by reducing their page rank or delisting them from search results.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
For example, let’s say I have a health site. I have several types of articles on health, drug information, and information on types of diseases and conditions. My angle on the site is that I’m targeting seniors. If I find out seniors are primarily interested in information on prescription drug plans and cheap blood pressure medication, then I know that I want to provide information specifically on those things. This allows me to hone in on that market’s needs and de-prioritize or bypass other content.

We often see posts on how to get blog topic ideas or ideas on creating visuals but nobody ever talked about new link building ideas. The ways you showed here some are absolutely unheard to me. You know what I think you should write a post on how to get your own link building ideas…where to start…how to proceed…how do I know it’s full proof…it surely comes with lots of experiments…but the point is starting…….I know sounds weird but I know you will come up with something 🙂
×