Hey Brian, I have landed in this blog while visiting via blog land. I must appreciate your effort to put up such informative content. As being an Internet Marketing Consultant, I would like to add few thought of my own with your valuable content. There are many people who wants HUGE number of traffic with no time at all. But as per my experience, SEO has become SLOW-BUT-STEADY process in the recent times. After so many algorithm updates of Google, I think if we will do any wrong things with the websites, that should be paid off. So without taking any risk, we need to work ethically so that slowly the website will get the authority and grab the targeting traffic. What do you think mate? I am eagerly looking forward to your reply and love to see more valuable write-ups from your side. Why don’t you write about some important points about Hummingbird Updates of Google. It will be a good read. Right brother? 🙂
Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉

There are community forums setup online for virtually every niche, industry, or topic you can imagine. The internet is a prime place for like minded people to talk to each other. 9 times out of 10 you can find a forum for your industry just by typing in [your industry]forum.com or searching for “[Your Industry] Forum” on Google. Find the forums in your industry with the largest user base, start posting there and become an active community member. Most forums will allow you to leave a link to your website in your post signature, so the more you post the more traffic you get.

We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
I completely agree that defintion of a target audience isa great first step, but would ask if adding in competitors to the analysis (mentioned here as a later step) helps draw out who your target audience would be via comparisons, i.e. showing who you are an who you are not - would be very interested to hear opinions on how this tactic can be used within the overall step in coordination with targeted keyword discovery.
Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁

Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.

Hey Mischelle, thanks for the input! It’s true, SEO is definitely a long game. You need to lay the foundation and keep improving your site, publish new content and promote what you already have. However, if you keep at it, it can pay off nicely over time. And you are right, picking the right keywords is one of the foundations for SEO success. Thanks for commenting!

For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Squidoo is a website full of 100% user generated content that allows you to create what’s called a “lense.” A lense is a page about a specific topic that you choose to write about (usually something you’re knowledgeable in). After creating your lense other people can find it by searching for terms and keywords related to your lense. Let me just start off by saying Squidoo is an absolute powerhouse in the search engines. Its very easy to rank Squidoo lenses for competitive terms that would prove to be a challenge for websites with lesser authority. Creating a lense on Squidoo gives you 2 traffic opportunities:

This information hits the mark. “If you want your content to go viral, write content that influencers in your niche will want to share.” I love the information about share triggers too. I’m wondering, though, if you could share your insights on how influencers manage to build such vast followings. At some point, they had to start without the support of other influencers. It would seem that they found a way to take their passion directly to a “ready” world. Excellent insights. Thanks for sharing.

In the real world, its not so easy. For example, I have 2 niches where I’m trying to use your technique. By keywords, its Software for Moving and Free Moving Quotes. I have 2 websites that related to each of them, emoversoftware.com (emover-software.com as initial, they linked together) and RealMoving.com ( for latter keyword). So, to begin with, none of those niches has Wikipedia articles, so your first suggestion will not work. But, in general suggestions, you are advising to get backlinks (of authoritative sites of course). But check this out – my site emover-software.com has only 4(!) backlinks (https://openlinkprofiler.org/r/emover-software.com#.VXTaOs9VhBc) and, however, listed as #1 (or #2) by my keyword. (moving software, software for moving, software for moving company). RealMoving.com has more than 600 backlinks and is way back in ranks ( 170 and up) by my keyword. Even though those sites have different competition, its still makes no sense! It doesn’t seem like Google even cares about your backlinks at all! I also checked one of my competitor’s backlinks, its more than 12000, however, his rank by keyword related to moving quotes even worse than mine!.
Achievable: Make sure you're grounding your goal in reality. Sure, you can't control a massive Google update, but using the history of your sales and competitive data, you can make some inferences. You also need to make sure you have agreed-upon goals. Get buy-in before you set the goal in stone, leveraging the thoughts from the leaders, merchandisers, analysts, and anyone who might be able to provide insight into the likelihood of hitting your goal.
I understand that some SEO agencies and departments are not built for the big SEO campaigns. Strategic work takes time, and speeding (or scaling) through the development stage will likely do more harm than good. It's like cramming for a test — you're going to miss information that's necessary for a good grade. It would be my pleasure if this post inspired some change in your departments.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.
SEO often involves improving the quality of the content, ensuring that it is rich in relevant keywords and organizing it by using subheads, bullet points, and bold and italic characters. SEO also ensures that the site’s HTML is optimized such that a search engine can determine what is on the page and display it as a search result in relevant searches. These standards involve the use of metadata, including the title tag and meta description. Cross linking within the website is also important.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
This information hits the mark. “If you want your content to go viral, write content that influencers in your niche will want to share.” I love the information about share triggers too. I’m wondering, though, if you could share your insights on how influencers manage to build such vast followings. At some point, they had to start without the support of other influencers. It would seem that they found a way to take their passion directly to a “ready” world. Excellent insights. Thanks for sharing.