A solid content marketing and SEO strategy is also the most scalable way to promote your business to a wide audience. And this generally has the best ROI, as there is no cost per click — so you are scaling your marketing without directly scaling your costs. This kind of SEO strategy is not right for every business, but when it is a good fit, it’s almost unbeatable.

I completely agree that defintion of a target audience isa great first step, but would ask if adding in competitors to the analysis (mentioned here as a later step) helps draw out who your target audience would be via comparisons, i.e. showing who you are an who you are not - would be very interested to hear opinions on how this tactic can be used within the overall step in coordination with targeted keyword discovery.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
Hey Brian, love your site + content. Really awesome stuff! I have a question about dead link building on Wikipedia. I actually got a “user talk” message from someone moderating a Wikipedia page I replaced a dead link on. They claimed that “Wikipedia uses nofollow tags” so “additions of links to Wikipedia will not alter search engine rankings.” Any thoughts here?
Thanks for sharing these great tips last August! I’ve recently adopted them and I have a question (that’s kind of connected to the last post): how important would promoting content be when using this strategy? For example, through Google Adwords. As I guess that would depend on the circumstances, but I am trying to discover if there’s a ‘formula’ here. Thanks in advance!
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
In addition, you can also use tools like Majestic SEO to see who is linking to your competitors. Once you identify the links to your competitor’s sites, you can analyze these links, learn how they got them and implement a similar strategy for your website. For example, did they donate to a charity causing the charity to link to their site? You can do the same thing.

In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.


Wow Brian, You have solved my problem. A few days back I was looking for ways to increase traffic on my tech blog, I found this blog post by you while I was looking out for possible tricks to increase traffic. I must say that few of the tricks mentioned above really worked for me. For example, I updated a few old posts on my blog, I did try the broken link building technique and the last I did was to repost my content on Medium.
Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉
In the real world, its not so easy. For example, I have 2 niches where I’m trying to use your technique. By keywords, its Software for Moving and Free Moving Quotes. I have 2 websites that related to each of them, emoversoftware.com (emover-software.com as initial, they linked together) and RealMoving.com ( for latter keyword). So, to begin with, none of those niches has Wikipedia articles, so your first suggestion will not work. But, in general suggestions, you are advising to get backlinks (of authoritative sites of course). But check this out – my site emover-software.com has only 4(!) backlinks (https://openlinkprofiler.org/r/emover-software.com#.VXTaOs9VhBc) and, however, listed as #1 (or #2) by my keyword. (moving software, software for moving, software for moving company). RealMoving.com has more than 600 backlinks and is way back in ranks ( 170 and up) by my keyword. Even though those sites have different competition, its still makes no sense! It doesn’t seem like Google even cares about your backlinks at all! I also checked one of my competitor’s backlinks, its more than 12000, however, his rank by keyword related to moving quotes even worse than mine!.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]

If you do great work with your search engine optimization (SEO), it could mean a significant amount of revenue for your business. At the same time, however, it is also an ongoing initiative. Once you generate a steady stream of traffic from SEO, you need to constantly be maintaining and improving your SEO in order to keep those rankings you worked so hard for.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]

Let me tell you a story. Early in my tenure at Yahoo we tried to get into the site dev process in the early stages in order to work SEO into the Product Recommendations Documents (PRD) before wireframing began. But as a fairly new horizontal group not reporting into any of the products, this was often difficult. Nay, damn near impossible. So usually we made friends with the product teams and got in where we could.


To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.


Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
So I’m not good at English. Well I’m running a new vacation rental and travel website in French. But it seems that in francophone area people are reluctant to implement backlinks. I do need links to rank because I have strong competitors. So I’ve decided to ask for those links to Anglophone website owner. Since my content is in French, I thought I could ask for links to pages with solely touristic spots photos. What do you thinks of that?

advertisingalgorithmbacklinkscampaigncontentFacebookGoogleGuest PostinghundredsimagesImprovise the LoadingIncreaseinternal linkingLinkedInLoading speedLong Tail keywordsOff Page SEOon-page SEOPaid AdvertisementPinterestplatformRedditROIsearch engine optimizationsearch enginesSEOsocial platformsSocial sharingsubmitTwittervideosVisitorsWebsiteXMLXML sitemap
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
×