What kind of advice would you give is your site is growing but seems to be attracting the wrong kind of traffic? My visitor numbers are going up but all other indicators such as bounce rate, time page, pages per visit seem to be developing in the wrong direction. Not sure if that’s to be expected or if there is something that I should be doing to counter that development?

This was all free information I found online in less than an hour, that gives me some great ideas for content, partnerships and potential tools to build into my site to be relevant and useful to my target audience. Of course this is just some quick loose data, so I'll emphasize again: be careful where your data comes from (try to validate when possible), and think about how to use your data wisely.
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
This philosophy is beautiful in its simplicity, and it serves to correct the “more, more, more” mentality of link building. We only want links from relevant sources. Often, this means that in order to scale our link-building efforts beyond the obvious tactics, we need to create something that deserves links. You have links where it makes sense for you to have links. Simple.

If you do great work with your search engine optimization (SEO), it could mean a significant amount of revenue for your business. At the same time, however, it is also an ongoing initiative. Once you generate a steady stream of traffic from SEO, you need to constantly be maintaining and improving your SEO in order to keep those rankings you worked so hard for.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”
Hats off to your detailing and intelligence. I thoroughly enjoyed reading the post, very informative and engaging. I was actually applying them to see the amazing results. I also found a platform called soovledotcom which actually pulls keywords from amazon, e-bay, yahoo answer, wikipedia, google and bing, but your illustrations here will certainly yeild superior results for organic seo & finding keywords.
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]

Hi Brian, Awsome content as ever! I’m very interested in your idea of creating an ‘uber’ resource list or expert roundup post i.e. linking out to lots of to other authorities in my niche within one post. But should you always create ‘no-follow’ links to these authority sites to prevent juice from passing to them? And similarly if you sprinkle a few outbound authority links in other posts should they all be ‘no-follow’ or do you think big G ignores ‘no follow’ these days?

If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.


I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites.  No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want.  I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc.  I also took note of who the people were who said those things and where they were talking (forums, twitter, etc).  It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links.  If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
I second Rand's comment!  Congrats on moving from the corporate world to the independent consultant.  This is my goal for the near future.  I too have been testing the waters of independent consulting, but it doesn't quite pay the bills yet!  Sometimes I feel like I should find a mentor who has been where I am now and is where I want to go.  Perhaps i'll find a few in this community over time!
A quick search for “SEO ranking factors” will give you all of these answers and myriad others. There is a lot of information out there. And the reality is, while there are likely hundreds of variables working together to determine final placement, much of what is suggested is guesswork. And certainly, not all ranking factors are relevant to every business.
You’re spot on, thanks again for sharing these terrific hacks. I remember you said on a video or post that you don’t write every time. Right that why you always deliver such valuable stuff. I have to tell you Backlinko is one of my favorite resources out of 3. I’ve just uncover SeedKeywords and Flippa. As LSI became more crucial SeedKeywords seems to be a tool to be considered.
Thanks so much for this entry, Laura! I loved the way your post is so practical, straightforward, newbie-friendly - and most importantly, how it emphasizes the bottom line at all times. It's easy to get "lost in the fog" of SEO with so many looming tasks and forget the main purpose, so it's wonderful to have a straightforward outline of what to do and why certain tasks need to be done. I look forward to reading your future insights!
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Hey Sammy, I would always advise against buying traffic, social followers, or anything else in that area. It mostly ends up being a vanity metric without business benefits. It’s always better to earn the traffic by creating a valuable, high-quality website and marketing it properly. When you do that, you attract the kind of visitors who are interested in what you have to offer, which is usually better for the bottom line.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
I have been trying to produce more content because I believed the lack of traffic was to the small amount of content, but after reading your blog post, i’m beginning to doubt wether or not this is quality content. I will definitely do more research on influencers on my niche, now I have to figure out how to get their attention with my kind of content.
There are about every people who are using different social application. If you are actively perform good, then your performance of increases the website visitor to the blog. Social media is the key of increasing the visitor to the blog. If you wanted to increase the visitors then you are activity write the quality content, and share on Facebook, Instagram, Twitter, and LinkedIn. If you are not able to write the good content, then you should hire a person, who write content and post to the social media on daily basis.  In this way you are not worry about to increasing the website visitor to you blog. From social media majority people are going to find out that what this is. And you post on social media are directly take the people to their blog. And then they are able to learn and find out the other information that might will help to the visitor.
As I had a teacher at school who was always really picky on how to draw conclusions I must say that the conclusions you drew for your health situation might be true, but dangerous. For example: If slightly more women than men suffer from health deseases it could be wise to write the information toward women. But, if you take search behaviour into account thing could look a lot different: It might turn up that men search more than women or that (senior) men are more present on the net than women.

We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Moving into 2016, your website needs to be mobile-ready. There are three types of accepted options for a mobile site in Google’s eyes: responsive design, being set up on a mobile subdomain or use dynamic serving. Google also now ranks websites higher that apply SEO for their apps. So if you have an app, make sure you are taking the time to implement application SEO.

Thanks for the great post. I am confused about the #1 idea about wikipedia ded links…it seems like you didn’t finish what you were supposed to do with the link once you found it. You indicated to put the dead link in ahrefs and you found a bunch of links for you to contact…but then what? What do you contact them about and how do you get your page as the link? I’m obviously not getting something 🙁

I second Rand's comment!  Congrats on moving from the corporate world to the independent consultant.  This is my goal for the near future.  I too have been testing the waters of independent consulting, but it doesn't quite pay the bills yet!  Sometimes I feel like I should find a mentor who has been where I am now and is where I want to go.  Perhaps i'll find a few in this community over time!
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
A quick search for “SEO ranking factors” will give you all of these answers and myriad others. There is a lot of information out there. And the reality is, while there are likely hundreds of variables working together to determine final placement, much of what is suggested is guesswork. And certainly, not all ranking factors are relevant to every business.
SEO is short for "search engine optimization." To have your site optimized for the search engines means to attempt to have top placement in the results pages whenever a specific keyword is typed into the query box. There are many search engine optimization services to choose from, so here are some things to keep in mind when seeking SEO services or developing an SEO strategy of your own.

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]


Moving into 2016, your website needs to be mobile-ready. There are three types of accepted options for a mobile site in Google’s eyes: responsive design, being set up on a mobile subdomain or use dynamic serving. Google also now ranks websites higher that apply SEO for their apps. So if you have an app, make sure you are taking the time to implement application SEO.
Google’s Gary IIIyes sent this tweet on August 18, 2015, saying that, “If you're an SEO and you're recommending against going HTTPS, you're wrong, and you should feel bad.” The “S” in HTTPS stands for security, and if your URL leads with HTTPS (https://example.com) instead of HTTP (http://example.com), then your website is secure. Google wants you to move your site to HTTPS so badly that they are now giving a ranking boost to websites that are secure. As we move into 2016, we will be seeing many new websites transferring to HTTPS. 
The intent behind “SEO agency” is obvious… The searcher is looking for an SEO agency. Most of these searchers aren’t looking for life lessons from an SEO agency owner. Instead, they are just looking for the best SEO agency to get them more traffic and customers from Google. Plain and simple. I knew this when I created that page, but my SEO ego was too big.
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
If you havent see it already, check out the links in shor's comment below - there are some great resources in there. In some cases you can also consider surveying your current audience or customers through email, on-site surveys or SurveyMonkey.  Be sure to ask for some profiling information that you can use for determining specific persona needs like age, sex, location, etc. (Probably best not to make it sound like a creepy text chat like I just did though...)  :)
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
#16 is interesting because no one really knows about it. Myself and a former colleagu did a test on it about 4 years ago and published our results which conculded what you are saying. Since then I’ve been careful to follow this rule. The only issue is that often times using the exact kw does not “work” for navigation anchor texts. But with a little CSS trickery one can get the code for the nav bar to be lower in the code, prioritizing contextual links. I’ve also seen sites add links to 3-5 specific and important internal pages with keyword rich anchor texts, at the very top of the page in order to get those important internal links to be indexed first.

Thanks for the very, very in-depth article. I am a real estate agent in Miami, Florida and have been blogging all-original content for the past 21 months on my website and watched traffic increase over time. I have been trying to grow my readership/leads/clients exponentially and have always heard about standard SEO backlink techniques and writing for my reader, not influencers. Recently, I have had a few of my articles picked up and backlinked by 2 of the largest real estate blogs in the country, which skyrocketed visits to my site. Realizing what I wrote about, that appealed to them, and now reading your article, I am going to continue writing in a way that will leverage those influencers to help me with quality backlinks.
For example, let’s say I have a health site. I have several types of articles on health, drug information, and information on types of diseases and conditions. My angle on the site is that I’m targeting seniors. If I find out seniors are primarily interested in information on prescription drug plans and cheap blood pressure medication, then I know that I want to provide information specifically on those things. This allows me to hone in on that market’s needs and de-prioritize or bypass other content.

Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

advertisingalgorithmbacklinkscampaigncontentFacebookGoogleGuest PostinghundredsimagesImprovise the LoadingIncreaseinternal linkingLinkedInLoading speedLong Tail keywordsOff Page SEOon-page SEOPaid AdvertisementPinterestplatformRedditROIsearch engine optimizationsearch enginesSEOsocial platformsSocial sharingsubmitTwittervideosVisitorsWebsiteXMLXML sitemap


Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.


See the screenshot below for some of the sections for specific recommendations that you can add which will provide the meat of the document. Keep in mind this is a very flexible document – add recommendations that make sense (for example you may not always have specific design considerations for a project). Remember, it will be different every time you do it.
×