The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.

Spider-driven search engines such as Google®, Yahoo!® and MSN® use "robots" or "crawlers" to score websites across the Internet. Robots "spider/crawl" each site and "score" pages based on how relevant they are. A website's score or placement within a spider driven search engine is derived from hundreds of variables such as link popularity, density and frequency of keywords in page content, HTML code, site themes and more. You will want to focus many criteria in your SEO strategy to position yourself well among the major search engines. Here are two of the most influential factors:


Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!

My company has been working on a large link building project. We’ve already performed extensive keyword research and link analysis and now we’re considering executing an email outreach campaign. However, all the content we’ve created up until this point is geared more towards our target audience as opposed to the key influencers of our target audience. Do you think it would be worth it to try to build backlinks to our existing content or are we better off creating new content that directly appeals to the influencers of our target audience?
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
For example, let’s say I have a health site. I have several types of articles on health, drug information, and information on types of diseases and conditions. My angle on the site is that I’m targeting seniors. If I find out seniors are primarily interested in information on prescription drug plans and cheap blood pressure medication, then I know that I want to provide information specifically on those things. This allows me to hone in on that market’s needs and de-prioritize or bypass other content.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]

For example, if a swimming pool business is trying to rank for "fiberglass pools" -- which is receiving 110,000 searches per month -- this short-tail keyword can be the one that represents the overarching topic on which they want to create content. The business would then identify a series of long-tail keywords that relate to this short-tail keyword, have reasonable monthly search volume, and help to elaborate on the topic of fiberglass pools. We'll talk more about these long-tails in the next step of this process.
Studies have proven that top placement in search engines generally provide a more favorable return on investment compared to traditional forms of advertising such as, snail mail, radio commercials and television. Search engine optimization is the primary method to earning top 10 search engine placement. Learn more about the search engine optimization process and discuss an SEO strategy for your site when you contact a search engine specialist today.
5) Post at the right time. Let’s say you want to post in the r/Entrepreneur/ subreddit, but there’s already a post in the #1 spot with 200 upvotes, and it was posted 4 hours ago. If you post at that time, you probably won’t overtake that #1 spot, and you’ll get less traffic. However, if you wait a day, check back, and see that the new #1 spot only has 12-15 upvotes, you’ll have a golden opportunity. It will be much easier for you to hit the #1 spot and get hundreds of upvotes.
Schema.org is a type of markup that you can put in the code of your website. Using schema.org, you can tell Google which picture on your site is your logo, where your reviews are, where your videos are, what type of company you are, where you are located and much more. Google has hinted over the last year that schema.org will help your website rank better in Google search. Recently, Google’s John Mueller, said in a Google Hangout on Sept. 11 (at the 21:40 minute mark) that “over time, I think it [structured markup] is something that might go into the rankings as well.”
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
Laura,Great post.  This touches something I wish more SEOs practiced: conversion optimization. I think most SEOs think of what they do as a service for, instead of a partnership with clients.  The end result should never be raw traffic, but value obtained through targeted, CONVERTING traffic.You make excellent points about market research, product input, content creation, and other functions many SEOs and SEMs neglect.More and more SEO providers focus only on assembly line basics and worn out techniques instead of challenging themsleves to learn product marketing, usability, and conversion optimization.Your advice on market research is extremely valuable.Great start to a promising series.  I look forward to more!

Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!

For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
There are about every people who are using different social application. If you are actively perform good, then your performance of increases the website visitor to the blog. Social media is the key of increasing the visitor to the blog. If you wanted to increase the visitors then you are activity write the quality content, and share on Facebook, Instagram, Twitter, and LinkedIn. If you are not able to write the good content, then you should hire a person, who write content and post to the social media on daily basis.  In this way you are not worry about to increasing the website visitor to you blog. From social media majority people are going to find out that what this is. And you post on social media are directly take the people to their blog. And then they are able to learn and find out the other information that might will help to the visitor.

Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]

Those people who are going to search some information from the internet. Those people are in hurry, because the research shows. Those people who are going to search some information from blogs, they are in hurry. Means they do not spends minutes on each link. Because they are going to open the web link, and if in a link they are find the good information than they stayed on your blog. So keep update you your blog, and writing content daily basis, then it is easy way to increase the website visitor to your blog.
I’ve just started blogging and there’s a ton of useful information here. I was wondering how to use reddit and you cleared that up for me, as well as when to post to social media. Quora I’m going to check out as I’ve never heard of them-thank you! In your opinion would you also deal with any of the free traffic generators to have people come and engage, or would you skip that step? Would you use meta tags, and if yes how? Thank you for your time and I look forward to hearing from you!

I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.
Brian, great post as always! Question: Do you consider authority sites (industry portals) a form of “influencer marketing?” e.g. guest blogging, etc? In some niches there are not so many individuals who are influencers (outside of journalists) but there are sites that those in the industry respect. I am in the digital video space and for me one site is actually a magazine that is building a very strong digital presence. Thanks, keep up the good work!
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Awesome tips Brian. Always enjoy your posts. My question is, how can I boost traffic significantly if my keyword has pretty low search volume (around 100 monthly searches based on keyword planner)? I’ve been trying to expand my keyword list to include broader terms like “customer experience” but as you know that is super competitive. Do you have any suggestions for me? Thanks in advance.
Great post. I know most of the stuff experienced people read and think “I know that already”… but actually lots of things we tend to forget even though we know them. So its always good to read those. What I liked most was the broken link solution. Not only to create a substitute for the broken link but actually going beyond that. I know some people do this as SEO technique but its actually also useful for the internet as you repair those broken links that others find somewhere else.
×