By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.

You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.


The best way to avoid this is proactively asking the right questions. Ask about resource support. Ask about historic roadblocks. Ask to be introduced to other players who otherwise hide behind an email here and there. Ask about the company's temperature regarding a bigger SEO strategy vs. short, quick-hit campaigns. Don't be your own biggest obstacle — I've never heard of anyone getting angry about over-communication unless it paralyzes progress.
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”
Thanks so much! Yes, this is a “mom blog” but I do give you blogging tips and how to make money blogging as well as ways to increase blog traffic. I do tailor my content for mom bloggers though! Thanks for purchasing my course, Ready Set Blog for Traffic. It did go through a big update in late 2018 so I would jump and check out the new module and video lessons on SEO in particular and more on Pinterest marketing for RIHGT NOW!
Keywords. Keyword research is the first step to a successful SEO strategy. Those successful with SEO understand what people are searching for when discovering their business in a search engine. These are the keywords they use to drive targeted traffic to their products. Start brainstorming potential keywords, and see how the competition looks by using Google AdWords Keyword Tool. If you notice that some keywords are too competitive in your niche, go with long-tail keywords (between two and five words) which will be easier for you to rank. The longer the keyword, the less competition you will have for that phrase in the engines.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]

Hey Brian, This article is really really awesome, Seriously you have covered all 21 points which i never read on anywhere over internet. Everyone shares basics but here you have shared awesome info specially that face book keyword research and 1000+ words in a post, and the wiki pedia ideas are really good and helpful. learned many things from this article. keep sharing this kind of info thanks


This philosophy is beautiful in its simplicity, and it serves to correct the “more, more, more” mentality of link building. We only want links from relevant sources. Often, this means that in order to scale our link-building efforts beyond the obvious tactics, we need to create something that deserves links. You have links where it makes sense for you to have links. Simple.
Content gaps – make an inventory of the site’s key content assets, are they lacking any foundational/cornerstone content pieces, non-existent content types, or relevant topic areas that haven’t been covered? What topics or content are missing from your competitors? Can you beat your competitors’ information-rich content assets? Useful guides on Content Gap Analysis:
Great article, learned a lot from it! But I still really get it with the share trigger and right content. For instance, the influencers now care a lot about the new Koenigsegg Agera RS >> https://koenigsegg.com/blog/ (Car). I thought about an article like “10 things you need to know about the Koenigsegg Agera RS”. The only problem is that I don’t know which keywords I should use and how i can put in share triggers.
Elna, I love it when pro bloggers write how-to posts that are highly highly valuable to their readers. This one is top notch … as you will see by how I share my NAME and blog with this comment. What a brilliant idea that I could never have thought of on my own EVER. This one is getting pinned all over the place. I love sharing content that really helps people.
Achievable: Make sure you're grounding your goal in reality. Sure, you can't control a massive Google update, but using the history of your sales and competitive data, you can make some inferences. You also need to make sure you have agreed-upon goals. Get buy-in before you set the goal in stone, leveraging the thoughts from the leaders, merchandisers, analysts, and anyone who might be able to provide insight into the likelihood of hitting your goal.
This was all free information I found online in less than an hour, that gives me some great ideas for content, partnerships and potential tools to build into my site to be relevant and useful to my target audience. Of course this is just some quick loose data, so I'll emphasize again: be careful where your data comes from (try to validate when possible), and think about how to use your data wisely.
Use the right anchor text. Using our previous example: if you wanted to internally link to the “how to make money” blog post, you can write a sentence in another blog, like “Once you have mastered [how to make money], you can enjoy as much luxury as you can dream.” In this case, the reader has a compelling case for clicking on the link because of both the anchor text (“how to make money”) and the context of the sentence. There is a clear benefit from clicking the link.
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Hi! I really found this article to be valuable and helpful to improve our SEO techniques. But I am just wondering regarding the dead links, does that mean we can contact those who have dead links to recreate the page? How does it improve my SEO technique for my website? Can they add some citations or thank you or gratitude section that links to our website?

Meta tags. Meta tags still play a vital role in SEO. If you type any keyword into a search engine, you’ll see how that keyword is reflected in the title for that page. Google looks at your page title as a signal of relevance for that keyword. The same holds true for the description of that page. (Don't worry about the keyword title tag -- Google has publicly said that it doesn't pay attention to that tag, since it has been abused by webmasters and all those trying to rank for certain keywords.)
It’s free to be active in online groups and on websites that are relevant to your business and community—and it helps you to obtain more traffic. Comment on blogs and social media posts, answer questions people are posting, and participate in conversations about your industry. The more you engage with your community, the more exposure and profile visits you get.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.
This philosophy is beautiful in its simplicity, and it serves to correct the “more, more, more” mentality of link building. We only want links from relevant sources. Often, this means that in order to scale our link-building efforts beyond the obvious tactics, we need to create something that deserves links. You have links where it makes sense for you to have links. Simple.

Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .
The goal of SEO is to get a web page high search engine ranking. The better a web page's search engine optimization, the higher a ranking it will achieve in search result listings. (Note that SEO is not the only factor that determines search engine page ranks.) This is especially critical because most people who use search engines only look at the first page or two of the search results, so for a page to get high traffic from a search engine, it has to be listed on those first two pages, and the higher the rank, the closer a page is to the number one listing, the better.  And whatever your web page's rank is, you want your website to be listed before your competitor's websites if your business is selling products or services over the internet.
WOW. I consider myself a total newbie to SEO, but I’ve been working on my Squarespace site for my small business for about 3 years and have read dozens of articles on how to improve SEO. So far, this has been the MOST USEFUL and information-packed resource I’ve found so far. I’m honestly shocked that this is free to access. I haven’t even completely consumed this content yet (I’ve bookmarked it to come back to!) but I’ve already made some significant changes to my SEO strategy, including adding a couple of infographics to blog posts, changing my internal and external linking habits, editing meta descriptions, and a bunch more. Thanks for all the time and passion you’ve out into this.
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[54] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[55]
#6 Go on podcasts! In 13 years of SEO and digital marketing, I’ve never had as much bang for the buck. You go on for 20 minutes, get access to a new audience and great natural links on high dwell time sites (hosts do all the work!). Thanks for including this tip Brian, I still don’t think the SEO community has caught on to the benefits of podcast guesting campaigns for SEO and more…it’s changed my business for sure.
On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
On a dating niche site I took the ‘ego-bait’ post one step further and had sexy girls perform a dance and strip to revel the names of the major bloggers in my niche written on their bodies. As you can imagine it got a lot of attention from the big players in my niche and my audience and is a little more creative for getting links, shares and traffic.
I read The Art of War in college, written by the Chinese general Sun Tzu (author of the quote above). While his actual existence is debated, his work is often considered as brilliant military strategy and philosophy. Thus, The Art of War is often co-opted into business for obvious reasons. Throughout the book, you'll realize tactics and strategy are not interchangeable terms.
On one specific project, one of the SEOs on my team was brought in during the wireframe stage. T­he entire product team held SEO-specific meetings every week to go over specific recommendations, taking them very seriously, and leaning on every word our team said. We were thrilled. We were hailing their efforts, promising big wins for the relaunch, and even hyping up the launch and it’s projected SEO results in the company SEO newsletter.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
×