Another way to increase traffic to your website is to get listed in free online directories and review sites. For most of these sites, your profile will have a link to your website, so actively updating these listings and getting positive reviews is likely to result in more website traffic. In addition, many directories like Yelp have strong domain authority on Google. There’s a chance that your business’s free Yelp page could rank high for relevant searches.
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
Great article as always. My wife is about to start a business about teaching (mainly) Mums how to film and edit little movies of their loved ones for posterity (www.lovethelittlethings.com launching soon). We have always struggled with thinking of and targeting relevant keywords because keywords like ‘videography’ and ‘family movies’ don’t really some up what she is about. Your article ties in with other learnings we have come across where we obviously need to reach out to right people and get them to share to get her product out there because purely focusing on keywords I don’t think will get us anywhere.
Hats off to your detailing and intelligence. I thoroughly enjoyed reading the post, very informative and engaging. I was actually applying them to see the amazing results. I also found a platform called soovledotcom which actually pulls keywords from amazon, e-bay, yahoo answer, wikipedia, google and bing, but your illustrations here will certainly yeild superior results for organic seo & finding keywords.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site. 

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]


In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
So many great tips! There are a couple of things I’ve implemented recently to try and boost traffic. One is to make a pdf version of my post that people can download. It’s a great way to build a list:) Another way is to make a podcast out of my post. I can then take a snippet of it and place it on my Facebook page as well as syndicate it. As far as video I’ve started to create a video with just a few key points from the post. The suggestion about going back to past articles is a tip I am definitely going to use especially since long-form content is so important. Thanks!

The Galactic Empire thought they could take over the galaxy with fear and brute force. They developed plans for a space station with firepower strong enough to destroy a planet. Under the command of Governor Tarkin, the Death Star was created. They tested the completed Death Star on Princess Leia's home planet of Alderaan, which gave Obi Wan Kenobi shivers.
Hi Brian! I enjoy reading your posts and use as much info as I possibly can. I build and sell storage sheds and cabins. The problem I have is that there are no top bloggers in my market or wikipedia articles with deadlinks that have to do with my market. 95% of my traffic and sales are generated via Facebook paid advertising. Would love to get more organic traffic and would be interested in your thoughts concerning this.
There are a few competitive tools we tend to gravitate towards in our industry. SEMrush is a fantastic tool allowing anyone to look up a website and get an estimated search visibility and traffic share. Drilling in shows how well pages perform independently. Gleaning through exports can quickly reveal what topics are driving traffic, to which you might replicate or improve your own version.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Every time I write new content I post it to twitter. If you use the right keywords and make your tweet interesting enough you can get a lot of clickthroughs just from people searching. For example if I write an article about SEO and Google I can tag the end of the tweet with #SEO #Google and anyone that searches for those keywords on Twitter can see my tweet about the post that I wrote. Be sure to write creative headlines for your posts so people feel the urge to click on them.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]

We often see posts on how to get blog topic ideas or ideas on creating visuals but nobody ever talked about new link building ideas. The ways you showed here some are absolutely unheard to me. You know what I think you should write a post on how to get your own link building ideas…where to start…how to proceed…how do I know it’s full proof…it surely comes with lots of experiments…but the point is starting…….I know sounds weird but I know you will come up with something 🙂
Btw, I was always under the impression that digg and delicious were dying but I’m really mistaken. Your(and Jason’s) thinking is foolproof though. If these guys are already curating content, there’s no reason they wouldn’t want to do more of just that! Seo has become a lot of chasing and pestering…it’s good of you to remind us that there are people out there just waiting to share stuff, too.:)
Instead, in this instance, we started at wireframe stage, plopping in keywords and meta tags. Of course, the site really needed those things, and although it launched technically “optimized”, it wasn’t enough to provide a better product than our top competitor(s). A product that people want to visit, revisit, email to friends, share on social networks, and link to more than our competitors. It wasn’t even enough to move up in the rankings.
There are a few competitive tools we tend to gravitate towards in our industry. SEMrush is a fantastic tool allowing anyone to look up a website and get an estimated search visibility and traffic share. Drilling in shows how well pages perform independently. Gleaning through exports can quickly reveal what topics are driving traffic, to which you might replicate or improve your own version.

In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.
The best way to avoid this is proactively asking the right questions. Ask about resource support. Ask about historic roadblocks. Ask to be introduced to other players who otherwise hide behind an email here and there. Ask about the company's temperature regarding a bigger SEO strategy vs. short, quick-hit campaigns. Don't be your own biggest obstacle — I've never heard of anyone getting angry about over-communication unless it paralyzes progress.

Sending out regular newsletters and promoting offers through email is a great way to stay in touch with your customers and can also help to get traffic to your website. Provide useful information and links to pages on your website where they can learn more, such as through blog posts and landing pages for particular offers. Just make sure that you don`t continually bombard your readers with emails or your customers will either disengage with, delete, or unsubscribe from your emails.
×