A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[48] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[48] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[49] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
Awesome tips Brian. Always enjoy your posts. My question is, how can I boost traffic significantly if my keyword has pretty low search volume (around 100 monthly searches based on keyword planner)? I’ve been trying to expand my keyword list to include broader terms like “customer experience” but as you know that is super competitive. Do you have any suggestions for me? Thanks in advance.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
Thanks Brian. I’ve had a “a-ha” moment thanks to you! Great advice. I knew that backlinks would improve the organic SEO rankings to our client-targeted landing pages but I never knew it was through getting influencers to backlink blogs. I always just assumed it was great content that users wanted to share with others. It was driving me mad why people love my content but never share enough. Now I know!
Like many SEOs, I was hired with one vague responsibility: to set up an SEO program and achieve results. Like many SEOs, we jumped right in and started spewing out SEO audits, rewriting title tags, offering up link suggestions, rewriting URLs and so on. And like many SEOs we promised results. But what we didn’t do, until that fateful launch, was develop a comprehensive strategy.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Hey Ted, thanks for the great questions! The peak times refer to your particular time zone, if you are targeting an audience that resides in the same zone as you. You can also use tools to find out when most of your audience is online. For example, Facebook has this built into their Page Insights. For Twitter, you can use https://followerwonk.com/. Many social posting tools also offer this functionality.
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.
I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.
If you create content that people enjoy it can easily become popular or even go viral. The important thing is to put your website and content in front of people that are looking for it, right? Social bookmarking is a super easy way to do just that. Social bookmarking sites allow users to bookmark their favorite websites that other people can publicly view and vote up or down. If you bookmark useful content other people will find it, share it, and vote it up so others can enjoy it. Oh yeah, and it only takes about 30 seconds to bookmark your site. The 3 most popular social bookmarking sites are Digg, Reddit, and Delicious. These 3 sites get over 8 MILLION unique visitors a month – funneling off a chunk of that traffic to your website is very doable. (There’s plenty to go around ) Just remember to create content that people will enjoy and/or find useful. The most popular content on social bookmarking sites are usually check lists, “Top 10” lists, tools & resources, and breaking news – so keep that in mind!
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
Hi Brain, I am a young business owner who has had 4 different websites in the last 2 years but none of them were successful as I would have liked due to lack of SEO. Now I am in process of starting another business and I felt it was time for me to learn about SEO myself. I must say the information you have provided is invaluable and extremely helpful!! I am learning on the go and you are my biggest contributor. Thank you Sir!
Not sure exactly why, perhaps I used a number too big and since my page is about classifieds, it probably seemed too much to browse through 1500 ads, I assume? Somewhat like you would post 800 tips for better ranking? Don’t know, will try to change things a bit and see how it goes, but you really gave me some new suggestions to go for with this article. Thanks again 🙂

This post has been so helpful. It took me over an hour to read because I kept jotting down notes and getting lost in a loop of researching some points on google, haha. You always have such great in depth articles that really help me out as a new blogger. I have a small list of bloggers I follow for advice on blogging, but Twins Mommy is my favorite.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
Lastly, it's important to remember that paralysis by over-thinking is a real issue some struggle with. There's no pill for it (yet). Predicting perfection is a fool's errand. Get as close as you can within a reasonable timeframe, and prepare for future iteration. If you're traveling through your plan and determine a soft spot at any time, simply pivot. It's many hours of upfront work to get your strategy built, but it's not too hard to tweak as you go.
×