When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.
The days when internet browsing was done exclusively on desktop PCs are long gone. Today, more people than ever before are using mobile devices to access the web, and if you force your visitors to pinch and scroll their way around your site, you’re basically telling them to go elsewhere. Ensure that your website is accessible and comfortably viewable across a range of devices, including smaller smartphones.
Another way to increase traffic to your website is to get listed in free online directories and review sites. For most of these sites, your profile will have a link to your website, so actively updating these listings and getting positive reviews is likely to result in more website traffic. In addition, many directories like Yelp have strong domain authority on Google. There’s a chance that your business’s free Yelp page could rank high for relevant searches.
For my Adsense plugin which you can get here https://wordpress.org/plugins/adsense-made-easy-best-simple-ad-inserter/ I’ve created a PRO version (https://www.seo101.net/adsense-made-easy-pro/) that is available to those that sign up for my mailing list. It’s not much but it gets me 5 to 6 subscibers a day. And best of all I know exactly what my subscribers are interested in… WordPress and Adsense:)
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
Hi, my name is Dimitrios and I am responsible for Crave Culinaire’s digital marketing. I would like to drive more traffic to Crave’s blog. Since Crave Culinaire is the only catering company who provides molecular cuisine, I thought about craving a blog post about that. The influencers in this niche have great success in utilizing recipes on their blogs. I will share some recipes of Brian Roland, owner and head chef of Crave Culinaire.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
The Galactic Empire thought they could take over the galaxy with fear and brute force. They developed plans for a space station with firepower strong enough to destroy a planet. Under the command of Governor Tarkin, the Death Star was created. They tested the completed Death Star on Princess Leia's home planet of Alderaan, which gave Obi Wan Kenobi shivers.
This was all free information I found online in less than an hour, that gives me some great ideas for content, partnerships and potential tools to build into my site to be relevant and useful to my target audience. Of course this is just some quick loose data, so I'll emphasize again: be careful where your data comes from (try to validate when possible), and think about how to use your data wisely.

Influencers: Government Contracting Officers, Other GovCon (Government Contracting) consultants, Sellers of professional services for small businesses (certain CPAs, bonding companies, financial institutions, contract attorneys), large contracting firms (who need to hire small business subcontractors), Union/trade organizations, Construction and Engineering trade publications


I read The Art of War in college, written by the Chinese general Sun Tzu (author of the quote above). While his actual existence is debated, his work is often considered as brilliant military strategy and philosophy. Thus, The Art of War is often co-opted into business for obvious reasons. Throughout the book, you'll realize tactics and strategy are not interchangeable terms.
It’s not enough to just share content through social channels – you need to actively participate in the community, too. Got a Twitter account? Then join in group discussions with relevant hashtags. Is your audience leaving comments on your Facebook posts? Answer questions and engage with your readers. Nothing turns people off quicker than using social media as a broadcast channel – use social media as it was intended and actually interact with your fans.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.

Expert roundups have been abused in the Internet Marketing industry, but they are effective for several reasons. First, you don’t have to create any content. The “experts” create all the content. Second, it is ego bait. Meaning, anyone who participated in the roundup will likely share it with their audience. Last, it is a great way to build relationships with influencers.
Ask a marketer or business owner what they’d like most in the world, and they’ll probably tell you “more customers.” What often comes after customers on a business’ wish list? More traffic to their site. There are many ways you can increase traffic on your website, and in today’s post, we’re going to look at 25 of them, including several ways to boost site traffic for FREE.
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.

SEO often involves improving the quality of the content, ensuring that it is rich in relevant keywords and organizing it by using subheads, bullet points, and bold and italic characters. SEO also ensures that the site’s HTML is optimized such that a search engine can determine what is on the page and display it as a search result in relevant searches. These standards involve the use of metadata, including the title tag and meta description. Cross linking within the website is also important.
There are many SEO tactics you can perform on each of your website pages to increase their rank in search engines and get more visitors. This includes producing high-quality content that your audience is searching for, and writing concise meta descriptions for your pages. The meta description appears below your URL in search results. Knowing what a page is about and what will result in a click makes users much more likely to do so. On-page SEO tactics such as these are free, but do take some time. For more help with on-page SEO, check out this blog post: Google Ranking Factors: On-Page vs Off-Page SEO.

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Excellent post Brian. I think the point about writing content that appeals to influencers in spot on. Could you recommend some good, manual strategies through which I can spot influencers in boring niches *B2B* where influencers are not really talking much online? Is it a good idea to rely on newspaper articles to a feel for what a particular industry is talking about? Would love to hear your thoughts on that.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
For example, let’s say I have a health site. I have several types of articles on health, drug information, and information on types of diseases and conditions. My angle on the site is that I’m targeting seniors. If I find out seniors are primarily interested in information on prescription drug plans and cheap blood pressure medication, then I know that I want to provide information specifically on those things. This allows me to hone in on that market’s needs and de-prioritize or bypass other content.
Good of you to compare SEO strategies to a battlefield. Tactics and strategy are two very confusing words when put together in the same boat. You are right. Every company feeds on different strategies to hope for the one and only outcome: SUCCESS. Every company should make realistic and achievable goals but what about their vision? Companies stand out from others when they do things in a different and efficient manner. But, I love your ideas and can’t wait to put my thinking hat on and start the considerations.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
Keywords. Keyword research is the first step to a successful SEO strategy. Those successful with SEO understand what people are searching for when discovering their business in a search engine. These are the keywords they use to drive targeted traffic to their products. Start brainstorming potential keywords, and see how the competition looks by using Google AdWords Keyword Tool. If you notice that some keywords are too competitive in your niche, go with long-tail keywords (between two and five words) which will be easier for you to rank. The longer the keyword, the less competition you will have for that phrase in the engines.
Fantastic information ,extremely informative and highly valuable for individuals looking to achieve website traffic.Our marketing team involved themselves in this activity using a hybrid email marketing called EasySendy Pro. We saw some vast improvement in our email open rate and click through rate. Therefore, as per my experience I can confidently say that email marketing is very effective and also it drives good amount of traffic .
Attempting to replace a dead link with your own is easily and routinely identified as spam by the Wikipedia community, which expects dead links to be replaced to equivalent links at archive.org. Persistent attempts will quickly get your account blocked, and your webiste can be blacklisted (the Wikipedia blacklist is public, and there is evidence that Google uses it to determine rankings), which will have negative SEO consequences.
However I feel that batching all the things influencers share , filter whats relevant from whats not… and ultimately niche it down to identify which exact type of content is hot in order to build our own is a bit fuzzy. Influencers share SO MUCH content on a daily basis – how do you exactly identify the topic base you’ll use build great content that is guaranteed to be shared?
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Hi Brain, I am a young business owner who has had 4 different websites in the last 2 years but none of them were successful as I would have liked due to lack of SEO. Now I am in process of starting another business and I felt it was time for me to learn about SEO myself. I must say the information you have provided is invaluable and extremely helpful!! I am learning on the go and you are my biggest contributor. Thank you Sir!
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
×