You’re spot on, thanks again for sharing these terrific hacks. I remember you said on a video or post that you don’t write every time. Right that why you always deliver such valuable stuff. I have to tell you Backlinko is one of my favorite resources out of 3. I’ve just uncover SeedKeywords and Flippa. As LSI became more crucial SeedKeywords seems to be a tool to be considered.
Excellent post Brian. I think the point about writing content that appeals to influencers in spot on. Could you recommend some good, manual strategies through which I can spot influencers in boring niches *B2B* where influencers are not really talking much online? Is it a good idea to rely on newspaper articles to a feel for what a particular industry is talking about? Would love to hear your thoughts on that.
In the real world, its not so easy. For example, I have 2 niches where I’m trying to use your technique. By keywords, its Software for Moving and Free Moving Quotes. I have 2 websites that related to each of them, emoversoftware.com (emover-software.com as initial, they linked together) and RealMoving.com ( for latter keyword). So, to begin with, none of those niches has Wikipedia articles, so your first suggestion will not work. But, in general suggestions, you are advising to get backlinks (of authoritative sites of course). But check this out – my site emover-software.com has only 4(!) backlinks (https://openlinkprofiler.org/r/emover-software.com#.VXTaOs9VhBc) and, however, listed as #1 (or #2) by my keyword. (moving software, software for moving, software for moving company). RealMoving.com has more than 600 backlinks and is way back in ranks ( 170 and up) by my keyword. Even though those sites have different competition, its still makes no sense! It doesn’t seem like Google even cares about your backlinks at all! I also checked one of my competitor’s backlinks, its more than 12000, however, his rank by keyword related to moving quotes even worse than mine!.

This philosophy is beautiful in its simplicity, and it serves to correct the “more, more, more” mentality of link building. We only want links from relevant sources. Often, this means that in order to scale our link-building efforts beyond the obvious tactics, we need to create something that deserves links. You have links where it makes sense for you to have links. Simple.


This is excellent and should be intuitive for marketers (and SEO pros are marketers!) but we often take the short cut and neglect critical details. What would also reinforce the strategy is way of providing solid projections for SEO (these could be based on industry trends and statistics). Clients now ask for ways to calculate ROI and they need numbers to get budget approvals. Increase in traffic by X,  increase in qualified traffic and leads, conversions etc, some way of quatifying the expected return.


Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. https://www.thatbloggingthing.com/69-blogging-secrets-i-stole-from-neil-patel/ The simple fact that you comment back is awesome.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.

Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.

In the real world, its not so easy. For example, I have 2 niches where I’m trying to use your technique. By keywords, its Software for Moving and Free Moving Quotes. I have 2 websites that related to each of them, emoversoftware.com (emover-software.com as initial, they linked together) and RealMoving.com ( for latter keyword). So, to begin with, none of those niches has Wikipedia articles, so your first suggestion will not work. But, in general suggestions, you are advising to get backlinks (of authoritative sites of course). But check this out – my site emover-software.com has only 4(!) backlinks (https://openlinkprofiler.org/r/emover-software.com#.VXTaOs9VhBc) and, however, listed as #1 (or #2) by my keyword. (moving software, software for moving, software for moving company). RealMoving.com has more than 600 backlinks and is way back in ranks ( 170 and up) by my keyword. Even though those sites have different competition, its still makes no sense! It doesn’t seem like Google even cares about your backlinks at all! I also checked one of my competitor’s backlinks, its more than 12000, however, his rank by keyword related to moving quotes even worse than mine!.
In the real world, its not so easy. For example, I have 2 niches where I’m trying to use your technique. By keywords, its Software for Moving and Free Moving Quotes. I have 2 websites that related to each of them, emoversoftware.com (emover-software.com as initial, they linked together) and RealMoving.com ( for latter keyword). So, to begin with, none of those niches has Wikipedia articles, so your first suggestion will not work. But, in general suggestions, you are advising to get backlinks (of authoritative sites of course). But check this out – my site emover-software.com has only 4(!) backlinks (https://openlinkprofiler.org/r/emover-software.com#.VXTaOs9VhBc) and, however, listed as #1 (or #2) by my keyword. (moving software, software for moving, software for moving company). RealMoving.com has more than 600 backlinks and is way back in ranks ( 170 and up) by my keyword. Even though those sites have different competition, its still makes no sense! It doesn’t seem like Google even cares about your backlinks at all! I also checked one of my competitor’s backlinks, its more than 12000, however, his rank by keyword related to moving quotes even worse than mine!.
Attempting to replace a dead link with your own is easily and routinely identified as spam by the Wikipedia community, which expects dead links to be replaced to equivalent links at archive.org. Persistent attempts will quickly get your account blocked, and your webiste can be blacklisted (the Wikipedia blacklist is public, and there is evidence that Google uses it to determine rankings), which will have negative SEO consequences.
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[48] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[48] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[49] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Although this is a step-by-step series, everyone's methods will (and should) vary, so it really depends on how much time you think it will take (if you're billing hourly).  What tools do you have at your disposal vs. how much researching for information will you have to do on your own? Will you have to pay for research reports or companies? Do you pay a monthly service for data or research?
In a very crowded, noisy space – entrepreneurs and small business owners with a ton of “experts and influencers.” How do I get “above the noise?” I have built up a great brand and, I think, some great content based on a boatload of practical, real-life experience. I also have some products and services that I’m trying to sell, but I remain, “all dressed up, with no place to go.” Thoughts?
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
You should build a website to benefit your users, and any optimization should be geared toward making the user experience better. One of those users is a search engine, which helps other users discover your content. Search Engine Optimization is about helping search engines understand and present content. Your site may be smaller or larger than our example site and offer vastly different content, but the optimization topics we discuss below should apply to sites of all sizes and types. We hope our guide gives you some fresh ideas on how to improve your website, and we'd love to hear your questions, feedback, and success stories in the Google Webmaster Help Forum1.
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][52] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[53] although the two are not identical.
However I feel that batching all the things influencers share , filter whats relevant from whats not… and ultimately niche it down to identify which exact type of content is hot in order to build our own is a bit fuzzy. Influencers share SO MUCH content on a daily basis – how do you exactly identify the topic base you’ll use build great content that is guaranteed to be shared?
Use your keyword list to determine how many different pillar pages you should create. Ultimately, the number of topics for which you create pillar pages should coincide with how many different products, offerings, and locations your business has. This will make it much easier for your prospects and customers to find you in search engines no matter what keywords they use.
People love reading about results. That’s because it’s one of the best ways to learn. You can read information all day, but results show you the practical application of the information. Create content showing real life results. It’s easy in my industry because results are all that matter. But this can work in other industries as well. Here are some non-marketing examples:

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]

×