When Larry wrote about the kick in the proverbial teeth that eBay took from Google’s Panda update, we managed to secure a link from Ars Technica in the Editor’s Pick section alongside links to The New York Times and National Geographic. Not too shabby – and neither was the resulting spike in referral traffic. Learn what types of links send lots of referral traffic, and how to get them, in this post.
Excellent post Brian. I think the point about writing content that appeals to influencers in spot on. Could you recommend some good, manual strategies through which I can spot influencers in boring niches *B2B* where influencers are not really talking much online? Is it a good idea to rely on newspaper articles to a feel for what a particular industry is talking about? Would love to hear your thoughts on that.
My company has been working on a large link building project. We’ve already performed extensive keyword research and link analysis and now we’re considering executing an email outreach campaign. However, all the content we’ve created up until this point is geared more towards our target audience as opposed to the key influencers of our target audience. Do you think it would be worth it to try to build backlinks to our existing content or are we better off creating new content that directly appeals to the influencers of our target audience?
To gain more customer engagement, the website must reach its visitors/customers efficiently. Obviously, you want the visitors to read your site content. Check the forms and click through on your Call To Actions (CTA’s) when they arrive on your web page. These features initiate user engagement in action, but it is essential to comprehend the in-depth analysis.
If you havent see it already, check out the links in shor's comment below - there are some great resources in there. In some cases you can also consider surveying your current audience or customers through email, on-site surveys or SurveyMonkey.  Be sure to ask for some profiling information that you can use for determining specific persona needs like age, sex, location, etc. (Probably best not to make it sound like a creepy text chat like I just did though...)  :)
Although this is a step-by-step series, everyone's methods will (and should) vary, so it really depends on how much time you think it will take (if you're billing hourly).  What tools do you have at your disposal vs. how much researching for information will you have to do on your own? Will you have to pay for research reports or companies? Do you pay a monthly service for data or research?
A solid content marketing and SEO strategy is also the most scalable way to promote your business to a wide audience. And this generally has the best ROI, as there is no cost per click — so you are scaling your marketing without directly scaling your costs. This kind of SEO strategy is not right for every business, but when it is a good fit, it’s almost unbeatable.
Great article, learned a lot from it! But I still really get it with the share trigger and right content. For instance, the influencers now care a lot about the new Koenigsegg Agera RS >> https://koenigsegg.com/blog/ (Car). I thought about an article like “10 things you need to know about the Koenigsegg Agera RS”. The only problem is that I don’t know which keywords I should use and how i can put in share triggers.
Specifics: Be as specific as you can with your recommendations. For example if you’re suggesting partnering with meal home delivery sites, find out which ones are going to provide the most relevant info, at what cost if possible, and what the ideal partnership would look like for content and SEO purposes. Even provide contact information if you can.
I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!
Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁
Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .
I understand that some SEO agencies and departments are not built for the big SEO campaigns. Strategic work takes time, and speeding (or scaling) through the development stage will likely do more harm than good. It's like cramming for a test — you're going to miss information that's necessary for a good grade. It would be my pleasure if this post inspired some change in your departments.
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Great post, your knowledge and innovative approach never fails to amaze me! This is certainly the first time I’ve heard someone suggest the Wikipedia dead link technique. It’s great that you’re getting people to think outside of the box. Pages like reddit are great for getting keywords and can also be used for link building although this can be difficult to get right. Even if you don’t succeed at using it to link build it’s still a really valuable platform for getting useful information. Thanks!
Expert roundups have been abused in the Internet Marketing industry, but they are effective for several reasons. First, you don’t have to create any content. The “experts” create all the content. Second, it is ego bait. Meaning, anyone who participated in the roundup will likely share it with their audience. Last, it is a great way to build relationships with influencers.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
On the other hand, I'd like to know how many people constitutes your new experience as an indipedent consultant? Infact, as others noted in the comments here, what you suggest is perfect especially for an in-house SEO situation or in for an Web Marketing Agency with at least 5/8 people working in. Even if all you say is correct and hopefully what everybodies should do, I honestly find quite difficult to dedicate all the amount of time and dedication in order to check all the steps described in your post. Or, at least, I cannot imagine myself doing it for all the clients.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Firstly, a disclaimer – don’t spam Reddit and other similar sites hoping to “hit the jackpot” of referral traffic, because it’s not going to happen. Members of communities like Reddit are extraordinarily savvy to spam disguised as legitimate links, but every now and again, it doesn’t hurt to submit links that these audiences will find genuinely useful. Choose a relevant subreddit, submit your content, then watch the traffic pour in.
Achievable: Make sure you're grounding your goal in reality. Sure, you can't control a massive Google update, but using the history of your sales and competitive data, you can make some inferences. You also need to make sure you have agreed-upon goals. Get buy-in before you set the goal in stone, leveraging the thoughts from the leaders, merchandisers, analysts, and anyone who might be able to provide insight into the likelihood of hitting your goal.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
×