A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
While most of the links to your site will be added gradually, as people discover your content through search or other ways and link to it, Google understands that you'd like to let others know about the hard work you've put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject. As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.
#16 is interesting because no one really knows about it. Myself and a former colleagu did a test on it about 4 years ago and published our results which conculded what you are saying. Since then I’ve been careful to follow this rule. The only issue is that often times using the exact kw does not “work” for navigation anchor texts. But with a little CSS trickery one can get the code for the nav bar to be lower in the code, prioritizing contextual links. I’ve also seen sites add links to 3-5 specific and important internal pages with keyword rich anchor texts, at the very top of the page in order to get those important internal links to be indexed first.
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
Let me tell you a story. Early in my tenure at Yahoo we tried to get into the site dev process in the early stages in order to work SEO into the Product Recommendations Documents (PRD) before wireframing began. But as a fairly new horizontal group not reporting into any of the products, this was often difficult. Nay, damn near impossible. So usually we made friends with the product teams and got in where we could.

Search engine spiders can only spider through text. They will use the content on your site to determine what your site is about, which in turn will help to decide how highly your site will be ranked for specific keyword phrases when visitors type them into the search engines. For this reason, keyword research is critical to obtaining natural search engine placement and should be at the top of your list when mapping out your SEO strategy.


Each organic search engine ranking places emphasis on variable factors such as the design and layout, keyword density and the number of relevant sites linking to it. Search engines constantly update and refine their ranking algorithms in order to index the most relevant sites. Other variables that have an impact on search engine placement include the following:
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Like you I am a scientist and like you did in the past, I am currently working on translating great scientific literature into tips. In my case it’s child development research into play tips for parents. I can already see that the outcome of my experiment is going to be the same as yours. Great content but who cares. I hadn’t even thought about my key influences. I know some important ones, but don’t see how they would share my content. I thought I was writing content for my potential customers. Is your SEO that works course the same as the content that gets results course? Sorry if I sound a bit dim asking that question.
Although this is a step-by-step series, everyone's methods will (and should) vary, so it really depends on how much time you think it will take (if you're billing hourly).  What tools do you have at your disposal vs. how much researching for information will you have to do on your own? Will you have to pay for research reports or companies? Do you pay a monthly service for data or research?
Instead, in this instance, we started at wireframe stage, plopping in keywords and meta tags. Of course, the site really needed those things, and although it launched technically “optimized”, it wasn’t enough to provide a better product than our top competitor(s). A product that people want to visit, revisit, email to friends, share on social networks, and link to more than our competitors. It wasn’t even enough to move up in the rankings.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.
Like you I am a scientist and like you did in the past, I am currently working on translating great scientific literature into tips. In my case it’s child development research into play tips for parents. I can already see that the outcome of my experiment is going to be the same as yours. Great content but who cares. I hadn’t even thought about my key influences. I know some important ones, but don’t see how they would share my content. I thought I was writing content for my potential customers. Is your SEO that works course the same as the content that gets results course? Sorry if I sound a bit dim asking that question.
In a very crowded, noisy space – entrepreneurs and small business owners with a ton of “experts and influencers.” How do I get “above the noise?” I have built up a great brand and, I think, some great content based on a boatload of practical, real-life experience. I also have some products and services that I’m trying to sell, but I remain, “all dressed up, with no place to go.” Thoughts?
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
This truly amazing and I’m gonna share this with like minded people. I loved the part about flippa. What a great source to get ideas. Building links tends to be the hardest to do, but a few good quality links is all you need now a days to get ranked. I currently rank for a very high volume keyword with only 5 links all with pr 3,4 and good DA and PA. Good links are hard to get but you only need a few which is encouraging! Props for this post!
Thanks for the great post. I am confused about the #1 idea about wikipedia ded links…it seems like you didn’t finish what you were supposed to do with the link once you found it. You indicated to put the dead link in ahrefs and you found a bunch of links for you to contact…but then what? What do you contact them about and how do you get your page as the link? I’m obviously not getting something 🙁
×