Attempting to replace a dead link with your own is easily and routinely identified as spam by the Wikipedia community, which expects dead links to be replaced to equivalent links at archive.org. Persistent attempts will quickly get your account blocked, and your webiste can be blacklisted (the Wikipedia blacklist is public, and there is evidence that Google uses it to determine rankings), which will have negative SEO consequences.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.
The online marketing is never ending process. The more you hard work, the more you put your hand in the blogging, then the more you get benefits. There are many website visitor comes to your blog. But the main thing is, if a visitor are coming to your blog, if they are find out that this blog are not beneficial for getting information. Then the visitor are going to decrease from your blog.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
#16 is interesting because no one really knows about it. Myself and a former colleagu did a test on it about 4 years ago and published our results which conculded what you are saying. Since then I’ve been careful to follow this rule. The only issue is that often times using the exact kw does not “work” for navigation anchor texts. But with a little CSS trickery one can get the code for the nav bar to be lower in the code, prioritizing contextual links. I’ve also seen sites add links to 3-5 specific and important internal pages with keyword rich anchor texts, at the very top of the page in order to get those important internal links to be indexed first.