Attempting to replace a dead link with your own is easily and routinely identified as spam by the Wikipedia community, which expects dead links to be replaced to equivalent links at archive.org. Persistent attempts will quickly get your account blocked, and your webiste can be blacklisted (the Wikipedia blacklist is public, and there is evidence that Google uses it to determine rankings), which will have negative SEO consequences.
The idea of “link bait” refers to creating content that is so extremely useful or entertaining it compels people to link to it. Put yourself in the shoes of your target demographic and figure out what they would enjoy or what would help them the most. Is there a tool you can make to automate some tedious process? Can you find enough data to make an interesting infographic? Is there a checklist or cheat sheet that would prove handy to your audience? The possibilities are nearly endless – survey your visitors and see what is missing or lacking in your industry and fill in the gaps.
I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites. No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want. I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc. I also took note of who the people were who said those things and where they were talking (forums, twitter, etc). It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links. If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Before you say it – no, true guest blogging isn’t dead, despite what you may have heard. Securing a guest post on a reputable site can increase blog traffic to your website and help build your brand into the bargain. Be warned, though – standards for guest blogging have changed radically during the past eighteen months, and spammy tactics could result in stiff penalties. Proceed with caution.
But I've recently heard some chatter voicing the polar opposite. I've heard the sentiment to wholly ignore certain data points because they don't represent the real person. To me, that's bad advice — directional data is better than the romantic notion of success based on your "gut" feel. Estimated search volume, clicks, and even impressions give credence not only to a keyword, but a bigger theme. This starts to create direction and an understanding of need, which leads to your next few rounds of audience recognition.
What are the pain points? What things drive the members of this organization to drink? From the customer support to the higher-ups, there are things that knock the company down. How do they get back up? Why are the pains they're looking to work around? It may not be realistic to interview the whole company, but ideally you can get a representative to answer these.
This is excellent and should be intuitive for marketers (and SEO pros are marketers!) but we often take the short cut and neglect critical details. What would also reinforce the strategy is way of providing solid projections for SEO (these could be based on industry trends and statistics). Clients now ask for ways to calculate ROI and they need numbers to get budget approvals. Increase in traffic by X, increase in qualified traffic and leads, conversions etc, some way of quatifying the expected return.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Hello Brian, i am planing to start my blog soon and im in preparation phase (invastigating, learning, etc…). I have read a lot of books and posts about SEO and i can say that this is the best post so far. Its not even a book and you covered more than in books. I would like to thank you for sharing your knowledge with me and rest of the world, thats one of the most appriciate thing that someone can do, even if you do it for your own “good” you shared it! As soon as i start my site ill make and article about you!!
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
For example, we regularly create content on the topic of "SEO," but it's still very difficult to rank well on Google for such a popular topic on this acronym alone. We also risk competing with our own content by creating multiple pages that are all targeting the exact same keyword -- and potentially the same search engine results page (SERP). Therefore, we also create content on conducting keyword research, optimizing images for search engines, creating an SEO strategy (which you're reading right now), and other subtopics within SEO.
The strength of your link profile isn’t solely determined by how many sites link back to you – it can also be affected by your internal linking structure. When creating and publishing content, be sure to keep an eye out for opportunities for internal links. This not only helps with SEO, but also results in a better, more useful experience for the user – the cornerstone of increasing traffic to your website.
Another way to increase traffic to your website is to get listed in free online directories and review sites. For most of these sites, your profile will have a link to your website, so actively updating these listings and getting positive reviews is likely to result in more website traffic. In addition, many directories like Yelp have strong domain authority on Google. There’s a chance that your business’s free Yelp page could rank high for relevant searches.