Let me tell you a story. Early in my tenure at Yahoo we tried to get into the site dev process in the early stages in order to work SEO into the Product Recommendations Documents (PRD) before wireframing began. But as a fairly new horizontal group not reporting into any of the products, this was often difficult. Nay, damn near impossible. So usually we made friends with the product teams and got in where we could.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Content is king. That’s the saying, right? It’s true in a way. Your website is really just a wrapper for your content. Your content tells prospects what you do, where you do it, who you have done it for, and why someone should use your business. And if you’re smart, your content should also go beyond these obvious brochure-type elements and help your prospective customers achieve their goals.
Specifics: Be as specific as you can with your recommendations. For example if you’re suggesting partnering with meal home delivery sites, find out which ones are going to provide the most relevant info, at what cost if possible, and what the ideal partnership would look like for content and SEO purposes. Even provide contact information if you can.
Nice work Laura! This is going to be a great series. I'm working my way through SEOmoz's Advanced SEO Training Series (videos) Vol. 1 & 2 to build upon the advice and guidance that you and your team provided to me during my time at Yahoo!. Now many others will benefit from your knowledge, experience and passion for SEO strategy and tactics. Best wishes for great success in your new role.
Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.
Hey Brian I must say it’s a awesome content you are sharing .my question to you is how did you transform from a nutrition expert to a Seo master I mean both subjects are poles apart so how did you learn SEO can you share your story because I find my self in similar situation I am an engineer by profession and I am starting a ecommerce business niche is Apparel no experience of watspever in Blog writing and SEO if you can throw some resources where I can improve my skills that would be a huge help
For example, let’s say I have a health site. I have several types of articles on health, drug information, and information on types of diseases and conditions. My angle on the site is that I’m targeting seniors. If I find out seniors are primarily interested in information on prescription drug plans and cheap blood pressure medication, then I know that I want to provide information specifically on those things. This allows me to hone in on that market’s needs and de-prioritize or bypass other content.
Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Laura,Great post. This touches something I wish more SEOs practiced: conversion optimization. I think most SEOs think of what they do as a service for, instead of a partnership with clients. The end result should never be raw traffic, but value obtained through targeted, CONVERTING traffic.You make excellent points about market research, product input, content creation, and other functions many SEOs and SEMs neglect.More and more SEO providers focus only on assembly line basics and worn out techniques instead of challenging themsleves to learn product marketing, usability, and conversion optimization.Your advice on market research is extremely valuable.Great start to a promising series. I look forward to more!
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.
On another note, we recently went through this same process with an entire site redesign. The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor. It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic. It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!). I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
This information hits the mark. “If you want your content to go viral, write content that influencers in your niche will want to share.” I love the information about share triggers too. I’m wondering, though, if you could share your insights on how influencers manage to build such vast followings. At some point, they had to start without the support of other influencers. It would seem that they found a way to take their passion directly to a “ready” world. Excellent insights. Thanks for sharing.
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
Think of it this way: The more specific your content, the more specific the needs of your audience are -- and the more likely you'll convert this traffic into leads. This is how Google finds value in the websites it crawls; the pages that dig into the interworkings of a general topic are seen as the best answer to a person's query, and will rank higher.
In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.
Every website should have a content strategy focused around your top keywords. When you create content such as blog posts, videos, whitepapers, research reports and webinars, it gives people something to link to. In addition, the content you create can rank by itself in the search engines. For example, if you write a blog post on “How to Pick an SEO Company,” there is a possibility it will rank for some of the keywords you use in the title and in the body post, especially if the post gets linked to from other websites or shared a lot on social media. It also helps if your website as a whole already has significant high-quality links.