Not all strategies have to be big. Sometimes your window is small, and you're forced to build for a distinct — or tiny — opportunity. Maybe you don't have time for a proper large-scale strategy at all; a tactic or two might be all you can do to carry in a win. Just make that very clear with your boss or client. Don't misrepresent what you're trying to build as an SEO campaign.
On the other hand, I'd like to know how many people constitutes your new experience as an indipedent consultant? Infact, as others noted in the comments here, what you suggest is perfect especially for an in-house SEO situation or in for an Web Marketing Agency with at least 5/8 people working in. Even if all you say is correct and hopefully what everybodies should do, I honestly find quite difficult to dedicate all the amount of time and dedication in order to check all the steps described in your post. Or, at least, I cannot imagine myself doing it for all the clients.
This toolbar is based on the LRT Power*Trust metric that we’ve been using to identify spammy and great links in LinkResearchTools and Link Detox since 2012 and the free browser was just recently launched. It helps you promptly evaluate the power and trustworthiness of a website or page during your web-browsing way exacter than Google PageRank ever did.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
This is excellent and should be intuitive for marketers (and SEO pros are marketers!) but we often take the short cut and neglect critical details. What would also reinforce the strategy is way of providing solid projections for SEO (these could be based on industry trends and statistics). Clients now ask for ways to calculate ROI and they need numbers to get budget approvals. Increase in traffic by X, increase in qualified traffic and leads, conversions etc, some way of quatifying the expected return.
Hello Brian, i am planing to start my blog soon and im in preparation phase (invastigating, learning, etc…). I have read a lot of books and posts about SEO and i can say that this is the best post so far. Its not even a book and you covered more than in books. I would like to thank you for sharing your knowledge with me and rest of the world, thats one of the most appriciate thing that someone can do, even if you do it for your own “good” you shared it! As soon as i start my site ill make and article about you!!
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Wow. This is really great stuff. I just stumbled across this on Pinterest. (proving the power of that venue!) I have been blogging for over 10 years and listen to Gael and Mark too, so most of it was just refresher for me. But your list was exceptionally well written, complete and compelling. I actually thought pretty hard to come up with something that I could say, “Hey I got one more thing for you” and came up blank. So kudos to you!
Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁
Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
It’s rare to come across new SEO tips worth trying. And this post has tons of them. I know that’s true BECAUSE…I actually read it all the way to the end and downloaded the PDF. What makes these great is that so many are a multiple step little strategy, not just the one-off things to do that clients often stumble across and ask if they are truly good for SEO. But there are also some nice one-off tips that I can easily start using without ramping up a new project.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Lastly, it's important to remember that paralysis by over-thinking is a real issue some struggle with. There's no pill for it (yet). Predicting perfection is a fool's errand. Get as close as you can within a reasonable timeframe, and prepare for future iteration. If you're traveling through your plan and determine a soft spot at any time, simply pivot. It's many hours of upfront work to get your strategy built, but it's not too hard to tweak as you go.