A quick search for “SEO ranking factors” will give you all of these answers and myriad others. There is a lot of information out there. And the reality is, while there are likely hundreds of variables working together to determine final placement, much of what is suggested is guesswork. And certainly, not all ranking factors are relevant to every business.
There's often a top-down marketing strategy already baked before you get to pitch your SEO work, to which you may find opportunity on a battlefield where access is not granted. It's reckless to assume you can go into any established company and lob a strategy onto their laps, expecting them to follow it with disregard to their existing plans, politics, and red tape. Candidly, this may be the quickest way to get fired and show you're not aligned with the existing business goals.
Thanks Brian. I’ve had a “a-ha” moment thanks to you! Great advice. I knew that backlinks would improve the organic SEO rankings to our client-targeted landing pages but I never knew it was through getting influencers to backlink blogs. I always just assumed it was great content that users wanted to share with others. It was driving me mad why people love my content but never share enough. Now I know!
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

What are the pain points? What things drive the members of this organization to drink? From the customer support to the higher-ups, there are things that knock the company down. How do they get back up? Why are the pains they're looking to work around? It may not be realistic to interview the whole company, but ideally you can get a representative to answer these.


Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites.  No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want.  I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc.  I also took note of who the people were who said those things and where they were talking (forums, twitter, etc).  It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links.  If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.
It’s an awesome post which I like the most and commenting here for the first time. I’m Abhishek founder of CouponMaal want to know more like you’ve said above in the points relaunch your old posts. Here I want to know is there any difference between changing the date, time and year while we’re relaunching old post OR we should relaunch the old post with the previous date, time and year. I mean it matters or not.
On a dating niche site I took the ‘ego-bait’ post one step further and had sexy girls perform a dance and strip to revel the names of the major bloggers in my niche written on their bodies. As you can imagine it got a lot of attention from the big players in my niche and my audience and is a little more creative for getting links, shares and traffic.
Google has recently changed how you can use the Google Keyword Planner. Before, everyone who signed up could see the search volume for keywords. Now, it only shows estimates. There is a way to get around this. You need to create a Google Adwords campaign. The amount you spend doesn’t matter. After you do that, you will regain access to the search volume.
For example, if a swimming pool business is trying to rank for "fiberglass pools" -- which is receiving 110,000 searches per month -- this short-tail keyword can be the one that represents the overarching topic on which they want to create content. The business would then identify a series of long-tail keywords that relate to this short-tail keyword, have reasonable monthly search volume, and help to elaborate on the topic of fiberglass pools. We'll talk more about these long-tails in the next step of this process.
Ask a marketer or business owner what they’d like most in the world, and they’ll probably tell you “more customers.” What often comes after customers on a business’ wish list? More traffic to their site. There are many ways you can increase traffic on your website, and in today’s post, we’re going to look at 25 of them, including several ways to boost site traffic for FREE.
Maybe this feels a bit too scattershot for you. Buzzsumo also allows you to find and observe influencers. What are they sharing? By clicking the "view links shared" button, you'll get a display of all the unique pages shared. Sometimes "influencers" share all types of varying content crossing many topics. But sometimes, they're pretty specfic in the themes they share. Look for the latter in this competitive research stage.
Achievable: Make sure you're grounding your goal in reality. Sure, you can't control a massive Google update, but using the history of your sales and competitive data, you can make some inferences. You also need to make sure you have agreed-upon goals. Get buy-in before you set the goal in stone, leveraging the thoughts from the leaders, merchandisers, analysts, and anyone who might be able to provide insight into the likelihood of hitting your goal.
5) Post at the right time. Let’s say you want to post in the r/Entrepreneur/ subreddit, but there’s already a post in the #1 spot with 200 upvotes, and it was posted 4 hours ago. If you post at that time, you probably won’t overtake that #1 spot, and you’ll get less traffic. However, if you wait a day, check back, and see that the new #1 spot only has 12-15 upvotes, you’ll have a golden opportunity. It will be much easier for you to hit the #1 spot and get hundreds of upvotes.
We just want to help you by giving great content, direction and strategies that worked well for us and our students and that we believe can move you forward. All of our terms, privacy policies, return policy and disclaimers for this program and website can be accessed via the link below. We feel transparency is important and we hold ourselves (and you) to a high standard of integrity. Thanks for stopping by. We hope this training and content brings you a lot of value.
For example, if a swimming pool business is trying to rank for "fiberglass pools" -- which is receiving 110,000 searches per month -- this short-tail keyword can be the one that represents the overarching topic on which they want to create content. The business would then identify a series of long-tail keywords that relate to this short-tail keyword, have reasonable monthly search volume, and help to elaborate on the topic of fiberglass pools. We'll talk more about these long-tails in the next step of this process.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
I definitely learned tons of new things from your post. This post is old, but I didn’t get the chance to read all of it earlier. I’m totally amazed that these things actually exist in the SEO field. What I liked most is Dead Links scenario on wikipedia, Flippa thing, Reddit keyword research, and at last, the facebook ad keyword research. Its like facebook is actually being trolled for providing us keywords thinking they are promoting ads.

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

Social media. The algorithms have truly changed since social media first emerged. Many content websites are community-oriented -- Digg began allowing users to vote which stories make the front page, and YouTube factors views and user ratings into their front page rankings. Therefore, e-commerce stores must establish a strong social media presence on sites like Facebook , Pinterest, Twitter, etc. These social media sites send search engines signals of influence and authority.
×