In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
While short-tail keywords are often searched more frequently, it is more difficult to rank for them on search engines. Targeting long-tail keywords, on the other hand, gives you a better chance of ranking higher (even on the first page) for queries specific to your products and services—and higher ranking means more traffic. Plus, as search engines and voice-to-text capabilities advance, people are using more specific phrases to search online. There are many free tools available to help you find keywords to target, such as Answer the Public.
The best way to avoid this is proactively asking the right questions. Ask about resource support. Ask about historic roadblocks. Ask to be introduced to other players who otherwise hide behind an email here and there. Ask about the company's temperature regarding a bigger SEO strategy vs. short, quick-hit campaigns. Don't be your own biggest obstacle — I've never heard of anyone getting angry about over-communication unless it paralyzes progress.
I understand that some SEO agencies and departments are not built for the big SEO campaigns. Strategic work takes time, and speeding (or scaling) through the development stage will likely do more harm than good. It's like cramming for a test — you're going to miss information that's necessary for a good grade. It would be my pleasure if this post inspired some change in your departments.
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.

Hi Brian, Awsome content as ever! I’m very interested in your idea of creating an ‘uber’ resource list or expert roundup post i.e. linking out to lots of to other authorities in my niche within one post. But should you always create ‘no-follow’ links to these authority sites to prevent juice from passing to them? And similarly if you sprinkle a few outbound authority links in other posts should they all be ‘no-follow’ or do you think big G ignores ‘no follow’ these days?
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Sure, we did keyword research, we recommended partnerships and widgets and architecture advice, but we didn’t step back and take a good look at our target audiences, what sites were meeting their specific needs in search results, and what we specifically could build into the product that would be far more desirable than what everyone else had (not even thought of yet ideally) to make sure our entire site is superior, resulting in the inevitable stealing of search traffic from our competitors.
Sometimes it seems extremely hard to get those first 100 visitors to your articles and it can be frustrating when this happens; especially when you are new to the world of Internet Marketing and you just don’t know what you are doing wrong. Well I’m here to help you with these ’10 Ways To Bring Visitors To Your Articles’ admittedly this is not going to be a masterclass which draws on and on; this is all about helping you to get to your first 100 visitors as quickly and easily as possible without a ton of jargon. So here you go guys I hope this truly helps some of you out. If you enjoy this post, you should also check David’s other posts such as 10 WordPress Plugins for Bloggers!
Backlinks can actually serve as a proxy for interest. In Google's vision of a democratic web, they considered links to function like votes. Google wants editorial votes to influence their algorithm. So, if we assume all links are potentially editorial, then looking up backlink data can illustrate content that's truly beloved. Grab your favorite backlink data provider (hey — Moz has one!) and pull a report on a competitor's domain. Take a look at the linked pages, and with a little filtering, you'll see top linked pages emerge. Dive into those pages and develop some theories on why they're popular link targets.

Hey Brian, love your site + content. Really awesome stuff! I have a question about dead link building on Wikipedia. I actually got a “user talk” message from someone moderating a Wikipedia page I replaced a dead link on. They claimed that “Wikipedia uses nofollow tags” so “additions of links to Wikipedia will not alter search engine rankings.” Any thoughts here?
This is excellent and should be intuitive for marketers (and SEO pros are marketers!) but we often take the short cut and neglect critical details. What would also reinforce the strategy is way of providing solid projections for SEO (these could be based on industry trends and statistics). Clients now ask for ways to calculate ROI and they need numbers to get budget approvals. Increase in traffic by X,  increase in qualified traffic and leads, conversions etc, some way of quatifying the expected return.
Every time I write new content I post it to twitter. If you use the right keywords and make your tweet interesting enough you can get a lot of clickthroughs just from people searching. For example if I write an article about SEO and Google I can tag the end of the tweet with #SEO #Google and anyone that searches for those keywords on Twitter can see my tweet about the post that I wrote. Be sure to write creative headlines for your posts so people feel the urge to click on them.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .

On another note, we recently went through this same process with an entire site redesign.  The executive team demanded we cut out over 75% of the pages on our site because they were useless to the visitor.  It's been 60 days since the launch of the new site and I've been able to still increase rankings, long-tail keywords, and even organic traffic.  It took a little bit of a "cowboy" mentality to get some simple things done (like using 301's instead of blocking the old content with robots.txt!).  I predicted we would lose a lot of our long tail keywords...but we haven't....yet!
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
Hello Brian, really such an informative article and is more meaningful as you provided screen shots. I have noticed that articles with images bring more value to understand the things. I have just started my career in this industry and thus keep looking for some good articles/blogs that are meaningful and help me to implement tips in my work apart from my seniors instructions. I guess this was I can prove them about my caliber 🙂

After you have identified your target keywords, you need to create a page targeting that keyword. This is known as SEO content. In many cases, it makes sense to publish a blog post targeting keywords. However, you need to make decisions based on the search intent. If your target keyword phrase is “buy black Nike shoes”, then it doesn’t make sense to create a long-form piece of content.
Sorry for the long comment, I just am really happy to see that after all those years of struggle you finally made a break through and you definitely deserve it bro. I’ve had my own struggles as well and just reading this got me a little emotional because I know what it feels like to never wanting to give up on your dreams and always having faith that one day your time will come. It’s all a matter of patience and learning from failures until you get enough experience to become someone who can generate traffic and bring value to readers to sustain long term relationships.
If you're looking to upload an image to a blog post, for example, examine the file for its file size first. If it's anywhere in megabyte (MB) territory, even just 1 MB, it's a good idea to use an image compression tool to reduce the file size before uploading it to your blog. Sites like TinyPNG make it easy to compress images in bulk, while Google's very own Squoosh has been known to shrink image file sizes to microscopic levels.
The intent behind “SEO agency” is obvious… The searcher is looking for an SEO agency. Most of these searchers aren’t looking for life lessons from an SEO agency owner. Instead, they are just looking for the best SEO agency to get them more traffic and customers from Google. Plain and simple. I knew this when I created that page, but my SEO ego was too big.
I read all the words in your post. To believe me I read “generated generated” words two times at step 3. Okay lets come to point. I believe that I’m producing the right content and that too related to niches. When I started my blog, I’ve already made a list of influential bloggers started following them. I produce the hot/trending content in market and I share every post of all them, but in return I receive a 5% of output towards it. I get interact with each others and don’t know why new comers like me getting struggle for a reply from influencers ? … In most cases is same. Is that they are consider about their followers ? / they have enough/more sales or business?.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Hi Brian, i absolutely love your content.My competitors and influence rs are very strong-most of them government bodies or supported by government, or travel guides known worldwide.I constantly follow them,get onto them.like,share,comment etc.they share photos that are submitted to them,and i do photography myself which takes hours and still can’t reach out big audience…Any idea please what could i create that my influence rs would love to share (hard to find out what they care,they get 100’s photos submitted daily and collaborate with other big names…) Please help me.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.

Social media. The algorithms have truly changed since social media first emerged. Many content websites are community-oriented -- Digg began allowing users to vote which stories make the front page, and YouTube factors views and user ratings into their front page rankings. Therefore, e-commerce stores must establish a strong social media presence on sites like Facebook , Pinterest, Twitter, etc. These social media sites send search engines signals of influence and authority.
What kind of advice would you give is your site is growing but seems to be attracting the wrong kind of traffic? My visitor numbers are going up but all other indicators such as bounce rate, time page, pages per visit seem to be developing in the wrong direction. Not sure if that’s to be expected or if there is something that I should be doing to counter that development?

Guest blogging is a great way to generate free traffic – all you have to invest is the time to write an article. Get in touch with the most popular blogs in your industry and ask if they’ll let you write a guest post. Most website owners will have no objections to having other people write free content for them. (ask Michael how I know…wink, wink) If you’re having trouble finding blogs to guest post on check out www.myblogguest.com – they have a full community of bloggers that are ready and waiting for your content.
This one is so obvious, we’re going to look at it first. Paid search, social media advertising and display advertising (try our Smart Ads Creator!) are all excellent ways of attracting visitors, building your brand and getting your site in front of people. Adjust your paid strategies to suit your goals – do you just want more traffic, or are you looking to increase conversions, too? Each paid channel has its pros and cons, so think carefully about your objectives before you reach for your credit card.
He is the co-founder of NP Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.
On the other hand, I'd like to know how many people constitutes your new experience as an indipedent consultant? Infact, as others noted in the comments here, what you suggest is perfect especially for an in-house SEO situation or in for an Web Marketing Agency with at least 5/8 people working in. Even if all you say is correct and hopefully what everybodies should do, I honestly find quite difficult to dedicate all the amount of time and dedication in order to check all the steps described in your post. Or, at least, I cannot imagine myself doing it for all the clients.
In fact, as stipulated by law, we can not and do not make any guarantees about your ability to get results or earn any money with our ideas, information, tools or strategies. We don’t know you and, besides, your results in life are up to you. Agreed? Your results will be impacted by numerous factors not limited to your experience, background, discipline and conscientiousness. Always do your own due diligence and use your own judgment when making buying decisions and investments for yourself or in your business.

Hey Ted, thanks for the great questions! The peak times refer to your particular time zone, if you are targeting an audience that resides in the same zone as you. You can also use tools to find out when most of your audience is online. For example, Facebook has this built into their Page Insights. For Twitter, you can use https://followerwonk.com/. Many social posting tools also offer this functionality.


Thank you Brian. I am so brand spanking new to all this and i am really struggling with understanding it all. I have tried to read so many thing to help my website and this was the first article to really make sense however Being an urban, street menswear online store i feel like my niche is too broad?.. Ahh Feel like I am drowning maybe I need to do your course! Thanks again for the read I will be doing a lot more thats for sure

To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.
×