In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Thank you Brian. I am so brand spanking new to all this and i am really struggling with understanding it all. I have tried to read so many thing to help my website and this was the first article to really make sense however Being an urban, street menswear online store i feel like my niche is too broad?.. Ahh Feel like I am drowning maybe I need to do your course! Thanks again for the read I will be doing a lot more thats for sure
Content is king. That’s the saying, right? It’s true in a way. Your website is really just a wrapper for your content. Your content tells prospects what you do, where you do it, who you have done it for, and why someone should use your business. And if you’re smart, your content should also go beyond these obvious brochure-type elements and help your prospective customers achieve their goals.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”

What kind of advice would you give is your site is growing but seems to be attracting the wrong kind of traffic? My visitor numbers are going up but all other indicators such as bounce rate, time page, pages per visit seem to be developing in the wrong direction. Not sure if that’s to be expected or if there is something that I should be doing to counter that development?


Think of it this way: The more specific your content, the more specific the needs of your audience are -- and the more likely you'll convert this traffic into leads. This is how Google finds value in the websites it crawls; the pages that dig into the interworkings of a general topic are seen as the best answer to a person's query, and will rank higher.

You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Thanks Brian. I’ve had a “a-ha” moment thanks to you! Great advice. I knew that backlinks would improve the organic SEO rankings to our client-targeted landing pages but I never knew it was through getting influencers to backlink blogs. I always just assumed it was great content that users wanted to share with others. It was driving me mad why people love my content but never share enough. Now I know!
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
Hi, my name is Dimitrios and I am responsible for Crave Culinaire’s digital marketing. I would like to drive more traffic to Crave’s blog. Since Crave Culinaire is the only catering company who provides molecular cuisine, I thought about craving a blog post about that. The influencers in this niche have great success in utilizing recipes on their blogs. I will share some recipes of Brian Roland, owner and head chef of Crave Culinaire.
Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉
Really its just a matter of getting creative - grab a cup of caffeine and think for a minute about what resources you have to try to get some insight on your visitors (or target markets) and their needs before you dive in.  Think about how much time it might take you (or what the cost of the reports would be if you are going to buy some market research reports), and tack that onto your billing as an optional service.
It’s rare to come across new SEO tips worth trying. And this post has tons of them. I know that’s true BECAUSE…I actually read it all the way to the end and downloaded the PDF. What makes these great is that so many are a multiple step little strategy, not just the one-off things to do that clients often stumble across and ask if they are truly good for SEO. But there are also some nice one-off tips that I can easily start using without ramping up a new project.
In this excellent post, SEO and Digital Trends in 2017, Gianluca Fiorelli writes, "In a mobile-only world, the relevance of local search is even higher. This seems to be the strategic reason both for an update like Possum and all the tests we see in local, and also of the acquisition of a company like Urban Engines, whose purpose is to analyze the "Internet of Moving Things."

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

I first heard you talk about your techniques in Pat Flynn’s podcast. Must admit, I’ve been lurking a little ever since…not sure if I wanted to jump into these exercises or just dance around the edges. The clever and interesting angles you describe here took me all afternoon to get through and wrap my brain around. It’s a TON of information. I can’t believe this is free for us to devour! Thank you!! Talk about positioning yourself as THE expert! Deep bow.
Your posts are amazingly right on target. In this specific post, #3 resonated with with personally. I am a content manager as well as a blogger for the website mentioned. I promote through different blog sites and social media. In fact, i just finished an article about you. Credited to you and your website of course. Thank you for such amazing information. You make things sound so easy. Thanks again!
For example, we regularly create content on the topic of "SEO," but it's still very difficult to rank well on Google for such a popular topic on this acronym alone. We also risk competing with our own content by creating multiple pages that are all targeting the exact same keyword -- and potentially the same search engine results page (SERP). Therefore, we also create content on conducting keyword research, optimizing images for search engines, creating an SEO strategy (which you're reading right now), and other subtopics within SEO.
×