Informational searches are those in which the user hopes to find information on a given topic, such as Abraham Lincoln. It wouldn’t make much sense to place ads or other types of paid results on a SERP like this, as the search query “Abraham Lincoln” has very low commercial intent; the vast majority of searchers using this search query are not looking to buy something, and as such only informational results are displayed on the SERP.
The PageRank concept is that a page casts votes for one or more other pages. Nothing is said in the original PageRank document about a page casting more than one vote for a single page. The idea seems to be against the PageRank concept and would certainly be open to manipulation by unrealistically proportioning votes for target pages. E.g. if an outbound link, or a link to an unimportant page, is necessary, add a bunch of links to an important page to minimize the effect.
Google’s SERPs can show various elements: the search results themselves (so-called snippets), a knowledge graph, a featured snippet, an answer box, images, shopping results and more. Depending on the type of query and the data Google finds, some of these elements will show up. You can add data to your page, so Google can show a ‘rich’ snippet, providing more information about your product or recipe, for instance.
2018 Update: Since 2012 we have run an informal poll to see how widely used digital marketing strategies are. The results have shown some big improvements over the years. A few years ago we found around two-thirds to three-quarters did not have a digital marketing plan. Now that number has shrunk to 49% in latest survey, although that is still quite high, and means almost half are still doing digital with no strategy in place.
Simply put, search engine optimization (SEO) is the process of optimizing the content, technical set-up, and reach of your website so that your pages appear at the top of a search engine result for a specific set of keyword terms. Ultimately, the goal is to attract visitors to your website when they search for products, services, or information related to your business.
Shifting the focus to the time span, we may need to measure some "Interim Metrics", which give us some insight during the journey itself, as well as we need to measure some "Final Metrics" at the end of the journey to inform use if the overall initiative was successful or not. As an example, most of social media metrics and indicators such as likes, shares and engagement comments may be classified as interim metrics while the final increase/decrease in sales volume is clearly from the final category.
Cautions: Whilst I thoroughly recommend creating and adding new pages to increase a site’s total PageRank so that it can be channeled to specific pages, there are certain types of pages that should not be added. These are pages that are all identical or very nearly identical and are known as cookie-cutters. Google considers them to be spam and they can trigger an alarm that causes the pages, and possibly the entire site, to be penalized. Pages full of good content are a must.
As I was telling Norman above, these days what we’ve come to call content marketing is really a big part of “link building.” You can’t buy links, and “you link to me I’ll link to you” requests often land on deaf ears. Its really all about creating high quality content (videos, images, written blog posts) that appeals to the needs/wants of your target market, and then naturally earning inbound links from sources that truly find what you have to offer worth referencing.
People tend to view the first results on the first page.[5] Each page of search engine results usually contains 10 organic listings (however some results pages may have fewer organic listings). The listings, which are on the first page are the most important ones, because those get 91% of the click through rates (CTR) from a particular search. According to a 2013 study,[6] the CTR's for the first page goes as:
The process of harvesting search engine result pages data is usually called "search engine scraping" or in a general form "web crawling" and generates the data SEO related companies need to evaluate website competitive organic and sponsored rankings. This data can be used to track the position of websites and show the effectiveness of SEO as well as keywords that may need more SEO investment to rank higher.
The majority of web traffic is driven by the major commercial search engines, Google, Bing, and Yahoo!. Although social media and other types of traffic can generate visits to your website, search engines are the primary method of navigation for most Internet users. This is true whether your site provides content, services, products, information, or just about anything else.
SERPs typically contain two types of content – “organic” results and paid results. Organic results are listings of web pages that appear as a result of the search engine’s algorithm (more on this shortly). Search engine optimization professionals, commonly known as SEOs, specialize in optimizing web content and websites to rank more highly in organic search results.
In the example above (a SERP for the search query “lawnmowers”), all of the results on the SERP – with the exception of the map and business listing beneath it – are paid results. The three large text-based ads at the top of the SERP (considered prime positioning for advertisers) are typical PPC ads. Of those three ads, the lower two (for Craftsman.com and Husqvarna.com) both feature ad extensions allowing prospective customers to navigate to specific pages on their websites directly from the ads.

There are simple and fast random walk-based distributed algorithms for computing PageRank of nodes in a network.[31] They present a simple algorithm that takes {\displaystyle O(\log n/\epsilon )} rounds with high probability on any graph (directed or undirected), where n is the network size and {\displaystyle \epsilon } is the reset probability ( {\displaystyle 1-\epsilon } is also called as damping factor) used in the PageRank computation. They also present a faster algorithm that takes {\displaystyle O({\sqrt {\log n}}/\epsilon )} rounds in undirected graphs. Both of the above algorithms are scalable, as each node processes and sends only small (polylogarithmic in n, the network size) number of bits per round.
The Google algorithm's most important feature is arguably the PageRank system, a patented automated process that determines where each search result appears on Google's search engine return page. Most users tend to concentrate on the first few search results, so getting a spot at the top of the list usually means more user traffic. So how does Google determine search results standings? Many people have taken a stab at figuring out the exact formula, but Google keeps the official algorithm a secret. What we do know is this:
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[69][70]
I think Google will always be working to discern and deliver “quality, trustworthy” content and I think analyzing inbound links as endorsements is a solid tool the SE won’t be sunsetting anytime soon. Why would they? If the president of the United States links to your page that is undoubtedly an endorsement that tells Google you’re a legitimate trusted source. I know that is an extreme example, but I think it illustrates the principals of a linking-as-endorsement model well.
Suppose instead that page B had a link to pages C and A, page C had a link to page A, and page D had links to all three pages. Thus, upon the first iteration, page B would transfer half of its existing value, or 0.125, to page A and the other half, or 0.125, to page C. Page C would transfer all of its existing value, 0.25, to the only page it links to, A. Since D had three outbound links, it would transfer one third of its existing value, or approximately 0.083, to A. At the completion of this iteration, page A will have a PageRank of approximately 0.458.
Say you're running a PPC ad for the keyword "Nikon D90 digital camera" -- a product you sell on your website. You set up the ad to run whenever this keyword is searched for on your chosen engine, and you use a URL that redirects readers who click on your ad to your site's home page. Now, this user must painstakingly click through your website's navigation to find this exact camera model -- if he or she even bothers to stick around.
Exhaustive – Your keyword research should include not only the most popular and frequently searched terms in your niche, but also to the long tail of search. Long-tail keywords are more specific and less common, but they add up to account for the majority of search-driven traffic. In addition, they are less competitive, and therefore less expensive.
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[47] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[47] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the canonical link element[48] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.
×