The PageRank algorithm outputs a probability distribution used to represent the likelihood that a person randomly clicking on links will arrive at any particular page. PageRank can be calculated for collections of documents of any size. It is assumed in several research papers that the distribution is evenly divided among all documents in the collection at the beginning of the computational process. The PageRank computations require several passes, called "iterations", through the collection to adjust approximate PageRank values to more closely reflect the theoretical true value.

Search results are presented in an ordered list, and the higher up on that list a site can get, the more traffic the site will tend to receive. For example, for a typical search query, the number one result will receive 40-60% of the total traffic for that query, with the number two and three results receiving significantly less traffic. Only 2-3% of users click beyond the first page of search results.
Page Rank (named after Larry Page) is a link analysis algorithm used by Google that measures how many links point to a website or page, and more importantly the quality or importance of the sites that provide those links. It uses a numerical scale, with 0 being the least important and 10 being the most important. In an attempt to “cheat the system", some website owners have tried to purchase links back to their website hoping for a higher Page Rank. However, those low quality links can have a negative impact and result in a lower Google Page Rank. In addition, a website may be penalized or blocked from search results, giving priority to websites and web pages that have quality backlinks and content that is valuable to humans.

Link page A to page E and click Calculate. Notice that the site’s total has gone down very significantly. But, because the new link is dangling and would be removed from the calculations, we can ignore the new total and assume the previous 4.15 to be true. That’s the effect of functionally useful, dangling links in the site. There’s no overall PageRank loss.
In today’s world, QUALITY is more important than quantity. Google penalties have caused many website owners to not only stop link building, but start link pruning instead. Poor quality links (i.e., links from spammy or off-topic sites) are like poison and can kill your search engine rankings. Only links from quality sites, and pages that are relevant to your website, will appear natural and not be subject to penalty. So never try to buy or solicit links — earn them naturally or not at all.
How many times do we need to repeat the calculation for big networks? That’s a difficult question; for a network as large as the World Wide Web it can be many millions of iterations! The “damping factor” is quite subtle. If it’s too high then it takes ages for the numbers to settle, if it’s too low then you get repeated over-shoot, both above and below the average - the numbers just swing about the average like a pendulum and never settle down.
By checking this box, you agree to receive emails (including occasional newsletters) from Yext and its affiliates regarding Yext and Yext’s products, services, special events and offers, surveys, and updates. You can withdraw your consent at any time. Please refer to Yext’s privacy policy (including for the list of relevant Yext affiliates) or contact Yext for more details.
Google has a very large team of search quality raters that evaluate the quality of search results, that gets fed into a machine learning algorithm. Google’s search quality rater guidelines provide plenty of detail and examples of what Google class as high or low quality content and websites, and their emphasis on wanting to reward sites that clearly show their expertise, authority and trust (EAT).

Sometimes, you can find keyword ‘niches’ for which the top bid is a fantastic deal. These are longer, highly specific phrases, that not everyone will have taken the time to pursue; “long-tail search terms”. In this case, PPC is a great option because you can generate highly targeted traffic to your site for a fraction of the cost of any other form of paid advertising.
Guest Blogging: although this practice has been discredited due to the generation of poor quality articles, which have become spam (since they were used only to promote links), Google has failed to encourage this tactic. But, if you can ensure that you can create a high-quality guest post, that’s relevant to the context of your area, then go ahead.
Larry Page and Sergey Brin developed PageRank at Stanford University in 1996 as part of a research project about a new kind of search engine.[11] Sergey Brin had the idea that information on the web could be ordered in a hierarchy by "link popularity": a page ranks higher as there are more links to it.[12] Rajeev Motwani and Terry Winograd co-authored with Page and Brin the first paper about the project, describing PageRank and the initial prototype of the Google search engine, published in 1998:[5] shortly after, Page and Brin founded Google Inc., the company behind the Google search engine. While just one of many factors that determine the ranking of Google search results, PageRank continues to provide the basis for all of Google's web-search tools.[13]
Typically, daily budgets are setup for each campaign, but sometimes you want these funds to shift between campaigns depending on what’s working. The shared budget feature saves the time spent managing and monitoring individual campaign budgets. Using a shared budget, AdWords will adjust the budget. There is one daily amount for the entire account or a group of campaigns within the account.

Junk traffic can also suck the life out of your campaign. Most, but not all pay per click services or providers distribute a segment of their budget to several search engines and other sites via their search partners and content networks. While you certainly want your ads displayed on Google and/or Bing, you may not want your ads showing up and generating clicks from some of the deeper, darker corners of the Internet. The resulting traffic may look fine in high-level statistics reports, but you have to separate out partner network campaigns and carefully manage them if you’re going to get your money’s worth.

PageRank is a numeric value that represents how important a page is on the web. Google figures that when one page links to another page, it is effectively casting a vote for the other page. The more votes that are cast for a page, the more important the page must be. Also, the importance of the page that is casting the vote determines how important the vote itself is. Google calculates a page’s importance from the votes cast for it. How important each vote is is taken into account when a page’s PageRank is calculated.

What is ​search engine optimization, then? It's not secrets or tricks — just ranking methodologies to follow in order to help a site that offers value to users beat the competition in search results. Today, you must be committed not just to optimizing your domain, but also to making it a quality site that attracts links naturally and is worthy of ranking.


In the 1990s, the term Digital Marketing was first coined,[10]. With the debut of server/client architecture and the popularity of personal computers, the Customer Relationship Management (CRM) applications became a significant part of marketing technology.[citation needed] Fierce competition forced vendors to include more service into their softwares, for example, marketing, sales and service applications. Marketers were also able to own huge online customer data by eCRM softwares after the Internet was born. Companies could update the data of customer needs and obtain the priorities of their experience. This led to the first clickable banner ad being going live in 1994, which was the "You Will" campaign by AT&T and over the first four months of it going live, 44% of all people who saw it clicked on the ad [11].

Optimizing digital marketing can be tricky, and a simple definition does not necessarily translate into something that is useful for achieving business objectives. That is where the RACE Digital Marketing Planning framework comes in, as it can help break down digital marketing into easier to manage areas that can then be planned, managed and optimized.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized, but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
The Google PageRank Checker on Small SEO Tools offers advanced insight not commonly found with any other free Google PageRank Checker or PR checker. The red results let you know when a Page Rank is fake, or false. You see, some shady individuals will use a variety of methods to create a “spoof" Page Rank. You can use this tool to check the validity of a website before you buy it, or buy advertising, and save yourself from getting scammed. In a nutshell, most tools will show you just a Page Rank and you could still be suckered into wasting your money on a scam. However, Small SEO Tools offers you a Google pagerank checker that provides you with a color scheme to identify the Fake PR from True PR, offering you peace of mind.
The advertiser signs a contract that allows them to compete against other advertisers in a private auction hosted by a publisher or, more commonly, an advertising network. Each advertiser informs the host of the maximum amount that he or she is willing to pay for a given ad spot (often based on a keyword), usually using online tools to do so. The auction plays out in an automated fashion every time a visitor triggers the ad spot.
For that reason, you're probably less likely to focus on ‘leads' in their traditional sense, and more likely to focus on building an accelerated buyer's journey, from the moment someone lands on your website, to the moment that they make a purchase. This will often mean your product features in your content higher up in the marketing funnel than it might for a B2B business, and you might need to use stronger calls-to-action (CTAs).

The Google Toolbar long had a PageRank feature which displayed a visited page's PageRank as a whole number between 0 and 10. The most popular websites displayed a PageRank of 10. The least showed a PageRank of 0. Google has not disclosed the specific method for determining a Toolbar PageRank value, which is to be considered only a rough indication of the value of a website. In March 2016 Google announced it would no longer support this feature, and the underlying API would soon cease to operate.[32]


In the parlance of digital marketing, advertisers are commonly referred to as sources, while members of the targeted ads are commonly called receivers. Sources frequently target highly specific, well-defined receivers. For example, after extending the late-night hours of many of its locations, McDonald's needed to get the word out. It targeted shift workers and travelers with digital ads, because the company knew that these people made up a large segment of its late night business. McDonald's encouraged them to download a new Restaurant Finder app, targeting them with ads placed at ATMs and gas stations, as well as on websites that it new its customers frequented at night.
When PageRank leaks from a site via a link to another site, all the pages in the internal link structure are affected. (This doesn’t always show after just 1 iteration). The page that you link out from makes a difference to which pages suffer the most loss. Without a program to perform the calculations on specific link structures, it is difficult to decide on the right page to link out from, but the generalization is to link from the one with the lowest PageRank.
The mathematics of PageRank are entirely general and apply to any graph or network in any domain. Thus, PageRank is now regularly used in bibliometrics, social and information network analysis, and for link prediction and recommendation. It's even used for systems analysis of road networks, as well as biology, chemistry, neuroscience, and physics.[43]
We can’t know the exact details of the scale because, as we’ll see later, the maximum PR of all pages on the web changes every month when Google does its re-indexing! If we presume the scale is logarithmic (although there is only anecdotal evidence for this at the time of writing) then Google could simply give the highest actual PR page a toolbar PR of 10 and scale the rest appropriately.
This section can be summed up in two words: GO BIG. “Blocking and Tackling” is a phrase that emphasizes the need to excel at fundamentals. We covered many PPC marketing fundamentals in the first two segments and it is important to note that you should always strive to block and tackle your way to success. However, don’t let the routine of blocking and tackling impede your creative and innovative side. Constantly remind yourself that end goal is customer acquisition (and your ongoing challenge) is to constantly build a better mousetrap.
This shows the number of pages indexed by Google that match your keyword search. If your search is very general (such as “tulips”) you will get more pages of results than if you type something very specific. Of course, probably no one in the history of the Internet has ever paged through these to see the last page of results when there are thousands of pages of results. Most users stick to the first page of results, which is why your goal as a search engine optimizer should be to get on the first page of results. If users aren’t finding what they are looking for, instead of continuing to page through dozens of SERPs, they are more likely to refine their search phrase to make it more specific or better match their intention.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content41 for ranking, parsing structured data, and generating snippets.
The map and business listing are the only results on this SERP that are not explicitly paid results. This map is shown based on a user’s location, and feature listings for local businesses that have set up their free Google My Business listing. Google My Business is a free directory of companies that can help smaller local businesses increase their visibility to searchers based on geolocation, a particularly important feature on mobile. Read this blog post for more information on Google My Business.
A PageRank results from a mathematical algorithm based on the webgraph, created by all World Wide Web pages as nodes and hyperlinks as edges, taking into consideration authority hubs such as cnn.com or usa.gov. The rank value indicates an importance of a particular page. A hyperlink to a page counts as a vote of support. The PageRank of a page is defined recursively and depends on the number and PageRank metric of all pages that link to it ("incoming links"). A page that is linked to by many pages with high PageRank receives a high rank itself.
With this, appearing in Google’s local pack is now more important than ever. In 2014, Mediative conducted an eye-tracking research studying where users look on Google’s SERP. The study showed that users often focus their attention near the top of the page, on the local search results, and the first organic search result. In addition to this, several studies have concluded that organic search listings receive more than 90% of the clicks, with users favoring local search results the most.
Social Media Marketing - The term 'Digital Marketing' has a number of marketing facets as it supports different channels used in and among these, comes the Social Media. When we use social media channels ( Facebook, Twitter, Pinterest, Instagram, Google+, etc.) to market a product or service, the strategy is called Social Media Marketing. It is a procedure wherein strategies are made and executed to draw in traffic for a website or to gain attention of buyers over the web using different social media platforms.
Instead of relying on a group of editors or solely on the frequency with which certain terms appear, Google ranks every web page using a breakthrough technique called PageRank™. PageRank evaluates all of the sites linking to a web page and assigns them a value, based in part on the sites linking to them. By analyzing the full structure of the web, Google is able to determine which sites have been “voted” the best sources of information by those

Totally agree — more does not always equal better. Google takes a sort of ‘Birds of a Feather’ approach when analyzing inbound links, so it’s really all about associating yourself (via inbound links) with websites Google deems high quality and trustworthy so that Google deems YOUR web page high quality and trustworthy. As you mentioned, trying to cut corners, buy links, do one-for-one trades, or otherwise game/manipulate the system never works. The algorithm is too smart.
Size: (green) The size of the text portion of the web page. It is omitted for sites not yet indexed. In the screen shot, “5k” means that the text portion of the web page is 5 kilobytes. One kilobyte is 1,024 (210) bytes. One byte typically holds one character. In general, the average size of a word is six characters. So each 1k of text is about 170 words. A page containing 5K characters thus is about 850 words long.
WordStream Advisor, our intuitive, centralized digital marketing management platform, makes online advertising easy. With full integration with Facebook Advertising, a suite of specialized keyword research and diagnostic tools, and intuitive, customized reporting, WordStream Advisor gives you everything you need to own the SERP and grow your business through digital marketing.
Depending on the number of Web pages that contain a particular word or phrase, a SERP might show anywhere from zero (in the case of no matches at all) to millions of items. For example, entering the phrase "complex-number admittance" into the Google search engine yields few results. In contrast, entering the single word "hurricane" yields millions of results.
Google’s SERPs can show various elements: the search results themselves (so-called snippets), a knowledge graph, a featured snippet, an answer box, images, shopping results and more. Depending on the type of query and the data Google finds, some of these elements will show up. You can add data to your page, so Google can show a ‘rich’ snippet, providing more information about your product or recipe, for instance.
This definition emphasizes the focus of marketing on the customer while at the same time implying a need to link to other business operations to achieve this profitability. Yet, it's a weak definition in relation to digital marketing since it doesn't emphasize communications which are so important to digital marketing. In Digital Marketing Excellence my co-author, PR Smith and I note that digital marketing can be used to support these aims as follows:
Not all links are counted by Google. For instance, they filter out links from known link farms. Some links can cause a site to be penalized by Google. They rightly figure that webmasters cannot control which sites link to their sites, but they can control which sites they link out to. For this reason, links into a site cannot harm the site, but links from a site can be harmful if they link to penalized sites. So be careful which sites you link to. If a site has PR0, it is usually a penalty, and it would be unwise to link to it.
Facebook Ads, which has an unparalleled targeting system (and also allows you to advertise on Instagram). Facebook Ads has two main strengths: retargeting based on segmented marketing and custom audiences and the ability to introduce your brand to customers who didn’t know they wanted it. Google AdWords is all about demand harvesting, while Facebook Ads is all about demand generation.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content41 for ranking, parsing structured data, and generating snippets.
It is clear that something new should emerge to cover that unfollow emptiness. Here and there it is believed that some search engines may use so-called implied links to rank the page. Implied links are, for example, references to your brand. They usually come with a tone: positive, neutral, or negative. The tone defines the reputation of your site. This reputation serves as a ranking signal to search engines.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.
Google spiders the directories just like any other site and their pages have decent PageRank and so they are good inbound links to have. In the case of the ODP, Google’s directory is a copy of the ODP directory. Each time that sites are added and dropped from the ODP, they are added and dropped from Google’s directory when they next update it. The entry in Google’s directory is yet another good, PageRank boosting, inbound link. Also, the ODP data is used for searches on a myriad of websites – more inbound links!
Google has a very large team of search quality raters that evaluate the quality of search results, that gets fed into a machine learning algorithm. Google’s search quality rater guidelines provide plenty of detail and examples of what Google class as high or low quality content and websites, and their emphasis on wanting to reward sites that clearly show their expertise, authority and trust (EAT).
An omni-channel approach not only benefits consumers but also benefits business bottom line: Research suggests that customers spend more than double when purchasing through an omni-channel retailer as opposed to a single-channel retailer, and are often more loyal. This could be due to the ease of purchase and the wider availability of products.[24]
Another disadvantage is that even an individual or small group of people can harm image of an established brand. For instance Dopplegnager is a term that is used to disapprove an image about a certain brand that is spread by anti-brand activists, bloggers, and opinion leaders. The word Doppelganger is a combination of two German words Doppel (double) and Ganger (walker), thus it means double walker or as in English it is said alter ego. Generally brand creates images for itself to emotionally appeal to their customers. However some would disagree with this image and make alterations to this image and present in funny or cynical way, hence distorting the brand image, hence creating a Doppelganger image, blog or content (Rindfleisch, 2016).

You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

For that reason, you're probably less likely to focus on ‘leads' in their traditional sense, and more likely to focus on building an accelerated buyer's journey, from the moment someone lands on your website, to the moment that they make a purchase. This will often mean your product features in your content higher up in the marketing funnel than it might for a B2B business, and you might need to use stronger calls-to-action (CTAs).
I think Google will always be working to discern and deliver “quality, trustworthy” content and I think analyzing inbound links as endorsements is a solid tool the SE won’t be sunsetting anytime soon. Why would they? If the president of the United States links to your page that is undoubtedly an endorsement that tells Google you’re a legitimate trusted source. I know that is an extreme example, but I think it illustrates the principals of a linking-as-endorsement model well.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×