PageRank gets its name from Google cofounder Larry Page. You can read the original ranking system to calculate PageRank here, if you want. Check out the original paper about how Google worked here, while you’re at it. But for dissecting how Google works today, these documents from 1998 and 2000 won’t help you much. Still, they’ve been pored over, analyzed and unfortunately sometimes spouted as the gospel of how Google operates now.
It is easy to think of our site as being a small, self-contained network of pages. When we do the PageRank calculations we are dealing with our small network. If we make a link to another site, we lose some of our network’s PageRank, and if we receive a link, our network’s PageRank is added to. But it isn’t like that. For the PageRank calculations, there is only one network – every page that Google has in its index. Each iteration of the calculation is done on the entire network and not on individual websites.

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.


In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]


Pay-per-click is commonly associated with first-tier search engines (such as Google AdWords and Microsoft Bing Ads). With search engines, advertisers typically bid on keyword phrases relevant to their target market. In contrast, content sites commonly charge a fixed price per click rather than use a bidding system. PPC "display" advertisements, also known as "banner" ads, are shown on web sites with related content that have agreed to show ads and are typically not pay-per-click advertising. Social networks such as Facebook and Twitter have also adopted pay-per-click as one of their advertising models.
My recent blog post on digital marketing trends shows the latest innovations, but here we go back to basics to define digital marketing. This is important since for some in business, particularly more traditional marketers or business owners, 'digital' is simplistically taken to mean 'our website' or 'our Facebook page'. This thinking limits the scope and opportunity of what's managed and it means that activities that should be managed may be missed.
PPC is an advertising method a company executes by placing paid text or display advertisements on search engine web results pages or website pages. The owner of the ad pays a fee to the host website or search engine, through the advertising management platform, when web users click the ad. Each click will open up your business profile, website, goods or services to the visitor. In essence, you are “buying” visitors who may become clients.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
PageRank as a visible score has been dying a slow death since around 2010, I’d say. Pulling it from the Google Toolbar makes it official, puts the final nail in the visible PageRank score coffin. The few actually viewing it within Internet Explorer, itself a depreciated browser, aren’t many. The real impact in dropping it from the toolbar means that third parties can no longer find ways to pull those scores automatically.
Digital marketing activity is still growing across the world according to the headline global marketing index. A study published in September 2018, found that global outlays on digital marketing tactics are approaching $100 billion.[40] Digital media continues to rapidly grow; while the marketing budgets are expanding, traditional media is declining (World Economics, 2015).[41] Digital media helps brands reach consumers to engage with their product or service in a personalised way. Five areas, which are outlined as current industry practices that are often ineffective are prioritizing clicks, balancing search and display, understanding mobiles, targeting, viewability, brand safety and invalid traffic, and cross-platform measurement (Whiteside, 2016).[42] Why these practices are ineffective and some ways around making these aspects effective are discussed surrounding the following points.

Ok this is some pretty high level stuff here. There is still a lot of good information as well though. I remember when I was able to see the page rank of website on my google toolbar. I never really fully understood what it meant until later. What a person should focus on is good SEO practices to make search engines naturally want to feature your content on their search results.
The priority given to the placement of a link on the results page of a Web search. For example, Google's PageRank system, named after co-founder Larry Page, classifies a site based on the number of links that point to it from other sites (the "backlinks"). The concept is that if very prominent sites link to a site, the site has greater value. The more popular the backlink sites themselves are, the higher the ranking as well.
Google thinks that if your site has been linked to several times, it’s because you’re doing something good. For them, it’s a sign that people like what you do, your content is useful, high-quality, relevant, and therefore you must have a certain authority or be a quality reference in the area that you specialize in, and that’s why people are citing your site or content.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.

Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
×