Get a link to your pages from an high PR page and yes, some of that PageRank importance is transmitted to your page. But that’s doesn’t take into account the context of the link — the words in the link — the anchor text. If you don’t understand anchor text, Google Now Reporting Anchor Text Phrases from me last month will take you by the hand and explain it more.
Large web pages are far less likely to be relevant to your query than smaller pages. For the sake of efficiency, Google searches only the first 101 kilobytes (approximately 17,000 words) of a web page and the first 120 kilobytes of a pdf file. Assuming 15 words per line and 50 lines per page, Google searches the first 22 pages of a web page and the first 26 pages of a pdf file. If a page is larger, Google will list the page as being 101 kilobytes or 120 kilobytes for a pdf file. This means that Google’s results won’t reference any part of a web page beyond its first 101 kilobytes or any part of a pdf file beyond the first 120 kilobytes.

Building on site authority and trust (off-site optimization) is one of the most critical search engine ranking signals. Search engines measure the popularity, trust and authority of your website by the amount and quality of websites that are linking to your site.  We work with our clients to develop an SEO strategy which stimulates link acquisition organically and supplements those strategies with additional services. Our content / editorial marketing finds the highest quality websites that are relevant to your business, and where you are positioned organically on authoritative and trusted site.


In the example above (a SERP for the search query “lawnmowers”), all of the results on the SERP – with the exception of the map and business listing beneath it – are paid results. The three large text-based ads at the top of the SERP (considered prime positioning for advertisers) are typical PPC ads. Of those three ads, the lower two (for Craftsman.com and Husqvarna.com) both feature ad extensions allowing prospective customers to navigate to specific pages on their websites directly from the ads.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×