In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Simply put, search engine optimization (SEO) is the process of optimizing the content, technical set-up, and reach of your website so that your pages appear at the top of a search engine result for a specific set of keyword terms. Ultimately, the goal is to attract visitors to your website when they search for products, services, or information related to your business.
The formula uses a model of a random surfer who gets bored after several clicks and switches to a random page. The PageRank value of a page reflects the chance that the random surfer will land on that page by clicking on a link. It can be understood as a Markov chain in which the states are pages, and the transitions, which are all equally probable, are the links between pages.
The Google algorithm's most important feature is arguably the PageRank system, a patented automated process that determines where each search result appears on Google's search engine return page. Most users tend to concentrate on the first few search results, so getting a spot at the top of the list usually means more user traffic. So how does Google determine search results standings? Many people have taken a stab at figuring out the exact formula, but Google keeps the official algorithm a secret. What we do know is this:
Paid search functions as an auction. Advertisers bid on keywords that are relevant to their business that can trigger the display of their ads when users search for those terms. A wide range of factors determine where an ad will be shown on the SERP. Some ads might be displayed above the organic search results (such as the Lowe’s, Craftsman, and Husqvarna examples in the “lawnmowers” SERP example above), whereas others may be shown to the right of the organic results. Some advertisers choose to limit the display of their ads to mobile searches only, whereas others exclude mobile results altogether. Some ads feature extensions, and some do not.
Let’s start with what Google says. In a nutshell, it considers links to be like votes. In addition, it considers that some votes are more important than others. PageRank is Google’s system of counting link votes and determining which pages are most important based on them. These scores are then used along with many other things to determine if a page will rank well in a search.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.
And now it’s 2016. A time where an effective local business search campaign needs to include all of these tactics, not just one. Half-baked efforts won't work anymore. A deep dive into each individual marketing tactic will show you that what was enough to maintain a decent search presence in the past won't produce the same outcome and potential ROI today. Local business owners need to employ these tactics correctly from the beginning and invest the time and money to build their brand online in an effort to achieve what I call “The Ideal SERP.”
In order to provide the best possible search experience for its users, Google continues to push for better local content from businesses. Regardless of future algorithm updates from Google, investing in high quality and locally relevant content will help businesses compete in the local pack and rank higher on search engine results pages. To learn more about Google’s recent removal of right rail ads, check out our whitepaper: Goodbye Right Rail: What Google Paid Search Changes Mean for Local Marketers.
Where do you start if you want to develop a digital marketing strategy? It's a common challenge since many businesses know how vital digital and mobile channels are today for acquiring and retaining customers. Yet they don't have an integrated plan to grow and engage their audiences effectively. They suffer from the 10 problems I highlight later in this article and are losing out to competitors.
Now, how much weight does PageRank carry? Like most every other part of the algorithm, it’s questionable. If we listed all the ranking factors, I don’t suspect it would be in the top 5, but it’s important to remember that the key to ranking well is to be the LESS IMPERFECT than your competition. IE: To have more of the right things that send the right signals in the right places so that Google sees you as a better, more relevant, candidate for the top three on page one. If you and your competitor both have optimized (on-page and technically) for the same keyword phrase perfectly, PR could be the deal breaker that pushes your blue link an inch up.
Another challenge is the sheer scope and scale of digital marketing. There are so many great digital marketing techniques ranging from search, social and email marketing to improve the digital experience of your website. Our article, What is digital marketing? shows how by using our RACE planning framework you can define a more manageable number of digital marketing activities which cover the full customer journey. Within each digital marketing technique, there are lots of detailed tactics that are important to success, so they need to be evaluated and prioritized, for example from dynamic content for email automation, website personalization to programmatic, retargeting and skyscraper content for organic search.
In the flat-rate model, the advertiser and publisher agree upon a fixed amount that will be paid for each click. In many cases the publisher has a rate card that lists the pay-per-click (PPC) within different areas of their website or network. These various amounts are often related to the content on pages, with content that generally attracts more valuable visitors having a higher PPC than content that attracts less valuable visitors. However, in many cases advertisers can negotiate lower rates, especially when committing to a long-term or high-value contract.
More specifically, who gets to appear on the page and where is based on an advertiser’s Ad Rank, a metric calculated by multiplying two key factors – CPC Bid (the highest amount an advertiser is willing to spend) and Quality Score (a value that takes into account your click-through rate, relevance, and landing page quality, among other factors). In turn, your Quality Score affects your actual cost per click, or CPC.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.