And now it’s 2016. A time where an effective local business search campaign needs to include all of these tactics, not just one. Half-baked efforts won't work anymore. A deep dive into each individual marketing tactic will show you that what was enough to maintain a decent search presence in the past won't produce the same outcome and potential ROI today. Local business owners need to employ these tactics correctly from the beginning and invest the time and money to build their brand online in an effort to achieve what I call “The Ideal SERP.”
Great article and writing in general. My company just published a 5,000 word Keyword targeting best practices guide for PPC and SEO, and we linked to your article “10 Reasons You Should Use Google Trends for More Than Just Keyword Research”. http://vabulous.com/keyword-research-targeting-for-ppc-and-seo-guide/ I would love if you checked it out and possibly shared it if you like it.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
If the PageRank value differences between PR1, PR2,…..PR10 were equal then that conclusion would hold up, but many people believe that the values between PR1 and PR10 (the maximum) are set on a logarithmic scale, and there is very good reason for believing it. Nobody outside Google knows for sure one way or the other, but the chances are high that the scale is logarithmic, or similar. If so, it means that it takes a lot more additional PageRank for a page to move up to the next PageRank level that it did to move up from the previous PageRank level. The result is that it reverses the previous conclusion, so that a link from a PR8 page that has lots of outbound links is worth more than a link from a PR4 page that has only a few outbound links.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
Search engine result pages are protected from automated access by a range of defensive mechanisms and the terms of service.[10] These result pages are the primary data source for SEO companies, the website placement for competitive keywords became an important field of business and interest. Google has even used Twitter to warn users against this practice[11]
In the 1990s, the term Digital Marketing was first coined,[10]. With the debut of server/client architecture and the popularity of personal computers, the Customer Relationship Management (CRM) applications became a significant part of marketing technology.[citation needed] Fierce competition forced vendors to include more service into their softwares, for example, marketing, sales and service applications. Marketers were also able to own huge online customer data by eCRM softwares after the Internet was born. Companies could update the data of customer needs and obtain the priorities of their experience. This led to the first clickable banner ad being going live in 1994, which was the "You Will" campaign by AT&T and over the first four months of it going live, 44% of all people who saw it clicked on the ad [11].
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at considerably more than the sheer volume of votes, or links a page receives; for example, it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.” Using these and other factors, Google provides its views on pages’ relative importance.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
×